WO2023273761A1 - 一种显示设备及图像处理方法 - Google Patents

一种显示设备及图像处理方法 Download PDF

Info

Publication number
WO2023273761A1
WO2023273761A1 PCT/CN2022/096009 CN2022096009W WO2023273761A1 WO 2023273761 A1 WO2023273761 A1 WO 2023273761A1 CN 2022096009 W CN2022096009 W CN 2022096009W WO 2023273761 A1 WO2023273761 A1 WO 2023273761A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
color
user
pixel
closed area
Prior art date
Application number
PCT/CN2022/096009
Other languages
English (en)
French (fr)
Inventor
董率
张振宝
李乃金
王之奎
肖媛
金玉卿
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110736313.3A external-priority patent/CN113485614A/zh
Priority claimed from CN202210107311.2A external-priority patent/CN115543116A/zh
Priority claimed from CN202210128208.6A external-priority patent/CN114501107A/zh
Priority claimed from CN202210191377.4A external-priority patent/CN115550718A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to CN202280046851.2A priority Critical patent/CN118120243A/zh
Publication of WO2023273761A1 publication Critical patent/WO2023273761A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering

Definitions

  • the present application relates to the field of display technology, in particular to a display device and an image processing method.
  • Display devices refer to terminal devices capable of outputting specific display images, such as smart TVs, mobile terminals, smart advertising screens, projectors, etc.
  • smart TV can be based on Internet application technology, has an open operating system and chip, has an open application platform, can realize two-way human-computer interaction, and integrates multiple functions such as audio-visual, entertainment, education, and data.
  • TV products are used to meet the diverse and personalized needs of users.
  • a sketchpad application can be installed in the display device, so as to use the sketchpad function provided by the sketchpad application.
  • the drawing board application software can be installed on the smart TV, and the drawing board application software provides the user with various types of coloring pictures for the user to color.
  • the application provides a display device, including:
  • the display is configured to display the electronic drawing board interface, the electronic drawing board interface includes a control area and a drawing area, the control area includes at least one brush control, the brush control is used to input content on the drawing area, and the drawing area is used to present the content input by the brush control and
  • the target picture is displayed;
  • the touch component is used to receive an instruction input by the user through touch, wherein the touch component and the display form a touch screen;
  • the controller is configured to: respond to the instruction input by the user to display the target picture, in the drawing area Display the target picture, the target picture includes at least one closed area; in response to the sliding operation from the drawing start point to the drawing end point input by the user in the drawing area, determine the sliding path between the drawing start point and the drawing end point; control the sliding path located in the target closed area Fill the preset color, and control the sliding path outside the target closed area to not be filled with color, where the target closed area is the closed area where the drawing start point is located.
  • the present application also provides a display device, including: a display configured to display an electronic drawing board interface, the electronic drawing board interface includes a target picture, a coloring control, and an erasing control, and the target picture includes a coloring area and a boundary surrounding the coloring area ;
  • the touch component is used to receive instructions input by the user through touch, wherein the touch component and the display form a touch screen; the controller is configured to: acquire the movement track formed by the user selecting a pixel on the touch screen; confirm the movement track corresponding Target erasure area; determine whether the position coordinates of each pixel in the target erasure area are the position coordinates of the boundary; if the position coordinates of the pixel point are not the position coordinates of the boundary, delete the pixel point; if the position coordinates of the pixel point are the boundary position coordinates The position coordinates of , then keep the pixel.
  • the present application also provides a display device, including: a display configured to display an electronic drawing board interface, and the electronic drawing board interface is used to display target pictures; a touch component configured to receive instructions input by the user through touch, wherein the touch component A touch screen is formed with the display; the controller is configured to: acquire the target picture, and divide the target picture into areas to form at least one closed area and a border area surrounding the closed area; control the display to display the target picture that completes the area division; respond to The coloring operation entered to color the closed area so that the color of the closed area is the color chosen by the user.
  • the present application also provides a display device, including: a display configured to display an electronic drawing board, the electronic drawing board includes a control area and a drawing area, the drawing area is used to display a target image, the target image includes a color filled area and an area boundary, and the control area Including at least one brush control, the brush control is used for inputting content in the color filled area;
  • the controller is configured to: determine the fill color indicated by the brush control in response to the user's trigger operation on the brush control; detect the fill color value corresponding to the fill color; when the fill color value is the same as the color value of the pixel point on the border of the area, Adjust the fill color value so that the fill color value is different from the color value of the pixel points on the border of the region.
  • Fig. 1 is a schematic diagram of an operation scene between a display device and a control device in some embodiments
  • Fig. 2 is a hardware configuration block diagram of the control device 100 in some embodiments
  • FIG. 3 is a block diagram of a hardware configuration of a display device 200 in some embodiments.
  • FIG. 4 is a software configuration diagram of the display device 200 in some embodiments.
  • FIG. 5 is a schematic diagram of a display device 200 in some embodiments.
  • Figure 6 is a schematic illustration of a user interface in a display in some embodiments.
  • Figure 7 is a schematic diagram of an application list in some embodiments.
  • Fig. 8 is a schematic diagram of an electronic drawing board interface in some embodiments.
  • Fig. 9 is a schematic diagram of a picture option bar in some embodiments.
  • Fig. 10 is a schematic diagram of a toolbar page in some embodiments.
  • Fig. 11 is a schematic diagram of a brush toolbar in some embodiments.
  • Fig. 12 is a schematic diagram of some embodiments in which the content input by the user exceeds the area boundary of the closed area of the target picture;
  • Fig. 13 is a schematic diagram of items in the color protection mode in some embodiments.
  • Fig. 14 is a picture of the confirmation information page of the painting protection mode in some embodiments.
  • Fig. 15 is an information interaction diagram for color filling pixels in some embodiments.
  • Fig. 16 is a schematic diagram of region division of a target picture in some embodiments.
  • Fig. 17 is a schematic diagram of a touch track in some embodiments.
  • Fig. 18 is a flow chart of a coloring method
  • Fig. 19a is a schematic diagram of an electronic drawing board interface in some embodiments.
  • Fig. 19b is a schematic diagram of inserting pictures in another electronic drawing board interface in some implementations.
  • Fig. 19c is a schematic diagram of another electronic drawing board interface in some embodiments.
  • Fig. 19d is a schematic diagram of another electronic drawing board interface in some embodiments.
  • Figure 20a is a schematic flow diagram of a display method in some embodiments.
  • Figure 20b is a schematic flow diagram of the display method in some embodiments.
  • FIG. 21 is a schematic diagram of a coloring picture displayed on the display device 200 in some embodiments.
  • Figure 22 is a schematic diagram of the target landing point in some embodiments.
  • Fig. 23 is a schematic diagram of the erasing range of the preset erasing width r in some embodiments.
  • Figure 24 is a schematic diagram of the coloring picture being colored in some embodiments.
  • Figure 25 is a schematic diagram of the target landing point of the erasing track in some embodiments.
  • Figure 26 is a schematic diagram of a painted picture after color is erased in some embodiments.
  • Fig. 27 is a schematic flowchart of a method for identifying boundary position information of a painted picture in some embodiments
  • Fig. 28 is a schematic diagram of the interface where the user completes the coloring after the coloring picture is not divided into regions in some embodiments;
  • Fig. 29 is a schematic flow chart of a method for eliminating gray levels of regions in some embodiments.
  • Fig. 30 is a schematic flow diagram of dividing a closed area and a border area in some embodiments.
  • Fig. 31 is a schematic diagram of the interface where the user completes the coloring after the coloring picture is divided into regions in some embodiments;
  • Fig. 32 is a schematic diagram of the interface of the control area in some embodiments.
  • Fig. 33 is a schematic diagram of an interface including a color picker control in the control area in some embodiments.
  • Figure 34 is a schematic diagram of an electronic drawing board in some embodiments.
  • Figure 35 is a schematic diagram of a target image in some embodiments.
  • Fig. 36 is a schematic diagram of a toolbar corresponding to a brush control in some embodiments.
  • Figure 37 is a schematic diagram of color options in a toolbar in some embodiments.
  • FIG. 38 is a schematic diagram of the content input by the user destroying the area boundary in the related art.
  • Fig. 39 is a schematic diagram of displaying area boundary protection mode confirmation information on a display in some embodiments.
  • Figure 40 is a flow chart showing the interaction of various components of the device in some embodiments.
  • Fig. 41 is a schematic diagram showing that the content input by the user will not destroy the area boundary in some embodiments.
  • Fig. 42 is a schematic flowchart of an embodiment of a color setting method in some embodiments.
  • Fig. 1 is a schematic diagram of an operation scene between a display device and a control device according to an embodiment. As shown in FIG. 1 , the user can operate the display device 200 through the smart device 300 or the control device 100 .
  • control device 100 may be a remote control, and the communication between the remote control and the display device includes infrared protocol communication, bluetooth protocol communication, and other short-distance communication methods, and the display device 200 is controlled wirelessly or wiredly.
  • the smart device 300 (such as a mobile terminal, a tablet computer, a computer, a notebook computer, etc.) can also be used to control the display device 200 .
  • the display device 200 is controlled using an application program running on the smart device.
  • the display device 200 can also be controlled in a manner other than the control device 100 and the smart device 300.
  • the module for obtaining voice commands configured inside the display device 200 can directly receive the user's voice command control
  • the user's voice command control can also be received through the voice control device provided outside the display device 200 .
  • the display device 200 also performs data communication with the server 400 .
  • the display device 200 may be allowed to communicate via a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • Fig. 2 is a configuration block diagram of the control device 100 according to some embodiments.
  • the control device 100 includes a controller 110 , a communication interface 130 , a user input/output interface 140 , a memory, and a power supply.
  • the control device 100 can receive the user's input operation instruction, and convert the operation instruction into an instruction that the display device 200 can recognize and respond to, and play an intermediary role between the user and the display device 200 .
  • the communication interface 130 is used for communicating with the outside, and includes at least one of a WIFI chip, a Bluetooth module, NFC or an alternative module.
  • the user input/output interface 140 includes at least one of a microphone, a touch pad, a sensor, a button or an alternative module.
  • FIG. 3 is a block diagram of a hardware configuration of a display device 200 according to some embodiments.
  • the display device includes a touch component, and the display device can realize the touch interaction function through the touch component, so that the user can realize the operation of the host as long as the user touches the display lightly with a finger, thus getting rid of the keyboard, mouse, etc. , Remote control operation, making human-computer interaction more straightforward.
  • the user can input different control commands through touch operations. For example, the user may input touch commands such as click, slide, long press, and double click, and different touch commands may represent different control functions.
  • the touch component can generate different electrical signals when the user inputs different touch actions, and send the generated electrical signals to the controller 250 .
  • the controller 250 may perform feature extraction on the received electrical signal, so as to determine the control function to be executed by the user according to the extracted feature. For example, when the user inputs a touch action on any program icon in the application program interface, the touch component will sense the touch action and generate an electrical signal. After receiving the electrical signal, the controller 250 may first judge the duration of the level corresponding to the touch action in the electrical signal, and when the duration is less than the preset time threshold, it recognizes that the user input is a touch command. The controller 250 then extracts the position feature generated by the electrical signal, so as to determine the touch position.
  • the click touch instruction is used to execute the function of running the corresponding application program in the current scene, so the controller 250 can start and run the corresponding application program.
  • the touch component when the user inputs a sliding action on the media asset display page, the touch component also sends the sensed electrical signal to the controller 250 .
  • the controller 250 first judges the duration of the signal corresponding to the touch action in the electrical signal. When it is determined that the duration is greater than the preset time threshold, the position change of the signal generation is judged. Obviously, for an interactive touch action, the position of the signal generation will change, thereby determining that the user has input a sliding touch command.
  • the controller 250 judges the sliding direction of the sliding touch command according to the change of the position where the signal is generated, and controls to turn the display screen in the media asset display page to display more media asset options. Further, the controller 250 may also extract features such as the sliding speed and the sliding distance of the sliding touch command, and perform page-turning screen control according to the extracted features, so as to achieve the hand-following effect and the like.
  • the controller 250 can extract different features and determine the type of the touch command through feature judgment, and then execute corresponding control functions according to preset interaction rules.
  • the touch component also supports multi-touch, so that the user can input touch actions through multiple fingers on the touch screen, for example, multi-finger click, multi-finger long press, multi-finger slide, etc.
  • specific functions can also be realized by cooperating with specific application programs.
  • the display 260 can present a drawing area, and the user can draw a specific touch motion track in the drawing area through a sliding touch command, and the controller 250 determines the Touch the action pattern, and control the display 260 to display it in real time, so as to satisfy the demonstration effect.
  • the display device 200 further includes a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface 280 at least one of the
  • the controller includes a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
  • the display 260 includes a display screen component for presenting images, and a drive component for driving image display, for receiving image signals output from the controller, and displaying video content, image content, and menu manipulation interface. Components and users manipulate the UI interface.
  • the display 260 can be a liquid crystal display, an OLED display, and a projection display, and can also be a projection device and a projection screen.
  • the communicator 220 is a component for communicating with external devices or servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi module, a Bluetooth module, a wired Ethernet module and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
  • the display device 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220 .
  • the user input interface can be used to receive a control signal of the control device 100 (such as an infrared remote controller, etc.).
  • the external device interface 240 may include, but is not limited to, the following: high-definition multimedia interface (HDMI), analog or data high-definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), Any one or more interfaces such as RGB ports.
  • HDMI high-definition multimedia interface
  • component analog or data high-definition component input interface
  • CVBS composite video input interface
  • USB USB input interface
  • the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in the memory.
  • the graphics processor is used to generate various graphic objects, such as: icons, operation menus, and user input instructions to display graphics and so on.
  • the graphics processor includes an arithmetic unit, which performs calculations by receiving various interactive instructions input by users, and displays various objects according to display attributes; it also includes a renderer, which renders various objects obtained based on the arithmetic unit, and the above-mentioned rendered objects are used to be displayed on the display.
  • the user can input user commands through a graphical user interface (GUI) displayed on the display 260, and the user input interface receives user input commands through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • the system is divided into four layers, from top to bottom are respectively the application (Applications) layer (abbreviated as “application layer”), application framework (Application Framework) layer (abbreviated as “framework layer”) "), Android runtime (Android runtime) and system library layer (referred to as “system runtime layer”), and the kernel layer.
  • application layer application layer
  • application framework Application Framework
  • Android runtime Android runtime
  • system library layer system library layer
  • the display device can support rotation and/or lifting functions by adding a driving component and an attitude detection component.
  • the driving assembly includes a rotating assembly and/or a lifting assembly
  • the controller 250 can communicate with the rotating assembly and/or the lifting assembly, so that when the display needs to be rotated, the rotating assembly is controlled to drive the display to rotate, and when the display needs to be raised or When descending, the lifting component is controlled to drive the display to rise or fall.
  • the rotating assembly and/or the lifting assembly is provided with a GPIO interface
  • the controller changes the state of the GPIO interface of the rotating assembly and/or the lifting assembly by reading the GPIO interface.
  • the rotating component and/or the lifting component drives the display to rotate and/or lift according to the changed state of the GPIO interface.
  • the lifting assembly and/or the lifting assembly includes an MCU chip, and a Bluetooth module is integrated on the MCU chip, so that the lifting assembly and/or the lifting assembly support Bluetooth functions, such as Bluetooth Low Energy (BLE), and then the controller 250 may communicate with the lifting assembly and/or the lifting assembly based on the Bluetooth protocol.
  • Bluetooth functions such as Bluetooth Low Energy (BLE)
  • the detecting component includes a sensor for detecting the rotation state of the display and a sensor for detecting the lifting state of the display.
  • the controller monitors the rotation state or elevation state of the display in real time according to the data detected by the posture detection component. For example, in the process of controlling the rotation of the display, information such as rotation angle and angular speed is obtained by monitoring the data of the sensor. In the process of controlling the lifting of the display, information such as lifting distance and lifting speed can be obtained by monitoring the data of the sensor.
  • the detection component is included in the drive component.
  • the sensor for detecting the rotation state of the display is included in the rotation assembly, and together with the above rotation assembly constitutes the rotation assembly.
  • the sensor for detecting and displaying the lifting state is included in the lifting assembly, and forms the lifting assembly together with the above-mentioned lifting assembly.
  • FIG. 5 is a schematic diagram of a display device 200 according to some embodiments.
  • the display device includes a display 260 and a lift driving device 511 .
  • the lifting drive device 511 and the lifting guide rail 512 are fixed on the bracket 512 .
  • the rotary driving device is arranged inside the lifting driving device, that is, between the lifting driving device and the display, which is not shown in FIG. 5 .
  • the above-mentioned display device 200 may be a touch display device, and its display is a touch display composed of a touch component and a screen.
  • the touch display device supports the touch interaction function, which allows the user to operate the host by simply touching the display with a finger, which eliminates keyboard, mouse, and remote control operations, making human-computer interaction more straightforward.
  • the user can input different control commands through touch operations. For example, the user may input touch commands such as click, slide, long press, and double click, and different touch commands may represent different control functions.
  • the touch-sensitive component may generate different electrical signals when the user inputs different touch actions, and send the generated electrical signals to the controller 250 .
  • the controller 250 may perform feature extraction on the received electrical signal, so as to determine the control function to be executed by the user according to the extracted feature. For example, when the user inputs a touch action on any program icon position in the application program interface, the touch component will sense the touch action and generate an electrical signal. After receiving the electrical signal, the controller 250 may first judge the duration of the level corresponding to the touch action in the electrical signal, and when the duration is less than the preset time threshold, it recognizes that the user input is a click touch command.
  • the controller 250 then extracts the position feature generated by the electrical signal, so as to determine the touch position.
  • the touch position is within the display range of the application icon, it is determined that the user has input a click touch instruction at the position of the application icon.
  • the click touch instruction is used to execute the function of running the corresponding application program in the current scene, so the controller 250 can start and run the corresponding application program.
  • the touch component when the user inputs a sliding action on the media asset display page, the touch component also sends the sensed electrical signal to the controller 250 .
  • the controller 250 first judges the duration of the signal corresponding to the touch action in the electrical signal. When it is determined that the duration is greater than the preset time threshold, the position change of the signal generation is judged. Obviously, for the interactive touch action, the position of the signal generation will change, thereby determining that the user has input a sliding touch command.
  • the controller 250 judges the sliding direction of the sliding touch command according to the change of the position where the signal is generated, and controls to turn the display screen in the media asset display page to display more media asset options. Further, the controller 250 may also extract features such as the sliding speed and the sliding distance of the sliding touch command, and perform page-turning screen control according to the extracted features, so as to achieve the hand-following effect and the like.
  • the controller 250 can extract different features and determine the type of the touch command through feature judgment, and then execute corresponding control functions according to preset interaction rules.
  • the touch component also supports multi-touch, so that the user can input touch actions with multiple fingers on the touch screen, for example, multi-finger click, multi-finger long press, multi-finger slide, etc.
  • specific functions can also be realized by cooperating with specific application programs.
  • the display 260 can present a drawing area, and the user can draw a specific touch motion track in the drawing area through a sliding touch command, and the controller 250 determines the Touch the action pattern, and control the display 260 to display it in real time, so as to satisfy the demonstration effect.
  • it is a basic function of a touch screen display device to control the rotation of pictures displayed on the display by rotating the user's finger touching the display.
  • the current interaction method is that after multiple fingers rotate on the screen, the picture immediately rotates to a horizontal or vertical angle according to the direction of finger rotation. There is no interaction process, and the user experience is poor.
  • the display device 200 includes a display for displaying a coloring picture (ie, a target picture).
  • control device 100 may be a stylus, and the user may use the stylus, such as a capacitive pen, to input user instructions by clicking on the touch screen to complete color filling or erasing of the picture on the display device 200 .
  • the control device 100 may also be a mouse. When the mouse is pressed and moved, the color filling or erasing of the picture on the display device 200 is completed.
  • control device 100 includes a control unit, the control unit is used to receive the movement trajectory presented by the position of the human hand sliding on the display over time, and generate corresponding control instructions according to the movement trajectory to complete the control. A color fill or wipe of the picture on the display device 200 .
  • the smart device 300 (such as a mobile terminal, a tablet computer, a computer, a notebook computer, etc.) can also be used to control the display device 200 .
  • the application program running on the smart device is used to control the display device 200 to complete the color filling of the picture on the display device 200 .
  • the controller 250 controls the work of the display device 200 and responds to user operations related to the display 260 by running various software control programs (such as an operating system and/or various application programs) stored in the memory.
  • various software control programs such as an operating system and/or various application programs
  • the control presents a user interface on the display, and the user interface includes several UI objects; in response to a received user command on the UI objects on the user interface, the controller 250 may perform operations related to the object selected by the user command.
  • Figure 6 illustrates a user interface in a display in accordance with some embodiments.
  • the user can click "My Apps" in the user interface to trigger the display of the application list.
  • the Apps list includes all apps installed on the display device.
  • the display device provided by this application has a drawing board function.
  • the drawing board function may be implemented based on applications related to the drawing board function installed on the display device.
  • the application related to the sketchpad function installed on the display device is referred to as a "sketch application”. Painted areas are color erased.
  • Fig. 7 shows a schematic diagram of an application list according to some embodiments.
  • the display device is installed with image processing applications, player applications, video chat applications, camera applications, online shopping applications and game applications.
  • the application list is displayed on the display, the user can select one of the applications and open it, so as to realize the functions of the application.
  • the above-mentioned installed applications may be system applications or third-party applications.
  • the electronic sketchpad interface displays user interface objects, information and/or inputtable content areas corresponding to one or more functions of the sketchpad application.
  • the aforementioned user interface objects refer to the objects that constitute the electronic sketchpad interface, which may include but not limited to text, images, icons, soft keys (or "virtual buttons"), drop-down menus, radio buttons, check boxes, selectable lists, etc. Wait.
  • the displayed user interface objects may include non-interactive objects for conveying information or constituting the appearance of the user interface, interactive objects available for user interaction, or a combination of non-interactive objects and interactive objects.
  • the user can interact with the user interface object by touching the touch screen at the position of the touch screen corresponding to the interactive object that the user wishes to interact with.
  • the display device detects the contact and responds to the detected contact by performing an operation corresponding to the interaction of the interactive object to enable drawing in the sketchpad application.
  • some or all of the steps involved in the embodiments of the present application are implemented in the operating system and in the application program.
  • the application program used to implement some or all of the steps in the embodiment of the present application is called the above-mentioned "sketch application", which is stored in the memory, and the controller 250 controls the application program by running the application program in the operating system.
  • Display device 200 operates and responds to user operations related to the application.
  • the display device involved in the embodiment of the present application includes but is not limited to the display device 200 introduced in the above embodiment, and can also be other terminal devices with image display function, data processing and information sending and receiving functions, such as mobile phone, tablet computer and other portable mobile terminals.
  • the following will take a display device as an example to describe the embodiment of the present application in detail.
  • the present application provides a display device and a coloring method, so as to solve the problem that the coloring traces of the user tend to exceed the boundary of the area to be painted during the coloring process, and improve user experience.
  • FIG. 8 is an exemplary electronic drawing board interface.
  • the electronic drawing board interface includes a drawing area (drawing area) and a control area (control area).
  • the drawing area is an area where content can be input
  • the control area displays item entries corresponding to one or more functions of the sketchpad application, such as "insert" item entry and "tool” item storage, etc.
  • the drawing area and the control area can be located in different positions of the same layer, and the drawing area and the control area can also be located in different layers.
  • the drawing area and the control area as being located in different layers as an example, the drawing The area can be part or all of the first layer, the first layer is located on the electronic drawing board interface, the control area can be part or all of the second layer, and the second layer can be superimposed on the first layer Above, the second layer can also be set side by side with the first layer to display the content corresponding to the drawing area and the control area in the electronic drawing board interface.
  • the display device when the user clicks on the "insert" item, in response to an instruction input by the user indicating to display the target picture, the display device is triggered to display the picture option bar as shown in FIG. 9 .
  • the picture option column includes multiple picture options that can be inserted, such as "picture A", "picture B" and "picture C".
  • the selected picture can be inserted into the sketchpad application and displayed in the drawing area.
  • the third layer is generated in response to the instruction input by the user to display the target picture.
  • the third picture is superimposed on the second layer, and the third layer is used to draw the picture option bar.
  • the controller controls to close the third layer, and The layer draws the target picture, so that the display shows the target picture, wherein the picture selected by the user to be inserted is the target picture.
  • the display device when the user clicks on the "Tools” item, the display device is triggered to display the toolbar page as shown in FIG. 10 .
  • the toolbar page includes tools such as "Brush” and "Eraser”.
  • the display device When the user clicks the "paintbrush” tool, the display device is triggered to display the paintbrush toolbar as shown in FIG. 11 .
  • the user can control the brush tool to input content in the drawing area, and the input content is the user's drawing. Touch traces on the area.
  • the target picture may include at least one closed area, and each closed area is surrounded by a closed area boundary, and the user may control a brush tool to color the target picture.
  • FIG. 12 shows a schematic diagram of a related application in which the content input by the user exceeds the area boundary of the closed area of the target picture.
  • L1 is the line input by the user. The line starts from point A and passes through points B, C, D and E in sequence. Points A and E are located in the closed area, and points B and D are located in the closed area. On the boundary, the part of the line at point B-point C-point D is the part beyond the boundary of the area, which affects the drawing effect.
  • the display device can limit the area boundary of the closed area in the target picture, preventing the content input by the user from exceeding the area boundary of the closed area and affecting the user's subsequent drawing effect .
  • the display device may be provided with a "paint protection mode".
  • the area boundary of the closed area can be restricted, so that the content input by the user will not exceed the area boundary of the closed area.
  • no restriction is made on the area boundary of the closed area.
  • the user can set the "paint protection mode" by interacting with interactive objects in the user interface.
  • the foregoing interactive object may be a "painting protection mode” item displayed on the interface of the electronic drawing board.
  • the interface of the electronic drawing board is provided with a "painting protection mode” item.
  • the controller controls the display device to enter the "painting protection mode”.
  • the controller can control the display device to exit the "painting protection mode".
  • the user may send an instruction to the display device to enter the "painting protection mode" by operating a designated key on the remote control.
  • the correspondence between the "painting protection mode" command and the keys of the remote control is bound in advance. For example, set a "painting protection mode” button on the remote control. When the user touches the button, the remote control sends a "painting protection mode” command to the controller. At this time, the controller controls the display device to enter the "painting protection mode”. model". When the user touches the button again, the controller can control the display device to exit the "painting protection mode".
  • the corresponding relationship between the "painting protection mode” command and multiple remote control keys can also be bound in advance.
  • the remote control sends out the "Painting Protection Mode” command.
  • the keys bound to the "painting protection mode” command are arrow keys (left, down, left, down), that is, when the user continuously touches the keys (left, down, left, In the case of below), the remote control sends the "Painting protection mode” command to the controller.
  • the embodiment of the present application is only an example to provide several binding relationships between the "painting protection mode" command and the key, and the binding relationship between the "painting protection mode” command and the key can be set according to the user's habits , without too much limitation here.
  • the user can use the sound collector of the display device, such as a microphone, to send a "coloring protection mode" instruction to the display device through voice input, so as to control the display device to enter the "coloring protection mode”.
  • An intelligent voice system may be installed in the display device, and the intelligent voice system may recognize the user's voice to extract the instruction content input by the user. The user can enter the preset wake-up word through the microphone to activate the intelligent voice system, so that the controller can respond to the user's input. And within a certain period of time, the "painting protection mode” command will make the display device enter the "painting protection mode". For example, the user can input “so-and-so classmate" to activate the intelligent voice system. Then enter "Enter Coloring Protection Mode” to send the "Painting Protection Mode” command to the display device.
  • the user can also send a "painting protection mode" instruction to the display device through a preset gesture.
  • the display device can detect the behavior of the user through an image collector, such as a camera.
  • an image collector such as a camera.
  • it can be considered that the user has sent a "painting protection mode” instruction to the display device.
  • it may be set as follows: when it is detected that the user draws the V character, it is determined that the user has input the "painting protection mode” instruction to the display device.
  • the user can also send a "painting protection mode” instruction to the display device through a preset action. For example, it may be set as follows: when it is detected that the user lifts the left foot and the right hand simultaneously, it is determined that the user has input the "painting protection mode" command to the display device.
  • a "painting protection mode” instruction may also be sent to the display device.
  • a control can be set in the mobile phone, through which you can choose whether to enter the "painting protection mode”.
  • a continuous clicking instruction can be issued to the mobile phone.
  • the continuous click instruction refers to: within a preset period, the number of times the user clicks on the same area of the touch screen of the mobile phone exceeds a preset threshold. For example: when the user taps a certain area of the touch screen of the mobile phone 3 times within 1 second, it is regarded as a continuous tap instruction.
  • the mobile phone can send a "coloring protection mode" command to the display device, so that the controller controls the display device to enter the "coloring protection mode".
  • the mobile phone when the user uses the mobile phone to control the display device, it can also be set to: when it is detected that the user's touch pressure value on a certain area of the mobile phone touch screen exceeds the preset pressure threshold, the mobile phone can send a message to the display device. "Paint protection mode" command.
  • the controller in order to prevent the user from accidentally triggering the "painting protection mode", when the controller receives the "painting protection mode” instruction, it can control the display to display the "painting protection mode” confirmation interface, so that the user is in the "painting protection mode” "Painting Protection Mode” confirmation interface to confirm whether to control the display device to enter "Painting Protection Mode".
  • the "painting protection mode" confirmation interface includes a "yes" option entry and a "no” option entry, when the user selects "yes"
  • the controller controls the display device to enter the "painting protection mode", so that when the user performs coloring operations on the target picture, each operation can only be colored in a part of the target picture, and will not Apply color to other areas.
  • the display device will not enter the "coloring protection mode", and the user can color anywhere in the target picture every time.
  • painting protection mode is only an exemplary function name/mode name defined for convenience of description, which represents a certain function of the display device, and does not limit the protection scope of the present application.
  • the target picture is stored in the memory in a standard image file format, for example, the target picture is stored in the memory in Bitmap format, the target picture includes several pixels, and a pixel is the basic unit of image display, and the pixel can be Seen as a grid point that cannot be further divided into smaller elements or units.
  • several pixel points of a specific color may enclose at least one closed area, and the above-mentioned several pixel points of a specific color are located at an area boundary of the closed area.
  • the color of the pixel at the boundary of the region is different from the color of the pixel at other locations, for example, the color of the pixel at the boundary of the region is black, and the color of the pixel at other locations is white.
  • the controller controls to scan each pixel in the target picture stored in the memory, and obtains the position information and the location information of each pixel.
  • Color information according to the color information of each pixel, divide the target picture into several regions, each region is a closed region surrounded by the region boundary formed by pixels of a specific color, and then, according to the position of each pixel Information and its closed area, divide all the pixels in the target picture into categories equal to the number of the closed area, and store the pixels of each category in the memory. For example, according to the color information of each pixel, if the target picture is divided into three closed areas, namely the first closed area, the second closed area and the third closed area.
  • the location information of the pixels located in the first closed area, the second closed area and the third closed area stored in the memory are obtained respectively, and corresponding first pixel point set, second pixel point set and third pixel point set are generated.
  • the first set of pixels includes the position information of all pixels in the first closed area
  • the second set of pixels includes the position information of all pixels in the second closed area
  • the third set of pixels includes the third closed area
  • the position information of all pixels in does not only refer to the image area surrounded by the pixel points at the boundary position of the area, for example, the closed area of the target picture also includes the area surrounded by the pixel points at the boundary position of the area and the edge position of the target picture background area.
  • the controller when the controller scans the pixels of the target picture stored in the memory, the division of the closed area may be missed due to the color information of the pixels in some areas not being scanned.
  • the user can control the brush tool to input content in the drawing area. More specifically, the user controls the input content of the brush tool in the drawing area through the operation of "pen down"-"move"-"lift up” to be a touch track.
  • the touch track includes the drawing start point and the drawing end point.
  • the drawing start point is It is the first pixel passed when the user enters the touch track, and the drawing end point is the last pixel passed by the user when the touch track is input; the user controls the brush tool to
  • the content input in the drawing area is a touch point, which is both the starting point and the end point of drawing.
  • the "drop pen” operation means that the user clicks the brush tool to make the brush tool in the selected state, and controls the brush tool to input content in the drawing area;
  • the "move” operation means to make the brush tool in a continuous motion state while it is in the selected state ;
  • the operation of "lifting the pen” means that the brush tool is not selected or the brush tool is selected but the user stops controlling the brush tool.
  • the sliding path between the drawing start point and the drawing end point is determined, and the target closed area is obtained, and the target closed area is the drawing start point
  • the closed area where the pixel is located in the target picture can be compared with each closed area in turn according to the position information of the pixel point where the drawing starting point is located, so as to find the target closed area.
  • the target picture includes the first closed area, the second closed area and the third closed area, and the pixel position information in the first closed area is obtained, if the pixel position information in the first closed area includes the pixel point where the drawing start point is located Position information, then determine that the first closed area is the target closed area, if not, continue to obtain the pixel point position information in the second closed area, if the pixel point position information in the second closed area includes the pixel point position where the starting point of drawing is located information, it is determined that the second closed area is the target closed area, otherwise, it is determined that the third closed area is the target closed area.
  • the position information of each pixel passed by the sliding path it is judged whether the pixel passed by the sliding path is a pixel in the target closed area, if the pixel passed by the sliding path is a pixel in the target closed area point, according to the fill color value corresponding to the fill color indicated by the brush control, color fill the pixel points passed by the sliding path in the target closed area.
  • the fill color is not selected, the default fill color is used to fill the pixels.
  • Obtain the area information of the pen-down point which includes the position information of all pixels in the closed area where the "pen-down” point is located; receive and determine the operation event input after the user enters the "pen-down” operation; if it is determined that the type of the operation event is "Pen up” operation event, the coloring is over; if it is determined that the operation event type is a "move” operation event, then obtain the pixel position information passed by the corresponding sliding path, and according to the obtained pixel position information passed by the sliding path , to obtain the pixel points passed by the sliding path in the closed area where the "pen down” point is located, and fill the pixel points passed by the sliding path in the closed area where the "pen down” point is located.
  • a target picture is exemplarily given, and the target picture includes an image area and a background area, wherein the image area is the area enclosed by the outline of the image of "little pony" in the target picture.
  • the controller scans the target image to obtain 18 closed areas including the background area.
  • the background area is the first closed area
  • the "Little Pony” The body part is the second closed area and so on.
  • Set eight points on the target picture represented by letters A, B, C, D, E, F, G, and H respectively, wherein, points A, B, C, G, and H are located in the second closed area, and Points C and G are located on the boundary of the area.
  • the user controls the brush tool, performs a "pen operation" at point A, and moves the brush tool through points B, C, D, E, F, G, and H in sequence to generate a touch track
  • a "pen operation" at point A
  • moves the brush tool through points B, C, D, E, F, G, and H in sequence to generate a touch track
  • FIG 17 only the solid line part passing through A-B-C and G-H will be displayed, and the dotted line part passing through C-D-E-F-G will not be displayed, that is, the pixels on the touch track on the A-B-C and G-H lines can be filled with color , and the pixels located on the C-D-E-F-G line will not be filled with color.
  • the shortest distance between any two adjacent pixel points passed by the sliding path is determined as the first distance, and the filling radius when the pixel is filled with color is determined as the second distance, if the first If the distance is greater than the second distance, a preset number of interpolation pixel points is generated between the two adjacent pixel points, wherein the preset number may be a ratio of the first distance to the second distance.
  • the position information of each pixel point of the target picture can be expressed in the form of coordinates by establishing a coordinate system, for example, the coordinate system can be established with the center position of the first pixel point in the lower left corner of the target picture as the coordinate origin , the distance between the centers of two adjacent pixel points is regarded as a unit length, and the pixel point is expressed in the form of (Xi,Yi), Xi is expressed as a point with a unit distance from the Y axis, and Yi is expressed as a distance from X A point with a unit distance of i on the axis, (Xi, Yi) is expressed as a pixel point at a unit distance from the Y axis and i unit distance from the X axis, and obtains the coordinate information of the pixel point at the boundary of the region, and the pixel at the boundary of the region
  • the coloring ends; if the operation event type is determined to be "move "event, then obtain the coordinate information of the pixel point that the corresponding sliding track passes through; according to the coordinate range of the closed area where the pen-down point is located and the coordinates of each pixel point that the sliding track passes through, determine whether each track pixel is in the pen-down point area, And fill the interpolation pixel points belonging to the pen-down point area with color, and eliminate the interpolation pixel points not belonging to the pen-down point area.
  • the present application also provides a coloring method, including:
  • S101 Display a target picture in a drawing area in response to an instruction input by a user indicating to display a target picture, where the target picture includes at least one closed area.
  • S102 Determine a sliding path between the drawing start point and the drawing end point in response to a sliding operation from the drawing start point to the drawing end point input by the user in the drawing area.
  • S103 controlling the sliding path located in the target closed area to be filled with a preset color, and controlling the sliding path outside the target closed area to not be filled with color, wherein the target closed area is the closed area where the drawing start point is located.
  • control area further includes a trigger button.
  • the display device In response to the operation of the trigger button, the display device is controlled to enter the coloring protection mode, wherein the coloring mode means that only the sliding path located in the target closed area is filled with the preset color.
  • the starting point of drawing is determined according to the pixel position information in the first closed area Whether the pixel point is a pixel point in the first closed area, wherein, the first closed area is any closed area in the target picture; if the pixel point where the drawing starting point is located is a pixel point in the first closed area, then the The first closed area is determined as the target area.
  • the pixel position information in the first closed area before judging whether the pixel where the drawing starting point is located is a pixel in the first closed area, traverse the pixels in the target picture to obtain each pixel location information; according to the closed area where the pixel is located, the location information of each pixel is classified and stored.
  • the position information of each pixel passed by the sliding path it is judged whether the pixel passed by the sliding path is a pixel in the target closed area; if the pixel passed by the sliding path is a pixel in the target closed area point, color fills the pixels passed by the sliding path.
  • the filling color indicated by the brush control is determined; according to the filling color value corresponding to the filling color, color filling is performed on the pixels passed by the sliding path.
  • the shortest distance between any two adjacent pixel points passed by the sliding path is determined as the first distance, and the filling radius when the first pixel is filled with color is determined as the second distance; if If the first distance is greater than the second distance, a preset number of interpolation pixel points will be generated between the two adjacent pixel points.
  • each interpolation pixel is located in the target closed area; control is performed on the interpolation pixel located in the target closed area; control is performed on the interpolation pixel located outside the target closed area to be eliminated.
  • the fill color value of a first pixel point closest to the interpolation pixel point is obtained; and the interpolation pixel point is filled with color according to the fill color value.
  • the present application provides a display device and a coloring method, which are used to display the target picture on the display device when receiving an input instruction indicating to display the target picture, and the target picture includes at least one closed area; in response to the input
  • the sliding operation from the drawing start point to the drawing end point determines the sliding path between the drawing start point and the drawing end point; controls the sliding path located in the target closed area to fill the preset color, and controls the sliding path located outside the target closed area to not fill the color,
  • the coloring traces of the user tend to exceed the boundary of the coloring area during the coloring process, and improve user experience.
  • the boundary will also be erased at the same time, which destroys the boundary of the coloring picture information, it will cause the coloring to be outside the area, affecting the coloring function and user experience.
  • the embodiment of the present application also provides another display device and display method.
  • This method solves the problem that the boundary color is erased at the same time when the completed coloring is erased, and can avoid destroying the coloring.
  • the boundary information of the picture so as to prevent the coloring from being painted outside the coloring area, thereby improving the user experience of the user operating the display device.
  • Figure 19a is a schematic diagram of an electronic drawing board in some embodiments.
  • the electronic drawing board includes a menu area, a drawing area and a control area.
  • the menu area is used to provide a user navigation menu, specifically including a "Start” control, an "Insert” control, a “Design” control, a “View” control, a “Process” control and a “Tool” control.
  • the user can realize different functions related to image processing by triggering different interactive controls.
  • the drawing area is an area where content can be entered.
  • the control area can centrally display controls corresponding to one or more functions of the image processing application, such as brush controls, erase controls, color controls, etc.
  • each control can use each control to perform corresponding operations, and can also set the settings of each control. Parameters, such as setting the color and line style of the brush control, etc.
  • the toolbar corresponding to the brush control is displayed. In the toolbar, the color and thickness of the brush can be selected.
  • the toolbar also displays a swatch control and a color picker control.
  • the brush color selected by the user on the swatch, or picked from the drawing area will be configured as the input color for the brush control.
  • the user can input content based on the contact with the drawing area, and the input content is the user's contact track on the drawing area.
  • the input is the trace of the user's contact on the drawing area.
  • Fig. 19b is a schematic diagram of inserting a picture in another electronic drawing board interface according to some embodiments.
  • the user may activate the "Insert” control on the display and enter a user command indicating to display a drop-down menu corresponding to the "Insert” control.
  • the controller may present a user interface as shown in FIG. 19b on the display, in which a drop-down menu corresponding to the "insert” control is displayed, and the drop-down menu contains multiple items, specifically including “insert picture ”, “Insert Text Box” and “Insert Object”. Among them, the user can insert a picture into the application by operating "insert picture”.
  • the controller may respond to the user instruction and present a user interface as shown in FIG. .
  • the controller may display the picture in the drawing area in response to the user instruction. For example, when the user chooses to insert the first picture in Figure 19b, the picture is displayed in the drawing area.
  • the format of the picture displayed in the drawing area may be in RGB format, and may also be in one of JPG, PNG or SVG formats.
  • the user after inserting the picture, the user can color the picture.
  • Fig. 19c is a schematic diagram of another electronic drawing board interface in some embodiments.
  • the coloring picture that is, the target picture
  • the patterns include clouds, dinosaurs, grass, etc.
  • the patterns include at least one line and at least one circle surrounded by continuous lines. closed area. At least one line is used as the boundary of the closed area, which is displayed in a different color from the closed area, so that the user can clearly identify the boundary and the closed area.
  • Users can paint in enclosed areas.
  • the closed areas can be set adjacently or not. When the user paints the picture, it is to paint the closed area in the picture.
  • the color of the closed area is a pure base color, such as white; the painted border is also the border line of the closed area, and the border line is set around the closed area.
  • the color of the painted border is a pure border color, for example: black.
  • the part of the closed area that has been painted is called the painted area, and the configuration information of the pixels on the colored area and the configuration of the boundary Information is kept in the same file so that painted areas and borders are shown in the same picture, where configuration information includes color and position.
  • the area selected to be erased includes the border or part of the border
  • the border included in the "area selected for erasing" will also be erased.
  • the boundary is the boundary limit of the coloring in the coloring picture. After the boundary is deleted, the boundary information of the coloring area surrounded by the boundary will be destroyed.
  • the aforementioned coloring area is selected for coloring again, the color will be painted to the color outside the color area, as shown in Figure 19d.
  • the display device includes a display, and the display is used to display a coloring interface; the controller of the display device can be configured to perform the following display method.
  • FIG. 20a shows Schematic flow chart of the display method. As shown in FIG.
  • the display method includes: S191, acquiring the moving track formed by the user selecting pixels on the touch screen; S192, confirming the target erasing area corresponding to the moving track; S193, judging the location of each pixel in the target erasing area Whether the position coordinates are the position coordinates of the boundary; S194, if the position coordinates of the pixel point are not the position coordinates of the boundary, delete the pixel point; S195, if the position coordinates of the pixel point are the position coordinates of the boundary, then retain the pixel point.
  • acquiring the movement trajectory formed by the user selecting pixels on the touch screen includes: responding to an instruction input by the user on the touch screen for erasing the color of the painted area, for example, the user selects the erase button on the electronic drawing board Control, at this time, the pixels in the coloring area are all in the state to be selected, and the user moves the selected pixels on the touch screen, and the aforementioned selected pixels are set as the movement track.
  • the user may also select the erase control on the electronic drawing board through a specific action.
  • the present application does not limit the selection method of the erasing control.
  • Fig. 20b shows a schematic flowchart of a display method according to some embodiments. As shown in Figure 20b, in a specific implementation manner, the display method includes:
  • the coloring picture is preset with patterns that can be colored (the patterns include clouds, dinosaurs, grass, etc.), and the patterns include at least one line and at least one closed area surrounded by lines.
  • the patterns include clouds, dinosaurs, grass, etc.
  • the patterns include at least one line and at least one closed area surrounded by lines.
  • it includes the line 41 and the closed area 61 surrounded by the line 41 , the closed area 62 surrounded by the line 42 and the line 41 , and the closed area 63 surrounded by the line 43 .
  • the pattern of the closed area 61 and the closed area 62 enclosed by the continuous lines 41 and 42 is a cloud. Users can paint in enclosed areas.
  • the other lines forming the pattern and the painted area surrounded by the lines are the same as the above-mentioned composition, and will not be repeated.
  • the user can perform coloring operation on the coloring picture on the electronic drawing board to realize the color filling or erasing of the coloring area 61 , the coloring area 62 , and the coloring area 63 .
  • color fill or erase can only be done in closed areas.
  • the target landing point refers to the movement trajectory presented by the change of the position over time when the human hand, the control device and/or the smart device slides on the coloring picture on the display device 200 .
  • the control device and/or the smart device slides the pre-associated corresponding control instruction on the coloring picture on the display device 200 , which is not limited in this application.
  • the present application provides a method for obtaining pixel position information of a finger's target landing point on a coloring picture.
  • the method for obtaining the pixel position information of the target point of the finger on the coloring picture includes: reading the touch screen event type,
  • the types of touch screen events include: finger drop event, finger move event, and finger lift event; when it is a finger drop event, obtain the pixel position of the finger moving; when it is a finger move event, obtain the pixel position of the finger move ; When it is a finger-up event, end the operation.
  • FIG. 22 shows a schematic diagram of a target landing point according to some embodiments.
  • a trajectory "line segment AB” is formed.
  • the pixel point of the target drop point in S1 refers to the pixel point corresponding to all points on the trajectory "line segment AB", and the set of position information of all pixel points (that is, coordinates ( xi , y i )) is the set of It is necessary to obtain the pixel position information of the target drop point on the coloring picture.
  • the target landing point may determine the erasing position, and the erasing range may be determined by a preset erasing width r. That is to say, if the preset erasing width is r, all pixels whose distance from the target landing point ( xi , y i ) is less than or equal to r will be erased.
  • the target pixel point also includes the pixel points in the set S, and the set S is a set of all pixel points whose distance from the target falling point is less than or equal to the preset erasing width r.
  • Fig. 23 shows a schematic diagram of an erasing range with a preset erasing width r according to some embodiments.
  • the erasing range is the width formed by “taking each point of "line segment AB” as the center and taking the erasing width r as the radius to form all circles" It is a strip-shaped area of 2r, that is, the illustrated area 621 and the area 631 .
  • the preset erasing width is r
  • the position information of the pixel point of the target landing point that is, the set of coordinates ( xi , y i )
  • the pixels away from the target landing point must be obtained
  • a set of position information that is, coordinates ( xi , y i )) of all pixel points whose distance between points is less than or equal to r.
  • the pixel point is the boundary position, then retain the pixel point; the aforementioned color should be the initial color of the boundary. In this way, after multiple coloring and erasing operations, the color of the boundary can be guaranteed to be consistent, thereby ensuring that the boundary limit of the coloring area is not damaged, and ensuring the normal use of the drawing board function.
  • Fig. 24 shows a schematic diagram of a coloring picture being colored according to some embodiments. As shown in FIG. 24, while referring to FIG. 21, the painted area 61, the painted area 62, and the painted area 63 have been painted (the painted color is black).
  • Fig. 25 shows a schematic diagram of a target drop point of an erase track according to some embodiments.
  • the erasing locus is the locus “line segment AB”, and the “line segment AB” passes through the painted area 62 and the painted area 63 .
  • the erasing trajectory is the trajectory "line segment AB”
  • the erasing range is, "take each point of the "line segment AB” as the center of the circle
  • the erasing width r is the radius
  • the band-shaped area with a width of 2r formed by all the circles " formed and the intersection area of the painted area 62 and the painted area 63. It can be understood that the colors of the pixels at other positions are preserved.
  • the boundary position refers to the boundary, that is, the continuous lines 41 , 42 , 43 and other lines enclosing the painted area as shown in FIG. 21 .
  • Figure 26 shows a schematic diagram of a painted picture after erasing color, according to some embodiments. As shown in FIG. 26 , when the erasing track is the track "line segment AB", the aforementioned band-shaped area, the intersecting area with the closed area 61 and the closed area 62 is erased, which are the erasing area 621 and the erasing area 631 respectively.
  • the present application provides a method for retaining the line segment 421 , the line segment 422 , the line segment 431 and the line segment 432 .
  • all pixels ( xi , y i ) whose distance from the pixel point of the target landing point is less than or equal to r are obtained, and recorded as a set S. Traverse each pixel point ( xi , y i ) in the set S, and judge whether the pixel point ( xi , y i ) is a boundary, and if it is a boundary, keep the boundary.
  • the boundary position information that is, the position coordinates of the pixel point ( xi , y i ), can be stored in the control device in advance, so that when there is an instruction to erase, the boundary position information is directly called as the boundary position.
  • the painted area includes a painted area; the controller is further configured to: save the configuration information of pixels on the painted area and the configuration information of the boundary in the same file, so that the painted area The color area and the boundary are displayed in the same picture; if the position coordinates of the pixel point are not the position coordinates of the boundary, delete the pixel point; including: if the position coordinates of the pixel point are not the position coordinates of the boundary, then delete the pixel point on the same file configuration information.
  • the position coordinates of the pixel point are the position coordinates of the boundary, then retain the pixel point; including: if the pixel point is not a boundary position, then retain the configuration information of the boundary on the same file.
  • confirming the target erasing area corresponding to the moving track includes: confirming the first intersecting track, the first intersecting track is a set of pixel points where the moving track intersects with the painted area; according to the first intersecting track, confirming Target erasure area.
  • the painted area also includes an unpainted area; confirming the target erasing area corresponding to the movement track; including: confirming the second intersecting track, the second intersecting track is the pixel where the moving track intersects the unpainted area A set of points; according to the pixel points on the moving track except the second intersecting track, confirm the target erasing area.
  • the initial color of the border position is different from the color of the area other than the border position on the painted picture; in the step of obtaining the initial color of the border position, the initial color of the border is the border color; In the step of the target pixel, the controller is further configured to: save the coloring picture; set the border color of the coloring picture; start from the first pixel of the coloring picture, traverse and read each pixel of the coloring picture If the color of the pixel point is the boundary color, the position coordinates of the pixel point are saved to the memory as the position coordinates of the boundary.
  • the color of the border is the first color
  • the color of the painted area when it is not painted is the second color
  • the first color is different from the second color
  • the pixel The position coordinates of the point are saved to the memory as the position coordinates of the boundary; including: judging whether the color of the pixel point is the first color; if the color of the pixel point is the first color, then saving the position coordinates of the pixel point as the position coordinates of the boundary to memory.
  • the coloring picture is preset with patterns that can be colored, and the patterns surround multiple coloring areas that can be colored with continuous lines.
  • the continuous lines are used as the boundaries of multiple painted areas (for the convenience of description, "boundary” is used hereinafter instead of “continuous lines surrounding multiple painted areas"), in order to allow users to identify the boundaries and painted areas , the line color of the boundary is obviously different from the initial setting color of the painted area.
  • the border of the painted picture shown in FIG. 21 uses black lines, and the initial setting color of the painted area is white.
  • Fig. 27 shows a schematic flowchart of a method for identifying boundary position information of a painted picture according to some embodiments. As shown in Figure 27.
  • the color of each pixel point ( xi , y i ) of the painted picture is traversed, and the pixel point (x i , y i ) is judged i , y i ) is the border color.
  • the coloring picture before judging the color of the pixel point ( xi , y i ), the coloring picture can be loaded into memory, saved in bitmap format, and the boundary color can be set, so as to realize the judging step.
  • the boundary color may be a single color or multiple colors, which is not limited in this application.
  • this pixel point ( xi , y i ) is stored in the set A. After traversing the color of each pixel (x i , y i ) of the colored picture, all elements in the set A are used as boundary positions.
  • the pixel point ( xi , y i ) is not the border color, then the pixel point ( xi , y i ) is considered to be the painted area, or the blank area outside the coloring pattern on the coloring picture .
  • the color of the painted area and the blank area can be the same or different, but both should be clearly different from the border color.
  • the pixels are distinguished by color, and the difference in color is converted into pixels with different characteristics (referring to boundary and non-boundary).
  • the controller responds to the control instruction of erasing the color, and is further configured to: traverse all the pixels on the painted picture to determine whether the color of the pixel is the boundary color; if the color of the pixel is the boundary color, then the pixel is determined as the target pixel.
  • the coloring picture before selecting the area to be erased, the coloring picture has already painted part of the painted area.
  • the coloring picture includes painted areas and unpainted areas; the area to be erased may have been painted Colored area and unpainted area, the unpainted area is not to erase the color.
  • the display method provided by the present application further includes: the painted area is a closed area; in the step of erasing the color of the target pixel, the controller is further configured to: confirm the first intersecting pixel , the first intersection pixel point is the pixel point where the target landing point intersects the boundary of the painted area; according to the first intersecting pixel point, confirm the pixel point on the target landing point that coincides with the painted area as the target pixel point
  • the painted area also includes an unpainted area; the unpainted area is a closed area; the controller is further configured to: confirm the second intersecting pixel point, the second intersecting pixel point is the intersection of the target landing point and the boundary of the unpainted area pixel points; according to the second intersecting pixel point, delete the pixel points on the target drop point that coincide with the unpainted area, so as to use the remaining pixels on the target drop point as the target pixel points.
  • an image processing application can be installed on the display device, and the image processing application can provide different types of coloring pictures.
  • the coloring picture includes a closed area and a boundary area, and the closed area is an area for the user to color.
  • the boundary area is the boundary line, and the boundary line is set around the closed area.
  • closed areas are colored white and bordered areas are colored black. The user can adjust the closed area in the coloring picture, for example, color the closed area.
  • the coloring pictures are all compressed and saved, but because the coloring pictures are compressed, after converting them into bitmap files, there will be a transition area between the closed area and the area boundary, and the color of the transition area is a grayscale gradient . Furthermore, when the user paints the coloring picture, due to the color grayscale in the transition area between the closed area and the boundary area, the user cannot completely fill the closed area when filling in, and glitches appear around the boundary area , thereby affecting the coloring effect and reducing the user experience.
  • the embodiment of the present application also provides a display device and a method for eliminating the gray level of an area, so as to solve the problems that the user cannot fill the closed area completely and glitches appear around the boundary area when filling in the color.
  • FIG. 19a the schematic diagram of the electronic interface of the above-mentioned display device provided in the embodiment of the present application is shown in FIG. 19a, and the corresponding interaction process between the user and the display device and the change of the user interface are shown in FIGS. 19a to 19c. Since this has been described in detail above, it will not be repeated here.
  • the user can color the coloring picture.
  • the color of the closed area is a pure base color, for example: white.
  • the boundary area is also the boundary line, and the boundary line is set around the closed area.
  • the color of the border area is a pure border color, eg black.
  • the display device can perform a grayscale elimination operation on the transition area of the colored picture before displaying the colored picture in the drawing area, so as to prevent the situation that the user cannot completely fill the closed area with color.
  • the display device provided by the present application includes a display; the display is configured to display an electronic drawing board interface, and the electronic drawing board interface is used to display coloring pictures.
  • the touch component is configured to receive instructions input by the user through touch, wherein the touch component and the display form a touch screen. In order to eliminate the color grayscale between the closed area and the boundary area in the painted picture, the closed area and the boundary area are clearly divided. This application will set the colors corresponding to all pixels according to the gray value to eliminate the gray color of the area. Spend.
  • a display device and a method for eliminating the gray scale of an area provided by the present application will be described in detail below.
  • Fig. 29 shows a schematic flow chart of a method for eliminating gray levels of regions according to some embodiments.
  • the controller configured in it is configured to perform the following steps when executing the method for eliminating the gray scale of an area:
  • the coloring picture can be inserted into the electronic drawing board interface through the above insertion method.
  • the coloring picture collection pre-stored, and the coloring picture collection includes several pictures for user selection, that is to say, the coloring picture is selected from the aforementioned picture collection .
  • the controller can perform uniform area division processing on the pictures in the picture collection in advance, so that before the user enters an instruction to insert a coloring picture, all the candidate pictures in the coloring picture set are coloring for the area division picture.
  • coloring picture When a coloring picture is selected by the user, it can be directly displayed in the drawing area of the electronic drawing board interface, so that the user can perform subsequent operations. It should be noted that this application does not limit the time for area division and the number of divided coloring pictures, that is to say, coloring pictures can be divided into areas in batches/each. At the same time, the coloring picture can be divided into regions before being selected by the user, and can also be automatically divided into regions after being selected by the user.
  • Fig. 30 shows a schematic flowchart of dividing a closed area and a boundary area according to some embodiments.
  • the controller in the process of dividing the painted picture into areas to form at least one closed area and a border area surrounding the closed area, the controller is further configured to perform the following steps:
  • the coloring picture is loaded into the memory, and a bitmap file (Bitmap) corresponding to the coloring picture is created, and the bitmap file is used to obtain all pixels in the coloring picture. Further, traverse all pixels. Since the painted picture is composed of a certain number of pixels, it is necessary to use a preset program to traverse all the pixels in the painted picture to determine the original RGB component corresponding to each pixel. The original RGB component corresponding to each pixel is calculated by using the preset gray value formula to obtain the gray value corresponding to each pixel.
  • the preset gray value formula is as follows:
  • Gray is the gray value; R is the red component value; G is the green component value; B is the blue component value.
  • the pixel colors of all pixels in the colored picture do not need to be adjusted, that is, the white and black areas in the colored picture do not need to be adjusted. Only the pixel colors of some pixels need to be adjusted. That is to say, it is necessary to adjust the gray gradient existing between the closed area and the boundary area. Furthermore, in the process of determining the pixel color of all pixels, while re-determining the pixel color of each pixel point, the pixel color of some pixel points is adjusted to complete the closed area and boundary area in the painted picture. division.
  • the gray values corresponding to all the pixels are judged. Determine the gray value corresponding to each pixel and the size of the preset gray threshold, which is used to divide the closed area and the boundary area. It should be noted that the preset gray threshold is not specifically limited in this application, and can be designed according to actual conditions.
  • the color of the pixel corresponding to the grayscale value is set to black, so that the pixel points whose color is black constitute the boundary area.
  • the color of the pixel corresponding to the grayscale value is set to white, so that the white pixel points form a closed area.
  • the colors of all pixels are determined according to the judgment result, so as to divide the closed area and the border area of the painted picture.
  • the transition area can be readjusted to black in the border area, or white in the closed area, or partly black and partly white, thereby eliminating the transition area, that is, the color grayscale, Make bordered and closed areas distinctly colored.
  • the electronic drawing board interface as shown in Figure 31 is presented on the display, and the electronic drawing board interface displays the user's coloring Coloring picture after. It can be seen that after the user performs a coloring operation on any closed area, there will be no glitches around the closed area, and the closed area is completely filled by the user, thereby improving the aesthetics of the coloring picture and the user's experience. .
  • the controller controls the display to display the divided coloring picture, and displays the coloring picture in the drawing area of the electronic drawing board interface. It is convenient for subsequent users to color the closed area.
  • the electronic drawing board interface further includes a control area, the control area includes at least one brush control, and the brush control is used for inputting content in the drawing area.
  • Fig. 32 shows a schematic diagram of the interface of the control area according to some embodiments. In the toolbar, you can choose parameter information such as the color of the brush, the thickness of the line, and the style of the line. After the user selects various parameter information of the brush control, he can input content based on touching the drawing area, and the input content is the user's touch track on the drawing area.
  • the user can select the color of the brush control through the toolbar corresponding to the brush control.
  • the controller can control the display to display the color of the brush control.
  • the brush control color case includes all the colors that the brush control can use and the current color of the brush control.
  • Fig. 33 shows a schematic diagram of an interface including a color picker control in the control area according to some embodiments. The user can see which color the brush control is currently using. Users can also set the color of the brush control by themselves.
  • the color suction control 3301 is at least used to indicate the suction color corresponding to the coloring position, and the suction color is the color at the coloring position in the drawing area.
  • the color picker control such as color 1
  • the color of the brush control is "color 1"
  • the controller completes the coloring of the closed area according to the suction color corresponding to the coloring position indicated by the color suction control.
  • when displaying the color-picking control when displaying the color-picking control, it further includes receiving an inputted color-picking operation. Determine a new suction color according to the change operation, and update the suction color indicated by the color suction control according to the new suction color, so as to complete the coloring of the closed area.
  • the altering operation includes a touch operation with a touch screen.
  • altering operations include performing gestures on the touchscreen, ie movements of objects/accessories in contact with the touchscreen.
  • the contacting can also be performed using any suitable object or accessory, such as a capacitive pen or the like.
  • Contact may include one or more taps occurring on the touch screen, maintaining continuous contact with the touch screen, moving a point of contact while maintaining continuous contact, breaking contact, or a combination of one or more of the foregoing.
  • the color picker control is an interactive user interface object that the user can interact with to change the picker color and eventually select the latest picker color.
  • the area of the painted picture is divided into a closed area and a boundary area.
  • the color gray scale between the closed area and the boundary area is eliminated, so that the colors of the closed area and the boundary area are distinct. Therefore, the user can fill the entire closed area with color, avoiding glitches around the closed area, thereby improving the user experience.
  • the above UI is an example of a display device.
  • Other types of display devices are basically similar to the above UI in terms of coloring pictures on the electronic drawing board interface, and will not be listed here.
  • the embodiment of the present application also provides a method for eliminating the gray scale of an area, which is applied to a display device.
  • the method specifically includes the following steps: acquiring a painted picture, and dividing the painted picture into areas to form at least one closed area and enclosing closed area The border area of the region.
  • the control monitor displays a painted picture of the completed area division.
  • the color of the closed area is the color selected by the user.
  • the method when the electronic drawing board interface is displayed, the method further includes: traversing all the pixels in the painted picture, and calculating the gray value corresponding to each pixel.
  • the pixel color of all pixels is determined based on the gray value, so as to divide the closed area and the border area in the painted picture.
  • the method also includes: judging the size of the grayscale value and the preset grayscale threshold value, preset Gray threshold is used to divide closed area and boundary area.
  • preset Gray threshold is used to divide closed area and boundary area.
  • the method also includes: loading the painted picture into memory, and creating a bitmap corresponding to the painted picture file, the bitmap file is used to obtain all the pixels in the coloring picture. Traversing all pixels, determine the original RGB component corresponding to each pixel. Use the preset gray value formula to calculate the original RGB components to obtain the gray value corresponding to each pixel.
  • the preset gray value formula is as follows:
  • Gray is the gray value; R is the red component value; G is the green component value; B is the blue component value.
  • the electronic drawing board interface also includes a control area, the control area includes at least one brush control, and the brush control is used to input content in the drawing area; in response to the input coloring operation on the closed area, the method also includes: responding Based on the input trigger coloring operation, a color pick-up control is displayed on the drawing area, and the color pick-up control is at least used to indicate the pick-up color corresponding to the painted position. In response to the input confirmation coloring operation, the coloring of the closed area is completed according to the suction color corresponding to the coloring position indicated by the color suction control.
  • the method further includes: when the color picker control is displayed, receiving an input pick color change operation. Determine a new suction color according to the change operation, and update the suction color indicated by the color suction control according to the new suction color, so as to complete the coloring of the closed area.
  • the format of the coloring picture is any one of RGB, JPG, PNG or SVG formats.
  • the present application provides a display device and a method for eliminating gray scale of an area, by dividing the area in the painted picture into a closed area and a boundary area, and eliminating the gray scale of the area between the closed area and the boundary area.
  • the closed area can be filled when the user paints, and there is no glitch in the border area after coloring.
  • the present application provides a display device and a color setting method, which are used to solve the problem in the related art that the boundary of a picture will be destroyed, resulting in poor user experience.
  • the electronic drawing board includes a drawing area and a control area.
  • the drawing area is an area where content can be input.
  • the control area can centrally display controls corresponding to one or more functions of the sketchpad application, such as brush control, erasing control, dyeing bucket control, etc. Users can use each control to perform corresponding operations, and can also set the settings of each control. Parameters, such as setting the fill color and line style indicated by the brush control.
  • Figure 34 shows a schematic diagram of an electronic sketchpad in some embodiments.
  • the target picture can also be displayed in the drawing area, and the user can perform operations such as coloring on the target picture.
  • Figure 35 shows a schematic diagram of a target picture in some embodiments.
  • the target image includes the color-filled area and the area boundary, and the user can perform operations such as painting on the color-filled area.
  • Users can also use various controls in the control area to perform various operations on the target picture in the drawing area, for example: use the brush control to input lines, graphics, etc. in the color filling area of the target picture; use the coloring bucket control to fill the area with color fill color; or use the erase control to erase the content entered by the user, etc.
  • the user can use the controls in the control area to perform various operations on the target picture.
  • the user can trigger the brush control, and the brush control is in the picking state. At this time, the user can control the brush control to perform corresponding operations in the target picture.
  • the controller may also control the display to display the toolbar corresponding to the paintbrush control.
  • Fig. 36 shows a schematic diagram of a toolbar corresponding to a brush control in some embodiments. In the toolbar, you can choose parameter information such as the color of the brush, the thickness of the line, and the style of the line.
  • the user After the user selects various parameter information of the brush control, he can input content based on touching the drawing area, and the input content is the user's touch track on the drawing area.
  • a toolbar corresponding to the paintbrush control will be displayed on the display.
  • the user can select the fill color indicated by the brush control through the toolbar corresponding to the brush control.
  • the controller can control the display to display the filling color.
  • the fill color case includes all colors that the brush control can indicate and the fill color that the brush control is currently indicating.
  • Figure 37 shows a schematic diagram of the color options in the toolbar in some embodiments. Users can view the fill color that the brush control is currently indicating. Users can also set the fill color indicated by the brush control by themselves.
  • FIG. 38 shows a schematic diagram of the content input by the user destroying the boundary of the region in the related art.
  • L1 is a line input by the user, which is determined by the system as a region boundary, thereby destroying the original region boundary of the target image.
  • the display device provided by the present application can protect the area boundary of the target picture to prevent it from being destroyed by the content input by the user.
  • the display device can be set with an area border protection mode.
  • the display device When the display device is in the area boundary protection mode, the content input by the user may not be determined as the area boundary, thereby realizing the protection of the area boundary of the target picture.
  • the way for the user to enter and exit the area boundary protection mode is the same as the above-mentioned method for entering and exiting the "painting protection mode", and will not be repeated here.
  • the area boundary protection mode option can also be set in the UI interface of the display device, and when the user clicks on this option, the display device can be controlled to enter or exit the area boundary protection mode.
  • Fig. 39 shows a schematic diagram of displaying area boundary protection mode confirmation information on a display in some embodiments.
  • Figure 40 shows an interaction flow diagram of various components of a display device in some embodiments.
  • the controller may first detect whether the content to be input by the user will destroy the area boundary of the target image.
  • the controller when detecting whether the content to be input by the user will destroy the area boundary of the target picture, the controller may detect the fill color indicated by the brush control selected by the user. After determining the fill color indicated by the brush control selected by the user, the controller may further detect a color value corresponding to the fill color, which is named as a fill color value in this embodiment of the application.
  • the controller may further determine the color value of the pixel points on the region boundary of the target picture. If the fill color value of the fill color is the same as the color value of the pixel point of the area border, it means that the fill color indicated by the brush control is exactly the same as the color of the area border. At this time, if the user adopts the fill color indicated by the currently selected brush control, the input content may destroy the area boundary of the target image. Therefore, it is necessary to adjust the fill color value of the fill color, so as to ensure that the fill color value is different from the color value of the pixel point of the area boundary, and the area boundary of the target image will not be destroyed. Fig.
  • L1 is the line input by the user.
  • the color value of this line is different from the color value of the pixel point of the area boundary, so the system will not judge the content input by the user as the area boundary, so that the original area boundary of the target image can be protected.
  • the color values may be ARGB values.
  • ARGB is a color mode, that is, the RGB color mode is added with a transparency channel.
  • the RGB color mode is a mode in which various colors are obtained by changing the three color channels of red (Red), green (Green), and blue (Blue) and superimposing them with each other.
  • the region boundary of the target picture is determined, and its color is also determined and is the same color.
  • the controller can obtain the color values (R0, G0, B0, Alpha0) of the pixels on the border of the region.
  • the controller can determine the current fill color value (R ⁇ , G ⁇ , B ⁇ , Alpha ⁇ ) corresponding to the fill color. Whether the content input by the user will destroy the region boundary of the target image can be confirmed by judging whether the color value of the pixel point on the region boundary is the same as the fill color value.
  • the controller can fine-tune the filling color value, and can add or subtract the preset value to the filling color value. For example, the value of a certain color value component in the filling color value (R ⁇ , G ⁇ , B ⁇ , Alpha ⁇ ) can be added or subtracted by one, so as to ensure that the color value of the filling color value and the pixel point of the area boundary is different.
  • the human eye has the highest sensitivity to green (G component), the second to red (R component), and the lowest sensitivity to blue (B component). Therefore, when the fill color value is the same as the color value of the pixel point on the border of the area, the controller can fine-tune the B component in the fill color value, by adding or subtracting the value of the B component, so that the maximum Keep the fill color indicated by the brush control consistent with the original color, and ensure that the content input by the user will not form a new area boundary, thus protecting the original area boundary of the target image.
  • G component green
  • R component the second to red
  • B component lowest sensitivity to blue
  • the corresponding color value is (0, 0, 0, Alpha), and the value of the B component in the fill color value needs to be changed at this time plus one.
  • the corresponding color value is (255, 255, 255, Alpha), and the value of the B component in the fill color value needs to be changed at this time minus one.
  • the value of the B component can be added or subtracted by one, which can be set by the user.
  • other components when fine-tuning the fill color value, other components can also be adjusted, and the adjusted value can be set by the user, which is not limited in this application.
  • the controller determines that the color value of the fill color currently indicated by the brush control is the adjusted fill color value.
  • the controller determines that the color value of the fill color currently indicated by the brush control is the fill color value selected by the user.
  • the controller can also detect the user's touch operation.
  • the user can pick up the brush control to paint the target picture in the drawing area, etc.
  • the user can use a finger to click or move on the touch screen, and achieve coloring of the color-filled area in the target picture.
  • the controller can use the touch component to detect the touch track input by the user.
  • the area that the user wishes to color in the target image can be confirmed according to the touch track input by the user.
  • the controller can update the color values of all pixels in the touch track according to the filling color value. Realize inputting the fill color indicated by the brush control into the touch track, so as to realize the color filling area.
  • the controller when updating the color values of all pixels in the touch track, may replace the color values of all pixels in the touch track with the current corresponding fill color value of the brush control.
  • the controller may replace all the color values of the pixels in all areas touched by the user with the filling color values. That is, all areas touched by the user are set to the color selected by the user.
  • the color value of the pixel in the touch track is (R1, G1, B1, Alpha1)
  • the fill color value of the fill color selected by the user is (R ⁇ , G ⁇ , B ⁇ , Alpha ⁇ ).
  • the controller when updating the color values of all pixels in the touch track, can perform color mixing processing on the area corresponding to the touch track, that is, perform color mixing on all pixels in the touch track deal with.
  • the controller may first determine the current first color value of each pixel in the touch track. For each pixel, the current first color value and the fill color value are superimposed to obtain the second color value, which is the color value obtained after the color mixing process for each pixel. Further, the controller may update the color value of each pixel in the touch track to its corresponding second color value, so as to realize the color mixing process on the corresponding area of the touch track.
  • the first color value of a certain pixel in the touch track is (R1, G1, B1, Alpha1)
  • the fill color value of the fill color selected by the user is (R ⁇ , G ⁇ , B ⁇ , Alpha ⁇ ).
  • the controller further sets the color value of the pixel to (R2, G2, B2, Alpha2).
  • the obtained color value after the color mixing process may be the same as the color value of the pixel points on the border of the area, That is, the second color value (R2, G2, B2, Alpha2) is the same as the color value (R0, G0, B0, Alpha0) of the pixel point on the border of the region.
  • the touch track corresponding to these pixels may be determined by the system as the area boundary of the target image, so it is necessary to adjust the color values of the pixels in this part of the touch track. For the specific adjustment method, please refer to the above steps, which will not be repeated here.
  • the controller may update the color values of the pixels in this part of the touch track to the third color value. In this way, it is ensured that the content input by the user will not be judged by the system as the area boundary of the target image.
  • the specific update method refer to the above steps, which will not be repeated here.
  • the user can also switch the fill color indicated by the brush control.
  • the controller may determine a fill color value of the switched fill color.
  • the controller can continue to monitor the touch track input by the user.
  • the color values of all pixels in the touch track are updated according to the switched fill color value.
  • the user may also use the stain bucket controls.
  • the coloring bucket control When the user clicks the coloring bucket control, the coloring bucket control is in the picking state, the user can further select the color indicated by the coloring bucket control, and click on a certain color-filled area, at this time, the color of the color-filled area can be updated to the user's selected color
  • the color indicated by the dyeing bucket control realizes the color filling of the area filled with the color.
  • the color filling area is being colored. After filling, it may be judged as the area boundary by the system.
  • the controller can determine whether the color value corresponding to the coloring bucket control is the same as the color value of the pixel point on the border of the region. If so, the controller may adjust the color values corresponding to the coloring buckets, so that all color values of the coloring buckets are different from the color values of pixels on the border of the region. For the specific adjustment method, please refer to the above steps, which will not be repeated here.
  • the embodiment of the present application also provides a color setting method, which is applied to a display device.
  • the method includes: step 4201, displaying an electronic drawing board, the electronic drawing board includes a brush control and a drawing area, and the drawing area is used to display target pictures , the target picture includes a color-filled area and area boundaries, and the brush control is used to input content in the color-filled area; Step 4202, in response to the user's trigger operation on the brush control, determine the fill color indicated by the brush control; Step 4203, detect the fill The brush color value corresponding to the color; step 4204, when the fill color value is the same as the color value of the pixel point of the region boundary, adjust the fill color value so that the fill color value is different from the color value of the pixel point of the region boundary.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种显示设备及图像处理方法,显示设备包括:显示器,被配置为显示电子画板界面,电子画板界面包括控件区域和绘图区域;触控组件,用于接收用户通过触控输入的指令,触控组件和显示器形成触摸屏;控制器,被配置为:响应于用户输入的指示显示目标图片的指令,在电子画板界面显示目标图片;响应于用户在绘图区域输入的由绘制起点到绘制终点的滑动操作,确定绘制起点到绘制终点之间的滑动路径;控制位于目标闭合区域的滑动路径填充预设颜色,且控制位于目标闭合区域外的滑动路径不填充颜色。

Description

一种显示设备及图像处理方法
相关申请的交叉引用
本申请要求在2021年06月30日提交、申请号为202110736500.1;在2021年06月30日提交、申请号为202110741457.8;在2021年06月30日提交、申请号为202110739092.5;要求在2021年06月30日提交、申请号为202110736313.3;在2022年01月28日提交、申请号为202210107311.2;要求在2022年02月11日提交、申请号为202210128208.6;要求在2022年02月28日提交、申请号为202210191377.4的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示技术领域,尤其涉及一种显示设备及图像处理方法。
背景技术
显示设备是指能够输出具体显示画面的终端设备,如智能电视、移动终端、智能广告屏、投影仪等。以智能电视为例,智能电视可以基于Internet应用技术,具备开放式操作系统与芯片,拥有开放式应用平台,可实现双向人机交互功能,集影音、娱乐、教育、数据等多种功能于一体的电视产品,用于满足用户多样化和个性化需求。
随着显示设备的普及,以及多媒体教育电视的不断更新,越来越多的教学娱乐、儿童益智的应用可以在电视上使用,从而实现或辅助实现教育、教学、培训、益智等功能。显示设备中可以安装画板应用,从而使用画板应用提供的画板功能。例如,智能电视上可以安装画板应用软件,画板应用软件为用户提供多种类型的涂色图片,以供用户涂色。
发明内容
本申请提供一种显示设备,包括:
显示器,被配置为显示电子画板界面,电子画板界面包括控件区域和绘图区域,控件区域包括至少一个画笔控件,画笔控件用于在绘图区域上输入内容,绘图区域用于呈现画笔控件输入的内容和显示目标图片;触控组件,用于接收用户通过触控输入的指令,其中触控组件和显示器形成触摸屏;控制器,被配置为:响应于用户输入的指示显示目标图片的指令,在绘图区域显示目标图片,目标图片包括至少一个闭合区域;响应于用户在绘图区域输入的由绘制起点到绘制终点的滑动操作,确定绘制起点到绘制终点之间的滑动路径;控制位于目标闭合区域的滑动路径填充预设颜色,且控制位于目标闭合区域外的滑动路径不填充颜色,其中,目标闭合区域是绘制起点所处的闭合区域。
本申请还提供一种显示设备,包括:显示器,被配置为展示电子画板界面,电子画板界面包括目标图片、涂色控件和擦除控件,目标图片包括涂色区域以及围成涂色区域的边界;触控组件,用于接收用户通过触控输入的指令,其中触控组件和显示器形成触摸屏;控制器,被配置为:获取用户在触摸屏上选中像素点形成的移动轨迹;确认移动轨迹对应的目标擦除区域;判断目标擦除区域中每一个像素点的位置坐标是否为边界的位置坐标;若像素点的位置坐标不是边界的位置坐标,则删除像素点;若像素点的位置坐标是边界的位置坐标,则保留该像素点。
本申请还提供一种显示设备,包括:显示器,被配置为显示电子画板界面,电子画板 界面用于展示目标图片;触控组件,被配置为接收用户通过触控输入的指令,其中触控组件和显示器形成触摸屏;控制器,被配置为:获取目标图片,并将目标图片进行区域划分,以形成至少一个闭合区域以及包围闭合区域的边界区域;控制显示器显示完成区域划分的目标图片;响应于输入的对闭合区域的涂色操作,以使闭合区域的颜色为用户选择的颜色。
本申请还提供一种显示设备,包括:显示器,被配置为显示电子画板,电子画板包括控件区域和绘图区域,绘图区域用于展示目标图片,目标图片中包括颜色填充区域和区域边界,控件区域包括至少一个画笔控件,画笔控件用于在颜色填充区域中输入内容;
控制器,被配置为:响应于用户对画笔控件的触发操作,确定画笔控件指示的填充颜色;检测填充颜色对应的填充颜色值;当填充颜色值与区域边界的像素点的颜色值相同时,对填充颜色值进行调整,以使填充颜色值和区域边界的像素点的颜色值不同。
附图说明
图1为一些实施例中显示设备与控制装置之间操作场景的示意图;
图2为一些实施例中的控制装置100的硬件配置框图;
图3为一些实施例中的显示设备200的硬件配置框图;
图4为一些实施例中的显示设备200中软件配置图;
图5为一些实施例中的显示设备200示意图;
图6为一些实施例中显示器中的用户界面的示意图;
图7为一些实施例中应用列表的示意图;
图8为一些实施例中的电子画板界面的示意图;
图9为一些实施例中的图片选项栏示意图;
图10为一些实施例中的工具栏页面示意图;
图11为一些实施例中的画笔工具栏示意图;
图12为一些实施例中用户输入的内容超出目标图片的闭合区域的区域边界的示意图;
图13为一些实施例中涂色保护模式项目示意图;
图14为一些实施例中涂色保护模式确认信息页面图;
图15为一些实施例中对像素进行颜色填充的信息交互图;
图16为一些实施例中对目标图片进行区域划分的示意图;
图17为一些实施例中的触控轨迹示意图;
图18为一种涂色方法的流程图;
图19a为一些实施例中一种电子画板界面示意图;
图19b为一些实施中另一种电子画板界面插入图片的示意图;
图19c为一些实施例中又一种电子画板界面的示意图;
图19d为一些实施例中又一种电子画板界面的示意图;
图20a为一些实施例中显示方法的流程示意图;
图20b为一些实施例中显示方法的流程示意图;
图21为一些实施例中显示设备200上显示的涂色图片的示意图;
图22为一些实施例中目标落点的示意图;
图23为一些实施例中预设擦除宽度r的擦除范围示意图;
图24为一些实施例中涂色图片被涂色的示意图;
图25为一些实施例中擦除轨迹的目标落点的示意图;
图26为一些实施例中擦除颜色后的涂色图片的示意图;
图27为一些实施例中涂色图片的边界位置信息的识别方法的流程示意图;
图28为一些实施例中涂色图片未进行区域划分后用户完成涂色的界面示意图;
图29为一些实施例中一种消除区域灰度方法的流程示意图;
图30为一些实施例中划分闭合区域和边界区域的流程示意图;
图31为一些实施例中涂色图片进行区域划分后用户完成涂色的界面示意图;
图32为一些实施例中控件区域的界面示意图;
图33为一些实施例中控件区域中包括颜色吸取控件的界面示意图。
图34为一些实施例中电子画板的示意图;
图35为一些实施例中目标图片的示意图;
图36为一些实施例中画笔控件对应的工具栏的示意图;
图37为一些实施例中工具栏中的颜色选项的示意图;
图38为相关技术中用户输入的内容破坏掉区域边界的示意图;
图39为一些实施例中显示器中显示区域边界保护模式确认信息的示意图;
图40为一些实施例中显示设备各部件的交互流程图;
图41为一些实施例中用户输入的内容不会破坏掉区域边界的示意图;
图42为一些实施例中颜色设置方法的一个实施例的流程示意图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
图1为根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过智能设备300或控制装置100操作显示设备200。
在一些实施例中,控制装置100可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式,通过无线或有线方式来控制显示设备200。
在一些实施例中,也可以使用智能设备300(如移动终端、平板电脑、计算机、笔记本电脑等)以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200。
在一些实施例中,显示设备200还可以采用除了控制装置100和智能设备300之外的方式进行控制,例如,可以通过显示设备200设备内部配置的获取语音指令的模块直接接收用户的语音指令控制,也可以通过显示设备200设备外部设置的语音控制装置来接收用户的语音指令控制。
在一些实施例中,显示设备200还与服务器400进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器400可以向显示设备200提供各种内容和互动。
图2为根据一些实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信接口130、用户输入/输出接口140、存储器、供电电源。控制装置100可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。
在一些实施例中,通信接口130用于和外部通信,包含WIFI芯片,蓝牙模块,NFC或可替代模块中的至少一种。
在一些实施例中,用户输入/输出接口140包含麦克风,触摸板,传感器,按键或可替代模块中的至少一种。
图3为根据一些实施例的显示设备200的硬件配置框图。
在一些实施例中,显示设备包括触控组件,显示设备可以通过触控组件实现触控交互功能,可以让用户只要用手指轻轻地碰显示器就能实现对主机操作,这样摆脱了键盘、鼠标、遥控器操作,使人机交互更为直截了当。在触控显示器上,用户可以通过触控操作输入不同的控制指令。例如,用户可以输入点击、滑动、长按、双击等触控指令,不同的触控指令可以代表不同的控制功能。
为了实现上述不同的触控动作,触控组件可以在用户输入不同触控动作时,产生不同的电信号,并将产生的电信号发送给控制器250。控制器250可以对接收到的电信号进行特征提取,从而根据提取的特征确定用户要执行的控制功能。例如,当用户在应用程序界面中的任一程序图标位置输入点击触控动作时,触控组件将感应到触控动作从而产生电信号。控制器250在接收到电信号后,可以先对电信号中触控动作对应电平的持续时间进行判断,在持续时间小于预设时间阈值时,识别出用户输入的是点击触控指令。控制器250再对电信号产生的位置特征进行提取,从而确定触控位置。当触控位置在应用图标显示范围内时,确定用户在应用图标位置输入了点击触控指令。相应的,点击触控指令在当前场景下用于执行运行相应应用程序的功能,因此控制器250可以启动运行对应的应用程序。
又如,当用户在媒资展示页面中输入滑动动作时,触控组件同样将感应到的电信号发送给控制器250。控制器250先对电信号中触控动作对应信号的持续时间进行判断。在确定持续时间大于预设时间阈值时,再对信号产生的位置变化情况进行判断,显然,对于互动触控动作,其信号的产生位置将发生变化,从而确定用户输入了滑动触控指令。控制器250再根据信号产生位置的变化情况,对滑动触控指令的滑动方向进行判断,控制在媒资展示页面中对显示画面进行翻页,以显示更多的媒资选项。进一步地,控制器250还可以对滑动触控指令的滑动速度、滑动距离等特征进行提取,并按照所提取的特征进行翻页的画面控制,以达到跟手效果等。
同理,对于双击、长按等触控指令,控制器250可以通过提取不同的特征,并通过特征判断确定触控指令的类型后,按照预设的交互规则执行相应的控制功能。在一些实施例中,触控组件还支持多点触控,从而使用户可以在触摸屏上通过多指输入触控动作,例如,多指点击、多指长按、多指滑动等。
对于上述触控动作还可以配合特定的应用程序,实现特定的功能。例如,当用户打开画板应用后,显示器260可以呈现绘图区域,用户可以通过滑动触控指令在绘图区域中画出特定触控动作轨迹,控制器250则通过触控组件检测的触控动作,确定触控动作图案,并控制显示器260实时进行显示,以满足演示效果。
在一些实施例中,显示设备200还包括调谐解调器210、通信器220、检测器230、外 部装置接口240、控制器250、显示器260、音频输出接口270、存储器、供电电源、用户接口280中的至少一种。
在一些实施例中控制器包括处理器,视频处理器,音频处理器,图形处理器,RAM,ROM,用于输入/输出的第一接口至第n接口。
在一些实施例中,显示器260包括用于呈现画面的显示屏组件,以及驱动图像显示的驱动组件,用于接收源自控制器输出的图像信号,进行显示视频内容、图像内容以及菜单操控界面的组件以及用户操控UI界面。
在一些实施例中,显示器260可为液晶显示器、OLED显示器、以及投影显示器,还可以为一种投影装置和投影屏幕。
在一些实施例中,通信器220是用于根据各种通信协议类型与外部设备或服务器进行通信的组件。例如:通信器可以包括Wifi模块,蓝牙模块,有线以太网模块等其他网络通信协议芯片或近场通信协议芯片,以及红外接收器中的至少一种。显示设备200可以通过通信器220与外部控制装置100或服务器400建立控制信号和数据信号的发送和接收。
在一些实施例中,用户输入接口,可用于接收控制装置100(如:红外遥控器等)的控制信号。
在一些实施例中,外部装置接口240可以包括但不限于如下:高清多媒体接口(HDMI)、模拟或数据高清分量输入接口(分量)、复合视频输入接口(CVBS)、USB输入接口(USB)、RGB端口等任一个或多个接口。
在一些实施例中,控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。
在一些实施例中,图形处理器,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。图形处理器包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象;还包括渲染器,对基于运算器得到的各种对象,进行渲染,上述渲染后的对象用于显示在显示器上。
在一些实施例中,用户可在显示器260上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
参见图4,在一些实施例中,将系统分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和系统库层(简称“系统运行库层”),以及内核层。
基于上述显示设备,可以通过增加驱动组件和姿态检测组件使显示设备支持旋转和/或升降功能。通常,驱动组件包括旋转组件和/或升降组件,控制器250可与旋转组件和/或升降组件进行通信,从而在需要使显示器旋转时,控制旋转组件驱动显示器进行旋转,在需要使显示器上升或者下降时,控制升降组件驱动显示器上升或者下降。
在一些实现方式中,旋转组件和/或升降组件上设有GPIO接口,控制器通过读取该GPIO接口以改变旋转组件和/或升降组件的GPIO接口状态。旋转组件和/或升降组件则在GPIO接口状态发生变化时,根据变化后的GPIO接口状态,驱动显示器旋转和/或升降。
在一些实现方式中,升降组件和/或升降组件包括MCU芯片,MCU芯片上集成蓝牙模块,从而使得升降组件和/或升降组件支持蓝牙功能,如低功耗蓝牙(BLE),进而,控 制器250可以基于蓝牙协议与升降组件和/或升降组件进行通信。
在一些实施例中,检测组件包括用于检测显示器旋转状态的传感器和用于检测显示升降状态的传感器。在显示器旋转或者升降过程中,控制器根据姿态检测组件检测到的数据,以实时监测显示器的旋转状态或者升降状态。例如,在控制显示器旋转过程中,通过监听传感器的数据获取旋转角度、角度速等信息。在控制显示器升降过程中,通过监听传感器的数据获取升降距离、升降速度等信息。
在一些实施例中,检测组件包括在驱动组件。例如,用于检测显示器旋转状态的传感器包括在旋转组件,与上述旋转组件共同构成旋转组件。用于检测显示升降状态的传感器包括在升降组件,与上述升降组件共同构成升降组件。
图5为根据一些实施例示出的显示设备200示意图,如图5所示,该显示设备包括显示器260和升降驱动设备511。升降驱动设备511与升降导轨512,升降导轨固定在支架512上。旋转驱动设备则设置在升降驱动设备的内侧,即升降驱动设备与显示器之间,图5中未示出。
在一些实施例中,上述显示设备200可以是触控显示设备,其显示器是由触控组件和屏幕共同构成触控显示器。触控显示设备支持触控交互功能,可以让用户只要用手指轻轻地碰显示器就能实现对主机操作,这样摆脱了键盘、鼠标、遥控器操作,使人机交互更为直截了当。在触摸显示器上,用户可以通过触摸操作输入不同的控制指令。例如,用户可以输入点击、滑动、长按、双击等触控指令,不同的触控指令可以代表不同的控制功能。
为了实现上述不同的触摸动作,触敏组件可以在用户输入不同触摸动作时,产生不同的电信号,并将产生的电信号发送给控制器250。控制器250可以对接收到的电信号进行特征提取,从而根据提取的特征确定用户要执行的控制功能。例如,当用户在应用程序界面中的任一程序图标位置输入点击触摸动作时,触控组件将感应到触摸动作从而产生电信号。控制器250在接收到电信号后,可以先对电信号中触摸动作对应电平的持续时间进行判断,在持续时间小于预设时间阈值时,识别出用户输入的是点击触控指令。控制器250再对电信号产生的位置特征进行提取,从而确定触摸位置。当触摸位置在应用图标显示范围内时,确定用户在应用图标位置输入了点击触控指令。相应的,点击触控指令在当前场景下用于执行运行相应应用程序的功能,因此控制器250可以启动运行对应的应用程序。
又如,当用户在媒资展示页面中输入滑动动作时,触控组件同样将感应到的电信号发送给控制器250。控制器250先对电信号中触摸动作对应信号的持续时间进行判断。在确定持续时间大于预设时间阈值时,再对信号产生的位置变化情况进行判断,显然,对于互动触摸动作,其信号的产生位置将发生变化,从而确定用户输入了滑动触控指令。控制器250再根据信号产生位置的变化情况,对滑动触控指令的滑动方向进行判断,控制在媒资展示页面中对显示画面进行翻页,以显示更多的媒资选项。进一步地,控制器250还可以对滑动触控指令的滑动速度、滑动距离等特征进行提取,并按照所提取的特征进行翻页的画面控制,以达到跟手效果等。
同理,对于双击、长按等触控指令,控制器250可以通过提取不同的特征,并通过特征判断确定触控指令的类型后,按照预设的交互规则执行相应的控制功能。在一些实施例中,触控组件还支持多点触控,从而使用户可以在触摸屏上通过多指输入触摸动作,例如,多指点击、多指长按、多指滑动等。
对于上述触控动作还可以配合特定的应用程序,实现特定的功能。例如,当用户打开 画板应用后,显示器260可以呈现绘图区域,用户可以通过滑动触控指令在绘图区域中画出特定触控动作轨迹,控制器250则通过触控组件检测的触控动作,确定触控动作图案,并控制显示器260实时进行显示,以满足演示效果。例如,用户通过旋转触控在显示器上的手指以实现控制显示器展示图片的旋转是触屏显示设备的一项基本功能。当前交互方式为多手指在屏幕上旋转后,图片立即按照手指旋转方向旋转到水平或垂直的角度,没有交互的过程,用户体验较差。
在一些实施例中,显示设备200包括显示器,用于显示涂色图片(即目标图片)。
在一些实施例中,控制设备100可以是触控笔,用户可以通过触控笔,如电容笔,以点击触摸屏幕的方式输入用户指令,完成对显示设备200上图片的颜色填充或擦除。控制设备100也可以是鼠标,按下鼠标移动时,完成对显示设备200上图片的颜色填充或擦除。
在一些实施例中,控制设备100包括控制单元,控制单元用于接收由人手在显示器上滑动而引起的位置随时间的变化所呈现的移动轨迹,并根据移动轨迹生成对应的控制指令,完成对显示设备200上图片的颜色填充或擦除。
在一些实施例中,也可以使用智能设备300(如移动终端、平板电脑、计算机、笔记本电脑等)以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200,以完成对显示设备200上图片的颜色填充。
在一些实施例中,控制器250通过运行存储在存储器上的各种软件控制程序(如操作系统和/或各种应用程序),来控制显示设备200的工作和响应与显示器260相关的用户操作。例如,控制在显示器上呈现用户界面,用户界面上包括若干UI对象;响应于接收到的对用户界面上UI对象的用户命令,控制器250便可以执行与用户命令选择的对象有关的操作。
在一些实施例中,显示设备中可以安装有多个应用。图6示出了根据一些实施例中显示器中的用户界面。用户可以点击用户界面中的“我的应用”,触发显示应用列表。应用列表中包括显示设备已安装的所有应用。
本申请提供的显示设备具有画板功能。该画板功能可以基于显示设备上安装的与画板功能相关应用实现。为便于说明,将显示设备上安装的与画板功能相关应用称为“画板应用”,例如,画板应用中可以提供JPG、PNG或SVG格式的图片,用户可以对图片进行涂色操作,以及对已涂色的区域进行颜色擦除操作。
图7示出根据一些实施例中应用列表的示意图。如图7所示,显示设备安装有图像处理应用,播放器应用,视频聊天应用,摄像头应用,网络购物应用和游戏应用。当显示器中显示应用列表时,用户可以选择其中一个应用并进行打开,以实现该应用的功能。需要说明的是,其中,安装的上述应用可以是系统应用,也可以是第三方应用。
显示设备启动画板应用后,显示器上呈现电子画板界面。电子画板界面上显示与画板应用的一个或者多个功能相对应的用户界面对象、信息和/或可输入内容的区域。前述用户界面对象是指构成电子画板界面的对象,可以包括但不限于文本、图像、图标、软按键(或称“虚拟按钮”)、下拉菜单、单选按钮、复选框、可选列表等等。所显示的用户界面对象可以包括,用于传递信息或者构成用户界面外观的非交互对象、可供用户交互的交互式对象,或者非交互对象与交互式对象的组合。用户在可以在其希望与之交互的交互对象所对应的触摸屏位置与触摸屏进行接触,从而与用户界面对象进行交互。显示设备检测接触,并且通过执行与交互对象的交互相对应的操作来响应所检测到的接触,以实现在画板应用 中进行作画。
在一些实施例中,本申请实施例所涉及的部分或者全部步骤在操作系统内和应用程序中实现。在一些实施例中,将用于实现本申请实施例部分或者全部步骤的应用程序称为上述“画板应用”,其存储在存储器中,控制器250通过在操作系统中运行该应用程序,来控制显示设备200的工作和响应与该应用程序有关的用户操作。
值得注意的是,本申请实施例涉及的显示设备包括但不限于上述实施例介绍的显示设备200,还可以是具有图像显示功能、数据处理及信息收发功能的其他终端设备,如手机、平板电脑等便携式移动终端。以下将以显示设备为例,对本申请实施例进行详细说明。
相关技术中,用户在进行涂色的过程中,容易出现涂色痕迹超出需要涂色区域边界的问题,不利于用户体验。
本申请提供了一种显示设备及涂色方法,以解决用户在涂色过程中容易出现涂色痕迹超出需要涂色区域边界的问题,提高用户体验。
图8为一种示例性的电子画板界面,如图8所示,电子画板界面包括绘图区域(绘画区域)和控件区域(控件区)。其中,绘图区域即为可输入内容的区域,控件区域显示有与画板应用的一个或者多个功能相对应的项目入口,例如,“插入”项目入口和“工具”项目入库等。
在一些实施例中,绘图区域和控件区域可以位于同一个图层的不同位置,绘图区域和控件区域也可以位于不同的图层中,以绘图区域和控件区域位于不同的图层为例,绘图区域可以是第一图层中的部分或全部区域,第一图层位于电子画板界面上,控件区域可以是第二图层中的部分或全部区域,第二图层可以叠加在第一图层上,第二图层也可以与第一图层并列设置,以在电子画板界面中显示绘图区域和控件区域对应的内容。
在一些实施例中,当用户点击“插入”项目时,响应于用户输入的指示显示目标图片的指令,触发显示设备显示如图9所示的图片选项栏。图片选项栏中包括多个可插入的图片选项,例如“图片A”、“图片B”以及“图片C”等。用户点击任一图片选项,可将选中的图片插入到画板应用中,并显示在绘图区域,在具体实现中,响应于用户输入的指示显示目标图片的指令,生成第三图层,第三图层叠加在第二图层上,第三图层用于绘制图片选项栏,响应于用户在该图片选项栏中对目标图片的选中操作,控制器控制关闭第三图层,并在第一图层绘制目标图片,以使显示器显示目标图片,其中,用户选择插入的图片即为目标图片。
在一些实施例中,当用户点击“工具”项目时,触发在显示设备显示如图10所示的工具栏页面。工具栏页面包括“画笔”工具以及“橡皮擦”工具等。当用户点击“画笔”工具时,触发显示设备显示如图11所示的画笔工具栏。基于画笔工具栏,可以对画笔的类型、颜色、线条粗细以及线条样式等参数信息进行选择,以生成相应的画笔工具,用户可以控制画笔工具在绘图区域输入内容,输入的内容即为用户在绘图区域上的触控轨迹。
在一些实施例中,目标图片可以包括至少一个闭合区域,每个闭合区域包括有封闭式的区域边界围成,用户可以控制画笔工具对目标图片进行涂色操作。
相关应用中,当用户控制画笔工具对目标图片进行涂色时,未对闭合区域的区域边界做出限制,使得用户输入的内容可能会超出闭合区域的区域边界,影响用户后续的绘制效果,降低用户的体验感。例如,图12示出了相关应用中用户输入的内容超出目标图片的闭合区域的区域边界的示意图。其中,L1为用户输入的线条,该线条从A点开始,依次 经过B点、C点、D点以及E点,A点和E点位于闭合区域中,B点和D点位于闭合区域的区域边界上,该线条B点-C点-D点的部分为超出区域边界的部分,影响绘制效果。
为了解决上述的技术问题,增加用户的体验感,本申请提供的显示设备可以对目标图片中闭合区域的区域边界进行限制,防止用户输入的内容超出闭合区域的区域边界,影响用户后续的绘制效果。
在一些实施例中,显示设备可以设置有“涂色保护模式”。当显示设备进入“涂色保护模式”时,可以对闭合区域的区域边界做出限制,令用户输入的内容不会超出闭合区域的区域边界。当显示设备未处于“涂色保护模式”时,则不对闭合区域的区域边界做出限制。
在一些实施例中,用户可以通过与用户界面中的交互式对象进行交互,设置“涂色保护模式”。示例性的,前述交互式对象可以是显示在电子画板界面上的“涂色保护模式”项目。参见图13,电子画板界面设置有“涂色保护模式”项目,当用户选中“涂色保护模式”项目时,控制器控制显示设备进入“涂色保护模式”。当用户再次选中该项目时,控制器可以控制显示设备退出“涂色保护模式”。
在一些实施例中,用户可以通过操作遥控器的指定按键,向显示设备发送指示进入“涂色保护模式”的指令。示例性的,预先绑定“涂色保护模式”指令与遥控器按键之间的对应关系。例如,在遥控器上设置一个“涂色保护模式”按键,当用户触控该按键时,遥控器发送“涂色保护模式”指令至控制器,此时控制器控制显示设备进入“涂色保护模式”。当用户再次触控该按键时,控制器可以控制显示设备退出“涂色保护模式”。
在一些实施例中,也可以预先绑定“涂色保护模式”指令与多个遥控器按键之间的对应关系,当用户触控与“涂色保护模式”指令绑定的多个按键时,遥控器发出“涂色保护模式”指令。在一些实施例中,“涂色保护模式”指令绑定的按键依次为方向键(左、下、左、下),即当用户在预设时间内连续触控按键(左、下、左、下)的情况下,遥控器才发送“涂色保护模式”指令至控制器。采用上述绑定方法,可以避免“涂色保护模式”由于用户的误操作而发出。本申请实施例仅是示例性的提供几种“涂色保护模式”指令与按键之间的绑定关系,可以根据用户的习惯设定“涂色保护模式”指令与按键之间的绑定关系,在此不做过多的限定。
在一些实施例中,用户可以使用显示设备的声音采集器,例如麦克风,通过语音输入的方式,向显示设备发送“涂色保护模式”指令,以控制显示设备进入“涂色保护模式”。显示设备中可以设置有智能语音系统,智能语音系统可以对用户的语音进行识别,以提取用户输入的指令内容。用户可以通过麦克风输入预设的唤醒词,从而启动智能语音系统,从而控制器可以对用户输入的指令做出响应。并在一定时间内“涂色保护模式”指令,使得显示设备进入“涂色保护模式”。例如,用户可以输入“某某同学”,以启动智能语音系统。再输入“进入涂色保护模式”,实现向显示设备发送“涂色保护模式”指令。
在一些实施例中,用户还可以通过预设的手势向显示设备发送“涂色保护模式”指令。显示设备可以通过图像采集器,例如摄像头,检测用户的行为。当用户做出预设的手势时,可以认为用户向显示设备发送了“涂色保护模式”指令。例如,可以设置为:当检测到用户划出V字时,判定为用户向显示设备输入了“涂色保护模式”指令。用户还可以通过预设的动作向显示设备发送“涂色保护模式”指令。例如,可以设置为:当检测到用户同时抬起左脚和右手时,判定为用户向显示设备输入了“涂色保护模式”指令。
在一些实施例中,当用户使用智能设备控制显示设备时,例如使用手机时,也可以向显示设备发送“涂色保护模式”指令。在实际应用的过程中可以在手机中设置一个控件,可以通过该控件选择是否进入“涂色保护模式”。当用户选择进入“涂色保护模式”时,发送“涂色保护模式”指令至显示设备,以控制显示设备进入“涂色保护模式”。
在一些实施例中,当用户使用手机控制显示设备时,可以对手机发出连续点击指令。连续点击指令指的是:在预设的周期内,用户对手机触摸屏的同一区域进行点击的次数超过预设阈值。例如:当用户在1s内对手机触摸屏的某个区域连续点击3次,则视为一次连续点击指令。手机接收到连续点击指令后,可以向显示设备发送“涂色保护模式”指令,以使控制器控制显示设备进入“涂色保护模式”。
在一些实施例中,当用户使用手机控制显示设备时,也可以设置为:当检测到用户对手机触摸屏的某一区域的触控压力值超过预设的压力阈值时,手机可以向显示设备发送“涂色保护模式”指令。
在一些实施例中,为防止用户误触发“涂色保护模式”,当控制器接收到“涂色保护模式”指令时,可以控制显示器显示“涂色保护模式”确认界面,从而使得用户在“涂色保护模式”确认界面确认是否要控制显示设备进入“涂色保护模式”。图14示出了一些实施例中显示器中显示“涂色保护模式”确认信息的示意图,该“涂色保护模式”确认界面包括“是”选项入口和“否”选项入口,当用户选择“是”选项入口时,控制器控制显示设备进入“涂色保护模式”,以在用户对目标图片进行涂色操作时,每次操作均只能在目标图片中的部分区域进行上色,而不会将颜色涂到其他区域中,当用户选择“否”选项入口时,则显示设备不进入“涂色保护模式”,用户可以每次操作都可以在目标图片中的任意位置进行涂色操作。
应理解的是,上述“涂色保护模式”仅是为便于说明而定义的示例性功能名称/模式名称,其代表显示设备所具有的某种功能,并不限定本申请的保护范围。
在一些实施例中,目标图片以标准图像文件格式存储在存储器中,例如,目标图片以Bitmap格式存储在存储器中,目标图片包括若干像素点,像素点是影像显示的基本单位,可以将像素点看为不能再分割成更小元素或单位的格点。在目标图片中,若干特定颜色的像素点可以围成至少一个闭合区域,而上述若干特定颜色的像素点则位于该闭合区域的区域边界位置。具体的,位于区域边界位置的像素点的颜色与其他位置的像素点的颜色不同,例如,位于区域边界位置的像素点的颜色为黑色,其他位置的像素点的颜色为白色。
在一些实施例中,控制器接收到用户输入的“涂色保护模式”开启指令后,控制对存储在存储器中的目标图片中的每一个像素点进行扫描,获取每一个像素点的位置信息和颜色信息,根据每一个像素点的颜色信息,将目标图片划分为若干区域,每一个区域均是由特定颜色的像素点形成的区域边界围成的闭合区域,之后,根据每一个像素点的位置信息和其的闭合区域,将目标图片中的所有像素点分成与该闭合区域数量相等的类别,并分别存储每一个类别的像素点至存储器中。例如,根据每一个像素点的颜色信息,若目标图片被划分为3个闭合区域,分别为第一闭合区域、第二闭合区域以及第三闭合区域。则分别获取存储器中存储的位于第一闭合区域、第二闭合区域以及第三闭合区域的像素点的位置信息,生成对应的第一像素点合集、第二像素点合集以及第三像素点合集。其中,第一像素点合集包括第一闭合区域中的所有像素点的位置信息,第二像素点合集包括第二闭合区域中的所有像素点的位置信息,第三像素点合集包括第三闭合区域中的所有像素点的位置 信息。需要说明的是,上述闭合区域不仅仅是指由区域边界位置的像素点围成的图像区域,例如,目标图片的闭合区域还包括由区域边界位置的像素点和目标图片的边缘位置围成的背景区域。
在一些实施例中,控制器控制对存储器中存储的目标图片的像素点进行扫描时,可能会由于未扫描到某些区域中的像素点的颜色信息而遗漏对该闭合区域的划分。为了避免上述问题,可通过将已扫描到的像素点的颜色信息的数量与目标图片中像素点的总数量进行对比,若已扫描到的像素点的颜色信息的数量小于目标图片中像素点的总数量,则控制继续对目标图片继续进行扫描。直至扫描到的像素点的颜色信息的数量等于目标图片中像素点的总数量,则说明对该目标图片的所有区域均已扫描完成且可以基于扫描得到的像素点的颜色信息划分出目标图片中的每一个闭合区域。
在一些实施例中,用户可以控制画笔工具在绘图区域输入内容。更为具体的,用户通过“落笔”-“移动”-“抬笔”的操作控制画笔工具在绘图区域输入的内容为一条触控轨迹,该触控轨迹包括绘制起点和绘制终点,绘制起点即为用户输入该触控轨迹时经过的第一个像素点,绘制终点即为用户输入该触控轨迹时经过的最后一个像素点;用户通过“落笔”-“抬笔”的操作控制画笔工具在绘图区域输入的内容为一个触控点,该触控点既是绘制起点也是绘制终点。其中,“落笔”操作即为用户点击画笔工具,使画笔工具处于被选中状态,并控制画笔工具在绘图区域输入内容;“移动”操作即为使画笔工具在选中状态的同时,处于连续运动状态;“抬笔”操作即为使画笔工具处于未被选中状态或画笔工具处于被选中状态但用户停止对画笔工具的控制。
在一些实施例中,响应于用户在绘图区域输入的由绘制起点到绘制终点的滑动操作,确定绘制起点到绘制终点之间的滑动路径,并获取目标闭合区域,该目标闭合区域即是绘制起点所在的像素点在目标图片中位于的闭合区域,可以根据绘制起点所在的像素点位置信息,依次和各闭合区域进行对比,以查找目标闭合区域。例如,目标图片包括第一闭合区域、第二闭合区域以及第三闭合区域,获取第一闭合区域中的像素点位置信息,若第一闭合区域中的像素点位置信息包括绘制起点所在的像素点位置信息,则判定第一闭合区域是目标闭合区域,若否,则继续获取第二闭合区域中的像素点位置信息,若第二闭合区域中的像素点位置信息包括绘制起点所在的像素点位置信息,则判定第二闭合区域是目标闭合区域,否则,判定第三闭合区域是目标闭合区域。
在一些实施例中,根据滑动路径经过的每一个像素点的位置信息,判断滑动路径经过的像素点是否是目标闭合区域中的像素点,若滑动路径经过的像素点是目标闭合区域中的像素点,则根据画笔控件指示的填充颜色对应的填充颜色值,对位于目标闭合区域中的滑动路径经过的像素点进行颜色填充。
在一些实施例中,当检测到用户的“落笔”操作时,参见图15,控制器对“落笔”点对应的像素点进行颜色填充,所填充的颜色可在画笔工具栏中进行选择,若未对填充的颜色进行选择,则使用默认填充颜色对像素点进行填充。获取落笔点区域信息,落笔点区域信息包括“落笔”点所在的闭合区域中的全部像素点的位置信息;接收并判定用户输入“落笔”操作后输入的操作事件;若判定该操作事件类型为“抬笔”操作事件,则此次涂色结束;若判定该操作事件类型为“移动”操作事件,则获取相应滑动路径经过的像素点位置信息,根据得到的滑动路径经过的像素点位置信息,获取位于“落笔”点所在的闭合区域中的滑动路径经过的像素点,对位于“落笔”点所在的闭合区域中的滑动路径经过的像素 点进行颜色填充。例如,参见图16,示例性的给出了一个目标图片,目标图片上包括图像区域和背景区域,其中,图像区域为目标图片中“小马驹”图像的轮廓所围成的区域。用户输入“涂色保护模式”开启指令后,控制器对该目标图片进行扫描,可以得到包括背景区域在内的18个闭合区域,例如,背景区域为第一闭合区域,“小马驹”的身体部分为第二闭合区域等。在该目标图片上设置八个点位,分别用字母A、B、C、D、E、F、G、H表示,其中,A、B、C、G以及H点位于第二闭合区域,且C点和G点位于区域边界上,用户控制画笔工具,在A点进行“落笔操作”,并移动画笔工具依次经过B、C、D、E、F、G以及H点,生成的触控轨迹如图17所示,仅有经过A-B-C以及G-H的实线部分会显示出来,经过C-D-E-F-G的虚线部分则不会被显示出来,即触控轨迹上位于A-B-C以及G-H线条上的像素点可以被填充颜色,而位于C-D-E-F-G线条上的像素点不会被填充颜色。
在一些实施例中,将滑动路径经过的任意两个相邻的像素点之间的最短距离确定为第一距离,将对像素点进行颜色填充时的填充半径确定为第二距离,若第一距离大于第二距离,则在这两个相邻的像素点之间生成预设数量的插值像素点,其中预设数量可以是第一距离与第二距离的比值。
在一些实施例中,可以通过建立坐标系,以坐标的形式表示目标图片的各像素点的位置信息,例如,可以以目标图片左下角的第一个像素点的中心位置为坐标原点建立坐标系,相邻的两个像素点的中心之间的距离视为一个单位长度,以(Xi,Yi)的形式表示像素点,Xi表示为距离Y轴i个单位距离的点,Yi表示为距离X轴i个单位距离的点,(Xi,Yi)表示为距离Y轴i个单位距离且距离X轴i个单位距离的像素点,获取位于区域边界的像素点的坐标信息,以及位于区域边界的像素点所围成的闭合区域的坐标范围;当用户进行“落笔”操作时,控制器对处于“落笔”点的像素点进行颜色填充,并获取位于落笔点的像素的坐标信息和该落笔点所在的闭合区域的坐标范围;判断用户进行“落笔”操作后的操作事件类型,若判定该操作事件类型为“抬笔”事件,则此次涂色结束;若判定该操作事件类型为“移动”事件,则获取相应的滑动轨迹经过的像素点的坐标信息;根据落笔点所在的闭合区域的坐标范围以及滑动轨迹经过的各像素点的坐标,判定各轨迹像素点是否在落笔点区域中,并对属于落笔点区域的插值像素点进行颜色填充,对不属于落笔点区域的插值像素点进行消除。
基于上述实施例,参见图18,本申请还提供一种涂色方法,包括:
S101:响应于用户输入的指示显示目标图片的指令,在绘图区域显示目标图片,目标图片包括至少一个闭合区域。
S102:响应于用户在绘图区域输入的由绘制起点到绘制终点的滑动操作,确定绘制起点到绘制终点之间的滑动路径。
S103:控制位于目标闭合区域的滑动路径填充预设颜色,且控制位于目标闭合区域外的滑动路径不填充颜色,其中,目标闭合区域是绘制起点所处的闭合区域。
在一些实施例中,控件区域还包括触发按钮,响应于对触发按钮的操作,控制显示设备进入涂色保护模式,其中,涂色模式是指只令位于目标闭合区域中的滑动路径填充预设颜色。
在一些实施例中,在控制位于目标闭合区域的滑动路径填充预设颜色,且控制位于目标闭合区域外的滑动路径不填充颜色之前,根据第一闭合区域中的像素点位置信息,判断绘制起点所在的像素点是否是第一闭合区域中的像素点,其中,第一闭合区域是目标图片 中的任意一个闭合区域;若绘制起点所在的像素点是第一闭合区域中的像素点,则将第一闭合区域确定为目标区域。
在一些实施例中,根据第一闭合区域中的像素点位置信息,判断绘制起点所在的像素点是否是第一闭合区域中的像素点之前,遍历目标图片中的像素点,获取每一个像素点的位置信息;根据像素点所处的闭合区域,分类存储每一个像素点的位置信息。
在一些实施例中,根据滑动路径经过的每一个像素点的位置信息,判断滑动路径经过的像素点是否是目标闭合区域中的像素点;若滑动路径经过的像素点是目标闭合区域中的像素点,则对滑动路径经过的像素点进行颜色填充。
在一些实施例中,响应于用户对画笔控件的触发操作,确定画笔控件指示的填充颜色;根据填充颜色对应的填充颜色值,对滑动路径经过的像素点进行颜色填充。
在一些实施例中,将滑动路径经过的任意两个相邻的像素点之间的最短距离确定为第一距离,将对第一像素点进行颜色填充时的填充半径确定为第二距离;若第一距离大于第二距离,则在这两个相邻的像素点之间生成预设数量的插值像素点。
在一些实施例中,判断每一个插值像素点是否位于目标闭合区域中;控制对位于目标闭合区域的插值像素点进行颜色填充;控制对位于目标闭合区域之外的插值像素点进行消除。
在一些实施例中,获取距离插值像素点最近的一个第一像素点的填充颜色值;根据填充颜色值,对插值像素点进行颜色填充。
由以上内容可知,本申请提供一种显示设备及涂色方法,用于在接收输入的指示显示目标图片的指令时,在显示设备上显示目标图片,目标图片包括至少一个闭合区域;响应于输入的由绘制起点到绘制终点的滑动操作,确定绘制起点到绘制终点之间的滑动路径;控制位于目标闭合区域的滑动路径填充预设颜色,且控制位于目标闭合区域外的滑动路径不填充颜色,以解决用户在涂色过程中容易出现涂色痕迹超出涂色区域边界的问题,提高用户体验。
相关技术在涂色过程中,由于涂色区域和涂色边界保存在同一张图片中,对已经完成的涂色进行擦除时,会将边界也同时擦除,这破坏了涂色图片的边界信息,会导致涂色时涂到区域以外,影响涂色功能和用户体验。
为解决上述问题,本申请实施方式还提供了另一种显示设备及显示方法,该方法解决了对已经完成的涂色进行擦除时,边界颜色同时被擦除的问题,可避免破坏涂色图片的边界信息,进而避免涂色时涂到涂色区域以外,从而提升用户操作显示设备的用户体验。
图19a为一些实施例中电子画板的示意图。如图19a所示,电子画板包括菜单区域、绘图区域和控件区域。其中,菜单区域用于提供用户导航菜单,具体包括“开始”控件、“插入”控件、“设计”控件、“视图”控件、“进程”控件和“工具”控件。用户可以通过触发不同的交互控件,实现不同的与图像处理有关的功能。绘图区域为可输入内容的区域。控件区域可以集中显示与图像处理应用的一个或者多个功能相对应的控件,例如画笔控件、擦除控件、颜色控件等,用户可以使用各个控件可进行相应的操作,还可以设置每个控件的参数,例如设置画笔控件的颜色及线条样式等。当某个画笔控件被选中时,展示该画笔控件对应的工具栏,在工具栏中可以选择画笔颜色、粗细,工具栏中还显示有色板控件以及颜色吸取控件。在显示该电子画板界面时,用户通过点击画笔控件拾起画笔,在拾起画笔后,可以在对应的工具栏中选择现有的画笔颜色选项,或者点击色板控件触发显 示色板,以在色板中选择画笔颜色,或者点击颜色吸取控件,触发进入到从绘图区域上拾取颜色的过程。用户在色板上选择的画笔颜色,或者从绘图区域上拾取的颜色,将被配置为画笔控件的输入颜色。在拾起画笔控件的状态下,用户可以基于与绘图区域接触输入内容,输入的内容即为用户在绘图区域上的接触轨迹。当用户选择擦除控件时,输入的内容即为用户在绘图区域上的接触轨迹。
图19b为根据一些实施例示出的另一种电子画板界面插入图片的示意图。如图19b所示,用户可以触发显示器上的“插入”控件而输入指示显示“插入”控件对应的下拉菜单的用户指令。控制器可以响应于该用户指令,在显示器上呈现如图19b所示的用户界面,该用户界面中显示有“插入”控件对应的下拉菜单,该下拉菜单包含多个项目,具体包括“插入图片”、“插入文本框”及“插入对象”。其中,用户可以通过操作“插入图片”,向应用中插入图片。具体的,当用户操作控制装置而输入指示选择项目的用户指令时,控制器可以响应于该用户指令,在显示器上呈现如图19b所示的用户界面,该用户界面中显示可插入的图片选项。之后,当用户操作控制装置而输入指示选择某个图片的用户指令时,控制器可以响应于该用户指令,将该图片显示在绘图区域。例如,当用户选择图19b中第一张图片插入后,该图片显示在绘图区域中。
在一些实施例中,在电子画板中,绘图区域展示的图片的格式可以为RGB格式,同时也可以是JPG、PNG或SVG格式中的一种。
在一些实施例中,插入图片后,用户可以对图片进行涂色。
图19c为一些实施例中又一种电子画板界面的示意图。如图19c所示,涂色图片(即目标图片)预置了可以进行涂色的图案(图案有云朵、恐龙以及草等等),图案包括至少一个线条以及以连续的线条围成的至少一个闭合区域。至少一个线条作为闭合区域的边界,与闭合区域内颜色显示不同,以便用户明显的识别边界和闭合区域。用户可以在闭合区域内进行涂色。闭合区域可以相邻设置,也可以不相邻设置。用户对图片进行涂色,即是对图片中的闭合区域进行涂色。通常,闭合区域的颜色为纯底色,例如:白色;涂色边界也就是闭合区域的边界线条,边界线条围绕闭合区域设置。通常,涂色边界的颜色为纯边界色,例如:黑色。涂色时,用户可点击颜色控件,选择需要的颜色,然后点击画笔控件,选择涂色区域,完成涂色操作;擦除颜色时,选择擦除控件,选择需要擦除的区域,完成擦除操作。
在一些实施例中,在对涂色图片进行涂色的过程中,闭合区域中已经完成涂色的部分,称为已涂色区域,将已涂色区域上像素点的配置信息与边界的配置信息保存在同一个文件中,以使已涂色区域和边界展示在同一图片中,其中,配置信息包括颜色和位置。这样,当选择擦除的区域包括边界或包括边界的一部分时,由于边界也在进行擦除操作的同一个文件中,包含在“选择擦除的区域”中的边界也会被擦除。而边界是涂色图片中涂色的界限限制,边界被删除后,会破坏该边界围成的涂色区域的界限信息,当再次选择前述涂色区域进行涂色时,会导致颜色涂到涂色区域以外,如图19d。
下面结合附图说明本申请提供的显示设备,该显示设备包括显示器,显示器用于展示涂色界面;该显示设备的控制器可被配置执行下述显示方法,图20a示出了根据一些实施例的显示方法的流程示意图。如图20a所示,显示方法包括:S191,获取用户在触摸屏上选中像素点形成的移动轨迹;S192,确认移动轨迹对应的目标擦除区域;S193,判断目标擦除区域中每一个像素点的位置坐标是否为边界的位置坐标;S194,若像素点的位置坐标 不是边界的位置坐标,则删除像素点;S195,若像素点的位置坐标是边界的位置坐标,则保留该像素点。
在一种实现方式中,获取用户在触摸屏上选中像素点形成的移动轨迹包括:响应于用户在触摸屏上输入的用于擦除涂色区域颜色的指令,例如,用户选中电子画板上的擦除控件,此时,涂色区域的像素点均处于待选中状态,用户通过在触摸屏上移动选中像素点,前述选中的像素点集合为移动轨迹。
在一种实现方式中,用户还可以通过特定的动作选中电子画板上的擦除控件。对于擦除控件的选中方式,本申请不作限定。
图20b示出了根据一些实施例的显示方法的流程示意图。如图20b所示,在某一具体的实现方式中,显示方法包括:
S201,获取涂色图片上目标落点的像素点位置信息。
如图21所示,涂色图片预置了可以进行涂色的图案(图案有云朵、恐龙以及草等等),图案包括至少一个线条以及以线条围成的至少一个闭合区域。如包括线条41以及以线条41围成的闭合区域61,线条42和线条41围成的闭合区域62,还包括线条43围成的闭合区域63。连续的线条41和线条42围成的闭合区域61和闭合62的图案为云朵。用户可以在闭合区域内进行涂色。其他组成图案的线条和线条围成的涂色区域与上述构成相同,不做赘述。
可以理解的是,用户可以通过在电子画板上对涂色图片进行涂色操作,实现涂色区域61、涂色区域62和涂色区域63等的颜色填充或擦除。并且颜色填充或擦除只能在闭合区域中进行。
目标落点,是指由人手、控制设备和/或智能设备在显示设备200上的涂色图片上滑动,而产生位置随时间的变化所呈现的移动轨迹。其中,控制设备和/或智能设备在显示设备200上的涂色图片上滑动预先关联对应的控制指令,本申请不作限定。
本申请提供一种获取手指在涂色图片上目标落点的像素点位置信息的方法。在某一具体的实现方式中,以手指为例,在涂色图片上落和移动时,获取手指在涂色图片上目标落点的像素点位置信息的方法包括:读取触屏事件类型,触屏事件类型包括:手指落下事件、手指移动事件以及手指抬起事件;当为手指落下事件时,获取手指移动经过的像素点位置;当为手指移动事件时,获取手指移动经过的像素点位置;当为手指抬起事件时,结束操作。
在某一具体的实现方式中,图22示出了根据一些实施例的目标落点的示意图。如图22所示,当人手、控制设备和/或智能设备在显示设备200上的涂色图片上从A点滑动到B点时,形成轨迹“线段AB”。S1中目标落点的像素点,即是指轨迹“线段AB”上所有点对应的像素点,所有像素点的位置信息(即坐标(x i,y i))的集合,即是本步骤中要获取涂色图片上目标落点的像素点位置信息。
在某一具体的实现方式中,目标落点可以决定擦除的位置,擦除的范围可以由预设的擦除宽度r决定。也就是说,若预设的擦除宽度r,则离目标落点(x i,y i)的距离小于等于r的所有像素点均被擦除。此时,目标像素点还包括集合S中的像素点,集合S为离目标落点的距离小于等于预设擦除宽度r的所有像素点的集合。
图23示出了根据一些实施例的预设擦除宽度r的擦除范围示意图。如图23所示,目标落点为轨迹“线段AB”时,擦除的范围是“以“线段AB”每一点为圆心,以擦除宽度r为半径,构成的所有圆”所形成的宽度为2r的带状区域,即图示区域621以及区域631。 也就是说,当预设擦除宽度为r时,不光要获取目标落点的像素点的位置信息(即坐标(x i,y i))的集合,同时,要获取离目标落点的像素点的距离小于等于r的所有像素点的位置信息(即坐标(x i,y i))的集合。
S202,若像素点为边界位置,则保留像素点;前述颜色应为边界的初始颜色。这样,多次涂色擦除操作后,可以保证边界的颜色一致,进而保证涂色区域的界限限制不被破坏,保证画板功能的正常使用。
图24示出了根据一些实施例的涂色图片被涂色的示意图。如图24所示,同时,参见图21,涂色区域61、涂色区域62和涂色区域63已经被涂色(涂色颜色为黑色)。
图25示出了根据一些实施例的擦除轨迹的目标落点的示意图。如图25所示,擦除轨迹为轨迹“线段AB”,“线段AB”经过涂色区域62和涂色区域63。
在某一具体的实现方式中,当预设的擦除宽度为r时,擦除轨迹为轨迹“线段AB”,擦除范围为,“以“线段AB”每一点为圆心,以擦除宽度r为半径,构成的所有圆”所形成的宽度为2r的带状区域,与涂色区域62和涂色区域63的相交区域。可以理解的是,其他位置的像素点的颜色被保留。
在某一具体的实现方式中,边界位置是指边界,即如图21所示的连续的线条41、线条42以及线条43等其他围成涂色区域的线条。图26示出了根据一些实施例的擦除颜色后的涂色图片的示意图。如图26所示,当擦除轨迹为轨迹“线段AB”时,前述带状区域,与闭合区域61和闭合区域62的相交区域被擦除,分别为擦除区域621和擦除区域631。线条42与带状区域的交集,即线段421和线段422被保留;线条43与带状区域的交集,即线段431以及线段432被保留。其中,边界位置是指线条42以及线条43等其他围成涂色区域的线条,保留像素点是指,保留线段421、线段422、线段431以及线段432。
本申请提供一种保留线段421、线段422、线段431以及线段432的方法。在某一具体的实现方式中,根据预设的擦除宽度r,获取离目标落点的像素点的距离小于等于r的所有像素点(x i,y i),记作集合S。遍历集合S中的每一个像素点(x i,y i),判断像素点(x i,y i)是否为边界,若为边界,则保留边界。
S203,若像素点不是边界位置,则删除像素点;
在某一具体的实现方式中,如图26所示,当擦除轨迹为轨迹“线段AB”时,前述带状区域,与涂色区域61和涂色区域62的相交区域被擦除,分别为擦除区域621和擦除区域631。可以理解的是,边界位置信息,即像素点(x i,y i)位置坐标,可以预先保存在控制设备中,用以当有擦除的指令时,直接调取边界位置信息作为边界位置。
本申请一些实施例中,涂色区域包括已涂色区域;控制器进一步被配置为:将已涂色区域上像素点的配置信息与边界的配置信息保存在同一个文件中,以使已涂色区域和边界展示在同一图片中;若像素点的位置坐标不是边界的位置坐标,则删除像素点;包括:若像素点的位置坐标不是边界的位置坐标,则在同一个文件上删除像素点的配置信息。
本申请一些实施例中,若像素点的位置坐标是边界的位置坐标,则保留该像素点;包括:若像素点不是边界位置,则在同一个文件上保留边界的配置信息。
本申请一些实施例中,确认移动轨迹对应的目标擦除区域;包括:确认第一相交轨迹,第一相交轨迹为移动轨迹与已涂色区域相交的像素点集合;根据第一相交轨迹,确认目标擦除区域。
本申请一些实施例中,涂色区域还包括未涂色区域;确认移动轨迹对应的目标擦除区 域;包括:确认第二相交轨迹,第二相交轨迹为移动轨迹与未涂色区域相交的像素点集合;根据移动轨迹上除第二相交轨迹以外的像素点,确认目标擦除区域。
下面结合附图说明涂色图片的边界配置信息的识别和保存方法。
在一些实施例中,边界位置的初始颜色与涂色图片上除边界位置以外的区域的颜色不同;获取边界位置的初始颜色的步骤中,边界的初始颜色为边界颜色;获取涂色图片上的目标像素点的步骤中,控制器进一步被配置为:保存涂色图片;设置涂色图片的边界颜色;从涂色图片的第一个像素点开始,遍历读取涂色图片的每一个像素点的颜色;若像素点的颜色为边界颜色,则将像素点的位置坐标作为边界的位置坐标保存至存储器。
在一些实施例中,边界的颜色为第一颜色,涂色区域未涂色时的颜色为第二颜色,第一颜色与第二颜色不同;若像素点的颜色为边界的颜色,则将像素点的位置坐标作为边界的位置坐标保存至存储器;包括:判断像素点的颜色是否为第一颜色;若像素点的颜色为第一颜色,则将像素点的位置坐标作为边界的位置坐标保存至存储器。
如图21所示,涂色图片预置了可以进行涂色的图案,图案以连续的线条围成多个可以进行涂色的涂色区域。其中,连续的线条作为多个涂色区域的边界(为方便描述,下文中使用“边界”代替“围成多个涂色区域的连续的线条”),为使用户识别出边界和涂色区域,边界的线条颜色明显区别于涂色区域的初始设置颜色。比如,图21所示的涂色图片边界使用黑色线条,涂色区域的初始设置颜色为白色。
本申请提供的涂色图片的边界位置信息的识别方法,利用边界颜色明显不同于涂色区域的初始设置颜色的特性,通过区分颜色的方法,来区分边界和涂色区域,进而将边界的像素点的位置信息集合作为边界位置保存。图27示出了根据一些实施例的涂色图片的边界位置信息的识别方法的流程示意图。如图27所示。
S001,判断像素点(x i,y i)的颜色是否为边界颜色。
在一些实施例中,从涂色图片的第一个像素(x 0,y 0)位置开始,遍历涂色图片的每一个像素点(x i,y i)的颜色,判断该像素点(x i,y i)的颜色是否为边界颜色。
在一些实施例中,在判断像素点(x i,y i)的颜色之前,可以涂色图片加载到内存,以bitmap格式保存,并设置边界颜色,从而实现判断步骤。可以理解的是,边界颜色可以是单一颜色,也可以是多种颜色,本申请不做限定。
S002,如果像素点(x i,y i)是边界颜色,将此像素点(x i,y i)保存。
在一些实施例中,若某一像素点(x i,y i)是边界颜色,将此像素点(x i,y i)保存在集合A中。遍历涂色图片的每一个像素点(x i,y i)的颜色后,所有集合A中元素均作为边界位置。
可以理解的是,若像素点(x i,y i)不是边界颜色,则认为,像素点(x i,y i)为涂色区域,或为涂色图片上涂色图案以外的留白区域。涂色区域和留白区域的颜色可以相同,也可以不同,但都应该明显不同于边界颜色。
通过上述实施例中的方法以颜色区分像素点,将颜色的不同转化为拥有不同特征(指边界和非边界)的像素点,通过将边界的像素点的集合作为边界位置保存,在判断目标像素点是否是边界位置时,控制器响应于擦除颜色的控制指令,进一步被配置为:遍历涂色图片上的所有像素点,用以判断像素点的颜色是否为边界颜色;若像素点的颜色是边界颜色,则像素点确定为目标像素点。
在一些情境中,选择需要擦除的区域之前,涂色图片已经完成一部分涂色区域的涂色, 涂色图片包括已涂色区域和未涂色区域;需要擦除的区域可能同时经过已涂色区域和未涂色区域,未涂色区域是不要擦除颜色的。
为了更高效的完成擦除动作,本申请提供的显示方法还包括:已涂色区域为闭合区域;擦除目标像素点的颜色的步骤中,控制器进一步被配置为:确认第一相交像素点,第一相交像素点为目标落点与已涂色区域的边界相交的像素点;根据第一相交像素点,确认目标落点上与已涂色区域重合的像素点,用以作为目标像素点;涂色区域还包括未涂色区域;未涂色区域为闭合区域;控制器进一步被配置为:确认第二相交像素点,第二相交像素点为目标落点与未涂色区域的边界相交的像素点;根据第二相交像素点,删去目标落点上与未涂色区域重合的像素点,用以将目标落点上剩余像素点作为目标像素点。
通过本申请提供的上述方法,能够通过扫描涂色区域,记录并保存边界位置信息,将颜色的不同转化为拥有不同特征(指边界和非边界)的像素点,通过将边界的像素点的集合作为边界位置保存,用以在擦除用户在涂色区域涂上的颜色时,区分边界和涂色区域,进而实现了在涂色过程中,对涂色进行擦除时只能擦除用户的涂色,保留涂色边界不被擦除。涂色图片的封闭区域和边界信息不会被破坏,保证涂色功能继续可用。
相关技术中,显示设备上可以安装图像处理应用,图像处理应用可以提供不同类型的涂色图片,涂色图片包括闭合区域和边界区域,闭合区域为用户进行涂色的区域。边界区域为边界线条,边界线条围绕闭合区域设置。通常,闭合区域的颜色为白色,边界区域的颜色为黑色。用户可以对涂色图片中的闭合区域进行调整,比如,对闭合区域进行涂色。
其中,涂色图片均采用压缩后保存,但由于涂色图片进行了压缩,所以在转换成位图文件后,闭合区域和区域边界之间会存在过渡区域,而过渡区域的颜色为灰度渐变。进而,当用户对涂色图片进行涂色时,由于闭合区域和边界区域之间的过渡区域存在颜色灰度,导致用户在填色时无法将闭合区域全部填满,以及在边界区域周围出现毛刺,从而影响涂色效果,降低用户的使用体验感。
本申请实施例还提供了一种显示设备及消除区域灰度的方法,用以解决导致用户在填色时无法将闭合区域全部填满,以及在边界区域周围出现毛刺的问题。
在一些实施例中,本申请实施例提供的上述显示设备的电子界面的示意图,如图19a所示,其对应的用户与显示设备的交互过程及用户界面变化,如图19a至19c所示,由于上述已对此进行了详细的阐述,此处不再赘述。
在一些实施例中,插入涂色图片后,用户可以对涂色图片进行涂色。在用户涂色的过程中需对涂色图片中的闭合区域进行涂色。其中,闭合区域的颜色为纯底色,例如:白色。边界区域也就是边界线条,边界线条围绕闭合区域设置。边界区域的颜色为纯边界色,例如:黑色。通常,闭合区域和边界区域之间还存在过渡区域,过渡区域的颜色常为灰度渐变。当用户对涂色图片进行涂色时,即会出现无法将闭合区域全部填满,以及在边界区域周围出现毛刺的情况。
为便于说明上述出现的情况,参见图28所示的用户界面,可以看出由于闭合区域和边界区域直接存在颜色灰度,严重影响用户的使用体验。为优化上述问题,本申请提供的显示设备可以在画图区域展示涂色图片之前,对涂色图片的过渡区域进行灰度消除操作,防止出现用户无法将闭合区域全部涂满颜色的情况。
本申请提供的显示设备包括显示器;显示器被配置为显示电子画板界面,电子画板界面用于展示涂色图片。触控组件,被配置为接收用户通过触控输入的指令,其中触控组件 和显示器形成触摸屏。为消除涂色图片中闭合区域和边界区域之间的颜色灰度,将闭合区域和边界区域进行明确划分,本申请将根据灰度值对全部像素点对应的颜色进行设置,以消除区域颜色灰度。
以下针对本申请提供的一种显示设备及消除区域灰度方法进行具体阐述。
图29示出了根据一些实施例中一种消除区域灰度方法的流程示意图。参见图29,本申请实施例提供的一种显示设备,其配置的控制器在执行消除区域灰度方法时,被配置为执行下述步骤:
S291、获取涂色图片,并将涂色图片进行区域划分,以形成至少一个闭合区域以及包围闭合区域的边界区域。
示例性的,涂色图片可以通过上述插入方式将涂色图片插入到电子画板界面中。在插入涂色图片的过程中,需对涂色图片进行区域划分,形成至少一个闭合区域和包围闭合区域的边界区域,进而消除灰度。同时,在图像处理应用中或者显示设备的本地存储器中可以预存有涂色图片集合,涂色图片集合中包括若干个的用于用户选择的图片,也就是说,涂色图片选自前述图片集合。控制器可以预先对该图片集合中的图片进行统一的区域划分处理,这样,在用户输入的插入涂色图片的指令之前,涂色图片集合中所有备选的图片都是完成区域划分的涂色图片。当某个涂色图片被用户选中后,可直接显示在电子画板界面中的画图区域,以便于用户进行后续的操作。需要说明的是,本申请不对进行区域划分的时间以及划分涂色图片的数量进行限定,也就是说涂色图片可以批量/每张进行区域划分。同时,涂色图片可以在被用户选中前完成区域划分,也可以当被用户选中后自动进行区域划分。
图30示出了根据一些实施例中划分闭合区域和边界区域的流程示意图。参见图30,将涂色图片进行区域划分,以形成至少一个闭合区域以及包围闭合区域的边界区域的过程中,控制器进一步被配置为执行下述步骤:
S301、将涂色图片中全部像素点进行遍历,计算每个像素点对应的灰度值。
具体实现时,将涂色图片加载至内存中,并创建涂色图片对应的位图文件(Bitmap),位图文件用于获取涂色图片中的全部像素点。进一步,遍历全部像素点。由于涂色图片是由一定数量的像素点组成的,需利用预设程序对涂色图片中的全部像素点进行遍历,以确定每个像素点对应的原始RGB分量。利用预设灰度值公式对每个像素点对应的原始RGB分量进行计算,得到每个像素点对应的灰度值。
在一些实施例中,预设灰度值公式如下:
Gray=R×0.03+G×0.59+B×0.11;
其中,Gray为灰度值;R为红色分量值;G为绿色分量值;B为蓝色分量值。
S302、基于灰度值确定全部像素点的像素颜色,以划分涂色图片中的闭合区域和边界区域。
在一些实施例中,由于涂色图片全部像素点的像素颜色无需全部调整,即涂色图片中白色和黑色的区域无需调整。仅需对部分像素点的像素颜色进行调整。也就是说,需对闭合区域和边界区域之间存在的灰度渐变进行调整。进而,在确定全部像素点的像素颜色的过程中,对每个像素点的像素颜色进行重新确定的同时,对部分像素点的像素颜色进行调整,以完成对涂色图片中闭合区域和边界区域的划分。
具体实现时,对全部像素点对应的灰度值进行判断。判断每个像素点对应的灰度值与 预设灰度阈值的大小,预设灰度阈值用于划分闭合区域和边界区域。需要说明的是,预设灰度阈值本申请并不进行具体限定,可根据实际情况自行设计。
进一步地,在灰度值大于预设灰度阈值的情况下,将灰度值对应像素点的颜色设置为黑色,以使颜色为黑色的像素点构成边界区域。在灰度值小于预设灰度阈值的情况下,将灰度值对应像素点的颜色设置为白色,以使颜色为白色的像素点构成闭合区域。
进而,根据判断结果确定全部像素点的颜色,以划分涂色图片的闭合区域和边界区域。这样,当边界区域和闭合区域之间存在过渡区域时,可以将过渡区域重新调整为边界区域的黑色、或者为闭合区域的白色或者一部分为黑色一部分为白色,从而消除过渡区域即颜色灰度,使边界区域和闭合区域颜色分明。
示例性的,经过完成对涂色图片的区域划分后,即不存在颜色灰度的情况下,在显示器上呈现如图31所示的电子画板界面,该电子画板界面中显示的是用户涂色后的涂色图片。可见,用户在对任意一个闭合区域进行涂色操作后,该闭合区域的周围不会出现毛刺的情况,且该闭合区域被用户全部涂满,从而提高涂色图片的美观性和用户的体验感。
S292、控制显示器显示完成区域划分的涂色图片。
当对涂色图片完成区域划分后,控制器控制显示器呈现出划分完成的涂色图片,并在电子画板界面中的画图区域展示涂色图片。以便于后续用户对闭合区域进行涂色。
S293、响应于输入的对闭合区域的涂色操作,以使闭合区域的颜色为用户选择的颜色。
在一些实施例中,电子画板界面还包括控件区域,控件区域包括至少一个画笔控件,画笔控件用于在画图区域中输入内容。
用户可以使用控件区域中的各种控件对画图区域中的涂色图片进行其他操作,例如,使用画笔控件在涂色图片的闭合区域中输入线条,图形等。具体的,用户可以选中画笔控件,画笔控件处于拾取状态,此时用户可以控制画笔控件在涂色图片中进行相应的操作。当检测到画笔控件被选中时,控制器还可以控制显示器显示该画笔控件对应的工具栏。图32示出了根据一些实施例中控件区域的界面示意图。在工具栏中可以选择画笔的颜色、线条粗细以及线条样式等参数信息。当用户选择好画笔控件的各项参数信息后,可以基于与画图区域触控以输入内容,输入的内容即为用户在画图区域上的触控轨迹。
在一些实施例中,用户可以通过画笔控件对应的工具栏选取画笔控件的颜色。当用户点击工具栏中的颜色选项时,控制器可以控制显示器显示画笔控件颜色情况。画笔控件颜色情况中包括画笔控件可以使用的所有颜色以及画笔控件的当前颜色。图33示出了根据一些实施例中控件区域中包括颜色吸取控件的界面示意图。用户可以查看画笔控件当前正在使用的颜色。用户还可以自行设定画笔控件的颜色。
其中,颜色吸取控件3301至少用于指示出涂色位置对应的吸取颜色,吸取颜色为画图区域中涂色位置处的颜色。当用户对颜色吸取控件进行选中操作后,如颜色1,此时,画笔控件的颜色为“颜色1”即涂色位置的颜色为“颜色1”。控制器响应于输入的确认涂色操作,根据颜色吸取控件指示的涂色位置对应的吸取颜色,完成对闭合区域的涂色。
在一些实施例中,在显示颜色吸取控件时,还包括接收输入的吸取颜色变更操作。根据变更操作确定新的吸取颜色,并根据新的吸取颜色对颜色吸取控件指示的吸取颜色进行更新,以完成对闭合区域的涂色。
在一些实施例中,变更操作包括与触摸屏的接触操作。例如,变更操作包括在触摸屏上执行手势,即与触摸屏接触的对象/配件的运动。但是应理解的是,该接触也可以是使用 任何适当的物件或配件来执行,例如电容笔等。接触可以包括,在触摸屏上发生的一下或者多下点击、保持与触摸屏持续接触、在保持持续接触的同时移动接触点、中断接触、或前述一项或者多项的组合。在一些实施例中,颜色吸取控件是一个交互式用户界面对象,用户可以与之交互以改变吸取颜色,最终选择出最新的吸取颜色。
由以上实施例可以看出,基于本申请实施例提供的显示设备,通过对涂色图片的区域划分,将涂色图片的区域划分为闭合区域和边界区域。同时在划分的过程中消除闭合区域和边界区域之间存在的颜色灰度,使闭合区域和边界区域的颜色分明。因此,用户可以将闭合区域全部填充满颜色,避免了闭合区域周围出现毛刺的情况,从而提高用户的体验感。
以上UI是以显示设备为例,其他类别的显示设备体现在电子画板界面对图片涂色方面的UI基本与上述UI相似,这里不再一一列举。
本申请实施例还提供一种消除区域灰度的方法,应用于显示设备,该方法具体包括以下步骤:获取涂色图片,并将涂色图片进行区域划分,以形成至少一个闭合区域以及包围闭合区域的边界区域。控制显示器显示完成区域划分的涂色图片。响应于输入的对闭合区域的涂色操作,以使闭合区域的颜色为用户选择的颜色。
在一些实施例中,在显示电子画板界面时,方法还包括:将涂色图片中全部像素点进行遍历,计算每个像素点对应的灰度值。基于灰度值确定全部像素点的像素颜色,以划分涂色图片中的闭合区域和边界区域。
在一些实施例中,基于灰度值确定全部像素点的像素颜色,以划分涂色图片中的闭合区域和边界区域,方法还包括:判断灰度值与预设灰度阈值的大小,预设灰度阈值用于划分闭合区域和边界区域。在灰度值大于预设灰度阈值的情况下,将灰度值对应像素点的颜色设置为黑色,以使颜色为黑色的像素点构成边界区域。在灰度值小于预设灰度阈值的情况下,将灰度值对应像素点的颜色设置为白色,以使颜色为白色的像素点构成闭合区域。
在一些实施例中,将涂色图片中全部像素点进行遍历,计算每个像素点对应的灰度值,方法还包括:将涂色图片加载至内存中,并创建涂色图片对应的位图文件,位图文件用于获取涂色图片中的全部像素点。遍历全部像素点,确定每个像素点对应的原始RGB分量。利用预设灰度值公式对原始RGB分量进行计算,得到每个像素点对应的灰度值。
在一些实施例中,预设灰度值公式如下:
Gray=R×0.03+G×0.59+B×0.11;
其中,Gray为灰度值;R为红色分量值;G为绿色分量值;B为蓝色分量值。
在一些实施例中,电子画板界面还包括控件区域,控件区域包括至少一个画笔控件,画笔控件用于在画图区域中输入内容;响应于输入的对闭合区域的涂色操作,方法还包括:响应于输入的触发涂色操作,在画图区域上显示颜色吸取控件,颜色吸取控件至少用于指示出涂色位置对应的吸取颜色。响应于输入的确认涂色操作,根据颜色吸取控件指示的涂色位置对应的吸取颜色,以完成对闭合区域的涂色。
在一些实施例中,方法还包括:在显示颜色吸取控件时,接收输入的吸取颜色变更操作。根据变更操作确定新的吸取颜色,并根据新的吸取颜色对颜色吸取控件指示的吸取颜色进行更新,以完成对闭合区域的涂色。
在一些实施例中,涂色图片的格式为RGB、JPG、PNG或SVG格式中的任意一种。
由以上内容可知,本申请提供的一种显示设备及消除区域灰度方法,通过将涂色图片中的区域划分为闭合区域和边界区域,消除闭合区域和边界区域之间的区域灰度。使闭合 区域和边界区域的颜色分明,即闭合区域为白色,边界区域为黑色。在用户涂色时闭合区域可以填满,并且边界区域在填色后不存在毛刺。
相关技术中,当用户对图片进行涂色时,如果绘制的部分和图片边界的颜色相同时,这些部分有可能被认定为图片边界,导致原来的图片边界被破坏,从而影响涂色效果,导致用户的体验性较差。
本申请提供了一种显示设备和颜色设置方法,用以解决相关技术中,图片边界会被破坏,导致用户体验性较差的问题。
在一些实施例中,电子画板包括绘图区域和控件区域。其中,绘图区域为可输入内容的区域。控件区域可以集中显示与画板应用的一个或者多个功能相对应的控件,例如画笔控件、擦除控件、染色桶控件等,用户可以使用各个控件可进行相应的操作,还可以设置每个控件的参数,例如设置画笔控件指示的填充颜色及线条样式等。图34示出了一些实施例中电子画板的示意图。
在一些实施例中,画图区域中还可以显示目标图片,用户可以对目标图片进行涂色等操作。图35示出了一些实施例中目标图片的示意图。目标图片中包括颜色填充区域和区域边界,用户可以对颜色填充区域进行涂色等操作。用户还可以使用控件区域中的各种控件对画图区域中的目标图片进行各种操作,例如:使用画笔控件在目标图片的颜色填充区域中输入线条,图形等;使用染色桶控件将颜色填充区域内填充颜色;或者使用擦除控件将用户输入的内容进行擦除等。
在一些实施例中,用户可以使用控件区域中的控件对目标图片进行各种操作。例如,用户可以触发画笔控件,画笔控件处于拾取状态,此时用户可以控制画笔控件在目标图片中进行相应的操作。当检测到画笔控件被选中时,控制器还可以控制显示器显示该画笔控件对应的工具栏。图36示出了一些实施例中画笔控件对应的工具栏的示意图。在工具栏中可以选择画笔的颜色、线条粗细以及线条样式等参数信息。
当用户选择好画笔控件的各项参数信息后,可以基于与画图区域触控以输入内容,输入的内容即为用户在画图区域上的触控轨迹。
在一些实施例中,当用户触发画笔控件,例如,点击画笔控件时,显示器中会显示画笔控件对应的工具栏。用户可以通过画笔控件对应的工具栏选取画笔控件指示的填充颜色。当用户点击工具栏中的颜色选项时,控制器可以控制显示器显示填充颜色情况。填充颜色情况中包括画笔控件可以指示的所有颜色以及画笔控件当前指示的填充颜色。图37示出了一些实施例中工具栏中的颜色选项的示意图。用户可以查看画笔控件当前正在指示的填充颜色。用户还可以自行设定画笔控件指示的填充颜色。
相关技术中,当用户选取的画笔控件指示的填充颜色和目标图片的区域边界的颜色完全相同时,用户输入的内容有可能被判定为目标图片的区域边界,导致目标图片原有的区域边界被破坏掉。因此会降低用户后续的绘制效果,严重影响用户体验性。图38示出了相关技术中用户输入的内容破坏掉区域边界的示意图。其中,L1为用户输入的线条,该线条被系统判定为区域边界,从而破坏掉目标图片原有的区域边界。
考虑到这一问题,本申请提供的显示设备可以对目标图片的区域边界进行保护,防止其被用户输入的内容破坏掉。
显示设备可以设置有区域边界保护模式。当显示设备处于区域边界保护模式时,可以令用户输入的内容不会被判定为区域边界,从而实现对目标图片的区域边界进行保护。
在一些实施例中,用户进入以及退出区域边界保护模式的方式,与上述进入以及退出“涂色保护模式”的方式相同,此处不再赘述。
还可以在显示设备的UI界面中设置区域边界保护模式选项,当用户点击该选项时,可以控制显示设备进入或退出区域边界保护模式。
在一些实施例中,为防止用户误触发区域边界保护模式,当控制器接收到区域边界保护模式指令时,可以控制显示器显示区域边界保护模式确认信息,从而使得用户进行二次确认,是否要控制显示设备进入区域边界保护模式。图39示出了一些实施例中显示器中显示区域边界保护模式确认信息的示意图。
图40示出了一些实施例中显示设备各部件的交互流程图。
在一些实施例中,当显示设备进入黑屏检测模式时,控制器可以首先检测用户将要输入的内容是否会破坏目标图片的区域边界。
在一些实施例中,在检测用户将要输入的内容是否会破坏目标图片的区域边界时,控制器可以检测用户选择的画笔控件指示的填充颜色。当确定用户选择的画笔控件指示的填充颜色后,控制器可以进一步检测填充颜色对应的颜色值,本申请实施例中命名为填充颜色值。
在确定填充颜色的填充颜色值后,控制器可以进一步确定目标图片的区域边界的像素点的颜色值。如果填充颜色的填充颜色值和区域边界的像素点的颜色值相同,则说明画笔控件指示的填充颜色和区域边界的颜色完全相同。此时,如果用户采用当前选择的画笔控件指示的填充颜色,那么输入的内容有可能破坏目标图片的区域边界。因此需要对填充颜色的填充颜色值进行调整,从而保证填充颜色值和区域边界的像素点的颜色值是不同的,实现目标图片的区域边界不会被破坏。图41示出了一些实施例中用户输入的内容不会破坏掉区域边界的示意图。其中,L1为用户输入的线条,该线条的颜色值和区域边界的像素点的颜色值不同,因此系统不会将用户输入的内容判定为区域边界,从而可以保护目标图片原有的区域边界。
在一些实施例中,颜色值可以是ARGB值。其中,ARGB是一种色彩模式,也就是RGB色彩模式附加上透明度通道。RGB色彩模式是通过对红(Red)、绿(Green)、蓝(Blue)三个颜色通道的变化以及它们相互之间的叠加来得到各式各样的颜色的模式。
在一些实施例中,目标图片的区域边界是确定的,其颜色也是确定的,并且为同一颜色。控制器可以获取区域边界的像素点的颜色值(R0,G0,B0,Alpha0)。
当用户选取画笔控件指示的填充颜色后,控制器可以确定填充颜色当前对应的填充颜色值(R^,G^,B^,Alpha^)。通过判断区域边界的像素点的颜色值和填充颜色值是否相同即可确认用户输入的内容是否会破坏目标图片的区域边界。
在一些实施例中,当检测到用户输入的内容有可能破坏目标图片的区域边界,即填充颜色值与区域边界的像素点的颜色值相同时。控制器可以对填充颜色值进行微调,可以将填充颜色值加减预设值。例如,可以将填充颜色值(R^,G^,B^,Alpha^)中的某个颜色值分量的数值加一或减一,从而保证填充颜色值和区域边界的像素点的颜色值是不同的。
在一些实施例中,考虑到人眼对绿色(G分量)的敏感度最高,对红色(R分量)的敏感度次之,对蓝色(B分量)的敏感度最低。因此,当填充颜色值与区域边界的像素点的颜色值相同时,控制器可以则将填充颜色值中的B分量进行微调,通过将B分量的数值进行加一或者减一处理,这样最大程度的保持画笔控件指示的填充颜色与原颜色一致,又 保证了用户输入的内容不会形成新的区域边界,从而对目标图片原有的区域边界进行了保护。
当检测到用户选择的画笔控件指示的填充颜色和区域边界的颜色都为黑色时,对应的颜色值为(0,0,0,Alpha),此时需要将填充颜色值中的B分量的数值加一。当检测到用户选择的画笔控件指示的填充颜色和区域边界的颜色都为白色时,对应的颜色值为(255,255,255,Alpha),此时需要将填充颜色值中的B分量的数值减一。当用户选择的画笔控件指示的填充颜色为其他颜色时,可以将B分量的数值进行加一或者减一处理,具体可由用户自行设定。
在一些实施例中,在对填充颜色值进行微调时,也可以对其他的分量进行调整,调整的数值可以有用户自行设定,本申请不做过多限定。
在一些实施例中,在对填充颜色值进行调整后,可以保证填充颜色值和区域边界的像素点的颜色值不同,用户输入的内容不会再被系统判定为目标图片的区域边界。此时,控制器确定画笔控件当前指示的填充颜色的颜色值为调整后的填充颜色值。
当用户选择的填充颜色值和区域边界的像素点的颜色值不同时,则不需要对填充颜色值进行调整,可以直接使用该填充颜色值进行操作。此时,控制器确定画笔控件当前指示的填充颜色的颜色值为用户选择的填充颜色值。
在一些实施例中,当确定好填充颜色值时,控制器还可以检测用户的触控操作。用户可以拾取画笔控件对画图区域中的目标图片进行涂色等。例如,用户可以利用手指在触摸屏中点击或移动,并实现对目标图片中的颜色填充区域进行涂色。
响应于用户在颜色填充区域中的操作,控制器可以利用触控组件来检测用户输入的触控轨迹。根据用户输入的触控轨迹即可确认用户希望在目标图片中涂色的区域。
控制器可以根据填充颜色值对触控轨迹中的所有像素点的颜色值进行更新。实现将画笔控件指示的填充颜色输入到触控轨迹中,从而实现对颜色填充区域进行涂色。
在一些实施例中,在对触控轨迹中的所有像素点的颜色值进行更新时,控制器可以将触控轨迹中的所有像素点的颜色值替换为画笔控件当前对应的填充颜色值。
具体的,当用户拾取画笔控件在画图区域中涂色时,控制器可以将用户触控到的所有区域中的像素点的颜色值全部替换为填充颜色值。即用户触控到的区域全部设置为用户选择的颜色。例如,触控轨迹中像素点的颜色值为(R1,G1,B1,Alpha1),用户选择的填充颜色的填充颜色值为(R^,G^,B^,Alpha^)。此时,需要将触控轨迹中的所有像素点的颜色值全部设置为填充颜色值,即(R^,G^,B^,Alpha^)。
在一些实施例中,在对触控轨迹中的所有像素点的颜色值进行更新时,控制器可以对触控轨迹对应的区域进行混色处理,即对触控轨迹中的所有像素点分别进行混色处理。
具体的,控制器可以首先确定触控轨迹中的每个像素点当前的第一颜色值。对于每一个像素点,将其当前的第一颜色值和填充颜色值进行叠加处理,可以得到第二颜色值,第二颜色值即为每个像素点进行混色处理后得到颜色值。进一步的,控制器可以将触控轨迹中的每个像素点的颜色值更新为各自对应的第二颜色值,从而实现对触控轨迹对应区域的混色处理。例如,触控轨迹中某个像素点的第一颜色值为(R1,G1,B1,Alpha1),用户选择的填充颜色的填充颜色值为(R^,G^,B^,Alpha^)。此时,需要对颜色值(R1,G1,B1,Alpha1)和填充颜色值(R^,G^,B^,Alpha^)叠加处理,得到第二颜色值(R2,G2,B2,Alpha2)。控制器进一步将该像素点的颜色值设置为(R2,G2,B2,Alpha2)。
在一些实施例中,在对触控轨迹中的所有像素点的颜色值进行更新后,对于某些像素点,得到的混色处理后的颜色值有可能与区域边界的像素点的颜色值相同,即第二颜色值(R2,G2,B2,Alpha2)和区域边界的像素点的颜色值(R0,G0,B0,Alpha0)相同。对于这些像素点对应的触控轨迹,有可能被系统判定为目标图片的区域边界,因此需要对这部分触控轨迹中的像素点的颜色值进行调整。具体的调整方法可参照上述步骤,此处不再赘述。
通过对第二颜色值进行调整后,可以得到第三颜色值,从而保证第三颜色值和区域边界的像素点的颜色值时不同的。此时,控制器可以将这部分触控轨迹中的像素点的颜色值更新为第三颜色值。从而保证用户输入的内容不会被系统判定为目标图片的区域边界。具体的更新方法可参照上述步骤,此处不再赘述。
在一些实施例中,用户还可以对画笔控件指示的填充颜色进行切换。当检测到用户切换画笔控件指示的填充颜色时,控制器可以确定切换后的填充颜色的填充颜色值。此时,当用户继续触控触摸屏时,控制器可以继续监测用户输入的触控轨迹。同时根据切换后的填充颜色值对触控轨迹中的所有像素点的颜色值进行更新。具体的更新方法可参照上述步骤,此处不再赘述。
在一些实施例中,用户还可以使用染色桶控件。当用户点击染色桶控件后,染色桶控件处于拾取状态,用户可以进一步选择染色桶控件指示的颜色,并对某个颜色填充区域进行点击,此时该颜色填充区域的颜色可以更新为用户选择的染色桶控件指示的颜色,实现对该颜色填充区域的颜色填充。
当用户选择染色桶控件指示的颜色和目标图片的区域边界的颜色完全相同时,即染色桶控件指示的颜色对应的颜色值和区域边界的像素点的颜色值相同时,颜色填充区域在进行颜色填充后,有可能被系统判定为区域边界。
因此,当用户选择染色桶控件指示的颜色后,控制器可以判断染色桶控件对应的颜色值和区域边界的像素点的颜色值是否相同。若是,则控制器可以对染色桶对应的颜色值进行调整,以使染色桶所有的颜色值和区域边界的像素点的颜色值不同。具体的调整方法可参照上述步骤,此处不再赘述。
本申请实施例还提供一种颜色设置方法,应用于显示设备,如图42所示,该方法包括:步骤4201、显示电子画板,电子画板包括画笔控件和画图区域,画图区域用于展示目标图片,目标图片中包括颜色填充区域和区域边界,画笔控件用于在颜色填充区域中输入内容;步骤4202、响应于用户对画笔控件的触发操作,确定画笔控件指示的填充颜色;步骤4203、检测填充颜色对应的画笔颜色值;步骤4204、当填充颜色值与区域边界的像素点的颜色值相同时,对填充颜色值进行调整,以使填充颜色值和区域边界的像素点的颜色值不同。
本说明书中各个实施例之间相同相似的部分互相参照即可,在此不再赘述。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释原理以及实际的应用,从而使得本领域技术人员更好的使用实施方式以及适于具体使用考虑的各种不同的变形的实施方式。

Claims (10)

  1. 一种显示设备,包括:
    显示器,被配置为显示电子画板界面,所述电子画板界面包括控件区域和绘图区域,所述控件区域包括至少一个画笔控件,所述画笔控件用于在所述绘图区域上输入内容,所述绘图区域用于呈现所述画笔控件输入的内容和显示目标图片;
    触控组件,用于接收用户通过触控输入的指令,其中所述触控组件和显示器形成触摸屏;
    控制器,被配置为:
    响应于用户输入的指示显示目标图片的指令,在所述绘图区域显示所述目标图片,所述目标图片包括至少一个闭合区域;
    响应于用户在所述绘图区域输入的由绘制起点到绘制终点的滑动操作,确定所述绘制起点到所述绘制终点之间的滑动路径;
    控制位于目标闭合区域的所述滑动路径填充预设颜色,且控制位于所述目标闭合区域外的所述滑动路径不填充颜色,其中,所述目标闭合区域是所述绘制起点所处的所述闭合区域。
  2. 根据权利要求1所述的显示设备,所述控制器进一步配置为:
    根据第一闭合区域中的像素点位置信息,判断所述绘制起点所在的像素点是否是所述第一闭合区域中的像素点,其中,所述第一闭合区域是所述目标图片中的任意一个闭合区域;
    若所述绘制起点所在的像素点是所述第一闭合区域中的像素点,则将所述第一闭合区域确定为目标闭合区域。
  3. 根据权利要求2所述的显示设备,所述控制器进一步配置为:
    遍历所述目标图片中的像素点,获取每一个所述像素点的位置信息;
    根据所述像素点所处的所述闭合区域,分类存储每一个所述像素点的位置信息。
  4. 根据权利要求3所述的显示设备,所述控制器进一步配置为:
    根据所述滑动路径经过的每一个像素点的位置信息,判断所述滑动路径经过的像素点是否是所述目标闭合区域中的像素点;
    若所述滑动路径经过的像素点是所述目标闭合区域中的像素点,则对所述滑动路径经过的像素点进行颜色填充。
  5. 根据权利要求4所述的显示设备,所述控制器进一步配置为:
    响应于用户对所述画笔控件的触发操作,确定所述画笔控件指示的填充颜色;
    根据所述填充颜色对应的填充颜色值,对所述滑动路径经过的像素点进行颜色填充。
  6. 根据权利要求5所述的显示设备,所述控制器进一步配置为:
    将所述滑动路径经过的任意两个相邻的像素点之间的最短距离确定为第一距离,将对第一像素点进行颜色填充时的填充半径确定为第二距离,其中,所述第一像素点为位于所述目标闭合区域的像素点;
    若所述第一距离大于所述第二距离,则在这两个相邻的所述像素点之间生成预设数量的插值像素点。
  7. 根据权利要求6所述的显示设备,所述控制器进一步配置为:
    判断每一个所述插值像素点是否位于所述目标闭合区域中;
    控制对位于所述目标闭合区域的所述插值像素点进行颜色填充;
    控制对位于所述目标闭合区域之外的所述插值像素点进行消除。
  8. 根据权利要求7所述的显示设备,所述控制器进一步配置为:
    获取距离所述插值像素点最近的一个所述第一像素点的填充颜色值;
    根据所述填充颜色值,对所述插值像素点进行颜色填充。
  9. 根据权利要求1所述的显示设备,所述控件区域还包括触发按钮,所述控制器进一步配置为:
    响应于对所述触发按钮的操作,控制所述显示设备进入涂色保护模式,其中,所述涂色模式是指只令位于目标闭合区域中的所述滑动路径填充预设颜色。
  10. 一种用于显示设备的图像处理方法,包括:
    响应于用户输入的指示显示目标图片的指令,在绘图区域显示所述目标图片,所述目标图片包括至少一个闭合区域;
    响应于用户在所述绘图区域输入的由绘制起点到绘制终点的滑动操作,确定所述绘制起点到所述绘制终点之间的滑动路径;
    控制位于目标闭合区域的所述滑动路径填充预设颜色,且控制位于所述目标闭合区域外的所述滑动路径不填充颜色,其中,所述目标闭合区域是所述绘制起点所处的所述闭合区域。
PCT/CN2022/096009 2021-06-30 2022-05-30 一种显示设备及图像处理方法 WO2023273761A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280046851.2A CN118120243A (zh) 2021-06-30 2022-05-30 一种显示设备及图像处理方法

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
CN202110741457.8 2021-06-30
CN202110739092.5 2021-06-30
CN202110739092 2021-06-30
CN202110736500 2021-06-30
CN202110736500.1 2021-06-30
CN202110736313.3A CN113485614A (zh) 2021-06-30 2021-06-30 显示设备和颜色设置方法
CN202110741457 2021-06-30
CN202110736313.3 2021-06-30
CN202210107311.2A CN115543116A (zh) 2021-06-30 2022-01-28 一种显示设备及消除区域灰度的方法
CN202210107311.2 2022-01-28
CN202210128208.6 2022-02-11
CN202210128208.6A CN114501107A (zh) 2021-06-30 2022-02-11 一种显示设备及涂色方法
CN202210191377.4 2022-02-28
CN202210191377.4A CN115550718A (zh) 2021-06-30 2022-02-28 一种显示设备及显示方法

Publications (1)

Publication Number Publication Date
WO2023273761A1 true WO2023273761A1 (zh) 2023-01-05

Family

ID=84692516

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/096009 WO2023273761A1 (zh) 2021-06-30 2022-05-30 一种显示设备及图像处理方法

Country Status (1)

Country Link
WO (1) WO2023273761A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309893A (zh) * 2023-05-18 2023-06-23 深圳市微克科技有限公司 一种图片压缩方法、装置、存储介质和电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06231271A (ja) * 1993-02-05 1994-08-19 Toshiba Corp ビットイメージデータ編集処理の塗りつぶし方法
CN102376099A (zh) * 2010-08-19 2012-03-14 北大方正集团有限公司 一种改善矢量图形填充效果的方法及系统
CN102542580A (zh) * 2010-12-14 2012-07-04 上海三旗通信科技股份有限公司 基于单元格的画板功能
JP2012256226A (ja) * 2011-06-09 2012-12-27 Nikon Corp 透過描画装置
CN103854296A (zh) * 2012-12-06 2014-06-11 腾讯科技(深圳)有限公司 控制颜料涂色的方法及装置
CN110458921A (zh) * 2019-08-05 2019-11-15 腾讯科技(深圳)有限公司 一种图像处理方法、装置、终端以及存储介质
CN114501107A (zh) * 2021-06-30 2022-05-13 海信视像科技股份有限公司 一种显示设备及涂色方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06231271A (ja) * 1993-02-05 1994-08-19 Toshiba Corp ビットイメージデータ編集処理の塗りつぶし方法
CN102376099A (zh) * 2010-08-19 2012-03-14 北大方正集团有限公司 一种改善矢量图形填充效果的方法及系统
CN102542580A (zh) * 2010-12-14 2012-07-04 上海三旗通信科技股份有限公司 基于单元格的画板功能
JP2012256226A (ja) * 2011-06-09 2012-12-27 Nikon Corp 透過描画装置
CN103854296A (zh) * 2012-12-06 2014-06-11 腾讯科技(深圳)有限公司 控制颜料涂色的方法及装置
CN110458921A (zh) * 2019-08-05 2019-11-15 腾讯科技(深圳)有限公司 一种图像处理方法、装置、终端以及存储介质
CN114501107A (zh) * 2021-06-30 2022-05-13 海信视像科技股份有限公司 一种显示设备及涂色方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309893A (zh) * 2023-05-18 2023-06-23 深圳市微克科技有限公司 一种图片压缩方法、装置、存储介质和电子设备
CN116309893B (zh) * 2023-05-18 2023-08-11 深圳市微克科技有限公司 一种图片压缩方法、装置、存储介质和电子设备

Similar Documents

Publication Publication Date Title
US11023055B2 (en) Devices, methods, and graphical user interfaces for an electronic device interacting with a stylus
KR102042169B1 (ko) 컨텐츠를 표시하는 사용자 단말 장치 및 그 방법
US8610678B2 (en) Information processing apparatus and method for moving a displayed object between multiple displays
US9317131B2 (en) System and method for generating a representative computerized display of a user'S interactions with a touchscreen based hand held device on a gazed-at screen
US9298266B2 (en) Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
CN103729055B (zh) 多显示设备、输入笔、多显示控制方法和多显示系统
KR102083937B1 (ko) 멀티 디스플레이 장치 및 그 툴 제공 방법
EP2715491B1 (en) Edge gesture
US20160364091A1 (en) Devices and Methods for Manipulating User Interfaces with a Stylus
US20110157053A1 (en) Device and method of control
US20140139430A1 (en) Virtual touch method
US20140223490A1 (en) Apparatus and method for intuitive user interaction between multiple devices
US20150160849A1 (en) Bezel Gesture Techniques
US20120304131A1 (en) Edge gesture
KR20140025494A (ko) 에지 제스처 기법
CN114501107A (zh) 一种显示设备及涂色方法
US20150138082A1 (en) Image display apparatus and image display system
CN114115637B (zh) 显示设备及电子画板优化方法
WO2023273761A1 (zh) 一种显示设备及图像处理方法
CN115129214A (zh) 一种显示设备和颜色填充方法
WO2023273462A1 (zh) 一种显示设备及填色方法
JP5657269B2 (ja) 画像処理装置、表示装置、画像処理方法、画像処理プログラム、記録媒体
WO2023273434A1 (zh) 一种显示设备及多指触控显示方法
CN118120243A (zh) 一种显示设备及图像处理方法
WO2023065766A1 (zh) 显示设备及其显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831583

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280046851.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22831583

Country of ref document: EP

Kind code of ref document: A1