CN113810746B - Display equipment and picture sharing method - Google Patents

Display equipment and picture sharing method Download PDF

Info

Publication number
CN113810746B
CN113810746B CN202111084170.9A CN202111084170A CN113810746B CN 113810746 B CN113810746 B CN 113810746B CN 202111084170 A CN202111084170 A CN 202111084170A CN 113810746 B CN113810746 B CN 113810746B
Authority
CN
China
Prior art keywords
operation data
data set
drawing operation
target
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111084170.9A
Other languages
Chinese (zh)
Other versions
CN113810746A (en
Inventor
姜伟伟
董率
李乃金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202111084170.9A priority Critical patent/CN113810746B/en
Publication of CN113810746A publication Critical patent/CN113810746A/en
Application granted granted Critical
Publication of CN113810746B publication Critical patent/CN113810746B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface

Abstract

The application provides a display device and a picture sharing method, wherein when a sending end device displays an electronic drawing board interface, a user can share a drawing to a target receiving end device by selecting a sharing control. The sending end device may generate a target picture corresponding to the pictorial representation in response to an input instruction to send the pictorial representation to the target receiving end device, and send a drawing operation data set corresponding to the pictorial representation and the target picture to the target receiving end device, so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable pictorial representation according to the drawing operation data set. Compared with the prior art that the finished pictorial representation can only be checked on the display device where the drawing board application software is located and cannot be checked on other devices, the picture sharing method provided by the application can realize that the pictorial representation can be checked on a plurality of devices, and can continuously edit the pictorial representation, so that the user experience is improved.

Description

Display equipment and picture sharing method
Technical Field
The application relates to the technical field of intelligent television drawing boards, in particular to display equipment and a picture sharing method.
Background
The display device refers to a terminal device capable of outputting a specific display screen, such as a smart television, a mobile terminal, a smart advertisement screen, a projector, and the like. Taking intelligent electricity as an example, the intelligent television can be based on the Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, and is a television product integrating multiple functions of video, entertainment, education, data and the like, and the intelligent television is used for meeting the diversified and personalized requirements of users. For example, a smart television may have a drawing board application installed thereon, and a user may edit a drawing on the drawing board application.
However, after the user finishes drawing on the drawing board application software, the user can only view the finished drawing on the display device where the drawing board application software is located, but cannot view the drawing on other display devices, which is not beneficial to user experience.
Disclosure of Invention
The application provides display equipment and a picture sharing method, which are used for solving the problem that after a user finishes drawing on drawing board application software, the user can only check the finished drawing on the display equipment where the drawing board application software is located and can not check the drawing on other display equipment.
In a first aspect, the present application provides a display apparatus comprising:
a display for displaying a user interface;
a controller configured to:
presenting an electronic palette interface, the electronic palette interface comprising a control region and a drawing region, the control region comprising at least one brush control, the drawing region being configured to present a pictorial representation corresponding to a pictorial manipulation dataset, the pictorial manipulation dataset comprising pictorial manipulation data entered by a user through the brush control;
and transmitting the drawing operation data set corresponding to the drawing work and the target picture to the target receiving end device so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable drawing work according to the drawing operation data set.
In a second aspect, the present application also provides a display apparatus including:
a display for displaying a user interface;
a controller configured to:
receiving a target picture and a drawing operation data set which are sent by a sending end device and correspond to the drawing work;
Displaying the target picture on the user interface or displaying the editable pictorial representation on the user interface in accordance with the pictorial manipulation data set.
In a third aspect, the present application provides a picture sharing method, including:
presenting an electronic palette interface, the electronic palette interface comprising a control region and a drawing region, the control region comprising at least one brush control, the drawing region being configured to present a pictorial representation corresponding to a pictorial manipulation dataset, the pictorial manipulation dataset comprising pictorial manipulation data entered by a user through the brush control;
and transmitting the drawing operation data set corresponding to the drawing work and the target picture to the target receiving end device so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable drawing work according to the drawing operation data set.
In a fourth aspect, the present application further provides a picture sharing method, including:
receiving a target picture and a drawing operation data set which are sent by a sending end device and correspond to the drawing work;
Displaying the target picture on the user interface or displaying the editable pictorial representation on the user interface in accordance with the pictorial manipulation data set.
Based on the display device and the picture sharing method provided by the application, when the electronic drawing board interface is displayed, a user can share the pictorial representation to the target receiving end device by selecting the sharing control. The display device may generate a target picture corresponding to the pictorial representation in response to an input instruction to transmit the pictorial representation to the target receiving device, and transmit a pictorial operation data set corresponding to the pictorial representation and the target picture to the target receiving device to display the target picture on the target receiving device or to cause other display devices to display editable pictorial representations according to the pictorial operation data set. Compared with the prior art that the finished pictorial representation can only be checked on the display device where the drawing board application software is located and cannot be checked on other receiving end devices, the picture sharing method provided by the application can realize that the pictorial representation can be checked on a plurality of display devices, and can continuously edit the pictorial representation, so that the user experience is improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a diagram illustrating an operational scenario between a display device and a control apparatus, according to some embodiments;
fig. 2 is a block diagram of a hardware configuration of the control apparatus 100 shown according to some embodiments;
fig. 3 is a block diagram of a hardware configuration of a display device 200 shown according to some embodiments;
FIG. 4 is a diagram of a software configuration in a display device 200, shown according to some embodiments;
FIG. 5 is a schematic diagram of a display device 200 shown according to some embodiments;
FIG. 6 is an electronic palette interface shown in some embodiments of the application;
FIG. 7 is an electronic palette interface shown in some embodiments of the application;
FIG. 8 is a schematic representation of a brush toolbar of the present application shown in some embodiments;
FIG. 9 is a schematic diagram of a shared page according to some embodiments of the present application;
FIG. 10 is a schematic diagram of a shared page according to some embodiments of the present application;
FIG. 11 is a schematic diagram of a shared page according to some embodiments of the present application;
FIG. 12 is a schematic diagram of a shared page according to some embodiments of the present application;
FIG. 13 is a schematic view of a selection interface of the present application shown in some embodiments;
FIG. 14 is a flowchart of a picture sharing method according to some embodiments of the present application;
fig. 15 is a flowchart of a picture sharing method according to some embodiments of the present application.
Detailed Description
For the purposes of making the objects and embodiments of the present application more apparent, an exemplary embodiment of the present application will be described in detail below with reference to the accompanying drawings in which exemplary embodiments of the present application are illustrated, it being apparent that the exemplary embodiments described are only some, but not all, of the embodiments of the present application.
It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc.
In some embodiments, a smart device 300 (e.g., mobile terminal, tablet, computer, notebook, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received through a module configured inside the display device 200 device for acquiring voice commands, or the voice command control of the user may be received through a voice control apparatus configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200.
Fig. 3 shows a hardware configuration block diagram of the display device 200 in accordance with an exemplary embodiment.
In some embodiments, display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, memory, a power supply, a user interface.
In some embodiments the controller includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, for receiving image signals from the controller output, for displaying video content, image content, and a menu manipulation interface, and for manipulating a UI interface by a user.
In some embodiments, the display 260 may be a liquid crystal display, an OLED display, a projection device, and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220.
In some embodiments, the user interface may be configured to receive control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, or the like. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
In some embodiments, the modem 210 receives broadcast television signals via wired or wireless reception and demodulates audio-video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other operable control. The operations related to the selected object are: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), video processor, audio processor, graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. Such as one main processor and one or more sub-processors.
In some embodiments, a graphics processor is used to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor comprises an arithmetic unit, which is used for receiving various interactive instructions input by a user to operate and displaying various objects according to display attributes; the device also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to a standard codec protocol of an input signal, and may obtain a signal that is displayed or played on the directly displayable device 200.
In some embodiments, the video processor includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like. And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received frame rate into a video output signal and changing the video output signal to be in accordance with a display format, such as outputting RGB data signals.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in a speaker.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, the display device 200 may be a touch display device, where the display device is a touch display formed by a touch component and a screen. The touch display device supports the touch interaction function, and a user can operate the host only by lightly touching the display with fingers, so that the operations of a keyboard, a mouse and a remote controller are eliminated, and man-machine interaction is more straightforward. On the touch display, the user can input different control instructions through touch operations. For example, a user may input touch instructions such as clicking, sliding, long pressing, double clicking, etc., and different touch instructions may represent different control functions.
To implement the different touch actions, the touch assembly may generate different electrical signals when the user inputs the different touch actions, and transmit the generated electrical signals to the controller 250. The controller 250 may perform feature extraction on the received electrical signal to determine a control function to be performed by the user based on the extracted features. For example, when a user enters a click touch action at any program icon location in the application program interface, the touch component will sense the touch action and thereby generate an electrical signal. After receiving the electrical signal, the controller 250 may determine the duration of the level corresponding to the touch action in the electrical signal, and recognize that the user inputs the click command when the duration is less than the preset time threshold. The controller 250 then extracts the location features generated by the electrical signals to determine the touch location. When the touch position is within the application icon display range, it is determined that a click touch instruction is input by the user at the application icon position. Accordingly, the click touch instruction is used to perform a function of running a corresponding application program in the current scenario, and thus the controller 250 may start running the corresponding application program.
For another example, when the user inputs a sliding motion in the media presentation page, the touch assembly also sends the sensed electrical signal to the controller 250. The controller 250 first determines the duration of the signal corresponding to the touch action in the electrical signal. When the duration time is determined to be longer than the preset time threshold value, the position change condition generated by the signals is judged, and obviously, the generation position of the signals changes for the interactive touch action, so that the user is determined to input a sliding touch instruction. The controller 250 then determines the sliding direction of the sliding touch command according to the change condition of the signal generating position, and controls the page turning of the display screen in the media display page so as to display more media options. Further, the controller 250 may further extract characteristics such as a sliding speed and a sliding distance of the sliding touch instruction, and perform a picture control of turning pages according to the extracted characteristics, so as to achieve a following effect.
Similarly, for the touch instructions such as double-click and long-press, the controller 250 may extract different features, determine the type of the touch instruction through feature judgment, and execute corresponding control functions according to a preset interaction rule. In some embodiments, the touch assembly also supports multi-touch, such that a user may enter touch actions on the touch screen via multiple fingers, e.g., multi-finger clicks, multi-finger long presses, multi-finger swipes, etc.
The touch control action can be matched with a specific application program to realize a specific function. For example, after the user opens the drawing board application, the display 260 may present a drawing area, the user may draw a specific touch action track in the drawing area through the sliding touch instruction, and the controller 250 determines a touch action pattern through the touch action detected by the touch component and controls the display 260 to display in real time, so as to satisfy the demonstration effect. For example, it is a basic function of a touch screen display device that a user rotates a finger touching a display to control the display to display a picture. The current interaction mode is that after a plurality of fingers rotate on a screen, the picture immediately rotates to a horizontal or vertical angle according to the rotation direction of the fingers, no interaction process exists, and the user experience is poor.
In some embodiments, a system of display devices may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework layer) (referred to as a "framework layer"), a An Zhuoyun row (Android run) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, or the like; or may be an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. Through the API interface, the application program can access the resources in the system and acquire the services of the system in the execution.
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the individual applications as well as the usual navigation rollback functions, such as controlling the exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of the display screen, judging whether a status bar exists or not, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window to display, dithering display, distorting display, etc.), etc.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
Based on the display device, the display device can support the rotation and/or lifting functions by adding the driving component and the gesture detection component. Typically, the drive assembly includes a rotation assembly and/or a lift assembly with which the controller 110 may communicate to control the rotation assembly to drive the display to rotate when it is desired to rotate the display and to control the lift assembly to drive the display to raise or lower when it is desired to raise or lower the display.
In a possible implementation manner, the rotating component and/or the lifting component are provided with a GPIO interface, and the controller changes the GPIO interface state of the rotating component and/or the lifting component by reading the GPIO interface. When the state of the GPIO interface is changed, the rotating component and/or the lifting component drive the display to rotate and/or lift according to the changed state of the GPIO interface.
In a possible implementation, the lifting assembly and/or lifting assembly includes an MCU chip on which a bluetooth module is integrated, such that the lifting assembly and/or lifting assembly supports bluetooth functions, such as Bluetooth Low Energy (BLE), and further, the controller 110 may communicate with the lifting assembly and/or lifting assembly based on a bluetooth protocol.
In some embodiments, the detection assembly includes a sensor for detecting a rotational state of the display and a sensor for detecting a display lift state. During the rotation or lifting process of the display, the controller monitors the rotation state or lifting state of the display in real time according to the data detected by the gesture detection assembly. For example, in the process of controlling the rotation of the display, information such as the rotation angle, the angle speed and the like is acquired by monitoring the data of the sensor. In the process of controlling the lifting of the display, the information such as the lifting distance, the lifting speed and the like is obtained by monitoring the data of the sensor.
In some embodiments, the detection assembly is included in the drive assembly. For example, a sensor for detecting the rotation state of the display is included in the rotation assembly, and forms the rotation assembly together with the rotation assembly. The sensor for detecting and displaying the lifting state is arranged on the lifting assembly and forms the lifting assembly together with the lifting assembly.
Fig. 5 is a schematic rear view of a display device according to the present application, which includes a display 260 and a lift driving device 511, as shown in fig. 5, in some exemplary embodiments. The elevation driving apparatus 511 includes an elevation guide rail fixed to the bracket 512. The rotation driving device is then arranged inside the lifting driving device, i.e. between the lifting driving device and the display, not shown in fig. 5.
In some embodiments, controller 250 controls the operation of display device 200 and responds to user operations associated with display 260 by running various software control programs (e.g., an operating system and/or various application programs) stored on memory. For example, control presents a user interface on a display, the user interface including a number of UI objects thereon; in response to a received user command for a UI object on the user interface, the controller 250 may perform an operation related to the object selected by the user command.
The display device provided by the application has the function of a drawing board. The palette function is implemented based on a palette function related application installed on the display device. For ease of description, the palette function related application will be referred to as a palette application.
In some embodiments, some or all of the steps involved in embodiments of the present application are implemented within an operating system and in an application. In some embodiments, the application program used to implement some or all of the steps of embodiments of the present application is the "palette application" described above, which is stored in memory, and the controller 250 controls the operation of the display device 200 and responds to user operations associated with the application program by running the application program in an operating system.
It should be noted that the display device according to the embodiment of the present application includes, but is not limited to, the display device 200 described in the above embodiment, and may also be other terminal devices having an image display function, a data processing function, and an information receiving and transmitting function, such as a mobile phone, a tablet computer, and other portable mobile terminals. Hereinafter, embodiments of the present application will be described in detail by taking a display device as an example.
After the display device starts the palette application, an electronic palette interface is presented on the display. An electronic palette interface displays areas of user interface objects, information, and/or inputtable content corresponding to one or more functions of the palette application. The aforementioned user interface objects refer to objects that make up the electronic palette interface and may include, but are not limited to, text, images, icons, soft keys (or "virtual buttons"), drop-down menus, radio buttons, check boxes, selectable lists, and the like. The displayed user interface objects may include non-interactive objects for conveying information or constituting the appearance of the user interface, interactive objects available for user interaction, or a combination of non-interactive objects and interactive objects. The user may interact with the user interface object by making contact with the touch screen at a location of the touch screen corresponding to the interactive object with which the user wishes to interact. The display device detects a contact and responds to the detected contact by performing an operation corresponding to interaction of the interactive object.
In some embodiments, the electronic palette interface includes a drawing area and a control area. The drawing area is an area capable of inputting contents, and the control area is used for intensively displaying user interface objects and information corresponding to one or more functions of the drawing board application. The user interface object includes, but is not limited to, a brush control, an erasure control, a sharing control, and the like, and the information includes various parameter information corresponding to the brush control, such as a current input color, an optional color, thickness, line shape, and the like.
FIG. 6 is an electronic palette interface shown in some embodiments of the application. As shown in fig. 6, the electronic palette interface includes a drawing area 610 and a control area 620. The drawing area 610 is used for receiving content input by a user through a control in the control area 620, and displaying part or all of an area of a layer of the received content, and the content received by the drawing area 610 may be at least one of lines, graphics, and characters. The control section 620 is a part or all of a layer for displaying various functional controls, and the functional controls displayed by the control section 620 include at least one of a brush control 621, an erasure control 622, a cut control 623, a delete control 624, a video control 625, a share control 626, a cancel control 627, a restore control 628, a save control 629, and a close application control 630.
In some embodiments, referring to fig. 7, the layer to which the control area 620 belongs may be superimposed on the layer to which the drawing area 610 belongs, and the user may select the layer to which the control area 620 belongs, and move the layer to which the control area 620 belongs to any position on the layer to which the drawing area 610 belongs through touch drag.
In some embodiments, when the middle brush control 621 is selected, the corresponding brush toolbar 700 of the brush control is triggered and displayed, and referring to fig. 8, the brush type, line color, line type, line thickness, and the like may be selected in the brush toolbar 700. The user selects the target type of brush control 621 to pick up the brush, and after picking up the brush, the user can continue to select the existing line color, line type, line thickness, and other options in the toolbar. The line color, line type, and line thickness selected by the user in the toolbar are input by the brush control 621 configured as the target type. In a state of picking up the brush control, the user may input content based on contact with the drawing area 610, i.e., a contact trajectory of the user on the drawing area 610.
In some embodiments, each time the user inputs operation contents in the drawing area 610, the display device stores drawing operation data corresponding to the operation contents. For example, after the user picks up the brush, the user contacts the drawing area 610, and leaves a continuous touch track until the brush is separated from the drawing area as one operation content, and the display device stores information such as position coordinates, filling color, filling type, filling radius and the like corresponding to each pixel point on the touch track in the drawing operation data set according to the input time sequence. The information of the filling color, the filling type, the filling radius and the like corresponds to options of line color, line type, line thickness and the like selected in the toolbar after the user picks up the painting brush.
In some embodiments, the user may edit the content input in the drawing area 610 by selecting at least one of the erase control 622, the cut control 623, the delete control 624, the cancel control 627, the restore control 628, and the like, so that the user may also view the operation of editing the content in the drawing area 610 by selecting at least one of the erase control 622, the cut control 623, the delete control 624, the cancel control 627, the restore control 628, and the like as the operation content input in the drawing area 610 by the user, and the display device stores drawing operation data performed by the user in the drawing operation data set in time sequence every time the user performs the operation.
In some embodiments, the user may share the edited pictorial representation with other users. In the following embodiments, a display device that shares an edited pictorial representation out will be referred to as a sender device, and a display device that receives the pictorial representation shared by the sender device will be referred to as a target receiver device or receiver device.
Referring to fig. 9, after a user selects the share control 626, the sender device is triggered to display a share page 800, where the share page 800 includes, but is not limited to, two-dimensional code 810, parent circle 820, family album 830, associated device 840, associated account 850, and applet 860 options. The user can select the target options, and the pictorial representations are shared to the target receiving end equipment through channels corresponding to the target options.
In some embodiments, after a user selects the two-dimension code 810 option, the sending end device stores the content input in the drawing area as a target picture, transmits the target picture and a drawing operation data set corresponding to the target picture to the server, generates a two-dimension code according to the target picture and a storage position of the drawing operation data set corresponding to the target picture at the server, triggers a sharing page as shown in fig. 10, and after the receiving end device scans the two-dimension code on the sharing page, the receiving end device can access the storage positions of the target picture and the drawing operation data set corresponding to the target picture to obtain the target picture and the drawing operation data set corresponding to the target picture, and displays an editable or non-editable target picture according to the target picture and the drawing operation data set corresponding to the target picture.
In some embodiments, after the user selects the option of the parent circle 820, the sending end device stores the content input in the drawing area as a target picture, and transmits the target picture and the drawing operation data set corresponding to the target picture to the server which establishes communication connection with the sending end device, the server sends the target picture and the drawing operation data set corresponding to the target picture to the information platform associated with the drawing board application, and after the receiving end device logs in the information platform, whether to download the target picture and the drawing operation data set corresponding to the target picture can be selected according to the requirement so as to display the editable or uneditable target picture.
In some embodiments, after the user selects the option of the home album 830, the sending end device stores the content input in the drawing area as a target picture, and transmits the target picture and the drawing operation data set corresponding to the target picture to the server that establishes a communication connection with the sending end device, and the server sends a reminding message to the receiving end device that establishes a communication connection with the sending end device, for example, the receiving end device receives "the friend XXX of you shares a picture, clicks the link to immediately view https:// www.mubiaotupian.com". After receiving the reminding message, the receiving end equipment can acquire the target picture and the drawing operation data set corresponding to the target picture from the server by clicking a link 'https:// www.mubiaotupian.com', and display the editable or non-editable target picture on the receiving end equipment.
In some embodiments, after the user selects the option of the association device 840, a sharing page as shown in fig. 11 is triggered, where the sharing page includes all device names of receiving end devices that establish a communication connection with the present sending end device, and the user selects a target receiving end device, for example, the user selects a target receiving end device XXX001, the sending end device stores the content that is already input in the drawing area as a target picture, and transmits the target picture and a drawing operation data set corresponding to the target picture to the target receiving end device XXX001, and after the target receiving end device XXX001 receives the target picture and the drawing operation data set corresponding to the target picture, an editable or non-editable target picture may be displayed on the target receiving end device XXX 001. The receiving end device which establishes communication connection with the transmitting end device can be the receiving end device which is in the same local area network with the transmitting end device, or can be the receiving end device which is connected with the transmitting end device through a near field communication technology, such as a Bluetooth transmission protocol, an infrared transmission protocol and the like.
In some embodiments, after the user selects the associated account 850 option, a sharing page as shown in fig. 12 is triggered, where the sharing page includes all other user account names that establish an association with the account logged in by the sender device, the user selects a target user account, for example, the user selects a target account LLL001, the sender device stores the content that is input in the drawing area as a target picture, and transmits the target picture, a drawing operation data set corresponding to the target picture, and related information (such as an account name and an account ID) of the target user account LLL001 to a server that establishes a communication connection with the sender device, and the server sends, according to the related information of the target user account LLL001, the target picture and the drawing operation data set corresponding to the target picture to the receiver device logged in with the target user account, where the receiver device receives the target picture and the drawing operation data set corresponding to the target picture, so that an editable or non-editable target picture may be displayed.
In some embodiments, after the user selects the option of the applet 860, a sharing page including a plurality of application software options installed on the sender device is triggered, the user selects a corresponding application software option, and the sender device stores the content input in the drawing area as a target picture, and shares the target picture and a drawing operation data set corresponding to the target picture through the selected application software.
In the above embodiment, the content that has been input in the drawing area is the pictorial representation, which corresponds to the drawing operation data set.
In some embodiments, the transmitting device may also function as a receiving device, and receive the target pictures and the painting operation data sets corresponding to the painting representations transmitted by the other transmitting devices, so as to display the target pictures which may be edited or not.
In some embodiments, the sending end device may further send an application name capable of resolving the drawing operation data set to the receiving end device, after the receiving end device receives the application name, the receiving end device matches the application name with names of all installed applications, if it is queried that a name of an installed application is the same as the received application name, the matching is successful, which indicates that the receiving end device installs the application capable of resolving the drawing operation data set.
In some embodiments, if the receiving end device has an application capable of parsing the drawing operation data set, for example, a drawing board application is installed on the receiving end device, a selection interface as shown in fig. 13 is triggered, the selection interface includes a view option and an edit option, and after the user selects the view option, the receiving end device invokes a built-in picture viewer to display a target picture; after the user selects the editing option, the receiving end device invokes the drawing board application according to the received target picture and the drawing operation data set, and after the drawing board application reads the target picture and the drawing operation data set, the target picture is displayed on the interface corresponding to the drawing board application, and is in an editable state, so that the user can continuously edit the target picture on the receiving end device. For example, the user may perform line addition and color filling on the target picture through the brush control 621, the user may erase the content in the target picture through the erase control 622, and the user may also cancel and restore the inputted operation content corresponding to the target picture through the cancel control or the restore control.
In some embodiments, if the receiving end device does not have an application capable of analyzing the drawing operation data set, the receiving end device calls the built-in picture viewer to display the target picture after receiving the target picture and the drawing operation data set sent by the sending end device.
According to the embodiment, when the electronic drawing board interface is displayed by the sending end device, the user can share the pictorial representation to the target receiving end device by selecting the sharing control. The sending end device may generate a target picture corresponding to the pictorial representation in response to an input instruction to send the pictorial representation to the target receiving end device, and send a drawing operation data set corresponding to the pictorial representation and the target picture to the target receiving end device, so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable pictorial representation according to the drawing operation data set. Compared with the prior art that the finished pictorial representation can only be checked on the display device where the drawing board application software is located and cannot be checked on other devices, the picture sharing method provided by the application can realize that the pictorial representation can be checked on a plurality of devices, and can continuously edit the pictorial representation, so that the user experience is improved.
According to the display device provided by the above embodiment, the present application further provides a picture sharing method, which is applied to the above display device, as shown in fig. 14, and the method may include:
s141: presenting an electronic palette interface, the electronic palette interface comprising a control region and a drawing region, the control region comprising at least one brush control, the drawing region being configured to present a pictorial representation corresponding to a pictorial manipulation dataset, the pictorial manipulation dataset comprising pictorial manipulation data entered by a user through the brush control;
in some embodiments, the control section further includes a trigger button, the controller performing sending the drawing operation dataset and the target picture corresponding to the pictorial representation to the target recipient device, further configured to:
responding to the operation of the trigger button, displaying a sharing window on the electronic drawing board interface, wherein the sharing window comprises at least one sharing option, and the sharing option corresponds to the target receiving end equipment in communication connection with the display equipment;
and responding to the selection of the sharing option, and sending the drawing operation data set and the target picture to the target receiving end equipment corresponding to the selected sharing option.
In some embodiments, the control section further includes a trigger button, the controller performing sending the drawing operation dataset and the target picture corresponding to the pictorial representation to the target recipient device, further configured to:
responding to the operation of the trigger button, displaying a sharing window on the electronic drawing board interface, wherein the sharing window comprises at least one sharing option, and the sharing option corresponds to at least one other user account associated with the account logged on the display device;
and responding to the selection of the sharing option, and sending the drawing operation data set, the target picture and the target user account corresponding to the selected sharing option to a server so as to send the drawing operation data set and the target picture to the target receiving end equipment logged in with the target user account through the server.
In some embodiments, when a drawing operation input by a user through the brush control is received, corresponding drawing operation data is stored in the drawing operation data set, and the drawing operation data in the drawing operation data set is ordered according to the input time sequence.
In some embodiments, the drawing operation data set includes at least a brush type, a brush color, a brush width, a brush trajectory, and an input order of the brush trajectory corresponding to content input by the brush control in the drawing area.
And S142, responding to an input instruction for transmitting the painting to a target receiving end device, generating a target picture corresponding to the painting, and transmitting the painting operation data set corresponding to the painting and the target picture to the target receiving end device so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable painting according to the painting operation data set.
In some embodiments, the present application further provides a picture sharing method, referring to fig. 15, the method includes:
s151, receiving a target picture and a painting operation data set which are sent by a sending end device and correspond to the painting work;
in some embodiments, the drawing operation data set includes at least a brush type, a brush color, a brush width, a brush track, and an input order of the brush track corresponding to the target picture.
And S152, displaying the target picture on the user interface or displaying the editable painting on the user interface according to the painting operation data set.
In some embodiments, the controller, upon execution of displaying the target picture on the user interface or displaying the editable pictorial representation on the user interface in accordance with the set of pictorial operation data, is further configured to:
when the display device is provided with an application capable of analyzing the drawing operation data set, displaying the editable drawing work on an interface corresponding to the application through the application according to the drawing operation data set;
when the display device does not have an application capable of analyzing the drawing operation data set, displaying the non-editable target picture on the user interface.
In some embodiments, the interface to which the application corresponds includes a drawing area for displaying the pictorial representation and a control area including at least one control for editing the pictorial representation.
Based on the picture sharing method provided by the embodiment of the application, when the electronic drawing board interface is displayed, a user can share the pictorial representation to the target receiving end equipment by selecting the sharing control. The sending end device may generate a target picture corresponding to the pictorial representation in response to an input instruction to send the pictorial representation to the target receiving end device, and send a drawing operation data set corresponding to the pictorial representation and the target picture to the target receiving end device, so as to display the target picture on the target receiving end device or enable the target receiving end device to display the editable pictorial representation according to the drawing operation data set. Compared with the prior art that the finished pictorial representation can only be checked on the display device where the drawing board application software is located and cannot be checked on other receiving end devices, the picture sharing method provided by the application can realize that the pictorial representation can be checked on a plurality of display devices, and can continuously edit the pictorial representation, so that the user experience is improved.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the picture sharing method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in essence or what contributes to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present invention.
The same or similar parts between the various embodiments in this specification are referred to each other. In particular, for the display device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference should be made to the description of the method embodiments for the matters.
The embodiments of the present invention described above do not limit the scope of the present invention.

Claims (7)

1. A display device, characterized by comprising:
a display for displaying a user interface;
a controller configured to:
presenting an electronic palette interface, wherein the electronic palette interface comprises a control area and a drawing area, the control area comprises at least one brush control, the drawing area is used for presenting a drawing work corresponding to a drawing operation data set, and the drawing operation data set comprises drawing operation data input by a user through the brush control, and a brush color, a brush width, a brush track and an input sequence of the brush track corresponding to the content input by the brush control in the drawing area;
when a drawing operation input by a user through the drawing pen control is received, storing corresponding drawing operation data into a drawing operation data set, wherein the drawing operation data in the drawing operation data set are ordered according to an input time sequence;
generating a target picture corresponding to the pictorial representation in response to an input instruction to transmit the pictorial representation to a target receiving end device, and transmitting the pictorial operation data set corresponding to the pictorial representation and the target picture to the target receiving end device;
When the target receiving end equipment is provided with an application capable of analyzing the drawing operation data set, displaying the editable drawing work on an interface corresponding to the application through the application according to the drawing operation data set;
and when the target receiving end equipment does not have the application capable of analyzing the drawing operation data set, displaying the target picture which is not capable of being edited on the user interface.
2. The display device of claim 1, wherein the control field further comprises a trigger button, the controller performing the sending of the pictorial representation corresponding to the pictorial manipulation data set and the target picture to the target receiving device, further configured to:
responding to the operation of the trigger button, displaying a sharing window on the electronic drawing board interface, wherein the sharing window comprises at least one sharing option, and the sharing option corresponds to the target receiving end equipment in communication connection with the display equipment;
and responding to the selection of the sharing option, and sending the drawing operation data set and the target picture to the target receiving end equipment corresponding to the selected sharing option.
3. The display device of claim 1, wherein the control field further comprises a trigger button, the controller performing the sending of the pictorial representation corresponding to the pictorial manipulation data set and the target picture to the target receiving device, further configured to:
responding to the operation of the trigger button, displaying a sharing window on the electronic drawing board interface, wherein the sharing window comprises at least one sharing option, and the sharing option corresponds to at least one other user account associated with the account logged on the display device;
and responding to the selection of the sharing option, and sending the drawing operation data set, the target picture and the target user account corresponding to the selected sharing option to a server so as to send the drawing operation data set and the target picture to the target receiving end equipment logged in with the target user account through the server.
4. A display device, characterized by comprising:
a display for displaying a user interface;
a controller configured to:
receiving a target picture and a drawing operation data set which are sent by a sending end device and correspond to a drawing work, wherein the drawing operation data set at least comprises a painting brush color, a painting brush width, a painting brush track and an input sequence of the painting brush track, which correspond to the target picture;
When the display device is provided with an application capable of analyzing the drawing operation data set, displaying the editable drawing work on an interface corresponding to the application through the application according to the drawing operation data set;
when the display device does not have an application capable of analyzing the drawing operation data set, displaying the non-editable target picture on the user interface.
5. The display device of claim 4, wherein the application-specific interface includes a pictorial representation and a control region, the pictorial representation being displayed by the pictorial representation, the control region including at least one control for editing the pictorial representation.
6. The picture sharing method is characterized by comprising the following steps:
presenting an electronic palette interface, wherein the electronic palette interface comprises a control area and a drawing area, the control area comprises at least one brush control, the drawing area is used for presenting a drawing work corresponding to a drawing operation data set, and the drawing operation data set comprises drawing operation data input by a user through the brush control, and a brush color, a brush width, a brush track and an input sequence of the brush track corresponding to the content input by the brush control in the drawing area;
When a drawing operation input by a user through the drawing pen control is received, storing corresponding drawing operation data into a drawing operation data set, wherein the drawing operation data in the drawing operation data set are ordered according to an input time sequence;
generating a target picture corresponding to the pictorial representation in response to an input instruction to transmit the pictorial representation to a target receiving end device, and transmitting the pictorial operation data set corresponding to the pictorial representation and the target picture to the target receiving end device;
when the target receiving end equipment is provided with an application capable of analyzing the drawing operation data set, displaying the editable drawing work on an interface corresponding to the application through the application according to the drawing operation data set;
and when the target receiving end equipment does not have an application capable of analyzing the drawing operation data set, displaying the target picture which is not capable of being edited on a user interface.
7. The picture sharing method is characterized by comprising the following steps:
receiving a target picture and a drawing operation data set which are sent by a sending end device and correspond to a drawing work, wherein the drawing operation data set at least comprises a painting brush color, a painting brush width, a painting brush track and an input sequence of the painting brush track, which correspond to the target picture;
When the target receiving end equipment is provided with an application capable of analyzing the drawing operation data set, displaying the editable drawing work on an interface corresponding to the application through the application according to the drawing operation data set;
and when the target receiving end equipment does not have an application capable of analyzing the drawing operation data set, displaying the target picture which is not capable of being edited on a user interface.
CN202111084170.9A 2021-09-14 2021-09-14 Display equipment and picture sharing method Active CN113810746B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111084170.9A CN113810746B (en) 2021-09-14 2021-09-14 Display equipment and picture sharing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111084170.9A CN113810746B (en) 2021-09-14 2021-09-14 Display equipment and picture sharing method

Publications (2)

Publication Number Publication Date
CN113810746A CN113810746A (en) 2021-12-17
CN113810746B true CN113810746B (en) 2023-09-22

Family

ID=78895474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111084170.9A Active CN113810746B (en) 2021-09-14 2021-09-14 Display equipment and picture sharing method

Country Status (1)

Country Link
CN (1) CN113810746B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113938753B (en) * 2021-12-20 2022-04-26 北京搜狐新动力信息技术有限公司 Live broadcast data processing method and device, storage medium and equipment
CN114442849A (en) * 2022-01-27 2022-05-06 海信视像科技股份有限公司 Display device and display method
WO2023185126A1 (en) * 2022-03-31 2023-10-05 海信视像科技股份有限公司 Drawing display method, and display device
CN114816626A (en) * 2022-04-08 2022-07-29 北京达佳互联信息技术有限公司 Operation control method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729196A (en) * 2014-01-22 2014-04-16 南京恒知讯科技有限公司 Graphic layer based electronic drawing board system drawn by multiple persons, processing method and processing equipment
CN104503694A (en) * 2014-12-24 2015-04-08 北京奇虎科技有限公司 Drawing data instant messaging based method, client and system
CN110109594A (en) * 2019-04-30 2019-08-09 北京大米科技有限公司 A kind of draw data sharing method, device, storage medium and equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080024519A1 (en) * 2006-07-25 2008-01-31 Alberto Blanco System and method for producing paintings

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729196A (en) * 2014-01-22 2014-04-16 南京恒知讯科技有限公司 Graphic layer based electronic drawing board system drawn by multiple persons, processing method and processing equipment
CN104503694A (en) * 2014-12-24 2015-04-08 北京奇虎科技有限公司 Drawing data instant messaging based method, client and system
CN110109594A (en) * 2019-04-30 2019-08-09 北京大米科技有限公司 A kind of draw data sharing method, device, storage medium and equipment

Also Published As

Publication number Publication date
CN113810746A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN113810746B (en) Display equipment and picture sharing method
CN114501107A (en) Display device and coloring method
CN113268199A (en) Display device and function item setting method
CN114501108A (en) Display device and split-screen display method
CN115129214A (en) Display device and color filling method
CN112860331B (en) Display equipment and voice interaction prompting method
CN111984167B (en) Quick naming method and display device
CN112584213A (en) Display device and display method of image recognition result
CN112650418B (en) Display device
CN112926420B (en) Display device and menu character recognition method
CN111787350B (en) Display device and screenshot method in video call
CN112947783B (en) Display device
CN114007128A (en) Display device and network distribution method
CN115562544A (en) Display device and revocation method
CN113485614A (en) Display apparatus and color setting method
CN112732120A (en) Display device
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112668546A (en) Video thumbnail display method and display equipment
CN114281284B (en) Display apparatus and image display method
CN112882780A (en) Setting page display method and display device
CN113076042B (en) Local media resource access method and display device
CN113766164B (en) Display equipment and signal source interface display method
CN113689856B (en) Voice control method for video playing progress of browser page and display equipment
CN112199612B (en) Bookmark adding and combining method and display equipment
CN114281284A (en) Display apparatus and image display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant