CN114721563A - Display device, annotation method, and storage medium - Google Patents

Display device, annotation method, and storage medium Download PDF

Info

Publication number
CN114721563A
CN114721563A CN202210334026.4A CN202210334026A CN114721563A CN 114721563 A CN114721563 A CN 114721563A CN 202210334026 A CN202210334026 A CN 202210334026A CN 114721563 A CN114721563 A CN 114721563A
Authority
CN
China
Prior art keywords
touch
display
annotation
content
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210334026.4A
Other languages
Chinese (zh)
Inventor
张敬坤
赵洋
林乐
黄萌瑶
周琼琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210334026.4A priority Critical patent/CN114721563A/en
Publication of CN114721563A publication Critical patent/CN114721563A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a display device, an annotation method and a storage medium, which are applied to the technical field of display and solve the problems that in the prior art, the display device cannot be annotated directly through touch operation, inconvenience is brought to users, and user experience is poor. The display device includes: a display; a controller configured to: receiving touch information sent by touch equipment, wherein the touch equipment displays the display content of a display in a mirror image mode, and the touch information comprises: touch operation events and position information corresponding to the touch operation events; drawing annotation content corresponding to the touch information on the display content of the display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.

Description

Display device, annotation method, and storage medium
Technical Field
The present disclosure relates to the field of display technologies, and in particular, to a display device, an annotation method, and a storage medium.
Background
Along with the continuous development of science and technology, the functions of the display device are more and more abundant, and great convenience is brought to the work and the life of people. For example, in a conference scene, a user may play multimedia content through a display device (mainly referring to a non-touch device, such as a multimedia device), so that all participants view the multimedia content through the display device, but the non-touch device cannot perform annotation directly through touch operation, which brings inconvenience to the user and causes poor user experience.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, the present disclosure provides a display device, an annotation method, and a storage medium, which can realize an effect of drawing corresponding annotation content on display content of a display of the display device and displaying the annotation content through interaction between a touch device and the display device when a user cannot annotate the display device directly through a touch operation.
In a first aspect, the present disclosure provides a display device comprising:
a display;
a controller configured to: receiving touch information sent by a touch device, wherein the touch device displays the display content of the display in a mirror image mode, and the touch information comprises: a touch operation event and position information corresponding to the touch operation event;
drawing annotation content corresponding to the touch information on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
In some embodiments of the present disclosure, the position information corresponding to the touch operation event is determined based on a resolution of the touch device;
the controller is specifically configured to:
converting the position information corresponding to the touch operation event based on the resolution of the touch equipment and the resolution of the display equipment to obtain target position information;
and drawing target content corresponding to the touch information on the display content of the display based on the touch operation event and the target position information.
In some embodiments of the present disclosure, the position information corresponding to the touch operation event is determined based on a resolution of the display device.
In some embodiments of the present disclosure, the controller is further configured to:
and receiving a touch command sent by the touch equipment, wherein the touch command is used for indicating the display equipment to annotate the display content of the display.
In some embodiments of the present disclosure, the display is further configured to:
after the controller receives a touch control command sent by the touch control equipment, displaying an annotation window;
the controller is specifically configured to: and drawing annotation content corresponding to the touch information on the annotation window.
In some embodiments of the present disclosure, the controller is further configured to:
after the display displays the annotation window, setting characteristic attributes corresponding to the annotation symbols in the annotation window;
and drawing annotation content corresponding to the touch information on the annotation window according to the characteristic attribute.
In some embodiments of the present disclosure, the controller is further configured to:
receiving a touch ending command sent by the touch equipment;
and removing the annotation window and annotation contents corresponding to the annotation window according to the touch control ending command.
In a second aspect, the present disclosure provides a method of annotating, comprising:
receiving touch information sent by a touch device, wherein the touch device displays display content of a display device in a mirror image manner, and the touch information comprises: a touch operation event and position information corresponding to the touch operation event;
drawing annotation content corresponding to the touch information on display content of a display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
In some embodiments of the present disclosure, the position information corresponding to the touch operation event is determined based on a resolution of the touch device;
the drawing annotation content corresponding to the touch information on the display content of the display based on the touch operation event and the position information corresponding to the touch operation event includes:
converting the position information corresponding to the touch operation event based on the resolution of the touch equipment and the resolution of the display equipment to obtain target position information;
and drawing target content corresponding to the touch information on the display content of the display based on the touch operation event and the target position information.
In some embodiments of the present disclosure, the position information corresponding to the touch operation event is determined based on a resolution of the display device.
In some embodiments of the present disclosure, the method further comprises:
and receiving a touch command sent by the touch equipment, wherein the touch command is used for indicating the display equipment to annotate the display content of the display.
In some embodiments of the present disclosure, the method further comprises:
after the controller receives a touch control command sent by the touch control equipment, displaying an annotation window;
and drawing annotation content corresponding to the touch information on the annotation window.
In some embodiments of the present disclosure, the method further comprises:
after the display displays the annotation window, setting characteristic attributes corresponding to annotation symbols in the annotation window;
and drawing annotation content corresponding to the touch information on the annotation window according to the characteristic attribute.
In some embodiments of the present disclosure, the method further comprises:
receiving a touch ending command sent by the touch equipment;
and removing the annotation window and annotation contents corresponding to the annotation window according to the touch control ending command.
In a third aspect, the present disclosure provides a computer-readable storage medium comprising: the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the annotation method as shown in the second aspect.
In a fourth aspect, the present disclosure provides a computer program product comprising: the computer program product, when run on a computer, causes the computer to implement the annotation process as shown in the second aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages: firstly, receiving touch information sent by a touch device, wherein the touch device displays the display content of a display in a mirror image mode, and the touch information comprises: the method comprises the steps of generating a touch operation event and position information corresponding to the touch operation event, drawing annotation content corresponding to the touch information on display content of a display based on the touch operation event and the position information corresponding to the touch operation event, controlling the display to display the annotation content, and realizing drawing of the corresponding annotation content on the display content of the display equipment and displaying of the annotation content when a user cannot annotate the display equipment directly through touch operation through interaction between the touch equipment and the display equipment.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic diagram of an operational scenario between a display device and a control apparatus according to one or more embodiments of the present disclosure;
fig. 2 is a block diagram of a hardware configuration of the control apparatus 100 according to one or more embodiments of the present disclosure;
fig. 3 is a block diagram of a hardware configuration of a display apparatus 200 according to one or more embodiments of the present disclosure;
fig. 4 is a schematic diagram of a software configuration in a display device 200 according to one or more embodiments of the present disclosure;
FIG. 5 is a schematic illustration of an icon control interface display of an application in a display device 200 according to one or more embodiments of the present disclosure;
FIG. 6A is a system framework diagram of an annotation process implemented in accordance with one or more embodiments of the disclosure;
FIG. 6B is an architecture diagram for implementing an annotation methodology in accordance with one or more embodiments of the present disclosure;
fig. 7A is a schematic flow chart diagram of an annotation method provided by an embodiment of the disclosure;
fig. 7B is a schematic diagram illustrating an annotation process according to an embodiment of the disclosure;
fig. 7C is a schematic view of an interaction process between the touch device and the display device in the annotation method provided by the embodiment of the disclosure;
fig. 7D is a schematic diagram of a first display interface of a display device according to an embodiment of the disclosure;
fig. 7E is a schematic diagram of a second display interface when the display of the display device displays the annotation content according to the embodiment of the disclosure;
fig. 8A is a schematic flow chart of another annotation method provided in the embodiment of the present disclosure;
fig. 8B is a schematic diagram illustrating another annotation method provided by an embodiment of the disclosure;
fig. 8C is a schematic view of an interaction process between the touch device and the display device in the annotation method provided by the embodiment of the disclosure;
fig. 9A is a schematic flow chart of another annotation method provided by the embodiment of the disclosure;
fig. 9B is a schematic diagram illustrating a principle of another annotation method provided by an embodiment of the disclosure;
fig. 9C is a schematic view of an interaction process between the touch device and the display device in the annotation method provided by the embodiment of the disclosure;
fig. 9D is a schematic diagram of a third display interface when the display of the display device displays the annotation window according to the embodiment of the disclosure;
fig. 9E is a schematic diagram of a fourth display interface when the display of the display device displays annotation content according to the embodiment of the disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
The terms "first" and "second," etc. in this disclosure are used to distinguish between different objects, rather than to describe a particular order of objects. For example, the first display interface and the second display interface, etc. are used to distinguish different display interfaces, rather than to describe a particular order of display interfaces.
Currently, multi-screen interaction is a common scenario in the display field, for example, content interaction between a device with a small screen (such as a mobile phone, a notebook computer, etc.) and a device with a large screen (such as a television). Cell phones are generally based on two operating systems: the iOS operating system and the Android (Android) operating system. In an iOS operating system, the functions of screen projection and mirror image of the mobile phone can be realized through a defined Airplay protocol (private protocol); in the Android system, the mobile phone can be used for screen projection through a defined Digital Living Network Alliance (DLNA) protocol, and screen images or audio contents on the mobile phone can be mirrored to nearby equipment through WiFi direct connection based on Miracast. The smart television or the television externally connected with the set-top box can interact with the mobile phone through the Airplay protocol and the DLNA protocol, so that a multi-screen interaction (mainly screen projection) function is realized, and a screen projection function can also be realized through other protocols, which is not specifically limited in this embodiment.
The remarking is a method for commenting and annotating the content of a multimedia file, such as a presentation (PPT) or an Excel form, so that a user can more easily master the content of the file. The annotation has a specific application in many scenes, for example, in a meeting scene, the core requirement is to be able to project a user document and to annotate a description at any time during the course of a speech. The embodiment of the disclosure is mainly suitable for the situation that a user cannot annotate the content displayed by the display device directly through touch operation.
It should be noted that: the touch device in this embodiment may be understood as a device having a touch screen, and the display device mainly refers to a non-touch device, that is: and devices without touch screens, such as multimedia devices or televisions. The touch screen may include: a capacitive screen, an electromagnetic screen, an infrared screen, etc., which is not limited in this embodiment.
In some technologies, when the display device cannot annotate the displayed content directly through touch operation, the display content of the display device is sent to a terminal, when a user marks on the terminal, the annotation handwriting is directly drawn in a canvas layer of the terminal, the annotation trajectory is projected to the display device and is displayed for the user to watch, but the problem of lagging of the annotation handwriting easily occurs under the conditions of poor wireless environment and poor hardware resources by the method, so that inconvenience is brought, and the user experience is reduced.
In order to solve the above problem, the present disclosure may first use a screen projection function or a screen recording function to mirror the content of the display device to the touch device, so that the touch device mirrors the display content of the display, and then the controller of the display device receives touch information sent by the touch device, where the touch information includes: the display device comprises a touch operation event and position information corresponding to the touch operation event, annotation content corresponding to the touch information is drawn on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, the display is controlled to display the annotation content, and through interaction between the touch device and the display device, when a user cannot annotate the display device directly through touch operation, the corresponding annotation content is drawn on the display content of the display device and the annotation content is displayed.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to one or more embodiments of the present disclosure. As shown in fig. 1, a user may operate the display device 200 through a touch device 300 (a smart terminal) or the control apparatus 100, and various multimedia files, such as video files, PPT or Excel forms, may be played in the display device 200. When a user plays a certain PPT through a display device, it is assumed that a certain piece of content of a certain page in the PPT needs to be annotated, and at this time, since the display device does not have a touch function, the touch operation of the user cannot be directly received, and thus the annotation cannot be directly performed. In order to annotate the piece of content, the user may mirror the PPT onto the touch device 300 by a screen projection operation or a screen recording operation, and then the user may perform a touch operation on the touch device 300, such as clicking, lifting, moving, or the like. Meanwhile, the touch device 300 sends the touch information to the display device, where the touch information mainly includes: the touch control operation event and the position information corresponding to the touch control operation event. After receiving touch information sent by the touch equipment, a controller of the display equipment draws annotation content corresponding to the touch information on display content (namely a certain section of content of a certain page in the PPT) of a display of the controller based on the touch operation event and position information corresponding to the touch operation event, and controls the display to display the annotation content, so that the problems in the prior art are solved, the content displayed by the display equipment can be annotated, inconvenience in operation for a user is avoided, and the use experience of the user is improved.
In some embodiments, the touch device 300 may be a mobile phone, a notebook computer, a tablet computer, or a device that is used for annotating a display device and is configured with the display device, and the like, which is not limited in this embodiment.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication, a bluetooth protocol communication, a wireless communication or other wired communication to control the display device 200. The user may input a user command through keys on a remote controller, voice input, control panel input, and the like to control the display apparatus 200. In some embodiments, the display device 200 may also be controlled using mobile terminals, tablets, computers, laptops, and other smart devices.
In some embodiments, the display device 200 may not receive instructions using the smart device or the control device described above, but receive control of the user through touch or gesture, or the like.
In some embodiments, the display device 200 may also be controlled by a manner other than the control apparatus 100 and the touch device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the touch device 300 may be installed with a software application on the display device 200, and implement connection communication through a network communication protocol, so as to achieve the purpose of one-to-one control operation and data communication. The audio and video content displayed on the touch device 300 can also be transmitted to the display device 200, and the audio and video content displayed on the display device 200 can also be transmitted to the touch device 300, so that a synchronous display function is realized. The display apparatus 200 is also in data communication with the server 400 through various communication means, which may allow the display apparatus 200 to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. The server 400 may provide various contents and interactions to the display apparatus 200. The display device 200 may be a liquid crystal display, an OLED display, or a projection display device, etc. The display apparatus 200 may additionally provide an intelligent network television function providing a computer support function in addition to the broadcast receiving television function.
Fig. 2 is a block diagram of a hardware configuration of the control apparatus 100 according to one or more embodiments of the present disclosure, and fig. 2 illustrates a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply source. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. The communication interface 130 is used for communicating with the outside, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module. The user input/output interface 140 includes at least one of a microphone, a touch pad, a sensor, a key, or an alternative module.
Fig. 3 is a block diagram of a hardware configuration of the display apparatus 200 according to one or more embodiments of the present disclosure, and fig. 3 illustrates a block diagram of a hardware configuration of the display apparatus 200 according to an exemplary embodiment. The display apparatus 200 as shown in fig. 3 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface (i.e., a user input interface) 280. The controller 250 includes a central processor, a video processor, an audio processor, a graphic processor, a RAM, a ROM, first to nth interfaces for input/output. The display 260 may be at least one of a liquid crystal display, an OLED display, a touch display, and a projection display, and may also be a projection device and a projection screen. The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals. The communicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control apparatus 100 or the server 400 through the communicator 220. The detector 230 is used to collect signals of the external environment or interaction with the outside. The controller 250 and the tuner-demodulator 210 may be located in different separate devices, that is, the tuner-demodulator 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. The user interface 280 may be used to receive control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. A user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operation and displayed in a graphical manner. It may be an interface element such as an icon, a window, and a control displayed in a display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, and other visual interface elements.
Fig. 4 is a schematic diagram of a software configuration in a display device 200 according to one or more embodiments of the present disclosure, and as shown in fig. 4, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (referred to as an "Application layer"), an Application Framework (Application Framework) layer (referred to as a "Framework layer"), an Android runtime (Android runtime) and system library layer (referred to as a "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or may be an application developed by a third party developer. In particular implementations, applications in the application layer include, but are not limited to, the above examples.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software, including at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
Fig. 5 is a schematic diagram illustrating an icon control interface display of an application program in the display device 200 according to one or more embodiments of the present disclosure, as shown in fig. 5, an application layer includes at least one application program that can display a corresponding icon control in a display, for example: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like. The live television application program can provide live television through different signal sources. A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. The media center application program can provide various applications for playing multimedia contents. The application program center can provide and store various application programs.
In some embodiments, the display device is a terminal device with a display function, such as a television, a mobile phone, a computer, a learning machine, and the like. In the display device:
an output interface (display 260, and/or audio output interface 270) configured to output user interaction information;
a communicator 220 for communicating with the server 400;
a controller 250 configured to: receiving touch information sent by a touch device, wherein the touch device displays the display content of the display in a mirror image mode, and the touch information comprises: a touch operation event and position information corresponding to the touch operation event;
drawing annotation content corresponding to the touch information on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
In some embodiments, the position information corresponding to the touch operation event is determined based on a resolution of the touch device;
the controller 250, in particular, is configured to:
converting the position information corresponding to the touch operation event based on the resolution of the touch equipment and the resolution of the display equipment to obtain target position information;
and drawing target content corresponding to the touch information on the display content of the display based on the touch operation event and the target position information.
In some embodiments, the position information corresponding to the touch operation event is determined based on a resolution of the display device.
In some embodiments, the controller 250 is further configured to:
and receiving a touch command sent by the touch equipment, wherein the touch command is used for indicating the display equipment to annotate the display content of the display.
In some embodiments, the display is further configured to:
after the controller receives a touch control command sent by the touch control equipment, displaying an annotation window;
the controller is specifically configured to: and drawing annotation content corresponding to the touch information on the annotation window.
In some embodiments, the controller 250 is further configured to:
after the display displays the annotation window, setting characteristic attributes corresponding to the annotation symbols in the annotation window;
and drawing annotation content corresponding to the touch information on the annotation window according to the characteristic attribute.
In some embodiments, the controller 250 is further configured to:
receiving a touch ending command sent by the touch equipment;
and removing the annotation window and annotation contents corresponding to the annotation window according to the touch control ending command.
In summary, the present disclosure first receives touch information sent by a touch device, where the touch device displays display content of a display in a mirror image manner, and the touch information includes: the display device comprises a touch operation event and position information corresponding to the touch operation event, annotation content corresponding to the touch information is drawn on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, the display is controlled to display the annotation content, and through interaction between the touch device and the display device, when a user cannot annotate the display device directly through touch operation, the corresponding annotation content is drawn on the display content of the display device and the annotation content is displayed.
Fig. 6A is a system framework diagram for implementing an annotation method according to one or more embodiments of the present disclosure, and as shown in fig. 6A, the system may include an information receiving module 601, a content rendering module 602, and a content display module 603. When a user plays certain multimedia content through a display device, it is assumed that some part of the multimedia content needs to be annotated, and at this time, because the display device does not have a touch function, the touch operation of the user cannot be directly received, and thus the annotation cannot be directly performed. In order to annotate the piece of content, the user may first mirror the multimedia content into the touch device 300 through operations such as screen projection or screen recording, and then the user may perform a touch operation on the touch device 300. The information receiving module 601 in the system can receive touch information sent by a touch device, then the content drawing module 602 in the system draws annotation content corresponding to the touch information on display content of a display of the display based on a touch operation event and position information corresponding to the touch operation event, and finally the display is controlled to display the annotation content through the content display module 603.
Fig. 6B is an architecture diagram of an annotation method implemented according to one or more embodiments of the present disclosure, based on the system framework in fig. 6A, the implementation of the present disclosure in an android system is as shown in fig. 6B, the android system mainly includes an application layer, a framework layer, a system runtime library layer, and a kernel layer, and an implementation logic is mainly embodied in the application layer, and includes an information receiving module, a content drawing module, and a content display module.
The annotating method provided in the embodiments of the present disclosure can receive touch information sent by a touch device when a user cannot annotate a display device directly through a touch operation, the display content of a display is displayed by the touch device in a mirror image manner, and the touch information includes: the display device comprises a touch operation event and position information corresponding to the touch operation event, annotation content corresponding to the touch information is drawn on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, the display is controlled to display the annotation content, and the corresponding annotation content can be drawn on the display content of the display device through interaction between the touch device and the display device and can be displayed. For more detailed description of the present solution, the following description is made in conjunction with fig. 7A, and it is understood that the steps involved in fig. 7A may include more steps or fewer steps in actual implementation, and the order between the steps may also be different, so as to enable the annotation method provided in the embodiment of the present disclosure to be implemented, and the embodiment of the present disclosure is not limited.
Fig. 7A is a schematic flowchart of an annotating method provided in the embodiment of the disclosure, and fig. 7B is a schematic principle diagram of an annotating method provided in the embodiment of the disclosure. The embodiment can be applied to the situation that the display content of the display device is annotated through the touch device.
As shown in fig. 7A, the annotation method specifically includes the following steps:
s710, receiving touch information sent by the touch device, where the touch device displays display content of the display in a mirror image manner, and the touch information includes: the touch control operation event and the position information corresponding to the touch control operation event.
The touch information may be understood as a touch operation event corresponding to a user performing a touch operation (i.e., an operation corresponding to annotation) on content displayed in the touch device and position information corresponding to each touch operation event. The touch operation may include a click (down), a lift (up), a move (move), or the like. Accordingly, the touch operation event may include a click event, a lift event, a move event, or the like. The operation position information may be understood as specific information of a position where the touch operation event occurs, mainly coordinate information.
When multi-screen interaction is performed between touch control equipment and display equipment, two mirror image modes are mainly adopted. One is that: mirroring the touch equipment to the display equipment; the other is as follows: the display device is mirrored to the touch device.
In the annotating method in this embodiment, when a user wants to annotate the display content of the display in the display device, because the display device cannot annotate directly through touch operation, the display device needs to mirror the display content of the display to the touch device so that the touch device displays the display content of the display in a mirror image manner, specifically, the display content in the display of the display device is converted into a plurality of corresponding image frames by starting a screen recording interface of the display device, and the plurality of image frames are sent to the touch device so that the plurality of image frames can be displayed in the touch device synchronously; and the display content in the display equipment can be projected to the touch equipment for synchronous display through the screen projection function. The touch device can receive touch operation of a user, so that in the process of displaying display content of the display in a mirror image mode through the touch device, the user can select an annotation function through a menu bar displayed in the touch device, and accordingly corresponding touch operation is performed. A touch screen of the touch device can receive a touch operation triggered by a user, and an inductor in the touch device can acquire touch information generated by the touch operation and transmit the touch information to the display device, specifically, the touch information can be transmitted to the display device through a Transmission Control Protocol (TCP). The position information corresponding to the touch operation event may be transmitted to the display device by the touch device through a TCP command in the form of a character string, which may start with pullminor _ touch. Accordingly, the controller of the display device can receive the touch information sent by the touch device.
And S720, drawing annotation content corresponding to the touch information on the display content of the display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
After receiving a touch operation event sent by the touch equipment and position information corresponding to the touch operation event, a controller of the display equipment simulates the touch operation event through a corresponding simulation method, meanwhile, annotation content corresponding to the touch information can be drawn on display content of the display based on the position information corresponding to the touch operation event, and the display is controlled to display the annotation content.
In some embodiments, the controller may simulate the touch operation event by using a sendPointSync method to obtain simulated event information, draw annotation content corresponding to the touch information on the display content of the display by using a Canvas according to the simulated event information and the position information corresponding to the touch operation event, and control the display to display the annotation content during the process of drawing the annotation content.
In some embodiments, when the touch device is mirrored to the display device, since the touch device can annotate directly through touch operation, the user can first operate a menu bar of the touch device to select an annotation function, and then the touch device draws corresponding annotation content on the display content displayed by the touch device according to touch information generated by the real-time touch operation of the user. And the touch control equipment can transmit the real-time data stream when drawing the annotation content in the self equipment to the display equipment for displaying through a local area network or a wide area network.
For example, the process of mirroring the touch device to the display device can be applied to the following scenarios: when the mobile phone mirrors to a television terminal to play PPT, word and other contents, the annotation is carried out through the mobile phone terminal to obtain annotation contents, and the annotation contents are synchronously displayed at the television terminal, so that the process of annotating the television terminal to share by the mobile phone terminal is realized.
Fig. 7C is a schematic view of an interaction process between a touch device and a display device in the method provided by the embodiment of the disclosure, as shown in fig. 7C: the process of implementing the annotation method through interaction between the touch device and the display device has been described in detail in the foregoing embodiments, and is not repeated here to avoid repetition.
Fig. 7D is a schematic diagram of a first display interface of a display device according to an embodiment of the disclosure, and as shown in fig. 7D, the first display interface displayed by the display device before annotation is exemplarily shown, and the first display interface is synchronously displayed in the touch device.
Fig. 7E is a schematic diagram of a second display interface when the display of the display device displays annotation content according to the embodiment of the disclosure, as shown in fig. 7E:
assuming that a user intends to annotate content displayed on the first display interface in fig. 7D, fig. 7E exemplarily shows a schematic diagram of a corresponding second display interface when controlling the display to display annotated content, which is obtained by the annotation method in the above embodiment. Fig. 7E shows that 11 is the annotation content, and the second display interface is synchronously displayed in the touch device.
In some embodiments, the mirror-displaying the display content of the display by the touch device may specifically include: the display equipment starts a screen recording interface; and sending the content displayed by the display to the touch control equipment through the screen recording interface until the screen recording interface is closed, and stopping sending the content displayed by the display so that the touch control equipment can synchronously display the display content of the display.
In this embodiment, the touch device can synchronously display content of the display device by the method, so that a user can perform touch operation on the display content displayed in the touch device to obtain touch information, and the touch device sends the touch information to the display device.
Exemplarily, in the process of mirroring the display device to the touch device, assuming that the display device is a television and the touch device is a mobile phone, the television is controlled on the mobile phone to draw corresponding annotation content and display the annotation content, and meanwhile, the television transmits the annotation display process of the television back to the mobile phone for displaying in real time under the condition of starting the screen recording interface, so that accurate annotation and synchronous display of the display device and the touch device can be realized.
Fig. 8A is a schematic flow chart of another annotation method provided in the embodiment of the present disclosure, and fig. 8B is a schematic principle diagram of another annotation method provided in the embodiment of the present disclosure. Optionally, the present embodiment mainly further describes a process of drawing annotation content corresponding to the touch information.
As shown in fig. 8A, the annotation method specifically includes the following steps:
s810, receiving touch information sent by the touch device, where the touch device displays display content of the display in a mirror image manner, and the touch information includes: the touch control device comprises a touch control operation event and position information corresponding to the touch control operation event, wherein the position information corresponding to the touch control operation event is determined based on the resolution of the touch control device.
The position information corresponding to the touch operation event is determined based on the resolution of the touch device, and it can be understood that the position information is original position information acquired by the touch device, and when the resolution of the touch device is different from the resolution of the display device, the original position information needs to be converted and then can be used in subsequent steps.
It should be noted that: typically, the resolution of the touch device is different from the resolution of the display device.
And S820, converting the position information corresponding to the touch operation event based on the resolution of the touch equipment and the resolution of the display equipment to obtain target position information.
The target position information may be understood as position information obtained by converting the original position information.
Since the position information corresponding to the touch operation event is determined based on the resolution of the touch device, the controller of the display device needs to convert the position information after receiving the position information corresponding to the touch operation event, specifically, the position information corresponding to the touch operation event is converted according to the resolution of the touch device and the resolution of the display device, and the target position information can be obtained after the conversion.
The following are exemplary: assuming that the position information corresponding to the touch operation event determined based on the resolution of the touch device is (X1, Y1), the target position information is (X2, Y2), the resolution of the touch device is (W1, H1), and the resolution of the display device is (W2, H2), the conversion formula for converting the position information is as follows:
Figure BDA0003573925860000161
and S830, drawing target content corresponding to the touch information on the display content of the display based on the touch operation event and the target position information, and controlling the display to display the target content.
After obtaining the target position information, the controller of the display device simulates the touch operation event through a corresponding simulation method, and simultaneously can draw target content corresponding to the touch information on the display content of the display based on the target position information and control the display to display the target content.
In this embodiment, the target position information is obtained by converting the position information corresponding to the touch operation event determined based on the resolution of the touch device, the target content corresponding to the touch information is drawn on the display content of the display based on the touch operation event and the target position information, and the display is controlled to display the target content, so that the accuracy of the position information can be improved, and the accuracy of the drawn target content and the use experience of the user are further improved.
Fig. 8C is a schematic view of an interaction process between a touch device and a display device in the method provided by the embodiment of the present disclosure, as shown in fig. 8C: the process of implementing another annotation method provided by the embodiment of the present disclosure through interaction between the touch device and the display device has been described in detail in the above embodiments, and is not repeated here to avoid repetition.
In some embodiments, the position information corresponding to the touch operation event is determined based on a resolution of the display device.
Specifically, the touch screen of the touch device can receive a touch operation triggered by a user, the sensor in the touch device can acquire touch information generated by the touch operation, after acquiring position information determined based on the resolution of the touch device, the processor of the touch device can convert the position information according to the resolution of the touch device and the resolution of the display device, a specific conversion formula is described in the above embodiment, and after the conversion is completed, the converted result is sent to the display device. Accordingly, the controller of the display device can receive the converted position information transmitted by the touch device.
In this embodiment, the conversion is performed in the touch device, and the converted position information is sent to the display device, so that the processing amount of the display device can be reduced, the display device does not need to convert the position information again, and the annotation content corresponding to the touch information can be directly drawn on the display content of the display based on the touch operation event and the position information corresponding to the touch operation event, thereby improving the efficiency.
Fig. 9A is a schematic flowchart of another annotation method provided in the embodiment of the present disclosure, and fig. 9B is a schematic diagram of a principle of the another annotation method provided in the embodiment of the present disclosure. Optionally, this embodiment mainly explains a process before receiving touch information sent by the touch device.
As shown in fig. 9A, the annotation method specifically includes the following steps:
s910, receiving a touch command sent by a touch device, where the touch command is used to instruct a display device to annotate display content of a display.
The touch control command is an annotation starting command.
When the touch device is not a device, such as a mobile phone or a laptop, which is matched with the display device and is specially used for annotating the display device, the touch device needs to send a touch command to the display device, and a controller of the display device can ensure that the display content of the display device can be annotated subsequently after receiving the touch command (i.e., S920-S930).
Illustratively, the touch command may begin with a string PULLMIRROR _ COMMENTS that can be sent to the display device via a TCP command.
S920, receiving touch information sent by the touch device, where the touch device displays display content of the display in a mirror image manner, and the touch information includes: the touch control operation event and the position information corresponding to the touch control operation event.
S930, drawing annotation content corresponding to the touch information on the display content of the display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
In this embodiment, before receiving the touch information sent by the touch device, the display device also needs to receive a touch command sent by the touch device, so that smooth proceeding of a subsequent annotation process can be ensured.
In some embodiments, when the touch device is a device that is matched with the display device and is specially used for annotating the display device, the touch device is used for controlling the display device to implement an annotating function, and the two are in a corresponding relationship, the touch device may not need to send a touch command to the display device at this time, and after receiving touch information sent by the touch device, the display device can draw annotation content corresponding to the touch information on display content of the display based on a touch operation event and position information corresponding to the touch operation event that are included in the touch information, and control the display to display the annotation content.
Fig. 9C is a schematic view of an interaction process between a touch device and a display device in an annotation method provided by the embodiment of the disclosure, as shown in fig. 9C: the process of implementing another annotation method provided by the embodiments of the present disclosure through interaction between the touch device and the display device has been described in detail in the embodiments, and is not repeated here to avoid repetition.
In some embodiments, the method may further specifically include:
after the controller receives a touch control command sent by the touch control equipment, displaying an annotation window;
and drawing annotation content corresponding to the touch information on the annotation window.
Specifically, after the controller of the display device receives a touch command sent by the touch device, an annotation window is displayed on the display of the display device, the position of the annotation window can be set by user, and meanwhile, a subsequent controller can draw annotation content corresponding to the touch information on the annotation window.
In this embodiment, through showing the annotation window, the follow-up user can see the annotation process, and can correct in time when finding the error in the annotation process.
Fig. 9D is a schematic diagram of a third display interface when a display of a display device displays an annotation window according to an embodiment of the present disclosure, and an implementation manner is exemplarily shown, where a dashed box indicated by 12 in fig. 9D is the annotation window.
Fig. 9E is a schematic view of a fourth display interface when the display of the display device displays annotation content provided by the embodiment of the disclosure, and on the basis of fig. 9D, the display device draws the annotation content corresponding to the touch information on the annotation window shown in fig. 9D, as shown in fig. 9E, the annotation content is 13.
In some embodiments, the method may further specifically include:
after the controller receives the touch control command sent by the touch control device, an annotation window can be set to be transparent (namely, the annotation window cannot be seen by a user), and the annotation window floats on the uppermost layer of the display content displayed by the touch control device, so that the annotation window can be prevented from blocking part of the display content; correspondingly, when the display starts the annotation window, the focus is not captured, and the influence on the annotation process caused by other control devices is avoided.
In some embodiments, the method may further specifically include:
after the display displays the annotation window, setting characteristic attributes corresponding to the annotation symbols in the annotation window;
and drawing annotation content corresponding to the touch information on the annotation window according to the characteristic attribute.
The annotation symbol may be understood as each symbol for annotation displayed when the annotation function is activated, such as a brush, a pencil, a straight line, a rectangle, a circle, an arrow, and the like. The characteristic attribute can be understood as the characteristics such as color and thickness corresponding to the annotation symbol.
Specifically, after the display displays the annotation window, the controller can also set a characteristic attribute corresponding to the annotation symbol in the annotation window, so that the annotation content corresponding to the touch information is drawn on the annotation window according to the set characteristic attribute.
For example, assuming that the annotation symbol is a brush, the characteristic attribute is red, and the thickness is a numerical value a, when the annotation content corresponding to the touch information is subsequently drawn on the display content of the display, the annotation content with a red line and a numerical value a is drawn by using the brush.
In some embodiments, the characteristic attribute corresponding to the annotation symbol may be a default color and a default thickness set by the controller after starting the annotation window and performing an initialization operation; the characteristic attribute corresponding to the annotation symbol may also be determined by the user by operating the touch device, and the touch device sends the determined characteristic attribute corresponding to the annotation symbol to the display device.
For example, the touch device may send the characteristic attribute corresponding to the determined annotation symbol to the display device in a form of a character string through a TCP command. Wherein, the color in the character string can be as follows: starting with PULLMIRROR _ PAINTBRUSHCOLOR, the coarseness in the string may start with PULLMIRROR _ PAINTBRUSHWIDTH.
In some embodiments, after the display device receives the characteristic attribute corresponding to the annotation symbol sent by the touch device, a comment window can be restarted to be distinguished from the original comment window, and meanwhile, the characteristic attribute of the restarted comment window is set, so that real-time color changing comment can be realized, the comment experience of a user is increased, and the comment scene is enriched.
In some embodiments, the display device may receive the characteristic attribute and the touch information through onTouchEvent.
In some embodiments, the method may further specifically include:
receiving a touch ending command sent by the touch equipment;
and removing the annotation window and annotation contents corresponding to the annotation window according to the touch control ending command.
Specifically, the controller of the display device can also receive a touch ending command sent by the touch device, and remove the annotation window and the annotation content corresponding to the annotation window according to the touch ending command.
In this embodiment, after the annotation is finished, the display content of the display can be restored to the original multimedia content by the above method.
In some embodiments, the process of mirroring the display content of the display device to the touch device and controlling the display device to annotate through the touch device may be applied to the following scenarios:
1. after sale, remote connection can be realized, when a user cannot use a certain function, the user can be remotely annotated to guide the user to operate at the television end, and the effect that the user cannot be helped to solve the problem is achieved;
2. when the television is mirrored to the mobile phone end, the mobile phone end is annotated to realize a picture with highlighted key annotation content.
In some embodiments, when the display device is mirrored to the touch device, the user may store the annotation content displayed in the touch device through a menu bar option of the touch device, for example, by means of screen capture, and may share the stored data, so that other users may also see the annotation content.
In some embodiments, when the touch device is mirrored to the display device, since the touch device can annotate directly through touch operation, a user can first operate a menu bar of the touch device to select an annotation function, and then the touch device starts a window service (WindowService) to set an annotation window having a transparent property and floating on the uppermost layer of display content of the touch device, and according to the real-time touch operation of the user, the display device obtains a real-time motion trajectory and draws corresponding annotation content in the annotation window. At this time, the real-time data stream during drawing the annotation content in the touch device can be transmitted to the display device for display through the local area network or the wide area network. Meanwhile, when the annotating window is started, the touch control equipment needs to capture a focus, so that the operation of an original control device (such as a remote controller) is not influenced, and other operations of the display equipment can be performed while annotating, so that the use scene is wider.
To sum up, the present disclosure executes the annotation method on the display device, and first receives touch information sent by the touch device, where the touch device displays display content of the display in a mirror image manner, and the touch information includes: the display device comprises a touch operation event and position information corresponding to the touch operation event, annotation content corresponding to the touch information is drawn on display content of the display based on the touch operation event and the position information corresponding to the touch operation event, the display is controlled to display the annotation content, and through interaction between the touch device and the display device, when a user cannot annotate the display device directly through touch operation, the corresponding annotation content is drawn on the display content of the display device and the annotation content is displayed.
The disclosed embodiment provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the computer program implements each process executed by the annotation method, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here.
The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The present disclosure provides a computer program product, comprising: when the computer program product is run on a computer, the computer is caused to implement the annotation method described above.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the foregoing discussion in some embodiments is not intended to be exhaustive or to limit the implementations to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a controller configured to: receiving touch information sent by a touch device, wherein the touch device displays the display content of the display in a mirror image manner, and the touch information comprises: a touch operation event and position information corresponding to the touch operation event;
based on the touch operation event and the position information corresponding to the touch operation event, drawing annotation content corresponding to the touch information on the display content of the display, and controlling the display to display the annotation content.
2. The display device according to claim 1, wherein the position information corresponding to the touch operation event is determined based on a resolution of the touch device;
the controller is specifically configured to:
converting the position information corresponding to the touch operation event based on the resolution of the touch equipment and the resolution of the display equipment to obtain target position information;
and drawing target content corresponding to the touch information on the display content of the display based on the touch operation event and the target position information.
3. The display device according to claim 1, wherein the position information corresponding to the touch operation event is determined based on a resolution of the display device.
4. The display device of claim 1, wherein the controller is further configured to:
and receiving a touch control command sent by the touch control equipment, wherein the touch control command is used for indicating the display equipment to carry out annotation operation on the display content of the display.
5. The display device of claim 4, wherein the display is further configured to:
after the controller receives a touch control command sent by the touch control equipment, displaying an annotation window;
the controller is specifically configured to: and drawing annotation content corresponding to the touch information on the annotation window.
6. The display device of claim 5, wherein the controller is further configured to:
after the display displays the annotation window, setting characteristic attributes corresponding to the annotation symbols in the annotation window;
and drawing annotation content corresponding to the touch information on the annotation window according to the characteristic attribute.
7. The display device according to any one of claims 5 or 6, wherein the controller is further configured to:
receiving a touch ending command sent by the touch equipment;
and removing the annotation window and annotation contents corresponding to the annotation window according to the touch control ending command.
8. A method of annotating, comprising:
receiving touch information sent by a touch device, wherein the touch device displays display content of a display device in a mirror image manner, and the touch information comprises: a touch operation event and position information corresponding to the touch operation event;
drawing annotation content corresponding to the touch information on display content of a display based on the touch operation event and the position information corresponding to the touch operation event, and controlling the display to display the annotation content.
9. A computer-readable storage medium, comprising: the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, implements the annotation method according to claim 8.
10. A computer program product, comprising: when the computer program product is run on a computer, the computer is caused to implement the annotation process of claim 8.
CN202210334026.4A 2022-03-30 2022-03-30 Display device, annotation method, and storage medium Pending CN114721563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210334026.4A CN114721563A (en) 2022-03-30 2022-03-30 Display device, annotation method, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210334026.4A CN114721563A (en) 2022-03-30 2022-03-30 Display device, annotation method, and storage medium

Publications (1)

Publication Number Publication Date
CN114721563A true CN114721563A (en) 2022-07-08

Family

ID=82242536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210334026.4A Pending CN114721563A (en) 2022-03-30 2022-03-30 Display device, annotation method, and storage medium

Country Status (1)

Country Link
CN (1) CN114721563A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293486A1 (en) * 2010-09-01 2013-11-07 Exent Technologies, Ltd. Touch-based remote control
CN105446689A (en) * 2015-12-16 2016-03-30 广州视睿电子科技有限公司 Synchronization method and system of remote commenting
CN109614178A (en) * 2018-09-04 2019-04-12 广州视源电子科技股份有限公司 Annotate display methods, device, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293486A1 (en) * 2010-09-01 2013-11-07 Exent Technologies, Ltd. Touch-based remote control
CN105446689A (en) * 2015-12-16 2016-03-30 广州视睿电子科技有限公司 Synchronization method and system of remote commenting
CN109614178A (en) * 2018-09-04 2019-04-12 广州视源电子科技股份有限公司 Annotate display methods, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112866734B (en) Control method for automatically displaying handwriting input function and display device
WO2021031623A1 (en) Display apparatus, file sharing method, and server
WO2020248714A1 (en) Data transmission method and device
US20160205427A1 (en) User terminal apparatus, system, and control method thereof
US11917329B2 (en) Display device and video communication data processing method
CN113784200B (en) Communication terminal, display device and screen projection connection method
WO2022088974A1 (en) Remote control method, electronic device and system
WO2021031598A1 (en) Self-adaptive adjustment method for video chat window position, and display device
CN112788378A (en) Display apparatus and content display method
CN114721563A (en) Display device, annotation method, and storage medium
CN113076031B (en) Display equipment, touch positioning method and device
CN116980554A (en) Display equipment and video conference interface display method
CN116801027A (en) Display device and screen projection method
CN111914565A (en) Electronic equipment and user statement processing method
WO2023240973A1 (en) Display device and screen-projection method
CN113766164B (en) Display equipment and signal source interface display method
CN115086722B (en) Display method and display device for secondary screen content
CN117809592A (en) Display device and layer refresh rate setting method
CN114302131A (en) Display device and black screen detection method
CN114298119A (en) Display apparatus and image recognition method
CN117406886A (en) Display device and floating window display method
CN114968150A (en) Screen projection control method and terminal equipment
CN114302095A (en) Display device and video stream processing method of display device
CN115665468A (en) Screen projection method and device and electronic equipment
CN117793424A (en) Display apparatus and resolution setting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination