CN111221491A - Interaction control method and device, electronic equipment and storage medium - Google Patents

Interaction control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111221491A
CN111221491A CN202010022596.0A CN202010022596A CN111221491A CN 111221491 A CN111221491 A CN 111221491A CN 202010022596 A CN202010022596 A CN 202010022596A CN 111221491 A CN111221491 A CN 111221491A
Authority
CN
China
Prior art keywords
terminal
picture
touch event
screen
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010022596.0A
Other languages
Chinese (zh)
Inventor
林进全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010022596.0A priority Critical patent/CN111221491A/en
Publication of CN111221491A publication Critical patent/CN111221491A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosed embodiment relates to an interaction control method and device, electronic equipment and a storage medium, and relates to the technical field of computers, wherein the method comprises the following steps: in response to a connection request, a picture displayed by a first terminal is projected to a second terminal to form a target picture according to attribute information of the second terminal, so that the target picture is displayed through the second terminal; and if a touch event acting on the target picture is detected, responding to the touch event to control the picture displayed on the first terminal so as to carry out interaction according to the first terminal and the second terminal. The technical scheme of the disclosure can improve the efficiency and convenience of controlling the first terminal.

Description

Interaction control method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction control method, an interaction control apparatus, an electronic device, and a computer-readable storage medium.
Background
In order to realize more information sharing, interaction and interaction can be carried out through screen projection between different terminals. In the related art, the content of the mobile terminal is generally projected to other terminals through a software application. In the above manner, the interactive operation is limited and relatively single, and the convenience is poor.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an interaction control method and apparatus, an electronic device, and a computer-readable storage medium, which overcome at least to some extent the problem of limitations in interaction due to limitations and disadvantages of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, there is provided an interaction control method including: in response to a connection request, a picture displayed by a first terminal is projected to a second terminal to form a target picture according to attribute information of the second terminal, so that the target picture is displayed through the second terminal; and if a touch event acting on the target picture is detected, responding to the touch event to control the picture displayed on the first terminal so as to carry out interaction according to the first terminal and the second terminal.
According to an aspect of the present disclosure, there is provided an interactive control apparatus including: the screen projection module is used for responding to the connection request, projecting the screen displayed by the first terminal to the second terminal according to the attribute information of the second terminal to form a target screen, and displaying the target screen through the second terminal; and the terminal interaction module is used for responding to the touch event to control the picture displayed on the first terminal so as to interact with the second terminal according to the first terminal if the touch event acting on the target picture is detected.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements an interaction control method as recited in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform any one of the above-described interaction control methods via execution of the executable instructions.
In the interaction control method, the interaction control device, the electronic device, and the computer-readable storage medium provided in the embodiments of the present disclosure, in response to a connection request, a picture displayed by a first terminal is displayed on a second terminal according to attribute information of the second terminal, and the first terminal is further controlled and interacted according to a touch event. The screen can be projected reversely, so that the limitation of projecting the screen to the picture in the related technology is avoided, the complex operation is avoided, the screen projecting quality is improved, the convenience of viewing the picture at the second terminal by a user is improved, the first terminal can be controlled and operated quickly through the touch event of the second terminal, the convenience of carrying out interactive control on the first terminal is improved, and the interactive efficiency and the interactive diversity are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 schematically shows a system architecture diagram for implementing an interaction control method according to an embodiment of the present disclosure.
Fig. 2 schematically illustrates a schematic diagram of an interaction control method in an embodiment of the present disclosure.
Fig. 3 schematically illustrates a schematic diagram for controlling a first terminal in an embodiment of the present disclosure.
Fig. 4 schematically illustrates another schematic diagram for controlling the first terminal in the embodiment of the present disclosure.
Fig. 5 schematically illustrates a block diagram of an interactive control device in an embodiment of the present disclosure.
Fig. 6 schematically illustrates a block diagram of an electronic device in an embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. Further, the drawings are merely schematic illustrations of the present disclosure, in which the same reference numerals denote the same or similar parts, and thus, a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the present exemplary embodiment, first, a system architecture diagram for performing an interactive control method is provided. Referring to fig. 1, a system architecture 100 may include a first end 101, a network 102, and a second end 103. The first end 101 may be a first terminal, and specifically may be a terminal having a display screen, for example, a television, a projection or computer, a smart speaker having a display screen, and the like. The second terminal may be a second terminal, i.e. a client, and may be, for example, various handheld devices (smart phones), tablet computers, and bracelets with image display functions. The network 102 is used as a medium for providing a communication link between the first end 101 and the second end 103, the network 102 may include various connection types, such as a wired communication link, a wireless communication link, and the like, and in the embodiment of the present disclosure, the network 102 between the first end 101 and the second end 103 may be a wired communication link, such as a communication link provided by a serial connection line, or a wireless communication link, such as a communication link provided by a wireless network.
It should be understood that the number of first ends, networks and second ends in fig. 1 is merely illustrative. There may be any number of clients, networks, and servers, as desired for an implementation.
It should be noted that the interaction control method provided in the embodiment of the present disclosure may be completely executed by the second end or the first end, or may be executed by the first end and the second end, where an execution main body of the interaction control method is not particularly limited. Accordingly, the interactive control device may be disposed in the second end 103 or in the first end 101.
Based on the system architecture, the embodiment of the disclosure provides an interaction control method. Fig. 2 schematically shows a flowchart of the interaction control method, which can be applied to any application scenario in which screen projection control is performed on different terminals. Referring to fig. 2, the interactive control method at least includes steps S210 to S220, which are described in detail as follows:
in step S210, in response to the connection request, the screen displayed by the first terminal is projected to the second terminal according to the attribute information of the second terminal to form a target screen, so that the target screen is displayed by the second terminal.
In the embodiments of the present disclosure, the connection request may be used to control whether the first terminal is mapped to the second terminal. The first terminal and the second terminal may be interchanged, and in the embodiment of the present disclosure, the first terminal is taken as a screen projection terminal, and the second terminal is taken as a screen projection terminal. The first terminal may be a terminal with a fixed position, such as a television, a computer, or a terminal with a display function such as a projector, and the first terminal can support a wireless projection technology. The wireless projection technology may be wfd (wireless display) technology, namely Miracast technology, or may be any other technology that can be used to implement the wireless projection function. In addition, an application program for realizing the wireless casting request can be installed on the first terminal. The second terminal may be a terminal with a variable position, for example, a terminal with a position that can be changed at any time, such as a smart phone and a tablet computer. The second terminal also needs to be equipped with a function to support the wireless projection technology. The first terminal and the second terminal may be in the same network and the location of the first terminal and the second terminal may be the same or different. The number of the first terminals may be at least one, and the number of the second terminals may be one or more.
The connection request may be triggered based on a user operation. The user operation may be a trigger operation on an application program on the first terminal for implementing the wireless projection request, where the trigger operation may be specifically executed in a voice manner, may also be executed in a touch manner, and may also be executed by clicking an external device of the first terminal, which is not limited herein. If the first terminal and the second terminal are in the same network and both the first terminal and the second terminal support the wireless projection technology, if a connection request for controlling the connection of the first terminal and the second terminal is detected, a communication connection can be established between the first terminal and the second terminal. The first terminal can be provided with an HID module, a wifi module, an H264 encoder, a network protocol stack and other modules; the second terminal can be configured with a wifi module, an H264 decoder, a network protocol stack and other modules, so as to connect the first terminal and the second terminal in a communication manner. The process of establishing the communication connection between the first terminal and the second terminal may include: the three-level connection is successfully completed, and both clients have an IP address. The first terminal sends a selection request to the second terminal, and the second terminal returns a selection response to the first terminal; the second terminal sends a selection request to the first terminal, and the first terminal returns a selection response to the second terminal; the first terminal sends a selection request to the second terminal, and the second terminal returns a confirmation response to the first terminal; the first terminal sends a selection parameter request to the second terminal, the second terminal returns the confirmation response to the first terminal, and the first terminal and the second terminal complete exchange and establish connection.
After the first terminal and the second terminal establish a communication connection, a picture displayed on the first terminal can be projected to the second terminal for displaying. The picture displayed on the first terminal can be a static desktop, a streaming media file picture, an image or a game scene picture and other types of pictures. Wherein, the displayed picture can comprise the object. When the displayed picture is a static desktop, the object can be a plurality of application program identifications or documents and the like; when the displayed picture is a streaming media file picture, the object can be an image of a streaming media file being played; when the displayed picture is a game scene picture, the object may be a game scene picture being displayed, or the like. When the displayed screen is an image, the object may be the image.
The screen projection refers to projecting a picture (such as a streaming media file) displayed on one device onto the other device for playing so as to facilitate the streaming media file on the device on the other device. In the embodiment of the present disclosure, different from the related art, the first terminal may be projected to the second terminal for displaying, for example, a picture displayed in a television is projected to a smart phone for displaying.
The attribute information of the second terminal may be a parameter of the second terminal itself, and specifically may include a data processing method and a hardware parameter. In the embodiment of the present disclosure, the attribute information may include hardware parameters such as a code rate, a resolution, a size, and the like; and may also include data processing methods such as encoding method and decoding method. The target information of the target picture can be obtained according to the attribute information, and the target information can include any one or more of display parameters, an encoding mode and a decoding mode. In the acquisition of the target information, parameters supported by the first terminal and supported by the second terminal may be set according to the attribute information acquired from the second terminal so as to keep the two consistent.
First, description will be given by taking attribute information as a hardware parameter. The screen size, code rate and resolution that the second terminal itself has can be obtained. Wherein, the size may correspond to the resolution, and the screen size of the second terminal may be 16: 9 or 18: 9. when the screen size is 16: when 9, the resolution may be any one of 1080 × 1920, 1440 × 2560, 1280 × 720 and 4096 × 2160; when the screen size is 18: when 9, the corresponding resolution may be 1080 × 2160, 1440 × 2880, 1440 × 720 and other numerical resolutions. The code rate refers to the number of data bits transmitted per unit time during data transmission, and the higher the code rate, i.e., the lower the distortion rate of the image, the clearer the picture. In general, the code rate may be represented by a range of values and may be fixed. For example, the code rate of the second terminal is between 1kbps and 10kbps, and the code rate of the first terminal is between 5kbps and 12 kbps. Because the code rates of different terminals can be represented by intervals, the overlapping interval of the first terminal and the second terminal can be used as the screen projection code rate, so that the screen can be projected on the picture according to the screen projection code rate. For example, the screen projection rate may be 5kbps to 10 kbps. By selecting a proper code rate, the image distortion rate can be reduced, and the quality of a target picture projected by a screen can be improved.
If the sizes of the first terminal and the second terminal are different, for example, the size of the first terminal is larger than that of the second terminal, the size is directly converted into the size of the second terminal. Correspondingly, if the resolution of the first terminal is greater than that of the second terminal, the resolution of the target picture is determined by directly taking the resolution of the second terminal as a reference. That is, the resolution of the target screen is set to be the smallest among the plurality of terminals. In addition, it is also possible to perform logic processing for combining resolutions of different terminals to determine the resolution of the target picture. After the processed display parameters consisting of the code rate and the resolution are determined, the target picture projected to the second terminal can be determined according to the processed display parameters, so that the problem that the screen cannot be accurately projected due to different and inappropriate display parameters of different terminals is solved, the quality and the user experience of the target picture obtained by screen projection are improved, and the playing quality is also improved. In the embodiment of the present disclosure, after the display parameter of the second terminal is obtained, if the first terminal also supports the display parameter of the second terminal, the first terminal may be determined to project the media to the second terminal completely by determining the display parameter together in the interaction stage directly according to the display parameter of the second terminal, and the first terminal may be controlled back by mirroring the target image displayed by the second terminal.
Next, description will be given by taking attribute information as an example of a data processing method. The encoding mode of the second terminal may be obtained, and may be different according to the type of the encoding object. For example, when the encoding object is a streaming media file, if the streaming media file is a video type, the encoding method may be any one of or a combination of H263, H264, H265, and 3D video, for example. If the streaming media file is of an audio type, the encoding method may be any one or a combination of AAC, AC3, LPCM and other methods. In the embodiment of the present disclosure, the encoding method is described as H264.
After the encoding mode of the second terminal is obtained, the streaming media file corresponding to the picture displayed by the first terminal may be encoded according to an H264 encoding mode to obtain encoded data. Specifically, the H264 encoding flow may include the following steps: grouping, i.e. grouping a number of frames of images into a group, i.e. a sequence; defining frames, and defining each frame image in each group into three types, namely I frames, B frames and P frames; predicting frames, namely predicting P frames by using I frames as basic frames and predicting B frames by using the I frames and the P frames; and data transmission, and finally storing and transmitting the I frame data and the predicted difference information. Where an I-frame refers to an intra-coded frame, the I-frame represents a key frame, and you can understand that this frame is a complete preservation of pictures. The B frame refers to a bidirectional predictive interpolation coding frame, is a bidirectional difference frame, and records the difference between the current frame and the previous and next frames. P-frames refer to forward predictive coded frames. The P frame represents the difference between this frame and a previous key frame (or P frame). The streaming media file corresponding to the screen displayed by the first terminal may support multiple encoding methods including the encoding method of the second terminal.
After the encoded data is obtained, the second terminal may decode the encoded data according to a decoding mode to form a target picture according to the decoded data, so that the picture displayed by the first terminal is projected to the second terminal for display. The decoding scheme may be, for example, H264 decoding. In the embodiment of the disclosure, the picture of the first terminal is encoded and decoded by selecting a conforming mode from the encoding modes of the first terminal according to the encoding mode of the second terminal, so that the encoding and decoding efficiency and the encoding and decoding quality can be improved, the connection difficulty and the data transmission difficulty between devices are greatly reduced, and the data transmission performance is improved.
When the picture displayed by the first terminal is projected to the second terminal to form the target picture, the target picture can be determined according to the type of the picture displayed by the first terminal. If the type of the picture of the first terminal belongs to the preset type, the picture displayed by the first terminal can be directly mirrored into the target picture. The preset type may be a type that does not require real-time operation, and based on this, the preset type may be any one of a video stream type, a desktop type, and an image type. Mirroring means that data on one terminal's disk has an identical copy on the other terminal's disk. That is, the target screen may be identical to the screen of the first terminal. For example, if the screen displayed on the first terminal is image 1, the target screen is image 1. If the type of the picture displayed by the first terminal does not belong to the preset type, the associated picture of the pictures displayed on the first terminal can be used as the target picture displayed on the second terminal. The associated screen may be a screen for providing assistance to the screen. For example, if the picture on the first terminal is a game scene picture, the picture displayed by the first terminal does not belong to the preset type, and therefore the target picture may be an associated picture corresponding to the game scene picture, and the associated picture may be a map picture or a prop picture, for example. In the embodiment of the present disclosure, the target screen is the same as the screen displayed on the first terminal.
When the number of the first terminals is one, the picture displayed on the first terminal can be directly projected to one or more second terminals which are successfully in communication connection with the first terminal, so that the picture of the first terminal is displayed on the second terminals. And if one second terminal needs to be screened out from the plurality of second terminals, determining the second terminal according to the signal strength and the electric quantity information of the second terminal. For example, the second terminal is determined as the terminal with the maximum signal strength and the maximum power information. If the number of the first terminals is multiple, the first terminal to be used at present can be selected according to the signal strength and the distance, and the first terminal is further projected to the second terminal. The first terminal may specifically be selected based on the signal strength being greatest and the distance being closest. For example, the mobile phone a and the television B are in communication connection and are in the same network, and the user triggers an application program installed on the television B in a voice manner, so that the image 1 displayed on the television B can be projected to the mobile phone a to display the image 1 on the mobile phone a. By projecting the second terminal to the first terminal, the problem that a user cannot watch the mobile phone through the inconvenient mobile terminal due to some reasons and can conveniently check the mobile phone through the mobile phone is avoided, and convenience is provided for the user.
It should be added that if the reference information of the second terminal does not satisfy the screen projection condition, the screen projection of the picture displayed on the first terminal to the second terminal may be stopped. The reference information may be information indicating a performance of the second terminal, and the reference information may be a remaining power amount and a remaining storage space of the second terminal. The screen projection condition may be determined according to the remaining power amount or the remaining storage space, and a first threshold may be configured for the remaining power amount, and a second threshold may be set for the remaining storage space, and the first threshold and the second threshold may be set according to actual requirements, which is not limited herein. For example, the first threshold of the remaining power may be 10%, and the second threshold of the remaining storage space may also be 10%. If the residual electric quantity of the second terminal is smaller than a first threshold value, the screen projection condition is not met; if the residual storage space of the second terminal is smaller than a second threshold value, the screen projection condition is not met; and if the residual capacity of the second terminal is less than the first threshold and the residual storage space of the second terminal is less than the second threshold, the screen projection condition is not met. And when the reference information of the second terminal does not meet the screen projection condition, stopping projecting the screen of the first terminal to the second terminal. In this case, the states of the first terminal and the second terminal may be restored to the state before the screen projection, for example, the first terminal and the second terminal may be independent from each other, or the screen of the second terminal may be projected to the first terminal, which is not limited herein. The screen projection of the first terminal to the second terminal is stopped when the reference information does not meet the screen projection condition, and the second terminal can be switched to a power-saving mode so as to save resource consumption of the second terminal.
Continuing to refer to fig. 2, in step S220, if a touch event acting on the target screen is detected, the screen displayed on the first terminal is controlled in response to the touch event, so as to interact with the second terminal according to the first terminal.
In the embodiment of the disclosure, after the screen of the first terminal is projected to the second terminal to form the target screen, whether a touch event acting on the target screen on the second terminal is received or not can be detected. The touch event here may be an event that the target screen is controlled by a touch device or a finger, and specifically may be one or more of a click event, a slide event, or a press event that acts on the target screen, and different types of touch events may correspond to different functions. If the touch event acting on the target screen is detected, the touch event can be mapped to the first terminal for controlling back, so that the screen displayed in the first terminal is correspondingly controlled according to the touch event. For example, if a sliding event acting on the target screen 1 displayed on the second terminal is detected, it may be mapped to the screen 1 in the first terminal to control the screen 1 in the first terminal through the sliding event, such as adding, deleting, etc. the screen 1 in the first terminal is controlled through the sliding event.
The specific step of controlling the screen displayed in the first terminal in response to the touch event includes: and controlling the picture corresponding to the position according to the position acted by the touch event. Specifically, the position acted by the touch event, for example, the position a, may be determined according to the touch point of the touch event or the slid track. Further, the position and the frame area may be compared to obtain a frame corresponding to the position, where the content in the frame corresponding to the position is determined. The content may be, for example, an application or a streaming media file, image, or the like. For example, the content of the frame corresponding to the position a may be the application 1. Therefore, the application program 1 on the first terminal can be controlled and operated based on the touch event acting on the second terminal, namely, the second terminal is used for controlling the first terminal, so that the interaction diversity and the user experience are improved, the function of controlling the terminal with the fixed position through the terminal with the variable position is realized, and convenience is provided for the user to control the first terminal. When the screen projection of the first terminal to the second terminal is stopped, the operation of controlling the first terminal through the second terminal can be stopped, and the interaction process between the first terminal and the second terminal is finished, so that misoperation is avoided. The first terminal is controlled back through the second terminal, so that the interaction mode of the first terminal can be increased, the interaction diversity is improved, the limitation of controlling the first terminal in the related technology is avoided, and the application range is enlarged.
Fig. 3 schematically shows a flow chart for controlling the first terminal, and referring to fig. 3, the flow chart mainly includes the following parts: a first terminal 310, a second terminal 320, an encoded data packet 330, and a decoded data packet 340. The first terminal is used as video source equipment, acquires the self displayed picture data and sends the picture data to the second terminal. And the second terminal is used as playing equipment and displays the acquired picture data to the user interface. For the second terminal, a touch event (e.g., a click event) may be acquired, and data associated with the touch event may be acquired. The data associated with the touch event may be position, force, track, duration and the like of the touch event. For convenience of processing, data associated with the touch event can be encoded to form a data packet, so that the safety and convenience of data transmission are ensured. Further, the data packet may be sent based on a TCP (Transmission Control Protocol) connection, and the data packet is decoded to obtain data associated with the touch event, so that the data associated with the touch event is transmitted to the first terminal, so as to send the click event to the user interface of the first terminal. Both encoding and decoding can be performed through a communication protocol, and the communication protocol here can be, for example, UIBC (User Input Back Channel) protocol. The UIBC defines how to send back the control signal of the second terminal to the first terminal, including device events such as touch, mouse, keyboard, joystick, etc.
Fig. 4 schematically shows another flow chart for controlling the first terminal, and referring to fig. 4, the flow chart mainly includes the following parts: the first terminal 410, the second terminal 420, the input event 430, the encoded data packet 440, the decoded data packet 450, and the touch event 460. The first terminal is used as video source equipment, acquires the self displayed picture data and sends the picture data to the second terminal. And the second terminal is used as playing equipment and displays the acquired picture data to the user interface. For the second terminal, the device event may be obtained, and data associated with the device event may be obtained. The device may be a mouse, keyboard, joystick, remote control, etc. of the second terminal. The data associated with the device event can be the position, strength, track, duration and the like of the device event. And further encoding the data associated with the equipment event to form a data packet so as to ensure the safety and convenience of data transmission. On the basis, the data packet can be sent based on the TCP connection and decoded to obtain the device event related data, so as to transmit the device event related data to the first terminal, so as to send the device event to the user interface of the first terminal. For convenience of processing, the device event may be converted to a touch event. Both encoding and decoding can be performed through a communication protocol, and the communication protocol here may be, for example, UIBC callback protocol. The UIBC defines how the control signals of the second terminal are looped back to the first terminal. When the Device event is identified, the Device event can be accurately fed back to a user Interface of the first terminal through an HID (Human Interface Device) module, and then sent to the HID for driving.
According to the technical scheme in the embodiment of the disclosure, the screen on the first terminal is projected to the second terminal, so that the screen on the first terminal can be subjected to back control through a touch event acting on a target screen displayed on the second terminal, and thus the remote control of the first terminal is realized through a back control mode, the interaction limitation is avoided, and the application range is enlarged. And the first terminal can be controlled by the second terminal frequently used by the user, such as a mobile phone, so that the first terminal can be more precisely and accurately controlled, the traditional control mode is avoided, and convenience and intelligent control are improved. The screen of the picture displayed by the first terminal can be projected to the second terminal, the first terminal is far away from the user, the second terminal is close to the user, or the user is in an application scene without the first terminal but only with the second terminal, so that the user can check the picture on the first terminal through the second terminal, the watching and operating efficiency of the user is improved, the user experience is also improved, and convenience is provided for the user. The television can have more complex interaction functions with users, and the convenience limitation brought by remote controller operation is separated. The television can be controlled like a mobile phone. In addition, by combining the UIBC technology, a user can use the intelligent device as a portable host, and can perform remote interaction only by preparing a screen and an input device, and the problems of data synchronization, transmission safety, working environment mobility and the like are not needed to be worried about, so that the usability and the application range are increased.
In an embodiment of the present disclosure, an interaction control apparatus is further provided, and as shown in fig. 5, the interaction control apparatus 500 mainly includes the following modules:
the screen projecting module 501 may respond to the connection request, project a screen displayed by the first terminal to the second terminal according to the attribute information of the second terminal to form a target screen, and display the target screen through the second terminal;
a terminal interaction module 502, configured to, if a touch event acting on the target screen is detected, respond to the touch event to control the screen displayed on the first terminal, so as to perform interaction according to the first terminal and the second terminal.
In an exemplary embodiment of the present disclosure, a screen projection module includes: and the display parameter determining module is used for acquiring the display parameters of the second terminal, processing the display parameters of the picture of the first terminal according to the display parameters and forming the target picture according to the processed display parameters.
In an exemplary embodiment of the present disclosure, a screen projection module includes: the coding module is used for coding the picture of the first terminal according to the coding mode of the second terminal to form coded data; and the decoding module is used for decoding the coded data according to the decoding mode of the second terminal so as to form the target picture.
In an exemplary embodiment of the present disclosure, a screen projection module includes: the first determining module is used for mirroring the picture as the target picture if the type of the picture belongs to a preset type; and the second determining module is used for taking the related picture of the picture as the target picture if the type of the picture does not belong to the preset type.
In an exemplary embodiment of the present disclosure, the terminal interaction module includes: the data packet acquisition module is used for acquiring a touch event acting on the second terminal and coding data related to the touch event to form a data packet; and the event back control module is used for decoding the data packet, sending the data related to the touch event obtained by decoding to a first terminal, and controlling the picture of the first terminal through the second terminal based on the data related to the touch event.
In an exemplary embodiment of the present disclosure, the terminal interaction module includes: the device event acquisition module is used for acquiring a device event acting on the second terminal through input equipment; the data packing module is used for packing the data related to the equipment event to form a data package; and the event back control module is used for decoding the data packet, converting the equipment event into a touch event according to the decoded data packet related to the equipment event, and controlling the picture of the first terminal through the second terminal based on the data related to the touch event.
In an exemplary embodiment of the present disclosure, the event rollback module includes: and the picture control module is used for controlling a picture corresponding to the position displayed in the first terminal according to the position of the touch event.
It should be noted that, the details of each module in the interaction control device have been described in detail in the corresponding interaction control method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In the embodiment of the disclosure, an electronic device capable of implementing the method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the disclosure is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 610 may perform the steps as shown in fig. 2.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, a graphics acceleration interface, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an embodiment of the present disclosure, a computer-readable storage medium is further provided, on which a program product capable of implementing the above-mentioned method of the present specification is stored. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
According to the program product for implementing the above method of the embodiments of the present disclosure, it may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. An interaction control method, comprising:
in response to a connection request, a picture displayed by a first terminal is projected to a second terminal to form a target picture according to attribute information of the second terminal, so that the target picture is displayed through the second terminal;
and if a touch event acting on the target picture is detected, responding to the touch event to control the picture displayed on the first terminal so as to carry out interaction according to the first terminal and the second terminal.
2. The interaction control method according to claim 1, wherein the projecting the screen displayed by the first terminal to the second terminal to form the target screen according to the attribute information of the second terminal comprises:
and acquiring the display parameters of the second terminal, processing the display parameters of the picture of the first terminal according to the display parameters, and forming the target picture according to the processed display parameters.
3. The interaction control method according to claim 1, wherein the projecting the screen displayed by the first terminal to the second terminal to form the target screen according to the attribute information of the second terminal comprises:
coding the picture of the first terminal according to the coding mode of the second terminal to form coded data;
and decoding the coded data according to the decoding mode of the second terminal to form the target picture.
4. The interaction control method according to any one of claims 1 to 3, wherein the method further comprises:
if the type of the picture belongs to a preset type, mirroring the picture as the target picture;
and if the type of the picture does not belong to the preset type, taking the associated picture of the picture as the target picture.
5. The interaction control method according to claim 1, wherein if a touch event acting on the target screen is detected, controlling the screen displayed on the first terminal in response to the touch event comprises:
acquiring a touch event acting on the second terminal, and encoding data associated with the touch event to form a data packet;
and decoding the data packet, sending the data related to the touch event obtained by decoding to a first terminal, and controlling the picture of the first terminal through the second terminal based on the data related to the touch event.
6. The interaction control method according to claim 1, wherein if a touch event acting on the target screen is detected, controlling the screen displayed in the first terminal in response to the touch event comprises:
acquiring a device event acting on the second terminal through an input device;
packaging the data related to the equipment event to form a data packet;
and decoding the data packet, converting the device event into a touch event according to the decoded data packet associated with the device event, and controlling the picture of the first terminal through the second terminal based on the data associated with the touch event.
7. The interaction control method according to claim 5 or 6, wherein controlling, by the second terminal, the screen of the first terminal based on the data associated with the touch event comprises:
and controlling a picture corresponding to the position displayed in the first terminal according to the position of the touch event.
8. An interactive control apparatus, comprising:
the screen projection module is used for responding to the connection request, projecting the screen displayed by the first terminal to the second terminal according to the attribute information of the second terminal to form a target screen, and displaying the target screen through the second terminal;
and the terminal interaction module is used for responding to the touch event to control the picture displayed on the first terminal so as to interact with the second terminal according to the first terminal if the touch event acting on the target picture is detected.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the interaction control method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the interaction control method of any of claims 1-7 via execution of the executable instructions.
CN202010022596.0A 2020-01-09 2020-01-09 Interaction control method and device, electronic equipment and storage medium Pending CN111221491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010022596.0A CN111221491A (en) 2020-01-09 2020-01-09 Interaction control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010022596.0A CN111221491A (en) 2020-01-09 2020-01-09 Interaction control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111221491A true CN111221491A (en) 2020-06-02

Family

ID=70809742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010022596.0A Pending CN111221491A (en) 2020-01-09 2020-01-09 Interaction control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111221491A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114765A (en) * 2020-09-27 2020-12-22 北京小米移动软件有限公司 Screen projection method and device and storage medium
CN112162717A (en) * 2020-10-10 2021-01-01 深圳市维派创展科技有限公司 Screen expansion method and system and multifunctional display
CN112394895A (en) * 2020-11-16 2021-02-23 Oppo广东移动通信有限公司 Cross-equipment display method and device of picture and electronic equipment
CN112422586A (en) * 2020-12-10 2021-02-26 努比亚技术有限公司 Computer screen and mobile terminal screen expansion method and expansion system
CN112667181A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen projection method, screen projection equipment, screen projection control system and storage medium
CN112684993A (en) * 2020-12-23 2021-04-20 北京小米移动软件有限公司 Display method, device and medium based on cross-screen cooperation
CN112905289A (en) * 2021-03-10 2021-06-04 Oppo广东移动通信有限公司 Application picture display method, device, terminal, screen projection system and medium
CN114115629A (en) * 2020-08-26 2022-03-01 华为技术有限公司 Interface display method and equipment
CN114339183A (en) * 2021-12-30 2022-04-12 深圳迈瑞动物医疗科技有限公司 Endoscope system and screen projection method thereof
CN114584828A (en) * 2020-11-30 2022-06-03 上海新微技术研发中心有限公司 Android screen projection method, computer-readable storage medium and device
WO2023030099A1 (en) * 2021-09-03 2023-03-09 华为技术有限公司 Cross-device interaction method and apparatus, and screen projection system and terminal
WO2023141857A1 (en) * 2022-01-27 2023-08-03 京东方科技集团股份有限公司 Screen projection method and apparatus, electronic device and computer readable medium
WO2024045985A1 (en) * 2022-08-31 2024-03-07 京东方科技集团股份有限公司 Screen control method, screen control apparatus, electronic device, program, and medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130001063A (en) * 2011-06-24 2013-01-03 에스케이플래닛 주식회사 Terminal device, method for streaming ui, and storage medium thereof
CN103024500A (en) * 2012-12-18 2013-04-03 四川长虹电器股份有限公司 System and method for remotely controlling television terminal
KR20130105004A (en) * 2012-03-16 2013-09-25 주식회사 엔씨소프트 System and method for providing on-line game service by using instant message
CN106371608A (en) * 2016-09-21 2017-02-01 努比亚技术有限公司 Display control method and device for screen projection
CN107145278A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 Apparatus control method, device and mobile terminal based on mobile terminal
WO2017211088A1 (en) * 2016-06-08 2017-12-14 中兴通讯股份有限公司 Multi-screen interaction method and device
CN107483994A (en) * 2017-07-31 2017-12-15 广州指观网络科技有限公司 It is a kind of reversely to throw screen control system and method
WO2019007402A1 (en) * 2017-07-06 2019-01-10 中兴通讯股份有限公司 Video interaction processing method, apparatus and system
CN109819493A (en) * 2019-03-06 2019-05-28 深圳前海达闼云端智能科技有限公司 Control method and throwing screen method and device, storage medium, electronic equipment
CN109976844A (en) * 2019-01-25 2019-07-05 维沃移动通信有限公司 A kind of message display method and mobile terminal
CN110248216A (en) * 2019-06-03 2019-09-17 广东有线广播电视网络有限公司 TV throws screen method, apparatus, TV throws screen system and computer equipment
CN110248226A (en) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 Information screen projection method, device, system, storage medium and processor
CN110333836A (en) * 2019-07-05 2019-10-15 网易(杭州)网络有限公司 Throwing screen method, apparatus, storage medium and the electronic device of information
CN110515580A (en) * 2019-09-02 2019-11-29 联想(北京)有限公司 A kind of display control method, device and terminal
CN110602087A (en) * 2019-09-10 2019-12-20 腾讯科技(深圳)有限公司 Intelligent screen projection method and device, intelligent terminal and server
CN110647303A (en) * 2019-08-30 2020-01-03 北京文渊佳科技有限公司 Multimedia playing method, device, storage medium and electronic equipment

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130001063A (en) * 2011-06-24 2013-01-03 에스케이플래닛 주식회사 Terminal device, method for streaming ui, and storage medium thereof
KR20130105004A (en) * 2012-03-16 2013-09-25 주식회사 엔씨소프트 System and method for providing on-line game service by using instant message
CN103024500A (en) * 2012-12-18 2013-04-03 四川长虹电器股份有限公司 System and method for remotely controlling television terminal
CN107145278A (en) * 2016-03-01 2017-09-08 阿里巴巴集团控股有限公司 Apparatus control method, device and mobile terminal based on mobile terminal
WO2017211088A1 (en) * 2016-06-08 2017-12-14 中兴通讯股份有限公司 Multi-screen interaction method and device
CN106371608A (en) * 2016-09-21 2017-02-01 努比亚技术有限公司 Display control method and device for screen projection
WO2019007402A1 (en) * 2017-07-06 2019-01-10 中兴通讯股份有限公司 Video interaction processing method, apparatus and system
CN107483994A (en) * 2017-07-31 2017-12-15 广州指观网络科技有限公司 It is a kind of reversely to throw screen control system and method
CN109976844A (en) * 2019-01-25 2019-07-05 维沃移动通信有限公司 A kind of message display method and mobile terminal
CN109819493A (en) * 2019-03-06 2019-05-28 深圳前海达闼云端智能科技有限公司 Control method and throwing screen method and device, storage medium, electronic equipment
CN110248216A (en) * 2019-06-03 2019-09-17 广东有线广播电视网络有限公司 TV throws screen method, apparatus, TV throws screen system and computer equipment
CN110333836A (en) * 2019-07-05 2019-10-15 网易(杭州)网络有限公司 Throwing screen method, apparatus, storage medium and the electronic device of information
CN110248226A (en) * 2019-07-16 2019-09-17 广州视源电子科技股份有限公司 Information screen projection method, device, system, storage medium and processor
CN110647303A (en) * 2019-08-30 2020-01-03 北京文渊佳科技有限公司 Multimedia playing method, device, storage medium and electronic equipment
CN110515580A (en) * 2019-09-02 2019-11-29 联想(北京)有限公司 A kind of display control method, device and terminal
CN110602087A (en) * 2019-09-10 2019-12-20 腾讯科技(深圳)有限公司 Intelligent screen projection method and device, intelligent terminal and server

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高天宇: "计算机信息科技时代无线多媒体通信技术在家庭中的应用", 《信息通信》, pages 159 - 160 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114115629A (en) * 2020-08-26 2022-03-01 华为技术有限公司 Interface display method and equipment
CN112114765A (en) * 2020-09-27 2020-12-22 北京小米移动软件有限公司 Screen projection method and device and storage medium
CN112162717A (en) * 2020-10-10 2021-01-01 深圳市维派创展科技有限公司 Screen expansion method and system and multifunctional display
CN112394895A (en) * 2020-11-16 2021-02-23 Oppo广东移动通信有限公司 Cross-equipment display method and device of picture and electronic equipment
CN112394895B (en) * 2020-11-16 2023-10-13 Oppo广东移动通信有限公司 Picture cross-device display method and device and electronic device
WO2022100305A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Cross-device picture display method and apparatus, and electronic device
CN114584828B (en) * 2020-11-30 2024-05-17 上海新微技术研发中心有限公司 Android screen-throwing method, computer readable storage medium and equipment
CN114584828A (en) * 2020-11-30 2022-06-03 上海新微技术研发中心有限公司 Android screen projection method, computer-readable storage medium and device
CN112422586A (en) * 2020-12-10 2021-02-26 努比亚技术有限公司 Computer screen and mobile terminal screen expansion method and expansion system
CN112684993A (en) * 2020-12-23 2021-04-20 北京小米移动软件有限公司 Display method, device and medium based on cross-screen cooperation
CN112667181A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen projection method, screen projection equipment, screen projection control system and storage medium
CN112905289A (en) * 2021-03-10 2021-06-04 Oppo广东移动通信有限公司 Application picture display method, device, terminal, screen projection system and medium
WO2023030099A1 (en) * 2021-09-03 2023-03-09 华为技术有限公司 Cross-device interaction method and apparatus, and screen projection system and terminal
CN114339183A (en) * 2021-12-30 2022-04-12 深圳迈瑞动物医疗科技有限公司 Endoscope system and screen projection method thereof
WO2023141857A1 (en) * 2022-01-27 2023-08-03 京东方科技集团股份有限公司 Screen projection method and apparatus, electronic device and computer readable medium
WO2024045985A1 (en) * 2022-08-31 2024-03-07 京东方科技集团股份有限公司 Screen control method, screen control apparatus, electronic device, program, and medium

Similar Documents

Publication Publication Date Title
CN111221491A (en) Interaction control method and device, electronic equipment and storage medium
US11824913B2 (en) Video stream management for remote graphical user interfaces
US11727079B2 (en) Cooperative web browsing using multiple devices
US10652506B2 (en) High quality multimedia transmission from a mobile device for live and on-demand viewing
US9264478B2 (en) Home cloud with virtualized input and output roaming over network
JP4585479B2 (en) Server apparatus and video distribution method
CN104685873B (en) Encoding controller and coding control method
US8982135B2 (en) Information processing apparatus and image display method
JP6511038B2 (en) Cloud streaming service providing method and apparatus therefor
WO2019164753A1 (en) Efficient streaming video for static video content
KR101942269B1 (en) Apparatus and method for playing back and seeking media in web browser
CN112399257B (en) Cloud desktop video playing method, server, terminal and storage medium
CN113225585A (en) Video definition switching method and device, electronic equipment and storage medium
CN113973224B (en) Media information transmission method, computing device and storage medium
CN115243074A (en) Video stream processing method and device, storage medium and electronic equipment
US20140099039A1 (en) Image processing device, image processing method, and image processing system
CN110798700B (en) Video processing method, video processing device, storage medium and electronic equipment
CN114339415A (en) Client video playing method and device, electronic equipment and readable medium
CN114339315A (en) Program playing method, system, electronic equipment and storage medium
CN116112476A (en) Multimedia playing method, system, equipment and storage medium of cloud desktop
CN116016968A (en) Audio and video data processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination