CN111970546A - Method and device for controlling terminal interaction, electronic equipment and storage medium - Google Patents

Method and device for controlling terminal interaction, electronic equipment and storage medium Download PDF

Info

Publication number
CN111970546A
CN111970546A CN202010704511.7A CN202010704511A CN111970546A CN 111970546 A CN111970546 A CN 111970546A CN 202010704511 A CN202010704511 A CN 202010704511A CN 111970546 A CN111970546 A CN 111970546A
Authority
CN
China
Prior art keywords
terminal
interface
target
control
multimedia content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010704511.7A
Other languages
Chinese (zh)
Inventor
郜光耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010704511.7A priority Critical patent/CN111970546A/en
Publication of CN111970546A publication Critical patent/CN111970546A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for controlling terminal interaction, an electronic device, and a storage medium, so as to improve flexibility of terminal interaction. The method comprises the following steps: the first terminal sends interface elements of all operation controls arranged on the first operation interface to a second terminal to be interacted, and the first terminal controls the second terminal to display multimedia contents; receiving an interface operation instruction which is generated and sent by the second terminal according to the control operation aiming at the target interface element displayed on the second operation interface; and executing corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sending the control result to the second terminal so that the second terminal displays the control result. According to the method and the device, the first terminal can send the interface elements of the operation control to the second terminal, the second terminal can send the interface operation instruction to the first terminal, and the first terminal and the second terminal can interact in a two-way mode, so that the flexibility of terminal interaction is improved.

Description

Method and device for controlling terminal interaction, electronic equipment and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for controlling terminal interaction, an electronic device, and a storage medium.
Background
Along with the development of the internet and the popularization of the terminal, the user group of the terminal is larger and larger, and meanwhile, more intelligent and humanized requirements are provided for software. Projecting the screen of the terminal into a display device for display has become a more preferred function. In the related technology, media contents such as videos and pictures on a mobile phone can be played on a smart television in a projection mode, however, the mode belongs to one-way interaction, and only the related media functions of the videos or the pictures are projected, so that the flexibility is low.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling terminal interaction, electronic equipment and a storage medium, which are used for improving the flexibility of terminal interaction.
The first method for controlling terminal interaction provided by the embodiment of the application comprises the following steps:
the method comprises the steps that a first terminal sends interface elements of all operation controls arranged on a first operation interface to a second terminal to be interacted, so that the second terminal displays the interface elements of all the operation controls on a second operation interface, and the first terminal controls the second terminal to display multimedia contents;
the first terminal receives an interface operation instruction sent by the second terminal, wherein the interface operation instruction is generated by the second terminal according to control operation aiming at a target interface element displayed on the second operation interface;
and the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sends a control result to the second terminal so that the second terminal displays the control result.
The second method for controlling terminal interaction provided by the embodiment of the application comprises the following steps:
the method comprises the steps that a second terminal receives interface elements of all operation controls arranged on a first operation interface sent by a first terminal, and displays the interface elements of all the operation controls on a second operation interface, wherein the second terminal displays multimedia content based on the control of the first terminal;
the second terminal generates an interface operation instruction according to control operation aiming at a target interface element displayed on the second operation interface, and sends the interface operation instruction to the first terminal, so that the first terminal executes corresponding operation on a target operation control corresponding to the target interface element according to the interface operation instruction;
and the second terminal receives the control result returned by the first terminal and displays the control result.
The first device for controlling terminal interaction provided by the embodiment of the application comprises:
the first sending unit is used for sending the interface elements of the operation controls arranged on the first operation interface to a second terminal to be interacted so that the second terminal displays the interface elements of the operation controls on the second operation interface, and the first terminal controls the second terminal to display multimedia contents;
the first receiving unit is used for receiving an interface operation instruction sent by the second terminal, wherein the interface operation instruction is generated by the second terminal according to control operation aiming at a target interface element displayed on the second operation interface;
and the control unit is used for executing corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sending a control result to the second terminal so that the second terminal displays the control result.
Optionally, the control unit is specifically configured to:
if interface operation instructions sent by a plurality of second terminals are received at the same time and the interface operation instructions sent by each second terminal are the same, executing corresponding operation on a target operation control corresponding to a target interface element according to the interface operation instructions; or
If interface operation instructions sent by a plurality of second terminals are received at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operations on target operation controls corresponding to corresponding target interface elements according to the sequence from high to low of the priority of each second terminal sending the interface operation instructions and the interface operation instructions sent by each second terminal in sequence; or
And if the interface operation instructions sent by the plurality of second terminals are received at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operation on the target operation control corresponding to the corresponding target interface element according to the priority of each second terminal sending the interface operation instruction and the interface operation instruction sent by the second terminal with the highest priority.
The second device for controlling terminal interaction provided by the embodiment of the application comprises:
the second receiving unit is used for receiving the interface elements of the operation controls arranged on the first operation interface and sent by the first terminal, and displaying the interface elements of the operation controls on the second operation interface, wherein the second terminal displays multimedia contents based on the control of the first terminal;
the operation unit is used for generating an interface operation instruction according to the control operation aiming at the target interface element displayed on the second operation interface and sending the interface operation instruction to the first terminal so that the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction;
and the execution unit is used for receiving the control result returned by the first terminal and displaying the control result.
Optionally, the second receiving unit is specifically configured to:
receiving content information of the multimedia content sent by the first terminal, and displaying the multimedia content according to the content information, wherein the multimedia content is the multimedia content currently displayed by the first terminal; or
And receiving address information of the multimedia content sent by the first terminal, and acquiring and displaying the multimedia content according to the address information, wherein the multimedia content is the multimedia content currently displayed by the first terminal.
Optionally, the apparatus further comprises:
and the second updating unit is used for receiving the interface element of the operation control after the interface updating sent by the first terminal and updating and displaying the changed interface element, wherein the interface element of the operation control after the interface updating is sent by the first terminal according to the updating operation responding to the first operation interface.
An electronic device provided in an embodiment of the present application includes a processor and a memory, where the memory stores program codes, and when the program codes are executed by the processor, the processor is caused to execute any one of the above steps of the method for controlling terminal interaction.
An embodiment of the present application provides a computer-readable storage medium, which includes a program code, when the program product runs on an electronic device, the program code is configured to enable the electronic device to perform any one of the steps of the method for controlling terminal interaction.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to make the computer device execute the steps of any one of the above methods for controlling terminal interaction.
The beneficial effect of this application is as follows:
according to the method, the device, the electronic equipment and the storage medium for controlling terminal interaction provided by the embodiment of the application, because the first terminal can send the interface elements of each operation control to the second terminal and the second terminal displays the interface elements, when the first terminal controls the second terminal to display multimedia contents, the second terminal can further respond to the control operation aiming at the target interface elements displayed on the second operation interface, further generate an interface operation instruction to be sent to the first terminal, the interface operation instruction is further controlled by the first terminal, the two-way interaction between the first terminal and the second terminal is realized, based on the above manner, the action is transmitted to the first terminal by the second terminal, the process of executing the action result is simultaneously embodied by the first terminal and the second terminal, and the second terminal can display the interface elements besides the multimedia contents, so that the interaction information between applications is enriched, the flexibility of terminal interaction is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1A is an alternative schematic diagram of a first projection method in the related art;
FIG. 1B is an alternative schematic diagram of a second projection method of the related art;
FIG. 1C is an alternative schematic diagram of a third projection method of the related art;
fig. 2 is a schematic diagram of an application scenario in an embodiment of the present application;
fig. 3A is a timing diagram illustrating a connection between a source device and a display device according to an embodiment of the present disclosure;
fig. 3B is a timing diagram illustrating another connection between a source device and a display device in the embodiment of the present application:
fig. 4 is a schematic flowchart of a first method for controlling terminal interaction in an embodiment of the present application;
FIG. 5A is a schematic view of a first interface in an embodiment of the present application;
FIG. 5B is a diagram illustrating a first second operation interface according to an embodiment of the present disclosure;
FIG. 5C is a diagram illustrating a second interface of the present application;
FIG. 5D is a diagram illustrating a second exemplary first interface in an embodiment of the present application;
FIG. 5E is a schematic view of a third interface for operations according to an embodiment of the present application;
FIG. 6 is a schematic flow chart illustrating updating a display interface element according to an embodiment of the present application;
FIG. 7A is a schematic view of a projection method in an embodiment of the present application;
FIG. 7B is a schematic diagram of another projection method in the embodiment of the present application;
FIG. 8 is a schematic view of a fourth interface for operation according to an embodiment of the present application;
fig. 9 is a flowchart illustrating a second method for controlling terminal interaction in an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating an alternative interactive implementation timing sequence in the embodiments of the present application;
fig. 11 is a schematic structural diagram of a first apparatus for controlling terminal interaction in an embodiment of the present application;
fig. 12 is a schematic structural diagram of a second apparatus for controlling terminal interaction in an embodiment of the present application;
fig. 13 is a schematic diagram of a hardware component structure of an electronic device to which an embodiment of the present application is applied;
fig. 14 is a schematic diagram of a hardware component structure of a terminal device to which an embodiment of the present application is applied.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments, but not all embodiments, of the technical solutions of the present application. All other embodiments obtained by a person skilled in the art without any inventive step based on the embodiments described in the present application are within the scope of the protection of the present application.
Some concepts related to the embodiments of the present application are described below.
Multimedia content: the content information displayed on the application program interface may include, but is not limited to, at least one of picture content, video content, audio content, file content, web page content, and text content, and may also relate to many fields, for example, stock, news, weather, and advertisement, etc.
Operating a control: the control is used for controlling the display of multimedia content, the operation control is displayed on an application operation interface in the form of interface elements, and in the embodiment of the application, the interface elements refer to a series of elements which can meet the interaction requirements of a user and are contained in software or a system interface which can meet the interaction requirements. Interface elements are indispensable in aspects of software development, man-machine interaction and the like. The human-machine interface is developed by constructing the human-machine interface of the system by using interface elements which can be supported by the selected interface support system. In the design stage, interface elements which can meet the interaction requirements are selected according to the requirement analysis of human-computer interaction, and how to form the human-computer interface by using the elements is planned. Common interface elements in a GUI are dialog boxes, menus, scroll bars, etc.
Target presentation APP (Application): the mobile terminal multi-screen interaction tool is characterized in that mobile equipment (such as a mobile phone/tablet) is connected with a multi-screen interaction tool of a large-screen terminal (such as a smart television, a box, a projector and VR (Virtual Reality)), the content of the mobile equipment can be wirelessly delivered (mirrored) to the large-screen terminal, the purpose that the mobile phone controls the smart television to play games (the operation experience of the games is not changed), watch movies, listen to music, share photos and the like is achieved, and the mobile terminal multi-screen interaction tool is suitable for playing large-screen games, meetings and offices, video pushing, audio and video sharing, parent-child interaction and other scenes.
And (3) duplexing: two communication equipments are allowed to have two-way data transmission. The communication link between mobile devices may occupy two frequencies: the transport channel from the terminal to the network (uplink) and a channel in the reverse direction (downlink). Duplex means that bi-directional transmission can be performed simultaneously, as is usual for talking on a telephone. The method for controlling terminal interaction in the embodiment of the application supports duplex interaction, for example, when multimedia content is displayed on a TV (smart Television) based on a mobile phone version APP, the mobile phone and the smart Television can simultaneously reflect an execution result of operation no matter the operation is performed on the mobile phone or the TV. For example, when operating on a TV, the TV may pass an action to a cell phone, perform the corresponding action by the cell phone and return the action execution result to the TV, with the TV and cell phone displaying the execution result simultaneously.
Server (Server): it has soft and hard points. From a hardware perspective, a Server is a physically present Server; from the software perspective, the Server refers to the computer software with Server-side functions and the running Server-side software. The whole network is constructed by countless nodes and connecting channels. And in terms of "hardware", is constructed from numerous hardware servers and other digital computing device terminals (e.g., personal computers, cell phones, etc.) and intermediate connection devices (e.g., network wires, routers, etc.). In terms of "soft", the software is constructed by numerous running server-side software and application software (or terminal software) and their interconnection.
DLNA (Digital Living Network Alliance ): the purpose of the system is Enjoy your music, photos and videos, and any where anytime, which aims to solve the interconnection and interworking between wireless networks and wired networks including personal computers, consumer electronics, and mobile devices, so that unlimited sharing and growth of digital media and content services are possible. DLNA is not a creative technology but forms a solution, a specification to which one can adhere. Therefore, the various technologies and protocols selected are the technologies and protocols which are widely applied at present. DLNA specifies its entire application as 5 functional components. From bottom to top in sequence: network interconnection, network protocols, media transport, discovery control and management of devices, media formats.
DMR (Digital Media Renderer ): such as a digital smart television, a smart set-top box, etc. DMRs belong to consumer electronics devices that receive digital media streams from computers through wired or wireless home networks. Some DMRs integrate a display screen and speakers. Some DMRs must be connected to external output devices such as smart televisions, active speakers, or stereo systems. DMRs are available in a variety of sizes, shapes and configurations from different manufacturers.
Simple service discovery protocol: is an application layer protocol, and is one of core protocols constituting a universal plug and play (UPnP) technology. The protocol provides a mechanism for discovering devices within a local network. The control point (i.e. the application receiving the service) can query the devices providing specific services in the local network where it is located according to its own needs by using a simple service discovery protocol. A device (i.e. the serving > server side) can also announce its presence to control points within its own local network by using simple service discovery protocols.
The following briefly introduces the design concept of the embodiments of the present application:
in the related art, the terminal device can be used as a game machine or a smart television by a user, and can also be used as a learning machine and the like, so that more pleasure can be brought to our life. In addition, based on the DLNA protocol, a solution for sharing digital content between terminal devices can be realized, for example, media such as videos and pictures on a mobile phone are projected onto a TV through the DLNA protocol for playing or displaying.
The process of projection is mainly divided into two stages of equipment discovery and equipment control. The device discovery phase is divided into two steps, wherein the first step is to acquire basic description of the device, and the second step is to acquire detailed description of the device.
In the first step there are two ways to implement, active discovery and passive discovery. Fig. 1A is a schematic diagram of a method for obtaining a basic description of a device in a device discovery phase in the related art.
With reference to the content shown in fig. 1A, the active discovery specific process is as follows: the active discovery device (control end) actively sends a Search multicast designated address and port through a User Datagram Protocol (UDP), and the target device (receiving end DMR) sends device basic information through a UDP unicast after receiving the multicast. The specific passive discovery process comprises the following steps: the target device sends Notify multicast to the local area network through UDP, and the passive discovery device can obtain the device description address after receiving the multicast.
And when the second step of obtaining the detailed description of the equipment, the control end equipment obtains the detailed description of the equipment according to the basic description address.
Through device discovery, the control end device can establish connection with the projection target device and acquire the projection type supported by the target device, so that a foundation is laid for the next device control.
In the device control phase, taking video playing as an example, and directly sending the type (video playing) + playing address to the target device under the condition that the target device supports video playing (the default is supported); and the target equipment receives the video playing address to directly play the video, thereby completing the projection. In addition, if the user executes an Action through the control end, the control end needs to send an Action to the receiving end, and the receiving end replies a Response to the control end, as shown in fig. 1B.
As shown in fig. 1C, a projection timing chart in the related art is shown, taking a mobile phone as a control end and a TV as a receiving end as an example, and the specific process is as follows:
the mobile phone discovers the equipment TV in the equipment discovery phase and acquires the basic description of the equipment returned by the TV; further, after discovering the message of acquiring the detailed description of the equipment from the TV, the mobile phone receives the detailed description of the equipment returned by the TV; finally, a video playback url (Uniform Resource Locator) is sent to the TV, based on which the TV plays the video.
As can be seen from the above, the current projection only supports one-way interaction, taking the mobile phone as a control end and the smart television as a receiving end as an example, only supports displaying the operation on the mobile phone on the smart television, and the operation on the smart television can not be fed back to the mobile phone and displayed on the mobile phone. Moreover, when the operations on the mobile phone are displayed on the smart television, only some related functions of the media, such as videos, pictures and the like on the mobile phone are played on the smart television, and only the projection of multimedia content is supported, so that the method has limitations.
In view of this, the embodiments of the present application illustrate a method, an apparatus, an electronic device, and a storage medium for controlling terminal interaction. The interaction between the applications is bidirectional interaction, for example, an interface on a mobile phone version APP can be projected to a TV, the interface is displayed on the TV, and then the mobile phone and the smart television can simultaneously display an execution result of an action no matter the mobile phone or the TV is operated. Therefore, applications with the same function do not need to be respectively adapted to different terminal devices such as a mobile phone, a TV and a smart watch, the applications in the embodiment of the application support one-end development, and other ends are automatically adapted, for example, the applications are developed at a mobile phone end and can be automatically adapted at a TV end and a computer end. In addition, the embodiment of the application not only supports the projection of multimedia contents, but also supports the projection of interface elements of the operation control and the like, such as various complex interface elements of buttons, selection frames and the like, and improves the flexibility of terminal interaction.
The preferred embodiments of the present application will be described below with reference to the accompanying drawings of the specification, it should be understood that the preferred embodiments described herein are merely for illustrating and explaining the present application, and are not intended to limit the present application, and that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Fig. 2 is a schematic view of an application scenario according to an embodiment of the present application. The application scenario diagram includes two terminal devices 210 and a server 220, and the terminal devices 210 can log in a related interface, such as a first operation interface or a second operation interface. The terminal device 210 and the server 220 can communicate with each other through a communication network. Each terminal device corresponds to one user, and fig. 2 illustrates an example in which each of the user a and the user B corresponds to one terminal device 210, and the number of terminal devices is not limited in practice. In some cases, the terminal devices may communicate with each other through the server 220 first, direct communication may be established between the terminal devices, and a manner of direct communication between the terminal devices may be referred to as point-to-point communication, in which case, some interaction processes between the terminal devices may not require the relay of the server 220.
Wherein each terminal device may have installed therein an application for displaying multimedia content. The application related to the embodiment of the application may be a pre-installed application, an applet embedded in a certain application, or a web page version application, and the specific type of the application is not limited.
In an alternative embodiment, the communication network is a wired network or a wireless network. The terminal device 210 and the server 220 may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
In the embodiment of the present application, the terminal device 210 is an electronic device used by a user, and the electronic device may be a computer device having a certain computing capability and running software or a website capable of displaying multimedia content, such as a personal computer, a mobile phone, a tablet computer, a notebook computer, a smart television, an e-book reader, and the like. Each terminal device 210 is connected to the server 220 through a wireless Network, and the server 220 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, middleware service, a domain name service, a security service, a CDN (Content Delivery Network), and a big data and artificial intelligence platform.
In the embodiment of the present application, an application or a browser for displaying multimedia content may be installed on the first terminal and the second terminal for interaction, and hereinafter, the application is taken as an example, and for the purpose of distinguishing, an application for displaying multimedia content located on the first terminal may be referred to as a first application, and an application for displaying multimedia content located on the second terminal may be referred to as a second application. In this embodiment of the application, the first terminal is a source device, and the second terminal is a display device, so the first application may be referred to as a source APP, and the second application may be referred to as a target presentation APP. For example, the source device is a mobile phone, the display device is a smart television, the source APP may be a certain video application installed on the mobile phone, and the target display APP may be a certain playing application installed on the smart television.
The source device and the display device first need to establish a connection for association. Specifically, there are two ways: in the first mode, the source equipment and the display equipment are directly connected through a local area network; in the second mode, the server 220 may be used as a relay device when the source device and the display device establish a connection. These two connection modes are described in detail below with reference to the accompanying drawings:
for the first mode, as shown in fig. 3A, it is a timing chart of establishing a connection between a source device and a target device in this embodiment, the mode is mainly to establish a connection through a local area network, and the source device and a display device belong to devices located in the same local area network:
firstly, a source device can discover devices, here discovery display devices, in a local area network where the source device is located through a simple service discovery protocol; the display equipment returns information such as an ip address of the equipment to the source equipment; after the source equipment receives the ip address, establishing socket connection according to the ip address request; after the display device receives the request, if the connection establishment with the source device is agreed, a message of agreeing to establish the connection can be returned, and the connection can be established between the source device and the display device. Further, the interaction between the source device and the display device mainly refers to the interaction between a first application on the source device and a second application on the display device, and in this embodiment of the present application, the first application located on the source device and the second application located on the display device may communicate through a predefined interaction protocol.
According to the second mode, transfer is performed through the server, the local area network is not depended on, common code scanning binding equipment exists, and corresponding information can be sent to the bound equipment after the equipment is bound. For example, when the source device is in a data network and the display device is in a WIFI (Wireless-Fidelity) network, the source device and the display device may be connected through the Server as a relay.
Fig. 3B is a timing diagram illustrating a connection between another source device and a target device in the embodiment of the present application:
the source device first needs to obtain a list of bound devices from the Server. The binding device list may include at least one display device, each of the display devices has a second application installed thereon, and the second applications are bound to the first application on the source device in advance in a code scanning manner or the like. After receiving the binding device list returned by the Server, the source device may display the binding device list on the current display interface, or may display the binding device list in a pop-up window manner.
And then, the source device receives a selection instruction of the user for the binding device list, determines the display device selected by the user as the target display device according to the selection instruction of the user for the binding device list, sends a message to the target display device, and receives a result returned by the target receiving device.
According to the embodiment, the binding equipment list is displayed, the selection instruction of the user for the display equipment list is received, and the display equipment selected by the user is determined as the target display equipment according to the selection instruction of the user for the binding equipment list, so that terminal interaction with the display equipment selected by the user is realized, the personalized requirements of the user are met, and the user experience is improved.
Referring to fig. 4, an implementation flow chart of a method for controlling terminal interaction provided in the embodiment of the present application is applied to a first terminal, and a specific implementation flow of the method is as follows:
in step S41: the first terminal sends the interface elements of the operation controls arranged on the first operation interface to a second terminal to be interacted, so that the second terminal displays the interface elements of the operation controls on a second operation interface, and the first terminal controls the second terminal to display multimedia contents;
in this application, the first operation interface may be an operation interface of a first application, and the second operation interface may be an operation interface of a second application, where the first application is mainly taken as a source APP on a mobile phone in this application, and the source APP may be an application that can display multimedia content, such as a video application. The second application, namely the target display APP on the smart television, is an application newly proposed in the embodiment of the application, and is used for realizing an interaction protocol and communicating with the source APP, and the target display APP can also be made into an H5(HTML5.0) page form, so that the target display APP can be issued in a background under the condition of change without upgrading; that is to say, the target presentation APP in the embodiment of the present application is a general APP, and any source APP needs to establish a connection with the target presentation APP only by implementing the target presentation APP according to a defined protocol. The specific implementation manner when the first terminal establishes the connection with the second terminal may refer to the method shown in fig. 3A or fig. 3B.
Each operation control arranged on the first operation interface is used for controlling the playing or displaying of the multimedia content, for example, when the multimedia content is a video, the interface elements may be keys, scroll bars, menus and the like used for controlling a playing mode, a playing progress and the like, for example, a fast forward key, a double speed selection box, a definition menu selection box, a scroll bar used for adjusting the playing progress and the like. When the multimedia content is a picture, the interface elements can be keys for previous, next, rotation, clipping and the like. The following description mainly takes multimedia content as an example for detailed description.
Fig. 5A is a schematic diagram of a first operation interface listed in the embodiments of the present application. The first terminal is a mobile phone, the first application is a video application A on the mobile phone, the first operation interface mainly comprises two parts, namely a display content area and an operation area, the display content area is used for displaying multimedia content, and the operation area displays interface elements of each operation control. The display content area mainly refers to a middle area of the first operation interface, and the operation area refers to a boundary area of the first operation interface, such as an upper boundary area and a lower boundary area. As shown in fig. 5A, the first operation interface displayed in a landscape manner, wherein the interface elements displayed in the lower border region in the interface are S51 to S55, and the interface element displayed in the upper border region is S56. Wherein, S51 is a pause button, and the corresponding operation control is used to control the playing state of the multimedia content, whether it is the pause state or the playing state; s52 is the next set of buttons, and the corresponding operation control is used to control the switching of the multimedia content. S53 is a scroll bar, and the corresponding operation control is used for controlling the playing progress of the multimedia content; s54 is a double-speed selection box, and the corresponding operation control is used for controlling the playing speed of the multimedia content; s55 is a definition selection box, and the corresponding operation control is used for controlling the definition of the multimedia content; and S56 is a bullet screen switch for controlling the display of bullet screens.
It should be noted that the interface elements listed in fig. 5A are only examples, and actually may also include other interface elements, which are not specifically limited herein.
In the embodiment of the application, after the second terminal receives each interface element sent by the first terminal, the interface elements of each operation control are also displayed on the second operation interface. For example, when the second terminal is a smart television and the second application is a target display APP on the smart television, as shown in fig. 5B, the second terminal is a schematic diagram of a second operation interface in the embodiment of the present application. The second operation interface shown in fig. 5B can also be divided into two parts, i.e., a content display area and an operation area, wherein the content display area is used for displaying multimedia content, and the operation area is used for displaying each interface element. And when the interface elements of the operation controls are displayed on the second operation interface, the interface elements can be displayed in a self-adaptive manner according to the size of the intelligent television screen. In this way, interface elements of the operation controls are also displayed on the second operation interface shown in fig. 5B, so that the user can directly perform control operations on the second operation interface.
The first terminal in the embodiment of the present application controls the second terminal to display the multimedia content, and the specific implementation manner is as follows:
the first terminal displays the multimedia content and sends the content information of the multimedia content to the second terminal so that the second terminal synchronously displays the multimedia content according to the content information; or the first terminal displays the multimedia content and sends the address information of the multimedia content to the second terminal so that the second terminal acquires and displays the multimedia content according to the address information.
Taking multimedia content as an example, in the embodiment of the present application, the first terminal needs to send, in addition to sending the interface element, audio and video data of the currently played video content or a url address to the second terminal.
If the first terminal directly sends the audio and video data, namely the content information of the video content to the second terminal, the second terminal can directly display the video content according to the audio and video data; if the first terminal sends the url address, that is, the address information of the video content to the second terminal, the second terminal may download and play the video content according to the url address.
In the above embodiment, the first terminal may project the multimedia content to the second terminal for display, and may also project the interface elements of the operation controls to the second terminal for display, so that the projection types are enriched.
In step S42: the method comprises the steps that a first terminal receives an interface operation instruction sent by a second terminal, wherein the interface operation instruction is generated by the second terminal according to control operation aiming at a target interface element displayed on a second operation interface;
in the above process, the second terminal may display the interface elements of the operation controls on the second operation interface, so that the user may directly perform control operations on the interface elements on the second operation interface, for example, a click event, a slide event, and the like. For example, if the speed is selected by clicking, as shown in fig. 5C, the user selects 1.5 times speed by clicking on the second operation interface, at this time, the user triggers a control operation on the target interface element S54, generates an interface operation instruction according to the control operation, and sends the interface operation instruction to the first terminal.
In step S43: and the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sends the control result to the second terminal so that the second terminal displays the control result.
The operation control corresponding to the target interface element is a target operation control, which is an operation control for controlling the playing speed, taking the enumerated target interface element S54 as an example. After the first terminal receives the interface operation instruction, corresponding operation can be executed on the target operation control corresponding to the target interface element according to the interface operation instruction. For example, when the click selection multiple speed is 1.5X as listed above, the following operations are performed: the playing speed of the video content is adjusted to 1.5 times of the normal speed, and the adjustment result is shown in fig. 5D. Then, the control result (i.e., the adjustment result) is sent to the second terminal, and the second terminal displays the control result to achieve synchronous playing control of the video content by the mobile phone and the smart television, that is, the adjusted playing speed is used as the control result to notify the second terminal, and after receiving the control result, the second terminal can synchronously adjust the playing speed of the video content to be 1.5 times of the normal speed to achieve control of the playing speed of the video content, and the result is shown in fig. 5E.
It should be noted that, the above is taken as an example that the target interface element is S54, and when the target interface element is another interface element, the same road is also used, for example, when the target interface element is S55, the definition can be adjusted and controlled in a manner similar to the above process.
In the above embodiment, the first terminal can send the interface elements of each operation control to the second terminal, and the interface elements are displayed by the second terminal, so that when the first terminal controls the second terminal to display multimedia content, the second terminal can further respond to the control operation of the target interface element displayed on the second operation interface, and further generate an interface operation instruction to send to the first terminal, which is further controlled by the first terminal, thereby realizing bidirectional interaction between the first terminal and the second terminal.
In addition, in consideration of that the first application is updated continuously, after the first application is updated, the operation control for controlling the display of the multimedia content or the interface element of the operation control may also be changed, and in addition, the user may trigger an update operation for the first operation interface at the first terminal to update the interface element on the first operation interface. In this embodiment of the application, when an interface element on the first operation interface is updated, the updated interface element needs to be sent to the second terminal.
Specifically, the first terminal responds to the updating operation of the first operation interface, sends the interface element of the operation control with the updated interface to the second terminal, and the second terminal receives the interface element of the operation control with the updated interface sent by the first terminal and updates and displays the changed interface element.
The user can directly control the display of the multimedia content through the first terminal, for example, after the user adjusts the playing definition on the first operation interface, the interface element changes, and the updated interface element can be sent to the second terminal; for example, after the setting of the bullet screen is opened, an operation control related to launching the bullet screen is newly added for inputting and launching the bullet screen, and then the interface element of the operation control which needs to be newly added is sent to the second terminal, so that the second terminal updates the display interface element and the like on the second operation interface.
In the embodiment of the application, when the first terminal is updated, interface elements of each operation control on the first operation interface may also be updated, and when the interface elements are updated, the first terminal needs to send the interface elements of the operation control with the updated interface to the second terminal, so that the second terminal updates and displays the changed interface elements on the second operation interface, thereby ensuring that the interface elements on the second terminal and the first terminal are consistent, and triggering a control operation for a target interface element through the second terminal.
Fig. 6 is a schematic flow chart illustrating a process of updating display interface elements in the embodiment of the present application, in which a first terminal is mainly used as a mobile phone, and a second terminal is used as an example of a smart phone.
In step S61, the mobile phone sends a message to the smart television, where the message includes interface elements of each operation on the first operation interface;
in step S62, the smart television performs interface adaptive display according to the received message;
in step S63, the smart television replies to the mobile phone with a received message;
in step S64, the smart television responds to the control operation for the target interface element, and generates an interface operation instruction;
in step S65, the smart television sends a message to the mobile phone, where the message includes an interface operation instruction;
in step S66, the mobile phone executes the relevant action according to the interface operation command;
in step S67, the mobile phone sends a message to the smart television again, where the message includes the updated interface element on the first operation interface;
in step S68, the smart television performs interface adaptive display according to the received message, and updates display interface elements;
in step S69, the smart tv replies to the handset with the received message.
Based on the above embodiment, the display and update of the interface elements can be realized, the synchronization of the second terminal and the first terminal is ensured, so that the control operation aiming at the target interface elements is triggered through the second terminal, in addition, the action is transmitted to the first terminal by the second terminal, the process of the action execution result is simultaneously embodied by the first terminal and the second terminal, the interface elements can be displayed by the second terminal besides the multimedia content, the interaction information between applications is enriched, and the flexibility of application interaction is improved.
In the above embodiments, the first terminal and the second terminal are described as an example in a one-to-one manner. In the embodiment of the present application, one or more display devices selected by the user may be used. When there are a plurality of display devices selected by the user, there may be a plurality of second applications to be actually interacted with the first application on the source device. For example, as shown in fig. 7A, the source device is a mobile phone, and the display device includes a notebook, a computer, and a smart television. At this moment, the target display APP installed on the notebook, the target display APP installed on the computer and the target display APP installed on the smart television all belong to the second application to be interacted with the first application on the mobile phone.
Optionally, when the first terminal projects the multimedia content to a plurality of second terminals, for example, three listed second terminals to be interacted are provided, and are respectively on a computer, an intelligent television and a tablet, at this time, when the first terminal receives an interface operation instruction sent by any one of the second terminals, the first terminal executes corresponding operation on a target operation control corresponding to a target interface element according to the received interface operation instruction, and sends the control result to each of the second terminals, so that each of the second terminals synchronously displays the effect of the control result.
Suppose that the first terminal is a video application a on a mobile phone, after the video application a is opened by a user a, an X video is selected for projection, and the X video is projected to three target display devices, namely a computer, a tablet and a smart television, and at this time, the X video can be synchronously played by the target display APPs on the three target display devices.
When the first terminal receives interface operation instructions sent by a plurality of second terminals at the same time, the following specific situations can be distinguished:
in the first case, the first terminal receives interface operation instructions sent by the plurality of second terminals at the same time, and the interface operation instructions sent by each second terminal are the same.
At this time, the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction.
For example, the user B triggers the control operation 1 for the target interface element 1 through the second operation interface of the target display APP1 on the computer, and the user C triggers the control operation 1 for the target interface element 1 through the second operation interface of the target display APP2 clicked on, at this time, the first terminal can receive the interface operation instructions sent by the target display APP1 and the target display APP2 at the same time, and the two interface operation instructions are the same, so that the first terminal only needs to execute the control operation 1 for the target interface element 1 according to the interface operation instructions sent by the target display APP1 or the target display APP2, and synchronously send the control result to the target display APP1 on the computer, the target display APP2 on the smart television, and the target display APP3 on the tablet.
And in the second situation, the first terminal receives the interface operation instructions sent by the plurality of second terminals at the same time, and the interface operation instructions sent by at least two second terminals are different.
For example, the user B triggers the control operation 1 for the target interface element 1 through the second operation interface of the target display APP1 on the computer, and the user C triggers the control operation 2 for the target interface element 2 through the second operation interface of the target display APP2 clicked on, at this time, the first terminal may receive the interface operation instructions sent by the target display APP1 and the target display APP2 at the same time, and the two interface operation instructions are different.
In this case, the following two processing methods can be adopted:
and in the first processing mode, according to the sequence of the priority of each second terminal sending the interface operation instruction from high to low, corresponding operations are executed on the target operation control corresponding to the corresponding target interface element according to the interface operation instruction sent by each second terminal in sequence.
The priorities of three terminal devices, such as a computer, a smart television and a tablet, are from high to low: the intelligent television, the computer, the panel, the priority of the terminal equipment computer that the target show APP1 that sends interface operation instruction this moment is less than the priority of the terminal equipment intelligent television that target show APP2 is located, therefore first terminal is according to the interface operation instruction that target show APP2 sent at first, carry out control operation 2 to the target operation control 2 that target interface element 2 corresponds, and send the control result synchronization for target show APP1 on the computer, target show APP2 on the intelligent television and target show APP3 on the panel.
And then, the second terminal executes control operation 1 on the target operation control 1 corresponding to the target interface element 1 according to the interface operation instruction sent by the target display APP1, and synchronously sends the control result to the target display APP1 on the computer, the target display APP2 on the smart television and the target display APP3 on the flat plate.
And in the second processing mode, according to the priority of each second terminal sending the interface operation instruction and according to the interface operation instruction sent by the second terminal with the highest priority, corresponding operation is executed on the target operation control corresponding to the corresponding target interface element.
In this way, according to the priority, the first terminal only needs to execute the control operation 2 on the target operation control 2 corresponding to the target interface element 2 according to the interface operation instruction sent by the target display APP2, and then sends the control result synchronously to the target display APP1 on the computer, the target display APP2 on the smart television and the target display APP3 on the flat plate, so that the target display APP1, the target display APP2 and the target display APP3 all execute corresponding operations.
Based on the embodiment, the effect of synchronous operation of different terminals can be realized, the synchronous display of multimedia contents among a plurality of terminal devices is ensured, and the corresponding operation is synchronously executed, so that the user experience is greatly improved.
In the embodiment of the application, when it is determined whether the interface operation instructions sent by the second terminal are the same, the two interface operation instructions are different if the target interface elements targeted by the interface operation instructions are different according to whether the target interface elements targeted by the interface operation instructions are the same and the corresponding control operations are the same; if the target interface elements targeted by the two interface operation instructions are the same, but the corresponding control operations are different, the two interface operation instructions are also different at this time, for example, the target interface element is S54, and the corresponding control operations are divided into setting at 2 times speed and setting at 1.5 times speed; and if the target interface elements aimed at by the two interface operation instructions are the same and the corresponding control operations are also the same, the two interface operation instructions are the same.
In addition, besides the projection manner illustrated in fig. 7A, a plurality of source devices or multimedia contents projected by a plurality of first applications on one source device may be displayed on one display device, for example, as illustrated in fig. 7B, when the notebook and the computer are both source devices and the smart tv is a display device, the first applications are respectively a certain video application on the notebook, a certain video application on the computer, and the second application is a target presentation APP on the smart tv, and at this time, the multimedia contents projected by two first applications are displayed in a split screen manner on the target presentation APP.
That is, when the second application receives address information or content information of multimedia content transmitted by the plurality of first applications, the second application may display in a split screen manner. For example, the second application receives video contents sent by two first applications, namely, an a video application and a B video application, the second application may display the video content 1 sent by the a video application in the first area and display the video content 2 sent by the B video application in the second area. As shown in fig. 8. A video content 1 sent by a video application A on the notebook is displayed on the left side of the screen of the intelligent television, and in addition, a corresponding interface element can be displayed; on the right side of the screen is displayed a video content 2 sent by the B video application on the computer, and the corresponding interface elements. In this case, the display and control of the video content 1 and the video content 2 are independent and do not affect each other.
Referring to fig. 9, an implementation flow chart of a method for controlling terminal interaction provided in the embodiment of the present application is applied to a second terminal, and a specific implementation flow of the method is as follows:
in step S91, the second terminal receives interface elements of each operation control set on the first operation interface sent by the first terminal, and displays the interface elements of each operation control on the second operation interface, and the second terminal displays multimedia content based on the control of the first terminal;
in step S92, the second terminal generates an interface operation instruction according to the control operation for the target interface element displayed on the second operation interface, and sends the interface operation instruction to the first terminal, so that the first terminal executes a corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction;
in step S93, the second terminal receives the control result returned by the first terminal, and displays the control result.
In the foregoing embodiment, the first terminal can send the interface element of each operation control to the second terminal, and the second terminal displays the interface element on the second operation interface, so that when the first terminal controls the second terminal to display multimedia content, the second terminal can further respond to the control operation of the target interface element displayed on the second operation interface, and further generate an interface operation instruction to send to the first terminal, thereby implementing bidirectional interaction between the second terminal and the first terminal. The first terminal receives the interface operation instruction and then further controls the interface operation instruction, based on the mode, the second terminal transmits the action to the first terminal, the first terminal and the second terminal simultaneously reflect the action execution result process, and the second terminal can display the interface elements besides the multimedia content, so that the interaction information among applications is enriched, and the flexibility of terminal interaction is improved.
Optionally, the displaying, by the second terminal, the multimedia content based on the control of the first terminal specifically includes:
the second terminal receives content information of the multimedia content sent by the first terminal and displays the multimedia content according to the content information, wherein the multimedia content is the multimedia content currently displayed by the first terminal; or
And the second terminal receives the address information of the multimedia content sent by the first terminal, and acquires and displays the multimedia content according to the address information, wherein the multimedia content is the multimedia content currently displayed by the first terminal.
Optionally, the method further includes:
and receiving the interface element of the operation control with the updated interface, which is sent by the first terminal, and updating and displaying the changed interface element, wherein the interface element of the operation control with the updated interface is sent by the first terminal according to the updating operation responding to the first operation interface.
In addition, the first terminal and the second terminal in the embodiment of the application communicate with each other through a pre-established interaction protocol.
In the embodiment of the present application, when the first terminal and the second terminal interact with each other, interface elements, interface operation instructions, control results, and content information or address information of multimedia content may all be transmitted based on the interaction protocol. The interaction protocol may consider json (javascript Object corporation) format, wherein json is an open and text-based data exchange format, which is convenient for expansion. For example:
and sending a message:
Figure BDA0002594173410000211
Figure BDA0002594173410000221
it should be noted that, the above only lists a json format send message and reply message when the first terminal and the second terminal perform message transmission, and the specific implementation manner may be determined according to the actual situation, and is not limited specifically herein.
Fig. 10 shows an interaction sequence diagram of a method for controlling terminal interaction. The specific implementation flow of the method is as follows:
step S101: the first terminal displays the multimedia content and sends the address information of the multimedia content and interface elements of each operation control arranged on the first operation interface to the second terminal;
step S102: the second terminal acquires and displays the multimedia content according to the address information, and displays interface elements of each operation control on a second operation interface;
step S103: the second terminal responds to the control operation aiming at the target interface element displayed on the second operation interface;
step S104: the second terminal generates an interface operation instruction according to the control operation and sends the interface operation instruction to the first terminal;
step S105: the first terminal executes corresponding operation on a target operation control corresponding to the target interface element according to the interface operation instruction;
step S106: the first terminal sends the control result to the second terminal;
step S107: the second terminal displays the control result;
step S108: the first terminal responds to the updating operation of the first operation interface;
step S109: the first terminal sends the interface element of the operation control with the updated interface to the second terminal;
step S110: and the second terminal updates and displays the changed interface elements.
Based on the same inventive concept, the embodiment of the application also provides a device for controlling terminal interaction. As shown in fig. 11, which is a schematic structural diagram of a first apparatus 1100 for controlling terminal interaction in this embodiment of the application, the apparatus may include:
a first sending unit 1101, configured to send the interface elements of each operation control set on the first operation interface to a second terminal to be interacted, so that the second terminal displays the interface elements of each operation control on the second operation interface, and the first terminal controls the second terminal to display multimedia content;
a first receiving unit 1102, configured to receive an interface operation instruction sent by a second terminal, where the interface operation instruction is generated by the second terminal according to a control operation for a target interface element displayed on a second operation interface;
the control unit 1103 is configured to execute a corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and send the control result to the second terminal, so that the second terminal displays the control result.
Optionally, the first sending unit 1101 is specifically configured to:
displaying the multimedia content, and sending the content information of the multimedia content to the second terminal so that the second terminal synchronously displays the multimedia content according to the content information; or
And displaying the multimedia content, and sending the address information of the multimedia content to the second terminal so that the second terminal acquires and displays the multimedia content according to the address information.
Optionally, the apparatus further comprises:
and a first updating unit 1104, configured to send, in response to an updating operation of the first operation interface, the interface element of the operation control with the updated interface to the second terminal, so that the second terminal updates and displays the changed interface element.
Optionally, if there are multiple second terminals to be interacted with, the control unit 1103 is specifically configured to:
and when receiving an interface operation instruction sent by at least one second terminal, executing corresponding operation on the target operation control corresponding to each target interface element according to each interface operation instruction, and sending the control result to each second terminal so that each second terminal synchronously displays the control result.
Optionally, the control unit 1103 is specifically configured to:
if interface operation instructions sent by a plurality of second terminals are received at the same time and the interface operation instructions sent by each second terminal are the same, executing corresponding operation on a target operation control corresponding to the target interface element according to the interface operation instructions; or
If interface operation instructions sent by a plurality of second terminals are received at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operations on target operation controls corresponding to corresponding target interface elements according to the sequence from high to low of the priority of each second terminal sending the interface operation instructions and the interface operation instructions sent by each second terminal in sequence; or
And if the interface operation instructions sent by the plurality of second terminals are received at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operation on the target operation control corresponding to the corresponding target interface element according to the priority of each second terminal sending the interface operation instruction and the interface operation instruction sent by the second terminal with the highest priority.
Based on the same inventive concept, the embodiment of the application also provides a device for controlling terminal interaction. As shown in fig. 12, which is a schematic structural diagram of a second apparatus 1200 for controlling terminal interaction in the embodiment of the present application, the apparatus may include:
a second receiving unit 1201, configured to receive interface elements of each operation control set on a first operation interface sent by a first terminal, and display the interface elements of each operation control on a second operation interface, where the second terminal displays multimedia content based on control of the first terminal;
the operation unit 1202 is configured to generate an interface operation instruction according to a control operation for a target interface element displayed on the second operation interface, and send the interface operation instruction to the first terminal, so that the first terminal executes a corresponding operation on a target operation control corresponding to the target interface element according to the interface operation instruction;
and the executing unit 1203 is configured to receive the control result returned by the first terminal, and display the control result.
Optionally, the second receiving unit 1201 is specifically configured to:
receiving content information of multimedia content sent by a first terminal, and displaying the multimedia content according to the content information, wherein the multimedia content is the multimedia content currently displayed by the first terminal; or
And receiving address information of the multimedia content sent by the first terminal, and acquiring and displaying the multimedia content according to the address information, wherein the multimedia content is the multimedia content currently displayed by the first terminal.
Optionally, the apparatus further comprises:
the second updating unit 1204 is configured to receive an interface element of the operation control sent by the first terminal after the interface is updated, and update and display the changed interface element, where the interface element of the operation control after the interface is updated is sent by the first terminal according to an update operation in response to the first operation interface.
For convenience of description, the above parts are separately described as modules (or units) according to functional division. Of course, the functionality of the various modules (or units) may be implemented in the same one or more pieces of software or hardware when implementing the present application.
Having described the method and apparatus for controlling terminal interaction according to an exemplary embodiment of the present application, an electronic device according to another exemplary embodiment of the present application is described next.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Fig. 13 is a block diagram illustrating an electronic device 1300 according to an example embodiment, the apparatus comprising:
a processor 1310;
a memory 1320 for storing operations executable by the processor 1310;
wherein the processor 1310 is configured to execute operations to implement a method of controlling terminal interaction in the embodiments of the present application, such as the steps shown in fig. 4 or the steps shown in fig. 9.
In an exemplary embodiment, a storage medium comprising operations, such as the memory 1320 comprising operations, which are executable by the processor 1310 of the electronic device 1300 to perform the above-described method is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, for example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Based on the same inventive concept, the embodiment of the present application further provides a terminal device 210, where the terminal device 210 is an electronic device used by a user, and the electronic device may be a computer device that has certain computing capability and runs software or a website capable of displaying multimedia content, such as a personal computer, a mobile phone, a tablet computer, a notebook computer, an intelligent television or an intelligent television reader.
Referring to fig. 14, the terminal device 210 includes a display unit 1440, a processor 1480, and a memory 1420, where the display unit 1440 includes a display panel 1441 for displaying information input by a user or information provided to the user and various interfaces of the terminal device 210, and in the embodiment of the present application, the display panel 1441 is mainly used for displaying an interface of an application installed in the terminal device 210, a shortcut window, and the like. Alternatively, the Display panel 1441 may be configured in the form of an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
The processor 1480 is used to read the computer program and then execute a method defined by the computer program, for example, the processor 1480 reads an application program of the first terminal or the second terminal, thereby running the application on the terminal device 210 and displaying an interface of the application on the display unit 1440. The Processor 1480 may include one or more general purpose processors and may further include one or more DSPs (Digital Signal processors) for performing relevant operations to implement the solutions provided by the embodiments of the present application.
Memory 1420 generally includes both internal and external memory, which may be Random Access Memory (RAM), Read Only Memory (ROM), and CACHE (CACHE). The external memory can be a hard disk, an optical disk, a USB disk, a floppy disk or a tape drive. The memory 1420 is used for storing computer programs including application programs and the like corresponding to applications, and other data, which may include data generated after an operating system or application programs are executed, including system data (e.g., configuration parameters of the operating system) and user data. Program instructions in the embodiments of the present application are stored in the memory 1420, and the processor 1480 executes the program instructions stored in the memory 1420, implements the method of controlling terminal interaction discussed above, or implements the function of adapting an application discussed above.
In addition, the terminal device 210 may further include a display unit 1440 for receiving input digital information, character information, or contact touch operation/non-contact gesture, and generating signal input related to user setting and function control of the terminal device 210, and the like. Specifically, in the embodiment of the present application, the display unit 1440 may include a display panel 1441. The display panel 1441, such as a touch screen, may collect touch operations of a user (e.g., operations of a player on the display panel 1441 or on the display panel 1441 using any suitable object or accessory such as a finger, a stylus, etc.) on or near the display panel 1441, and drive the corresponding connection device according to a preset program. Alternatively, the display panel 1441 may include two parts of a touch detection device and a touch controller. The touch detection device comprises a touch controller, a touch detection device and a touch control unit, wherein the touch detection device is used for detecting the touch direction of a user, detecting a signal brought by touch operation and transmitting the signal to the touch controller; the touch controller receives touch information from the touch sensing device and converts it to touch point coordinates, which are provided to the processor 1480 and can receive and execute commands from the processor 1480. In this embodiment, if the user performs a control operation on a target interface element displayed on the second operation interface in a touch manner, and a touch detection device in the display panel 1441 detects a touch operation, the touch detection device sends a signal corresponding to the detected touch operation, the touch controller converts the signal into a touch point coordinate and sends the touch point coordinate to the processor 1480, and the processor 1480 determines the target interface element selected by the user according to the received touch point coordinate.
The display panel 1441 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The terminal device 210 may further include an input unit 1430 in addition to the display unit 1440, and the input unit 1430 may include, but is not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. In fig. 14, the input unit 1430 includes an image input device 1431 and another input device 1432 as an example.
In addition to the above, the terminal device 210 may further include a power supply 1490, an audio circuit 1460, a near field communication module 1470, and an RF circuit 1410 for powering the other modules. The terminal device 210 may also include one or more sensors 1450, such as acceleration sensors, light sensors, pressure sensors, and the like. The audio circuit 1460 specifically includes a speaker 1461, a microphone 1462, and the like, for example, a user may use voice control, and the terminal device 210 may collect a user's voice through the microphone 1462, may perform control through the user's voice, and when a user needs to be prompted, play a corresponding prompt sound through the speaker 1461.
In an alternative embodiment, the present application further provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations of the above embodiments.
In an alternative embodiment, various aspects of the method for controlling terminal interaction provided by the present application may also be implemented in the form of a program product including program code for causing a computer device to perform the steps of the method for controlling terminal interaction according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device, for example, the computer device may perform the steps as shown in fig. 4 or fig. 9.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product of embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a computing device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with a command execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with a command execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (15)

1. A method for controlling terminal interaction is characterized in that the method comprises the following steps:
the method comprises the steps that a first terminal sends interface elements of all operation controls arranged on a first operation interface to a second terminal to be interacted, so that the second terminal displays the interface elements of all the operation controls on a second operation interface, and the first terminal controls the second terminal to display multimedia contents;
the first terminal receives an interface operation instruction sent by the second terminal, wherein the interface operation instruction is generated by the second terminal according to control operation aiming at a target interface element displayed on the second operation interface;
and the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sends a control result to the second terminal so that the second terminal displays the control result.
2. The method of claim 1, wherein the first terminal controls the second terminal to display multimedia content, specifically comprising:
the first terminal displays the multimedia content and sends the content information of the multimedia content to the second terminal so that the second terminal synchronously displays the multimedia content according to the content information; or
And the first terminal displays the multimedia content and sends the address information of the multimedia content to the second terminal so that the second terminal acquires and displays the multimedia content according to the address information.
3. The method of claim 1, further comprising:
and the first terminal responds to the updating operation of the first operation interface and sends the interface element of the operation control with the updated interface to the second terminal so that the second terminal updates and displays the changed interface element.
4. The method according to any one of claims 1 to 3, wherein if there are multiple second terminals to be interacted with, the first terminal performs corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sends a control result to the second terminal, so that the second terminal displays the control result, specifically including:
and when the first terminal receives an interface operation instruction sent by at least one second terminal, executing corresponding operation on the target operation control corresponding to each target interface element according to each interface operation instruction, and sending a control result to each second terminal so that each second terminal synchronously displays the control result.
5. The method according to claim 4, wherein when the first terminal receives an interface operation instruction sent by at least one second terminal, according to each target interface operation instruction, the first terminal performs corresponding operation on the target operation control corresponding to each target interface element, and specifically includes:
if the first terminal receives interface operation instructions sent by a plurality of second terminals at the same time and the interface operation instructions sent by each second terminal are the same, executing corresponding operation on a target operation control corresponding to a target interface element according to the interface operation instructions; or
If the first terminal receives interface operation instructions sent by a plurality of second terminals at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operation on a target operation control corresponding to a corresponding target interface element according to the interface operation instructions sent by each second terminal in sequence from high to low according to the priority of each second terminal sending the interface operation instructions; or
And if the first terminal receives interface operation instructions sent by a plurality of second terminals at the same time and the interface operation instructions sent by at least two second terminals are different, executing corresponding operation on the target operation control corresponding to the corresponding target interface element according to the priority of each second terminal sending the interface operation instruction and the interface operation instruction sent by the second terminal with the highest priority.
6. A method for controlling terminal interaction is characterized in that the method comprises the following steps:
the method comprises the steps that a second terminal receives interface elements of all operation controls arranged on a first operation interface sent by a first terminal, and displays the interface elements of all the operation controls on a second operation interface, wherein the second terminal displays multimedia content based on the control of the first terminal;
the second terminal generates an interface operation instruction according to control operation aiming at a target interface element displayed on the second operation interface, and sends the interface operation instruction to the first terminal, so that the first terminal executes corresponding operation on a target operation control corresponding to the target interface element according to the interface operation instruction;
and the second terminal receives the control result returned by the first terminal and displays the control result.
7. The method of claim 6, wherein the second terminal displays multimedia content based on the control of the first terminal, specifically comprising:
the second terminal receives content information of the multimedia content sent by the first terminal, and displays the multimedia content according to the content information, wherein the multimedia content is the multimedia content currently displayed by the first terminal; or
And the second terminal receives the address information of the multimedia content sent by the first terminal, and acquires and displays the multimedia content according to the address information, wherein the multimedia content is the multimedia content currently displayed by the first terminal.
8. The method of claim 6, further comprising:
and receiving the interface element of the operation control with the updated interface, which is sent by the first terminal, and updating and displaying the changed interface element, wherein the interface element of the operation control with the updated interface is sent by the first terminal according to the update operation responding to the first operation interface.
9. An apparatus for controlling terminal interaction, comprising:
the first sending unit is used for sending the interface elements of the operation controls arranged on the first operation interface to a second terminal to be interacted so that the second terminal displays the interface elements of the operation controls on the second operation interface, and the first terminal controls the second terminal to display multimedia contents;
the first receiving unit is used for receiving an interface operation instruction sent by the second terminal, wherein the interface operation instruction is generated by the second terminal according to control operation aiming at a target interface element displayed on the second operation interface;
and the control unit is used for executing corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction, and sending a control result to the second terminal so that the second terminal displays the control result.
10. The apparatus of claim 9, wherein the first sending unit is specifically configured to:
displaying the multimedia content, and sending content information of the multimedia content to the second terminal, so that the second terminal synchronously displays the multimedia content according to the content information; or
And displaying the multimedia content, and sending the address information of the multimedia content to the second terminal so that the second terminal acquires and displays the multimedia content according to the address information.
11. The apparatus of claim 9, wherein the apparatus further comprises:
and the first updating unit is used for responding to the updating operation of the first operation interface and sending the interface element of the operation control with the updated interface to the second terminal so as to enable the second terminal to update and display the changed interface element.
12. The apparatus according to any one of claims 9 to 11, wherein if there are multiple second terminals to be interacted with, the control unit is specifically configured to:
and when receiving an interface operation instruction sent by at least one second terminal, executing corresponding operation on the target operation control corresponding to each target interface element according to each interface operation instruction, and sending a control result to each second terminal so that each second terminal synchronously displays the control result.
13. An apparatus for controlling terminal interaction, comprising:
the second receiving unit is used for receiving the interface elements of the operation controls arranged on the first operation interface and sent by the first terminal, and displaying the interface elements of the operation controls on the second operation interface, wherein the second terminal displays multimedia contents based on the control of the first terminal;
the operation unit is used for generating an interface operation instruction according to the control operation aiming at the target interface element displayed on the second operation interface and sending the interface operation instruction to the first terminal so that the first terminal executes corresponding operation on the target operation control corresponding to the target interface element according to the interface operation instruction;
and the execution unit is used for receiving the control result returned by the first terminal and displaying the control result.
14. An electronic device, comprising a processor and a memory, wherein the memory stores program code which, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1 to 5 or the steps of the method of any of claims 6 to 8.
15. A computer-readable storage medium, characterized in that it comprises program code for causing an electronic device to carry out the steps of the method of any one of claims 1 to 5 or the steps of the method of any one of claims 6 to 8, when said program product is run on said electronic device.
CN202010704511.7A 2020-07-21 2020-07-21 Method and device for controlling terminal interaction, electronic equipment and storage medium Pending CN111970546A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010704511.7A CN111970546A (en) 2020-07-21 2020-07-21 Method and device for controlling terminal interaction, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010704511.7A CN111970546A (en) 2020-07-21 2020-07-21 Method and device for controlling terminal interaction, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111970546A true CN111970546A (en) 2020-11-20

Family

ID=73362239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010704511.7A Pending CN111970546A (en) 2020-07-21 2020-07-21 Method and device for controlling terminal interaction, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111970546A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419452A (en) * 2021-06-29 2021-09-21 海信集团控股股份有限公司 Vehicle, control method thereof and mobile terminal
CN113596595A (en) * 2021-07-30 2021-11-02 北京奇艺世纪科技有限公司 Information interaction method and device
CN113873309A (en) * 2021-07-30 2021-12-31 北京达佳互联信息技术有限公司 Object playing method and device, electronic equipment and storage medium
CN114007125A (en) * 2021-10-15 2022-02-01 杭州逗酷软件科技有限公司 Volume control method, mobile terminal, target device and storage medium
CN114286167A (en) * 2021-12-03 2022-04-05 杭州逗酷软件科技有限公司 Cross-device interaction method and device, electronic device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019117308A (en) * 2017-12-27 2019-07-18 コニカミノルタ株式会社 Head-mounted display
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene
CN110515580A (en) * 2019-09-02 2019-11-29 联想(北京)有限公司 A kind of display control method, device and terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019117308A (en) * 2017-12-27 2019-07-18 コニカミノルタ株式会社 Head-mounted display
CN110377250A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of touch control method and electronic equipment thrown under screen scene
CN110515580A (en) * 2019-09-02 2019-11-29 联想(北京)有限公司 A kind of display control method, device and terminal

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419452A (en) * 2021-06-29 2021-09-21 海信集团控股股份有限公司 Vehicle, control method thereof and mobile terminal
CN113419452B (en) * 2021-06-29 2023-02-10 海信集团控股股份有限公司 Vehicle, control method thereof and mobile terminal
CN113596595A (en) * 2021-07-30 2021-11-02 北京奇艺世纪科技有限公司 Information interaction method and device
CN113873309A (en) * 2021-07-30 2021-12-31 北京达佳互联信息技术有限公司 Object playing method and device, electronic equipment and storage medium
CN114007125A (en) * 2021-10-15 2022-02-01 杭州逗酷软件科技有限公司 Volume control method, mobile terminal, target device and storage medium
CN114286167A (en) * 2021-12-03 2022-04-05 杭州逗酷软件科技有限公司 Cross-device interaction method and device, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN111970546A (en) Method and device for controlling terminal interaction, electronic equipment and storage medium
US10235305B2 (en) Method and system for sharing content, device and computer-readable recording medium for performing the method
WO2020244266A1 (en) Remote control method for smart television, mobile terminal, and smart television
WO2019120008A1 (en) Smart television and method for displaying graphical user interface of television screen shot
US9081477B2 (en) Electronic device and method of controlling the same
EP4130963A1 (en) Object dragging method and device
US9569159B2 (en) Apparatus, systems and methods for presenting displayed image information of a mobile media device on a large display and control of the mobile media device therefrom
WO2020181906A1 (en) Method and apparatus for interacting with smart television
US10798153B2 (en) Terminal apparatus and server and method of controlling the same
CN112350981B (en) Method, device and system for switching communication protocol
CN101316318A (en) Remote control for devices with connectivity to a service delivery platform
CN103491179A (en) Multi-screen interaction method and system based on Web
WO2022100308A1 (en) Information processing method and related apparatus
WO2014173115A1 (en) Method, device, and system for network communication
WO2020248627A1 (en) Video call method and display device
CN112073664A (en) Video call method and display device
US10097482B2 (en) Method, device, and system for network communication
KR20120105318A (en) Method for sharing of presentation data and mobile terminal using this method
Dooley et al. Followme: The persistent gui
US8638766B2 (en) Electronic device and method of controlling the same
CN108141697B (en) Electronic device, corollary device and method for operating electronic device
CN114979756B (en) Method, device and equipment for realizing one-to-many screen-throwing independent display and interaction
KR101748153B1 (en) Method for displaying information in home network and mobile terminal using this method
CN115904514B (en) Method for realizing cloud rendering pixel flow based on three-dimensional scene and terminal equipment
KR20140014788A (en) Mobile terminal and control method for mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201120