CN108462729B - Method and device for realizing interaction of terminal equipment, terminal equipment and server - Google Patents
Method and device for realizing interaction of terminal equipment, terminal equipment and server Download PDFInfo
- Publication number
- CN108462729B CN108462729B CN201710087563.2A CN201710087563A CN108462729B CN 108462729 B CN108462729 B CN 108462729B CN 201710087563 A CN201710087563 A CN 201710087563A CN 108462729 B CN108462729 B CN 108462729B
- Authority
- CN
- China
- Prior art keywords
- content
- user
- terminal device
- shared
- terminal equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/56—Provisioning of proxy services
- H04L67/565—Conversion or adaptation of application format or content
- H04L67/5651—Reducing the amount or size of exchanged application data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/104—Peer-to-peer [P2P] networks
- H04L67/1074—Peer-to-peer [P2P] networks for supporting data block transmission mechanisms
- H04L67/1078—Resource delivery mechanisms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Information Transfer Between Computers (AREA)
Abstract
A method and a device for realizing terminal equipment interaction, terminal equipment and a server are provided. The method comprises the following steps: (A) Determining shared content to be shared to a second terminal device in the content currently provided by a first terminal device for a user; (B) And sending the determined shared content to the second terminal equipment for sharing. According to the method, the transmission quantity of the shared data can be effectively reduced.
Description
Technical Field
The present invention relates to the field of terminal device interaction technologies, and in particular, to a method and an apparatus for implementing terminal device interaction, a terminal device, and a server.
Background
In the prior art, content currently displayed by a terminal device can be shared to other terminal devices connected to the terminal device for displaying in real time, so that users of the other terminal devices can also view the content displayed by the terminal device. However, the terminal device and other terminal devices are usually connected by wire or in the same local area network to ensure that the network environment is good and controllable, so that a large amount of content can be shared to other terminal devices in real time. However, in some cases, for example, when the terminal device needs to remotely share with other remote terminal devices for remote content transmission, it is difficult to transmit a large amount of image data in real time, and real-time interaction cannot be guaranteed.
Disclosure of Invention
An exemplary embodiment of the present invention is to provide a method and an apparatus for implementing terminal device interaction, a terminal device, and a server, which can effectively reduce the transmission amount of shared data.
According to an exemplary embodiment of the present invention, a method for implementing terminal device interaction is provided, which includes: (A) Determining shared content to be shared to a second terminal device in the content currently provided by a first terminal device for a user; (B) And sending the determined shared content to the second terminal equipment for sharing.
Optionally, the first terminal device and/or the second terminal device are/is a virtual reality device.
Optionally, the shared content to be shared to the second terminal device is determined according to at least one of: the method comprises the steps of obtaining a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, changed content containing effective content and operation of a user of the second terminal device on the received shared content.
Optionally, the object of interest of the user corresponding to the first terminal device is determined according to at least one of the following items: the method comprises the steps that user behaviors corresponding to first terminal equipment, incidence relations among objects in content provided for a user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the first terminal equipment are obtained; and/or determining the user interest object corresponding to the second terminal equipment according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment; and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located; and/or, the specified event comprises at least one of a system related event and a content related event; and/or the changed content is partial content of which the content currently provided for the user changes relative to the shared content sent last time, wherein when at least one of the following conditions is met, the changed content is determined to contain effective content: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the changed content comprises content corresponding to the shared area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
Optionally, step (a) is performed when at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
Optionally, step (B) comprises: and reconstructing the shared content according to a preset mode, and sending the reconstructed content to second terminal equipment for sharing.
Optionally, the preset manner includes at least one of the following: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
Optionally, the method further comprises: receiving, from the second terminal device: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing angle of the first terminal device to a viewing angle corresponding to the target object; and guiding the user of the first terminal device to move the viewing angle and/or automatically switching the viewing angle of the user of the first terminal device according to the received position information and/or the motion path.
Optionally, the method further comprises: receiving operation information of a user operating on the received shared content from the second terminal device; and executing the function corresponding to the received operation information on the content currently provided for the user.
Optionally, the method further comprises: responding to the operation of a user on an external operation device of the first terminal device, and acquiring the position of an operation focus of the external operation device in the content currently provided for the user; and when the external operation equipment is detected to have invalid movement, adjusting the current operation focus to the position of the operation focus before movement.
Optionally, the method further comprises: before the step (A), acquiring the content currently provided by the first terminal device for the user.
Optionally, the content currently provided by the first terminal device for the user is content currently displayed to the user by the first terminal device, where the step of acquiring the content currently provided by the first terminal device for the user includes: acquiring a current panoramic view of a first terminal device; receiving current viewing angle information corresponding to first terminal equipment; and determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
Optionally, the method further comprises: receiving operation information of a user for operating the content currently provided by the first terminal equipment for the user from the first terminal equipment; and executing the function corresponding to the received operation information to acquire the content currently provided by the first terminal equipment for the user.
According to another exemplary embodiment of the present invention, an apparatus for implementing terminal device interaction is provided, including: the shared content determining unit is used for determining shared content to be shared to the second terminal equipment in the content which is currently provided for the user by the first terminal equipment; and the sharing unit is used for sending the determined sharing content to the second terminal equipment for sharing.
Optionally, the first terminal device and/or the second terminal device are/is a virtual reality device.
Optionally, the shared content determining unit determines the shared content to be shared to the second terminal device according to at least one of: the method comprises the steps of obtaining a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, changed content containing effective content and operation of a user of the second terminal device on the received shared content.
Optionally, the shared content determining unit determines the user interest object corresponding to the first terminal device according to at least one of the following items: the user behavior corresponding to the first terminal device, the incidence relation among all objects in the content currently provided for the user, the scene corresponding to the content currently provided for the user and the user attribute corresponding to the first terminal device; and/or the shared content determining unit determines the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment; and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located; and/or, the specified event comprises at least one of a system related event and a content related event; and/or the changed content is a part of content in which the content currently provided for the user changes relative to the shared content transmitted last time, wherein the shared content determining unit determines that the changed content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
Optionally, the shared content determining unit determines the shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user when determining that at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
Optionally, the sharing unit reconstructs the shared content according to a preset manner, and sends the reconstructed content to the second terminal device for sharing.
Optionally, the preset manner comprises at least one of the following items: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
Optionally, the apparatus further comprises: a receiving unit that receives, from the second terminal device: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing angle of the first terminal device to a viewing angle corresponding to the target object; and the visual angle conversion unit guides the user of the first terminal device to move the viewing visual angle and/or automatically switches the viewing visual angle of the user of the first terminal device according to the received position information and/or the motion path.
Optionally, the apparatus further comprises: an operation information receiving unit that receives operation information, which a user operates with respect to the received shared content, from the second terminal device; and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user.
Optionally, the apparatus further comprises: the focus acquisition unit is used for responding to the operation of a user on the external operation equipment of the first terminal equipment, and acquiring the position of an operation focus of the external operation equipment in the content currently provided for the user; and the focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
Optionally, the apparatus further comprises: and the acquisition unit acquires the content currently provided by the first terminal equipment for the user before the shared content determination unit determines the shared content to be shared by the second terminal equipment in the content currently provided by the first terminal equipment for the user.
Optionally, the content currently provided by the first terminal device for the user is content currently displayed to the user by the first terminal device, where the obtaining unit includes: the panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal equipment; the viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal equipment; and the determining unit is used for determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
Optionally, the apparatus further comprises: an operation information receiving unit that receives, from a first terminal device, operation information in which a user operates a content that the first terminal device currently provides for the user; and the execution unit executes a function corresponding to the received operation information so as to acquire the content currently provided by the first terminal equipment for the user.
According to another exemplary embodiment of the present invention, there is provided a terminal device including: a shared content determining unit that determines shared content to be shared to another terminal device among contents currently provided by the terminal device for a user; and the sharing unit is used for sending the determined sharing content to the other terminal equipment for sharing.
Optionally, the terminal device and/or the another terminal device is a virtual reality device.
Optionally, the shared content determining unit determines the shared content to be shared to the other terminal device according to at least one of: the object of interest of the user corresponding to the terminal device, the content corresponding to the shared area, the content corresponding to the designated event, the object of interest of the user corresponding to the other terminal device, the changed content containing the valid content, and the operation of the user of the other terminal device on the received shared content.
Optionally, the shared content determining unit determines the user interest object corresponding to the terminal device according to at least one of the following items: user behaviors corresponding to the terminal equipment, incidence relations among objects in the content provided for the user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the terminal equipment; and/or the shared content determining unit determines the user interest object corresponding to the other terminal equipment according to at least one of the following items: the user behavior corresponding to the other terminal device and the user attribute corresponding to the other terminal device; and/or, the shared region comprises at least one of the following regions: a sharing area set for the terminal device, a sharing area set for the other terminal device, an area where a user interested object corresponding to the terminal device is located, and an area where a user interested object corresponding to the other terminal device is located; and/or, the specified event comprises at least one of a system-related event and a content-related event; and/or the changed content is a part of content in which the content currently provided for the user changes relative to the shared content transmitted last time, wherein the shared content determining unit determines that the changed content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the changed content comprises a user interest object corresponding to the first terminal device and/or the other terminal device; the variant content comprises an actionable object.
Optionally, the shared content determining unit determines the shared content to be shared to the other terminal device among the content currently provided by the terminal device for the user, when it is determined that at least one of the following is satisfied: the movement speed of the terminal equipment is not greater than a second preset threshold; the content currently provided for the user includes at least one of content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to the terminal device, a user interest object corresponding to the other terminal device, an operable object, and changed content including valid content.
Optionally, the sharing unit reconstructs the shared content according to a preset mode, and sends the reconstructed content to the other terminal device for sharing.
Optionally, the preset manner comprises at least one of the following items: acquiring content in a minimum area including the shared content; adjusting the shared content according to the device attribute of the other terminal device; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the terminal equipment and/or the user attribute corresponding to the other terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the terminal equipment and/or the user of the other terminal equipment for the shared content.
Optionally, the terminal device further includes: a receiving unit that receives, from the other terminal device: the user selects the position information of the target object from the received shared content, and/or the motion path from the current viewing angle of the terminal equipment to the viewing angle corresponding to the target object; and the visual angle conversion unit is used for guiding the user of the terminal equipment to move the viewing visual angle and/or automatically switching the viewing visual angle of the user of the terminal equipment according to the received position information and/or the motion path.
Optionally, the terminal device further includes: an operation information receiving unit that receives, from the other terminal device, operation information in which a user operates on the received shared content; and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user.
Optionally, the terminal device further includes: the focus acquisition unit is used for responding to the operation of a user on the external operation equipment of the terminal equipment and acquiring the position of an operation focus of the external operation equipment in the content currently provided for the user; and the focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
According to another exemplary embodiment of the present invention, there is provided a terminal device including: the shared content receiving unit is used for receiving shared content sent by another terminal device, wherein the shared content is determined in the content currently provided by the other terminal device for a user; and a display unit which displays the received shared content.
Optionally, the terminal device further includes: a transmitting unit configured to transmit, to the another terminal apparatus: the position information of the target object selected by the user from the received shared content and/or the motion path from the current viewing angle of the other terminal equipment to the viewing angle corresponding to the target object.
Optionally, the terminal device further includes: an operation receiving unit configured to receive operation information on which a user operates with respect to the received shared content; a sending unit, configured to send the operation information to the other terminal device.
According to another exemplary embodiment of the present invention, there is provided a server including: the acquisition unit is used for acquiring the content currently provided by the first terminal equipment for the user; the shared content determining unit is used for determining shared content to be shared to the second terminal equipment in the content which is currently provided for the user by the first terminal equipment; and the sharing unit is used for sending the determined sharing content to the second terminal equipment for sharing.
Optionally, the first terminal device and/or the second terminal device are/is a virtual reality device.
Optionally, the shared content determining unit determines the shared content to be shared to the second terminal device according to at least one of: the method comprises the steps of obtaining a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, changed content containing effective content and operation of a user of the second terminal device on the received shared content.
Optionally, the shared content determining unit determines the user interest object corresponding to the first terminal device according to at least one of the following items: the method comprises the steps that user behaviors corresponding to first terminal equipment, incidence relations among objects in content provided for a user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the first terminal equipment are obtained; and/or the shared content determining unit determines the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment; and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located; and/or, the specified event comprises at least one of a system related event and a content related event; and/or the changed content is a part of content in which the content currently provided for the user changes relative to the shared content transmitted last time, wherein the shared content determining unit determines that the changed content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
Optionally, the shared content determining unit determines the shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user when determining that at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
Optionally, the sharing unit reconstructs the shared content according to a preset mode, and sends the reconstructed content to the second terminal device for sharing.
Optionally, the preset manner includes at least one of the following: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal device and/or the second terminal device aiming at the shared content.
Optionally, the server further comprises: a receiving unit that receives, from the second terminal device: position information of a target object selected by a user from the received shared content; and the path determining unit is used for determining a motion path from the current viewing angle of the first terminal equipment to the viewing angle corresponding to the target object according to the received position information, and sending the motion path to the first terminal equipment.
Optionally, the server further comprises: an operation information receiving unit that receives operation information, which a user operates with respect to the received shared content, from the second terminal device; and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user and sends the operation information to the first terminal equipment.
Optionally, the content currently provided by the first terminal device for the user is content currently displayed to the user by the first terminal device, where the obtaining unit includes: the panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal equipment; the viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal equipment; and the determining unit is used for determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
Optionally, the server further comprises: an operation information receiving unit that receives, from a first terminal device, operation information in which a user operates a content that the first terminal device currently provides for the user; and the execution unit executes a function corresponding to the received operation information so as to acquire the content currently provided by the first terminal equipment for the user.
In the method and the device for realizing terminal device interaction, the terminal device and the server according to the exemplary embodiments of the present invention, only a part of contents to be shared among contents provided by the terminal device for a user is sent to other terminal devices for sharing, thereby reducing transmission of redundant data, reducing data transmission amount among devices, saving transmission resources, and improving real-time performance of communication among devices. In addition, operation sharing between terminal devices can be realized.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
The above and other objects and features of exemplary embodiments of the present invention will become more apparent from the following description taken in conjunction with the accompanying drawings which illustrate exemplary embodiments, wherein:
FIG. 1 shows a flowchart of a method of implementing terminal device interaction according to an example embodiment of the present invention;
FIG. 2 illustrates an example of determining a user object of interest to which a first terminal device corresponds according to an exemplary embodiment of the present invention;
FIG. 3 illustrates another example of determining a user object of interest to which a first terminal device corresponds according to an exemplary embodiment of the present invention;
fig. 4 illustrates an example of determining shared content according to content corresponding to a shared area according to an exemplary embodiment of the present invention;
fig. 5 illustrates an example of determining shared content from varied content including valid content according to an exemplary embodiment of the present invention;
fig. 6 illustrates an example of determining whether to share content according to a motion speed according to an exemplary embodiment of the present invention;
FIG. 7 illustrates an example of reconstructing shared content according to an exemplary embodiment of the present invention;
FIG. 8 illustrates another example of reconstructing shared content according to an exemplary embodiment of the present invention;
FIG. 9 illustrates another example of reconstructing shared content according to an exemplary embodiment of the present invention;
FIG. 10 illustrates another example of restructuring shared content according to an exemplary embodiment of the present invention;
FIG. 11 illustrates an example of reconstructing shared content according to an exemplary embodiment of the present invention;
FIG. 12 depicts a flowchart of a method of enabling terminal device interaction in accordance with another illustrative embodiment of the present invention;
fig. 13 to 15 illustrate an example of guiding a user of a first terminal device to move a viewing perspective according to an exemplary embodiment of the present invention;
fig. 16 illustrates an example of guiding a user of a first terminal device to move a viewing perspective according to another exemplary embodiment of the present invention;
FIG. 17 shows a flowchart of a method of implementing terminal device interaction in accordance with another example embodiment of the present invention;
18-19 illustrate examples of correspondence between different operating devices according to exemplary embodiments of the present invention;
FIG. 20 illustrates an example of matching of operating parameters between different operating devices, according to an exemplary embodiment of the present invention;
FIG. 21 illustrates an example of a share gesture operation according to an exemplary embodiment of the present invention;
fig. 22 illustrates an example of adaptively adjusting an operation focus according to an exemplary embodiment of the present invention;
FIG. 23 shows a flowchart of a method of implementing terminal device interaction, according to another example embodiment of the present invention;
FIG. 24 is a block diagram of an apparatus for enabling terminal device interaction according to an exemplary embodiment of the present invention;
fig. 25 shows a block diagram of a terminal device according to another exemplary embodiment of the present invention;
FIG. 26 shows a block diagram of a server according to another exemplary embodiment of the present invention;
fig. 27 shows a block diagram of a terminal device according to another exemplary embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.
Fig. 1 shows a flowchart of a method for implementing terminal device interaction according to an exemplary embodiment of the present invention. The method can be executed by the terminal device and also can be executed by the server. The method may be implemented by a computer program. For example, the method may be performed by an application for interaction installed in the terminal device or the server, or by a functional program implemented in an operating system of the terminal device or the server.
In step S10, a shared content to be shared to the second terminal device is determined among the contents currently provided by the first terminal device for the user.
As an example, the first terminal device and/or the second terminal device may be virtual reality devices.
In step S20, the determined shared content is transmitted to the second terminal device for sharing.
As an example, the method for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: before step S10, a connection between the first terminal device and the second terminal device is established for interaction. The connection request can be initiated by the first terminal device or the second terminal device. The initiator of the connection request may depend on the specific scenario: for example, if a user of a first terminal device wants to view content provided by the first terminal device for the user together with a user of a second terminal device, the user may request to establish a connection with the second terminal device through the first terminal device, and when the user of the second terminal device receives the connection request through the second terminal device, the connection establishment process is completed; for example, if a user of the second terminal device wants to view content provided by the first terminal device for the user through the second terminal device, the user can request to establish a connection with the first terminal device through the second terminal device, and after the first terminal device receives the connection request, the connection establishment process is completed.
As an example, when the first terminal device is a connection request initiator, after the second terminal device accepts the request of the first terminal device and establishes a connection, at least one of the following information may be returned to the first terminal device: the device attribute of the second terminal device, the network status of the second terminal device, the user attribute corresponding to the second terminal device, and the user requirement corresponding to the second terminal device (for example, the user interest object corresponding to the second terminal device, the shared area set for the second terminal device, etc.) are provided, so as to perform better interaction with the first terminal device. If the connection request initiator is the second terminal device, the second terminal device may send at least one of the above information to the first terminal device in the connection request, or may send at least one of the above information to the first terminal device after the first terminal device accepts the connection request.
In step S10, the shared content to be shared to the second terminal device may be determined in various suitable manners among the content currently provided by the first terminal device for the user.
It should be understood that the content currently provided by the first terminal device to the user may be the content currently displayed to the user, or may include the content currently displayed to the user and the content currently not displayed to the user due to the viewing perspective of the user. For example, taking the first terminal device as a virtual reality device as an example, the content currently provided by the first terminal device for the user may be content currently presented in the virtual field of view of the user, or may be content of a current entire virtual scene, that is, content of a current panoramic view.
As a preferred example, the shared content to be shared to the second terminal device may be determined according to at least one of: the method comprises the steps of a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, change content containing effective content and operation of a user of the second terminal device on received shared content.
The following detailed description is to be read in connection with specific exemplary embodiments:
(A) Determining shared content according to user interest object corresponding to first terminal equipment
As an example, the object of interest of the user corresponding to the first terminal device may be determined according to at least one of: the user behavior corresponding to the first terminal device, the incidence relation among the objects in the content provided for the user currently, the user attribute corresponding to the first terminal device, and the scene corresponding to the content provided for the user currently.
(a1) Determining a user interest object corresponding to a first terminal device according to a user behavior corresponding to the first terminal device
As an example, the object of interest to the user corresponding to the first terminal device may be determined according to an object operated by the user of the first terminal device in the content currently provided by the first terminal device for the user. For example, an object currently operated by the user may be directly determined as the user-interested object, and an object associated with the currently operated object may also be determined as the user-interested object together. Fig. 2 illustrates an example of determining a user object of interest to which a first terminal device corresponds according to an exemplary embodiment of the present invention. As shown in fig. 2, the content currently provided for the user is represented by a billiard game scene, and the object currently operated by the user of the first terminal device is a gray ball, so that the currently operated ball can be determined as the object of interest of the user, and at the same time, other balls (balls numbered 1, 5, 6, and 8) on the table can also be determined as the object of interest of the user.
As an example, the user interest object corresponding to the first terminal device may be determined according to an operation of the user of the first terminal device on the content currently provided by the first terminal device for the user. Fig. 3 illustrates another example of determining a user object of interest to which a first terminal device corresponds according to an exemplary embodiment of the present invention. As shown in fig. 3, in the shooting game, when most users perform shooting operations, an object of interest is a certain type of object as a shooting target. Therefore, it can be determined that the object of interest of the user is the type of object that is the target of shooting, according to the shooting operation of the user. As an example, the user object of interest may be predicted from the user's current operation of the first terminal device by means of a prediction model with respect to different operations and their respectively corresponding user objects of interest. Here, the prediction model may be trained or an existing prediction model may be optimized based on objects of interest corresponding to various historical operations of the current user or other users.
As an example, the object of interest to the user corresponding to the first terminal device may be determined according to the current action of the user of the first terminal device. As an example, an object corresponding to the current action of the user of the first terminal device may be determined as the object of interest to the user. For example, most users often have an object of interest if the eyes are gazing at the object for more than 3 seconds. Therefore, it can be determined that the object of interest of the user is the object currently gazed at according to the gazing action of the user. As another example, a current motion (e.g., gesture, etc.) of the user of the first terminal device may be semantically analyzed, resulting in an object matching the motion as an object of interest to the user. For example, if the user of the first terminal device makes a writing action, an object such as a pen, paper, book, etc. in the content currently provided for the user may be determined as the object of interest to the user; if the user makes a drinking action, objects such as a cup and a beverage bottle in the content currently provided for the user can be determined as the objects of interest to the user.
Further, as an example, a more accurate user object of interest may also be determined from both the current action of the user of the first terminal device and the object that the user operates in the content that the first terminal device currently provides for the user. For example, the object operated by the user in the content currently provided for the user by the first terminal device is a stylus, and by performing semantic analysis on the current action of the user, if the current action of the user is determined to be an action simulating dipping ink, the object interested by the user is determined to be the stylus and the ink bottle; if it is determined that the current motion of the user is a motion simulating writing, it is determined that the object of interest of the user is a stylus pen and writing paper.
As an example, the object of interest to the user corresponding to the first terminal device may be determined according to the current voice of the user of the first terminal device. Specifically, the user of the first terminal device usually accompanies the language exchange while using the first terminal device, and the language often includes information about the object of interest of the user. Thus, an object mentioned in the speech of the user of the first terminal device may be recognized by means of a speech recognition technique and determined as an object of interest to the user. For example: in a table game, white balls are mentioned in the voice of the user of the first terminal device, the white balls may be determined as objects of interest to the user. For example, when the content currently provided for the user is a game scene, if the user of the first terminal device sends a voice to the user of the second terminal device through the remote connection to "see the pipes", it may be determined that the "pipe" is the user interest object corresponding to the first terminal device according to the voice content.
As an example, the object of interest to the user corresponding to the first terminal device may be determined according to a user setting of the first terminal device. For example, the user of the first terminal device may set an object of interest in advance, and if it is detected that the preset object of interest is included in the content currently provided for the user, the detected object may be determined as the user object of interest. In addition, the user may also designate an object among the contents currently provided for the user as an object of interest to the user in real time.
As an example, the object of interest to the user corresponding to the first terminal device may be determined according to the historical behavior of the user of the first terminal device. Specifically, the historical behavior of the user reflects the past demand of the user at a certain moment in a certain sense, and is also the potential demand of the user at a certain future moment, so that the object which meets the historical behavior of the user can be detected from the content which is currently provided for the user as the object which is interested by the user. As an example, the historical behavior may be eye gaze behavior, search behavior, sharing behavior, browsing behavior, editing behavior, clicking behavior, and the like.
(a2) Determining the object of interest of the user corresponding to the first terminal device according to the incidence relation among all objects in the content provided for the user currently
Because the objects in the scene are often not independent and have some correlation with other objects, the objects related to the objects of interest of the user can be obtained and used as the objects of interest of the user together according to the correlation between the objects. For example, an object associated with an object currently operated by the user may be determined together as an object of interest to the user.
By way of example, the associated object may be an object of the same type as the object of interest to the user, a matching object, or the like. It should be understood that the associated objects may be determined according to various suitable manners, for example, the associated objects may be determined according to association relations preset in a scene; or whether the objects belong to the same type can be determined according to attributes such as categories, colors, shapes and the like; or whether it is a matching object may be determined according to attributes such as categories, matching relationships, and the like. For example, if the object operated by the user in the content currently provided for the user is a pen, both the pen and an object associated therewith (e.g., book, whiteboard, etc.) may be determined as the object of interest to the user.
(a3) Determining a user interest object corresponding to a first terminal device according to a scene corresponding to content currently provided for a user
For example, in a table game scenario, the majority of the objects of interest to the user are table balls. Therefore, the object of interest to the user can be determined to be the table ball according to the table game scene corresponding to the content currently provided for the user.
As an example, the user interest object may be predicted from the scene corresponding to the content currently provided for the user by prediction models regarding different scenes and their respectively corresponding user interest objects. Here, the prediction model may be trained or an existing prediction model may be optimized based on objects of interest corresponding to the current user or other users in various scenarios.
(a4) Determining a user interest object corresponding to the first terminal device according to the user attribute corresponding to the first terminal device
As an example, the user attributes may include at least one of: the user's academic calendar, the user's identity, the user's income status, the user's interests, the user's habits.
(B) Determining shared content according to content corresponding to shared area
As an example, the shared region may include at least one of the following regions: the method comprises the steps of setting a sharing area aiming at a first terminal device, setting a sharing area aiming at a second terminal device, setting an area where a user interest object corresponding to the first terminal device is located, and setting an area where the user interest object corresponding to the second terminal device is located. It should be understood that the shared region may include one or more regions.
(b1) With respect to a shared area set for a first terminal device
The user of the first terminal device may set a fixed position and a fixed size of the shared area in advance, or may update the position and size of the shared area in real time as specified by the user. The user may specify the shared region through various suitable operations, for example, the user may use eye gaze, gestures, clicks, circle, and the like. For example, the location points where the eyes of the user gaze within a certain period of time may be counted to obtain a location point distribution map, and then the region of interest of the user may be determined according to the distribution map, so as to set the region as the shared region.
Fig. 4 illustrates an example of determining shared content according to content corresponding to a shared area according to an exemplary embodiment of the present invention. As shown in fig. 4, the content currently provided to the user includes a personal bank card and a soccer game, and the user only wants to share the content related to the soccer game to the user of the second terminal device, so that the user of the first terminal device can designate a sharing area (an area corresponding to a dotted square in fig. 4) by a gesture operation, thereby sharing only the content corresponding to the sharing area to the user of the second terminal device.
(b2) About the area where the user interest object corresponding to the first terminal device is located
Here, the shared area may be adaptively adjusted, and the position and size of the shared area may be updated in real time as the size and position of the user's object of interest are changed. For example, the user may set the user interest object (for example, the user interest object may be specified in real time or may be preset), and then determine the minimum area containing the user interest object as the current sharing area.
In addition, the user may also designate an area in which the dynamic user object of interest may appear in the content currently provided by the first terminal device for the user as a sharing area, and the first terminal device shares the content in the sharing area to the second terminal device in real time. When the occurrence of the dynamic user interest object is detected, the current sharing area can be kept unchanged according to the requirements of the user of the first terminal device and/or the second terminal device, or the area where the dynamic user interest object is located is selected as the sharing area, that is, the sharing area is adaptively adjusted along with the change of the user interest object.
(b3) The shared area set for the second terminal equipment and the area where the user interested object corresponding to the second terminal equipment is located
When the shared area is set, not only the requirement of the user of the first terminal device needs to be met, but also the shared area required by the user of the second terminal device can be used as the shared area according to the requirement of the user of the second terminal device, and/or the area where the user interested object corresponding to the second terminal device is located is used as the shared area.
(C) Determining shared content according to content corresponding to specified event
In the process of sharing content from the first terminal device to the second terminal device, the first terminal device is often accompanied by some events, and for some specified events, a user of the second terminal device needs to be reminded. Therefore, the content corresponding to the specified event can be regarded as the shared content.
By way of example, the specified events may include system-related events and/or content-related events.
(c1) About system-related events
As an example, the system-related events may include at least one of: system notification, error reminding, timing reminding, low power, abnormal network speed, abnormal equipment temperature and other system popup events. For example, when the first terminal device issues a low power alarm, the event prompt window may be optionally sent to the second terminal device to notify the user of the second terminal device that the first terminal device is in a low power state for the first time, and the sharing of the content may be interrupted or the resolution of the shared content may be reduced.
(c2) Relating to content-related events
As an example, the content-related event may include that a user interest object corresponding to the first terminal device and/or the second terminal device appears in the content currently provided for the user. For example, when a user interest object corresponding to the first terminal device and/or the second terminal device appears in the content currently provided for the user, the content of the user interest object or the area where the user interest object is located may be shared to the second terminal device.
It should be understood that the specified event may be set by a user of the first terminal device, and may also be set by a user of the second terminal device. For example, the user of the second terminal device may request that when a certain user interest object appears, the content of the area where the user interest object is located is sent to the second terminal device for sharing.
In addition, when the second terminal device receives the content corresponding to the specified event, the user of the second terminal device can be prompted in the modes of voice, vibration and the like while the received content is displayed.
(D) Determining shared content according to user interest objects corresponding to the second terminal equipment
As an example, the object of interest of the user corresponding to the second terminal device may be determined according to at least one of the following: the user behavior corresponding to the second terminal device and the user attribute corresponding to the second terminal device.
(d1) Determining the user interest object corresponding to the second terminal device according to the user behavior corresponding to the second terminal device
As an example, the user interest object corresponding to the second terminal device may be determined according to at least one of an operation of the user of the second terminal device on the content displayed by the second terminal device, an object operated in the displayed content, a current action, a voice, a set interest object, and a historical behavior.
(d2) Determining a user interest object corresponding to the second terminal device according to the user attribute corresponding to the second terminal device
As an example, the user attributes may include at least one of: the user's academic calendar, the user's identity, the user's income status, the user's interests, the user's habits.
(E) Determining shared content from changing content containing active content
As an example, the changed content may be a partial content in which the content currently provided for the user is changed from the shared content transmitted last time. As an example, it may be determined that the effective content is included in the variation content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
As an example, the first predetermined threshold may be a fixed value (e.g. 30%) by default, or may be a value set according to user requirements of the first terminal device and/or the second terminal device.
Fig. 5 illustrates an example of determining shared content from varied content including valid content according to an exemplary embodiment of the present invention. As shown in fig. 5, (a) in fig. 5 shows the shared content, hereinafter referred to as screen one, which was last transmitted to the second terminal apparatus, and the content currently provided to the user (hereinafter referred to as screen two) changes with respect to screen one, as shown in (b) in fig. 5. If the changed part of the two pictures relative to the first picture does not contain the valid content, the content currently provided for the user can not be shared, as shown in (c) in fig. 5, the second terminal device still continues to display the shared content received last time; if the part of the two pictures changed relative to the first picture contains valid content, the content of the part of the two pictures changed relative to the first picture can be determined as shared content, and the second terminal device can synthesize the content of the changed image (namely, the second picture) according to the content of the part of the first picture and the part of the current received change and display the content, namely, the second picture, as shown in (d) of fig. 5.
(F) Determining shared content according to operation of user of second terminal equipment on received shared content
As an example, the user interest object corresponding to the second terminal device and/or the sharing area set for the second terminal device may be determined according to an operation of the user of the second terminal device on the received shared content, so as to determine the shared content according to the determined user interest object corresponding to the second terminal device and/or the sharing area set for the second terminal device.
In addition, as an example, the method for implementing terminal device interaction according to the exemplary embodiment of the present invention may further include: step S10 is performed when at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
As an example, it may be determined not to perform step S10 when at least one of the following conditions is satisfied: the movement speed of the first terminal device exceeds a second predetermined threshold; the movement speed of the first terminal equipment exceeds a third preset threshold value and the content needing to be shared is not contained in the content currently provided for the user. As an example, the content that needs to be shared may include at least one of: the method comprises the content corresponding to a shared area, the content corresponding to a specified event, a user interest object corresponding to a first terminal device, a user interest object corresponding to a second terminal device, an operable object and the change content containing effective content.
It should be appreciated that the speed of movement of the first terminal device may be obtained in any suitable manner. For example, when the first terminal device is a virtual reality device, the head movement of the user can be detected through a gyroscope and an acceleration sensor built in the first terminal device, and the first terminal device is controlled to present the changed content to the user. And the data detected by the gyroscope and the acceleration sensor also reflect the motion state of the first terminal equipment, namely the size of the rotation angle and the speed of the rotation speed. Therefore, the first terminal device may be subjected to motion analysis according to the data of the gyroscope and the acceleration sensor in the first terminal device, so as to obtain a motion state (e.g., a motion speed, etc.) of the first terminal device, and if it is known that the head of the user of the first terminal device rotates faster according to the motion state (e.g., the motion speed exceeds a second predetermined threshold) of the first terminal device, the user may be considered to be uninterested in the content presented during the rotation, and thus it may be determined not to perform step S10.
Fig. 6 illustrates an example of determining whether to share content according to a motion speed according to an exemplary embodiment of the present invention. As shown in fig. 6, when the user of the first terminal device turns his head, the motion state of the first terminal device may be detected through a gyroscope and an acceleration sensor built in the first terminal device, when it is detected that the current motion speed exceeds a second predetermined threshold, it may be determined that the user is not interested in the content currently provided for the user, and the content currently provided for the user is not shared, as shown in (a) of fig. 6, the second terminal device still displays the shared content received from the first terminal device last time, and if it is detected that the motion speed does not exceed the second predetermined threshold, which indicates that the user may be interested in the content currently provided for the user, steps S10 and S20 may be performed, and the determined shared content is transmitted to the second terminal device for sharing, as shown in (b) of fig. 6.
As another example, whether to perform step S10 may be determined in conjunction with the content currently provided to the user, in addition to according to the motion state of the first terminal device. Specifically, the motion analysis may be performed on the first terminal device according to data of a gyroscope and an acceleration sensor that are built in the first terminal device, so as to obtain a motion state (e.g., a motion speed, etc.) of the first terminal device, if it is known that the head of the user of the first terminal device rotates faster according to the motion state (e.g., the motion speed exceeds a third predetermined threshold) of the first terminal device, the content of the current virtual view image may be further analyzed, and if it is obtained through the analysis that the content provided by the first terminal device for the user does not include content that needs to be shared in the process that the head of the user of the first terminal device rotates faster, the user may be considered to be uninteresting in the content provided by the first terminal device for the user in the rotation process, so step S10 may not be executed.
It should be understood that the second predetermined threshold and the third predetermined threshold may be the same or different. As a preferred example, the second predetermined threshold may be greater than the third predetermined threshold, the second predetermined threshold may be a greater value, and the third predetermined threshold may be a lesser value, i.e. it is determined not to perform step S10 only when the user of the first terminal device turns the head very quickly; if the user of the first terminal device rotates the head relatively quickly, the step S10 may be further determined to be executed according to the content currently provided to the user.
As a preferred example, in step S20, the shared content may be reconstructed in a preset manner, and the reconstructed content may be transmitted to the second terminal device for sharing.
As an example, the preset manner includes at least one of the following: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
The following detailed description is to be read in connection with specific exemplary embodiments:
(A) Obtaining content within a minimum area including the shared content
When the shared content is an object, in order to enable the user of the second terminal apparatus to enhance the viewing experience with the content such as a background other than the object, the content within the minimum area including the shared content may be acquired from the content currently provided to the user. Fig. 7 illustrates an example of reconstructing shared content according to an exemplary embodiment of the present invention. As shown in fig. 7, if it is determined that the shared content is the user interest object "pipe" corresponding to the first terminal device, the content in the minimum area including the "pipe" may be sent to the second terminal device for sharing.
(B) Adjusting the shared content according to the device attribute of the second terminal device
As an example, the device properties may include at least one of: screen size, type of image that can be displayed (e.g., 2D, 3D, etc.), resolution of the screen, memory, processor.
As an example, the size of the shared content may be adjusted according to the screen size of the second terminal device, for example, the shared content may be enlarged according to the screen size of the second terminal device, so as to enable the shared content to be displayed as enlarged as possible within the limited screen of the second terminal device; the shared content can be reduced according to the size of the screen of the second terminal device, so that the shared content can be completely displayed in the limited screen of the second terminal device, and the sharing effect is improved.
As an example, if the second terminal device can display only a 2D image, the shared content may be converted from a 3D content form to a 2D content form.
As an example, if the resolution of the screen of the second terminal device is low, the resolution of the shared content may be reduced.
(C) Reconstructing the shared content from redundant regions
It should be understood that the determined shared content may include a plurality of shared contents, and therefore, according to an exemplary embodiment of the present invention, the shared contents may be arranged compactly together according to a position relationship of the shared contents in the content provided for the user, so as to minimize a redundant blank area. In addition, the arrangement position of the shared content can also be determined according to the user requirements of the first terminal equipment and/or the second terminal equipment.
Fig. 8 illustrates another example of reconstructing shared content according to an exemplary embodiment of the present invention. As shown in fig. 8, if the determined shared content is the user interest object "application icon" corresponding to the first terminal device, the "application icons" may be arranged together compactly after being enlarged according to the position relationship between the "application icons" and the content currently provided for the user, so that redundant areas are reduced as much as possible.
(D) Adjusting the shared content based on current network conditions
As an example, if the current network conditions are better, the resolution of the shared content may be increased; the resolution of the shared content may be reduced if the current network conditions are poor.
(E) Correcting the shared content based on the user attribute corresponding to the first terminal device and/or the user attribute corresponding to the second terminal device
For the user of the first terminal device and the user of the second terminal device, the user of the first terminal device and the user of the second terminal device have certain specific attributes, the attributes reflect the specific requirements of the user, shared content is reconstructed according to the attributes of the user, the personalized requirements of the user are reflected, and the experience of the user is improved. As an example, the user attributes may include at least one of: a user's academic calendar, a user's identity, a user's income status, a user's interests, a user's habits, a user's physical condition.
As an example, if it is determined that the user belongs to the color-blind group through analysis of the eye attribute of the user of the first terminal device, the shared content may be automatically corrected according to the color-blind attribute of the user, as shown in fig. 9, it may be determined that the object of interest of the user corresponding to the first terminal device is a "red ball" according to the voice of the user of the first terminal device, but it is found that the user belongs to the red-green-blind group according to the user attribute corresponding to the first terminal device, and therefore, the content of the area where the "green ball" is located is sent to the second terminal device for sharing after the correction.
As an example, if it is determined that the user is accustomed to using the left hand through analysis of the dominant hand attributes of the user of the first terminal device and/or the second terminal device, the shared content may be automatically corrected according to the left-handed attributes of the user.
As an example, the person portrait analysis may be performed on the user according to the condition of the user of the first terminal device and/or the second terminal device, such as the academic calendar, the identity, the income, and the like, so as to obtain the preference of the user, so as to correct the shared content and achieve the purpose of personalization.
(F) Filtering the shared content based on filtering conditions
As an example, the filtering condition may be a filtering condition preset by a user of the first terminal device and/or a filtering condition preset by a user of the second terminal device, for example, the user of the second terminal device presets to filter out an event window popped up by the system, so as to avoid affecting the experience.
As an example, the filter condition may be a filter condition determined from historical behavior of a user of the first terminal device and/or the second terminal device. The historical behavior of the user reflects the interest points of the user in a certain sense, so that the corresponding filtering condition can be determined through the analysis of the historical behavior of the user. For example, in a virtual scene of drawing, a user always uses a writing brush to create a wash painting, and rarely uses a pencil to perform sketch, so that the pencil in the shared content can be automatically filtered out.
As an example, the filter condition may be a filter condition determined according to a user attribute. Here, the user attribute may include at least one of: the user attribute corresponding to the first terminal device, the user attribute corresponding to the second terminal device, and the attribute of the role in the content provided for the user. For example, objects that do not match the user attributes may be filtered out, e.g., if the user of the first terminal device is a student, then content other than content related to the student may be filtered out.
As an example, the filter condition may be a filter condition determined according to a user authority of the sharing device set by the user of the first terminal device.
For different users of the sharing device or the sharing device, different user permissions can be set to protect the private content of the user of the first terminal device from being disclosed, or avoid the user of the sharing device from carrying out misoperation on the shared content to cause unnecessary loss. For example, the user of the first terminal device is currently playing a game, and at this time, if the user of the second terminal device inadvertently turns off the game, the user of the first terminal device cannot return to the normal game.
As an example, the user permissions may include at least one of: the system sets authority, user privacy authority and application program range authority.
Regarding system setting permissions, it is concerned which system settings can be shared and which system settings cannot be shared. For example, the settings of the system parameters such as sound, brightness, vibration feedback, etc. may be set to be not shareable to avoid unnecessary influence on the user of the first terminal device due to erroneous operation by the user of the second terminal device.
Regarding the user privacy authority, it is referred to which contents regarding the user privacy cannot be shared. For example, the content including the bank card, the personal account, the friend information, and the like of the user of the first terminal device may be set to be not sharable, so as to protect the personal privacy of the user of the first terminal device from being freely viewed by the user of the second terminal device.
Regarding the application scope authority, it is related to setting which application contents can be shared and which application contents cannot be shared, and for the application which can be shared, it can further set which contents can be shared and which contents cannot be shared in the application interface. For example, the user of the first terminal device may chat and watch a video at the same time, and the dialog box in the chat application may include a chat history with another person and may not be desired to be seen by the user of the second terminal device.
In addition, a specific permission level can be set, for example, different permission levels such as invisible (i.e., not shared), read-only, operable, and the like can be set, and the user of the first terminal device can flexibly set the permission to protect the shared content. As shown in fig. 10, the content to be shared includes personal bank card information and a soccer game video interface, and for user a of the second terminal device, the authority level is invisible to the bank card, so that the bank card information in the content to be shared is filtered out, and only the shared content including the soccer game video interface is sent to user a. And for the user B of the second terminal equipment, the authority level is visible to the bank card, so that the shared content containing the personal bank card information and the football game video interface is directly sent to the user B.
By filtering the content to be shared based on the filtering condition, the range of the shared content can be narrowed, the transmission amount of the shared data is reduced, and the privacy of the user of the first terminal device can be protected.
(G) Optimizing shared content according to operation of user of first terminal device and/or second terminal device for shared content
During the process of sharing the content, the first terminal device and the second terminal device may perform some operations on the shared content, for example, an amplification operation, and these operations of the user reflect the sharing requirement of the user to some extent, for example, the amplification operation reflects the quality requirement of the shared content.
Based on these operations by the user of the first terminal device and/or the second terminal device, the shared content may be reconstructed and then shared, for example, the quality of the shared content may be improved (for example, the resolution of the shared content may be improved).
For example, if the user of the first terminal device performs an enlargement operation on the shared content, a finer-grained (high-resolution) image of the shared content can be transmitted to the second terminal device for sharing.
As an example, step S20 may include: and transmitting the shared content in a high-resolution form and transmitting the content except the shared content in the content currently provided for the user to a second terminal device in a low-resolution form.
Further, as an example, step S10 may further include: determining a part which cannot be displayed in a screen of the second terminal equipment in the content currently provided for the user based on the screen size of the second terminal equipment; and overlaying the content and/or the operable object which needs to be shared in the part which cannot be displayed to the user to the part which can be displayed in the screen of the second terminal equipment in the content which is provided for the user currently, and taking the overlaid content as the shared content to be shared to the second terminal equipment.
As shown in fig. 11, the screen of the second terminal device is small and the displayed content is limited, and in order to display the shared content as much as possible in the limited display area, the shared content may be determined according to the screen size of the second terminal device. For example, (a) in fig. 11 shows content currently provided for a user, and (b) in fig. 11 shows content that can be displayed by a second terminal device if the content currently provided for the user is directly transmitted to the second terminal device for sharing, according to an exemplary embodiment of the present invention, a user interest object corresponding to the first terminal device in a portion that cannot be displayed by the second terminal device may be extracted, as content in a box in (a) in fig. 11, and the extracted interest object may be superimposed on a portion that can be displayed in a screen of the second terminal device, as shown in (c) in fig. 11, so that the user of the second terminal device can obtain more information in a limited display area, and in addition, in order to facilitate accurate recognition of an actual position of a superimposed object by the user of the second terminal device, a thumbnail of the content currently provided for the user, which can indicate the actual position of the superimposed object, may be also transmitted to the second terminal device for display together.
In addition, as an example, the method for implementing terminal device interaction according to the exemplary embodiment of the present invention may further include: and optimizing the shared content according to the operation of the user of the second terminal equipment on the received shared content, and sending the optimized content to the second terminal equipment.
During the content sharing process of the first terminal device and the second terminal device, the user of the second terminal device may perform some operations, such as an amplification operation, on the received shared content, and these operations of the user of the second terminal device reflect the sharing requirement of the user of the second terminal device to some extent, for example, the amplification operation reflects the quality requirement of the shared content. Based on these operations by the user of the second terminal device, the first terminal device may optimize the shared content and transmit the optimized content to the second terminal device.
For example, after the first terminal device sends the shared content to the second terminal device, the user of the second terminal device performs an enlarging operation on the shared content, the second terminal device sends information indicating the enlarging operation of the user to the first terminal device, and the first terminal device sends the shared content or a finer-grained (high-resolution) image of an area viewed by the enlarged user to the user of the second terminal device for display, so as to meet the requirements of the user of the second terminal device.
Fig. 12 shows a flowchart of a method of implementing terminal device interaction according to another exemplary embodiment of the present invention. Step S10 and step S20 may be implemented with reference to the specific implementation described with reference to fig. 1, and are not described herein again.
In step S30, receiving from the second terminal device: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing perspective of the first terminal device to a viewing perspective corresponding to the target object.
In particular, the user of the second terminal device may choose to view the previously received shared content without having to view the currently real-time shared content, i.e. the content currently viewed by the user of the second terminal device and the user of the first terminal device may be different, in other words the viewing angle from which the panoramic view is viewed may be different. As an example, the panoramic view may be shared to the second terminal device in advance, and the second terminal device may also reconstruct the panoramic view based on the previously received shared content, so that the user of the second terminal device may view the previously shared content through the panoramic view.
In step S40, according to the received position information and/or the motion path, the user of the first terminal device is guided to move the viewing angle and/or the viewing angle of the user of the first terminal device is automatically switched.
As an example, when the received position information of the target object is received, a motion path from the current viewing perspective to the viewing perspective corresponding to the target object (i.e., the viewing perspective at which the target object appears in the field of view of the user of the first terminal device) may be calculated according to the received position information of the target object and the current viewing perspective of the user, and then the user may be guided to move the viewing perspective and/or the viewing perspective of the user of the first terminal device may be automatically switched according to the calculated motion path.
As an example, the user may be instructed to move the viewing perspective by voice, display a movement path, a warning tone, or the like. Taking the alert tone as an example, if the user of the first terminal device slowly approaches the target object through the alert tone, the alert tone is larger and larger, and the interval is shorter; if the user is far away from the target object, the sound is smaller and smaller, and the interval is longer.
As shown in fig. 13, the user of the second terminal device may select to view shared content that has been received before, i.e., the content currently viewed by the user of the second terminal device and the user of the first terminal device is different; as shown in fig. 14, the user may select a target object "computer" from the shared content that has been received before, and transmit position information of the selected target object to the first terminal device, and the first terminal device instructs the user to move the viewing angle so that the "computer" appears in the field of view of the user of the first terminal device according to the received position information of the "computer", as shown in fig. 15, and it should be understood that, since the viewing angle of the user of the first terminal device is changed, the content that the first terminal device currently provides for the user may also be changed, and therefore, steps S10 and S20 may be performed back, so that the user of the second terminal device can determine whether the user of the first terminal device has moved to the position where the selected target object is located.
As another example, the following may be received directly from the second terminal device: and the motion path from the current viewing angle of the first terminal device to the viewing angle corresponding to the target object.
Specifically, the user of the second terminal device may select a target object from the previously received shared content, and may calculate a motion path from the current viewing perspective to a viewing perspective corresponding to the target object by the user of the first terminal device based on the current viewing perspective of the user of the first terminal device (e.g., may be determined according to the current shared content) and the position information of the selected target object, and may transmit the calculated motion path to the first terminal device, so that the first terminal device may guide the user thereof to move the viewing perspective and/or automatically switch the viewing perspective of the user thereof according to the received motion path.
Fig. 16 (a) shows the current shared content, fig. 16 (b) shows that the second terminal device currently displays a panoramic view, and can synchronously display the current viewing angle of the user of the first terminal device on the displayed panoramic view, the second terminal device calculates a motion path from the current viewing angle of the user of the first terminal device to the viewing angle corresponding to the target object according to the position information of the target object selected by the user in the panoramic view and the current viewing angle of the user of the first terminal device, and transmits the calculated motion path to the first terminal device, the first terminal device guides the user of the first terminal device to move the viewing angle or automatically switch the viewing angle of the user thereof according to the received motion path, and it should be understood that the content currently provided by the user of the first terminal device may also change due to the change of the viewing angle of the user of the first terminal device, and therefore, the first terminal device may return to execute steps S10 and S20, so that the user of the second terminal device can determine whether the user of the first terminal device has moved to the position of the target object, as shown in fig. 16 (c).
Fig. 17 shows a flowchart of a method of implementing terminal device interaction according to another exemplary embodiment of the present invention. Step S10 and step S20 may be implemented with reference to the specific implementation described with reference to fig. 1, and are not described herein again.
When the user of the first terminal device uses the first terminal device, the user may be unfamiliar with some manipulations and wants to obtain operation guidance of the user of the second terminal device by sharing content with the second terminal device. After the second terminal device receives the shared content, the user of the second terminal device may perform a series of guidance operations. Optionally, the method for implementing terminal device interaction according to another exemplary embodiment of the present invention may include step S30 and step S40, prompting the user of the first terminal device how to move from the current viewing perspective to the position of the object operated by the user of the second terminal device, so that the user of the first terminal device finds the object operated by the user of the second terminal device. Further, step S30 and step S40 may not be performed, for example, the object operated by the user of the second terminal device is within the current viewing perspective of the user of the first terminal device.
In step S50, operation information that the user operates on the received shared content is received from the second terminal apparatus.
As an example, the operation information by which the user of the second terminal device operates on the received shared content may include at least one of: attribute information and operating parameters of the operating device; operation information on the gesture operation; operation information on voice operation. For example, the operation device may be an externally connected operation device such as a handle, a keyboard, and a mouse, or may be a touch screen, a key, and the like of the second terminal device itself. For example, the operation parameter may be a specific operation parameter such as click, slide, long press, etc.
In step S60, a function corresponding to the received operation information is executed for the content currently provided to the user.
As an example, corresponding operation information for performing the same operation on the content currently provided for the user may be first determined from the received operation information, and then a function corresponding to the determined operation information may be executed. Therefore, the operation sharing between the first terminal device and the second terminal device is realized, and the operation of the user of the second terminal device on the shared content can be reflected in the content provided by the first terminal device for the user.
The first terminal device and the second terminal device may implement different operations corresponding to the same object and the same processing. For example, the user of the first terminal device may achieve the processing result of moving the viewing angle to the right by operating the handle, while the user of the second terminal device may need to slide to the right on the screen to achieve the processing result of moving the viewing angle to the right. Therefore, in practice, when the user of the second terminal device instructs the user of the first terminal device to operate the shared content, the current operation of the user of the second terminal device needs to be matched with the operation device of the first terminal device, and the matched operation can be displayed on the first terminal device in real time, so as to instruct the user of the first terminal device to achieve the operation effect according to the operation.
As an example, when the operation information includes attribute information and operation parameters of the operation device of the second terminal device, it may be determined whether the operation device of the second terminal device is consistent with or matched with the operation device of the first terminal device according to the attribute information of the operation device of the second terminal device, if so, the operation parameters may be directly acted on the first terminal device, otherwise, the operation parameters need to be converted into corresponding operation parameters for performing the same operation on the content currently provided by the first terminal device for the user, and the converted operation parameters are acted on the first terminal device, it should be understood that, since the first terminal device executes the function corresponding to the converted operation information, the content currently provided for the user is also changed, and therefore, steps S10 and S20 may be returned to execute, and the effect generated by the operation may be fed back to the user of the second terminal device in real time, thereby achieving real-time sharing of the operation. In addition, the matched operation parameters can be displayed in real time in the first terminal equipment.
As shown in fig. 18, the first terminal device may store the correspondence relationship of the operation parameters between different handles in advance, as shown in fig. 19, the handle connected to the second terminal device is different from the handle connected to the first terminal device, and the "o" key of the handle connected to the second terminal device has the same function as the "a" key of the handle connected to the first terminal device by matching the attribute information of the devices. Therefore, when the user of the second terminal device operates the O key of the connected handle, the first terminal device is automatically matched with the A key in the connected handle, and corresponding processing is carried out on the content currently provided for the user by the first terminal device according to the function of the A key. Further, the matched handle operation can be displayed in the first terminal device, as shown in fig. 20.
When the user of the second terminal device instructs the user of the first terminal device to perform an operation, if the operation is a gesture operation, as shown in fig. 21, the user of the second terminal device may first select an operation object, then an image capture device (e.g., a front camera or the like) of the second terminal device captures a gesture of the user of the second terminal device, the second terminal device sends information of the operation object and the captured gesture operation to the first terminal device, and the first terminal device performs corresponding processing on the operation object according to the received operation information, it should be understood that, since the first terminal device performs corresponding processing on the operation object, the content currently provided by the first terminal device for the user is also changed, and therefore, steps S10 and S20 may be returned to perform, and the effect generated by the operation may be fed back to the user of the second terminal device in real time, and in addition, the gesture operation of the user of the second terminal device may be displayed to the user of the first terminal device, so that the user of the first terminal device may know a specific operation manner.
As an example, in an actual scenario, when a user of the second terminal device operates a selected operation object, the second terminal device may filter the redundant operation according to an operation type acceptable by the operation object. In the process of making the gesture by the user of the second terminal device, due to the continuity of the gesture, some redundant gesture operations are often carried out in the operation process, and some other action and behavior operations may be recognized through actions and behaviors. For example, in a table game, when a ball hitting action is completed, the actions that can be captured by the camera include a ball hitting action, a bending and bending action, and an aiming action, but the object to be operated, the table ball, which is known from the operation attribute, only receives the ball hitting action, so that redundant operations such as bending and bending, aiming and the like can be filtered.
In addition, the method for implementing terminal device interaction according to the exemplary embodiment of the present invention shown in fig. 1, 12 and 17 may further include: responding to the operation of a user on an external operation device of the first terminal device, and acquiring the position of an operation focus of the external operation device in the content currently provided for the user; and when the external operation equipment is detected to have invalid movement, adjusting the current operation focus to the position of the operation focus before movement.
As an example, the invalid movement may be a movement of the external operating device caused by the user of the first terminal device and the user of the second terminal device transferring the external operating device.
For example, when the user of the first terminal device and the user of the second terminal device share the external operation device in a close range, in order to know how the user of the second terminal device operates the content currently provided by the first terminal device for the user through the external operation device (e.g., a handle, etc.), the user of the first terminal device may transfer the external operation device to the user of the second terminal device in a guidance process, the user of the second terminal device operates the external operation device, the first terminal device displays the operation result in real time, and may display specific operation information of the user of the second terminal device on the external operation device.
However, during the transferring (i.e. the invalid movement) of the external operating device, the operation focus of the external operating device may move, and therefore, when the invalid movement of the external operating device is detected, the current operation focus may be adjusted to the position of the operation focus before the movement.
For example, when the user of the first terminal device and the user of the second terminal device transmit the external operating device, the user of the first terminal device may press a certain setting key on the external operating device before transmission, when the first terminal device receives the key signal, the position of the current operating focus may be recorded, after the transmission is completed, the user of the second terminal device may press the setting key again, and the first terminal device learns that the transmission is completed (that is, the external operating device performs invalid movement), and then adjusts the current operating focus of the external operating device to the recorded position (that is, the position of the operating focus before the transmission).
As shown in fig. 22, in the table game, the operation focus of the external operating device points to the "white ball", but since the data of the sensor such as the gyroscope and the accelerometer on the external operating device is changed in the transmission process of the external operating device, the operation focus of the external operating device points to the "ball 1" after the transmission. At this time, if the operation focus of the external operation device is not adjusted and the user of the second terminal device directly performs the operation, an error operation will be generated. Therefore, the position of the operation focus after transmission can be adaptively adjusted according to the position of the operation focus before transmission of the external operation equipment, so that the position of the operation focus after transmission is consistent with the position before transmission.
Fig. 23 shows a flowchart of a method of implementing terminal device interaction according to another exemplary embodiment of the present invention. The method may be performed by a server or may be implemented by a computer program. For example, the method may be performed by an application installed in the server for interaction, or by a functional program implemented in an operating system of the server. Step S10 and step S20 may be implemented with reference to the specific implementation described with reference to fig. 1, and are not described herein again.
In step S70, before step S10, the content currently provided by the first terminal device for the user is acquired.
Here, before content sharing, the first terminal device and the second terminal device that needs to be shared need to establish connection with the server, respectively, and the second terminal device that needs to be shared may send at least one of the following information to the server: the identification information (for example, ID and the like) of the second terminal device, the device attribute of the second terminal device, the network status of the second terminal device, the user attribute corresponding to the second terminal device, and the user requirement corresponding to the second terminal device (for example, the user interest object corresponding to the second terminal device, the shared area set for the second terminal device, and the like) are provided, so as to perform better interaction with the first terminal device.
As an example, the content currently provided by the first terminal device for the user is the content currently displayed by the first terminal device for the user, wherein step S70 may include: acquiring a current panoramic view of a first terminal device; receiving current viewing angle information corresponding to first terminal equipment; and determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
As an example, the current panoramic view of the first terminal device may be obtained directly from the first terminal device. As another example, instead of acquiring the current panoramic view of the first terminal device from the first terminal device, the server may run the same application as the application currently run by the first terminal device, and receive, from the first terminal device in real time, operation information of the user operating the content currently provided by the first terminal device for the user; and executing a function corresponding to the received operation information, thereby ensuring that an image consistent with the current panoramic view of the first terminal equipment can be obtained all the time so as to obtain the content currently provided for the user by the first terminal equipment.
Moreover, the sharing method between the first terminal device and the second terminal device can be divided into the following methods:
(a) One-to-one interaction mode: a first terminal device interacts with a second terminal device.
(b) One-to-many interaction mode: one first terminal device interacts with a plurality of second terminal devices, that is, the plurality of second terminal devices obtain the content in the same first terminal device through the server.
(c) Many-to-one interaction mode: the plurality of first terminal devices interact with one second terminal device, one second terminal device can simultaneously acquire the contents of the plurality of first terminal devices, at the moment, the second terminal device can select and display the shared contents of one first terminal device in an arbitrary switching mode, and can simultaneously display and operate the shared contents of the plurality of first terminal devices on the same screen.
(d) Many-to-many interaction mode: the plurality of first terminal devices interact with the plurality of second terminal devices.
In a first embodiment, a first terminal device shares a picture of a certain video currently viewed by a user of the first terminal device with a second terminal device a and a second terminal device B.
The user of the second terminal device a may preset a specified event: when the object 1 appears in the video picture, the picture of the area where the object 1 is located is sent to the second terminal device a. When the object 1 appears in the content currently provided by the first terminal device for the user, the first terminal device (or the server) recognizes the area where the object 1 is located, and automatically transmits the content of the area to the second terminal device a. And, if the user of the second terminal apparatus a enlarges the shared content on the second terminal apparatus a, the second terminal apparatus a transmits information indicating the enlargement operation of the user thereof to the first terminal apparatus (or server), and the first terminal apparatus (or server) can transmit an image of finer granularity (high resolution) in the area to the second terminal apparatus a, thereby satisfying the demand of the user of the second terminal apparatus a.
In addition, the first terminal device (or the server) may share the current panoramic view to the second terminal device B in real time, and the user of the second terminal device B can see the current panoramic view, and due to too many targets in the video picture, when the user of the first terminal device cannot find out where the target 2 is, help of the user of the second terminal device B may be sought. The user of the second terminal device B selects the position of the target 2 by watching the current panoramic view, and the second terminal device transmits the position information of the target 2 to the first terminal device or to the server, and the server forwards the position information to the first terminal device. After receiving the position information, the first terminal device may instruct the user how to move the viewing angle to reach the position of the target 2, for example, the user may be notified of the distance relationship between the user and the target 2 by the size of the prompt tone, and the user of the first terminal device may finally find the target 2 by the prompt.
In the second embodiment, the content currently provided by the first terminal device for the user is a scene in a shooting game, and the first terminal device and the second terminal device are respectively connected with handles of different models. When a shooting object appears in the content that the first terminal device currently provides to the user, the user of the second terminal device shoots the shooting object in the shared content using the handle and clicking the "o" key. The second terminal device sends the operation information to the first terminal device (or the server), the operation information is analyzed by the first terminal device (or the server), the operation of the handle used by the user of the second terminal device is mapped to the handle A key of the user of the first terminal device by comparing model parameters of different handles, the function corresponding to the A key is executed, and then the shooting of the shooting object is completed, therefore, the shooting object shot by the user of the second terminal device can be displayed in the content currently provided for the user by the first terminal device, in addition, the key position corresponding to the handle of the first terminal device can be prompted to the user, so that the user of the first terminal device can know how to use the handle of the user.
Fig. 24 is a block diagram illustrating an apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention. As shown in fig. 24, an apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention includes a shared content determining unit 101 and a sharing unit 102.
Specifically, the shared content determining unit 101 is configured to determine shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user.
The sharing unit 102 is configured to send the determined shared content to the second terminal device for sharing.
As an example, the first terminal device and/or the second terminal device is a virtual reality device.
As an example, the shared content determining unit 101 may determine the shared content to be shared to the second terminal device according to at least one of: the method comprises the steps of a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, change content containing effective content and operation of a user of the second terminal device on received shared content.
As an example, the shared content determining unit 101 may determine the user interest object corresponding to the first terminal device according to at least one of the following items: the user behavior corresponding to the first terminal device, the incidence relation among the objects in the content provided for the user currently, the scene corresponding to the content provided for the user currently, and the user attribute corresponding to the first terminal device.
As an example, the shared content determining unit 101 may determine the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal device and the user attribute corresponding to the second terminal device.
As an example, the shared region may include at least one of the following regions: the method comprises the steps of setting a sharing area aiming at a first terminal device, setting a sharing area aiming at a second terminal device, setting an area where a user interest object corresponding to the first terminal device is located, and setting an area where the user interest object corresponding to the second terminal device is located.
As an example, the specified event may include at least one of a system related event, a content related event.
As an example, the changed content may be partial content in which content currently provided for the user changes from shared content transmitted last time, wherein the shared content determining unit 101 may determine that valid content is included in the changed content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the changed content comprises content corresponding to the shared area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
As an example, the shared content determining unit 101 may determine shared content to be shared to the second terminal device among the content that the first terminal device currently provides for the user, when it is determined that at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
As an example, the sharing unit 102 may reconstruct the shared content in a preset manner and transmit the reconstructed content to the second terminal device for sharing.
As an example, the preset manner may include at least one of: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: a receiving unit (not shown) and a viewing angle converting unit (not shown).
The receiving unit is used for receiving from the second terminal equipment: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing perspective of the first terminal device to a viewing perspective corresponding to the target object.
The visual angle conversion unit is used for guiding the user of the first terminal device to move the viewing visual angle and/or automatically switching the viewing visual angle of the user of the first terminal device according to the received position information and/or the received motion path.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: an operation information receiving unit (not shown) and an execution unit (not shown).
The operation information receiving unit is used for receiving operation information of the user for operating the received shared content from the second terminal equipment.
The execution unit is used for executing the function corresponding to the received operation information on the content which is currently provided for the user.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: a focus acquisition unit (not shown) and a focus adjustment unit (not shown).
The focus obtaining unit is used for responding to the operation of the user on the external operation equipment of the first terminal equipment, and obtaining the position of the operation focus of the external operation equipment in the content currently provided for the user. The focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: an acquisition unit (not shown).
The obtaining unit is used for obtaining the content which is currently provided by the first terminal device for the user before the shared content determining unit determines the shared content to be shared by the second terminal device in the content which is currently provided by the first terminal device for the user.
As an example, the content currently provided by the first terminal device to the user is the content currently displayed by the first terminal device to the user, wherein the obtaining unit may include: a panorama view acquiring unit (not shown), a viewing angle information receiving unit (not shown), and a determining unit (not shown).
The panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal equipment. The viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal device. The determining unit is used for determining the content currently displayed to the user by the first terminal device according to the current viewing angle information and the current panoramic view.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: an operation information receiving unit (not shown) and an execution unit (not shown).
The operation information receiving unit is used for receiving operation information of a user for operating the content currently provided for the user by the first terminal equipment from the first terminal equipment. The execution unit is used for executing the function corresponding to the received operation information so as to acquire the content currently provided by the first terminal device for the user.
It should be understood that specific implementation manners of the apparatus for implementing terminal device interaction according to the exemplary embodiment of the present invention may be implemented with reference to the related specific implementation manners described in conjunction with fig. 1 to fig. 23, and are not described herein again.
Fig. 25 illustrates a block diagram of a terminal device according to an exemplary embodiment of the present invention. As shown in fig. 25, the terminal device according to the exemplary embodiment of the present invention includes a shared content determining unit 201 and a sharing unit 202.
Hereinafter, the terminal device will be referred to as a first terminal device. It should be understood that the first terminal apparatus may have other devices, such as a display device or the like, which perform its own function as the terminal apparatus, in addition to the shared content determining unit 201 and the sharing unit 202.
Specifically, the shared content determining unit 201 is configured to determine shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user.
The sharing unit 202 is configured to send the determined shared content to the second terminal device for sharing.
As an example, the first terminal device and/or the second terminal device is a virtual reality device.
As an example, the shared content determining unit 201 may determine the shared content to be shared to the second terminal device according to at least one of: the method comprises the steps of obtaining a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, changed content containing effective content and operation of a user of the second terminal device on the received shared content.
As an example, the shared content determining unit 201 may determine the user interest object corresponding to the first terminal device according to at least one of the following items: the user behavior corresponding to the first terminal device, the incidence relation among the objects in the content provided for the user currently, the scene corresponding to the content provided for the user currently, and the user attribute corresponding to the first terminal device.
As an example, the shared content determining unit 201 may determine the user interest object corresponding to the second terminal device according to at least one of: the user behavior corresponding to the second terminal device and the user attribute corresponding to the second terminal device.
As an example, the shared region may include at least one of the following regions: the method comprises the steps of setting a sharing area aiming at a first terminal device, setting a sharing area aiming at a second terminal device, setting an area where a user interest object corresponding to the first terminal device is located, and setting an area where the user interest object corresponding to the second terminal device is located.
As an example, the specified event may include at least one of a system related event, a content related event.
As an example, the changed content may be partial content in which content currently provided for the user changes from shared content transmitted last time, wherein the shared content determining unit 201 may determine that valid content is included in the changed content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
As an example, the shared content determining unit 201 may determine shared content to be shared to the second terminal device among the content that the first terminal device currently provides for the user, when it is determined that at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
As an example, the sharing unit 202 may reconstruct the shared content in a preset manner and transmit the reconstructed content to the second terminal device for sharing.
As an example, the preset manner may include at least one of: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
As an example, the terminal device according to an exemplary embodiment of the present invention may further include: a receiving unit (not shown) and a viewing angle converting unit (not shown).
The receiving unit is used for receiving the following information from the second terminal equipment: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing perspective of the first terminal device to a viewing perspective corresponding to the target object.
The visual angle conversion unit is used for guiding the user of the first terminal device to move the viewing visual angle and/or automatically switching the viewing visual angle of the user of the first terminal device according to the received position information and/or the received motion path.
As an example, the terminal device according to an exemplary embodiment of the present invention may further include: an operation information receiving unit (not shown) and an execution unit (not shown).
The operation information receiving unit is used for receiving operation information of the user for operating the received shared content from the second terminal equipment.
The execution unit is used for executing the function corresponding to the received operation information on the content which is currently provided for the user.
As an example, the terminal device according to an exemplary embodiment of the present invention may further include: a focus acquisition unit (not shown) and a focus adjustment unit (not shown).
The focus obtaining unit is used for responding to the operation of the user on the external operation equipment of the first terminal equipment, and obtaining the position of the operation focus of the external operation equipment in the content currently provided for the user. The focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
It should be understood that the specific implementation of the terminal device according to the exemplary embodiment of the present invention may be implemented with reference to the related specific implementation described in conjunction with fig. 1 to fig. 22, and will not be described herein again.
Fig. 26 illustrates a block diagram of a server according to an exemplary embodiment of the present invention. As shown in fig. 26, the server according to an exemplary embodiment of the present invention includes: an acquisition unit 301, a shared content determination unit 302, and a sharing unit 303.
Specifically, the obtaining unit 301 is configured to obtain content currently provided by the first terminal device for the user.
The shared content determining unit 302 is configured to determine shared content to be shared to the second terminal device among content currently provided by the first terminal device for the user.
The sharing unit 303 is configured to send the determined shared content to the second terminal device for sharing.
As an example, the first terminal device and/or the second terminal device is a virtual reality device.
As an example, the shared content determining unit 302 may determine the shared content to be shared to the second terminal device according to at least one of: the method comprises the steps of obtaining a user interest object corresponding to a first terminal device, content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to a second terminal device, changed content containing effective content and operation of a user of the second terminal device on the received shared content.
As an example, the shared content determining unit 302 may determine the user interest object corresponding to the first terminal device according to at least one of the following items: the user behavior corresponding to the first terminal device, the incidence relation among the objects in the content provided for the user currently, the scene corresponding to the content provided for the user currently, and the user attribute corresponding to the first terminal device.
As an example, the shared content determining unit 302 may determine the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal device and the user attribute corresponding to the second terminal device.
As an example, the shared region may include at least one of the following regions: the method comprises the steps of setting a sharing area aiming at a first terminal device, setting a sharing area aiming at a second terminal device, setting an area where a user interest object corresponding to the first terminal device is located, and setting an area where the user interest object corresponding to the second terminal device is located.
As an example, the specified event may include at least one of a system related event, a content related event.
As an example, the changed content may be partial content in which content currently provided for the user changes from shared content that was transmitted last time, wherein the shared content determining unit 302 may determine that valid content is included in the changed content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
As an example, the shared content determining unit 302 may determine the shared content to be shared to the second terminal device among the contents currently provided by the first terminal device for the user when it is determined that at least one of the following is satisfied: the movement speed of the first terminal equipment is not greater than a second preset threshold value; the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
As an example, the sharing unit 303 may reconstruct the shared content in a preset manner and transmit the reconstructed content to the second terminal device for sharing.
As an example, the preset manner may include at least one of: acquiring content in a minimum area including the shared content; adjusting the shared content according to the equipment attribute of the second terminal equipment; reconstructing the shared content according to the redundant area; adjusting the shared content according to the current network condition; correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment; filtering the shared content based on a filtering condition; and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
As an example, the server according to an exemplary embodiment of the present invention may further include: a receiving unit (not shown) and a path determining unit (not shown).
The receiving unit is used for receiving the following information from the second terminal equipment: and position information of a target object selected by the user from the received shared content.
The path determining unit is used for determining a motion path from the current viewing angle of the first terminal device to the viewing angle corresponding to the target object according to the received position information, and sending the motion path to the first terminal device.
As an example, the server according to an exemplary embodiment of the present invention may further include: an operation information receiving unit (not shown) and an execution unit (not shown).
The operation information receiving unit is used for receiving operation information operated by the user aiming at the received shared content from the second terminal equipment.
The execution unit is used for executing the function corresponding to the received operation information on the content currently provided for the user and sending the operation information to the first terminal equipment.
As an example, the apparatus for implementing terminal device interaction according to an exemplary embodiment of the present invention may further include: a focus acquisition unit (not shown) and a focus adjustment unit (not shown).
As an example, the content currently provided by the first terminal device for the user is the content currently displayed by the first terminal device for the user, wherein the obtaining unit 301 may include: a panorama view acquiring unit (not shown), a viewing angle information receiving unit (not shown), and a determining unit (not shown).
The panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal device. The viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal device. The determining unit is used for determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
As an example, the server according to an exemplary embodiment of the present invention may further include: an operation information receiving unit (not shown) and an execution unit (not shown).
The operation information receiving unit is used for receiving operation information of a user for operating the content currently provided for the user by the first terminal equipment from the first terminal equipment. The execution unit is used for executing the function corresponding to the received operation information so as to acquire the content currently provided by the first terminal device for the user.
It should be understood that the specific implementation of the server according to the exemplary embodiment of the present invention may be implemented with reference to the related specific implementation described in conjunction with fig. 1 to fig. 23, and details are not described herein again.
Fig. 27 shows a block diagram of a terminal device according to another exemplary embodiment of the present invention. As shown in fig. 27, a terminal device according to an exemplary embodiment of the present invention includes: a shared content receiving unit 401 and a display unit 402. Here, the terminal device may be the second terminal device in the above-described exemplary embodiment.
Specifically, the shared content receiving unit 401 is configured to receive shared content sent by the first terminal device, where the shared content is determined from content currently provided by the first terminal device for the user.
The display unit 402 is configured to display the received shared content.
As an example, the terminal device according to another exemplary embodiment of the present invention may further include: a transmitting unit (not shown).
The sending unit is used for sending the following information to the first terminal equipment: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing perspective of the first terminal device to a viewing perspective corresponding to the target object.
As an example, the terminal device according to another exemplary embodiment of the present invention may further include: a receiving unit (not shown) is operated.
The operation receiving unit is used for receiving operation information operated by a user aiming at the received shared content, wherein the sending unit sends the operation information to the first terminal equipment.
It should be understood that the specific implementation of the terminal device according to the exemplary embodiment of the present invention may be implemented with reference to the related specific implementation described in conjunction with fig. 1 to fig. 23, and will not be described herein again.
According to the method and the device for realizing terminal equipment interaction, the terminal equipment and the server, only part of contents which need to be shared in the contents provided by the terminal equipment for the user are sent to other terminal equipment for sharing, so that the transmission of redundant data is reduced, the data transmission quantity among the equipment is reduced, the transmission resources are saved, and the real-time performance of communication among the equipment is improved. In addition, operation sharing between terminal devices can be realized.
Furthermore, it should be understood that each unit in the apparatus for implementing terminal device interaction, the terminal device, and the server according to the exemplary embodiments of the present invention may be implemented as a hardware component and/or a software component. The individual units may be implemented, for example, using Field Programmable Gate Arrays (FPGAs) or Application Specific Integrated Circuits (ASICs), depending on the processing performed by the individual units as defined by the skilled person.
Further, the method of implementing terminal device interaction according to an exemplary embodiment of the present invention may be implemented as computer code in a computer-readable recording medium. The computer code can be implemented by those skilled in the art from the description of the method above. The computer code when executed in a computer implements the above-described method of the present invention.
Although a few exemplary embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.
Claims (47)
1. A method for realizing terminal equipment interaction comprises the following steps:
(A) Determining shared content to be shared to a second terminal device in the content currently provided by a first terminal device for a user;
(B) Sending the determined shared content to the second terminal equipment for sharing,
wherein the method further comprises:
receiving, from the second terminal device: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing angle of the first terminal device to a viewing angle corresponding to the target object;
and guiding the user of the first terminal device to move the viewing angle and/or automatically switching the viewing angle of the user of the first terminal device according to the received position information and/or the motion path.
2. The method of claim 1, wherein the first terminal device and/or the second terminal device is a virtual reality device.
3. The method of claim 1, wherein the shared content to be shared to the second terminal device is determined according to at least one of: the user interest object corresponding to the first terminal device, the content corresponding to the shared area, the content corresponding to the designated event, the user interest object corresponding to the second terminal device, the changed content containing the effective content, the operation of the user of the second terminal device on the received shared content,
wherein the specified event comprises at least one of a system-related event and a content-related event;
wherein it is determined that the change content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
4. The method according to claim 3, wherein the object of interest of the user corresponding to the first terminal device is determined according to at least one of: the method comprises the steps that user behaviors corresponding to first terminal equipment, incidence relations among objects in content provided for a user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the first terminal equipment are obtained;
and/or determining the user interest object corresponding to the second terminal equipment according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment;
and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located;
and/or the changed content is part of content which is currently provided for the user and is changed relative to the shared content sent last time.
5. The method of claim 3, wherein,
performing step (A) when at least one of the following is satisfied:
the movement speed of the first terminal equipment is not greater than a second preset threshold value;
the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
6. The method of any one of claims 1 to 5, wherein step (B) comprises:
and reconstructing the shared content according to a preset mode, and sending the reconstructed content to second terminal equipment for sharing.
7. The method of claim 6, wherein the preset manner comprises at least one of:
acquiring content in a minimum area including the shared content;
adjusting the shared content according to the equipment attribute of the second terminal equipment;
reconstructing the shared content according to the redundant area;
adjusting the shared content according to the current network condition;
correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment;
filtering the shared content based on a filtering condition;
and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
8. The method of any of claims 1 to 5, further comprising:
receiving operation information of a user operating on the received shared content from the second terminal device;
and executing the function corresponding to the received operation information on the content currently provided for the user.
9. The method of any of claims 1 to 5, further comprising:
responding to the operation of a user on an external operation device of the first terminal device, and acquiring the position of an operation focus of the external operation device in the content currently provided for the user;
and when the external operation equipment is detected to have invalid movement, adjusting the current operation focus to the position of the operation focus before movement.
10. The method of any of claims 1 to 5, further comprising:
before the step (A), acquiring the content currently provided by the first terminal device for the user.
11. The method of claim 10, wherein the content currently provided by the first terminal device to the user is content currently displayed by the first terminal device to the user, and wherein the step of obtaining the content currently provided by the first terminal device to the user comprises:
acquiring a current panoramic view of a first terminal device;
receiving current viewing angle information corresponding to first terminal equipment;
and determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
12. The method of claim 10, further comprising:
receiving operation information of a user for operating the content currently provided by the first terminal equipment for the user from the first terminal equipment;
and executing a function corresponding to the received operation information to acquire the content which is currently provided by the first terminal equipment for the user.
13. An apparatus for implementing terminal device interaction, comprising:
the shared content determining unit is used for determining shared content to be shared to the second terminal equipment in the content which is currently provided for the user by the first terminal equipment;
a sharing unit for transmitting the determined sharing content to the second terminal device for sharing,
wherein the apparatus further comprises:
a receiving unit that receives, from the second terminal device: the user selects position information of a target object from the received shared content, and/or a motion path from a current viewing angle of the first terminal device to a viewing angle corresponding to the target object;
and the visual angle conversion unit is used for guiding the user of the first terminal equipment to move the viewing visual angle and/or automatically switching the viewing visual angle of the user of the first terminal equipment according to the received position information and/or the motion path.
14. The apparatus of claim 13, wherein the first terminal device and/or the second terminal device is a virtual reality device.
15. The apparatus according to claim 13, wherein the shared content determining unit determines the shared content to be shared to the second terminal device according to at least one of: the user interest object corresponding to the first terminal device, the content corresponding to the shared area, the content corresponding to the designated event, the user interest object corresponding to the second terminal device, the changed content containing the effective content, the operation of the user of the second terminal device on the received shared content,
wherein the specified event comprises at least one of a system-related event and a content-related event;
wherein the shared content determining unit determines that the change content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
16. The apparatus of claim 15, wherein the shared content determining unit determines the user object of interest corresponding to the first terminal device according to at least one of: the method comprises the steps that user behaviors corresponding to first terminal equipment, incidence relations among objects in content provided for a user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the first terminal equipment are obtained;
and/or the shared content determining unit determines the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment;
and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located;
and/or the changed content is part of content which is currently provided for the user and is changed relative to the shared content sent last time.
17. The apparatus of claim 15, wherein,
the shared content determination unit determines shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user when it is determined that at least one of the following is satisfied:
the movement speed of the first terminal equipment is not greater than a second preset threshold value;
the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
18. The apparatus of any one of claims 13 to 17,
and the sharing unit reconstructs the shared content according to a preset mode and sends the reconstructed content to the second terminal equipment for sharing.
19. The apparatus of claim 18, wherein the preset manner comprises at least one of:
acquiring content in a minimum area including the shared content;
adjusting the shared content according to the equipment attribute of the second terminal equipment;
reconstructing the shared content according to the redundant area;
adjusting the shared content according to the current network condition;
correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment;
filtering the shared content based on a filtering condition;
and optimizing the shared content according to the operation of the user of the first terminal equipment and/or the second terminal equipment aiming at the shared content.
20. The apparatus of any of claims 13 to 17, further comprising:
an operation information receiving unit that receives operation information, which a user operates with respect to the received shared content, from the second terminal device;
and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user.
21. The apparatus of any of claims 13 to 17, further comprising:
the focus acquisition unit is used for responding to the operation of a user on the external operation equipment of the first terminal equipment and acquiring the position of an operation focus of the external operation equipment in the content currently provided for the user;
and the focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
22. The apparatus of any of claims 13 to 17, further comprising:
and the acquisition unit acquires the content currently provided by the first terminal equipment for the user before the shared content determination unit determines the shared content to be shared by the second terminal equipment in the content currently provided by the first terminal equipment for the user.
23. The apparatus of claim 22, wherein the content currently provided by the first terminal device to the user is content currently displayed by the first terminal device to the user, and wherein the obtaining unit comprises:
the panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal equipment;
the viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal equipment;
and the determining unit is used for determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
24. The apparatus of claim 22, further comprising:
an operation information receiving unit that receives, from a first terminal device, operation information in which a user operates a content that the first terminal device currently provides for the user;
and the execution unit executes a function corresponding to the received operation information so as to acquire the content currently provided by the first terminal equipment for the user.
25. A terminal device, comprising:
a shared content determining unit that determines shared content to be shared to another terminal device among contents currently provided by the terminal device for a user;
a sharing unit that transmits the determined shared content to the other terminal device for sharing,
wherein, the terminal equipment still includes:
a receiving unit that receives, from the other terminal device: the user selects the position information of the target object from the received shared content, and/or the motion path from the current viewing angle of the terminal equipment to the viewing angle corresponding to the target object;
and the visual angle conversion unit is used for guiding the user of the terminal equipment to move the viewing visual angle and/or automatically switching the viewing visual angle of the user of the terminal equipment according to the received position information and/or the motion path.
26. The terminal device of claim 25, wherein the terminal device and/or the another terminal device is a virtual reality device.
27. The terminal device according to claim 25, wherein the shared content determining unit determines the shared content to be shared to the other terminal device according to at least one of: the object of interest of the user corresponding to the terminal device, the content corresponding to the shared area, the content corresponding to the specified event, the object of interest of the user corresponding to the other terminal device, the changed content containing the valid content, the operation of the user of the other terminal device on the received shared content,
wherein the specified event comprises at least one of a system-related event and a content-related event;
wherein the shared content determining unit determines that the change content contains effective content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the change content comprises content corresponding to the sharing area; the change content comprises content corresponding to a specified event; the changed content comprises a user interest object corresponding to the first terminal device and/or the other terminal device; the variant content comprises an actionable object.
28. The terminal device of claim 27, wherein the shared content determining unit determines the user interest object corresponding to the terminal device according to at least one of: user behaviors corresponding to the terminal equipment, incidence relations among objects in the content provided for the user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the terminal equipment;
and/or the shared content determining unit determines the user interest object corresponding to the other terminal equipment according to at least one of the following items: the user behavior corresponding to the other terminal device and the user attribute corresponding to the other terminal device;
and/or, the shared region comprises at least one of the following regions: a sharing area set for the terminal device, a sharing area set for the other terminal device, an area where a user interested object corresponding to the terminal device is located, and an area where a user interested object corresponding to the other terminal device is located;
and/or the changed content is part of content which is currently provided for the user and is changed relative to the shared content sent last time.
29. The terminal device of claim 27, wherein,
the shared content determining unit determines shared content to be shared to the other terminal device among the content currently provided by the terminal device for the user when it is determined that at least one of:
the movement speed of the terminal equipment is not greater than a second preset threshold value;
the content currently provided for the user includes at least one of content corresponding to a shared area, content corresponding to a specified event, a user interest object corresponding to the terminal device, a user interest object corresponding to the other terminal device, an operable object, and changed content including valid content.
30. The terminal device of any of claims 25 to 29,
and the sharing unit reconstructs the shared content according to a preset mode and sends the reconstructed content to the other terminal device for sharing.
31. The terminal device of claim 30, wherein the preset manner comprises at least one of:
acquiring content in a minimum area including the shared content;
adjusting the shared content according to the device attribute of the other terminal device;
reconstructing the shared content according to the redundant area;
adjusting the shared content according to the current network condition;
correcting the shared content based on the user attribute corresponding to the terminal equipment and/or the user attribute corresponding to the other terminal equipment;
filtering the shared content based on a filtering condition;
and optimizing the shared content according to the operation of the terminal equipment and/or the user of the other terminal equipment for the shared content.
32. The terminal device of any of claims 25 to 29, further comprising:
an operation information receiving unit that receives operation information, which a user operates with respect to the received shared content, from the other terminal apparatus;
and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user.
33. The terminal device of any of claims 25 to 29, further comprising:
the focus acquisition unit is used for responding to the operation of a user on the external operation equipment of the terminal equipment and acquiring the position of the operation focus of the external operation equipment in the content currently provided for the user;
and the focus adjusting unit is used for adjusting the current operation focus to the position of the operation focus before movement when the external operation equipment is detected to have invalid movement.
34. A terminal device, comprising:
the shared content receiving unit is used for receiving shared content sent by another terminal device, wherein the shared content is determined in the content currently provided by the other terminal device for a user;
a display unit that displays the received shared content;
a transmitting unit configured to transmit, to the other terminal device, at least one of: and the user selects the position information of the target object from the received shared content, and the motion path from the current viewing angle of the other terminal equipment to the viewing angle corresponding to the target object.
35. The terminal device of claim 34, further comprising:
an operation receiving unit configured to receive operation information on which a user operates with respect to the received shared content;
wherein the sending unit is configured to send the operation information to the other terminal device.
36. A server, comprising:
the acquisition unit acquires the content currently provided by the first terminal equipment for the user;
the shared content determining unit is used for determining shared content to be shared to the second terminal equipment in the content which is currently provided for the user by the first terminal equipment;
a sharing unit for transmitting the determined sharing content to the second terminal device for sharing,
wherein the server further comprises:
a receiving unit that receives, from the second terminal device: position information of a target object selected by a user from the received shared content;
and the path determining unit is used for determining a motion path from the current viewing angle of the first terminal equipment to the viewing angle corresponding to the target object according to the received position information, and sending the motion path to the first terminal equipment.
37. The server of claim 36, further comprising:
an operation information receiving unit that receives operation information, which a user operates with respect to the received shared content, from the second terminal device;
and the execution unit executes the function corresponding to the received operation information on the content currently provided for the user and sends the operation information to the first terminal equipment.
38. The server according to claim 36, wherein the content that the first terminal device currently provides for the user is content that the first terminal device currently displays to the user, wherein the obtaining unit includes:
the panoramic view acquisition unit is used for acquiring the current panoramic view of the first terminal equipment;
the viewing angle information receiving unit is used for receiving current viewing angle information corresponding to the first terminal equipment;
and the determining unit is used for determining the content currently displayed to the user by the first terminal equipment according to the current viewing angle information and the current panoramic view.
39. The server of claim 36, wherein the first terminal device and/or the second terminal device is a virtual reality device.
40. The server according to claim 36, wherein the shared content determining unit determines the shared content to be shared to the second terminal device based on at least one of: the user interest object corresponding to the first terminal device, the content corresponding to the shared area, the content corresponding to the designated event, the user interest object corresponding to the second terminal device, the changed content containing the effective content, the operation of the user of the second terminal device on the received shared content,
wherein the specified event comprises at least one of a system-related event and a content-related event;
wherein the shared content determining unit determines that the change content contains valid content when at least one of the following conditions is satisfied: the ratio of the changed content to the content currently provided for the user is not less than a first preset threshold value; the changed content comprises content corresponding to the shared area; the change content comprises content corresponding to a specified event; the change content comprises a user interest object corresponding to the first terminal equipment and/or the second terminal equipment; the variant content comprises an actionable object.
41. The server according to claim 40, wherein the shared content determining unit determines the user interest object corresponding to the first terminal device according to at least one of: the method comprises the steps that user behaviors corresponding to first terminal equipment, incidence relations among objects in content provided for a user currently, scenes corresponding to the content provided for the user currently and user attributes corresponding to the first terminal equipment are obtained;
and/or the shared content determining unit determines the user interest object corresponding to the second terminal device according to at least one of the following items: the user behavior corresponding to the second terminal equipment and the user attribute corresponding to the second terminal equipment;
and/or, the shared region comprises at least one of the following regions: the method comprises the steps that a sharing area is set for a first terminal device, a sharing area is set for a second terminal device, an area where a user interest object corresponding to the first terminal device is located, and an area where the user interest object corresponding to the second terminal device is located;
and/or the changed content is part of content which is currently provided for the user and is changed relative to the shared content sent last time.
42. The server according to claim 40, wherein,
the shared content determination unit determines shared content to be shared to the second terminal device among the content currently provided by the first terminal device for the user when it is determined that at least one of the following is satisfied:
the movement speed of the first terminal equipment is not greater than a second preset threshold value;
the content currently provided for the user includes at least one of content corresponding to the shared area, content corresponding to the specified event, a user interest object corresponding to the first terminal device, a user interest object corresponding to the second terminal device, an operable object, and changed content including effective content.
43. The server according to any one of claims 36 to 42,
and the sharing unit reconstructs the shared content according to a preset mode and sends the reconstructed content to the second terminal equipment for sharing.
44. The server of claim 43, wherein the predetermined manner comprises at least one of:
acquiring content in a minimum area including the shared content;
adjusting the shared content according to the equipment attribute of the second terminal equipment;
reconstructing the shared content according to the redundant area;
adjusting the shared content according to the current network condition;
correcting the shared content based on the user attribute corresponding to the first terminal equipment and/or the user attribute corresponding to the second terminal equipment;
filtering the shared content based on a filtering condition;
and optimizing the shared content according to the operation of the user of the first terminal device and/or the second terminal device aiming at the shared content.
45. The server of claim 36, further comprising:
an operation information receiving unit that receives, from a first terminal device, operation information in which a user operates a content that the first terminal device currently provides for the user;
and the execution unit executes a function corresponding to the received operation information so as to acquire the content currently provided by the first terminal equipment for the user.
46. A terminal device, wherein the terminal device comprises:
a processor;
a memory storing a computer program which, when executed by the processor, implements the method of implementing terminal device interaction of any of claims 1 to 12.
47. A server, wherein the server comprises:
a processor;
a memory storing a computer program which, when executed by the processor, implements the method of implementing terminal device interaction of any of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710087563.2A CN108462729B (en) | 2017-02-17 | 2017-02-17 | Method and device for realizing interaction of terminal equipment, terminal equipment and server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710087563.2A CN108462729B (en) | 2017-02-17 | 2017-02-17 | Method and device for realizing interaction of terminal equipment, terminal equipment and server |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108462729A CN108462729A (en) | 2018-08-28 |
CN108462729B true CN108462729B (en) | 2023-01-10 |
Family
ID=63221602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710087563.2A Active CN108462729B (en) | 2017-02-17 | 2017-02-17 | Method and device for realizing interaction of terminal equipment, terminal equipment and server |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108462729B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111338590A (en) * | 2020-02-19 | 2020-06-26 | 北京翼鸥教育科技有限公司 | Screen sharing initiating and responding method and interaction system |
CN111629452A (en) * | 2020-04-02 | 2020-09-04 | 北京大米科技有限公司 | Data transmission control method and device, storage medium and electronic equipment |
CN114327924A (en) * | 2020-09-30 | 2022-04-12 | 华为终端有限公司 | Terminal equipment interaction method and device |
CN117008777A (en) * | 2020-10-30 | 2023-11-07 | 华为技术有限公司 | Cross-equipment content sharing method, electronic equipment and system |
CN112770159A (en) * | 2020-12-30 | 2021-05-07 | 北京字节跳动网络技术有限公司 | Multi-screen interaction system, method, device, equipment and storage medium |
CN116710979A (en) * | 2021-12-31 | 2023-09-05 | 华为技术有限公司 | Man-machine interaction method, system and processing device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1845064A (en) * | 2005-04-08 | 2006-10-11 | 佳能株式会社 | Information processing method and apparatus |
CN102656542A (en) * | 2009-12-17 | 2012-09-05 | 微软公司 | Camera navigation for presentations |
CN104090706A (en) * | 2014-07-31 | 2014-10-08 | 北京智谷睿拓技术服务有限公司 | Content obtaining method, content sharing method, content obtaining device and content sharing device |
CN104540012A (en) * | 2015-01-20 | 2015-04-22 | 三星电子(中国)研发中心 | Method, device and terminal for content sharing |
CN106339192A (en) * | 2016-08-24 | 2017-01-18 | 腾讯科技(深圳)有限公司 | Area sharing method and device as well as system |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201001728D0 (en) * | 2010-02-03 | 2010-03-24 | Skype Ltd | Screen sharing |
WO2015054868A1 (en) * | 2013-10-17 | 2015-04-23 | 华为技术有限公司 | Content sharing method and terminal device |
CN105872723A (en) * | 2015-12-28 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Video sharing method and device based on virtual reality system |
CN106201259A (en) * | 2016-06-30 | 2016-12-07 | 乐视控股(北京)有限公司 | A kind of method and apparatus sharing full-view image in virtual reality system |
CN106302427B (en) * | 2016-08-09 | 2019-11-29 | 深圳市摩登世纪科技有限公司 | Sharing method and device in reality environment |
CN106331764A (en) * | 2016-09-14 | 2017-01-11 | 乐视控股(北京)有限公司 | Panoramic video sharing method and panoramic video sharing device |
CN106385587B (en) * | 2016-09-14 | 2019-08-02 | 三星电子(中国)研发中心 | Share the method, apparatus and system at virtual reality visual angle |
-
2017
- 2017-02-17 CN CN201710087563.2A patent/CN108462729B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1845064A (en) * | 2005-04-08 | 2006-10-11 | 佳能株式会社 | Information processing method and apparatus |
CN102656542A (en) * | 2009-12-17 | 2012-09-05 | 微软公司 | Camera navigation for presentations |
CN104090706A (en) * | 2014-07-31 | 2014-10-08 | 北京智谷睿拓技术服务有限公司 | Content obtaining method, content sharing method, content obtaining device and content sharing device |
CN104540012A (en) * | 2015-01-20 | 2015-04-22 | 三星电子(中国)研发中心 | Method, device and terminal for content sharing |
CN106339192A (en) * | 2016-08-24 | 2017-01-18 | 腾讯科技(深圳)有限公司 | Area sharing method and device as well as system |
Also Published As
Publication number | Publication date |
---|---|
CN108462729A (en) | 2018-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108462729B (en) | Method and device for realizing interaction of terminal equipment, terminal equipment and server | |
US10832448B2 (en) | Display control device, display control method, and program | |
US11237717B2 (en) | Information processing device and information processing method | |
US10516830B2 (en) | Guided image composition on mobile devices | |
JP2020530631A (en) | Interaction locating methods, systems, storage media, and smart devices | |
CN111580652B (en) | Video playing control method and device, augmented reality equipment and storage medium | |
US20170192500A1 (en) | Method and electronic device for controlling terminal according to eye action | |
US9965039B2 (en) | Device and method for displaying user interface of virtual input device based on motion recognition | |
CN110546601B (en) | Information processing device, information processing method, and program | |
CN111970456B (en) | Shooting control method, device, equipment and storage medium | |
CN113655887B (en) | Virtual reality equipment and static screen recording method | |
CN111045511A (en) | Gesture-based control method and terminal equipment | |
EP2939411B1 (en) | Image capture | |
US20220191577A1 (en) | Changing Resource Utilization associated with a Media Object based on an Engagement Score | |
WO2024131669A1 (en) | Photography processing method and electronic device | |
CN111818382B (en) | Screen recording method and device and electronic equipment | |
CN107408186A (en) | The display of privacy content | |
US11756302B1 (en) | Managing presentation of subject-based segmented video feed on a receiving device | |
CN111610886A (en) | Method and device for adjusting brightness of touch screen and computer readable storage medium | |
CN111782053B (en) | Model editing method, device, equipment and storage medium | |
CN113835664A (en) | Information processing method and device and electronic equipment | |
US12028645B2 (en) | Subject-based smart segmentation of video feed on a transmitting device | |
US9693016B2 (en) | Data processing method, data processing apparatus and electronic device | |
CN115562500B (en) | Method for controlling smart phone through eye movement | |
CN112732088B (en) | Virtual reality equipment and monocular screen capturing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |