CN111897437A - Cross-terminal interaction method and device, electronic equipment and storage medium - Google Patents

Cross-terminal interaction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111897437A
CN111897437A CN202010839960.2A CN202010839960A CN111897437A CN 111897437 A CN111897437 A CN 111897437A CN 202010839960 A CN202010839960 A CN 202010839960A CN 111897437 A CN111897437 A CN 111897437A
Authority
CN
China
Prior art keywords
information
display
interface
display interface
motion data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010839960.2A
Other languages
Chinese (zh)
Inventor
黄剑鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010839960.2A priority Critical patent/CN111897437A/en
Publication of CN111897437A publication Critical patent/CN111897437A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions

Abstract

The embodiment of the application discloses a cross-terminal interaction method, a cross-terminal interaction device, electronic equipment and a storage medium, wherein the method comprises the following steps: establishing communication connection with second equipment based on a preset communication connection mode; acquiring motion data of the second device; and controlling the change of the display information in the display interface of the first equipment according to the motion data. By adopting the embodiment of the application, the simplicity of cross-terminal interaction can be improved, the applicability is high, the actual application requirements are better met, and the use perception of a user is improved.

Description

Cross-terminal interaction method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a cross-terminal interaction method and apparatus, an electronic device, and a storage medium.
Background
At present, with the great popularization of network technologies and intelligent terminals, the interaction between the terminals and the interaction between people and the terminals bring rich life experience for people.
However, most of the cross-end interactions nowadays often require special devices, such as industrial cameras, infrared sensing devices, and the like, to only participate in the interactions, which results in a complex interaction process and a single applicable scene. On the other hand, most of cross-end interactions can only simulate simple operations such as clicking and sliding, and the user experience is low.
Therefore, how to improve the simplicity and diversity of cross-end interaction and improve the convenience of cross-end interaction becomes an urgent problem to be solved.
Disclosure of Invention
The embodiment of the application provides a cross-terminal interaction method and device, electronic equipment and a storage medium, which can improve the simplicity of cross-terminal interaction and have high applicability.
In a first aspect, an embodiment of the present application provides a cross-terminal interaction method, where the method includes:
establishing communication connection with second equipment based on a preset communication connection mode;
acquiring motion data of the second device;
and controlling the change of the display information in the display interface of the first equipment according to the motion data.
In a second aspect, an embodiment of the present application provides a cross-terminal interaction device, where the device includes:
the communication unit is used for establishing communication connection with the second equipment based on a preset communication connection mode;
an acquisition unit configured to acquire motion data of a second device;
and the control unit is used for controlling the change of the display information in the display interface of the first equipment according to the motion data.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the processor and the memory are connected to each other;
the memory is used for storing computer programs;
the processor is configured to perform the method provided by the first aspect when the computer program is called.
In a fourth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the method provided in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the first aspect.
In the embodiment of the application, the first device controls the change of the display information in the display interface of the first device based on the motion data generated when the second device moves, so that the simplicity of cross-terminal interaction can be improved, the applicability is high, the actual application requirements are better met, and the use perception of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic diagram of an interaction of a device provided by an embodiment of the present application;
fig. 2 is a flowchart of an interaction method across terminals according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating a first scenario for controlling display information change according to an embodiment of the present application;
FIG. 4 is a diagram illustrating a second scenario for controlling display information change according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a third scenario for controlling display information change according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a fourth scenario for controlling display information change according to an embodiment of the present disclosure;
fig. 7 is another flowchart illustrating an interaction method across terminals according to an embodiment of the present application;
FIG. 8 is a schematic view of a rotation state provided by an embodiment of the present application;
fig. 9 is a schematic diagram of a fifth scenario for controlling display information change according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating a cross-terminal interaction method according to an embodiment of the present disclosure;
fig. 11 is a scene schematic diagram of a cross-terminal interaction method provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of a cross-terminal interaction device provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic diagram of device interaction provided in an embodiment of the present application. As shown in fig. 1, the device 101 may obtain the motion data of the device 102, and then control the change of the display information in the display interface of the device 101 according to the motion data of the device 102. The device 101 is a device with a display function, and includes but is not limited to a mobile phone, a tablet computer, a notebook computer, a palm computer, a television, and an electronic screen with a data processing capability. The device 102 may be a mobile phone, a palm pc, a wearable device, other devices or electronic devices with data transmission capability, and the like, and the device 101 and the device 102 may be determined based on practical application scenarios, which are not limited herein.
The display information in the display interface of the device 101 may be scene content preset by the device 101, including but not limited to game scenes, images, video animations, web pages, and the like displayed by the device 101 based on the display interface, and may be specifically determined based on an application scene of the device 101. The device 101 may control the change of the display information in the display interface of the device 101 based on the motion data of the device 102, such as the device 101 controlling the switch, the movement, and the change of the partial display elements (such as partial scene elements in the game scene) of the display information based on the change of the motion data of the device 102.
Optionally, the information displayed in the display interface of the device 101 may be information associated with the device 102, including but not limited to a movement track of the device 102 during the movement, a rotation state of the device 102 during the movement, and a projection image, a stereo image, a plane schematic diagram, etc. corresponding to the device 102 in the display interface.
It should be particularly noted that the cross-terminal interaction method provided in the embodiment of the present application does not limit an interaction scenario between the device 101 and the device 102, and may be any interaction scenario in which the device 101 having a display function and the device 102 having a data transmission function participate, including but not limited to a game interaction scenario in which the device 102 is used as a game control device and the device 102 is used as a game display device, and a test interaction between the device 102 is used as a device to be tested and the device 101 is used as a test device.
The cross-terminal interaction method provided by the embodiment of the application can be executed by a device with a display function, and for convenience of description, the device with the display function is referred to as a first device, and a device participating in interaction with the first device is referred to as a second device.
Referring to fig. 2, fig. 2 is a flow diagram of a cross-terminal interaction method provided in an embodiment of the present application. As shown in fig. 2, the cross-terminal interaction method shown in fig. 2 may include the following steps:
step S201, establishing a communication connection with the second device based on a preset communication connection mode.
In some possible embodiments, the first device may establish a communication connection with the second device based on a preset communication connection manner. The preset Communication connection mode includes, but is not limited to, a bluetooth connection, a network connection, and other connection modes of short-distance Communication technologies such as NFC (Near Field Communication), which may be determined based on actual device types of the first device and the second device and an interaction scenario of the first device and the second device, and is not limited herein. And if the first device searches the Bluetooth address of the second device, establishing communication connection with the second device based on the Bluetooth address of the second device.
Optionally, the first device may establish a communication connection with the second device based on a preset communication Protocol, for example, establish a communication connection with the second device based on a communication Protocol such as HTTP (Hyper Text Transfer Protocol) and a WebSocket Protocol, which is not limited herein. If the first device establishes the communication connection with the second device based on the WebSocket protocol, the first device establishes and accesses the WebSocket object, and after the second device is determined to be accessed to the WebSocket object, the first device establishes the communication connection with the second device.
Optionally, the first device may establish a communication connection with the second device based on other specific manners, for example, the second device is instructed to access the preset connection by scanning the preset two-dimensional code or logging in the preset website based on the display interface, and the communication connection between the first device and the second device is established when the first device and the second device access the preset connection at the same time.
Optionally, a communication connection with the second device may also be established through an intermediate device, where the intermediate device is a device that can be connected to the first device and the second device and has a data forwarding capability, such as a router and a switch. After the first device and the second device establish communication connection with the intermediate device simultaneously, the establishment of the communication connection between the first device and the second device is realized based on the intermediate device.
It should be particularly noted that the above specific implementation manner for establishing the communication connection based on the preset communication connection manner is only an example and is not limited herein.
And step S202, acquiring motion data of the second equipment.
In some possible embodiments, the motion data of the second device is data generated by the second device during a motion process, such as data generated when a user shakes the second device from side to side and rotates the second device, which is not limited herein. The motion data of the second device may be a gravity sensing device generated by a gravity sensing device built in the second device in a motion process of the second device, may also be gyroscope data generated by a gyroscope built in the second device based on the motion process of the second device, may also be movement data generated by a navigation application in the second device in the motion process of the device, and the like, and may specifically be determined based on a specific type of the second device and a data processing capability of the first device, which is not limited herein.
And S203, controlling the change of the display information in the display interface of the first device according to the motion data.
In some possible embodiments, when the first device controls the change of the display information in the display interface of the first device based on the motion data of the second device, the first device may convert the motion data into interface control information of the display interface of the first device, and then control the change of the display information in the display interface of the first device according to the interface control information.
Specifically, when the display information in the display interface of the first device is the information associated with the second device, the first device may convert the motion data of the second device into interface control information that the user controls the information associated with the second device to keep the same change as the motion of the second device, and then the first device controls the display information in the display interface to keep the same change as the motion of the second device based on the interface control information.
For example, referring to fig. 3, fig. 3 is a schematic diagram of a first scenario for controlling display information change according to an embodiment of the present application. In the scene of controlling the change of the display information shown in fig. 3, the second device is a handheld device, the first device is a display device, and the first device displays a stereoscopic image corresponding to the second device in a display interface of the first device, that is, the stereoscopic image corresponding to the display interface of the second device is the display information in the display interface. When a user holds the second device horizontally, the first device generates interface control information based on the motion data of the second device, and controls the placing state of the three-dimensional graph in the display interface to be horizontally placed based on the interface control information, and the placing state is consistent with the state of holding the second device horizontally by the user.
Further, when the user changes from holding the second device horizontally to holding the second device obliquely, the first device generates new interface control information based on the motion data of the second device during the motion process, and further controls the stereoscopic graphics in the display interface to be in an oblique placement state based on the interface control information, so as to keep the placement state of the stereoscopic graphics of the second device corresponding to the display interface consistent with the placement state of the second device (the user holding mode) again.
Optionally, when the display information in the display interface of the first device is the scene content built in the first device, the first device may convert the motion data of the second device into interface control information for controlling the change of the scene content built in the first device, and then control the scene content in the display interface based on the interface control information, and perform corresponding change according to the motion of the second device.
For example, referring to fig. 4, fig. 4 is a schematic diagram of a second scenario for controlling display information change according to an embodiment of the present application. In fig. 4, the second device is a game control device, the first device is a display device, and the first device displays a preset game scene in a display interface of the first device, that is, the game scene is display information of the first device in the display interface. When the second device is horizontally stationary, the second device generates interface control information based on the motion data of the second device, and controls the game screen to maintain a game scene of "the game character runs forward" based on the interface control information.
Further, when the second device is rotated (tilted) to the left, the first device may generate the interface control information based on the motion data of the second device rotated to the left, and then control the game scene in which the game screen is changed from the game scene in which the game character runs forward to the left based on the interface control information.
Optionally, when the display information in the display interface of the first device is the scene content built in the first device, the first device may generate interface control information for controlling the change of the scene content built in the first device based on the change of the motion data frequency of the second device, and then control the scene content in the display interface based on the interface control information, and perform corresponding change according to the motion of the second device.
For example, when the motion data of the second device changes a plurality of times within a unit time (e.g., 1 second), the first device may control the frequency of change of the display information in the display interface based on the interface control information generated based on the frequency of change of the motion data. Such as controlling the switching rate of scene content of the display information, controlling the playing speed of the animation video, and so on.
In some possible embodiments, when the motion data of the second device is gyroscope data, the gyroscope data may be converted into a quaternion, and the quaternion may be used as interface control information of the display interface of the first device, so as to control a change of the display information in the display interface according to the interface control information. Wherein the quaternion may be used to represent the rotation regime of the second device. Specifically, after acquiring the gyroscope data of the second device, the first device may determine, based on the gyroscope data, a rotation angle of the second device, where the rotation angle of the second device is an angle, such as an euler angle, successively rotated based on three coordinate axes of a coordinate system corresponding to the second device. Further, the euler angle is converted into a quaternion, so that the quaternion can be used as interface control information of the display interface of the first device, and thus, the change of the display information in the display interface is controlled based on the quaternion.
If the euler rotation determined based on the gyroscope data is (α, β, γ), the second device is rotated by α degrees, β degrees and γ degrees around the x-axis, y-axis and z-axis, respectively, i.e. the rotation angles of the second device are α, β, γ, respectively. The quaternion q ═ ((x, y, z), w) is obtained based on the following expression:
Figure BDA0002641088990000071
Figure BDA0002641088990000072
Figure BDA0002641088990000073
Figure BDA0002641088990000074
specifically, the first device controls the change of the display information in the display interface according to the interface control information, which may be expressed as controlling the rotation state of the display information in the display interface based on the quaternion. Where each movement of the second device can be considered as a rotational movement, the first device is therefore based on the quaternion available to the gyroscope data of the second device at different moments of the movement. The quaternion of each moment can represent the rotation mode of the second equipment at the moment compared with the second equipment at the previous moment, and the quaternion can be used as interface control information by the first equipment to control the rotation state of the display information in the display interface. At this time, the rotation state of the display information in the display interface is synchronized with the rotation state of the second device itself, and the first device may rotate the scene content preset by the first device in the display interface, or may control the information associated with the second device in the display interface to rotate, which may be specifically determined based on the display content in the display interface of the first device, which is not limited herein.
Referring to fig. 5, fig. 5 is a schematic diagram of a third scenario for controlling display information change according to an embodiment of the present application. As shown in fig. 5, the second device is a handheld device of a user, and the display information in the display interface of the first device is a stereoscopic graphic corresponding to the second device. In one rotation movement, the user rotates the second device from a horizontal placement state to a vertical placement state, and in the rotation movement process, the first device can continuously acquire movement data (gyroscope data) of the second device and convert the movement data at each moment into quaternions, so that the quaternions at each moment are used as interface control information at corresponding moments. And in the rotating motion process, the first equipment controls the three-dimensional graph in the display interface according to the quaternion at each moment, and rotates according to the rotating mode represented by the quaternion at the current moment from the rotating state at the previous moment, so that the rotating state of the three-dimensional graph in the display interface is continuously adjusted in the complete rotating motion process, and the three-dimensional graph which finally presents a vertically placed state is obtained.
The first device controls the control process of the rotation state of the three-dimensional graph based on the quaternion obtained by converting the motion data of the second device, and the control process is synchronous with the rotation state of the second device. In other words, while the second device performs the rotation motion, the first device adjusts the rotation state of the stereoscopic graph in the display interface of the first device in real time according to the motion data of the second device.
Optionally, the first device controls the change of the display information in the display interface according to the interface control information, which may be expressed as determining a motion trajectory of the second device corresponding to the display interface based on a quaternion, and then displaying a preset picture corresponding to the display information when the motion trajectory coincides with the display information in the display interface.
The first device can determine different position changes of the second device corresponding to the display interface in the motion process based on the quaternion obtained by converting the motion data of the second device, and further obtain a complete motion track of the second device corresponding to the display interface. Further, the first device displays a plurality of content elements on the display interface thereof, each content element may be used as display information on the display interface of the first device, and each content element may correspond to a different preset picture, such as a different animation picture, text information, and the like, which is not limited herein. When the motion track of the second device corresponding to the display interface is overlapped with any display information in the display interface, the first device can control the display interface to display a preset picture corresponding to the display information overlapped with the motion track of the second device corresponding to the display interface.
Or, when the motion trajectory of the second device corresponding to the display interface passes through a fixed area in the display interface, the first device may control the display interface to display a preset picture corresponding to the fixed area in the fixed area or in the entire display interface.
Referring to fig. 6, fig. 6 is a schematic diagram of a fourth scenario for controlling display information change according to an embodiment of the present application. Fig. 6 provides a shooting game scenario, where display information in the display interface of the first device is balloons in the shooting game, and each balloon corresponds to a preset picture. If the game playing method is that after the sight aims at a certain balloon, the balloon bursts, and at the moment, the picture of the burst balloon is the preset picture corresponding to the balloon. Since the game is based on the interaction of the second device with the first device, i.e. the aiming of the sight is achieved by the movement of the second device. Therefore, in the game scene, the sight is a motion track of the second device corresponding to the display picture during the motion process.
For the process of aiming based on the second device, the first device may obtain a corresponding quaternion based on the motion data of the second device, and then determine the motion trajectory of the sight based on the quaternion, that is, the motion trajectory of the second device corresponding to the display interface. When the second device moves to the lower left as shown in fig. 6, the motion trajectory (sight) of the second device corresponding to the display screen also presents a motion trajectory from the current position to the lower left, indicating that the sight aims at the balloon and completes shooting when the motion trajectory coincides with the balloon. At this time, the first device controls the display interface to display the burst picture corresponding to the balloon to indicate that the balloon is successfully aimed at for shooting, namely, the first device completes an interaction process according to the change of the display information in the quaternion control display interface.
Optionally, the first device controls a change of display information in the display interface according to the interface control information, which may be expressed as determining a motion trajectory of the second device corresponding to the display interface based on a quaternion, and then displaying a preset picture corresponding to the preset trajectory when the motion trajectory is the same as the preset trajectory corresponding to the display interface. If the corresponding preset track in the display interface is a square track, when the first device determines that the motion track of the second device corresponding to the display picture is a square track based on the quaternion of the motion data of the second device, the first device controls the display interface to display the preset picture corresponding to the square track.
In some possible embodiments, the first device may further display, in the display interface, operation information corresponding to the interface control information according to the interface control information. Specifically, the interface control information may be used to indicate that the first device determines the motion state of the second device based on the interface control information after the motion data of the second device is converted into the interface control information of the display interface, and then displays the corresponding operation information according to the motion state of the second device. The motion state of the second device includes, but is not limited to, information representing a motion mode of the second device, such as a motion speed, a motion trajectory, and a motion pause time, and is not limited specifically. The operation information corresponding to the interface control information is response operation of the motion state of the second device and operation result information thereof, including but not limited to a play operation and a played picture, a switch operation and a switched picture, a click operation and operation result information after the click, and the like, and may be determined based on an actual application scenario, which is not limited herein.
For example, after translating the motion data of the second device into the interface control information of the display interface, it may be determined that the motion of the second device is a left or right motion. The interface control information can respond to the operation of page turning back or page turning forward (the display information in the display interface of the first device is the text content before the second device moves) according to the left and right movement of the second device, and the text content after page turning back or page turning forward is taken as the corresponding operation result information and is displayed through the display interface of the first device. Or the interface control information can respond to the operation of playing the next or previous page, or respond to the operation of page forward and page backward, or respond to the operation of canceling, canceling and the like according to the left and right movement of the second device, and display the operation result information through the display interface of the first device.
For another example, after the motion data of the continuous motion of the second device is converted into the interface control information of the display interface, the motion mode of the second device within a certain duration may be determined. If the second device does up-and-down motion for a plurality of times continuously under the condition of short pause time, the preset operation corresponding to the motion state is responded based on the interface control information, such as pause, play start and the like. And the first equipment further responds to the preset operation and displays the corresponding operation result through the display interface.
Referring to fig. 7, fig. 7 is another flowchart illustrating a cross-terminal interaction method provided in an embodiment of the present application. The cross-terminal interaction method shown in fig. 7 may include the following steps:
step S701, establishing a communication connection with the second device based on a preset communication connection manner.
And step S702, acquiring the motion data of the second equipment.
In some possible embodiments, specific implementation manners of steps S701 to step 702 may refer to the specific implementation manners shown in steps S201 to step S202 in fig. 2, and are not described herein again.
And step S703, determining reference position information of the second device corresponding to the display interface.
In some possible embodiments, when the display information in the display interface of the first device is information associated with the second device, the second device corresponds to the reference position information of the display interface, and establishes a communication connection between the second device and the first device, the initial position of the display information is described above. For example, when the display information is a movement track of the second device corresponding to the display interface, the reference position information of the second device corresponding to the display interface is a starting point of the movement track. For example, when the display information is that the second device corresponds to the stereoscopic graphic of the display interface, the reference position information of the second device corresponding to the display interface is the initial display position of the stereoscopic graphic and the initial rotation state of the stereoscopic graphic, and the initial rotation state can represent the initial placement state of the stereoscopic graphic, such as horizontal placement, vertical placement and the like. In this case, the reference position information of the second device corresponding to the display interface is preset by the first device, that is, when the second device establishes a communication connection with the first device, the display information in the display interface is displayed based on the reference position information.
Optionally, when the display information in the display interface of the first device is the scene content preset by the first device, the second device may correspond to the reference position information of the display interface, and the second device may correspond to a starting point of the motion trajectory of the display interface, such as the starting position of the sight bead in fig. 6.
Optionally, the reference position information of the second device corresponding to the display interface may be predetermined for the first device, and when the second device establishes a communication connection with the first device, the initial placement state of the second device may be set. If the second device establishes a communication connection with the first device, the second device needs to be kept in a horizontal placement state.
In some possible embodiments, the reference position information of the second device corresponding to the display interface may also be a preset position designated by the first device. When the display information in the display interface of the first device is information associated with the second device, the first device may determine an initial position of the display information in the display interface based on the preset position and the motion data when the second device establishes a communication connection with the first device. If the display information is a movement track of the second device corresponding to the display interface, the first device may determine a starting point of the movement track based on the movement data of the second device when the communication connection is established and the preset position. If the display information is that the second device corresponds to the stereoscopic graph of the display interface, the first device may determine the initial rotation state of the stereoscopic graph based on the motion data when the second device establishes the communication connection and the preset position. In this case, the reference position information of the second device corresponding to the display interface is not preset by the first device, that is, when the second device establishes a communication connection with the first device, the first device is determined based on the motion data of the second device and the preset position.
The reference position information and the preset position specified by the first device may be coordinates of a coordinate system corresponding to the display interface, such as two-dimensional coordinates in a two-dimensional coordinate system, three-dimensional coordinates in a three-dimensional coordinate system, and the like, which is not limited herein. Further, the first device may control a change of the display information in the display interface of the first device based on the reference position information and the motion data of the second device, and a specific implementation manner will be described in step S704.
And step S704, controlling the change of the display information in the display interface of the first device according to the reference position information and the motion data.
In some possible embodiments, in the case that the display information in the display interface of the first device is the motion track or the stereoscopic graphic of the second device corresponding to the display interface, and the second device corresponds to the reference position information of the display interface, the starting point of the motion track when the second device establishes the communication connection with the first device, and the initial rotation state of the stereoscopic graphic, the extension of the motion track in the display interface and the change of the rotation state of the stereoscopic graphic may be controlled based on the reference position information and the motion data of the second device.
Specifically, for the motion trajectory, the first device may determine, based on the quaternion corresponding to the motion data of the second device when the second device moves for the first time, a rotation manner of the second device relative to the starting point of the motion trajectory, and further rotate the starting point according to the rotation manner to obtain a rotated point. Based on the quaternion, rotating the initial coordinate corresponding to the initial point in a coordinate system corresponding to the display interface to obtain a rotated coordinate. At this time, the motion trajectory obtained based on the rotated point (the rotated coordinate) and the start point (the start coordinate) is the motion trajectory corresponding to the display interface when the second device moves for the first time. When the second device continues to move, the coordinate corresponding to the moment can be determined based on the motion data of each moment and the coordinate corresponding to the previous moment, and then the motion track is controlled to extend.
Specifically, for the rotation state of the stereoscopic graphic, the initial rotation state of the stereoscopic graphic in the display interface may be represented by a reference vector indicated by the reference position information. As shown in fig. 8, fig. 8 is a schematic view of a rotation state provided in the embodiment of the present application. In the three-dimensional coordinate system shown in fig. 8, each vector may represent a unique direction due to the mathematical characteristics of the vector, and thus an initial rotation state of the stereoscopic image may be represented based on the reference vector. Further, the first device rotates the reference vector based on the quaternion corresponding to the motion data of the second device when the first device moves to obtain a rotated vector, and the rotated vector can represent the rotated state of the stereoscopic image after the rotation. When the second device moves continuously, the vector corresponding to the moment can be determined based on the motion data of each moment and the vector corresponding to the rotation state of the previous moment, and further the rotation state of the stereo image is changed continuously.
Optionally, in the above process, the first device may convert the quaternion into a vector value, and then obtain a moving trajectory of the second device in the moving process in a coordinate system corresponding to the display interface based on a coordinate corresponding to the vector value.
It should be noted that, in the process of controlling the change of the rotation state, the rotation state (placement state) when the second device establishes the communication connection with the first device is kept consistent with the initial rotation state represented by the reference vector indicated by the reference position information, and further, the first device may control the rotation state of the stereoscopic image to be the same as the rotation state of the second device in the motion process based on the motion data of the second device.
In some possible embodiments, when the reference position information of the second device corresponding to the display interface is a preset position designated by the first device, for a start point of the motion trajectory, the first device may rotate the preset position based on a quaternion corresponding to the motion data acquired when the first device establishes a communication connection with the second device to obtain a rotated position, and at this time, the first device determines the rotated position as the start point of the motion trajectory.
For the stereoscopic image, under the condition that the reference position information of the second device corresponding to the display interface is the preset position specified by the first device (at this time, the preset position is a preset vector), the vector corresponding to the initial rotation state of the stereoscopic image can be obtained by the first device rotating the preset vector based on the quaternion corresponding to the motion data when the second device establishes the communication connection with the first device. And the first device can continuously control the rotation state of the stereoscopic graph in the display interface based on the quaternion corresponding to the motion data of the second device during subsequent motion. Based on the above manner, when the first device establishes the communication connection with the second device, the first device may show the rotation state of the second device at the time in the display interface, and the user does not need to adjust the placement state of the second device at the time.
When the first device and the second device are in communication connection, besides the motion data of the second device at the moment, the size information, the orientation information and the like of the second device can be obtained, so that the three-dimensional graph which is the same as the second device in appearance and different in proportion is obtained based on the size information, the shape information and the orientation information, and the display effect of the rotation state of the second device is further improved. The orientation information may indicate a placement manner (e.g., right side up) of the second device when establishing the communication connection, so that the stereoscopic image is always consistent with the second device.
It should be particularly noted that the motion trajectory and the stereoscopic graph in the embodiment of the present application are only examples for describing the method provided by the embodiment of the present application, and may be specifically characterized by any graph, symbol, and other indication information, which is not limited herein.
In addition, the second device corresponds to the motion track of the display interface, and the corresponding stereoscopic graph and the scene content preset by the first device can exist at the same time, that is, the first device can comprehensively control the change of the display information in the display interface by combining the method provided by the embodiment of the application.
Referring to fig. 9, fig. 9 is a schematic diagram of a fifth scenario for controlling display information change according to an embodiment of the present application. In fig. 9, the second device moves a distance from left to right and then from right to left. The first device continuously acquires the motion data of the second device during the motion of the second device, and the implementation mode determines the starting point of the motion track of the second device corresponding to the display interface. And further determining the position of the second device in the display interface corresponding to the movement moment according to the movement data of each movement moment and the corresponding position of the last movement moment of the movement moment, so that the first device displays the movement track from left to right and then from right to left through the display interface.
The cross-terminal interaction method provided by the embodiment of the present application is described below with reference to fig. 10. Fig. 10 is a schematic diagram illustrating a cross-terminal interaction method provided in an embodiment of the present application. As shown in fig. 10, if the communication connection between the first device and the second device is established based on the WebSocket protocol, when the first device establishes the communication connection with the second device, the first device creates a WebSocket object, so that the establishment of the communication connection is completed after the first device and the second device access the WebSocket object, respectively. At this time, the second device acquires motion data based on a built-in gyroscope and other devices, and sends the motion data to the first device, and the first device can display information based on a display interface after establishing communication connection, wherein the display information at this time is the preset scene content of the first device. Further, after the first device acquires the motion data of the second device, the change of the display information in the display interface is further controlled according to the motion data of the second device.
In some possible embodiments, to further enhance the interaction experience, after the first device establishes the communication connection with the second device, the interaction operation guidance information may be displayed based on the display interface. The interactive operation guidance information is related information for guiding a user how to complete the interactive operation, and the specific information content may be determined based on the actual application scenarios of the first device and the second device, which is not limited herein.
For example, in the game scenario shown in fig. 6, the first device may display the game operation presentation information as the interactive operation guidance information through the display interface to guide the user how to move the second device, so that the motion trajectory of the second device corresponding to the display interface meets the aiming shooting requirement. For another example, in an audio-visual playing scene, the first device may show the operation need information as operation guide information through the display interface to guide the user how to move the second device to complete different operations such as fast forward, pause, and the like. The interactive operation guiding information can also indicate a user to adjust the placing state of the second equipment so that the placing state of the second equipment is consistent with the display information in the display interface of the first equipment when the second equipment and the first equipment are in communication connection.
It should be particularly noted that the above-mentioned interactive operation guidance information includes, but is not limited to, text information, animation video information, and information combining text and animation video, and may be determined based on an actual application scenario, and is not limited herein.
In some possible implementations, the first device may complete interaction with the user based on the second device in addition to completing interaction with the second device. That is, the first device may perform a corresponding operation based on user operation information of the user, in addition to controlling a change of display information in the display interface based on the motion data of the second device.
Specifically, the second device displays a corresponding operation interface, and the second device can receive user operation information through the operation interface and further send the user operation information to the first device. The user operation information may be a touch event of a user on an operation interface, such as a click, a double click, a sliding track, or a related operation responding to information displayed in a display interface of the first device, and may be specifically determined based on an actual application scenario, which is not limited herein.
The first device can establish an association relation with the user operation information, so that after the user operation information is received by the second device, corresponding operation is executed according to the user operation information. For example, the operations of display screen reset, game pause, display screen switching and the like are realized through different touch events of the user. Further taking the game scene in fig. 5 as an example, it is assumed that the first device does not display the preset picture when the motion trajectory of the second device corresponding to the display interface coincides with the display picture, and only when the sight bead aims at the balloon and receives the click operation information of the user, the first device plays the preset picture corresponding to the balloon, that is, the balloon burst picture.
Optionally, when the display information in the display interface of the first device is a web page, the motion trajectory of the second device corresponding to the display interface may represent a moving position of the mouse identifier in the web page, and the first device may perform operations such as opening a web page link, advancing and retreating the web page after receiving the user operation information.
Optionally, when the user operation information is a touch event, the first device may further display, on the display interface, a touch point or a touch trajectory corresponding to the touch event based on the user operation information acquired by the second device, so as to visualize the user operation information, and further improve user experience.
It should be noted that, the above-mentioned first device may perform interaction with the second device through interaction between the second device and the user at the same time, that is, the first device may perform corresponding operation based on the user operation information of the user in addition to controlling the change of the display information based on the motion data of the second device.
As shown in fig. 11, fig. 11 is a scene schematic diagram of a cross-terminal interaction method provided in the embodiment of the present application. In fig. 11, the first device controls the change of the display information in the display interface based on the motion data of the second device. The display information in the display interface is a three-dimensional graph corresponding to the second equipment, and the terminal controls the rotation state of the three-dimensional graph based on the motion data of the second equipment. On the other hand, the second device obtains the user operation information of the user based on the operation interface, for example, in fig. 11, after the user finger clicks the operation interface, the user operation information is further sent to the first device, the first device may display the corresponding touch point through the display interface, and further perform the corresponding operation based on the user operation information.
By way of further example, assuming that a display screen in a display interface of a first device is a game scene of weiqi, a motion track of a second device corresponding to the display interface represents a motion track of an active chess piece in the game, and the first device controls the active chess piece in the game to move on a chessboard based on motion data of the second device. When the second device receives user operation information (such as a double-click operation interface) of a user through the operation interface and the first device acquires the user operation information through the second device, the acquired chessman is controlled to stay on the chessboard to fall completely, and therefore cross-terminal interaction between the first device and the second device and interaction between the first device and the user are achieved.
In the embodiment of the application, the first device can control the rotation state and the motion track of the display information in the display interface of the first device and display the corresponding picture based on the motion track based on the motion data generated when the second device moves, so that the application range of cross-terminal interaction is greatly enriched. Meanwhile, the first device can realize interaction with the user while realizing cross-terminal interaction, and the interaction method of the embodiment of the application can support multiple communication modes, thereby further improving the user experience. Moreover, the motion data in this application can be for the multiple motion data including the gyroscope to the change based on quaternion control display information can promote the interactive simplicity of cross-terminal, and the suitability is high.
Referring to fig. 12, fig. 12 is a schematic structural diagram of a cross-terminal interaction device provided in an embodiment of the present application. The device 12 provided by the embodiment of the application comprises:
a communication unit 121, configured to establish a communication connection with a second device based on a preset communication connection manner;
an acquisition unit 122 for acquiring motion data of the second device;
and the control unit 123 is configured to control a change of display information in the display interface of the first device according to the motion data.
In some possible embodiments, a gyroscope is installed in the second device, and the motion data includes gyroscope data.
In some possible embodiments, the control unit 123 is configured to:
converting the motion data into interface control information of the display interface;
and controlling the change of the display information in the display interface according to the interface control information.
In some possible embodiments, the motion data includes gyroscope data, and the control unit 123 is configured to:
and converting the gyroscope data into quaternions, and taking the quaternions as the interface control information.
In some possible embodiments, the control unit 123 is configured to:
controlling the rotation state of the display information in the display interface according to the quaternion;
determining a motion track of the second device corresponding to the display interface according to the quaternion; and when the motion track is coincident with the display information in the display interface, displaying a preset picture corresponding to the display information.
In some possible embodiments, the control unit 123 is configured to:
and displaying the operation information corresponding to the interface control information in the display interface according to the interface control information.
In some possible embodiments, the control unit 123 is further configured to:
determining reference position information of the second equipment corresponding to the display interface;
and controlling the change of the display information in the display interface of the first equipment according to the reference position information and the motion data.
In some possible embodiments, the control unit 123 is further configured to:
and displaying the interactive operation guide information.
In some possible embodiments, the preset communication connection manner is a communication connection manner based on a WebSocket protocol.
In some possible embodiments, after establishing the communication connection with the second device, the second device displays a corresponding operation interface;
the acquiring unit 122 is configured to:
acquiring user operation information received through the operation interface;
the control unit 123 is configured to:
and executing corresponding operation according to the user operation information.
In a specific implementation, the device 12 may execute the implementation manners provided in the steps in fig. 2 and/or fig. 7 through the built-in functional modules, which may specifically refer to the implementation manners provided in the steps, and are not described herein again.
In the embodiment of the application, the first device can control the rotation state and the motion track of the display information in the display interface of the first device and display the corresponding picture based on the motion track based on the motion data generated when the second device moves, so that the application range of cross-terminal interaction is greatly enriched. Meanwhile, the first device can realize interaction with the user through the second device, and cross-terminal interaction is realized, and user experience is further improved. Moreover, the motion data in this application can be for the multiple motion data including the gyroscope, has further promoted the interactive practicality and the extensive suitability of span to the change based on quaternion control display information can promote the interactive simplicity of span terminal, and the suitability is high.
Referring to fig. 13, fig. 13 is a schematic structural diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 13, the electronic device 1000 in the present embodiment may include: the processor 1001, the network interface 1004, and the memory 1005, and the electronic device 1000 may further include: a user interface 1003, and at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display) and a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a standard wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1004 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 13, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the electronic device 1000 shown in fig. 13, the network interface 1004 may provide a network communication function; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
establishing communication connection with second equipment based on a preset communication connection mode;
acquiring motion data of the second device;
and controlling the change of the display information in the display interface of the first equipment according to the motion data.
In some possible embodiments, a gyroscope is installed in the second device, and the motion data includes gyroscope data.
In some possible embodiments, the processor 1001 is configured to:
converting the motion data into interface control information of the display interface;
and controlling the change of the display information in the display interface according to the interface control information.
In some possible embodiments, the processor 1001 is configured to:
and converting the gyroscope data into quaternions, and taking the quaternions as the interface control information.
In some possible embodiments, the processor 1001 is configured to:
controlling the rotation state of the display information in the display interface according to the quaternion;
determining a motion track of the second device corresponding to the display interface according to the quaternion; and when the motion track is coincident with the display information in the display interface, displaying a preset picture corresponding to the display information.
In some possible embodiments, the processor 1001 is further configured to:
and displaying the operation information corresponding to the interface control information in the display interface according to the interface control information.
In some possible embodiments, the processor 1001 is further configured to:
determining reference position information of the second equipment corresponding to the display interface;
and controlling the change of the display information in the display interface of the first equipment according to the reference position information and the motion data.
In some possible embodiments, the processor 1001 is further configured to:
and displaying the interactive operation guide information.
In some possible embodiments, the preset communication connection manner is a communication connection manner based on a WebSocket protocol.
In some possible embodiments, after establishing the communication connection with the second device, the second device displays a corresponding operation interface;
the processor 1001 is further configured to:
acquiring user operation information received through the operation interface;
and executing corresponding operation according to the user operation information.
It should be understood that in some possible embodiments, the processor 1001 may be a Central Processing Unit (CPU), and the processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), field-programmable gate arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may include both read-only memory and random access memory, and provides instructions and data to the processor. The portion of memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In a specific implementation, the electronic device 1000 may execute, through each built-in functional module thereof, the implementation manners provided in each step in fig. 2 and/or fig. 7, which may be referred to specifically for the implementation manners provided in each step, and are not described herein again.
In the embodiment of the application, the first device can control the rotation state and the motion track of the display information in the display interface of the first device and display the corresponding picture based on the motion track based on the motion data generated when the second device moves, so that the application range of cross-terminal interaction is greatly enriched. Meanwhile, the first device can realize interaction with the user through the second device, and cross-terminal interaction is realized, and user experience is further improved. Moreover, the motion data in this application can be for the multiple motion data including the gyroscope, has further promoted the interactive practicality and the extensive suitability of span to the change based on quaternion control display information can promote the interactive simplicity of span terminal, and the suitability is high.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and is executed by a processor to implement the method provided in each step in fig. 2 and/or fig. 7, which may specifically refer to the implementation manner provided in each step, and is not described herein again.
The computer readable storage medium may be an internal storage unit of the task processing device provided in any of the foregoing embodiments, for example, a hard disk or a memory of an electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, which are provided on the electronic device. The computer readable storage medium may further include a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), and the like. Further, the computer readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the electronic device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the electronic device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods provided by the steps of fig. 2 and/or fig. 7.
The terms "first", "second", and the like in the claims and in the description and drawings of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or electronic device that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or electronic device. Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not intended to limit the scope of the present application, which is defined by the appended claims.

Claims (13)

1. A cross-terminal interaction method is characterized by comprising the following steps:
establishing communication connection with second equipment based on a preset communication connection mode;
acquiring motion data of the second device;
and controlling the change of the display information in the display interface of the first equipment according to the motion data.
2. The method of claim 1, wherein a gyroscope is mounted in the second device and the motion data comprises gyroscope data.
3. The method according to claim 1 or 2, wherein the controlling the change of the display information in the display interface of the first device according to the motion data comprises:
converting the motion data into interface control information of the display interface;
and controlling the change of the display information in the display interface according to the interface control information.
4. The method of claim 3, wherein the motion data comprises gyroscope data, and wherein translating the motion data into interface control information for the display interface comprises:
and converting the gyroscope data into quaternions, and taking the quaternions as the interface control information.
5. The method of claim 4, wherein the controlling the change of the display information in the display interface according to the interface control information comprises at least one of:
controlling the rotation state of display information in the display interface according to the quaternion;
determining a motion track of the second device corresponding to the display interface according to the quaternion; and when the motion track is coincident with the display information in the display interface, displaying a preset picture corresponding to the display information.
6. The method of claim 3, further comprising:
and displaying operation information corresponding to the interface control information in the display interface according to the interface control information.
7. The method of claim 1, further comprising:
determining reference position information of the second device corresponding to the display interface;
the controlling the change of the display information in the display interface of the first device according to the motion data includes:
and controlling the change of the display information in the display interface of the first equipment according to the reference position information and the motion data.
8. The method of claim 1 or 7, further comprising:
and displaying the interactive operation guide information.
9. The method according to claim 1, wherein the preset communication connection mode is a communication connection mode based on a WebSocket protocol.
10. The method according to claim 8, wherein after establishing the communication connection with the second device, the second device displays a corresponding operation interface;
the method further comprises the following steps:
acquiring user operation information received through the operation interface;
and executing corresponding operation according to the user operation information.
11. An inter-terminal interaction apparatus, the apparatus comprising:
the communication unit is used for establishing communication connection with the second equipment based on a preset communication connection mode;
an acquisition unit configured to acquire motion data of a second device;
and the control unit is used for controlling the change of the display information in the display interface of the first equipment according to the motion data.
12. An electronic device comprising a processor and a memory, the processor and the memory being interconnected;
the memory is used for storing a computer program;
the processor is configured to perform the method of any of claims 1 to 10 when the computer program is invoked.
13. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 10.
CN202010839960.2A 2020-08-19 2020-08-19 Cross-terminal interaction method and device, electronic equipment and storage medium Pending CN111897437A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010839960.2A CN111897437A (en) 2020-08-19 2020-08-19 Cross-terminal interaction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010839960.2A CN111897437A (en) 2020-08-19 2020-08-19 Cross-terminal interaction method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111897437A true CN111897437A (en) 2020-11-06

Family

ID=73230244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010839960.2A Pending CN111897437A (en) 2020-08-19 2020-08-19 Cross-terminal interaction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111897437A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113207087A (en) * 2021-04-27 2021-08-03 上海闻泰信息技术有限公司 Wireless communication device connection method, system, wireless communication device and storage medium
CN114053704A (en) * 2021-10-28 2022-02-18 腾讯科技(深圳)有限公司 Information display method, device, terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180255284A1 (en) * 2017-03-03 2018-09-06 Fyusion, Inc. Tilts as a measure of user engagement for multiview interactive digital media representations
CN109085925A (en) * 2018-08-21 2018-12-25 福建天晴在线互动科技有限公司 It is a kind of realize MR mixed reality interaction method, storage medium
CN110413110A (en) * 2019-07-05 2019-11-05 深圳市工匠社科技有限公司 The control method and Related product of virtual role
CN110633018A (en) * 2019-07-30 2019-12-31 华为技术有限公司 Method for controlling display of large-screen equipment, mobile terminal and first system
CN111193832A (en) * 2020-04-10 2020-05-22 杭州脸脸会网络技术有限公司 Method for calculating speed and direction of parabolic motion based on mobile phone gyroscope sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180255284A1 (en) * 2017-03-03 2018-09-06 Fyusion, Inc. Tilts as a measure of user engagement for multiview interactive digital media representations
CN109085925A (en) * 2018-08-21 2018-12-25 福建天晴在线互动科技有限公司 It is a kind of realize MR mixed reality interaction method, storage medium
CN110413110A (en) * 2019-07-05 2019-11-05 深圳市工匠社科技有限公司 The control method and Related product of virtual role
CN110633018A (en) * 2019-07-30 2019-12-31 华为技术有限公司 Method for controlling display of large-screen equipment, mobile terminal and first system
CN111193832A (en) * 2020-04-10 2020-05-22 杭州脸脸会网络技术有限公司 Method for calculating speed and direction of parabolic motion based on mobile phone gyroscope sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113207087A (en) * 2021-04-27 2021-08-03 上海闻泰信息技术有限公司 Wireless communication device connection method, system, wireless communication device and storage medium
CN113207087B (en) * 2021-04-27 2022-09-23 上海闻泰信息技术有限公司 Wireless communication device connection method, system, wireless communication device and storage medium
CN114053704A (en) * 2021-10-28 2022-02-18 腾讯科技(深圳)有限公司 Information display method, device, terminal and storage medium
CN114053704B (en) * 2021-10-28 2023-06-09 腾讯科技(深圳)有限公司 Information display method, device, terminal and storage medium

Similar Documents

Publication Publication Date Title
US10401960B2 (en) Methods and systems for gaze-based control of virtual reality media content
US10671239B2 (en) Three dimensional digital content editing in virtual reality
US9345967B2 (en) Method, device, and system for interacting with a virtual character in smart terminal
CN109905754B (en) Virtual gift receiving method and device and storage equipment
CN111857923B (en) Special effect display method and device, electronic equipment and computer readable medium
US9849378B2 (en) Methods, apparatuses, and systems for remote play
US10062209B2 (en) Displaying an object in a panoramic image based upon a line-of-sight direction
CN112933606B (en) Game scene conversion method and device, storage medium and computer equipment
US20210042980A1 (en) Method and electronic device for displaying animation
CN108012195B (en) Live broadcast method and device and electronic equipment thereof
US20200380724A1 (en) Personalized scene image processing method, apparatus and storage medium
WO2023138559A1 (en) Virtual reality interaction method and apparatus, and device and storage medium
JP7078234B2 (en) How to create a 3D object to be placed in augmented reality space
CN111897437A (en) Cross-terminal interaction method and device, electronic equipment and storage medium
CN113485626A (en) Intelligent display device, mobile terminal and display control method
CN114401443B (en) Special effect video processing method and device, electronic equipment and storage medium
CN111913674A (en) Virtual content display method, device, system, terminal equipment and storage medium
CN107197339B (en) Display control method and device of film bullet screen and head-mounted display equipment
WO2024016924A1 (en) Video processing method and apparatus, and electronic device and storage medium
WO2023207989A1 (en) Method and apparatus for controlling virtual object, and device and storage medium
CN111798548A (en) Control method and device of dance picture and computer storage medium
CN114053693B (en) Object control method and device in virtual scene and terminal equipment
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN109542218B (en) Mobile terminal, human-computer interaction system and method
CN111973984A (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40030754

Country of ref document: HK

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201106