CN115814403A - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN115814403A
CN115814403A CN202111089487.1A CN202111089487A CN115814403A CN 115814403 A CN115814403 A CN 115814403A CN 202111089487 A CN202111089487 A CN 202111089487A CN 115814403 A CN115814403 A CN 115814403A
Authority
CN
China
Prior art keywords
terminal
interface
data
target
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111089487.1A
Other languages
Chinese (zh)
Inventor
刘华军
杨金华
杨毅轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111089487.1A priority Critical patent/CN115814403A/en
Priority to PCT/CN2022/118543 priority patent/WO2023040848A1/en
Publication of CN115814403A publication Critical patent/CN115814403A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a device control method and device, and relates to the technical field of terminals. According to the device control method provided by the application, a user only needs to establish communication connection between the first terminal and the second terminal. The first terminal sends control information to the second terminal. The second terminal opens the target function unit in response to the control information. And then, the second terminal executes the first function by matching with the first terminal based on the target function unit. Furthermore, the operation steps of the user are simplified, the second terminal can execute the first function based on the target function unit and matched with the first terminal, and the operation efficiency is improved.

Description

Equipment control method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a device control method and apparatus.
Background
Currently, with the development of terminal technology, terminal devices have become a part of people's work and life. Along with the function of the terminal device becoming more and more rich, for the more complicated function, one terminal device often needs to be completed by a third terminal. For example, when a first terminal needs to execute a target function, a second terminal needs to control the first terminal to complete the target function.
In general, people control the first terminal to complete the process of the target function by means of the second terminal, and the operation steps are complex and low in efficiency.
Disclosure of Invention
The application provides a device control method and device, which can enable a user to control a first terminal to realize a target function more conveniently by means of a second terminal only through simple operation, and improve the operation efficiency of the user.
The application provides a device control method, which is applied to a communication system, wherein the communication system comprises a first terminal and a second terminal, and the method comprises the following steps: the first terminal is in a first state, wherein the first terminal can execute a first function when in the first state; the first terminal sends control information to the second terminal based on the established communication connection; the second terminal responds to the control information to start the target function unit; the second terminal executes the first function in cooperation with the first terminal based on the target function unit.
According to the device control method, the user only needs to establish communication connection between the first terminal and the second terminal. The first terminal sends control information to the second terminal. The second terminal opens the target function unit in response to the control information. And then, the second terminal executes the first function by matching with the first terminal based on the target function unit. Furthermore, the operation steps of the user are simplified, the second terminal can execute the first function based on the target function unit and matched with the first terminal, and the operation efficiency is improved.
In a possible implementation manner, the second terminal cooperates with the first terminal to execute the first function based on the target function unit, and the method includes: the second terminal responds to the trigger operation of the target function unit and sends target data to the first terminal; the first terminal performs a first function associated with the target data.
In one possible implementation, the target function unit is a virtual text key in the second interface, and the target data is text data. The first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface comprises an area of text to be input. The first terminal performs a first function associated with the target data, including: when the first interface is displayed, the first terminal inputs text data in an area of the first interface where a text is to be input.
Therefore, the user can input the text data in the first interface of the first terminal by means of the second terminal by needing at most two operations. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
In one possible embodiment, the target data is control data. The first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface comprises a game role. The first terminal performs a first function associated with the target data, including: and when the first interface is displayed, the first terminal controls the game role in the first interface to execute the game behavior corresponding to the control data.
Therefore, the user can control the game role in the first terminal to execute the game behavior associated with the triggering operation by means of the second terminal at most by two operations. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
In one possible embodiment, the target function unit is a virtual control pad in the second interface, wherein the virtual control pad includes virtual keys for controlling game behavior of the game character. Or the control data is attitude data, and the target function unit is an attitude sensor of the second terminal and used for acquiring the attitude data.
In a possible implementation, the target function unit is a virtual control keyboard in the second interface, and after the first terminal sends the control information to the second terminal, the method further includes: the second terminal receives first interface data of the first interface from the first terminal. And the second terminal displays second interface data in a second interface, wherein the second interface data comprises display content corresponding to the first interface data.
In a possible implementation, the target functional unit is a video capture module, and the target data is second video data captured by the video capture module. The first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface comprises first video data from a third terminal. The first terminal performs a first function associated with the target data, including: the first terminal receives second video data from the second terminal when displaying the first interface; the first terminal transmits the second video data to the third terminal.
Therefore, the user needs to operate twice at most, the function that the first terminal executes video interaction by means of the second terminal can be achieved, and the hands of the user are liberated. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
In one possible implementation, after the second terminal turns on the target functional unit in response to the control information, the method further includes: the second terminal receives the first video data from the first terminal. And the second terminal displays a second interface, wherein the second interface comprises the first video data.
In this way, the user can also view the first video data from the third terminal at the second terminal.
In a possible implementation manner, the establishment manner of the communication connection includes: when the first terminal is in the first state, equipment discovery is carried out; and the first terminal establishes communication connection with the second terminal under the condition that the second terminal is found. Or under the condition that the distance between the second terminal and the target electronic equipment is smaller than the distance threshold, acquiring the equipment identifier of the first terminal stored in the target electronic equipment; and the second terminal establishes communication connection with the first terminal according to the equipment identifier. Or the first terminal receives voice information input by a user; the voice information is used for indicating connection with the second terminal; the first terminal carries out equipment discovery, and the first terminal establishes communication connection with the second terminal under the condition that the first terminal discovers the second terminal. Or the second terminal transmits a UWB signal, the UWB signal carries the equipment identifier of the second terminal, the first terminal receives the UWB signal from the second terminal, and the first terminal establishes communication connection with the second terminal according to the equipment identifier in the UWB signal.
In a possible implementation, before the first terminal sends the control information to the second terminal based on the established communication connection, the method further includes: and the second terminal displays a fifth interface, wherein the fifth interface comprises a first control used for controlling to be in/out of a third preset mode. And the second terminal responds to the triggering operation of the first control, and the second terminal is in a third preset mode.
In a possible implementation manner, the control information is information for indicating that the first terminal is in the first state, and the second terminal turns on the target functional unit in response to the control information, including: and the second terminal starts the target function unit according to the information indicating that the first terminal is in the first state. Or, the control information is an instruction for starting the target function unit, and the second terminal responds to the control information to start the target function unit, including: and the second terminal starts the target function unit according to the instruction for starting the target function unit.
Therefore, the target function unit can be started through the first terminal decision, and the target function unit can also be started through the second terminal decision.
In a second aspect, the present application further provides an apparatus control device applied to a first terminal, where the apparatus control device includes: the first processing unit is used for controlling the first terminal to be in a first state, wherein the first terminal can execute a first function when being in the first state. And the first communication unit is used for sending the control information to the second terminal based on the established communication connection. The control information is used for indicating the second terminal to start the target function unit so as to cooperate with the first terminal to execute the first function.
In a possible implementation, the first communication unit is further configured to receive target data from the second terminal. The first processing unit is further used for executing a first function related to the target data.
In one possible implementation, the target function unit is a virtual text key in the second interface, and the target data is text data. The first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes an area of a text to be input. The first processing unit is further specifically configured to control the first terminal to input text data in an area of a text to be input on the first interface when the first interface is displayed.
In one possible embodiment, the target data is control data. The first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes a game character. The first processing unit is further specifically configured to control the first terminal to, when the first interface is displayed, control a game character in the first interface to execute a game behavior corresponding to the control data.
In one possible embodiment, the target function unit is a virtual control pad in the second interface, wherein the virtual control pad comprises virtual keys for controlling the game behavior of the game character. Or the control data is attitude data, and the target function unit is an attitude sensor of the second terminal and used for acquiring the attitude data.
In a possible implementation, the target functional unit is a video capture module, and the target data is second video data captured by the video capture module. The first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes first video data from a third terminal, and the first processing unit is specifically configured to control the first terminal to receive second video data from a second terminal when the first interface is displayed; and transmitting the second video data to the third terminal.
In a third aspect, the present application further provides an apparatus control device, which is applied to a second terminal, and the apparatus control device provided in the third aspect includes: and a second communication unit for receiving the control information from the first terminal based on the established communication connection. And the second processing unit is used for responding to the control information to start the target function unit. And the second processing unit is also used for executing the first function by matching the first terminal based on the target function unit.
In a possible implementation manner, the second processing unit is specifically configured to control the second communication unit to send the target data to the first terminal in response to a trigger operation on the target function unit; wherein the target data is used to instruct the first terminal to perform a first function associated with the target data.
In a possible implementation manner, the target function unit is a virtual text key in the second interface, the target data is text data, and the text data is used for instructing the first terminal to input the text data in an area of the first interface where text is to be input when the first interface is displayed.
In a possible implementation manner, the target data is control data, and the target data is used for instructing the first terminal to control a game character in the first interface to execute a game behavior corresponding to the control data when the first interface is displayed.
In one possible implementation, the target function unit is a virtual control keyboard in the second interface, wherein the virtual control keyboard comprises virtual keys for controlling game behaviors of the game character; or the control data is attitude data, and the target function unit is an attitude sensor of the second terminal and used for acquiring the attitude data.
In a possible implementation manner, the target function unit is a video capture module, the target data is second video data captured by the video capture module, and the second video data is used for instructing the first terminal to send the second video data to the third terminal when displaying the first interface including the first video data.
In a possible implementation manner, the control information is information used to indicate that the first terminal is in the first state, and the second processing unit is specifically configured to start the target function unit according to the information indicating that the first terminal is in the first state. Or the control information is an instruction for starting the target functional unit, and the second processing unit is specifically configured to start the target functional unit according to the instruction for starting the target functional unit.
In a fourth aspect, the present application also provides a computer-readable storage medium storing instructions that, when executed, cause a computer to perform a method performed by a first terminal or a second terminal as described in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, the present application also provides a computer program product comprising a computer program that, when executed, causes a computer to perform the method performed by the first terminal or the second terminal as described in the first aspect or any implementation manner of the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1-2 are schematic interface diagrams illustrating text data input in a first interface displayed by the mobile phone 200 to the smart television 200;
fig. 3 is a schematic diagram of a hardware system architecture of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a software system architecture of a terminal device according to an embodiment of the present application
Fig. 5 is a schematic diagram illustrating that the mobile phone 200 is close to the remote controller 300 of the smart tv 100 to obtain the device identity of the smart tv 100 according to the embodiment of the present application;
fig. 6 is a schematic diagram of the mobile phone 200 receiving the voice message "connect with tv" input by the user according to the embodiment of the present application;
fig. 7 is a schematic diagram illustrating that the mobile phone 200 establishes a communication connection with the smart tv 100 according to an embodiment of the present application;
fig. 8 is an interface schematic diagram illustrating that the smart television 100 sends an interface identifier of the first interface 101 to the mobile phone 200 in the process of displaying the first interface 101 according to the embodiment of the present application;
fig. 9 is a schematic diagram illustrating that, in the process of displaying the second interface 207, the mobile phone 200 responds to a user's trigger operation on a virtual text key in the second interface 207 to input an account and a password in the first interface 101 of the smart television 100 according to the embodiment of the present application;
fig. 10 is a schematic interface diagram illustrating that when the smart tv 100 displays a game interface, a communication connection is established with the mobile phone 100 according to an embodiment of the present application;
fig. 11 is an interface schematic diagram illustrating that the smart television 100 sends an interface identifier of a game interface to the mobile phone 200 in a process of displaying the game interface according to the embodiment of the present application;
fig. 12 is an interface schematic diagram of the mobile phone 200, which is provided in the embodiment of the present application, controlling the game character a in the first interface 101 of the smart television 100 to attack the game character B with a light hand in response to a trigger operation of the user on the light-hand attack key "a" in the second interface 207;
fig. 13 is a schematic interface diagram illustrating that when the mobile phone 200 displays a video playing interface, a communication connection is established with the smart television 200 according to the embodiment of the present application;
fig. 14 is a schematic interface diagram of the mobile phone 200 according to the embodiment of the present application, which implements a large-screen video interaction function by means of software and hardware of the smart tv 100;
fig. 15 is a schematic interface diagram illustrating the mobile phone 200 set in the third preset mode according to the embodiment of the present application;
fig. 16 is a flowchart illustrating an apparatus control method according to an embodiment of the present application;
fig. 17 is a schematic diagram of a functional architecture of a first terminal performing a function associated with target data by a second terminal according to an embodiment of the present application;
fig. 18 is a block diagram of a terminal device 1800 according to an embodiment of the present application;
fig. 19 is a block diagram of a terminal device 1900 according to an embodiment of the present application;
fig. 20 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application;
fig. 21 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
Along with the function of the terminal device becoming more and more rich, for the more complex function, the first terminal is often completed more conveniently by the need of the second terminal. For example, taking a first terminal as the smart tv 100 and a second terminal as the mobile phone 200 as an example, with reference to fig. 1-2, how the smart tv 100 can conveniently complete a certain function by using the mobile phone 200 is described.
As shown in fig. 1, the smart television 100 displays a first interface 101, where the first interface 101 includes an account input area a and a password input area B, and a user needs to input an account in the account input area a and input a password in the password input area B to complete login and use the smart television 100. For example, the user can perform at least the following steps with the aid of the mobile phone 200 to enter an account and a password in the first interface 101.
Step 1: as shown in fig. 1, a user may establish a communication connection between a mobile phone 200 and a smart tv 100.
Step 2: as shown in fig. 2 (a) - (b), the handset 200 displays a system home interface 201 in response to an unlocking operation by the user.
And 3, step 3: as shown in fig. 2 (b), the display system main interface 201 includes an icon 202 of the "collaboration" application, and the cell phone 200 displays a main interface 203 of the "collaboration" application in response to a user's trigger operation on the icon 202 of the "collaboration" application, as shown in fig. 2 (c).
And 4, step 4: as shown in fig. 2 (c), the home interface 203 of the "collaboration" application includes a control 204 for indicating that a collaboration function is implemented. The handset 200 can display a function list interface 205 in response to a user's triggering operation of the control 204. The function list interface 205 displays "text input application", "handle application", and "game application", and the switch controls 206 are displayed on one side of each of the "text input application", "handle application", and "game application".
And 5, step 5: as shown in fig. 2 (c) - (d), the mobile phone 200 may display a second interface 207 in response to a user's trigger operation on the switch control 206 on the "text input application" side, where the second interface 207 includes virtual keys. The mobile phone 200 may control the smart television 100 to input an account in the account input area a and input a password in the password input area B in response to the user's triggering operation of the virtual key, so as to complete login.
However, the smart television 100 completes the function of inputting the account and the password by using the mobile phone 200, and at least the user needs to operate the above steps 1 to 5, and the operation steps are complicated and inefficient.
In view of this, an embodiment of the present application provides an apparatus control method, where after a first terminal displays a first interface, a second terminal is needed to complete a certain function, and a communication connection between the first terminal and the second terminal may be established; further, the first terminal sends the identifier of the first interface to the second terminal; the second terminal identifies the identifier of the first interface and displays a second interface corresponding to the identifier of the first interface; the second terminal acquires target data in the process of displaying the second interface; the second terminal sends target data to the first terminal; the first terminal performs a function associated with the target data. Therefore, the operation steps of the user are reduced, the process that the first terminal completes the target function by means of the second terminal can be achieved, and the efficiency is improved.
It is to be understood that the first terminal and the second terminal described above may be referred to as terminal devices. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes a structure of a terminal device according to the embodiments of the present application. Exemplarily, fig. 3 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a key 190, an indicator 192, a camera 193, a display 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like. Among them, the combination of the gyro sensor 180B and the acceleration sensor 180E may be understood as an attitude sensor.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal device. In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. The different processing units may be separate devices or may be integrated into one or more processors. A memory may also be provided in processor 110 for storing instructions and data.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device, and may also be used to transmit data between the terminal device and the peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger can be a wireless charger or a wired charger. The power management module 141 is used for connecting the charging management module 140 and the processor 110.
The wireless communication function of the terminal device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in terminal devices may be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation.
The wireless communication module 160 may provide a solution for wireless communication applied to a terminal device, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), and the like.
The terminal device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The camera 193 is used to capture still images or video. In some embodiments, the terminal device may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. Illustratively, the internal memory 121 may be used to store a first target table and a second target table. Specific contents of the first target table may refer to table 1 in the following embodiments, and specific contents of the second target table may refer to table 2 in the following embodiments, which are not described herein.
The terminal device may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device can play music through the speaker 170A or listen to voice. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals, such as collected speech.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the terminal device. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device in various directions (generally, three axes). A distance sensor 180F for measuring a distance. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light brightness. The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is used to detect temperature. The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The bone conduction sensor 180M can acquire a vibration signal.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device may receive a key input, and generate a key signal input related to user setting and function control of the terminal device. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, which is not described herein again.
The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a terminal device. Fig. 4 is a block diagram of a software structure of a terminal device to which the embodiment of the present application is applied. The layered architecture divides the software system of the terminal device into a plurality of layers, and each layer has clear roles and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into five layers, namely an application layer (applications), an application framework layer (application framework), an Android runtime (Android runtime), and a system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 3, the application packages may include camera, text entry, games, gamepad, collaboration control, weChat, telephony, maps, navigation, WLAN, bluetooth information, etc. applications. The display interface when the text input application is opened comprises virtual text keys, and the virtual text keys are used for inputting texts. Exemplary virtual text keys include number keys, letter keys, symbol keys, and the like, without limitation. The display interface when the game application is opened is a game interface. For example, the game interface may include a game character a and a game character B in a fighting game. The display interface when the handle application is opened may include a virtual control keypad. The virtual control keyboard may include a forward key "←", a backward key "→", a jump key "→", a squat key "↓", a light-hand attack key "a", a heavy-hand attack key "B", a light-foot attack key "C", and a heavy-foot attack key "D". The cooperative control application may be used to call application programs such as a text input application, a handle application, and a video application, and may also be used to call hardware modules such as an acceleration sensor 180E, a gyro sensor 180B, a microphone 170A, and a speaker 170C.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. Content providers are used to store and retrieve data and make it accessible to applications. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The android runtime includes a core library and a virtual machine. And the android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. The system library may include a plurality of functional modules.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports playback and recording of various commonly used audio and video formats, still image files and the like
The hardware abstraction layer can contain a plurality of library modules, and the library modules can be camera library modules, motor library modules and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, speakers, a display screen, and a camera in the terminal device.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The inner core layer at least includes a display driver, a motor driver, a sensor driver, and the like, which is not limited in the embodiments of the present application.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
In the following, taking the first terminal as the smart television 100 and the second terminal as the mobile phone 200 as an example, and referring to the interface schematic diagrams of fig. 5 to fig. 15, a device control method provided in the embodiment of the present application is described, and this example does not limit the embodiment of the present application. The following embodiments may be combined with each other and are not described in detail with respect to the same or similar concepts or processes.
As shown in fig. 5 or fig. 6, when the user turns on the smart tv 100, the smart tv 100 displays the first interface 101 (i.e., the smart tv 100 is in the first state). The first interface 101 includes an account input area a where text is to be input, and a password input area B (i.e., an area where text is to be input) (i.e., the smart tv 100 is capable of implementing a function of text input, i.e., a first function). The first interface 101 may be understood as a text input interface. In this way, the user needs to input an account and a password in the first interface 101 by using the mobile phone 200 (i.e., the smart tv 100 needs to input text data by using the mobile phone 200), and the user can use the smart tv 100 after the account and the password are successfully verified by the smart tv 100. For example, the user uses the smart tv 100 to watch a live channel, request a movie, play a game, and so on.
Before the user inputs an account and a password in the first interface 101 by means of the mobile phone 200, the smart tv 100 needs to establish a communication connection with the mobile phone 200. Illustratively, the communication connection mode of the mobile phone 200 and the smart tv 100 may include, but is not limited to, the following modes:
the first method comprises the following steps: the smart tv 100 performs device discovery during the process of displaying the first interface 101. The smart television 100 establishes a communication connection with the mobile phone 200 when discovering the mobile phone 200. The communication connection may be a bluetooth connection, a WiFi connection, an NFC connection, a UWB connection, or the like, which is not limited herein. Therefore, the communication connection between the smart television 100 and the mobile phone 200 can be realized without user operation, so that the operation steps of the user can be reduced, and the efficiency can be improved.
And the second method comprises the following steps: as shown in fig. 5, when the mobile phone 200 is close to the remote controller 300 of the smart tv 100 (e.g., the user touches the remote controller 300 once with the mobile phone 200), the mobile phone 200 may establish an NFC communication connection with the remote controller 300. As such, the mobile phone 200 may obtain the device identification (which may include the communication address of the smart tv 100) of the smart tv 100 stored in the remote controller 300 through NFC communication. Further, as shown in fig. 7, the mobile phone 200 may establish a communication connection with the smart tv 100 according to the device identifier. Therefore, the user only needs to make the mobile phone 200 approach the remote controller 300, and the communication connection between the mobile phone 200 and the smart television 100 can be established conveniently and quickly.
In addition, in order to avoid that the mobile phone 200 establishes a communication connection with the smart television 100 when the mobile phone 200 makes a contact with the remote controller 300 by mistake. The mobile phone 200 may establish the communication connection with the smart tv 100 only when the NFC communication is established with the remote controller 300 twice (i.e., the user touches the mobile phone 200 to the remote controller 300 twice).
Alternatively, the remote controller 300 may be replaced with a router, the smart tv 100, or the like, which is not limited herein. It is to be understood that the remote controller 300, the router, or the smart tv 100 described above is the target electronic device.
And the third is that: as shown in fig. 6, the handset 200 may receive the voice message "connect to tv" input by the user. Further, as also shown in fig. 7, the cell phone 200 discovers the smart tv 100 and establishes a communication connection with the smart tv 100. It is understood that the voice information is used to indicate the communication connection with the smart tv 100. As such, the voice message may be replaced by "connect to tv" or the like, and is not limited herein.
And fourthly: the handset 200 may transmit UWB signals. Where the UWB signal carries the device identification of the handset 200 (which may include the communication address of the handset 200). When the antenna of the cell phone 200 is aligned with the smart tv 100, the smart tv 100 may receive the device identification of the cell phone 200. Further, the smart tv 100 establishes a communication connection with the mobile phone 200 according to the device identifier. Therefore, the user only needs to align the mobile phone 200 to the smart television 100, so that the mobile phone 200 and the smart television 100 can establish communication connection conveniently and quickly.
After the mobile phone 200 is successfully connected with the smart television 100, the smart television 100 controls the mobile phone 200 to display a second interface in the process of displaying the first interface 101. The second interface comprises virtual text keys, and the virtual text keys are used for inputting texts. Exemplary virtual text keys include number keys, letter keys, symbol keys, and the like, without limitation. It will be appreciated that the second interface described above is a keyboard interface. Next, how the smart tv 100 controls the mobile phone 200 to display the second interface is described with reference to table 1 and table 2, respectively.
Exemplarily, the mobile phone 200 may store a first target table, where the first target table includes a mapping relationship between the interface identifier of the smart tv 100 and the interface identifier of the mobile phone 200. Wherein the contents of the first target table may be as follows
Table 1 shows:
interface identification of smart television 100 Interface identification of mobile phone 200
Interface identification of text input interface Interface identification of keyboard interface
Interface identification of game interface Interface mark of handle interface
Interface mark of video playing interface Interface mark of video playing interface
TABLE 1
After the mobile phone 200 and the smart tv 100 are successfully connected, as shown in fig. 8, in the process of displaying the first interface 101, the smart tv 100 may send an interface identifier of the first interface 101 (i.e., an interface identifier of a text input interface, i.e., control information) to the mobile phone 200. The mobile phone 200 searches the interface identifier of the second interface 207 (i.e. the interface identifier of the keyboard interface) from the first target table according to the interface identifier of the first interface 101. Further, the mobile phone 200 displays a second interface 207 (i.e. a keyboard interface) according to the found interface identifier.
In addition, the mapping relationship may be replaced by: the interface of the smart television 100 identifies a mapping relation with the package name of the mobile phone 200. Illustratively, the contents of the first target table may also be as shown in table 2 below:
Figure BDA0003266841440000101
Figure BDA0003266841440000111
TABLE 2
Based on table 2, the mobile phone 200 may further find the package name of the keyboard application from the first target table according to the interface identifier of the first interface 101. Further, the mobile phone 200 opens the keyboard application to display the second interface 207 according to the found package name of the keyboard application.
As shown in fig. 9, during the process of displaying the second interface 207, the mobile phone 200 may input an account and a password in the first interface 101 of the smart tv 100 in response to a user's trigger operation on a virtual text key in the second interface 207. The method for inputting the account and the password in the first interface 101 by the mobile phone 200 includes, but is not limited to, the following two methods:
the first method comprises the following steps: the virtual text keys are marked with key identifications. The key identifier may be a number "0", a number "1", or the like, or an english letter "a", a number "b", or the like. The mobile phone 200 may transmit the key identification (i.e., the target data) of the triggered virtual text key to the smart tv 100 in response to the triggering operation of the virtual text key by the user. The smart tv 100 inputs the received key identifier in the first interface 101.
And the second method comprises the following steps: the virtual text keys are marked with key identifications. The key identifier may be a number "0", or "1", and the like, and an english letter "a", or "b", and the like. The key identification may have a mapping relationship with the key code. Illustratively, the smart tv 100 and the mobile phone 200 each store a second target table, where the second target table includes a mapping relationship between key identifications and key codes. Wherein, the contents of the second target table may be as shown in the following table 3:
key mark Second key code
Key label '0' 48
Key label '1' 49
Key label '2' 50
Key-press label '3' 51
Key-press label '4' 52
Key-press label '5' 53
Key-press label '6' 54
Key label '7' 55
Key label '8' 56
Key label '9' 57
Key label 'Enter' 13
Key-press identification 'A' 65
Key-press identification 'B' 66
Key label 'C' 67
... ...
Key-press identification 'Z' 90
TABLE 3
The mobile phone 200 may search the second target table for the key code corresponding to the key identifier of the triggered virtual text key in response to the triggering operation of the virtual text key by the user. The cell phone 200 transmits the found key code to the smart tv 100. The smart tv 100 finds the keyboard identifier corresponding to the key code from the second target table, and inputs the found key identifier in the first interface 101.
In summary, in the above embodiments, the user only needs to establish a communication connection between the smart tv 100 and the mobile phone 200. In this way, the smart tv 100 sends the identifier of the first interface to the cell phone 200. After recognizing the first interface identifier, the mobile phone 200 displays a second interface corresponding to the first interface identifier. The mobile phone 200 receives a trigger operation of a user in the process of displaying the second interface; the smart television 100 inputs an account and a password corresponding to the trigger operation in the first interface. Thus, the user can input the account and the password in the first interface of the smart television 100 by using the mobile phone 200 only by two operations at most. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
It should be noted that, in the above embodiment, the first interface 101 of the smart television 100 is taken as a text input interface, and the user inputs an account and a password on the text input interface of the smart television 100 by using the mobile phone 200 as an example. Next, taking the first interface 101 of the smart tv 100 as a game interface as an example, how the user controls a game character in the game interface of the smart tv 100 to execute a game behavior by using the mobile phone 200 will be described.
As shown in fig. 10, when the user plays the fighting game using the smart tv 100, the smart tv 100 displays the first interface 101 (i.e., the smart tv 100 is in the first state). The first interface 101 includes a game character a and a game character B (i.e., the smart tv 100 has a function of controlling a behavior of the game character, i.e., a first function). It will be appreciated that the first interface 101 is a game interface. In addition, at this time, the mobile phone 200 displays the system main interface 201. Of course, the mobile phone 200 may display the system main interface 201 instead of the mobile phone 200 in the lock screen state or displaying other browsing interfaces, etc., which are only illustrated here.
As also shown in fig. 10, if the user wants to control the game behavior of the game character a via the mobile phone 200, the smart tv 100 needs to establish a communication connection with the mobile phone 200. For example, the manner of the communication connection between the mobile phone 200 and the smart tv 100 may refer to the description in the above embodiments, and is not described herein again.
After the mobile phone 200 establishes a communication connection with the smart television 100, the smart television 100 controls the mobile phone 200 to display the second interface 207 in the process of displaying the first interface 101.
Illustratively, as shown in (a) - (b) of fig. 11, during the process of displaying the first interface 101, the smart tv 100 sends the interface identifier of the first interface 101 (i.e., the interface identifier of the game interface, i.e., the control information) to the mobile phone 200. The mobile phone 200 displays the second interface 207 according to the interface identifier of the first interface 101. The principle of displaying the second interface 207 by the mobile phone 200 according to the interface identifier of the first interface 101 is the same as that in the above embodiment, and is not described herein again.
Illustratively, as shown in fig. 11 (b), the second interface 207 may include a virtual control keyboard 208. The virtual control keyboard 208 includes virtual keys for controlling the game behavior of the game character a. For example, the virtual control keyboard 208 may include a forward key "←", a backward key "→", a jump key "→", a squat key "↓", a light attack key "a", a heavy attack key "B", a light attack key "C", and a heavy attack key "D".
In addition, as also shown in fig. 11 (b), the mobile phone 200 may further receive interface data displayed on the first interface 101 from the smart tv 100, and display the interface data of the first interface 101 on the second interface 207, so that the user may refer to the interface data of the first interface 101 in the smart tv 100 at a short distance through the mobile phone 200.
It is understood that, in the process of displaying the second interface 207 by the mobile phone 200, the mobile phone 200 can control the game behavior of the game character a in the first interface 101 in response to the trigger operation of the user.
As shown in (a) in fig. 12, the mobile phone 200 may control the game character a in the first interface 101 of the smart tv 100 to lightly attack the game character B in response to a user's trigger operation on the light-handed attack key "a" in the second interface 207. The principle that the user controls the game character a in the first interface 101 of the smart television 100 to lightly attack the game character B by using the mobile phone 200 is the same as the principle that the user inputs an account and a password into the first interface 101 of the smart television 100 by using the mobile phone 200 in the above embodiment, and details are not repeated here. It can be understood that the mobile phone 200 can also control the game character a in the first interface 101 of the smart television 100 to perform other operations based on the same principle, which is not limited and described herein.
In addition, fig. 11 (b) may be replaced with fig. 11 (c). As shown in fig. 11 (c), the cellular phone 200 displays the second interface 211. The second interface 211 comprises a virtual control pad 212, wherein control data (i.e. object data) for the game character is generated when the virtual control pad 211 is triggered, wherein the control data is used to control the behavior of the game character a in the first interface 211. The virtual control keyboard 212 may include a forward key "←", a backward key "→", a jump key "→", a squat key "↓", a light-hand attack key "a", a heavy-hand attack key "B", a light-foot attack key "C", and a heavy-foot attack key "D". As can be seen, the second interface 211 is the handle interface. The mobile phone 200 may also control the game character a in the first interface 101 of the smart tv 100 to lightly attack the game character B in response to the user's trigger operation on the light attack button "a" in the second interface 211.
In addition, fig. 11 (b) may be replaced with fig. 11 (d) described above. As shown in (d) in fig. 11, the cellular phone 200 displays the third interface 213. The third interface 213 displays the first prompt message. The first prompt message is used to indicate that the mobile phone 200 is in a first preset mode, where the first preset mode may be understood as that the mobile phone 200 starts a game mode. For example, the first prompt message may be, but is not limited to, a text message "in game mode". In addition, the mobile phone 200 displays the third interface 213, and also finds the gesture sensor according to the identifier of the first interface 101 of the smart tv 100 and the preset mapping relationship. In this way, the mobile phone 200 may share the gesture data of the mobile phone 200 collected by the gesture sensor to the smart tv 100. After receiving the gesture data, the smart television 100 may control the game character a to perform an operation associated with the gesture data.
Illustratively, the handset 200 sends gesture data (i.e., target data) to the right to the smart tv 100 when moving to the right. In this way, the smart tv 100 can control the game character a to move to the right according to the gesture data moving to the right. For another example, when the mobile phone 200 moves downward, the downward gesture data is sent to the smart tv 100. In this way, the smart tv 100 can control the game character a to squat according to the downward posture data. For another example, when the mobile phone 200 is flipped forward, the flipped-forward gesture data is sent to the smart tv 100. In this way, the smart tv 100 can control the game character a to lightly attack the game character B according to the posture data of the forward turning.
In summary, in the above embodiment, the user only needs to establish a communication connection between the smart television 100 and the mobile phone 200. In this way, the smart tv 100 sends the identifier of the first interface to the cell phone 200. After recognizing the first interface identifier, the mobile phone 200 displays a second interface corresponding to the first interface identifier. The mobile phone 200 receives a trigger operation of the user during the process of displaying the second interface, and controls the game character in the smart tv 100 to execute a game behavior associated with the trigger operation. Thus, the user can control the game character in the smart television 100 to execute the game behavior associated with the trigger operation by using the mobile phone 200 at most two operations. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
It should be noted that, in the above two embodiments, the smart television 100 implements a certain function by means of software and/or hardware of the mobile phone 200. For example, the smart television 100 inputs an account and a password by means of the mobile phone 200; or, the smart television 100 controls the game character to execute the game behavior by means of the mobile phone 200. In the following, taking the first terminal as the mobile phone 200 and the second terminal as the smart tv 100 as an example, the mobile phone 200 implements the large-screen video interaction function by means of software and hardware of the smart tv 100.
When the user is watching a television program using the smart tv 100, the user's handset 200 receives a video call request from the third terminal. As shown in fig. 13, the cellular phone 200 receives the first video data from the third terminal and captures the second video data in response to the confirmation operation of the video call request input by the user. The first video data includes a captured image D of the third terminal and first sound data. The second video data includes first sound data collected by a microphone of the cellular phone 200 and an image C collected by a camera of the cellular phone 200. In this way, the mobile phone 200 displays the first interface 209 (i.e. a video playing interface, and the mobile phone 200 is in the first state), wherein the first interface 209 includes an image C and an image D, and the mobile phone 200 plays the first sound data on the speaker of the mobile phone 200 (i.e. the mobile phone 200 has a video interaction function, i.e. a first function). The mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data, and thus, video interaction between the mobile phone 200 and the third terminal can be realized.
If the user does not want to capture the second video data by holding the mobile phone 200 (i.e. the user wants to free both hands) during the video interaction process, the mobile phone 200 may capture the second video data by using the smart tv 100. Alternatively, if the user wants to watch the images C and D more clearly, the mobile phone 200 may display the images C and D by means of the smart tv 100. In the following, with reference to fig. 13 and fig. 14, it is described how the mobile phone 200 acquires the second video data by means of the smart tv 100, and how the mobile phone 200 displays the image C and the image D by means of the smart tv 100.
As shown in fig. 13, the mobile phone 200 establishes a communication connection with the smart tv 100 during the process of displaying the first interface 209. For example, the manner of the communication connection between the mobile phone 200 and the smart tv 100 may refer to the description in the above embodiments, and is not described herein again.
After the communication connection between the mobile phone 200 and the smart television 100 is established, the mobile phone 200 controls the smart television 100 to display the second interface 103 in the process of displaying the first interface 209. The second interface 103 also includes an image C and an image D. Next, how the mobile phone 200 controls the smart tv 100 to display the second interface 103 is specifically described.
As shown in (a) - (b) of fig. 14, during the process of displaying the first interface 209, the mobile phone 200 sends the interface identifier of the first interface 209 (i.e., the interface identifier of the video playing interface, i.e., the control information) to the smart tv 100. The smart television 100 finds the identifier of the second interface 103 and the identifier of the speaker according to the interface identifier of the first interface 209 and the preset mapping relationship. The smart tv 100 receives first video data from the mobile phone 200, where the first video data includes a captured image D and first sound data of the third terminal. As such, the smart tv 100 displays the second interface 103 (i.e., a video playing interface), wherein the second interface 103 includes the image D, and plays the first sound data on the speaker of the smart tv 100. Generally, the size of the image D displayed on the smart tv 100 is larger than that of the image D displayed on the cell phone 200. In this way, the user can watch the image D on the smart tv 100 more clearly.
In addition, optionally, as shown in (d) in fig. 14, the mobile phone 200 may further display the fourth interface 210 after transmitting the first video data to the smart tv 100. The fourth interface 210 displays the second prompt message. The second prompt message is used to indicate that the mobile phone 200 is in a second predetermined mode (e.g., video mode). Illustratively, the first prompting message may be, but is not limited to, a text message "in video call".
In addition, the smart television 100 finds the identifier of the microphone and the identifier of the camera according to the interface identifier of the first interface 209 and the preset mapping relationship. In the process of displaying the second interface 103 by the smart television 100, the camera of the smart television 100 acquires the image C, and the microphone of the smart television 100 acquires the second sound data. Further, as shown in (C) - (d) of fig. 14, the smart tv 100 also transmits second video data (i.e., target data) including the image C and second sound data to the cell phone 200. Further, the mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data. In this way, the mobile phone 200 realizes video interaction with the smart television 100 and the third terminal. Since the second video data is collected by the smart television 100, the user does not need to hold the mobile phone 200 to collect the second video data, and both hands of the user are released.
In summary, in the above embodiments, the user only needs to establish a communication connection between the smart tv 100 and the mobile phone 200. In this way, the mobile phone 200 sends the identifier of the video playing interface of the smart tv 100. After recognizing the first interface identifier, the smart television 100 displays a video playing interface corresponding to the identifier of the video playing interface, and a speaker of the smart television 100 plays the first sound data from the mobile phone 200. The smart television 100 collects second video data in the process of displaying the video playing interface. The smart tv 100 transmits the second video data to the cell phone 200. The mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data. Therefore, the user needs to perform two operations at most, so that the function of controlling the smart television 100 to execute large-screen video interaction by the mobile phone 200 can be realized, and the two hands of the user are liberated. Furthermore, the operation steps of the user are simplified, and the operation efficiency is improved.
It should be noted that, in the above three embodiments, in order to further avoid the handset 200 and the smart tv 100 from mistakenly establishing a communication connection. The mobile phone 200 needs to be in the third preset mode to establish a communication connection with the smart tv 100. Illustratively, the user may trigger the handset 200 to be in a third preset mode. The third preset mode may be understood as a sharing mode, a coordination mode, and the like. As shown in (a) - (b) of fig. 15, the cellular phone 200 can display the function list interface 212 (i.e., a fifth interface) in response to a user's trigger operation of the "set" icon 211 in the system main interface 201. The function list interface 212 includes a first control 213 for controlling to be in/out of a third preset mode. The cell phone 200 may be in a third preset mode in response to a triggering operation of the first control 213. In a third preset mode, the mobile phone 200 may establish a communication connection with the smart tv 100.
It is to be understood that the above description in connection with fig. 5-15 describes the embodiments of the present application in terms of schematic diagrams of the bonding interface. Next, the device control method provided in the embodiment of the present application is described with reference to the flowchart provided in fig. 16. It should be noted that the basic principle and the technical effects of the apparatus control method provided in fig. 16 are the same as those of the above embodiments, and for the sake of brief description, no part of the embodiments of the present application is mentioned, and reference may be made to the corresponding contents in the above embodiments.
Fig. 16 is a flowchart of an apparatus control method according to an embodiment of the present application. As shown in fig. 16, the apparatus control method provided in the embodiment of the present application may include:
s1601: the first terminal displays a first interface.
The first interface may be a text input interface, a game interface, or a video playing interface in the above embodiments, which is not limited herein.
S1602: and in the process of displaying the first interface by the first terminal, the first terminal and the second terminal establish communication connection.
The specific process of S1602 may refer to the description in the foregoing embodiments, and is not described herein again. The first terminal may be the smart television 100 in the above embodiment, and the second terminal may be the mobile phone 200 described above. That is, the smart television 100 can be used as a function demanding party, and the mobile phone 200 used as a function providing party is used to implement functions such as text input and behavior control of a game role. Of course, the first terminal may also be the mobile phone 200 described above, and the second terminal may also be the smart television 100 described above. That is, the mobile phone 200 may be used as a function demanding party, and the smart television 100 used as a function providing party is used to implement functions such as video interaction.
S1603: and the first terminal sends control information to the second terminal in the process of displaying the first interface. The control information is used for indicating the second terminal to start the target function unit, and when the target function unit is triggered, the function related to triggering can be realized.
Illustratively, the target functional unit may be a second interface including virtual text keys; when the virtual text key is triggered, the function of inputting text data (i.e. target data) to the text input interface of the first terminal can be realized. When the second interface including the virtual text key is triggered, the function of inputting text data to the text input interface of the first terminal may be implemented, which may refer to the description of fig. 8 and 9 in the foregoing embodiment, and is not described herein again.
The target function unit can also be a second interface of the virtual control keyboard, and when the virtual control keyboard is triggered, the game role on the first interface of the first terminal can be controlled to execute game behaviors related to the triggering. The target function unit can also be an attitude sensor, and when the attitude sensor is triggered, the game role on the first interface of the first terminal can be controlled to execute game behaviors related to the triggering. When the virtual control keyboard or the gesture sensor is triggered, the game role on the first interface of the first terminal may be controlled to execute the game behavior associated with the trigger, which may refer to the descriptions of fig. 11 to 12 in the foregoing embodiments, and details are not described here.
The target function unit can be a video acquisition module (such as a camera and a microphone). When the video acquisition module is started, the first terminal and the third terminal can be controlled to realize a video interaction function. When the video capture module is turned on, the first terminal and the third terminal may be controlled to implement a video interaction function, which may refer to the description of fig. 14 and is not described herein again.
S1604: and the second terminal responds to the trigger operation of the target function unit, so that the first terminal executes the function associated with the trigger operation when the first interface is displayed.
For example, referring to the explanation of fig. 9 above, when the first interface is a text input interface and the target function unit is a virtual text key in the second interface, the second terminal may obtain text data in response to a triggering operation of the virtual text key by the user. As will be appreciated, text data is the target data.
Or; for example, referring to the above description of (a) in fig. 12, when the first interface is a game interface and the target function unit is a virtual control keyboard in the second interface, the second terminal may control a game character on the first interface of the first terminal to perform a game behavior associated with a trigger in response to a trigger operation for the virtual control keyboard in the second interface. It is understood that the game character on the first interface controlling the first terminal performs the game behavior associated with the trigger, i.e., the function associated with the trigger operation.
Or; illustratively, when the first interface is a game interface and the target function unit is a gesture sensor, the second terminal may respond to a trigger operation of a user, so that the gesture sensor acquires gesture data of the second terminal; to control a game character on a first interface of the first terminal to perform a game action associated with the trigger. It is understood that the game character on the first interface controlling the first terminal performs the game behavior associated with the trigger, i.e., the function associated with the trigger operation.
Or; illustratively, when the first interface is a video playing interface and the target function unit includes a video capture module (such as a microphone and a camera), the first terminal and the third terminal may be controlled to implement a video interaction function. It is understood that the first terminal and the third terminal are controlled to implement a video interaction function, i.e., a function associated with the trigger operation.
Fig. 17 is a schematic structural diagram illustrating a structure in which, when the first terminal is the smart television 100 and the second terminal is the mobile phone 200, the smart television 100 implements a function associated with target data by using the mobile phone 200, or when the first terminal is the mobile phone 200 and the second terminal is the smart television 100, the mobile phone 200 implements a function associated with the target data by using the smart television 100.
As shown in fig. 17, the smart tv 100 and the mobile phone 200 may be communicatively connected through a short-range communication module. The short-range communication module may be a bluetooth module, a WiFi module, or an NFC module, and the like, which is not limited herein. When the smart television 100 receives an interface identifier of a video playing interface from the mobile phone 200, the cooperative control application of the smart television 100 calls and opens the video playing interface (including the image C and the image D) of the video application and calls a speaker to play first sound data from the mobile phone 200; and, a camera of the smart television 100 is called to capture the image C and a microphone is called to capture second sound data. As can be seen, the foregoing is that the cooperative control application of the smart television 100 decides to invoke the video application, the speaker, the camera, and the microphone based on the interface identifier of the video playing interface. In addition, the cooperative control application of the mobile phone 100 may also generate an instruction to call the video application, the speaker, the camera, and the microphone when the video is played. The smart television 200 calls a video application, a speaker, a camera and a microphone after receiving the instruction from the mobile phone 100. As such, the handset 100 may also make decisions to invoke video applications, speakers, cameras, and microphones. The mobile phone 200 receives the image C and the second sound data from the smart tv 100, and transmits the image C and the second sound data to the third terminal. In this way, the mobile phone 200 may implement a large-screen video interaction function with the smart television 100. It can be understood that the interface identifier of the video playing interface and the instruction for invoking the video application, the speaker, the camera and the microphone are control information.
In the above process, it can be understood that the smart television 200 may decide to invoke the video application, the speaker, the camera and the microphone according to the currently running function of the smart television 200 and the currently running function of the mobile phone 100; the mobile phone 100 may also decide to call the video application, the speaker, the camera, and the microphone according to the currently running function of the mobile phone 100 and the currently running function of the smart tv 200.
When the mobile phone 200 receives the identifier of the text input interface from the smart tv 100, the cooperative control application of the mobile phone 200 decides to invoke the text input application to open the keyboard interface. The mobile phone 200 responds to the triggering operation of the user on the keyboard interface, and sends text data associated with the triggering operation to the smart television 100. The smart tv 100 displays the received text data. In addition, the cooperative control application of the smart television 200 may also decide to generate an instruction for invoking the text input application based on the same principle, and control the mobile phone 100 to invoke the text input application to open the keyboard interface, which is not described herein again.
When the mobile phone 200 receives the identification of the game interface from the smart television 100, the cooperative control application of the mobile phone 200 decides to invoke the handle application to open the handle interface. The mobile phone 200 responds to the trigger operation of the user on the handle interface, and sends control data of the game character associated with the trigger operation to the smart television 100. The smart tv 100 controls a game character in the game interface to perform a game action associated with the control data.
Alternatively, when the mobile phone 200 receives the identifier of the game interface from the smart tv 100, the cooperative control application of the mobile phone 200 makes a decision to call the gesture sensor to acquire the gesture data (i.e., the control data) of the mobile phone 200. The mobile phone 200 transmits the gesture data to the smart tv 100. The smart tv 100 controls a game character in the game interface to perform a game action associated with the pose data. In addition, the cooperative control application of the smart television 200 may also make a decision to generate an instruction for invoking the handle application or the attitude sensor based on the same principle, and control the mobile phone 100 to invoke the handle application or the attitude sensor to obtain control data, which is not described herein again.
It should be noted that the carrier of the cooperative control application may be an operable icon located on a negative screen, activity of an application program, or an interface of a background service, and is not limited herein.
In addition, in the method for controlling a device provided by the foregoing embodiment, the trigger operation may include: the operation is not limited herein, and may be a click operation, a long-press operation, an operation of changing the posture of the terminal device, or the like.
In addition, in the above embodiments, the first terminal is in the first state, and the first terminal displays the first interface. In addition, the first terminal is in the first state, and can also be in a music playing state, and then the first terminal plays music by means of a loudspeaker of the second terminal. And is not limited herein.
In addition, in the above embodiments, the first terminal executes the first function as an example, and here, in the case where the first terminal plays the music scene by using the speaker of the second terminal, the second terminal executes the first function. I.e. the second terminal performs the function of music playing.
Referring to fig. 18, an apparatus control device 1800 for a first terminal is further provided in the present embodiment. The device control apparatus 1800 provided in the embodiment of the present application includes: a first processing unit 1802 for controlling the first terminal to be in a first state, wherein the first terminal is capable of performing a first function when being in the first state. A first communication unit 1801, configured to send control information to the second terminal based on the established communication connection. The control information is used for indicating the second terminal to start the target function unit so as to cooperate with the first terminal to execute the first function.
In a possible implementation, the first communication unit 1801 is further configured to receive target data from the second terminal. The first processing unit 1802 is further adapted to perform a first function associated with the target data.
In a possible implementation, the target function unit is a virtual text key in the second interface, and the target data is text data. The first processing unit 1802 is specifically configured to control the first terminal to display a first interface, where the first interface includes an area of a text to be input. The first processing unit 1802 is further specifically configured to control the first terminal to input text data in an area of a first interface where text is to be input when the first interface is displayed.
In one possible embodiment, the target data is control data. The first processing unit 1802 is specifically configured to control the first terminal to display a first interface, where the first interface includes a game character. The first processing unit 1802 is further specifically configured to control, when the first terminal displays the first interface, a game character in the first interface to execute a game behavior corresponding to the control data.
In one possible embodiment, the target function unit is a virtual control pad in the second interface, wherein the virtual control pad includes virtual keys for controlling game behavior of the game character. Or the control data is attitude data, and the target function unit is an attitude sensor of the second terminal for acquiring the attitude data.
In a possible implementation, the target functional unit is a video capture module, and the target data is second video data captured by the video capture module. The first processing unit 1802 is specifically configured to control the first terminal to display a first interface, where the first interface includes first video data from a third terminal, and the first processing unit 1802 is specifically configured to control the first terminal to receive second video data from a second terminal when the first interface is displayed; and transmitting the second video data to the third terminal.
Referring to fig. 19, an apparatus control device 1900 is further provided in the embodiment of the present application, and is applied to a second terminal. The device control apparatus 1900 provided in the embodiment of the present application includes: a second communication unit 1901, configured to receive control information from the first terminal based on the established communication connection. The second processing unit 1902 is configured to turn on the target functional unit in response to the control information. The second processing unit 1902 is further configured to cooperate with the first terminal to execute the first function based on the target function unit.
In a possible implementation, the second processing unit 1902 is specifically configured to, in response to a trigger operation on the target function unit, control the second communication unit 1901 to send the target data to the first terminal; wherein the target data is used to instruct the first terminal to perform a first function associated with the target data.
In a possible implementation manner, the target function unit is a virtual text key in the second interface, the target data is text data, and the text data is used for instructing the first terminal to input the text data in an area of the first interface where text is to be input when the first interface is displayed.
In a possible implementation manner, the target data is control data, and the target data is used for instructing the first terminal to control a game character in the first interface to execute a game behavior corresponding to the control data when the first interface is displayed.
In one possible implementation, the target function unit is a virtual control keyboard in the second interface, wherein the virtual control keyboard comprises virtual keys for controlling game behaviors of the game character; or the control data is attitude data, and the target function unit is an attitude sensor of the second terminal and used for acquiring the attitude data.
In a possible implementation manner, the target function unit is a video capture module, the target data is second video data captured by the video capture module, and the second video data is used for instructing the first terminal to send the second video data to the third terminal when displaying the first interface including the first video data.
In a possible implementation manner, the control information is information used to indicate that the first terminal is in the first state, and the second processing unit 1902 is specifically configured to start the target function unit according to the information indicating that the first terminal is in the first state. Alternatively, the control information is an instruction for starting the target functional unit, and the second processing unit 1902 is specifically configured to start the target functional unit according to the instruction for starting the target functional unit.
Fig. 20 is a schematic diagram of a hardware structure of a first terminal or a second terminal according to an embodiment of the present disclosure, and as shown in fig. 20, the first terminal or the second terminal includes a processor 2001, a communication line 2004, and at least one communication interface (an exemplary communication interface 2003 in fig. 20 is illustrated as an example).
The processor 2001 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the teachings of the present disclosure.
Communication lines 2004 may include circuitry to transfer information between the above-described components.
Communication interface 2003, using any transceiver or the like, is used to communicate with other devices or communication networks, such as ethernet, wireless Local Area Networks (WLAN), etc.
Possibly, the first terminal or the second terminal may further comprise a memory 2002.
The memory 2002 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and coupled to the processor via communication link 2004. The memory may also be integral to the processor.
The memory 2002 is used for storing computer-executable instructions for executing the present application, and is controlled by the processor 2001. The processor 2001 is configured to execute the computer execution instructions stored in the memory 2002, so as to implement the device control method executed by the first terminal or the second terminal according to the embodiment of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 2001 may include one or more CPUs, such as CPU0 and CPU1 in fig. 20, as one embodiment.
In particular implementations, the first terminal or the second terminal may include multiple processors, such as processor 2001 and processor 2005 in fig. 20, as an example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 21 is a schematic structural diagram of a chip provided in an embodiment of the present application. The chip 210 includes one or more than two (including two) processors 2110 and a communication interface 2130.
In some embodiments, the memory 2140 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In the present embodiment, the memory 2140 may include a read-only memory and a random access memory, and provides instructions and data to the processor 2110. The portion of the memory 2140 may also include non-volatile random access memory (NVRAM).
In the embodiment of the present application, the memory 2140, the communication interface 2130, and the memory 2140 are coupled together by a bus system 2120. The bus system 2120 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For ease of description, the various buses are labeled as the bus system 2120 in FIG. 21.
The method described in the embodiments of the present application can be applied to the processor 2110 or implemented by the processor 2110. Processor 2110 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 2110. The processor 2110 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 2110 may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present application.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 2140, and the processor 2110 reads information in the memory 2140, and performs the steps of the above method in combination with hardware thereof.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. Computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.) computer-readable storage media may be any available media that a computer can store or a data storage device including one or more servers, data centers, etc. integrated with available media.
The embodiment of the application also provides a computer readable storage medium. The method performed by the first terminal or the second terminal described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be referred to as a computer-readable medium, where appropriate. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (25)

1. An apparatus control method applied to a communication system including a first terminal and a second terminal, the method comprising:
the first terminal is in a first state, wherein the first terminal can execute a first function when in the first state;
the first terminal sends control information to the second terminal based on the established communication connection;
the second terminal responds to the control information to start a target function unit;
and the second terminal cooperates with the first terminal to execute a first function based on the target function unit.
2. The method of claim 1, wherein the second terminal performs the first function with the first terminal based on the target function unit, and wherein the performing comprises:
the second terminal responds to the trigger operation of the target function unit and sends target data to the first terminal;
the first terminal performs the first function associated with the target data.
3. The method of claim 2, wherein the target functional unit is a virtual text key in a second interface, the target data is text data,
the first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface comprises an area of a text to be input;
the first terminal performs the first function associated with the target data, including:
and when the first terminal displays the first interface, inputting the text data in the area of the text to be input of the first interface.
4. The method of claim 2, wherein the target data is control data,
the first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface includes a game character,
the first terminal performs the first function associated with the target data, including:
and when the first terminal displays the first interface, controlling the game role in the first interface to execute game behaviors corresponding to the control data.
5. The method of claim 4,
the target function unit is a virtual control keyboard in a second interface, wherein the virtual control keyboard comprises virtual keys for controlling the game behavior of the game role;
or, the control data is attitude data, and the target function unit is an attitude sensor of the second terminal for acquiring the attitude data.
6. The method of claim 5, wherein the target function unit is a virtual control keyboard in the second interface, and after the first terminal sends control information to the second terminal, the method further comprises:
the second terminal receives first interface data of the first interface from the first terminal;
and the second terminal displays second interface data in the second interface, wherein the second interface data comprises display content corresponding to the first interface data.
7. The method of claim 2, wherein the target functional unit is a video capture module, the target data is second video data captured by the video capture module,
the first terminal is in a first state comprising: the first terminal displays a first interface, wherein the first interface comprises first video data from a third terminal,
the first terminal performs the first function associated with the target data, including: the first terminal receives the second video data from the second terminal when displaying the first interface; and the first terminal sends the second video data to the third terminal.
8. The method of claim 7, wherein after the second terminal turns on the target functional unit in response to the control information, the method further comprises:
the second terminal receives first video data from the first terminal;
and the second terminal displays a second interface, wherein the second interface comprises the first video data.
9. The method according to claim 1, wherein the establishing of the communication connection comprises:
when the first terminal is in a first state, equipment discovery is carried out; the first terminal establishes communication connection with the second terminal under the condition that the second terminal is found by the first terminal;
or, when the distance between the second terminal and the target electronic device is smaller than a distance threshold, acquiring the device identifier of the first terminal stored in the target electronic device; the second terminal establishes communication connection with the first terminal according to the equipment identifier;
or, the first terminal receives voice information input by a user; the voice information is used for indicating connection with the second terminal; the first terminal carries out equipment discovery, and establishes communication connection with the second terminal under the condition that the first terminal discovers the second terminal;
or, the second terminal launches the UWB signal, the UWB signal carries the equipment sign at second terminal, first terminal receipt comes from the UWB signal at second terminal, first terminal basis in the UWB signal the equipment sign, with communication connection is established to the second terminal.
10. The method according to claim 1, wherein before the first terminal sends control information to the second terminal based on the established communication connection, the method further comprises:
the second terminal displays a fifth interface, wherein the fifth interface comprises a first control used for controlling to be in/out of a third preset mode;
and the second terminal responds to the triggering operation of the first control, and the second terminal is in the third preset mode.
11. The method of any one of claims 1 to 10,
the control information is information for indicating that the first terminal is in a first state, and the second terminal responds to the control information to start the target function unit, including: the second terminal starts the target function unit according to the information indicating that the first terminal is in the first state;
or, the control information is an instruction for starting the target function unit, and the second terminal responds to the control information to start the target function unit, including: and the second terminal starts the target function unit according to the instruction for starting the target function unit.
12. An apparatus for controlling a device, applied to a first terminal, the apparatus comprising:
the first processing unit is used for controlling the first terminal to be in a first state, wherein the first terminal can execute a first function when being in the first state;
a first communication unit configured to transmit control information to the second terminal based on the established communication connection;
the control information is used for indicating the second terminal to start a target function unit so as to cooperate with the first terminal to execute a first function.
13. The apparatus of claim 12, wherein the first communication unit is further configured to receive target data from the second terminal;
the first processing unit is further configured to execute the first function associated with the target data.
14. The device of claim 13, wherein the target functional unit is a virtual text key in the second interface, the target data is text data,
the first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes an area of a text to be input;
the first processing unit is further specifically configured to control the first terminal to input the text data in the area of the text to be input on the first interface when the first interface is displayed on the first terminal.
15. The apparatus of claim 13, wherein the target data is control data,
the first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes a game character,
the first processing unit is further specifically configured to control the game character in the first interface to execute a game behavior corresponding to the control data when the first terminal displays the first interface.
16. The apparatus of claim 15,
the target function unit is a virtual control keyboard in a second interface, wherein the virtual control keyboard comprises virtual keys for controlling the game behavior of the game role;
or, the control data is attitude data, and the target function unit is an attitude sensor of the second terminal for acquiring the attitude data.
17. The apparatus of claim 13, wherein the target functional unit is a video capture module, the target data is second video data captured by the video capture module,
the first processing unit is specifically configured to control the first terminal to display a first interface, where the first interface includes first video data from a third terminal,
the first processing unit is specifically configured to control the first terminal to receive the second video data from the second terminal when the first interface is displayed; and sending the second video data to the third terminal.
18. An apparatus for controlling a device, applied to a second terminal, the apparatus comprising:
a second communication unit for receiving control information from the first terminal based on the established communication connection;
the second processing unit is used for responding to the control information to start the target function unit;
the second processing unit is further configured to cooperate with the first terminal to execute a first function based on the target function unit.
19. The apparatus according to claim 18, wherein the second processing unit is specifically configured to control the second communication unit to send target data to the first terminal in response to a trigger operation on the target function unit; wherein the target data is for instructing the first terminal to perform the first function associated with the target data.
20. The device according to claim 19, wherein the target function unit is a virtual text key in the second interface, and the target data is text data, and the text data is used to instruct the first terminal to input the text data in an area of the first interface where text is to be input when the first interface is displayed.
21. The apparatus of claim 19, wherein the target data is control data, and the target data is used to instruct the first terminal to control a game character in the first interface to execute a game behavior corresponding to the control data when the first interface is displayed.
22. The apparatus of claim 21,
the target function unit is a virtual control keyboard in a second interface, wherein the virtual control keyboard comprises virtual keys for controlling the game behavior of the game role;
or, the control data is attitude data, and the target function unit is an attitude sensor of the second terminal for acquiring the attitude data.
23. The apparatus of claim 19, wherein the target functional unit is a video capture module, and the target data is second video data captured by the video capture module, the second video data being used to instruct the first terminal to send the second video data to a third terminal when displaying the first interface including the first video data.
24. The apparatus of any one of claims 18-23,
the second processing unit is specifically configured to start the target function unit according to the information indicating that the first terminal is in the first state;
or, the control information is an instruction for starting the target function unit, and the second processing unit is specifically configured to start the target function unit according to the instruction for starting the target function unit.
25. A computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, causes a computer to perform the method performed by the first terminal or the second terminal according to any one of claims 1 to 11.
CN202111089487.1A 2021-09-16 2021-09-16 Equipment control method and device Pending CN115814403A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111089487.1A CN115814403A (en) 2021-09-16 2021-09-16 Equipment control method and device
PCT/CN2022/118543 WO2023040848A1 (en) 2021-09-16 2022-09-13 Device control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111089487.1A CN115814403A (en) 2021-09-16 2021-09-16 Equipment control method and device

Publications (1)

Publication Number Publication Date
CN115814403A true CN115814403A (en) 2023-03-21

Family

ID=85515170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111089487.1A Pending CN115814403A (en) 2021-09-16 2021-09-16 Equipment control method and device

Country Status (2)

Country Link
CN (1) CN115814403A (en)
WO (1) WO2023040848A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634419A (en) * 2013-11-15 2014-03-12 北京洋浦伟业科技发展有限公司 Remote control method of terminal and terminal
CN104104992A (en) * 2014-07-08 2014-10-15 深圳市同洲电子股份有限公司 Multi-screen interaction method, device and system
CN104679239A (en) * 2015-01-23 2015-06-03 深圳市金立通信设备有限公司 Terminal input method
CN106371935A (en) * 2016-08-26 2017-02-01 上海浮罗创业投资有限公司 Game control interface acquisition method and apparatus
CN108632560A (en) * 2018-05-14 2018-10-09 聚好看科技股份有限公司 Video call method, device and terminal device
CN110337020A (en) * 2019-06-26 2019-10-15 华为技术有限公司 A kind of control method and relevant apparatus showing equipment
CN113032766A (en) * 2021-05-26 2021-06-25 荣耀终端有限公司 Application authority management method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7634263B2 (en) * 2006-01-30 2009-12-15 Apple Inc. Remote control of electronic devices
CN103634645A (en) * 2013-12-10 2014-03-12 青岛海尔软件有限公司 Method for controlling smart television by using mobile phone
CN105635625B (en) * 2014-10-31 2019-12-20 腾讯科技(深圳)有限公司 Video call method and device
CN104486684A (en) * 2014-12-18 2015-04-01 百度在线网络技术(北京)有限公司 Input method and device for electronic equipment
CN107750003A (en) * 2017-10-11 2018-03-02 广州指观网络科技有限公司 A kind of long-range touching input system and method
CN110138937B (en) * 2019-05-07 2021-06-15 华为技术有限公司 Call method, device and system
CN114489876A (en) * 2020-11-09 2022-05-13 华为技术有限公司 Text input method, electronic equipment and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634419A (en) * 2013-11-15 2014-03-12 北京洋浦伟业科技发展有限公司 Remote control method of terminal and terminal
CN104104992A (en) * 2014-07-08 2014-10-15 深圳市同洲电子股份有限公司 Multi-screen interaction method, device and system
CN104679239A (en) * 2015-01-23 2015-06-03 深圳市金立通信设备有限公司 Terminal input method
CN106371935A (en) * 2016-08-26 2017-02-01 上海浮罗创业投资有限公司 Game control interface acquisition method and apparatus
CN108632560A (en) * 2018-05-14 2018-10-09 聚好看科技股份有限公司 Video call method, device and terminal device
CN110337020A (en) * 2019-06-26 2019-10-15 华为技术有限公司 A kind of control method and relevant apparatus showing equipment
CN113032766A (en) * 2021-05-26 2021-06-25 荣耀终端有限公司 Application authority management method and device

Also Published As

Publication number Publication date
WO2023040848A1 (en) 2023-03-23
WO2023040848A9 (en) 2023-05-25

Similar Documents

Publication Publication Date Title
CN110381197B (en) Method, device and system for processing audio data in many-to-one screen projection
JP7450734B2 (en) Audio output method and terminal device
CN113325996B (en) Split screen display method and device
JP7369281B2 (en) Device capacity scheduling method and electronic devices
CN110493626B (en) Video data processing method and device
KR20140112900A (en) Communication connecting method for bluetooth device and apparatus therefor
WO2022048500A1 (en) Display method, and device
CN116360725B (en) Display interaction system, display method and device
CN109067981B (en) Split screen application switching method and device, storage medium and electronic equipment
CN112130788A (en) Content sharing method and device
CN113051015A (en) Page rendering method and device, electronic equipment and storage medium
CN114143906B (en) Electronic equipment connection method and electronic equipment
CN113703849B (en) Screen-casting application opening method and device
CN110086814B (en) Data acquisition method and device and storage medium
CN115814403A (en) Equipment control method and device
CN114513479B (en) Message transmitting and receiving method, device, terminal, server and storage medium
CN114173315B (en) Bluetooth reconnection method and terminal equipment
CN115087134B (en) Bluetooth connection method and electronic equipment
CN114125805B (en) Bluetooth reconnection method and terminal equipment
US20240353988A1 (en) Display Interaction System, and Display Method and Device
WO2024067170A1 (en) Device management method and electronic device
US20240353987A1 (en) Display Interaction System, and Display Method and Device
WO2023045966A1 (en) Capability sharing method, electronic devices and computer-readable storage medium
US20240201932A1 (en) Display method, electronic device, and system
CN115700443A (en) Split screen recommendation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination