WO2023040848A9 - Procédé et appareil de commande de dispositif - Google Patents

Procédé et appareil de commande de dispositif Download PDF

Info

Publication number
WO2023040848A9
WO2023040848A9 PCT/CN2022/118543 CN2022118543W WO2023040848A9 WO 2023040848 A9 WO2023040848 A9 WO 2023040848A9 CN 2022118543 W CN2022118543 W CN 2022118543W WO 2023040848 A9 WO2023040848 A9 WO 2023040848A9
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
interface
data
target
control
Prior art date
Application number
PCT/CN2022/118543
Other languages
English (en)
Chinese (zh)
Other versions
WO2023040848A1 (fr
Inventor
刘华军
杨金华
杨毅轩
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023040848A1 publication Critical patent/WO2023040848A1/fr
Publication of WO2023040848A9 publication Critical patent/WO2023040848A9/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the technical field of terminals, and in particular to a device control method and device.
  • terminal equipment has become a part of people's work and life.
  • a terminal device often needs a third terminal to complete. For example, when the first terminal needs to execute the target function, it needs to use the second terminal to control the first terminal to complete the target function.
  • the present application provides a device control method and device, which can enable the user to perform simple operations, realize target functions more conveniently by using the second terminal to control the first terminal, and improve the user's operation efficiency.
  • the present application provides a device control method, which is applied to a communication system.
  • the communication system includes a first terminal and a second terminal.
  • the method includes: the first terminal is in the first state, wherein, when the first terminal is in the first state, it can Execute the first function; the first terminal sends control information to the second terminal based on the established communication connection; the second terminal activates the target function unit in response to the control information; the second terminal cooperates with the first terminal to execute the first function based on the target function unit. Function.
  • the user only needs to establish a communication connection between the first terminal and the second terminal.
  • the first terminal sends control information to the second terminal.
  • the second terminal turns on the target function unit in response to the control information. Further, the second terminal cooperates with the first terminal to execute the first function based on the target function unit. Furthermore, the user's operation steps are simplified, and the second terminal can cooperate with the first terminal to execute the first function based on the target function unit, thereby improving the operation efficiency.
  • the second terminal cooperates with the first terminal to execute the first function based on the target functional unit, including: the second terminal sends target data to the first terminal in response to a trigger operation on the target functional unit; A terminal performs a first function associated with target data.
  • the target functional unit is a virtual text button in the second interface
  • the target data is text data.
  • the first terminal being in the first state includes: the first terminal displays a first interface, where the first interface includes an area where text is to be input.
  • the first terminal executes the first function associated with the target data, including: when the first terminal displays the first interface, inputting text data in an area where text is to be input on the first interface.
  • the user needs at most two operations to realize the text data input in the first interface of the first terminal by means of the second terminal. Furthermore, the user's operation steps are simplified, and the operation efficiency is improved.
  • the target data is control data.
  • the first terminal being in the first state includes: the first terminal displays a first interface, where the first interface includes game characters.
  • Executing the first function associated with the target data by the first terminal includes: when the first terminal displays the first interface, controlling the game character in the first interface to perform the game behavior corresponding to the control data.
  • the user needs at most two operations to control the game character in the first terminal to perform the game behavior associated with the triggering operation by means of the second terminal. Furthermore, the user's operation steps are simplified, and the operation efficiency is improved.
  • the target functional unit is a virtual control keyboard in the second interface, where the virtual control keyboard includes virtual keys for controlling the game behavior of the game character.
  • the control data is attitude data
  • the target functional unit is an attitude sensor of the second terminal for collecting attitude data.
  • the target functional unit is a virtual control keyboard in the second interface
  • the method further includes: the second terminal receives the first The first interface data of the interface.
  • the second terminal displays second interface data on the second interface, where the second interface data includes display content corresponding to the first interface data.
  • the target functional unit is a video collection module
  • the target data is the second video data collected by the video collection module.
  • the first terminal being in the first state includes: the first terminal displays a first interface, where the first interface includes first video data from the third terminal.
  • the first terminal executes the first function associated with the target data, including: the first terminal receives the second video data from the second terminal when the first interface is displayed; the first terminal sends the second video data to the third terminal.
  • the user needs at most two operations to implement the function of the first terminal performing video interaction with the help of the second terminal, and the user's hands are freed. Furthermore, the user's operation steps are simplified, and the operation efficiency is improved.
  • the method further includes: the second terminal receives the first video data from the first terminal.
  • the second terminal displays a second interface, where the second interface includes the first video data.
  • the user can also watch the first video data from the third terminal on the second terminal.
  • the method for establishing a communication connection includes: when the first terminal is in the first state, performing device discovery; when the first terminal discovers the second terminal, establishing a communication connection with the second terminal .
  • the device identifier of the first terminal stored in the target electronic device is acquired; the second terminal establishes a communication connection with the first terminal according to the device identifier.
  • the first terminal receives voice information input by the user; wherein, the voice information is used to indicate connection with the second terminal; the first terminal performs device discovery, and the first terminal establishes communication with the second terminal when the second terminal is found connect.
  • the second terminal transmits a UWB signal
  • the UWB signal carries the device identifier of the second terminal
  • the first terminal receives the UWB signal from the second terminal
  • the first terminal establishes a communication connection with the second terminal according to the device identifier in the UWB signal .
  • the method before the first terminal sends the control information to the second terminal based on the established communication connection, the method further includes: the second terminal displays a fifth interface, the fifth interface includes a First control to exit third preset mode. The second terminal responds to the trigger operation on the first control, and the second terminal is in the third preset mode.
  • control information is information indicating that the first terminal is in the first state
  • the second terminal starts the target function unit in response to the control information, including: the second terminal indicates that the first terminal is in the first state. status information, enable the target functional unit.
  • control information is an instruction for enabling the target functional unit
  • the second terminal responds to the control information for enabling the target functional unit, including: the second terminal activates the target functional unit according to the instruction for enabling the target functional unit.
  • the target functional unit may be activated by the first terminal decision, or the target functional unit may be activated by the second terminal decision.
  • the present application also provides a device control device, which is applied to a first terminal.
  • the device control device provided in the present application includes: a first processing unit, configured to control the first terminal to be in the first state, wherein the first terminal While in the first state, a first function can be performed.
  • the first communication unit is configured to send control information to the second terminal based on the established communication connection. Wherein, the control information is used to instruct the second terminal to activate the target function unit, so as to cooperate with the first terminal to execute the first function.
  • the first communication unit is further configured to receive target data from the second terminal.
  • the first processing unit is further configured to execute a first function associated with the target data.
  • the target functional unit is a virtual text button in the second interface
  • the target data is text data.
  • the first processing unit is specifically configured to control the first terminal to display the first interface, where the first interface includes an area where text is to be input.
  • the first processing unit is further specifically configured to control the first terminal to input text data in an area where text is to be input on the first interface when the first interface is displayed.
  • the target data is control data.
  • the first processing unit is specifically configured to control the first terminal to display the first interface, where the first interface includes game characters.
  • the first processing unit is further specifically configured to control the first terminal to control the game character in the first interface to perform the game behavior corresponding to the control data when the first interface is displayed.
  • the target functional unit is a virtual control keyboard in the second interface, where the virtual control keyboard includes virtual keys for controlling the game behavior of the game character.
  • the control data is attitude data
  • the target functional unit is an attitude sensor of the second terminal for collecting attitude data.
  • the target functional unit is a video collection module
  • the target data is the second video data collected by the video collection module.
  • the first processing unit is specifically configured to control the first terminal to display the first interface, wherein the first interface includes first video data from the third terminal, and the first processing unit is specifically configured to control the first terminal to display the first video data. interface, receiving the second video data from the second terminal; sending the second video data to the third terminal.
  • the present application further provides a device control device, which is applied to a second terminal.
  • the device control device provided in the third aspect includes: a second communication unit, configured to receive a communication from the first terminal based on an established communication connection control information.
  • the second processing unit is configured to enable the target function unit in response to the control information.
  • the second processing unit is further configured to cooperate with the first terminal to execute the first function based on the target function unit.
  • the second processing unit is specifically configured to, in response to a trigger operation on the target functional unit, control the second communication unit to send target data to the first terminal; where the target data is used to indicate that the first terminal A first function associated with the target data is performed.
  • the target functional unit is a virtual text button in the second interface
  • the target data is text data
  • the text data is used to indicate that when the first terminal displays the first interface, the key to be input in the first interface Text field to enter text data.
  • the target data is control data
  • the target data is used to instruct the first terminal to control the game character in the first interface to perform the game behavior corresponding to the control data when the first interface is displayed.
  • the target function unit is a virtual control keyboard in the second interface, wherein the virtual control keyboard includes virtual keys for controlling the game behavior of the game character; or, the control data is gesture data, and the target function The unit is an attitude sensor of the second terminal for collecting attitude data.
  • the target functional unit is a video collection module
  • the target data is second video data collected by the video collection module
  • the second video data is used to instruct the first terminal to display the first video data including the first video data. interface, sending the second video data to the third terminal.
  • control information is information indicating that the first terminal is in the first state
  • the second processing unit is specifically configured to enable the target functional unit according to the information indicating that the first terminal is in the first state.
  • control information is an instruction for enabling the target functional unit
  • the second processing unit is specifically configured to enable the target functional unit according to the instruction for enabling the target functional unit.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores instructions, and when the instructions are executed, the computer executes the computer as described in the first aspect or any implementation manner of the first aspect. The method executed by the first terminal or the second terminal.
  • the present application further provides a computer program product, including a computer program, which, when the computer program is run, causes the computer to execute the first terminal or the first terminal described in any implementation manner of the first aspect or the first aspect. The method executed by the terminal.
  • FIG. 1-FIG. 2 are schematic diagrams of the interface for inputting text data in the first interface displayed by the mobile phone 200 to the smart TV 200;
  • FIG. 3 is a schematic diagram of a hardware system architecture of a terminal device provided in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a software system architecture of a terminal device provided in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of obtaining the device identity of the smart TV 100 by the mobile phone 200 approaching the remote control 300 of the smart TV 100 provided by the embodiment of the present application;
  • FIG. 6 is a schematic diagram of the mobile phone 200 receiving the voice information "connect to TV" input by the user provided by the embodiment of the present application;
  • FIG. 7 is a schematic diagram of establishing a communication connection between the mobile phone 200 and the smart TV 100 provided by the embodiment of the present application;
  • FIG. 8 is a schematic diagram of an interface in which the smart TV 100 sends the interface identifier of the first interface 101 to the mobile phone 200 during the process of displaying the first interface 101 provided by the embodiment of the present application;
  • Fig. 9 shows that in the process of displaying the second interface 207 of the mobile phone 200 provided by the embodiment of the present application, in response to the user's trigger operation on the virtual text button in the second interface 207, input in the first interface 101 of the smart TV 100 Schematic diagram of account and password;
  • FIG. 10 is a schematic diagram of an interface for establishing a communication connection with a mobile phone 100 when the smart TV 100 provided in the embodiment of the present application displays a game interface;
  • FIG. 11 is a schematic diagram of an interface of the smart TV 100 sending the interface identifier of the game interface to the mobile phone 200 during the process of displaying the game interface provided by the embodiment of the present application;
  • FIG. 12 shows that the mobile phone 200 provided by the embodiment of the present application responds to the user's trigger operation on the light-hand attack button "A" in the second interface 207, and controls the game character A in the first interface 101 of the smart TV 100 to control the light-hand attack game.
  • FIG. 13 is a schematic diagram of an interface for establishing a communication connection with a smart TV 200 when the mobile phone 200 displays the video playback interface provided by the embodiment of the present application;
  • FIG. 14 is a schematic diagram of the interface of the mobile phone 200 implementing the large-screen video interaction function by means of the software and hardware of the smart TV 100 provided by the embodiment of the present application;
  • FIG. 15 is a schematic diagram of an interface for setting the mobile phone 200 in the third preset mode provided by the embodiment of the present application.
  • FIG. 16 is one of the schematic flowcharts of the device control method provided by the embodiment of the present application.
  • FIG. 17 is a schematic diagram of a functional architecture in which the first terminal executes the function associated with the target data by means of the second terminal provided by the embodiment of the present application;
  • FIG. 18 is a structural block diagram of a terminal device 1800 provided in an embodiment of the present application.
  • FIG. 19 is a structural block diagram of a terminal device 1900 provided in an embodiment of the present application.
  • FIG. 20 is a schematic diagram of a hardware structure of a terminal device provided in an embodiment of the present application.
  • FIG. 21 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect.
  • the first value and the second value are only used to distinguish different values, and their sequence is not limited.
  • words such as “first” and “second” do not limit the quantity and execution order, and words such as “first” and “second” do not necessarily limit the difference.
  • At least one means one or more, and “multiple” means two or more.
  • “And/or” describes the association relationship of associated objects, indicating that there may be three types of relationships, for example, A and/or B, which can mean: A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • “At least one of the following" or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • At least one item (piece) of a, b, or c can represent: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • the first terminal often needs the second terminal to complete them more conveniently.
  • the first terminal as the smart TV 100 and the second terminal as the mobile phone 200 as an example, with reference to FIGS. 1-2 , how the smart TV 100 can more conveniently complete a certain function with the help of the mobile phone 200 is described.
  • the smart TV 100 displays a first interface 101, which includes an account input area A and a password input area B, and the user needs to input an account in the account input area A and a password in the password input area B. , to complete the login and use the smart TV 100.
  • the user may use the mobile phone 200 to perform at least the following steps to enter the account and password on the first interface 101 .
  • Step 1 As shown in FIG. 1 , the user can establish a communication connection between the mobile phone 200 and the smart TV 100 .
  • Step 2 As shown in (a)-(b) in FIG. 2 , the mobile phone 200 displays the system main interface 201 in response to the user's unlocking operation.
  • Step 3 As shown in (b) in Figure 2, the main interface 201 of the display system includes the icon 202 of the "cooperative” application, and the mobile phone 200 responds to the user's trigger operation on the icon 202 of the "cooperative” application, as shown in Figure 2 As shown in (c), the main interface 203 of the "collaboration" application is displayed.
  • Step 4 As shown in (c) of FIG. 2 , the main interface 203 of the "cooperation" application includes a control 204 for instructing to realize the coordination function.
  • the mobile phone 200 may display the function list interface 205 in response to the user's trigger operation on the control 204 .
  • the “text input application”, “handle application” and “game application” are displayed in the function list interface 205 , and switch controls 206 are displayed on one side of the “text input application”, “handle application” and “game application”.
  • Step 5 As shown in (c)-(d) in FIG. 2 , the mobile phone 200 may display a second interface 207 in response to the user's trigger operation on the switch control 206 on the "text input application" side, the second interface 207 includes virtual keys.
  • the mobile phone 200 can control the smart TV 100 to input an account in the account input area A and a password in the password input area B in response to the user's trigger operation on the virtual key, so as to complete the login.
  • the above smart TV 100 uses the mobile phone 200 to complete the function of inputting the account and password, at least requiring the user to operate the above steps 1-5, the operation steps are complicated and inefficient.
  • an embodiment of the present application provides a device control method.
  • the first terminal displays the first interface and needs to use the second terminal to complete a certain function
  • a communication connection can be established between the first terminal and the second terminal; and , the first terminal sends the identifier of the first interface to the second terminal; the second terminal recognizes the identifier of the first interface, and displays the second interface corresponding to the identifier of the first interface; during the process of displaying the second interface, the second terminal, The target data is acquired; the second terminal sends the target data to the first terminal; the first terminal executes a function associated with the target data. In this way, the user's operation steps are reduced, and the process in which the first terminal completes the target function by means of the second terminal can be realized, thereby improving efficiency.
  • the terminal device can be mobile phone, smart TV, wearable device, tablet computer (Pad), computer with wireless transceiver function, virtual reality (virtual reality, VR) terminal device, augmented reality (augmented reality, AR) terminal Equipment, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery, wireless terminals in smart grid, transportation Wireless terminals in transportation safety, wireless terminals in smart city, wireless terminals in smart home, etc.
  • the embodiment of the present application does not limit the specific technology and specific device form adopted by the terminal device.
  • FIG. 3 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
  • the terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, and a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a button 190, an indicator 192, a camera 193, and a display screen 194, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, and a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a button 190, an indicator 192, a camera 193, and a
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the combination of the gyro sensor 180B and the acceleration sensor 180E can be understood as an attitude sensor.
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the terminal device.
  • the terminal device may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • Processor 110 may include one or more processing units. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the terminal device, and can also be used to transmit data between the terminal device and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the power management module 141 is used for connecting the charging management module 140 and the processor 110 .
  • the wireless communication function of the terminal device can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Antennas in end devices can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on terminal equipment.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system ( global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • the terminal device realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the terminal device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the terminal device can realize the shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • Camera 193 is used to capture still images or video.
  • the terminal device may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the internal memory 121 can be used to store the first object table and the second object table.
  • Table 1 For the specific content of the first target table, refer to Table 1 in the following embodiments, and for the specific content of the second target table, refer to Table 2 in the following embodiments, which will not be introduced here.
  • the terminal device can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 170A also referred to as a "horn” is used to convert audio electrical signals into sound signals.
  • the terminal device can play music or listen to voice through the speaker 170A.
  • the receiver 170B also called “earpiece”, is used to convert audio electrical signals into audio signals. When the terminal device answers a phone call or voice information, the receiver 170B can be placed close to the human ear to listen to the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals, such as collecting voice.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyroscope sensor 180B can be used to determine the motion posture of the terminal device.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the acceleration of the terminal device in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the terminal device can receive key input and generate key signal input related to user settings and function control of the terminal device.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture, etc., which will not be repeated here.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the terminal device.
  • FIG. 4 is a software structural block diagram of a terminal device applicable to an embodiment of the present application.
  • the layered architecture divides the software system of the terminal device into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system can be divided into five layers, namely applications, application framework, Android runtime, system library, and hardware abstraction layer. layer, HAL) and the kernel layer (kernel).
  • the application program layer may include a series of application program packages, and the application program layer runs the application program by calling an application program interface (application programming interface, API) provided by the application program framework layer.
  • the application package can include applications such as camera, text input, game, handle, collaborative control, WeChat, phone, map, navigation, WLAN, and Bluetooth information.
  • the display interface when the text input application is opened includes a virtual text key, and the virtual text key is used for inputting text.
  • the virtual text keys include number keys, letter keys, and symbol keys, etc., which are not limited herein.
  • the display interface is the game interface.
  • the game interface may include game character A and game character B in a fighting game.
  • the display interface when the handle application is opened may include a virtual control keyboard.
  • the virtual control keyboard can include forward button “ ⁇ ”, back button “ ⁇ ”, jump button “ ⁇ ”, squat button “ ⁇ ”, light hand attack button “A”, heavy hand attack button “B”, light foot attack button “C” and the heavy foot attack button “D”.
  • the coordinated control application can be used to call application programs such as text input application, joystick application, and video application.
  • the coordinated control application can also be used to call hardware modules such as the acceleration sensor 180E, the gyroscope sensor 180B, the microphone 170A, and the speaker 170C.
  • the application framework layer provides API and programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 4, the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • the Android runtime includes core libraries and a virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the hardware abstraction layer can contain multiple library modules, such as camera library modules, motor library modules, etc.
  • the Android system can load corresponding library modules for the device hardware, and then realize the purpose of the application framework layer accessing the device hardware.
  • Device hardware may include, for example, speakers, display screens, and cameras in terminal devices.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer is used to drive the hardware and make the hardware work.
  • the kernel layer includes at least a display driver, a motor driver, a sensor driver, etc., which are not limited in this embodiment of the present application.
  • the smart TV 100 displays a first interface 101 (that is, the smart TV 100 is in a first state).
  • the first interface 101 includes an account input area A where text is to be input, and a password input area B (ie, the area where text is to be input) (ie, the smart TV 100 can realize the function of text input, ie, the first function).
  • the first interface 101 can be understood as a text input interface.
  • the user needs to use the mobile phone 200 to enter the account and password in the first interface 101 (that is, the smart TV 100 needs to use the mobile phone 200 to input text data), and the user can use the smart TV 100 only after the smart TV 100 successfully verifies the account and password.
  • the user uses the smart TV 100 to watch live channels, order movies, and play games.
  • the smart TV 100 Before the user enters the account and password on the first interface 101 with the help of the mobile phone 200 , the smart TV 100 needs to establish a communication connection with the mobile phone 200 .
  • the ways for the mobile phone 200 to communicate with the smart TV 100 may include but not limited to the following ways:
  • the first type the smart TV 100 performs device discovery during the process of displaying the first interface 101 .
  • the smart TV 100 discovers the mobile phone 200 , it establishes a communication connection with the mobile phone 200 .
  • the communication connection may be a Bluetooth connection, a WiFi connection, an NFC connection, UWB, etc., which is not limited herein. In this way, the communication connection between the smart TV 100 and the mobile phone 200 can be realized without user operation, which can reduce user operation steps and improve efficiency.
  • the second type as shown in FIG. 5 , the mobile phone 200 is close to the remote control 300 of the smart TV 100 (for example, the user uses the mobile phone 200 to touch the remote control 300 once), and the mobile phone 200 can establish an NFC communication connection with the remote control 300 .
  • the mobile phone 200 can obtain the device identifier of the smart TV 100 stored in the remote controller 300 through NFC communication (which may include the communication address of the smart TV 100 ).
  • the mobile phone 200 can establish a communication connection with the smart TV 100 according to the device identifier. In this way, the user only needs to bring the mobile phone 200 close to the remote control 300 to establish a communication connection between the mobile phone 200 and the smart TV 100 , which is convenient and fast.
  • the mobile phone 200 may establish a communication connection with the smart TV 100 only when the mobile phone 200 establishes NFC communication with the remote control 300 twice (that is, the user uses the mobile phone 200 to touch the remote control 300 twice).
  • the aforementioned remote controller 300 may be replaced with a router, or a smart TV 100 , etc., which is not limited herein. It can be understood that the aforementioned remote control 300 , router, or smart TV 100 are target electronic devices.
  • the mobile phone 200 can receive the voice message "connect with TV" input by the user. Furthermore, as still shown in FIG. 7 , the mobile phone 200 discovers the smart TV 100 and establishes a communication connection with the smart TV 100 . Understandably, the above voice information is used to indicate a communication connection with the smart TV 100 . In this way, the above-mentioned voice information may also be replaced with "connect TV", etc., which is not limited here.
  • the fourth type: the mobile phone 200 can transmit UWB signals.
  • the UWB signal carries the device identification of the mobile phone 200 (may include the communication address of the mobile phone 200).
  • the smart TV 100 can receive the device identification of the mobile phone 200 .
  • the smart TV 100 establishes a communication connection with the mobile phone 200 according to the device identifier. In this way, the user only needs to align the mobile phone 200 with the smart TV 100 to establish a communication connection between the mobile phone 200 and the smart TV 100 , which is convenient and quick.
  • the smart TV 100 controls the mobile phone 200 to display the second interface during the process of displaying the first interface 101 .
  • the second interface includes a virtual text button, and the virtual text button is used for inputting text.
  • the virtual text keys include number keys, letter keys, and symbol keys, etc., which are not limited herein. Understandably, the above-mentioned second interface is a keyboard interface. In the following, how the smart TV 100 controls the mobile phone 200 to display the second interface will be described in conjunction with Table 1 and Table 2 respectively.
  • the mobile phone 200 may store a first target table, and the first target table includes a mapping relationship between the interface identifier of the smart TV 100 and the interface identifier of the mobile phone 200 .
  • the contents of the first target table may be shown in Table 1 below:
  • Interface logo of smart TV 100 The interface identification of the mobile phone 200 Interface ID of the text input interface Interface ID for the keyboard interface Interface ID of the game interface Interface ID of the handle interface Interface ID of the video playback interface Interface ID of the video playback interface
  • the mobile phone 200 After the mobile phone 200 is successfully connected with the smart TV 100, as shown in FIG. , that is, control information).
  • the mobile phone 200 searches out the interface identifier of the second interface 207 (ie, the interface identifier of the keyboard interface) from the first target table according to the interface identifier of the first interface 101 .
  • the mobile phone 200 displays the second interface 207 (ie, the keyboard interface) according to the found interface identifier.
  • mapping relationship can also be replaced by: the mapping relationship between the interface identifier of the smart TV 100 and the program package name of the mobile phone 200 .
  • content of the first target table may also be shown in Table 2 below:
  • Interface logo of smart TV 100 The program package name of the mobile phone 200 Interface ID of the text input interface The package name of the keyboard application Interface ID of the game interface The package name of the controller application Interface ID of the video playback interface The package name of the video application
  • the mobile phone 200 may also find out the program package name of the keyboard application from the first target table according to the interface identifier of the first interface 101 . Furthermore, the mobile phone 200 opens the keyboard application to display the second interface 207 according to the found program package name of the keyboard application.
  • the mobile phone 200 may respond to the user's trigger operation on the virtual text button in the second interface 207, and enter the account and password.
  • the ways for the mobile phone 200 to input the account and password on the first interface 101 include but not limited to the following two:
  • the first type the virtual text button is marked with a button identifier.
  • the button identification can be a number "0" or “1", etc., an English letter “a” or “b” and so on.
  • the mobile phone 200 may, in response to the user's trigger operation on the virtual text key, send the key identification (ie target data) of the triggered virtual text key to the smart TV 100 .
  • the smart TV 100 inputs the received button identification on the first interface 101 .
  • the second type a button identifier is marked on the virtual text button.
  • the button identification may be a number "0" or "1", and an English letter "a” or "b”.
  • both the smart TV 100 and the mobile phone 200 store a second target table, and the second target table includes a mapping relationship between key identifiers and key codes.
  • the content of the second target table may be shown in Table 3 below:
  • button identification Second key code Button ID "0" 48 Button ID “1” 49 Button ID “2” 50 Button identification "3" 51 Button identification "4" 52 Button identification "5" 53 Button identification "6” 54 Button identification "7” 55 Button identification "8” 56 Button identification "9” 57 Button identification "Enter” 13 Button identification "A” 65 Button identification "B” 66 Button identification "C” 67 ... ... Button identification "Z” 90
  • the mobile phone 200 may, in response to the user's trigger operation on the virtual text key, search out the key code corresponding to the key identifier of the triggered virtual text key from the second target table.
  • the mobile phone 200 sends the found key code to the smart TV 100 .
  • the smart TV 100 finds the keyboard identifier corresponding to the key code from the second target table, and inputs the found key identifier on the first interface 101 .
  • the user only needs to establish a communication connection between the smart TV 100 and the mobile phone 200 .
  • the smart TV 100 sends the identifier of the first interface to the mobile phone 200 .
  • the mobile phone 200 recognizes the first interface identifier, it displays the second interface corresponding to the first interface identifier.
  • the mobile phone 200 receives a user's trigger operation; the smart TV 100 inputs the account and password corresponding to the trigger operation on the first interface.
  • the user needs at most two operations to realize the account and password entered in the first interface of the smart TV 100 by means of the mobile phone 200 .
  • the user's operation steps are simplified, and the operation efficiency is improved.
  • the first interface 101 of the smart TV 100 is used as the text input interface, and the user enters the account and password on the text input interface of the smart TV 100 by means of the mobile phone 200 as an example.
  • the user enters the account and password on the text input interface of the smart TV 100 by means of the mobile phone 200 as an example.
  • taking the first interface 101 of the smart TV 100 as the game interface as an example how the user controls the game character in the game interface of the smart TV 100 to execute the game behavior with the help of the mobile phone 200 .
  • the smart TV 100 displays a first interface 101 (that is, the smart TV 100 is in a first state).
  • the first interface 101 includes game character A and game character B (that is, the smart TV 100 has a function of controlling the behavior of the game character, that is, the first function). Understandably, the first interface 101 is a game interface.
  • the mobile phone 200 displays the system main interface 201 at this moment.
  • the main interface 201 of the system displayed on the mobile phone 200 can also be replaced by the mobile phone 200 being in a locked screen state or displaying other browsing interfaces, etc., and this is just an example.
  • the smart TV 100 needs to establish a communication connection with the mobile phone 200 .
  • the manner of the communication connection between the mobile phone 200 and the smart TV 100 may refer to the description in the foregoing embodiments, which will not be repeated here.
  • the smart TV 100 controls the mobile phone 200 to display the second interface 207 during the process of displaying the first interface 101 .
  • the mobile phone 200 displays the second interface 207 according to the interface identifier of the first interface 101 .
  • the principle of the mobile phone 200 displaying the second interface 207 according to the interface identifier of the first interface 101 is the same as that in the above-mentioned embodiment, and will not be repeated here.
  • the second interface 207 may include a virtual control keyboard 208 .
  • the virtual control keyboard 208 includes virtual keys for controlling the game behavior of the game character A.
  • the virtual control keyboard 208 may include a forward key " ⁇ ”, a backward key “ ⁇ ”, a jump key “ ⁇ ”, a squat key “ ⁇ ”, a light hand attack key “A”, a heavy hand attack key “B”, a light hand Foot attack button “C” and heavy foot attack button "D”.
  • the mobile phone 200 can also receive the interface data displayed on the first interface 101 from the smart TV 100, and display the interface data of the first interface 101 on the second interface 207, and then , the user can view the interface data of the first interface 101 in the smart TV 100 at close range on the mobile phone 200 .
  • the mobile phone 200 when the mobile phone 200 is displaying the second interface 207 , the mobile phone 200 can control the game behavior of the game character A in the first interface 101 in response to a user's trigger operation.
  • the mobile phone 200 can control the game character A in the first interface 101 of the smart TV 100 to lightly respond to the user's trigger operation on the light hand attack button "A" in the second interface 207.
  • Hand attack game character B the principle that the user controls the game character A in the first interface 101 of the smart TV 100 to attack the game character B with the help of the mobile phone 200 is the same as in the above-mentioned embodiment, the user uses the mobile phone 200 to control the game character A in the first interface 101 of the smart TV 100.
  • the principle of entering the account and password is the same, and will not be repeated here. It can be understood that the mobile phone 200 can also control the game character A in the first interface 101 of the smart TV 100 to perform other operations based on the same principle, which will not be limited or repeated here.
  • FIG. 11 may also be replaced with (c) in the above-mentioned FIG. 11 .
  • the mobile phone 200 displays a second interface 211 .
  • the second interface 211 includes a virtual control keyboard 212, wherein, when the virtual control keyboard 211 is triggered, control data (i.e. target data) on the game character is generated, wherein the control data is used to control the game character A in the first interface 211 the behavior of.
  • control data i.e. target data
  • the virtual control keyboard 212 may include a forward key “ ⁇ ”, a backward key “ ⁇ ”, a jump key “ ⁇ ”, a squat key “ ⁇ ”, a light hand attack key “A”, a heavy hand attack key “B”, a light foot attack key Button “C” and heavy foot attack button “D”.
  • the second interface 211 is the handle interface.
  • the mobile phone 200 can also control the game character A in the first interface 101 of the smart TV 100 to lightly attack the game character B in response to the user's trigger operation on the light-hand attack button "A" in the second interface 211 .
  • the mobile phone 200 displays a third interface 213 .
  • the first prompt information is displayed on the third interface 213 .
  • the first prompt information is used to indicate that the mobile phone 200 is in the first preset mode, where the first preset mode can be understood as the mobile phone 200 turning on the game mode.
  • the first prompt information may be but not limited to text information of "in game mode”.
  • the mobile phone 200 also finds out the attitude sensor according to the logo of the first interface 101 of the smart TV 100 and the preset mapping relationship. In this way, the mobile phone 200 can share the attitude data of the mobile phone 200 collected by the attitude sensor to the smart TV 100 . After receiving the gesture data, the smart TV 100 can control the game character A to perform operations associated with the gesture data.
  • the smart TV 100 when the mobile phone 200 moves to the right, it sends right gesture data (ie target data) to the smart TV 100 .
  • the smart TV 100 can control the game character A to move to the right according to the gesture data of the movement to the right.
  • the mobile phone 200 moves downward, it sends downward posture data to the smart TV 100 .
  • the smart TV 100 can control the game character A to squat according to the downward posture data.
  • the mobile phone 200 is turned forward, it sends the attitude data of the forward turn to the smart TV 100 . In this way, the smart TV 100 can control the game character A to attack the game character B with a light hand according to the posture data flipped forward.
  • the user only needs to establish a communication connection between the smart TV 100 and the mobile phone 200 .
  • the smart TV 100 sends the identifier of the first interface to the mobile phone 200 .
  • the mobile phone 200 displays the second interface corresponding to the identifier of the first interface.
  • the mobile phone 200 receives the user's trigger operation, and controls the game character in the smart TV 100 to perform the game behavior associated with the trigger operation.
  • the user needs at most two operations to control the game character in the smart TV 100 to perform the game behavior associated with the triggering operation by means of the mobile phone 200 .
  • the user's operation steps are simplified, and the operation efficiency is improved.
  • the smart TV 100 uses the software and/or hardware of the mobile phone 200 to implement a certain function.
  • the smart TV 100 uses the mobile phone 200 to input the account and password; or, the smart TV 100 uses the mobile phone 200 to control the game character to execute the game behavior.
  • the mobile phone 200 implements the large-screen video interactive function by means of the software and hardware of the smart TV 100 .
  • the user's mobile phone 200 receives a video call request from the third terminal.
  • the mobile phone 200 receives the first video data from the third terminal and collects the second video data in response to the confirmation operation of the video call request input by the user.
  • the first video data includes the image D collected by the third terminal and first sound data.
  • the second video data includes the first sound data collected by the microphone of the mobile phone 200 and the image C collected by the camera of the mobile phone 200 .
  • the mobile phone 200 displays the first interface 209 (i.e.
  • the mobile phone 200 plays the first sound data on the speaker of the mobile phone 200 (That is, the mobile phone 200 has a function of video interaction, that is, the first function).
  • the mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data. In this way, video interaction between the mobile phone 200 and the third terminal can be realized.
  • the mobile phone 200 can use the smart TV 100 to collect the second video data.
  • the mobile phone 200 may use the smart TV 100 to display image C and image D. 13 and 14, how the mobile phone 200 collects the second video data with the smart TV 100, and how the mobile phone 200 displays the image C and the image D with the smart TV 100.
  • the mobile phone 200 establishes a communication connection with the smart TV 100 during the process of displaying the first interface 209 .
  • the manner of the communication connection between the mobile phone 200 and the smart TV 100 may refer to the description in the foregoing embodiments, which will not be repeated here.
  • the mobile phone 200 After the mobile phone 200 establishes a communication connection with the smart TV 100 , the mobile phone 200 controls the smart TV 100 to display the second interface 103 during the process of displaying the first interface 209 . Wherein, image C and image D are also included in the second interface 103 . Next, the process of how the mobile phone 200 controls the smart TV 100 to display the second interface 103 will be specifically introduced.
  • the mobile phone 200 sends the interface identification of the first interface 209 to the smart TV 100 (i.e. the interface identification of the video playback interface, i.e. the control information).
  • the smart TV 100 finds the identifier of the second interface 103 and the identifier of the speaker according to the interface identifier of the first interface 209 and the preset mapping relationship.
  • the smart TV 100 receives the first video data from the mobile phone 200, and the first video data includes the image D collected by the third terminal and the first sound data.
  • the smart TV 100 displays the second interface 103 (namely, the video playback interface), wherein the second interface 103 includes the image D, and the speaker of the smart TV 100 plays the first sound data.
  • the size of the image D displayed on the smart TV 100 is larger than the size of the image D displayed on the mobile phone 200 . In this way, the user can watch the image D on the smart TV 100 more clearly.
  • the mobile phone 200 may also display a fourth interface 210 after sending the first video data to the smart TV 100 .
  • Second prompt information is displayed on the fourth interface 210 .
  • the second prompt information is used to indicate that the mobile phone 200 is in a second preset mode (such as video mode).
  • the first prompt information may be, but not limited to, text information of "in video call”.
  • the smart TV 100 also finds the identifier of the microphone and the identifier of the camera according to the interface identifier of the first interface 209 and the preset mapping relationship.
  • the smart TV 100 displays the second interface 103
  • the camera of the smart TV 100 collects the image C
  • the microphone of the smart TV 100 collects the second sound data.
  • the smart TV 100 also sends second video data (ie target data) to the mobile phone 200 , the second video data includes image C and second sound data.
  • the mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data. In this way, the mobile phone 200 implements video interaction with the third terminal through the smart TV 100 . Since the second video data is collected by the smart TV 100, the user does not need to hold the mobile phone 200 to collect the second video data, which frees the user's hands.
  • the user only needs to establish a communication connection between the smart TV 100 and the mobile phone 200 .
  • the mobile phone 200 sends the identifier of the video playback interface of the smart TV 100 to the smart TV 100 .
  • the smart TV 100 After the smart TV 100 recognizes the first interface identifier, it displays the video playback interface corresponding to the identifier of the video playback interface, and the speaker of the smart TV 100 plays the first sound data from the mobile phone 200 .
  • the smart TV 100 collects the second video data.
  • the smart TV 100 sends the second video data to the mobile phone 200 .
  • the mobile phone 200 sends the second video data to the third terminal, so that the third terminal can play the second video data.
  • the user needs at most two operations to realize the function of the mobile phone 200 controlling the smart TV 100 to perform large-screen video interaction, and frees the user's hands. Furthermore, the user's operation steps are simplified, and the operation efficiency is improved.
  • the mobile phone 200 in order to further prevent the mobile phone 200 and the smart TV 100 from establishing a communication connection by mistake.
  • the mobile phone 200 needs to be in the third preset mode before it can establish a communication connection with the smart TV 100 .
  • the user can trigger the mobile phone 200 to be in the third preset mode.
  • the third preset mode can be understood as a sharing mode, a collaborative mode, and the like.
  • the mobile phone 200 can display a function list interface 212 (that is, the fifth interface) in response to the user's trigger operation on the "settings" icon 211 in the system main interface 201.
  • the function list interface 212 includes a first control 213 for controlling entering/exiting the third preset mode.
  • the mobile phone 200 may be in the third preset mode in response to the trigger operation on the first control 213 .
  • the mobile phone 200 can establish a communication connection with the smart TV 100 .
  • FIG. 16 is one of the schematic flowcharts of the device control method provided by the embodiment of the present application. As shown in Figure 16, the device control method provided in the embodiment of the present application may include:
  • the first terminal displays a first interface.
  • the first interface may be a text input interface, a game interface, or a video playback interface, etc. in the above embodiments, which is not limited herein.
  • the first terminal may be the smart TV 100 in the above-mentioned embodiment
  • the second terminal may be the above-mentioned mobile phone 200 . That is to say, the smart TV 100 can act as a function demander, and realize functions such as text input and game character behavior control with the help of the mobile phone 200 as a function provider.
  • the first terminal may also be the aforementioned mobile phone 200
  • the second terminal may also be the aforementioned smart TV 100 . That is, the mobile phone 200 may serve as a function demander, and realize functions such as video interaction with the help of the smart TV 100 as a function provider.
  • the first terminal sends control information to the second terminal.
  • the control information is used to instruct the second terminal to enable the target function unit, and when the target function unit is triggered, it can realize the trigger-associated function.
  • the target function unit may be a second interface including a virtual text key; when the virtual text key is triggered, the function of inputting text data (that is, target data) to the text input interface of the first terminal may be implemented.
  • the second interface includes a virtual text key and is triggered, the function of inputting text data to the text input interface of the first terminal can be realized, and the introduction of FIGS.
  • the target functional unit can also be the second interface of the virtual control keyboard.
  • the target functional unit can also be a gesture sensor, and when the gesture sensor is triggered, it can control the game character on the first interface of the first terminal to perform the game behavior associated with the trigger.
  • the virtual control keyboard or the gesture sensor when triggered, it can be realized to control the game character on the first interface of the first terminal to perform the game behavior associated with the trigger.
  • the target functional unit may be a video acquisition module (such as a camera, a microphone).
  • the video acquisition module When the video acquisition module is turned on, it can control the first terminal and the third terminal to realize the video interaction function.
  • the first terminal and the third terminal may be controlled to realize video interaction function, and reference may be made to the above introduction to FIG. 14 , which will not be repeated here.
  • S1604 The second terminal responds to the trigger operation on the target function unit, so that the first terminal executes a function associated with the trigger operation when the first interface is displayed.
  • the second terminal may respond to the user triggering the virtual text key Operation, get text data. Understandably, the text data is the target data.
  • the game character on the first interface of the first terminal is controlled to perform the game behavior associated with the trigger. Understandably, the game character on the first interface of the first terminal is controlled to perform a game behavior associated with the trigger, that is, a function associated with the trigger operation.
  • the second terminal may respond to a user's trigger operation, so that the gesture data of the second terminal collected by the gesture sensor; to control the first terminal
  • the game character on the first interface executes the game behavior associated with the trigger.
  • the game character on the first interface of the first terminal is controlled to perform a game behavior associated with the trigger, that is, a function associated with the trigger operation.
  • the first interface is a video playback interface
  • the target functional unit includes a video acquisition module (such as a microphone and a camera)
  • the first terminal and the third terminal may be controlled to implement a video interaction function.
  • the first terminal and the third terminal are controlled to implement a video interaction function, that is, a function associated with a trigger operation.
  • Figure 17 shows that when the first terminal is a smart TV 100 and the second terminal is a mobile phone 200 provided by the embodiment of the present application, the smart TV 100 implements the function associated with the target data with the help of the mobile phone 200, or the first terminal is the mobile phone 200 and the second terminal When the terminal is the smart TV 100 , the mobile phone 200 implements the functional architecture associated with the target data by means of the smart TV 100 .
  • the smart TV 100 and the mobile phone 200 can be connected through a short-distance communication module.
  • the short-distance communication module may be a Bluetooth module, a WiFi module or an NFC module, etc., which are not limited herein.
  • the coordinated control application of the smart TV 100 invoked and opened the video playback interface (comprising image C and image D) of the video application and called the speaker to play the video from the mobile phone 200.
  • the cooperative control application of the smart TV 100 decides to call the video application, speaker, camera, and microphone based on the interface identifier of the video playback interface.
  • the collaborative control application of the mobile phone 100 may also generate instructions for invoking video applications, speakers, cameras, and microphones in the video playback interface.
  • Smart TV 200 invokes video application, speaker, camera and microphone after receiving the instruction from mobile phone 100 .
  • the mobile phone 100 can also decide to call a video application, a speaker, a camera, and a microphone.
  • the mobile phone 200 receives the image C and the second sound data from the smart TV 100, and sends the image C and the second sound data to the third terminal.
  • the mobile phone 200 can use the smart TV 100 to realize a large-screen video interactive function. It can be understood that the interface identification of the above-mentioned video playback interface and the instructions for invoking the video application, speaker, camera, and microphone are all control information.
  • the smart TV 200 can decide to call the video application, speaker, camera, and microphone according to the functions currently running on the smart TV 200 and the functions currently running on the mobile phone 100;
  • the functions currently running and the functions currently running on the smart TV 200 decide to call a video application, a speaker, a camera, and a microphone.
  • the cooperative control application decision of the mobile phone 200 calls the text input application to open the keyboard interface.
  • the mobile phone 200 sends text data associated with the trigger operation to the smart TV 100 .
  • the smart TV 100 displays the received text data.
  • the collaborative control application of the smart TV 200 can also decide to generate an instruction to call the text input application based on the same principle as above, and control the mobile phone 100 to call the text input application to open the keyboard interface, which will not be repeated here.
  • the cooperative control application of the mobile phone 200 decides to invoke the handle application to open the handle interface.
  • the mobile phone 200 sends control data on the game character associated with the trigger operation to the smart TV 100 .
  • the smart TV 100 controls the game character in the game interface to execute the game behavior associated with the control data.
  • the cooperative control application of the mobile phone 200 decides to call the posture sensor to collect the posture data (ie control data) of the mobile phone 200 .
  • the mobile phone 200 sends attitude data to the smart TV 100 .
  • the smart TV 100 controls the game character in the game interface to perform the game behavior associated with the gesture data.
  • the cooperative control application of the smart TV 200 can also decide to generate an instruction to call the handle application or the attitude sensor based on the above-mentioned same principle, and control the mobile phone 100 to call the handle application or the attitude sensor to obtain control data, which will not be repeated here.
  • the carrier of the cooperative control application mentioned above may be an operable icon located on a negative screen, an activity of an application program, or an interface of a background service, etc., which is not limited herein.
  • the trigger operation mentioned may include: click operation, long press operation, and operation of changing the posture of the terminal device, etc., which are not limited here.
  • the first terminal is in the first state and the first terminal displays the first interface as an example.
  • the first terminal is in the first state, and may also be in a state of playing music, and further, the first terminal plays music through the speaker of the second terminal. It is not limited here.
  • the first terminal performs the first function as an example.
  • the first terminal plays music through the speaker of the second terminal, it is the second terminal that performs the first function. That is, the second terminal performs the function of playing music.
  • the embodiment of the present application further provides a device control apparatus 1800, which is applied to a first terminal.
  • the device control apparatus 1800 provided in the embodiment of the present application includes: a first processing unit 1802 configured to control the first terminal to be in the first state, where the first terminal can execute the first function when in the first state.
  • the first communication unit 1801 is configured to send control information to the second terminal based on the established communication connection. Wherein, the control information is used to instruct the second terminal to activate the target function unit, so as to cooperate with the first terminal to execute the first function.
  • the first communication unit 1801 is further configured to receive target data from the second terminal.
  • the first processing unit 1802 is further configured to execute a first function associated with the target data.
  • the target functional unit is a virtual text button in the second interface
  • the target data is text data.
  • the first processing unit 1802 is specifically configured to control the first terminal to display a first interface, where the first interface includes an area where text is to be input.
  • the first processing unit 1802 is further specifically configured to control the first terminal to input text data in an area where text is to be input on the first interface when the first interface is displayed.
  • the target data is control data.
  • the first processing unit 1802 is specifically configured to control the first terminal to display a first interface, where the first interface includes game characters.
  • the first processing unit 1802 is also specifically configured to control the first terminal to control the game character in the first interface to perform the game behavior corresponding to the control data when the first interface is displayed.
  • the target functional unit is a virtual control keyboard in the second interface, where the virtual control keyboard includes virtual keys for controlling the game behavior of the game character.
  • the control data is attitude data
  • the target functional unit is an attitude sensor of the second terminal for collecting attitude data.
  • the target functional unit is a video collection module
  • the target data is the second video data collected by the video collection module.
  • the first processing unit 1802 is specifically configured to control the first terminal to display the first interface, wherein the first interface includes the first video data from the third terminal, and the first processing unit 1802 is specifically configured to control the first terminal to display the first interface.
  • the embodiment of the present application further provides a device control apparatus 1900, which is applied to a second terminal.
  • the device control apparatus 1900 provided in the embodiment of the present application includes: a second communication unit 1901, configured to receive control information from a first terminal based on an established communication connection.
  • the second processing unit 1902 is configured to enable the target function unit in response to the control information.
  • the second processing unit 1902 is further configured to cooperate with the first terminal to execute the first function based on the target function unit.
  • the second processing unit 1902 is specifically configured to control the second communication unit 1901 to send target data to the first terminal in response to a trigger operation on the target functional unit; A terminal performs a first function associated with target data.
  • the target functional unit is a virtual text button in the second interface
  • the target data is text data
  • the text data is used to indicate that when the first terminal displays the first interface, the key to be input in the first interface Text field to enter text data.
  • the target data is control data
  • the target data is used to instruct the first terminal to control the game character in the first interface to perform the game behavior corresponding to the control data when the first interface is displayed.
  • the target function unit is a virtual control keyboard in the second interface, wherein the virtual control keyboard includes virtual keys for controlling the game behavior of the game character; or, the control data is gesture data, and the target function The unit is an attitude sensor of the second terminal for collecting attitude data.
  • the target functional unit is a video collection module
  • the target data is second video data collected by the video collection module
  • the second video data is used to instruct the first terminal to display the first video data including the first video data. interface, sending the second video data to the third terminal.
  • control information is information indicating that the first terminal is in the first state
  • the second processing unit 1902 is specifically configured to enable the target functional unit according to the information indicating that the first terminal is in the first state
  • control information is an instruction for enabling the target functional unit
  • the second processing unit 1902 is specifically configured to enable the target functional unit according to the instruction for enabling the target functional unit.
  • FIG. 20 is a schematic diagram of a hardware structure of a first terminal or a second terminal provided in an embodiment of the present application.
  • the first terminal or the second terminal includes a processor 2001, a communication line 2004 and At least one communication interface (the communication interface 2003 is used as an example in FIG. 20 for illustration).
  • the processor 2001 can be a general-purpose central processing unit (central processing unit, CPU), a microprocessor, a specific application integrated circuit (application-specific integrated circuit, ASIC), or one or more for controlling the implementation of the application program program integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication lines 2004 may include circuitry that communicates information between the components described above.
  • the communication interface 2003 uses any device such as a transceiver for communicating with other devices or communication networks, such as Ethernet, wireless local area networks (wireless local area networks, WLAN) and so on.
  • a transceiver for communicating with other devices or communication networks, such as Ethernet, wireless local area networks (wireless local area networks, WLAN) and so on.
  • the first terminal or the second terminal may further include a memory 2002 .
  • the memory 2002 may be a read-only memory (read-only memory, ROM) or other types of static storage devices that can store static information and instructions, a random access memory (random access memory, RAM) or other types that can store information and instructions It can also be an electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be programmed by a computer Any other medium accessed, but not limited to.
  • the memory may exist independently and be connected to the processor through the communication line 2004 . Memory can also be integrated with the processor.
  • the memory 2002 is used to store computer-executed instructions for implementing the solution of the present application, and the execution is controlled by the processor 2001 .
  • the processor 2001 is configured to execute computer-executed instructions stored in the memory 2002, so as to implement the device control method performed by the first terminal or the second terminal provided in the embodiment of the present application.
  • the computer-executed instructions in the embodiment of the present application may also be referred to as application program code, which is not specifically limited in the embodiment of the present application.
  • the processor 2001 may include one or more CPUs, for example, CPU0 and CPU1 in FIG. 20 .
  • the first terminal or the second terminal may include multiple processors, for example, processor 2001 and processor 2005 in FIG. 20 .
  • processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor.
  • a processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • FIG. 21 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip 210 includes one or more than two (including two) processors 2110 and a communication interface 2130 .
  • the memory 2140 stores the following elements: executable modules or data structures, or subsets thereof, or extensions thereof.
  • the memory 2140 may include a read-only memory and a random access memory, and provides instructions and data to the processor 2110 .
  • a part of the memory 2140 may also include a non-volatile random access memory (non-volatile random access memory, NVRAM).
  • the memory 2140 , the communication interface 2130 and the memory 2140 are coupled together through the bus system 2120 .
  • the bus system 2120 may include not only a data bus, but also a power bus, a control bus, and a status signal bus.
  • various buses are labeled as bus system 2120 in FIG. 21 .
  • the methods described in the foregoing embodiments of the present application may be applied to the processor 2110 or implemented by the processor 2110 .
  • the processor 2110 may be an integrated circuit chip with signal processing capability.
  • each step of the above method may be implemented by an integrated logic circuit of hardware in the processor 2110 or instructions in the form of software.
  • the above-mentioned processor 2110 may be a general-purpose processor (for example, a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate Array (field-programmable gate array, FPGA) or other programmable logic devices, discrete gates, transistor logic devices or discrete hardware components, the processor 2110 can implement or execute the methods, steps and logic block diagrams disclosed in the embodiments of the present application .
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a mature storage medium in the field such as random access memory, read-only memory, programmable read-only memory, or electrically erasable programmable read only memory (EEPROM).
  • the storage medium is located in the memory 2140, and the processor 2110 reads the information in the memory 2140, and completes the steps of the above method in combination with its hardware.
  • the instructions stored in the memory for execution by the processor may be implemented in the form of computer program products.
  • the computer program product may be written in the memory in advance, or may be downloaded and installed in the memory in the form of software.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL) or wireless (such as infrared, wireless, microwave, etc.) transmission to another website site, computer, server or data center.
  • Computer readable storage medium can be Any available media capable of being stored by a computer or a data storage device such as a server, data center, etc. integrated with one or more available media.
  • available media may include magnetic media (e.g., floppy disks, hard disks, or tapes), optical media (e.g., A digital versatile disc (digital versatile disc, DVD)), or a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), etc.
  • magnetic media e.g., floppy disks, hard disks, or tapes
  • optical media e.g., A digital versatile disc (digital versatile disc, DVD)
  • a semiconductor medium for example, a solid state disk (solid state disk, SSD)
  • the embodiment of the present application also provides a computer-readable storage medium.
  • the methods performed by the first terminal or the second terminal described in the foregoing embodiments may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • Computer-readable media may include computer storage media and communication media, and may include any medium that can transfer a computer program from one place to another.
  • a storage media may be any target media that can be accessed by a computer.
  • the computer-readable medium may include compact disc read-only memory (compact disc read-only memory, CD-ROM), RAM, ROM, EEPROM or other optical disc storage; the computer-readable medium may include a magnetic disk memory or other disk storage devices.
  • any connected cord is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, compact disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Reproduce data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande se rapporte au domaine technique des terminaux. Un procédé et un appareil de commande de dispositif sont proposés. Au moyen du procédé de commande de dispositif proposé dans la présente demande, un utilisateur a juste besoin d'établir une connexion de communication entre un premier terminal et un second terminal. Le procédé comprend les étapes suivantes : un premier terminal envoie des informations de commande à un second terminal ; le second terminal active une unité fonctionnelle cible en réponse aux informations de commande ; puis le second terminal exécute une première fonction en coopération avec le premier terminal sur la base de l'unité fonctionnelle cible. Ainsi, les étapes de fonctionnement pour un utilisateur sont simplifiées, à savoir, un second terminal peut exécuter une première fonction en coopération avec un premier terminal sur la base d'une unité fonctionnelle cible, ce qui permet d'améliorer l'efficacité de fonctionnement.
PCT/CN2022/118543 2021-09-16 2022-09-13 Procédé et appareil de commande de dispositif WO2023040848A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111089487.1 2021-09-16
CN202111089487.1A CN115814403A (zh) 2021-09-16 2021-09-16 设备控制方法和装置

Publications (2)

Publication Number Publication Date
WO2023040848A1 WO2023040848A1 (fr) 2023-03-23
WO2023040848A9 true WO2023040848A9 (fr) 2023-05-25

Family

ID=85515170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/118543 WO2023040848A1 (fr) 2021-09-16 2022-09-13 Procédé et appareil de commande de dispositif

Country Status (2)

Country Link
CN (1) CN115814403A (fr)
WO (1) WO2023040848A1 (fr)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7634263B2 (en) * 2006-01-30 2009-12-15 Apple Inc. Remote control of electronic devices
CN103634419B (zh) * 2013-11-15 2016-08-31 北京洋浦伟业科技发展有限公司 终端的远程控制方法及终端
CN103634645A (zh) * 2013-12-10 2014-03-12 青岛海尔软件有限公司 利用手机控制智能电视的方法
CN104104992A (zh) * 2014-07-08 2014-10-15 深圳市同洲电子股份有限公司 一种多屏互动方法、装置及系统
CN105635625B (zh) * 2014-10-31 2019-12-20 腾讯科技(深圳)有限公司 视频通话方法和装置
CN104486684A (zh) * 2014-12-18 2015-04-01 百度在线网络技术(北京)有限公司 电子设备的输入方法和装置
CN104679239B (zh) * 2015-01-23 2018-05-25 深圳市金立通信设备有限公司 一种终端输入方法
CN106371935A (zh) * 2016-08-26 2017-02-01 上海浮罗创业投资有限公司 游戏操控界面的获取方法及装置
CN107750003A (zh) * 2017-10-11 2018-03-02 广州指观网络科技有限公司 一种远程触碰输入系统及方法
CN113630574B (zh) * 2018-05-14 2023-09-29 聚好看科技股份有限公司 视频通话方法及终端设备
CN110138937B (zh) * 2019-05-07 2021-06-15 华为技术有限公司 一种通话方法、设备及系统
CN110337020A (zh) * 2019-06-26 2019-10-15 华为技术有限公司 一种显示设备的控制方法及相关装置
CN114489876A (zh) * 2020-11-09 2022-05-13 华为技术有限公司 一种文本输入的方法、电子设备和系统
CN113032766B (zh) * 2021-05-26 2021-09-24 荣耀终端有限公司 应用权限管理的方法和装置

Also Published As

Publication number Publication date
WO2023040848A1 (fr) 2023-03-23
CN115814403A (zh) 2023-03-21

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
US20220163932A1 (en) Device control page display method, related apparatus, and system
EP4075249A1 (fr) Procédé d'interaction pour traitement de tâches inter-appareils, et dispositif électronique et support de stockage
US11864248B2 (en) Application function implementation method and electronic device
EP4109891A1 (fr) Procédé d'interaction avec le dispositif et dispositif électronique
WO2021147406A1 (fr) Procédé de sortie audio et dispositif terminal
CN112527174B (zh) 一种信息处理方法及电子设备
WO2023010940A1 (fr) Procédé et appareil d'affichage d'écran partagé
CN110493626B (zh) 视频数据处理方法及装置
CN112130788A (zh) 一种内容分享方法及其装置
CN113986092A (zh) 消息显示方法和装置
US20230362782A1 (en) Data Sharing Method, Electronic Device, and System
WO2022048453A1 (fr) Procédé de déverrouillage et dispositif électronique
CN113703849B (zh) 投屏应用打开方法和装置
CN114356195B (zh) 一种文件传输的方法及相关设备
CN115242994B (zh) 视频通话系统、方法和装置
WO2023040848A9 (fr) Procédé et appareil de commande de dispositif
EP4202666A1 (fr) Procédé d'accès à une application et appareil associé
WO2022143310A1 (fr) Procédé de projection sur écran à double canal et dispositif électronique
WO2021104000A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2024109443A1 (fr) Procédé de connexion de dispositif, dispositif et système
WO2023005348A1 (fr) Appareil et procédé de recommandation d'écran divisé
WO2023020496A1 (fr) Procédé d'affichage, puce et dispositif électronique
WO2024078306A1 (fr) Procédé d'affichage de message de notification en bannière et dispositif électronique
US20240201932A1 (en) Display method, electronic device, and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22869217

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE