WO2022089102A1 - 一种控制方法、装置及电子设备 - Google Patents

一种控制方法、装置及电子设备 Download PDF

Info

Publication number
WO2022089102A1
WO2022089102A1 PCT/CN2021/119707 CN2021119707W WO2022089102A1 WO 2022089102 A1 WO2022089102 A1 WO 2022089102A1 CN 2021119707 W CN2021119707 W CN 2021119707W WO 2022089102 A1 WO2022089102 A1 WO 2022089102A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
electronic device
control
command information
information
Prior art date
Application number
PCT/CN2021/119707
Other languages
English (en)
French (fr)
Inventor
许顺
黄敏
陆建海
周雨沛
宋亚鲁
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21884832.3A priority Critical patent/EP4220627A4/en
Priority to JP2023523533A priority patent/JP2023547821A/ja
Publication of WO2022089102A1 publication Critical patent/WO2022089102A1/zh
Priority to US18/308,244 priority patent/US20230259250A1/en

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/547Remote procedure calls [RPC]; Web services
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1822Parsing for meaning understanding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42225User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details characterized by types of remote control, e.g. universal remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44227Monitoring of local network, e.g. connection or bandwidth variations; Detecting new devices in the local network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4518Management of client data or end-user data involving characteristics of one or more peripherals, e.g. peripheral type, software version, amount of memory available or display capabilities
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances

Definitions

  • the present application relates to the technical field of electronic equipment, and in particular, to a control method, device, and electronic equipment.
  • the present application provides a control method, an apparatus and an electronic device for realizing a cross-device application control function.
  • an embodiment of the present application provides a control method, and the method can be applied to various application scenarios including multiple electronic devices that can communicate.
  • the steps of the method will be described below by taking the first electronic device as an example, and the method includes:
  • the first electronic device After acquiring the command information of the first application, the first electronic device generates a control application according to the command information of the first application; wherein, the first application is located in the second electronic device, and the command information of the first application is is used to implement the action of the first application; the control application is used to make the second electronic device implement the action of the first application.
  • the command information of the first application may be an intent (Intent) of the first application.
  • the first electronic device can generate the control application according to the command information of the application located in other electronic devices, so that the user can start the control application to make other electronic devices realize the action of the application.
  • the electronic device can realize the cross-device application control function by generating a control application, thereby realizing multi-device collaboration, thereby improving user experience.
  • the first electronic device may acquire the command information of the first application in the following manner:
  • Manner 1 Receive command information from the first application of the second electronic device.
  • Manner 2 Obtain command information of the first application input by the user.
  • the first electronic device can flexibly obtain command information of applications located in other electronic devices in various ways.
  • the first electronic device may also generate a control icon corresponding to the control application, and display the control icon on the display screen.
  • the first electronic device can generate a control icon, which is convenient for starting the control application by clicking on the control icon.
  • the first electronic device may, but is not limited to, generate the control icon in the following manner:
  • Manner 1 the first electronic device acquires icon information corresponding to the first application, and generates the control icon according to the icon information corresponding to the first application.
  • the first electronic device may generate the control icon according to a preset picture or a picture selected by the user.
  • the first electronic device generates a control icon corresponding to the control application.
  • the first electronic device may also send command information of the first application to the second electronic device, so that the second electronic device can send the command information of the first application to the second electronic device.
  • the electronic device executes the action of the first application according to the received command information of the first application.
  • the first electronic device can implement cross-device application control by sending the command information to the second electronic device to the first application.
  • the first electronic device may generate a control application through the following steps, including:
  • command information of a second application wherein the second application is located in the first electronic device and/or the third electronic device, and the command information of the second application is used to implement actions of the second application;
  • the control application is generated according to the command information of the first application and the command information of the second application, wherein the control application is further used to enable the first electronic device and/or the third electronic device Implement actions of the second application.
  • the first electronic device may further execute the second application according to the command information of the second application after acquiring the start command of the control application action; when the second application is located in the third electronic device, the first electronic device may also send the second application to the third electronic device after acquiring the start command of the control application command information, so that the third electronic device executes the action of the second application according to the received command information of the second application.
  • control application can not only make the second electronic device realize the action of the first application, but also enable the first electronic device or the third electronic device to realize the action of the second application. It should be noted that, the embodiments of the present application do not limit the number of electronic devices that need to be controlled by cooperation, nor the number of applications that need to be controlled by cooperation.
  • control application can merge multiple applications located in multiple electronic devices.
  • the first electronic device can implement the application control function of multiple electronic devices by starting the control application.
  • the first electronic device may obtain the start command of the control application in the following manner:
  • Manner 1 Detecting a user's operation on the control icon corresponding to the control application; in response to the operation, generating a start command of the control application;
  • Manner 2 Receive a user's voice instruction through a voice assistant application; and obtain a startup command of the control application obtained by parsing the voice instruction by the voice assistant application.
  • the first electronic device before the first electronic device acquires the start command of the control application obtained by parsing the voice command by the voice assistant application, the first electronic device also needs to be The first application is added to the application list.
  • the use range of the voice assistant application can be expanded, and through the voice assistant application, the user can open applications located in other electronic devices based on voice commands.
  • the first electronic device when acquiring the command information of the first application, can also acquire the information of the second electronic device; in this way, the first electronic device can The information of the second electronic device, and the command information of the first application is sent to the second electronic device.
  • the first electronic device before the first electronic device sends the command information of the first application to the second electronic device, if it is determined that a connection is not established with the second electronic device, send the command to the second electronic device.
  • the second electronic device sends a power-on signal; and after the second electronic device is powered on, a connection is established with the second electronic device.
  • the first electronic device can also automatically turn on other electronic devices and establish connections with other electronic devices, thereby reducing user operations during the collaborative control process and improving user experience.
  • the first electronic device may also acquire information of a fourth electronic device associated with the second electronic device.
  • the first electronic device sends the command information of the first application to the second electronic device.
  • the first electronic device sends the command information to the fourth electronic device.
  • the device sends a power-on signal; and after the fourth electronic device is powered on, establishes a connection with the fourth electronic device.
  • the first electronic device can also control the opening of the fourth electronic device associated with the second electronic device, so that the fourth electronic device and the second electronic device can be turned on when the fourth electronic device and the second electronic device are turned on.
  • the connection is automatically established, thereby ensuring that the second electronic device can cooperate with the fourth electronic device to realize the action of the first application.
  • the first electronic device may acquire the command information of the first application in the following manner:
  • Manner 1 Send a first control request to the second electronic device, so that the second electronic device feeds back command information of the first application according to the first control request; command information of the first application;
  • Manner 2 Receive a second control request from the second electronic device, where the second control request includes command information of the first application.
  • the first electronic device can acquire the command information of the first application in various ways.
  • the first electronic device may also send command information of the control application to the fifth electronic device after generating the control application, where the command information of the control application is used to start the control application .
  • the fifth electronic device can generate a new control application at the fifth electronic device according to the command information of the control application.
  • an embodiment of the present application further provides a control apparatus, which can be applied to an electronic device, and includes a unit or module for performing the steps of the first aspect above.
  • the present application provides an electronic device, comprising at least one processing element and at least one storage element, wherein the at least one storage element is used to store programs and data, and the at least one processing element is used to execute the first processing element of the present application. methods provided in the aspect.
  • the embodiments of the present application further provide a computer storage medium, where a software program is stored in the storage medium, and the software program can implement the first aspect or any one of the software programs when read and executed by one or more processors method provided by a design.
  • embodiments of the present application further provide a computer program product including instructions, which, when run on a computer, cause the computer to execute the method provided by the first aspect or any one of the designs.
  • an embodiment of the present application provides a chip system, where the chip system includes a processor for supporting an electronic device to implement the functions involved in the first aspect above.
  • the chip system further includes a memory for storing necessary program instructions and data of the electronic device.
  • the chip system can be composed of chips, and can also include chips and other discrete devices.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 2 is a structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a software architecture diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 4A is a schematic diagram of a generation process of a control application provided by an embodiment of the present application.
  • 4B is a schematic diagram of a startup process of a control application provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an example of a control method provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an example of another control method provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an example of another control method provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an example of still another control method provided by an embodiment of the present application.
  • FIG. 10 is a structural diagram of a control device provided by an embodiment of the application.
  • FIG. 11 is a structural diagram of an electronic device according to an embodiment of the application.
  • the present application provides a control method, device, and electronic device, which are used to implement a cross-device application control function, thereby realizing multi-device collaboration.
  • the method and the electronic device are based on the same technical concept. Since the method, the device, and the electronic device have similar problem-solving principles, the implementation of the device, the electronic device, and the method can be referred to each other, and repeated descriptions will not be repeated.
  • the electronic device can acquire command information of applications located in other electronic devices, and generate a control application according to the command information, so that the user can start the control application to enable other electronic devices to realize Actions for the app.
  • the electronic device can realize the cross-device application control function by generating a control application, thereby realizing multi-device collaboration, thereby improving user experience.
  • the electronic equipment which is a device or device with data connection function, data calculation and processing function.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, a netbook, a vehicle-mounted device, a business intelligent terminal (including a video phone, a conference desktop intelligent terminal, etc.), a personal digital assistant (PDA), Augmented reality (AR) ⁇ virtual reality (virtual reality, VR) equipment, etc.
  • PDA personal digital assistant
  • AR Augmented reality
  • VR virtual reality
  • An application which is installed in an electronic device and has the function of providing services to users.
  • an application For example, camera applications that provide shooting services, WeChat applications, QQ applications, etc. that provide chatting services, iQiyi applications, Tencent Video applications, etc. that provide video services, QQ music applications that provide music services, etc. .
  • the application may be developed by a manufacturer of the electronic device, or developed by a supplier of an operating system of the electronic device, or developed by a third-party application manufacturer, which is not limited in this application.
  • the command information of the application which is used to realize the action of the application (ie, function, service, task, operation, etc.).
  • the command information of the application may include relevant information of the action, specifically including: the type of the action, involved data, additional data, and the like.
  • the command information of an application can be represented by an Intent.
  • the Android system can assist the interaction and communication between applications through the Intent mechanism.
  • Intent is an abstract description of the action that needs to be performed in the application, and can be used as a parameter of the application.
  • the Android system can find the corresponding component according to the description of the Intent, and transmit the Intent to the component that needs to be called, so as to complete the invocation of the component.
  • Intent can not only be used between applications, but also can be applied to the interaction between activities (Activity) and services (Service) within the application. Therefore, Intent plays the role of a medium, providing relevant information about components calling each other, and realizing the decoupling between the caller and the callee.
  • Intent manifestations include:
  • the main attributes of the Intent information include: the type of action to be executed (Action), the operation data (Data); the secondary attributes (that is, the additional data contained in the Intent information) include: category (category), data type (type), component (component) , additional information (extras).
  • Multiple refers to two or more. At least one means one and more.
  • the application scenario includes multiple electronic devices.
  • the communication network may be a local area network (eg, a home local area network, a smart home local area network, etc.).
  • the communication network may also be implemented through wireless-fidelity (Wi-Fi), Bluetooth (BT), near field communication (NFC), infrared (IR). ), a network formed by technologies such as sidelink communication technology, etc., which are not limited in this application.
  • Wi-Fi wireless-fidelity
  • BT Bluetooth
  • NFC near field communication
  • IR infrared
  • a network formed by technologies such as sidelink communication technology, etc.
  • any electronic device in the application scenario can be used as a control device with a control function; similarly, any electronic device can also be used as a cooperative device, and is controlled by the control device for application control.
  • a smart phone can control at least one of electronic devices such as a smart TV, a smart speaker, a notebook computer, and a wearable device.
  • the method can also be applied to various other application scenarios, such as: vehicle to everything (V2X), long-term evolution-vehicle (LTE-vehicle, LTE-V), vehicle to vehicle (vehicle to vehicle, V2V), Internet of Vehicles, Machine Type Communications (MTC), Internet of Things (IoT), Long Term Evolution-Machine to Machine (LTE-machine to machine, LTE-M), Machine to Machine (machine) to machine, M2M) and other application scenarios.
  • V2X vehicle to everything
  • LTE-vehicle LTE-vehicle
  • V2V vehicle to vehicle
  • MTC Machine Type Communications
  • IoT Internet of Things
  • LTE-machine to machine Long Term Evolution-Machine to Machine
  • machine Machine to Machine
  • M2M Machine to Machine
  • FIG. 2 shows a structural diagram of a possible electronic device to which the method provided by the embodiment of the present application is applicable.
  • the electronic device 200 includes: a communication unit 201 , a processor 202 , a memory 203 , a display unit 204 , an input unit 205 , an audio circuit 206 , a sensor 207 , a camera 208 and other components. Each component of the electronic device 200 will be described in detail below with reference to FIG. 2 .
  • the communication unit 201 is used to realize the functions of the electronic device 200 and realize data communication with other devices.
  • the communication unit 201 may include a wireless communication module 2011 and a mobile communication module 2012 .
  • the electronic device 200 also needs to cooperate with components such as an antenna, a modem processor and a baseband processor in the processor 202 to implement a communication function.
  • the wireless communication module 2011 can provide applications on electronic devices including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation
  • FM near field communication technology
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 2011 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 2011 receives electromagnetic waves via an antenna, performs signal frequency modulation and filtering processing on the electromagnetic waves, and sends the processed signals to the processor 202 .
  • the wireless communication module 2011 can also receive the signal to be sent from the processor 202, perform frequency modulation, amplify, and radiate out the electromagnetic wave through the
  • the mobile communication module 2012 can provide mobile communication solutions including 2G/3G/4G/5G etc. applied on the electronic device.
  • the mobile communication module 2012 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 2012 can receive electromagnetic waves through the antenna, filter, amplify, etc. the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 2012 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves and radiate it out through the antenna.
  • at least part of the functional modules of the mobile communication module 2012 may be provided in the processor 202 .
  • at least part of the functional modules of the mobile communication module 2012 may be provided in the same device as at least part of the modules of the processor 202 .
  • the electronic device 200 can establish a wireless connection with a base station in a mobile communication system according to the mobile communication module 2012 , and receive services of the mobile communication system through the mobile communication module 2012 .
  • the electronic device 200 may send application command information to other electronic devices through the wireless communication module 2011 or the mobile communication module 2012 in the communication unit 201 , or receive application command information of other electronic devices, etc.; it can also send power-on signals to other electronic devices or receive power-on signals of other electronic devices.
  • the communication unit 201 may further include a communication interface for physically connecting the electronic device 200 with other devices.
  • the communication interface may be connected with the communication interface of the other device through a cable to realize data transmission between the terminal device 200 and the other device.
  • the memory 203 can be used to store software programs and data.
  • the processor 202 executes various functions and data processing of the terminal device 200 by running software programs and data stored in the memory 203 .
  • the software program may be a control program for implementing the control method, a program for various applications, and the like.
  • the memory 203 may mainly include a program storage area and a data storage area.
  • the storage program area may store an operating system, various software programs, etc.; the storage data area may store user input or data created by the terminal device 200 during running the software program. Wherein, the operating system can be Wait.
  • the memory 203 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
  • the control program implementing the control method and the programs of various applications can be stored in the stored program area, and the application command information and data such as icons can be stored in the storage data area.
  • the input unit 205 can be used to receive character information and signals input by the user.
  • the input unit 205 may include a touch panel 2051 and other input devices (eg, function keys).
  • the touch panel 2051 also called a touch screen, can collect user's touch operations on or near it, generate corresponding touch information and send it to the processor 202, so that the 202 can execute the command corresponding to the touch information.
  • the touch panel 2051 can be implemented by various types of resistive, capacitive, infrared, and surface acoustic waves. For example, in this embodiment of the present application, the user can select an application to be merged or activated through the touch panel 2051 .
  • the display unit 204 is used for presenting a user interface and realizing human-computer interaction.
  • the display unit 204 can display information input by the user or information provided to the user, as well as various menus of the terminal device 200, various main interfaces (including icons of various applications), windows of various applications, etc. content.
  • the processor 202 may display icons of various applications in the display unit 204 .
  • the display unit 204 may include a display panel 2041, and the display panel 2041 may be configured in the form of a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED) or the like.
  • a liquid crystal display liquid crystal display, LCD
  • organic light-emitting diode organic light-emitting diode, OLED
  • the touch panel 2051 can cover the display panel 2041, although in FIG. 2 , the touch panel 2051 and the display panel 2041 are implemented as two independent components to realize the electronic device 200, but in this embodiment of the present application, the touch panel 2051 may be integrated with the display panel 2041 (ie, a touch display screen) to implement the input and output functions of the electronic device 200.
  • the processor 202 is the control center of the electronic device 200, using various interfaces and lines to connect various components, by running or executing the software programs and/or modules stored in the memory 203, and calling
  • the data in the memory 203 executes various functions of the electronic device 200 and processes data, thereby realizing various services of the electronic device 200 .
  • the processor 202 may run a control program stored in the memory 203 to implement the control method provided by the embodiments of the present application, and generate a control application.
  • the processor 202 may further control the communication unit 201 to send the command information of the application to other electronic devices after acquiring the start command of the control application.
  • the processor 202 may include one or more processing units.
  • the processor 202 may integrate an application processor, a modem processor, a baseband processor, a graphics processor (graphics processing unit, GPU), etc., wherein the application processor mainly processes an operating system, a user interface, an application program, and the like,
  • the modem processor mainly handles wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 202.
  • the audio circuit 206 (including the speaker 2061 and the microphone 2062 ) can provide an audio interface between the user and the terminal device 200 .
  • the audio circuit 206 can transmit the electrical signal converted from the received audio data to the speaker 2061, and the speaker 2061 converts it into a sound signal for output.
  • the microphone 2062 converts the collected sound signals into electrical signals, which are received by the audio circuit 206 and then converted into audio data for further processing such as transmission or storage.
  • the voice assistant application in the electronic device 200 may collect the user's voice command through the microphone 2062, so as to parse the voice command to obtain the corresponding command.
  • the electronic device 200 may also include one or more sensors 207, such as light sensors, motion sensors, ultrasonic sensors, and other sensors.
  • the electronic device 200 can implement various functions according to the real-time sensor data collected by the sensor 207 .
  • the electronic device 200 may further include a camera 208 to capture images.
  • the structure of the terminal device shown in FIG. 1 does not constitute a limitation on the terminal device, and the terminal device provided in this embodiment of the present application may include more or less components than those shown in the figure, or a combination of certain components may be included. some components, or a different arrangement of components.
  • the software system of the electronic device provided by this application may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to illustrate the software structure of an electronic device.
  • FIG. 3 shows a block diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • the software structure of the electronic device can be a layered architecture, for example, the software can be divided into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, a framework layer (framework, FWK), an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of applications. As shown in FIG. 3 , the application layer may include camera applications, voice assistant applications, desktop management (eg, Huawei desktop HuaWei Launcher) applications, music applications, video applications, map applications, and third-party applications.
  • the third-party applications may include WeChat applications, iQiyi applications, and the like.
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • the application framework layer can include some predefined functions. As shown in FIG. 6 , the application framework layer may include: a system service (System Service), a view system (View System), a web page service (Web Service), a phone manager, a resource manager, and the like.
  • the system service may include a window manager service (WMS) and an activity manager service (activity manager service, AMS).
  • WMS window manager service
  • AMS activity manager service
  • a new system-level service a remote system service (remote system service)
  • remote system service may also be added to the system service.
  • Each service in the system service is described below.
  • the window management service provides window management services for windows, specifically controlling the display and hiding of all windows and the position of windows on the display screen.
  • the window management service can specifically be responsible for the following functions: 1. Allocate a display plane (surface) for each window; 2. Manage the display order, size, and position of the surface; 3. By calling management functions (such as the surface control function (SurfaceControl.Transaction) ), adjust the transparency, stretch factor, position and size of the window to realize the animation effect of the window; 4.
  • management functions such as the surface control function (SurfaceControl.Transaction)
  • the electronic device can provide the user with a window management service. Appropriate window to display or handle this message.
  • the activity management service provides management services for the activities in the application.
  • the activity management service may, but is not limited to, be responsible for the following functions: 1. Unified scheduling of the life cycle of all applications' activities; 2. Start or end the process of the application; 3. Start and schedule the life cycle of the service; 4. Register the broadcast receiver (Broadcast Receiver), and receive and distribute broadcasts (Broadcast); 5. Query the current operating status of the system; 6. Schedule tasks (task).
  • the remote system service is used to realize the signaling between different electronic devices in the control method in the embodiment of the present application, the interaction of command information of the application, and the like.
  • the electronic device can send a first control request to other electronic devices through the remote system service, so that other electronic devices can feed back the command information of the application that needs to be controlled according to the first control request, and can use the remote system in the future.
  • the service receives application command information sent by the other electronic device.
  • the electronic device may receive a second control request (including command information of an application) sent by another electronic device.
  • the electronic device after the electronic device obtains the start command of the application to be controlled, it can also send the command information of the application to other electronic devices through the remote system service.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • An interface can consist of one or more controls.
  • the interface including the SMS notification icon may include controls for displaying text and controls for displaying pictures.
  • a web service is an API that can be called through a web page.
  • the phone manager is used to provide the communication function of the electronic device. For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • Android runtime includes core library (Kernel Library) and virtual machine.
  • Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one part is the functional function that the java language needs to call, and the other part is the core library of the Android system, which is used to provide the Android system with Input/Output Service (Input/Output Service) and core service (Kernel Service) .
  • the application layer and framework layer can run in virtual machines.
  • the virtual machine executes the java files of the application layer and the framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: icon management module, fusion application management module, media library, image processing library, etc.
  • the control application management module is used to determine the command information of the application that is locally located and selected by the user and needs to be controlled; or, according to the obtained command information of the application located in other electronic devices, generate a control application (also referred to as a fusion application).
  • the icon management module is used for correspondingly generating a control icon of the control application in the process of generating the control application.
  • the media library supports playback and recording of audio and video in multiple formats, and supports opening of still images in multiple formats.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer at least includes display drivers, sensor drivers, processor drivers, camera drivers, audio drivers, etc., which are used to drive the hardware in the hardware layer.
  • the hardware layer can include various sensors, displays, processors, input devices, memory, cameras, etc.
  • an embodiment of the present application provides a control method, which can be applied to an application scenario with multiple electronic devices as shown in FIG. 1 , and the method specifically includes There are two processes of controlling application generation and controlling application startup. Based on the software structure of the electronic device shown in FIG. 3 , the first application for realizing the cooperative control of the first electronic device and the second application for the second electronic device are taken as examples, and the control applications are generated in conjunction with FIG. 4A and FIG. 4B respectively. And the two processes of controlling application startup are described in detail.
  • the process of controlling application generation includes the following steps:
  • the user When the user wants to generate the control application of the first application and the second application on the first electronic device, the user needs to operate the second electronic device, select the second application to be cooperatively controlled on the second electronic device, and the device to be cooperatively controlled for the first electronic device.
  • the second electronic device transmits the icon 2 (ie Icon2) and the intent 2 (ie Intent2) of the second application to its own remote system service; wherein, the icon 2 and the intent 2 at this time can use various The network communication method is transmitted to the remote system service, such as broadcast (ie Broadcast), socket (Socket) and so on.
  • the remote system service of the second electronic device establishes a connection with the remote system service of the first electronic device, it sends a control request to the remote system service of the first electronic device, wherein the control request includes the second The icon 2 and intent 2 of the application, and the information of the second electronic device.
  • the control request may further include related information of the second application (eg, application name, application function information, etc.).
  • the first electronic device After the first electronic device receives the control request of the second electronic device through the remote system service, it can prompt the user whether to select the local application to perform coordinated control (ie fusion/combination) with the second application, and combine the items in the control request.
  • the information is transmitted to its own Activity Management Service (AMS).
  • AMS Activity Management Service
  • the first electronic device After the user operates the first electronic device to select the first application to be coordinated, the first electronic device transmits the icon 1 (ie Icon1) and the intent 1 (ie Intent1) of the first application to the activity management service according to the user's operation; example
  • the icon 1 and the intent 1 can also be transmitted through network communication such as broadcast or socket.
  • the activity management service of the first electronic device transmits the received icon 1, intent 1 of the first application, icon 2 and intent 2 of the second application, and information of the second electronic device to the desktop management function of the first electronic device (for example, In the desktop management application (HuaWei Launcher)); exemplarily, the above-mentioned information can also be transmitted to the desktop management function by means of network communication such as broadcast or socket.
  • the desktop management function generates a control application and a control icon according to the above information received, as shown in the figure, including the following steps:
  • the desktop management function generates a control application according to the intent 1 of the first application and the intent 2 of the second application.
  • the desktop management function performs icon redrawing (eg, combined drawing) according to the icon 1 of the first application and the icon 2 of the second application to generate a control icon.
  • the desktop management function may also generate a control icon according to a preset picture or a picture selected by the user.
  • the desktop management function associates the control icon with the control application (ie, realizes the association between the control icon and the intent 1 and the intent 2), and associates the control icon with the information of the second electronic device.
  • control icon is actually a new type of shortcut, and a single icon may be located in at least one application intent of the electronic device.
  • the first electronic device may store the generated control application and control icon in the database.
  • the process of controlling application startup includes the following steps:
  • the user When the user wants to start the control application of the first electronic device, the user needs to operate the first electronic device and click the control icon on the first electronic device.
  • the first electronic device notifies an internal activity management service (AMS) to start a fusion activity (StartMultiActivity) according to the user's operation.
  • AMS internal activity management service
  • StartMultiActivity fusion activity
  • the activity management service of the first electronic device determines that this startup is the startup of a cross-device control application. Therefore, according to the intents associated with the control applications (intent 1 and intent 2), it is determined that the first application corresponding to intent 1 is located locally, and it is determined that The second application corresponding to intent 2 is located in the second electronic device.
  • the activity management service of the first electronic device directly starts the first application by starting the activity (StartActivity) to realize the action corresponding to the intent 1; in addition, for the second electronic device, the first electronic device first checks whether to establish a connection with the second electronic device , if it is determined that the connection is not established (indicating that the second electronic device is not powered on), the first electronic device sends a power-on signal (such as an infrared signal) to the second electronic device, so that when the second electronic device is powered on, it will automatically communicate with the first electronic device. Establish a connection; when the first electronic device determines to establish a connection with the second electronic device, it notifies an internal activity management service (AMS) to start a remote activity (StartRemoteActivity). The activity management service of the first electronic device starts the remote system service, so that the remote system service of the first electronic device sends the intent 2 to the remote system service of the second electronic device according to the information of the second electronic device.
  • AMS internal activity management service
  • StartRemoteActivity The activity management
  • the remote system service of the second electronic device After receiving the intent 2, the remote system service of the second electronic device sends the intent 2 to the internal activity management service (AMS).
  • the activity management service of the second electronic device starts the second application by starting the activity (StartActivity), so as to realize the action corresponding to the intent 2 .
  • the first electronic device can generate a control application according to the intention of the application located in the second electronic device, and when the user starts the control application, send the intention to the second electronic device, so that the second electronic device can realize The action corresponding to the intent finally realizes cross-device application control, thereby realizing multi-device collaboration.
  • the embodiment of the present application provides another control method, and the method can be applied to an application scenario with multiple electronic devices as shown in FIG. 1 .
  • the first electronic device is used as an electronic device for controlling other devices, which may be an electronic device that is easily operated by a user and carried around, such as a smart phone, a wearable device, and the like.
  • the second electronic device can be various electronic devices, which is not limited in this application. The specific process of the method will be described in detail below with reference to the flowchart of the control method shown in FIG. 5 .
  • a first electronic device acquires command information of a first application, where the first application is located in a second electronic device, and the command information of the first application is used to implement an action of the first application.
  • the command information of the first application may be an intent (Intent) of the first application.
  • the first electronic device may, but is not limited to, acquire the command information of the first application in three ways as shown in the figure. Each method is described below:
  • Manner 1 The first electronic device receives, through S501a, the command information of the first application input by the user or from other devices.
  • Manner 2 The user operates the second electronic device, selects the first application to be collaboratively controlled in the applications of the second electronic device, and selects the first electronic device to be collaboratively controlled; then the second electronic device sends the information to the second electronic device through S501b
  • the first electronic device sends a control request, and the control request carries the command information of the first application; the first electronic device receives the command information of the first application from the second electronic device.
  • the user operates the first electronic device, selects the second electronic device to be controlled in coordination with the first electronic device, and selects the first application to be controlled in coordination; then, the first electronic device passes the S501c1 sends a control request to the second electronic device, where the control request carries the information of the first application, so that the second electronic device feeds back the command information of the first application according to the control request.
  • the second electronic device After the second electronic device receives the control request, it prompts the user whether the first electronic device needs to perform coordinated control of the first application. , sending a control response to the first electronic device through S501c2, where the control response includes command information of the first application.
  • the user operates the first electronic device, and selects the second electronic device to be cooperatively controlled in the first electronic device; then the first electronic device sends a control request to the second electronic device through S501c1 to Make the second electronic device feed back the command information of the first application to be cooperatively controlled according to the control request; after receiving the control request, the second electronic device prompts the user that the application to be cooperatively controlled needs to be selected; then the user operates The second electronic device selects a first application to be cooperatively controlled in the applications of the second electronic device; the first electronic device sends a control response to the first electronic device, where the control response includes the first application command information.
  • the second and third modes are both sending the command information of the first application by the second electronic device to the first electronic device. Therefore, optionally, in Manners 2 and 3, the second electronic device may also send the icon of the first application or the information of the second electronic device to the first electronic device.
  • the icon of the first application is used by the first electronic device to subsequently generate a control icon for controlling the application, and the information of the second electronic device may identify that the first application is located in the second electronic device.
  • the first electronic device generates a control application according to the command information of the first application, where the control application is used to make the second electronic device realize the action of the first application.
  • the first electronic device In a first implementation manner, the first electronic device generates a control application only according to the command information of the first application.
  • the control application is for the first electronic device to cooperatively control the first application on the second electronic device.
  • the first electronic device may prompt the user whether to select a local application to perform cooperative control with the first application; if the user chooses not to If a local application is required to perform cooperative control, the first electronic device only generates a control application according to the command information of the first application; if the user selects a local second application to perform cooperative control with the first application, all The first electronic device generates a control application according to the command information of the second application and the command information of the first application.
  • control application can not only make the second electronic device realize the action of the first application, but also make the first electronic device realize the action of the second application.
  • the first electronic device may also acquire command information of a third application located on other electronic devices (the third electronic device will be used as an example for description in the following).
  • the first electronic device may generate the control application according to the command information of the first application and the command information of the third application.
  • the control application can not only enable the second electronic device to implement the action of the first application, but also enable the third electronic device to realize the action of the third application.
  • the first electronic device may further prompt the user whether to select a local application to perform cooperative control with the first application and the third application. If the user selects the local second application for the first application and the third application to perform coordinated control, the first electronic device, according to the command information of the first application and the command information of the second application, the first electronic device.
  • the command information of the three applications generates a control application; at this time, the control application enables the three electronic devices to respectively implement the actions of the respective applications.
  • the first electronic device generates the control application according to the command information of the first application and the command information of the third application; at this time, the The control application not only enables the second electronic device to implement the action of the first application, but also enables the third electronic device to implement the action of the third application.
  • the first electronic device may also generate a control icon corresponding to the control application, and display all the control icons on the display screen of the first electronic device.
  • the control icon is displayed, so that the user can intuitively see that the control application has been generated, and the user can start the control application by clicking the control icon.
  • the first electronic device may generate the control icon according to a preset picture or a picture selected by a user.
  • the first electronic device may also acquire icons of each application to be collaboratively controlled, and perform icon redrawing (for example, combined drawing, combined drawing, layered drawing, etc.) according to these icons, The control icon is generated.
  • the first electronic device can obtain the icon of the corresponding application in the same way as obtaining the command information of the application of other electronic devices.
  • the specific process please refer to the specific description of obtaining the command information of the first application in S501. , and will not be repeated here.
  • control icon is actually a new type of shortcut, and a single icon may correspond to command information of an application located in at least one electronic device.
  • first electronic device can send responsive command information to other electronic devices when the control application is started subsequently, when generating the control icon, the first electronic device can also associate the control icon with other electronic devices. information on electronic devices.
  • the first electronic device can also obtain the information of the corresponding electronic device in the same way as obtaining the command information of the application of other electronic devices. The specific description will not be repeated here.
  • S503 After the first electronic device obtains the start command of the control application, send the command information of the first application to the second electronic device, so that the second electronic device can execute the first electronic device according to the received first command.
  • the command information of the application executes the action of the first application.
  • the first electronic device when the first electronic device further generates the control application according to the command information of the second application located locally, when the first electronic device executes S503, it also needs to generate the control application according to the second application.
  • the command information of the second application executes the action of the second application, in other words, the first electronic device starts the second application according to the command information of the second application, and executes the action of the application through the second application .
  • the first electronic device when the first electronic device further generates the control application according to the command information of the third application located in the third electronic device, when the first electronic device performs S503, the The third electronic device sends the command information of the third application, so that the third electronic device executes the action of the third application according to the received command information of the third application.
  • the first electronic device in the case that the first electronic device also associates the control icon with information of other electronic devices in the process of generating the control application, the first electronic device can information of the second electronic device associated with the icon, and send the command information of the first application to the second electronic device; and the first electronic device may send the command information of the first application to the second electronic device according to the information of the third electronic device associated with the control icon
  • the third electronic device sends command information of the third application.
  • the first electronic device may acquire the start command of the control application in the following manner:
  • Manner 1 the first electronic device detects an operation of the control icon corresponding to the control application by the user; the first electronic device generates a start command of the control application in response to the operation.
  • Manner 2 The first electronic device receives a user's voice command through a voice assistant application; the first electronic device obtains a start command of the control application obtained by parsing the voice command by the voice assistant application.
  • the applications managed by the voice assistant application of the first electronic device are generally applications located in the first electronic device, and the number is limited. Therefore, in order to realize that the voice assistant of the first electronic device can manage applications (for example, the first application) located on other electronic devices, before the first electronic device performs S503, the first electronic device may also The first application (optionally, a managed electronic device—a second electronic device) may be added to the list of applications managed by the voice assistant application.
  • the first electronic device may send a control request to the second electronic device after adding the first application to the list of voice assistant applications, so that the second electronic device can respond to the control request
  • the second electronic device For feeding back the command information of the first application, reference may be made to the description of the third mode in S501 for the specific process, which will not be described in detail here.
  • the first electronic device before sending the command information of the corresponding application to other electronic devices, can also determine whether the other party is powered on (whether it has established a connection with the other party). If the other party is not powered on, the first electronic device can also Send a power-on signal (such as an infrared signal) to it.
  • a power-on signal such as an infrared signal
  • the first electronic device determines that the connection with the second electronic device is not established, it sends a power-on signal to the second electronic device; after the second electronic device is powered on, it establishes a connection with the second electronic device.
  • the first electronic device can also automatically turn on other electronic devices and establish connections with other electronic devices, thereby reducing user operations during the collaborative control process and improving user experience.
  • the first electronic device obtains the command information of the first application and the information of the second electronic device through S501, it also obtains the information associated with the second electronic device.
  • the information of the fourth electronic device indicating that when the second electronic device implements the action of the first application, the cooperation of the fourth electronic device may be required
  • the first electronic device is sending the information to the second electronic device.
  • the electronic device Before the electronic device sends the command information of the first application, it also needs to establish a connection with the fourth electronic device, that is, perform the following steps:
  • the first electronic device determines that the connection with the fourth electronic device is not established, it sends a power-on signal to the fourth electronic device; after the fourth electronic device is powered on, it establishes a connection with the fourth electronic device.
  • the first electronic device can also control the opening of the fourth electronic device associated with the second electronic device, so that the fourth electronic device and the second electronic device can be automatically established in the power-on state connection, so as to ensure that the second electronic device can cooperate with the fourth electronic device to realize the action of the first application.
  • S504 The second electronic device starts the first application according to the received command information of the first application, and executes the action of the first application through the first application.
  • the first electronic device after the first electronic device generates a control application, if the user has further application control or multi-device coordination requirements, the first electronic device The device sends command information of the control application, where the command information of the control application is used to start the control application.
  • the fifth electronic device can generate a new control application at the fifth electronic device according to the command information of the control application.
  • the embodiment of the present application provides a control method, through which an electronic device can acquire command information of applications located in other electronic devices, and generate a control application according to the command information, so that a user can start the control application to Make other electronic devices realize the action of the application.
  • the electronic device can realize the cross-device application control function by generating a control application, thereby realizing multi-device collaboration, thereby improving user experience.
  • connection between any two electronic devices may be various wireless communication connections, for example, at least one of a local area network connection, a Wi-Fi connection, a Bluetooth connection, an IR connection, an NFC connection, a sidelink connection, etc. item.
  • Example 1 The applicable application scenario is that the user uses the Huawei video application in the smart TV to watch videos, and the user wishes to use the smartphone as a remote control.
  • the process of generating a cross-device control application includes the following steps:
  • the user selects the application to be collaboratively controlled (to be shared, to be integrated) as the Huawei video application, and selects the electronic device to be shared as the smartphone.
  • the smart TV sends the icon and intent of the Huawei Video app, as well as the smart TV's logo, to the smartphone through a control request.
  • the smartphone After the smartphone receives the control request, it asks the user whether to choose the local application for collaborative control with the Huawei video application (whether to choose the local application to integrate or combine with the Huawei video application).
  • the smart phone For collaborative control, the smart phone generates a control application according to the intent of the Huawei video application and the intent of the intelligent remote control application, and generates a control icon according to the icon of the Huawei video application and the icon of the intelligent remote control application; The control application is associated, and the control icon is associated with the identification of the smart TV.
  • the process of starting the control application in this example includes the following steps:
  • the smart phone starts the smart remote control application according to the normal local application startup process.
  • the smartphone sends the intent of the Huawei video app to the smart TV. Afterwards, the smart TV launches the Huawei Video app.
  • the user can perform various operations on the smart remote control application of the smart phone, and the smart phone will send the infrared remote control signal corresponding to the operation to the smart TV, so that the Huawei video application of the smart TV can perform corresponding actions according to the infrared remote control signal.
  • the smart TV when the user does not need to operate the remote control of the smart TV, the smart TV can be activated by clicking the control icon on the smart phone, and the corresponding applications can be opened on the smart phone and the smart TV respectively.
  • the smart phone acts as a remote control to operate the smart remote control application in the smart phone, so that the program played by the Huawei video application in the smart TV can be directly controlled.
  • this example extends the function of the desktop icon of the smartphone, that is, a single control icon can achieve the purpose of opening multiple devices and multiple applications.
  • the control application in the smart phone is associated with the identification of the smart TV, so that the smart phone can automatically complete the process of starting the smart TV and establishing a connection with the smart TV according to the identification of the smart TV.
  • Example 2 The application scenario used is that the user uses the badminton somatosensory game application of the smart TV to play games, and the user wishes to use the smartphone as a somatosensory controller (ie, an input device for inputting other data).
  • a somatosensory controller ie, an input device for inputting other data
  • the process of generating a cross-device control application includes the following steps:
  • the user selects the application to be collaboratively controlled (to be shared, to be integrated) as the badminton somatosensory game application (that is, select the application that needs to be associated with the input device as the badminton somatosensory game application), and select the electronic device to be shared.
  • the device is a smartphone.
  • the smart TV sends the icon, intent of the badminton somatosensory game application, and the identity of the smart TV to the smart phone through a control request.
  • the smartphone After the smartphone receives the control request, it asks the user whether to choose the local application and the badminton somatosensory game application for collaborative control (whether to choose the local application and the badminton somatosensory game application for fusion or combination), if the user chooses the local somatosensory control application and badminton
  • the somatosensory game application performs collaborative control, then the smartphone generates a control application according to the intention of the badminton somatosensory game application and the somatosensory control application, and generates a control icon according to the icon of the badminton somatosensory game application and the icon of the somatosensory control application;
  • the control icon is associated with the control application, and the control icon is associated with the identification of the smart TV.
  • the process of starting the control application in this example includes the following steps:
  • the smartphone starts the somatosensory control application according to the normal local application startup process.
  • the smartphone sends the intent of the badminton somatosensory game application to the smart TV. After that, the smart TV starts the badminton somatosensory game application.
  • the user can use the smart phone as the somatosensory controller, and as the user moves the position of the smart phone, the smart phone sends the somatosensory input data to the smart TV, so that the badminton somatosensory game application of the smart TV can be based on the received somatosensory The action corresponding to the input data.
  • the smart phone can transmit the somatosensory input data through an already established connection with the smart TV (for example, a Bluetooth connection, a Wi-Fi connection, etc.), or the smart phone can establish a new connection with the smart TV to transmit the somatosensory input data.
  • the smart TV can be started, and the badminton somatosensory game application in the smart TV can be opened, and then the user can use the smart phone as the somatosensory game application.
  • this example extends the function of the desktop icon of the smartphone, that is, a single control icon can achieve the purpose of opening multiple devices and multiple applications.
  • the control application in the smart phone is associated with the identification of the smart TV, so that the smart phone can automatically complete the process of starting the smart TV and establishing a connection with the smart TV according to the identification of the smart TV.
  • Example 3 The applicable application scenario is that the user sings using the national karaoke application in the smart TV, and the user wishes to use the smart phone as a microphone and use the smart speaker to play audio.
  • the process of generating a cross-device control application includes the following steps:
  • the user selects the application to be collaboratively controlled (to be shared, to be integrated) as the national karaoke application, and selects the electronic device to be shared as the smartphone, and selects the associated device as the smart speaker.
  • the smart TV sends the icon and intention of the National K song application, as well as the logo of the smart TV and the logo of the smart speaker to the smart phone through a control request.
  • the smartphone After the smartphone receives the control request, it asks the user whether to choose the local application and the national K song application for collaborative control (whether to choose the local application and the national K song application for integration or combination).
  • the song application performs collaborative control, then the smart phone generates a control application according to the intention of the national karaoke application and the microphone application, and generates a control icon according to the icon of the national karaoke application and the icon of the microphone application;
  • the icon is associated with the control application, and the control icon is associated with the identification of the smart TV and the identification of the smart audio.
  • the smartphone starts the microphone application according to the normal local application startup process.
  • the smartphone sends the intent of the National K song application to the smart TV. After that, the smart TV starts the national karaoke application.
  • the user can use the smart phone as the microphone of the smart TV to collect the user's voice data, and the smart phone will send the voice data to the smart TV, so that the national karaoke application of the smart TV can process the voice data. , thereby generating audio data.
  • the smart TV can also send audio data to the smart speaker so that the smart speaker can output the audio data.
  • the smart phone can transmit the intent or voice data of the national karaoke application to the smart TV through the connection with the smart TV (such as Bluetooth connection), and the smart TV can also pass the connection with the smart speaker (such as Bluetooth connection) ) to transmit audio data to the smart speaker.
  • the connection with the smart TV such as Bluetooth connection
  • the smart TV can also pass the connection with the smart speaker (such as Bluetooth connection) ) to transmit audio data to the smart speaker.
  • the user can click the control icon on the smart phone to start the smart TV and the smart speaker, open the corresponding applications on the smart phone and the smart TV respectively, and then The user can use the smartphone as the microphone of the smart TV to collect voice data, and the smart TV can also play audio data through the smart speaker, which significantly improves the user experience.
  • this example extends the function of the desktop icon of the smartphone, that is, a single control icon can achieve the purpose of opening multiple devices and multiple applications.
  • the control application in the smart phone is associated with the identification of the smart TV and the identification of the smart audio, so that the smart phone can automatically complete the process of turning on the smart TV and establishing a connection with the smart TV according to the identification of the smart TV.
  • the logo of the audio system automatically completes the startup of the smart audio system.
  • Example 4 The applicable application scenario is that the user wishes to use the voice assistant application on the smart phone to collaboratively control the Huawei video application in the smart TV, and use the smart speaker to play audio.
  • the process of generating a cross-device control application by a smartphone includes the following steps:
  • the user adds the smart TV to the list of electronic devices managed by the language assistant application of the smart phone (that is, the smart TV is the electronic device to be controlled collaboratively).
  • the smartphone sends a control request to the smart TV.
  • the smart TV After the smart TV receives the control request, it prompts the user to select the application to be collaboratively controlled; then the user operates the smart TV, selects the local Huawei video application as the application to be collaboratively controlled, and selects the associated device of the application as the smart speaker.
  • the smart TV sends the icon and intent of the Huawei video app, as well as the logo of the smart TV and the logo of the smart speaker, to the smartphone through a control response.
  • the smartphone After receiving the control request, the smartphone generates a control application according to the intention of the Huawei video application; and generates a control icon according to the icon of the Huawei video application; associates the generated control icon with the control application, and associates the control icon with the smart TV's icon. Identification, identification of smart speakers.
  • the process of starting the control application (including the second control application in the above-mentioned second embodiment) on the smart phone in this example includes the following steps:
  • the user starts the voice assistant application in the smartphone, and inputs the voice information "TV plays XXXX"; the voice assistant application parses the voice information, starts the control application on the smartphone, and generates a command message instructing to play XXXX.
  • the smartphone detects the connection to the smart TV and the connection to the smart speaker, respectively. If the smart phone does not detect the connection with the smart TV, it means that the smart TV is not turned on, then use the infrared remote control signal to turn on the smart TV and establish a connection with the smart TV. Similarly, if the smartphone does not detect the connection with the smart speaker, it means that the smart speaker is not turned on, then use the infrared remote control signal to turn on the smart speaker and establish a connection with the smart speaker. In this way, a connection between the smart speaker and the smart TV can be established when both are turned on.
  • the smartphone After the smartphone establishes a connection with the smart TV, it sends the intent of the Huawei video application and the command message obtained by parsing the voice information to the smart TV. After that, the smart TV starts the Huawei video app and plays the XXXX video according to the command message. In addition, the audio data of the XXXX video played by the smart TV is sent to the smart speaker, so that the smart speaker outputs the audio data.
  • the smart TV can transmit audio data to the smart speaker through a connection with the smart speaker (eg, a Bluetooth connection).
  • Example 5 The applicable application scenario is the same as that of Example 4.
  • the user generates the first control application in the smart TV.
  • the first control application is generated according to the intent of the Huawei video application, and is associated with the smart speaker.
  • the specific generation process is as follows: the user selects the application to be collaboratively controlled as the Huawei video application in the application list interface of the smart TV, and selects the electronic device to be associated as the smart audio.
  • the smart TV generates the first control application according to the intention of the Huawei video application, and generates the first control icon according to the icon of the Huawei video application; associates the first control icon with the first control application, and associates the first control icon with the identification of the smart speaker .
  • the process of generating a cross-device control application by a smartphone includes the following steps:
  • the user adds the smart TV to the list of electronic devices managed by the language assistant application of the smart phone (that is, the smart TV is the electronic device to be controlled collaboratively).
  • the smartphone sends a control request to the smart TV.
  • the smart TV After the smart TV receives the control request, it prompts the user to select the application to be coordinated control; then the user operates the smart TV and selects the local first control application as the application to be coordinated control.
  • the smart TV sends the icon and intent of the first control application (for opening the Huawei video application and playing through the smart speaker), as well as the identification of the smart TV and the identification of the smart speaker to the smartphone through a control response.
  • the smart phone After receiving the control request, the smart phone generates a second control application according to the intention of the first control application; and generates a second control icon according to the icon of the first control application; associates the generated second control icon with the second control application , and associate the second control icon with the identification of the smart TV and the identification of the smart audio.
  • the process of starting the second control application on the smartphone in this example includes the following steps:
  • the user starts the voice assistant application in the smartphone, and inputs the voice information "TV plays XXXX"; the voice assistant application parses the voice information, starts the second control application on the smartphone, and generates a command message instructing to play XXXX.
  • the smartphone detects the connection to the smart TV and the connection to the smart speaker, respectively. If the smart phone does not detect the connection with the smart TV, it means that the smart TV is not turned on, then use the infrared remote control signal to turn on the smart TV and establish a connection with the smart TV. Similarly, if the smartphone does not detect the connection with the smart speaker, it means that the smart speaker is not turned on, then use the infrared remote control signal to turn on the smart speaker and establish a connection with the smart speaker. In this way, a connection between the smart speaker and the smart TV can be established when both are turned on.
  • the smart phone After the smart phone establishes a connection with the smart TV, it sends the intent of the first control application and the command message obtained by parsing the voice information to the smart TV. After that, the smart TV starts the first control application (including opening the local Huawei video application and establishing a connection with the smart speaker), and plays the XXXX video in the Huawei video application according to the command message. In addition, the audio data of the XXXX video played by the smart TV is sent to the smart speaker, so that the smart speaker outputs the audio data.
  • users can manage applications located in other electronic devices through the voice assistant application, which greatly expands the application scope of the voice assistant application, and can open all applications and electronic devices that meet the user's needs at one time through the voice assistant application.
  • the user can directly operate the voice assistant application on the smartphone side to turn on the smart TV and smart speaker at one time, and open the Huawei video application on the smart TV to play the desired video.
  • Example 4 and Example 5 expand the functions of the voice assistant application, so that the voice assistant application can also manage other electronic devices.
  • the voice assistant application does not depend on the power-on state of other electronic devices and the open state of the voice assistant in the process of managing other electronic devices, and can start the corresponding device by controlling the startup of the application.
  • the present application also provides a control apparatus, which can be applied to the electronic equipment in the above embodiments or examples, and is described below by taking the application to the first electronic equipment as an example.
  • the device can implement the above control method.
  • the control device 1000 includes a communication unit 1001 and a processing unit 1002 . The function of each unit is described below.
  • the communication unit 1001 is used for receiving and sending data.
  • the communication unit 1001 may be implemented by a mobile communication module and/or a wireless communication module.
  • a processing unit 1002 configured to acquire command information of a first application, where the first application is located in a second electronic device, and the command information of the first application is used to implement actions of the first application; according to the The command information of the first application generates a control application, wherein the control application is used to make the second electronic device realize the action of the first application.
  • the processing unit 1002 when acquiring the command information of the first application, is specifically configured to:
  • processing unit 1002 is further configured to:
  • the control icon is displayed on the display screen of the first electronic device.
  • the processing unit 1002 when generating the control icon corresponding to the control application, is specifically configured to:
  • the control icon is generated according to the icon information corresponding to the first application.
  • processing unit 1002 is further configured to:
  • the processing unit 1002 when generating the control application according to the command information of the first application, is specifically configured to:
  • command information of a second application wherein the second application is located in the first electronic device and/or the third electronic device, and the command information of the second application is used to implement actions of the second application;
  • the control application is generated according to the command information of the first application and the command information of the second application, wherein the control application is further used to enable the first electronic device and/or the third electronic device implementing the action of the second application;
  • the processing unit 1002 is further configured to: after acquiring the start command of the control application, execute the first application according to the command information of the second application 2. the action of application;
  • the received processing unit 1002 is further configured to: after acquiring the start command of the control application, send the communication unit 1001 to the third electronic device command information of the second application, so that the third electronic device executes the action of the second application according to the command information of the second application received.
  • the processing unit 1002 is specifically configured to obtain the start command of the control application in the following manner:
  • Manner 1 Detecting a user's operation on the control icon corresponding to the control application; in response to the operation, generating a start command of the control application;
  • Manner 2 Receive a user's voice instruction through a voice assistant application; and obtain a startup command of the control application obtained by parsing the voice instruction by the voice assistant application.
  • processing unit 1002 is further configured to:
  • the first application is added to the application list managed by the voice assistant application before acquiring the start command of the control application obtained by parsing the voice instruction by the voice assistant application.
  • the processing unit 1002 is further configured to acquire the information of the second electronic device
  • the processing unit 1002 is specifically configured to:
  • the command information of the first application is sent to the second electronic device through the communication unit 1001 .
  • processing unit 1002 is further configured to:
  • a connection is established with the second electronic device.
  • processing unit 1002 is further configured to:
  • a connection is established with the fourth electronic device.
  • the processing unit 1002 is further configured to send a first control request to the second electronic device through the communication unit 1001 before acquiring the command information of the first application, to causing the second electronic device to feed back the command information of the first application according to the first control request; or the processing unit 1002, when acquiring the command information of the first application, is specifically configured to: pass the The communication unit 1001 receives a second control request from the second electronic device, where the second control request includes command information of the first application.
  • processing unit 1002 is further configured to:
  • command information of the control application is sent to the fifth electronic device through the communication unit 1001, where the command information of the control application is used to start the control application.
  • each functional unit in each embodiment of the present application It can be integrated in one processing unit, or it can exist physically alone, or two or more units can be integrated in one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • the technical solutions of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, and the computer software products are stored in a storage medium , including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .
  • the embodiments of the present application further provide an electronic device, which is used to implement the control methods provided by the above embodiments, and has the functions of the control apparatus 1000 shown in FIG. 10 .
  • the electronic device 1100 includes: a transceiver 1101 , a processor 1102 , a memory 1103 , and a display screen 1104 .
  • the transceiver 1101, the processor 1102, the memory 1103, and the display screen 1104 are connected to each other.
  • the transceiver 1101, the processor 1102, the memory 1103, and the display screen 1104 are connected to each other through a bus.
  • the bus may be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industry standard architecture (extended industry standard architecture, EISA) bus or the like.
  • PCI peripheral component interconnect
  • EISA extended industry standard architecture
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is used in FIG. 11, but it does not mean that there is only one bus or one type of bus.
  • the transceiver 1101 is used to receive and transmit data, and implement communication with other devices.
  • the communication unit 1001 may be implemented by a mobile communication module and/or a wireless communication module.
  • the transceiver 1101 may be implemented by a radio frequency device and an antenna.
  • the processor 1102 is configured to implement the control methods provided by the above embodiments or examples. For the specific process, reference may be made to the descriptions in the above embodiments or examples, which will not be repeated here.
  • the display screen 1104 is used to display an interface.
  • the processor 1102 may be a central processing unit (central processing unit, CPU), a network processor (network processor, NP), or a combination of CPU and NP, and so on.
  • the processor 1102 may further include hardware chips.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the memory 1103 is used to store program instructions and the like.
  • the program instructions may include program code, and the program code includes computer operation instructions.
  • the memory 1103 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory), such as at least one disk storage.
  • the processor 1102 executes the program instructions stored in the memory 1103 to implement the above functions, thereby implementing the methods provided by the above embodiments.
  • the embodiments of the present application further provide a computer program, when the computer program runs on a computer, the computer can execute the methods provided by the above embodiments.
  • the embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a computer, the computer executes the method provided by the above embodiment. .
  • the storage medium may be any available medium that the computer can access.
  • computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or be capable of carrying or storing instructions or data structures in the form of desired program code and any other medium that can be accessed by a computer.
  • an embodiment of the present application further provides a chip, where the chip is used to read a computer program stored in a memory to implement the methods provided by the above embodiments.
  • the embodiments of the present application provide a chip system, where the chip system includes a processor for supporting a computer apparatus to implement the functions involved in the communication device in the above embodiments.
  • the chip system further includes a memory for storing necessary programs and data of the computer device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the embodiments of the present application provide a control method, an apparatus, and an electronic device.
  • the electronic device can obtain command information of applications located in other electronic devices, and generate a control application according to the command information, so that the user can start the control application to enable other electronic devices to implement the action of the application.
  • the electronic device can realize the cross-device application control function by generating a control application, thereby realizing multi-device collaboration, thereby improving user experience.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Acoustics & Sound (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Stored Programmes (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

一种控制方法、装置及电子设备。在本方法中,电子设备可以获取位于其他电子设备的应用的命令信息,并根据命令信息,生成控制应用,从而用户可以通过启动控制应用,以使其他电子设备实现应用的动作。通过本方法,电子设备可以通过生成控制应用的方式,实现跨设备应用控制功能,从而实现多设备协同,进而提高用户体验。

Description

一种控制方法、装置及电子设备
相关申请的交叉引用
本申请要求在2020年10月30日提交中国专利局、申请号为202011193906.1、申请名称为“一种控制方法、装置及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备技术领域,尤其涉及一种控制方法、装置及电子设备。
背景技术
随着全场景多种类电子设备的普及,使多设备协同成为发展趋势。然而,目前要实现多设备协同,需要用户分别对多设备协同所涉及的每个电子设备分别进行操作,这个过程繁琐、操作复杂,用户体验不高。
发明内容
本申请提供了一种控制方法、装置及电子设备,用以实现跨设备应用控制功能。
第一方面,本申请实施例提供了一种控制方法,该方法可以应用于包含多个电子设备恩能够通信的各种应用场景。下面以第一电子设备为例,对该方法的步骤进行说明,该方法包括:
第一电子设备获取第一应用的命令信息之后,根据所述第一应用的命令信息,生成控制应用;其中,所述第一应用位于第二电子设备中,所述第一应用的命令信息用于实现所述第一应用的动作;所述控制应用用于使所述第二电子设备实现所述第一应用的动作。示例性的,所述第一应用的命令信息可以为所述第一应用的意图(Intent)。
在本方法中,第一电子设备可以根据位于其他电子设备的应用的命令信息,生成控制应用,从而用户可以通过启动该控制应用,以使其他电子设备实现该应用的动作。显然通过该方法,电子设备可以通过生成控制应用的方式,实现跨设备应用控制功能,从而实现多设备协同,进而提高用户体验。
在一种可能的设计中,所述第一电子设备可以通过以下方式获取所述第一应用的命令信息:
方式一:接收来自所述第二电子设备的所述第一应用的命令信息。
方式二:获取用户输入的所述第一应用的命令信息。
通过该方法,所述第一电子设备可以灵活地通过多种方式获取位于其他电子设备的应用的命令信息。
在一种可能的设计中,所述第一电子设备还可以生成所述控制应用对应的控制图标,并在显示屏中显示所述控制图标。
通过该方法,所述第一电子设备可以生成控制图标,便于用于通过点击所述控制图标启动所述控制应用。
在一种可能的设计中,所述第一电子设备可以但不限于通过以下方式,生成所述控制图标:
方式一:所述第一电子设备获取所述第一应用对应的图标信息,并根据所述第一应用对应的图标信息,生成所述控制图标。
方式二:所述第一电子设备可以根据预设的图片或用户选择的图片,生成所述控制图标。
通过该方法,所述第一电子设备生成控制应用对应的控制图标。
在一种可能的设计中,所述第一电子设备在获取所述控制应用的启动命令之后,还可以向所述第二电子设备发送所述第一应用的命令信息,以使所述第二电子设备根据接收的所述第一应用的命令信息执行所述第一应用的动作。
通过该方法,所述第一电子设备可以通过向所述第二电子设备房所述第一应用的命令信息,以实现跨设备应用控制。
在一种可能的设计中,所述第一电子设备可以通过以下步骤生成控制应用,包括:
获取第二应用的命令信息,其中,所述第二应用位于所述第一电子设备和/或第三电子设备中,所述第二应用的命令信息用于实现所述第二应用的动作;
根据所述第一应用的命令信息和所述第二应用的命令信息,生成所述控制应用,其中,所述控制应用还用于使所述第一电子设备和/或所述第三电子设备实现所述第二应用的动作。
当所述第二应用位于所述第一电子设备时,所述第一电子设备在获取所述控制应用的启动命令之后,还可以根据所述第二应用的命令信息,执行所述第二应用的动作;当所述第二应用位于所述第三电子设备时,所述第一电子设备在获取所述控制应用的启动命令之后,还可以向所述第三电子设备发送所述第二应用的命令信息,以使所述第三电子设备根据接收到第二应用的命令信息执行所述第二应用的动作。
通过该设计,所述控制应用不仅能够使所述第二电子设备实现所述第一应用的动作,还能够使所述第一电子设备或第三电子设备实现所述第二应用的动作。需要说明的是,本申请实施例不限定需要协同控制的电子设备的数量,也不限定需要协同控制的应用的数量。
在该设计中,所述控制应用可以融合位于多个电子设备中的多个应用。通过该设计,所述第一电子设备可以通过启动所述控制应用,实现多个电子设备的应用控制功能。
在一种可能的设计中,所述第一电子设备可以通过以下方式获取所述控制应用的启动命令:
方式一:检测到用户对所述控制应用对应的控制图标的操作;响应于所述操作,生成所述控制应用的启动命令;
方式二:通过语音助手应用接收用户的语音指令;获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令。
通过该设计,用户可以通过多种操作,灵活地启动所述控制应用。
在一种可能的设计中,所述第一电子设备在获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令之前,还需要在所述语音助手应用所管理的应用列表中添加所述第一应用。
通过该设计,可以扩展语音助手应用的使用范围,通过语音助手应用,用户可以基于语音指令打开位于其他电子设备中的应用。
在一种可能的设计中,所述第一电子设备在获取所述第一应用的命令信息时,还可以获取所述第二电子设备的信息;这样,所述第一电子设备可以根据所述第二电子设备的信息,向所述第二电子设备发送所述第一应用的命令信息。
在一种可能的设计中,所述第一电子设备在向所述第二电子设备发送所述第一应用的命令信息之前,若确定与所述第二电子设备未建立连接,则向所述第二电子设备发送开机信号;并在所述第二电子设备开机之后,与所述第二电子设备建立连接。
通过该设计,所述第一电子设备还可以自动完成其他电子设备的开启,以及和其他电子设备建立连接,从而减少了用户在协同控制过程中的操作,提高了用户体验。
在一种可能的设计中,所述第一电子设备还可以获取所述第二电子设备关联的第四电子设备的信息。在该情况下,所述第一电子设备在向所述第二电子设备发送所述第一应用的命令信息之前,若确定与所述第四电子设备未建立连接,则向所述第四电子设备发送开机信号;并在所述第四电子设备开机之后,与所述第四电子设备建立连接。
通过该设计,所述第一电子设备还可以控制与所述第二电子设备相关联的第四电子设备的开启,从而使所述第四电子设备与所述第二电子设备在开机状态下可以自动建立连接,从而保证所述第二电子设备可以与所述第四电子设备配合实现所述第一应用的动作。
在一种可能的设计中,所述第一电子设备可以通过以下方式,获取所述第一应用的命令信息:
方式一:向所述第二电子设备发送第一控制请求,以使所述第二电子设备根据所述第一控制请求反馈所述第一应用的命令信息;接收所述第二电子设备发送的所述第一应用的命令信息;
方式二:接收来自所述第二电子设备的第二控制请求,所述第二控制请求中包含所述第一应用的命令信息。
通过该设计,所述第一电子设备可以通过多种方式,获取所述第一应用的命令信息。
在一种可能的设计中,所述第一电子设备还可以在生成控制应用之后,向第五电子设备发送所述控制应用的命令信息,所述控制应用的命令信息用于启动所述控制应用。这样所述第五电子设备可以根据所述控制应用的命令信息,在所述第五电子设备处生成新的控制应用。
第二方面,本申请实施例还提供了一种控制装置,该控制装置可以应用于电子设备中,包括用于执行上述第一方面各个步骤的单元或模块。
第三方面,本申请提供一种电子设备,包括至少一个处理元件和至少一个存储元件,其中所述至少一个存储元件用于存储程序和数据,所述至少一个处理元件用于执行本申请第一方面中提供的方法。
第四方面,本申请实施例中还提供一种计算机存储介质,该存储介质中存储软件程序,该软件程序在被一个或多个处理器读取并执行时可实现第一方面或其中任意一种设计提供的方法。
第五方面,本申请实施例还提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第一方面或其中任一种设计提供的方法。
第六方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于支持电子设备实现上述第一方面中所涉及的功能。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器,用于保存电子设备必要的程序指令和数据。该芯片系统,可以由芯片 构成,也可以包含芯片和其他分立器件。
附图说明
图1为本申请实施例提供的一种应用场景示意图;
图2为本申请实施例提供的一种电子设备的结构图;
图3为本申请实施例提供的一种电子设备的软件架构图;
图4A为本申请实施例提供的一种控制应用的生成过程示意图;
图4B为本申请实施例提供的一种控制应用的启动过程示意图;
图5为本申请实施例提供的一种控制方法的流程图;
图6为本申请实施例提供的一种控制方法的实例示意图;
图7为本申请实施例提供的另一种控制方法的实例示意图;
图8为本申请实施例提供的又一种控制方法的实例示意图;
图9为本申请实施例提供的再一种控制方法的实例示意图;
图10为本申请实施例提供的一种控制装置的结构图;
图11为本申请实施例提供的一种电子设备的结构图。
具体实施方式
本申请提供一种控制方法、装置及电子设备,用以实现跨设备应用控制功能,进而实现多设备协同。其中,方法和电子设备是基于同一技术构思的,由于方法与装置、电子设备解决问题的原理相似,因此装置、电子设备与方法的实施可以相互参见,重复之处不再赘述。
在本申请实施例提供的方案中,电子设备可以在获取位于其他电子设备的应用的命令信息,并根据该命令信息,生成控制应用,从而用户可以通过启动该控制应用,以使其他电子设备实现该应用的动作。显然通过该方法,电子设备可以通过生成控制应用的方式,实现跨设备应用控制功能,从而实现多设备协同,进而提高用户体验。
以下,对本申请中的部分用语进行解释说明,以便于本领域技术人员理解。
1)、电子设备,为具有数据连通功能、数据计算和处理功能的设备或装置。例如,所述电子设备可以为手机、平板电脑、笔记本电脑、上网本、车载设备,以及商务智能终端(包括:可视电话、会议桌面智能终端等)、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等,本申请对所述电子设备的具体形态不作限定。
2)、应用(application,APP),用于安装在电子设备中,具有向用户提供服务的功能。例如,具有提供拍摄服务功能的相机应用,具有提供聊天服务功能的微信应用、QQ应用等,具有提供视频服务功能的爱奇艺应用、腾讯视频应用等,具有提供音乐服务功能的QQ音乐应用等。应用可以是电子设备的生产厂商开发,或者为电子设备的操作系统的供应商开发,或者由第三方应用厂商开发,本申请对此不作限定。
3)、应用的命令信息,用于实现该应用的动作(即功能、服务、任务、操作等)。在一些实施方式中,应用的命令信息可以包含该动作的相关信息,具体包括:该动作的类型、所涉及的数据、附加数据等。
在安卓(Android)系统中,应用的命令信息可以用意图(Intent)表示。Android系统可以通过Intent机制来协助应用之间的交互与通讯。Intent为对应用中需要执行的动作的抽象描述,可以作为应用的参数。Android系统可以根据Intent的描述,负责找到对应的组件,将Intent传输给该需要调用的组件,从而完整组件的调用。Intent不仅可以用于应用之间,还可以应用于应用内部的活动(Activity)和服务(Service)之间的交互。因此,Intent起着媒介的作用,提供组件互相调用的相关信息,实现调用者和被调用者之间的解耦。Intent的表现形式包含:
启动Activity、启动Service、绑定Activity和Service以建立二者之间的通信,还可以发送广播(Broadcast)。
发送广播Broadcast。通过广播函数Context.sendBroadcasts()/Context.sendOrderedBroadcast()/Context.sendStickyBroadcast()发给Broadcast Receivers。
Intent信息的主要属性包括:执行的动作类型(Action)、操作数据(Data);其次要属性(即Intent信息包含的附加数据)包含:类别(category)、数据类型(type)、组件(component)、附加信息(extras)。
4)、多个,是指两个或两个以上。至少一个是指一个和多个。
5)、“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
另外,需要理解的是,在本申请的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。
下面说明本申请实施例可以使用的应用场景架构图,参阅图1所示,该应用场景中包含多个电子设备。
在该应用场景中,不同电子设备之间能够通过通信网络进行通信。示例性的,所述通信网络可以为局域网(例如家庭局域网、智能家居局域网等)。又例如,所述通信网络还可以为通过无线保真(wireless-fidelity,Wi-Fi)、蓝牙(Bluetooth,BT)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)、直连连接(sidelink)通信技术等技术形成的网络,本申请对此不做限定。其中,在该应用场景中,在两个电子设备之间在同一通信网络中建立过通信连接的情况下,当二者同时处于开机状态,且该通信网络通信正常,那么二者均可以自动接入通信网络,并建立二者之间的连接。
需要说明的是,所述应用场景中的任一个电子设备可以作为具有控制功能的控制设备;同样的,任一个电子设备也可以作为协同设备,被控制设备进行应用控制。
示例性的,在图1所示的智能家居系统应用场景中,智能手机可以控制智能电视、智能音响、笔记本电脑、可穿戴设备等电子设备中的至少一项。
可以理解的是,本发明实施例描述的上述应用场景是为了更加清楚的说明本发明实施例的技术方案,并不构成对于本发明实施例提供的技术方案的限定,本领域普通技术人员可知,随着网络架构的演变和新业务的出现,本发明实施例提供的技术方案对于类似的技术问题,同样适用。例如,该方法还可以适用于各种其他应用场景,例如:车到万物(vehicle to everything,V2X)、长期演进-车联网(LTE-vehicle,LTE-V)、车到车(vehicle to vehicle,V2V)、车联网、机器类通信(Machine Type Communications,MTC)、物联网(internet of  things,IoT)、长期演进-机器到机器(LTE-machine to machine,LTE-M)、机器到机器(machine to machine,M2M)等应用场景中。
本申请实施例提供的控制方法可以适用于如图1所示的应用场景中的任一个电子设备中,下面对该电子设备的结构进行说明。图2示出了本申请实施例提供方法适用的可能的电子设备的结构图。参阅图2所示,电子设备200中包含:通信单元201、处理器202、存储器203、显示单元204、输入单元205、音频电路206、传感器207、摄像头208等部件。下面结合图2对所述电子设备200的各个构成部件进行具体的介绍。
通信单元201用于实现所述电子设备200的功能,实现与其他设备的数据通信。可选的,所述通信单元201中可以包含无线通信模块2011和移动通信模块2012。除了所述通信单元201,所述电子设备200还需要配合天线、处理器202中的调制解调处理器和基带处理器等部件实现通信功能。
无线通信模块2011可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块2011可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块2011经由天线接收电磁波,将电磁波进行信号调频以及滤波处理,将处理后的信号发送到处理器202。无线通信模块2011还可以从处理器202接收待发送的信号,对其进行调频、放大,经天线转为电磁波辐射出去。
移动通信模块2012可以提供应用在电子设备上的包括2G/3G/4G/5G等移动通信的解决方案。移动通信模块2012可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块2012可以由天线接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块2012还可以对经调制解调处理器调制后的信号放大,经天线转为电磁波辐射出去。在一些实施例中,移动通信模块2012的至少部分功能模块可以被设置于处理器202中。在一些实施例中,移动通信模块2012的至少部分功能模块可以与处理器202的至少部分模块被设置在同一个器件中。
所述电子设备200可以根据所述移动通信模块2012与移动通信系统中的基站建立无线连接,并通过所述移动通信模块2012接受移动通信系统的服务。
在所述电子设备200实现本申请实施例提供的控制方法时,所述电子设备200可以通过所述通信单元201中的无线通信模块2011或移动通信模块2012,向其他电子设备发送应用的命令信息,或接收其他电子设备的应用的命令信息等;还可以向其他电子设备发送开机信号或接收其他电子设备的开机信号。
所述通信单元201中还可以包括通信接口,用于所述电子设备200与其他设备实现物理连接。所述通信接口可以与所述其他设备的通信接口通过电缆连接,实现所述终端设备200和其他设备之间的数据传输。
所述存储器203可用于存储软件程序以及数据。所述处理器202通过运行存储在所述存储器203的软件程序以及数据,从而执行所述终端设备200的各种功能以及数据处理。在本申请实施例中,所述软件程序可以为实现控制方法的控制程序,以及各种应用的程序 等。
可选的,所述存储器203可以主要包含存储程序区和存储数据区。其中,存储程序区可存储操作系统、各种软件程序等;存储数据区可存储用户输入或者所述终端设备200在运行软件程序过程中创建的数据等。其中,所述操作系统可以为
Figure PCTCN2021119707-appb-000001
Figure PCTCN2021119707-appb-000002
等。此外,所述存储器203可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。例如,在本申请实施例中,实现控制方法的控制程序,以及各种应用的程序等可以存储在存储程序区中,应用的命令信息,以及图标等数据可以存储在存储数据区中。
所述输入单元205可用于接收用户输入的字符信息以及信号。可选的,输入单元205可包括触控面板2051以及其他输入设备(例如功能键)。其中,所述触控面板2051,也称为触摸屏,可收集用户在其上或附近的触摸操作,生成相应的触摸信息发送给处理器202,以使202执行该触摸信息对应的命令。触控面板2051可以采用电阻式、电容式、红外线以及表面声波等多种类型实现。例如,在本申请实施例中,用户可以通过所述触控面板2051选择需要融合或启动的应用。
所述显示单元204用于呈现用户界面,实现人机交互。例如,所述显示单元204可以显示由用户输入的信息,或提供给用户的信息,以及所述终端设备200的各种菜单、各个主界面(包含各种应用的图标),各个应用的窗口等内容。在本申请实施例中,在所述处理器202可以在所述显示单元204中显示各种应用的图标。
所述显示单元204可以包括显示面板2041,所述显示面板2041可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)等形式来配置。
需要说明的是,所述触控面板2051可覆盖所述显示面板2041,虽然在图2中,所述触控面板2051与所述显示面板2041是作为两个独立的部件来实现所述电子设备200的输入和输入功能,但是在本申请实施例中,可以将所述触控面板2051与所述显示面板2041集成(即触摸显示屏)而实现所述电子设备200的输入和输出功能。
所述处理器202是所述电子设备200的控制中心,利用各种接口和线路连接各个部件,通过运行或执行存储在所述存储器203内的软件程序和/或模块,以及调用存储在所述存储器203内的数据,执行所述电子设备200的各种功能和处理数据,从而实现所述电子设备200的多种业务。例如,所述处理器202可以运行存储在所述存储器203中的控制程序,实现本申请实施例提供的控制方法,生成控制应用。另外,在生成控制应用后,所述处理器202还可以在获取到所述控制应用的启动命令之后,控制通信单元201向其他电子设备发送应用的命令信息。
可选的,所述处理器202可包括一个或多个处理单元。所述处理器202可集成应用处理器、调制解调处理器、基带处理器,图形处理器(graphics processing unit,GPU)等,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到所述处理器202中。
所述音频电路206(包括扬声器2061,麦克风2062)可提供用户与所述终端设备200之间的音频接口。音频电路206可将接收到的音频数据转换后的电信号,传输到所述扬声器2061,由所述扬声器2061转换为声音信号输出。另一方面,所述麦克风2062将收集的声音信号转换为电信号,由所述音频电路206接收后转换为音频数据,以进行传输或存储 等进一步处理。在本申请实施例中,电子设备200中的语音助手应用可以通过麦克风2062采集用户的语音指令,从而解析该语音指令,得到相应的命令。
所述电子设备200还可以包括一种或多种传感器207,比如光传感器、运动传感器、超声波传感器以及其他传感器。所述电子设备200可以根据所述传感器207采集的实时传感器数据,实现各种功能。
所述电子设备200内部还可以包括摄像头208,以采集图像。
本领域技术人员可以理解,图1中示出的终端设备的结构并不构成对终端设备的限定,本申请实施例提供的终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
本申请提供的电子设备的软件系统可以采用分层架构、事件驱动架构,微核架构、微服务架构,或云架构。本申请实施例以分层架构的安卓(Android)系统为例,示例性说明电子设备的软件结构。
图3示出了本申请实施例提供的电子设备的软件结构框图。如图3所示,电子设备的软件结构可以是分层架构,例如可以将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,框架层(framework,FWK),安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序。如图3中所示,应用程序层可以包括相机应用、语音助手应用、桌面管理(例如华为桌面HuaWei Launcher)应用、音乐应用、视频应用、地图应用,以及第三方应用程序等。其中,第三方应用程序可以包括微信应用、爱奇艺应用等。
框架层为应用程序层中的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层可以包括一些预先定义的函数。如图6所示,应用程序框架层可以包括:系统服务(System Service)、视图系统(View System)、网页服务(Web Service),电话管理器,资源管理器等。
其中,系统服务中可以包含窗口管理服务(window manager service,WMS)、活动管理服务(activity manager service,AMS)。其中,在本申请实施例中,所述系统服务中还可以增加一个新的系统级服务——远端系统服务(remote system service)。下面分别对系统服务中的各个服务进行说明。
窗口管理服务,为窗口(window)提供窗口管理服务,具体控制所有窗口的显示、隐藏以及窗口在显示屏中的位置。窗口管理服务具体可以负责以下功能:1、为每个窗口分配显示平面(surface);2、管理surface的显示顺序、尺寸、位置;3、通过调用管理函数(例如surface控制函数(SurfaceControl.Transaction)),调节窗口的透明度、拉伸系数、位置和尺寸,实现窗口的动画效果;4、与输入系统相关,例如当电子设备接收到一个触摸事件时,电子设备可以通过窗口管理服务为用户提供一个合适的窗口来显示或处理这个消息。
活动管理服务,为应用中的活动(activity)提供管理服务。所述活动管理服务可以但不限于负责以下功能:1、统一调度所有应用的活动的生命周期;2、启动或结束应用的进程;3、启动并调度服务的生命周期;4、注册广播接收器(Broadcast Receiver),并接收和 分发广播(Broadcast);5、查询系统当前运行状态;6、调度任务(task)。
远端系统服务,用于实现本申请实施例中控制方法中不同电子设备之间的信令、应用的命令信息的交互等。例如,电子设备可以通过该远端系统服务向其他电子设备发送第一控制请求,以使其他电子设备根据该第一控制请求反馈需要控制的应用的命令信息,并在后续可以通过该远端系统服务接收该其他电子设备发送的应用的命令信息。又例如,电子设备可以接收其他电子设备的发送的第二控制请求(包含应用的命令信息)。再例如,当电子设备在获取待控制应用的启动命令之后,还可以通过远端系统服务向其他电子设备发送应用的命令信息。
视图系统中包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。界面可以由一个或多个控件组成的。例如,包括短信通知图标的界面,可以包括显示文字的控件以及显示图片的控件。
网页服务(Web Service),为能够通过网页进行调用的API。电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
Android runtime包括核心库(Kernel Library)和虚拟机。Android runtime负责安卓系统的调度和管理。其中,核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓系统的核心库,用于为安卓系统提供输入/输出服务(Input/Output Service)和核心服务(Kernel Service)。应用程序层和框架层可以运行在虚拟机中。虚拟机将应用程序层和框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:图标管理模块、融合应用管理模块、媒体库(media libraries),图像处理库等。
控制应用管理模块,用于确定位于本地的、用户选择的需要控制的应用的命令信息;或者,根据获取的位于其他电子设备的应用的命令信息,生成控制应用(还可以称为融合应用)。
图标管理模块,用于在生成控制应用的过程中,相应的生成控制应用的控制图标。
媒体库支持多种格式的音频、视频的回放和录制,以及支持打开多种格式的静态图像等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,传感器驱动、处理器驱动、摄像头驱动,音频驱动等,用于驱动硬件层中的硬件。
硬件层可以包括各类传感器、显示屏、处理器、输入设备、内存、摄像头等。
为了实现电子设备的跨设备应用控制和多设备协同,本申请实施例提供了一种控制方法,该方法可以适用于如图1所示的具有多个电子设备的应用场景中,该方法具体包含控制应用生成和控制应用启动两个过程。下面基于图3所示的电子设备的软件结构,以实现第一电子设备协同控制自身的第一应用和第二电子设备的第二应用为例,并结合图4A和图4B分别对控制应用生成和控制应用启动两个过程进行详细说明。
如图4A所示,控制应用生成的过程包括以下步骤:
用户希望在第一电子设备上生成第一应用和第二应用的控制应用时,需要用户操作第 二电子设备,在第二电子设备上选择待协同控制的第二应用,以及待协同控制的设备为第一电子设备。
第二电子设备根据用户的操作,将第二应用的图标2(即Icon2)和意图2(即Intent2)传输至自身的远端系统服务;其中,此时的图标2和意图2可以通过多种网络通信方式传输至该远端系统服务,例如,广播(即Broadcast)、套接字(Socket)等。第二电子设备的远端系统服务在与第一电子设备的远端系统服务通过验证建立连接之后,向第一电子设备的远端系统服务发送控制请求,其中,所述控制请求中包含第二应用的图标2和意图2,以及第二电子设备的信息。可选的,所述控制请求中还可以包含第二应用的相关信息(例如应用名称、应用功能信息等)。
第一电子设备通过远端系统服务接收第二电子设备的控制请求后,可以提示用户是否选择本地的应用与该第二应用进行协同控制(即融合/组合),并将控制请求中的各项信息传输给自身的活动管理服务(AMS)。在用户操作第一电子设备选择待协同的第一应用后,第一电子设备根据用户的操作,将第一应用的图标1(即Icon1)和意图1(即Intent1)传输至活动管理服务;示例性的,所述图标1和意图1也可以通过广播或套接字等网络通信方式传输。第一电子设备的活动管理服务将接收的第一应用的图标1、意图1,第二应用的图标2和意图2,以及第二电子设备的信息传输到第一电子设备的桌面管理功能(例如桌面管理应用(HuaWei Launcher))中;示例性的,上述各项信息也可以通过广播或套接字等网络通信方式传输至该桌面管理功能。桌面管理功能根据接收的上述各项信息,生成控制应用和控制图标,如图中所示,具体包括以下步骤:
1、桌面管理功能根据第一应用的意图1和第二应用的意图2,生成控制应用。
2、桌面管理功能根据第一应用的图标1和第二应用的图标2进行图标重绘(例如合并绘制),生成控制图标。在一些其他实施例中,桌面管理功能还可以根据预设的图片或用户选择的图片,生成控制图标。
3、桌面管理功能将控制图标与控制应用关联(即实现控制图标与意图1和意图2的关联),并将控制图标与所述第二电子设备的信息关联。
需要说明的是,控制图标实际上为一个新的类型的快捷方式,单个图标可以位于至少一个电子设备中的应用的意图。
最后,第一电子设备可以将生成的控制应用和控制图标存储到数据库中。
图4B所示,控制应用启动的过程包括以下步骤:
用户希望启动第一电子设备的控制应用时,需要用户操作第一电子设备,在第一电子设备上点击控制图标。
第一电子设备根据用户的操作,通知内部的活动管理服务(AMS)启动融合活动(StartMultiActivity)。第一电子设备的活动管理服务确定本次启动为跨设备的控制应用的启动,因此,根据控制应用关联的意图(意图1和意图2),确定意图1对应的第一应用位于本地,并确定意图2对应的第二应用位于第二电子设备中。第一电子设备的活动管理服务通过启动活动(StartActivity),直接启动第一应用,以实现意图1对应的动作;另外对于第二电子设备,第一电子设备首先检查是否与第二电子设备建立连接,若确定未建立连接(标识第二电子设备未开机),则第一电子设备向第二电子设备发送开机信号(例如红外信号),这样当第二电子开机后则会自动与第一电子设备建立连接;在第一电子设备确 定与第二电子设备建立连接时,通知内部的活动管理服务(AMS)启动远端活动(StartRemoteActivity)。第一电子设备的活动管理服务启动远端系统服务,以使第一电子设备的远端系统服务根据第二电子设备的信息,将意图2发送到第二电子设备的远端系统服务。
第二电子设备的远端系统服务接收到意图2之后,将意图2发送给内部的活动管理服务(AMS)。第二电子设备的活动管理服务通过启动活动(StartActivity),启动第二应用,以实现意图2对应的动作。
通过本申请实施例,第一电子设备可以根据位于第二电子设备的应用的意图,生成控制应用,并在用户启动控制应用时,向第二电子设备发送该意图,以使第二电子设备实现该意图对应的动作,最终实现跨设备应用控制,进而实现多设备协同。
需要说明的是,在本申请实施例中关于应用的意图的描述可以参考前述内容中对应用的命令信息的用语解释中的描述,此处不再赘述。
本申请实施例提供了另一种控制方法,该方法可以适用于如图1所示的具有多个电子设备的应用场景中。其中,在本申请实施例中,第一电子设备作为控制其他设备的电子设备,其可以为用户便于操作、随身携带的电子设备,例如智能手机、可穿戴设备等。第二电子设备作为被控制的电子设备,可以为各种电子设备,本申请对此不作限定。下面结合图5所示的控制方法流程图,对该方法的具体过程进行详细描述。
S501:第一电子设备获取第一应用的命令信息,其中所述第一应用位于第二电子设备中,所述第一应用的命令信息用于实现所述第一应用的动作。
可选的,所述第一应用的命令信息可以为第一应用的意图(Intent)。
根据具体的场景的不同,所述第一电子设备可以但不限于通过如图所示的三种方式,获取所述第一应用的命令信息。下面分别对每种方式进行说明:
方式一:所述第一电子设备通过S501a,接收用户输入的或者来自其他设备的所述第一应用的命令信息。
方式二:用户操作第二电子设备,在第二电子设备的应用中选择待协同控制的第一应用,以及选择待协同控制的第一电子设备;然后所述第二电子设备通过S501b向所述第一电子设备发送控制请求,所述控制请求中携带所述第一应用的命令信息;所述第一电子设备接收来自所述第二电子设备的所述第一应用的命令信息。
方式三:
在一种实施方式中,用户操作第一电子设备,在第一电子设备中选择待协同控制的第二电子设备,以及并选择待协同控制的第一应用;然后,所述第一电子设备通过S501c1向第二电子设备发送控制请求,所述控制请求中携带所述第一应用的信息,以使所述第二电子设备根据所述控制请求反馈所述第一应用的命令信息。所述第二电子设备接收到所述控制请求后,向用户提示是否需要第一电子设备对第一应用进行协同控制,在用户选择需要协同控制的情况下,所述第二电子设备根据控制请求,通过S501c2向所述第一电子设备发送控制响应,所述控制响应中包含所述第一应用的命令信息。
在另一种实施方式中,用户操作第一电子设备,在第一电子设备中选择待协同控制的第二电子设备;然后所述第一电子设备通过S501c1向第二电子设备发送控制请求,以使第二电子设备根据所述控制请求反馈待协同控制的第一应用的命令信息;所述第二电子设 备接收到所述控制请求后,向用户提示需要选择待协同控制的应用;然后用户操作第二电子设备,在第二电子设备的应用中选择待协同控制的第一应用;所述第一电子设备向所述第一电子设备发送控制响应,所述控制响应中包含所述第一应用的命令信息。
通过以上描述可知,方式二和方式三均为第二电子设备向第一电子设备发送所述第一应用的命令信息。因此,可选的,在方式二和方式三中所述第二电子设备还可以向所述第一电子设备发送所述第一应用的图标或所述第二电子设备的信息。所述第一应用的图标用于所述第一电子设备在后续生成控制应用的控制图标,所述第二电子设备的信息可以标识第一应用位于第二电子设备中。
需要说明的是,所述第二电子设备向所述第一电子设备发送第一应用的命令信息(例如S501b或S501c2)的具体过程,可以参考图4A中的第二电子设备发送图标2、意图2的过程,此处不再详细赘述。
S502:所述第一电子设备根据所述第一应用的命令信息,生成控制应用,其中,所述控制应用用于使所述第二电子设备实现所述第一应用的动作。
在第一种实施方式中,所述第一电子设备仅根据所述第一应用的命令信息,生成控制应用。在该情况下,所述控制应用用于所述第一电子设备协同控制所述第二电子设备上的第一应用。
在第二种实施方式中,所述第一电子设备可以在通过S501接收到所述第一应用的命令信息之后,提示用户是否选择本地的应用与该第一应用进行协同控制;若用户选择不需要本地的应用进行协同控制,则所述第一电子设备仅根据所述第一应用的命令信息生成控制应用;若用户选择本地的第二应用与所述第一应用进行协同控制时,则所述第一电子设备根据所述第二应用的命令信息和所述第一应用的命令信息,生成控制应用。
在该情况下,所述控制应用不仅能够使所述第二电子设备实现所述第一应用的动作,还能够使所述第一电子设备实现所述第二应用的动作。
在第三种实施方式中,所述第一电子设备还可以获取位于其他电子设备(后续以第三电子设备为例进行说明)上的第三应用的命令信息。
可选的,与第一种实施方式类似的,所述第一电子设备可以根据所述第一应用的命令信息和所述第三应用的命令信息,生成所述控制应用。此时,所述控制应用不仅能够使所述第二电子设备实现所述第一应用的动作,还能够使所述第三电子设备实现所述第三应用的动作。
可选的,与所述第二种实施方式类似的,所述第一电子设备还可以提示用户是否选择本地的应用与第一应用和第三应用进行协同控制。若用户选择本地的第二应用于第一应用和第三应用进行协同控制时,则所述第一电子设备根据所述第一应用的命令信息、所述第二应用的命令信息,所述第三应用的命令信息,生成控制应用;此时,所述控制应用能够使三个电子设备分别实现各自应用的动作。而若用户选择不需要本地的应用进行协同控制时,则所述第一电子设备根据所述第一应用的命令信息和所述第三应用的命令信息生成所述控制应用;此时,所述控制应用不仅能够使所述第二电子设备实现所述第一应用的动作,还能够使所述第三电子设备实现所述第三应用的动作。
需要说明的是,本申请实施例不限定需要协同控制的电子设备的数量,也不限定需要协同控制的应用的数量。
另外,所述第一电子设备在执行S502生成控制应用的过程中,所述第一电子设备还 可以生成所述控制应用对应的控制图标,并且在所述第一电子设备的显示屏中显示所述控制图标,以便用户可以直观的看到该控制应用已经生成,并且用户可以通过点击所述控制图标启动所述控制应用。
在一种实施方式中,所述第一电子设备可以根据预设的图片或用户选择的图片,生成所述控制图标。
在另一种实施方式中,所述第一电子设备还可以获取各个待协同控制的应用的图标,并根据这些图标进行图标重绘(例如,合并绘制、组合绘制,分层绘制等等),生成所述控制图标。示例性的,所述第一电子设备可以采用与获取其他电子设备的应用的命令信息同样的方式,获取相应的应用的图标,具体过程可以参考S501中对获取第一应用的命令信息的具体描述,此处不再赘述。
需要说明的是,所述控制图标实际上为一个新的类型的快捷方式,单个图标可以对应位于至少一个电子设备中的应用的命令信息。为了实现后续在所述控制应用启动时所述第一电子设备可以向其他电子设备发送响应的命令信息,所述第一电子设备在生成所述控制图标时,还可以将所述控制图标与其他电子设备的信息进行关联。
示例性的,所述第一电子设备也可以采用与获取其他电子设备的应用的命令信息同样的方式,获取相应的电子设备的信息,具体过程可以参考S501中对获取第一应用的命令信息的具体描述,此处不再赘述。
S503:所述第一电子设备获取所述控制应用的启动命令之后,向所述第二电子设备发送所述第一应用的命令信息,以使所述第二电子设备根据接收的所述第一应用的命令信息执行所述第一应用的动作。
在一些实施方式中,当所述第一电子设备还根据位于本地的第二应用的命令信息生成所述控制应用时,所述第一电子设备在执行S503时,还需要根据所述第二应用的命令信息执行所述第二应用的动作,换句话说,所述第一电子设备根据所述第二应用的命令信息启动所述第二应用,并通过所述第二应用执行该应用的动作。
在另一些实施方式中,当所述第一电子设备还根据位于第三电子设备中的第三应用的命令信息生成所述控制应用时,所述第一电子设备在执行S503时,还需要向所述第三电子设备发送所述第三应用的命令信息,以使所述第三电子设备根据接收的所述第三应用的命令信息执行所述第三应用的动作。
需要说明的是,在所述第一电子设备在生成所述控制应用过程中,还将所述控制图标与其他电子设备的信息进行关联的情况下,所述第一电子设备可以根据所述控制图标关联的第二电子设备的信息,向所述第二电子设备发送所述第一应用的命令信息;以及所述第一电子设备可以根据所述控制图标关联的第三电子设备的信息,向所述第三电子设备发送所述第三应用的命令信息。
在本申请实施例中,所述第一电子设备可以通过以下方式,获取所述控制应用的启动命令:
方式一:所述第一电子设备检测到用户对所述控制应用对应的控制图标的操作;所述第一电子设备响应于所述操作,生成所述控制应用的启动命令。
方式二:所述第一电子设备通过语音助手应用接收用户的语音指令;所述第一电子设备获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令。
需要说明的是,一般情况下,第一电子设备的语音助手应用管理的应用一般为位于第 一电子设备的应用,且数量有限。因此,为了实现第一电子设备的语音助手能够管理位于其他电子设备上的应用(例如所述第一应用),在所述第一电子设备在执行S503之前,所述第一电子设备还可以在所述语音助手应用所管理的应用列表中添加所述第一应用(可选的,还可以添加管理的电子设备——第二电子设备)。
在情况下,所述第一电子设备可以在语音助手应用的列表中添加所述第一应用之后,向所述第二电子设备发送控制请求,以使所述第二电子设备根据所述控制请求反馈所述第一应用的命令信息,具体过程可以参考S501中方式三的描述,此处不再详细赘述。
另外,所述第一电子设备在向其他电子设备发送相应的应用的命令信息之前,还可以确定对方是否开机(自身是否与对方建立连接),若对方未开机,所述第一电子设备还可以向其发送开机信号(例如红外信号)。继续以第二电子设备为例,所述第一电子设备在向所述第二电子设备发送所述第一应用的命令信息之前,还包括以下步骤:
所述第一电子设备确定与所述第二电子设备未建立连接时,向所述第二电子设备发送开机信号;待所述第二电子设备开机后,与所述第二电子设备建立连接。
通过该步骤,所述第一电子设备还可以自动完成其他电子设备的开启,以及和其他电子设备建立连接,从而减少了用户在协同控制过程中的操作,提高了用户体验。
另外,在本申请实施例中,若所述第一电子设备通过S501,获取所述第一应用的命令信息和所述第二电子设备的信息时,还获取到所述第二电子设备关联的第四电子设备的信息(表示所述第二电子设备实现所述第一应用的动作时,可能需要所述第四电子设备的配合)时,那么所述第一电子设备在向所述第二电子设备发送所述第一应用的命令信息之前,还需要与所述第四电子设备建立连接,即执行以下步骤:
所述第一电子设备确定与所述第四电子设备未建立连接时,向所述第四电子设备发送开机信号;待所述第四电子设备开机后,与所述第四电子设备建立连接。这样,所述第一电子设备还可以控制与所述第二电子设备相关联的第四电子设备的开启,从而使所述第四电子设备与所述第二电子设备在开机状态下可以自动建立连接,从而保证所述第二电子设备可以与所述第四电子设备配合实现所述第一应用的动作。
S504:所述第二电子设备根据接收的所述第一应用的命令信息,启动所述第一应用,并通过所述第一应用执行所述第一应用的动作。
还需要说明的是,在本申请实施例中,所述第一电子设备在生成控制应用之后,若用户有进一步的应用控制或多设备协同需求,所述第一电子设备还可以向第五电子设备发送所述控制应用的命令信息,所述控制应用的命令信息用于启动所述控制应用。这样所述第五电子设备可以根据所述控制应用的命令信息,在所述第五电子设备处生成新的控制应用。其中,所述第五电子设备生成新的得控制应用的过程,可以参考以上步骤中第一电子设备生成控制应用的过程,此次不再赘述。
另外,在本申请实施例中,在控制应用的生成过程和启动过程中各个电子设备内部的执行动作可以参考图4A和图4B中的描述,此处也不再详细赘述。
本申请实施例提供了一种控制方法,通过该方法,电子设备可以在获取位于其他电子设备的应用的命令信息,并根据该命令信息,生成控制应用,从而用户可以通过启动该控制应用,以使其他电子设备实现该应用的动作。显然通过该方法,电子设备可以通过生成控制应用的方式,实现跨设备应用控制功能,从而实现多设备协同,进而提高用户体验。
本申请实施例提供的方法可以适用于各种应用场景,下面结合几个具体的实例,对本申请实施例提供的控制方法的进行说明。需要说明的是,以下实例中涉及的控制应用的生成过程和启动过程可以参考图4A和图4B中的描述,以及图5所示的实施例中的描述,以下个实例不再详细赘述。另外,在以下实例中,任意两个电子设备之间的连接可以为各种无线通信连接,例如,局域网连接、Wi-Fi连接、蓝牙连接、IR连接、NFC连接、sidelink连接等中的至少一项。
实例1:适用的应用场景为用户使用智能电视中的华为视频应用观看视频,并且用户希望可以使用智能手机作为遥控器。
在本实例中,生成跨设备的控制应用(又可以称为融合应用)的过程包括以下步骤:
用户在智能电视的应用列表界面中选择待协同控制(待分享、待融合)的应用为华为视频应用,并选择需要分享到的电子设备为智能手机。智能电视将华为视频应用的图标、意图,以及智能电视的标识通过控制请求发送给智能手机。
智能手机接收到控制请求后,询问用户是否选择本地的应用与华为视频应用进行协同控制(是否选择本地的应用与华为视频应用进行融合或组合),若用户选择本地的智能遥控应用与华为视频应用进行协同控制,那么所述智能手机根据华为视频应用的意图和智能遥控应用的意图,生成控制应用,并根据华为视频应用的图标和智能遥控应用的图标,生成控制图标;将所述控制图标与所述控制应用关联,并将所述控制图标与智能电视的标识关联。
另外,参阅图6所示,在本实例中启动该控制应用的过程包括以下步骤:
用户点击智能手机的主界面中的控制应用的控制图标,智能手机检测与智能电视的连接。若智能手机未检测到与智能电视的连接,说明智能电视未打开,则使用红外遥控信号开启智能电视,并建立与智能电视的连接。
智能手机按正常本地的应用启动流程,启动智能遥控应用。
智能手机将华为视频应用的意图发送给智能电视。之后智能电视启动华为视频应用。
这样,用户可以对智能手机的智能遥控应用进行各种操作,智能手机会将操作对应的红外遥控信号发送给智能电视,从而使智能电视的华为视频应用根据红外遥控信号执行对应的动作。
通过本实例,用户不需要操作智能电视的遥控器的情况下,通过在智能手机上点击控制图标这一个操作,即可完成智能电视启动、智能手机与智能电视分别打开对应应用,之后用户可以将智能手机作为遥控器,对智能手机中的智能遥控应用进行操作,从而可以直接操控智能电视中华为视频应用的播放的节目。
显然,本实例扩展了智能手机的桌面图标的功能,即单个控制图标可以达到打开多设备多应用的目的。另外,智能手机中的控制应用与智能电视的标识关联,从而使所述智能手机可以根据智能电视的标识,自动完成智能电视的开机和与智能电视建立连接的过程。
实例2:使用的应用场景为用户使用智能电视的羽毛球体感游戏应用玩游戏,并且用户希望可以将智能手机作为体感控制器(即或者输入其他数据的输入设备)。
在本实例中,生成跨设备的控制应用(又可以称为融合应用)的过程包括以下步骤:
用户在智能电视的应用列表界面中选择待协同控制(待分享、待融合)的应用为羽毛 球体感游戏应用(即选择需要关联输入设备的应用为羽毛球体感游戏应用),并选择需要分享到的电子设备为智能手机。智能电视将羽毛球体感游戏应用的图标、意图,以及智能电视的标识通过控制请求发送给智能手机。
智能手机接收到控制请求后,询问用户是否选择本地的应用与羽毛球体感游戏应用进行协同控制(是否选择本地的应用与羽毛球体感游戏应用进行融合或组合),若用户选择本地的体感控制应用与羽毛球体感游戏应用进行协同控制,那么所述智能手机根据羽毛球体感游戏应用的意图和体感控制应用的意图,生成控制应用,并根据羽毛球体感游戏应用的图标和体感控制应用的图标,生成控制图标;将所述控制图标与所述控制应用关联,并将所述控制图标与智能电视的标识关联。
另外,参阅图7所示,在本实例中启动该控制应用的过程包括以下步骤,:
用户点击智能手机的主界面中的控制应用的控制图标,智能手机检测与智能电视的连接。若智能手机未检测到与智能电视的连接,说明智能电视未打开,则使用红外遥控信号开启智能电视,并建立与智能电视的连接。
智能手机按正常本地的应用启动流程,启动体感控制应用。
智能手机将羽毛球体感游戏应用的意图发送给智能电视。之后智能电视启动羽毛球体感游戏应用。
这样,用户可以将智能手机作为体感控制器,随着用户移动所述智能手机的位置,所述智能手机将体感输入数据发送给智能电视,从而使智能电视的羽毛球体感游戏应用可以根据接收的体感输入数据对应的动作。其中,所述智能手机可以通过与所述智能电视已经建立的连接(例如,蓝牙连接、Wi-Fi连接等)传输所述体感输入数据,或者所述智能手机与所述智能电视建立新的连接以传输所述体感输入数据。
通过本实例,用户不需要游戏手柄的情况下,通过在智能手机上点击控制图标这一个操作,即可完成智能电视启动、打开智能电视中的羽毛球体感游戏应用,之后用户可以将智能手机作为体感控制器或游戏手柄,对智能手机中的羽毛球体感游戏应用中的游戏对象进行控制。
显然,本实例扩展了智能手机的桌面图标的功能,即单个控制图标可以达到打开多设备多应用的目的。另外,智能手机中的控制应用与智能电视的标识关联,从而使所述智能手机可以根据智能电视的标识,自动完成智能电视的开机和与智能电视建立连接的过程。
实例3:适用的应用场景为用户使用智能电视中的全民K歌应用唱歌,并且用户希望可以使用智能手机作为麦克风,并使用智能音响播放音频。
在本实例中,生成跨设备的控制应用(又可以称为融合应用)的过程包括以下步骤:
用户在智能电视的应用列表界面中选择待协同控制(待分享、待融合)的应用为全民K歌应用,并选择需要分享到的电子设备为智能手机,以及选择关联设备为智能音响。智能电视将全民K歌应用的图标、意图,以及智能电视的标识和智能音响的标识通过控制请求发送给智能手机。
智能手机接收到控制请求后,询问用户是否选择本地的应用与全民K歌应用进行协同控制(是否选择本地的应用与全民K歌应用进行融合或组合),若用户选择本地的麦克风应用与全民K歌应用进行协同控制,那么所述智能手机根据全民K歌应用的意图和麦克风应用的意图,生成控制应用,并根据全民K歌应用的图标和麦克风应用的图标,生成控 制图标;将所述控制图标与所述控制应用关联,并将所述控制图标与智能电视的标识、智能音响的标识关联。
另外,参阅图8所示,在本实例中启动该控制应用的过程包括以下步骤:
用户点击智能手机的主界面中的控制应用的控制图标,智能手机分别检测与智能电视的连接,以及与智能音响的连接。若智能手机未检测到与智能电视的连接,说明智能电视未打开,则使用红外遥控信号开启智能电视,并建立与智能电视的连接。同样的,若智能手机未检测到与智能音响的连接,说明智能音响未打开,则使用红外遥控信号开启智能音响,并建立与智能音响的连接。这样,在智能音响和智能电视均打开的情况下,可以建立二者之间的连接。
智能手机按正常本地的应用启动流程,启动麦克风应用。
智能手机将全民K歌应用的意图发送给智能电视。之后智能电视启动全民K歌应用。
这样,通过智能手机中的麦克风应用,用户可以将智能手机作为智能电视的麦克风采集用户语音数据,智能手机会将语音数据发送给智能电视,从而使智能电视的全民K歌应用对语音数据进行处理,从而生成音频数据。智能电视还可以将音频数据发送给智能音响以使智能音响对该音频数据进行输出。
其中,智能手机可以通过与智能电视之间的连接(例如蓝牙连接)向智能电视传输该全民K歌应用的意图或语音数据,而智能电视也可以通过与智能音响之间的连接(例如蓝牙连接)向智能音响传输音频数据。
通过本实例,用户不需要操作智能电视的遥控器的情况下,通过在智能手机上点击控制图标这一个操作,即可完成智能电视和智能音响启动、智能手机与智能电视分别打开对应应用,之后用户可以将智能手机作为智能电视的麦克风来采集语音数据,另外智能电视还可以通过智能音响进行音频数据播放,明显提高了用户体验。
显然,本实例扩展了智能手机的桌面图标的功能,即单个控制图标可以达到打开多设备多应用的目的。另外,智能手机中的控制应用与智能电视的标识、智能音响的标识关联,从而使所述智能手机可以根据智能电视的标识,自动完成智能电视的开机和与智能电视建立连接的过程,根据智能音响的标识,自动完成智能音响的开机。
实例4:适用的应用场景为用户希望通过智能手机上的语音助手应用来协同控制智能电视中的华为视频应用,并且使用智能音响播放音频。
在本实例中,智能手机生成跨设备的控制应用的过程包括以下步骤:
用户在智能手机的语言助手应用管理的电子设备列表中添加智能电视(即智能电视为待协同控制的电子设备)。智能手机向智能电视发送控制请求。
智能电视接收到控制请求后,向用户提示需要选择待协同控制的应用;然后用户操作智能电视,选择本地的华为视频应用为待协同控制的应用,并选择该应用的关联设备为智能音响。智能电视将华为视频应用的图标、意图,以及智能电视的标识和智能音响的标识通过控制响应发送给智能手机。
智能手机接收到控制请求后,根据华为视频应用的意图,生成控制应用;并根据华为视频应用的图标,生成控制图标;将生成的控制图标与控制应用关联,并将该控制图标与智能电视的标识、智能音响的标识关联。
另外,参阅图9所示,本实例中启动智能手机上的控制应用(包括上述第二种实施方 式中的第二控制应用)的过程包括以下步骤:
用户启动智能手机中的语音助手应用,并输入语音信息“电视播放XXXX”;语音助手应用解析该语音信息,启动智能手机上的控制应用,并生成指示播放XXXX的命令消息。智能手机分别检测与智能电视的连接,和与智能音响的连接。若智能手机未检测到与智能电视的连接,说明智能电视未打开,则使用红外遥控信号开启智能电视,并建立与智能电视的连接。同样的,若智能手机未检测到与智能音响的连接,说明智能音响未打开,则使用红外遥控信号开启智能音响,并建立与智能音响的连接。这样,在智能音响和智能电视均打开的情况下,可以建立二者之间的连接。
智能手机在与智能电视建立连接之后,将华为视频应用的意图、以及解析该语音信息得到的命令消息发送给智能电视。之后,智能电视启动华为视频应用,并根据该命令消息,播放该XXXX视频。另外,智能电视播放XXXX视频的音频数据发送给智能音响,以使智能音响对该音频数据进行输出。其中,智能电视可以通过与智能音响之间的连接(例如蓝牙连接)向智能音响传输音频数据。
实例5:适用的应用场景与实施例4相同。
需要说明的是,本实例的前提为用户在智能电视中生成第一控制应用。该第一控制应用是根据华为视频应用的意图生成的,并与智能音响关联。具体生成过程为:用户在智能电视的应用列表界面中选择待协同控制的应用为华为视频应用,并选择需要关联的电子设备为智能音响。智能电视根据华为视频应用的意图生成第一控制应用,根据华为视频应用的图标生成第一控制图标;将第一控制图标与第一控制应用关联,并将第一控制图标与智能音响的标识关联。
在本实例中,智能手机生成跨设备的控制应用的过程包括以下步骤:
用户在智能手机的语言助手应用管理的电子设备列表中添加智能电视(即智能电视为待协同控制的电子设备)。智能手机向智能电视发送控制请求。
智能电视接收到控制请求后,向用户提示需要选择待协同控制的应用;然后用户操作智能电视,选择本地的第一控制应用为待协同控制的应用。智能电视将第一控制应用的图标、意图(用于打开华为视频应用,并通过智能音响播放),以及智能电视的标识和智能音响的标识通过控制响应发送给智能手机。
智能手机接收到控制请求后,根据第一控制应用的意图,生成第二控制应用;并根据第一控制应用的图标,生成第二控制图标;将生成的第二控制图标与第二控制应用关联,并将该第二控制图标与智能电视的标识、智能音响的标识关联。
另外,参阅图9所示,本实例中启动智能手机上的第二控制应用的过程包括以下步骤:
用户启动智能手机中的语音助手应用,并输入语音信息“电视播放XXXX”;语音助手应用解析该语音信息,启动智能手机上的第二控制应用,并生成指示播放XXXX的命令消息。智能手机分别检测与智能电视的连接,和与智能音响的连接。若智能手机未检测到与智能电视的连接,说明智能电视未打开,则使用红外遥控信号开启智能电视,并建立与智能电视的连接。同样的,若智能手机未检测到与智能音响的连接,说明智能音响未打开,则使用红外遥控信号开启智能音响,并建立与智能音响的连接。这样,在智能音响和智能电视均打开的情况下,可以建立二者之间的连接。
智能手机在与智能电视建立连接之后,将第一控制应用的意图、以及解析该语音信息 得到的命令消息发送给智能电视。之后,智能电视启动该第一控制应用(包含打开本地的华为视频应用,以及与智能音响建立连接),并根据该命令消息,在华为视频应用中播放该XXXX视频。另外,智能电视播放XXXX视频的音频数据发送给智能音响,以使智能音响对该音频数据进行输出。
通过实例4和实例5,用户可以通过语音助手应用管理位于其他电子设备中的应用,大大扩展了语音助手应用的适用范围,并且通过语音助手应用可以一次性开启满足用户需求的所有应用和电子设备。图以上实例所示,用户可以直接在智能手机侧操作语音助手应用,即可一次性打开智能电视、智能音响,并打开智能电视中的华为视频应用播放希望看的视频。
显然,实例4和实例5扩展了语音助手应用的功能,使语音助手应用还可以管理其他电子设备。另外,语音助手应用在管理其他电子设备过程中也不依赖其他电子设备的开机状态、语音助手的开启状态,可以通过控制应用的启动,启动对应的设备。
基于以上实施例和实例,本申请还提供了一种控制装置,所述装置能够应用于以上实施例或实例中的电子设备,以下以应用于第一电子设备为例进行说明。该装置能够实现以上控制方法。参阅图10所示,所述控制装置1000中包含:通信单元1001和处理单元1002。下面对各个单元的功能进行描述。
通信单元1001,用于接收和发送数据。示例性的,所述通信单元1001可以通过移动通信模块和/或无线通信模块实现。
处理单元1002,用于获取第一应用的命令信息,其中,所述第一应用位于第二电子设备中,所述第一应用的命令信息用于实现所述第一应用的动作;根据所述第一应用的命令信息,生成控制应用,其中,所述控制应用用于使所述第二电子设备实现所述第一应用的动作。
在一种可能的实施方式中,所述处理单元1002,在获取所述第一应用的命令信息时,具体用于:
通过所述通信单元1001接收来自所述第二电子设备的所述第一应用的命令信息;或者获取用户输入的所述第一应用的命令信息。
在一种可能的实施方式中,所述处理单元1002还用于:
生成所述控制应用对应的控制图标;
在所述第一电子设备的显示屏中显示所述控制图标。
在一种可能的实施方式中,所述处理单元1002,在生成所述控制应用对应的控制图标时,具体用于:
获取所述第一应用对应的图标信息;
根据所述第一应用对应的图标信息,生成所述控制图标。
在一种可能的实施方式中,所述处理单元1002还用于:
获取所述控制应用的启动命令之后,通过所述通信单元1001向所述第二电子设备发送所述第一应用的命令信息,以使所述第二电子设备根据接收的所述第一应用的命令信息执行所述第一应用的动作。
在一种可能的实施方式中,所述处理单元1002,在根据所述第一应用的命令信息,生成控制应用时,具体用于:
获取第二应用的命令信息,其中,所述第二应用位于所述第一电子设备和/或第三电子设备中,所述第二应用的命令信息用于实现所述第二应用的动作;
根据所述第一应用的命令信息和所述第二应用的命令信息,生成所述控制应用,其中,所述控制应用还用于使所述第一电子设备和/或所述第三电子设备实现所述第二应用的动作;
当所述第二应用位于所述第一电子设备时,所述处理单元1002,还用于:在获取所述控制应用的启动命令之后,根据所述第二应用的命令信息,执行所述第二应用的动作;
当所述第二应用位于所述第三电子设备时,所受处理单元1002,还用于:在获取所述控制应用的启动命令之后,通过所述通信单元1001向所述第三电子设备发送所述第二应用的命令信息,以使所述第三电子设备根据接收到第二应用的命令信息执行所述第二应用的动作。
在一种可能的实施方式中,所述处理单元1002,具体用于通过以下方式获取所述控制应用的启动命令:
方式一:检测到用户对所述控制应用对应的控制图标的操作;响应于所述操作,生成所述控制应用的启动命令;
方式二:通过语音助手应用接收用户的语音指令;获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令。
在一种可能的实施方式中,所述处理单元1002,还用于:
在获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令之前,在所述语音助手应用所管理的应用列表中添加所述第一应用。
在一种可能的实施方式中,所述处理单元1002,还用于获取所述第二电子设备的信息;
所述处理单元1002,在通过所述通信单元1001向所述第二电子设备发送所述第一应用的命令信息时,具体用于:
根据所述第二电子设备的信息,通过所述通信单元1001向所述第二电子设备发送所述第一应用的命令信息。
在一种可能的实施方式中,所述处理单元1002,还用于:
在向所述第二电子设备发送所述第一应用的命令信息之前,确定与所述第二电子设备未建立连接时,通过所述通信单元1001向所述第二电子设备发送开机信号;
与所述第二电子设备建立连接。
在一种可能的实施方式中,所述处理单元1002,还用于:
获取所述第二电子设备关联的第四电子设备的信息;
在向所述第二电子设备发送所述第一应用的命令信息之前,确定与所述第四电子设备未建立连接时,通过所述通信单元1001向所述第四电子设备发送开机信号;
与所述第四电子设备建立连接。
在一种可能的实施方式中,所述处理单元1002,还用于在获取所述第一应用的命令信息之前,通过所述通信单元1001向所述第二电子设备发送第一控制请求,以使所述第二电子设备根据所述第一控制请求反馈所述第一应用的命令信息;或者所述处理单元1002,在获取所述第一应用的命令信息时,具体用于:通过所述通信单元1001接收来自所述第二电子设备的第二控制请求,所述第二控制请求中包含所述第一应用的命令信息。
在一种可能的实施方式中,所述处理单元1002,还用于:
在生成控制应用之后,通过所述通信单元1001向第五电子设备发送所述控制应用的命令信息,所述控制应用的命令信息用于启动所述控制应用。
需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
基于以上实施例和实例,本申请实施例还提供了一种电子设备,所述电子设备用于实现以上实施例提供的控制方法,具有图10所示的控制装置1000的功能。参阅图11所示,所述电子设备1100中包括:收发器1101、处理器1102、存储器1103,以及显示屏1104。
其中,所述收发器1101、所述处理器1102、所述存储器1103,以及显示屏1104之间相互连接。可选的,所述收发器1101、所述处理器1102、所述存储器1103,以及显示屏1104之间通过总线相互连接。所述总线可以是外设部件互连标准(peripheral component interconnect,PCI)总线或扩展工业标准结构(extended industry standard architecture,EISA)总线等。所述总线可以分为地址总线、数据总线、控制总线等。为便于表示,图11中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
所述收发器1101,用于接收和发送数据,实现与其他设备之间的通信。示例性的,所述通信单元1001可以通过移动通信模块和/或无线通信模块实现。具体的,所述收发器1101可以通过射频装置和天线实现。
所述处理器1102,用于实现以上实施例或实例提供的控制方法,具体过程可以参考以上实施例或实例中的描述,此处不再赘述。
所述显示屏1104,用于显示界面。
其中,处理器1102可以是中央处理器(central processing unit,CPU),网络处理器(network processor,NP)或者CPU和NP的组合等等。处理器1102还可以进一步包括硬件芯片。上述硬件芯片可以是专用集成电路(application-specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。上述PLD可以是复杂可编程逻辑器件(complex programmable logic device,CPLD),现场可编程逻辑门阵列(field-programmable gate array,FPGA),通用阵列逻辑(generic array logic,GAL)或其任意组合。处理器1102在实现上述功能时,可以通过硬件实现,当然也可以通过硬件执行相应的软件实现。
所述存储器1103,用于存放程序指令等。具体地,程序指令可以包括程序代码,该程序代码包括计算机操作指令。存储器1103可能包含随机存取存储器(random access memory, RAM),也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。处理器1102执行存储器1103所存放的程序指令,实现上述功能,从而实现上述实施例提供的方法。
基于以上实施例,本申请实施例还提供了一种计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行以上实施例提供的方法。
基于以上实施例,本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,所述计算机程序被计算机执行时,使得计算机执行以上实施例提供的方法。
其中,存储介质可以是计算机能够存取的任何可用介质。以此为例但不限于:计算机可读介质可以包括RAM、ROM、EEPROM、CD-ROM或其他光盘存储、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质。
基于以上实施例,本申请实施例还提供了一种芯片,所述芯片用于读取存储器中存储的计算机程序,实现以上实施例提供的方法。
基于以上实施例,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于支持计算机装置实现以上实施例中通信设备所涉及的功能。在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存该计算机装置必要的程序和数据。该芯片系统,可以由芯片构成,也可以包含芯片和其他分立器件。
综上所述,本申请实施例提供了一种控制方法、装置及电子设备。通过该方案,电子设备可以在获取位于其他电子设备的应用的命令信息,并根据该命令信息,生成控制应用,从而用户可以通过启动该控制应用,以使其他电子设备实现该应用的动作。显然通过该方法,电子设备可以通过生成控制应用的方式,实现跨设备应用控制功能,从而实现多设备协同,进而提高用户体验。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机 或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (29)

  1. 一种控制方法,应用于第一电子设备,其特征在于,包括:
    获取第一应用的命令信息,其中,所述第一应用位于第二电子设备中,所述第一应用的命令信息用于实现所述第一应用的动作;
    根据所述第一应用的命令信息,生成控制应用,其中,所述控制应用用于使所述第二电子设备实现所述第一应用的动作。
  2. 如权利要求1所述的方法,其特征在于,获取所述第一应用的命令信息,包括:
    接收来自所述第二电子设备的所述第一应用的命令信息;或者
    获取用户输入的所述第一应用的命令信息。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    生成所述控制应用对应的控制图标;
    在显示屏中显示所述控制图标。
  4. 如权利要求3所述的方法,其特征在于,生成所述控制应用对应的控制图标,包括:
    获取所述第一应用对应的图标信息;
    根据所述第一应用对应的图标信息,生成所述控制图标。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    获取所述控制应用的启动命令之后,向所述第二电子设备发送所述第一应用的命令信息,以使所述第二电子设备根据接收的所述第一应用的命令信息执行所述第一应用的动作。
  6. 如权利要求5所述的方法,其特征在于,根据所述第一应用的命令信息,生成控制应用,包括:
    获取第二应用的命令信息,其中,所述第二应用位于所述第一电子设备和/或第三电子设备中,所述第二应用的命令信息用于实现所述第二应用的动作;
    根据所述第一应用的命令信息和所述第二应用的命令信息,生成所述控制应用,其中,所述控制应用还用于使所述第一电子设备和/或所述第三电子设备实现所述第二应用的动作;
    当所述第二应用位于所述第一电子设备时,在获取所述控制应用的启动命令之后,所述方法还包括:根据所述第二应用的命令信息,执行所述第二应用的动作;或者
    当所述第二应用位于所述第三电子设备时,在获取所述控制应用的启动命令之后,所述方法还包括:向所述第三电子设备发送所述第二应用的命令信息,以使所述第三电子设备根据接收的所述第二应用的命令信息执行所述第二应用的动作。
  7. 如权利要求5或6所述的方法,其特征在于,获取所述控制应用的启动命令,包括:
    检测到用户对所述控制应用对应的控制图标的操作;响应于所述操作,生成所述控制应用的启动命令;或者
    通过语音助手应用接收用户的语音指令;获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令。
  8. 如权利要求7所述的方法,其特征在于,在获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令之前,所述方法还包括:
    在所述语音助手应用所管理的应用列表中添加所述第一应用。
  9. 如权利要求5-8任一项所述的方法,其特征在于,所述方法还包括:
    获取所述第二电子设备的信息;
    向所述第二电子设备发送所述第一应用的命令信息,包括:
    根据所述第二电子设备的信息,向所述第二电子设备发送所述第一应用的命令信息。
  10. 如权利要求5-9任一项所述的方法,其特征在于,在向所述第二电子设备发送所述第一应用的命令信息之前,所述方法还包括:
    确定与所述第二电子设备未建立连接时,向所述第二电子设备发送开机信号;
    与所述第二电子设备建立连接。
  11. 如权利要求5-10任一项所述的方法,其特征在于,所述方法还包括:
    获取所述第二电子设备关联的第四电子设备的信息;
    在向所述第二电子设备发送所述第一应用的命令信息之前,所述方法还包括:
    确定与所述第四电子设备未建立连接时,向所述第四电子设备发送开机信号;
    与所述第四电子设备建立连接。
  12. 如权利要求1-11任一项所述的方法,其特征在于,
    在获取所述第一应用的命令信息之前,所述方法还包括:向所述第二电子设备发送第一控制请求,以使所述第二电子设备根据所述第一控制请求反馈所述第一应用的命令信息;或者
    获取所述第一应用的命令信息,包括:接收来自所述第二电子设备的第二控制请求,所述第二控制请求中包含所述第一应用的命令信息。
  13. 如权利要求1-12任一项所述的方法,其特征在于,在生成控制应用之后,所述方法还包括:
    向第五电子设备发送所述控制应用的命令信息,所述控制应用的命令信息用于启动所述控制应用。
  14. 一种控制装置,应用于第一电子设备,其特征在于,包括:
    通信单元,用于接收和发送数据;
    处理单元,用于获取第一应用的命令信息,其中,所述第一应用位于第二电子设备中,所述第一应用的命令信息用于实现所述第一应用的动作;根据所述第一应用的命令信息,生成控制应用,其中,所述控制应用用于使所述第二电子设备实现所述第一应用的动作。
  15. 如权利要求14所述的装置,其特征在于,所述处理单元,在获取所述第一应用的命令信息时,具体用于:
    通过所述通信单元接收来自所述第二电子设备的所述第一应用的命令信息;或者
    获取用户输入的所述第一应用的命令信息。
  16. 如权利要求14或15所述的装置,其特征在于,所述处理单元还用于:
    生成所述控制应用对应的控制图标;
    在所述第一电子设备的显示屏中显示所述控制图标。
  17. 如权利要求16所述的装置,其特征在于,所述处理单元,在生成所述控制应用对应的控制图标时,具体用于:
    获取所述第一应用对应的图标信息;
    根据所述第一应用对应的图标信息,生成所述控制图标。
  18. 如权利要求14-17任一项所述的装置,其特征在于,所述处理单元还用于:
    获取所述控制应用的启动命令之后,通过所述通信单元向所述第二电子设备发送所述 第一应用的命令信息,以使所述第二电子设备根据接收的所述第一应用的命令信息执行所述第一应用的动作。
  19. 如权利要求18所述的装置,其特征在于,所述处理单元,在根据所述第一应用的命令信息,生成控制应用时,具体用于:
    获取第二应用的命令信息,其中,所述第二应用位于所述第一电子设备和/或第三电子设备中,所述第二应用的命令信息用于实现所述第二应用的动作;
    根据所述第一应用的命令信息和所述第二应用的命令信息,生成所述控制应用,其中,所述控制应用还用于使所述第一电子设备和/或所述第三电子设备实现所述第二应用的动作;
    当所述第二应用位于所述第一电子设备时,所述处理单元,还用于:在获取所述控制应用的启动命令之后,根据所述第二应用的命令信息,执行所述第二应用的动作;或者
    当所述第二应用位于所述第三电子设备时,所受处理单元,还用于:在获取所述控制应用的启动命令之后,通过所述通信单元向所述第三电子设备发送所述第二应用的命令信息,以使所述第三电子设备根据接收到第二应用的命令信息执行所述第二应用的动作。
  20. 如权利要求18或19所述的装置,其特征在于,所述处理单元,在获取所述控制应用的启动命令时,具体用于:
    检测到用户对所述控制应用对应的控制图标的操作;响应于所述操作,生成所述控制应用的启动命令;或者
    通过语音助手应用接收用户的语音指令;获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令。
  21. 如权利要求20所述的装置,其特征在于,所述处理单元,还用于:
    在获取所述语音助手应用对所述语音指令进行解析得到的所述控制应用的启动命令之前,在所述语音助手应用所管理的应用列表中添加所述第一应用。
  22. 如权利要求18-21任一项所述的装置,其特征在于,所述处理单元,还用于获取所述第二电子设备的信息;
    所述处理单元,在通过所述通信单元向所述第二电子设备发送所述第一应用的命令信息时,具体用于:
    根据所述第二电子设备的信息,通过所述通信单元向所述第二电子设备发送所述第一应用的命令信息。
  23. 如权利要求18-22任一项所述的装置,其特征在于,所述处理单元,还用于:
    在向所述第二电子设备发送所述第一应用的命令信息之前,确定与所述第二电子设备未建立连接时,通过所述通信单元向所述第二电子设备发送开机信号;
    与所述第二电子设备建立连接。
  24. 如权利要求18-23任一项所述的装置,其特征在于,所述处理单元,还用于:
    获取所述第二电子设备关联的第四电子设备的信息;
    在向所述第二电子设备发送所述第一应用的命令信息之前,确定与所述第四电子设备未建立连接时,通过所述通信单元向所述第四电子设备发送开机信号;
    与所述第四电子设备建立连接。
  25. 如权利要求14-24任一项所述的装置,其特征在于,所述处理单元,还用于在获取所述第一应用的命令信息之前,通过所述通信单元向所述第二电子设备发送第一控制请求, 以使所述第二电子设备根据所述第一控制请求反馈所述第一应用的命令信息;或者
    所述处理单元,在获取所述第一应用的命令信息时,具体用于:通过所述通信单元接收来自所述第二电子设备的第二控制请求,所述第二控制请求中包含所述第一应用的命令信息。
  26. 如权利要求14-25任一项所述的装置,其特征在于,所述处理单元,还用于:
    在生成控制应用之后,通过所述通信单元向第五电子设备发送所述控制应用的命令信息,所述控制应用的命令信息用于启动所述控制应用。
  27. 一种电子设备,其特征在于,包括:显示屏、处理器,以及存储器;其中,所述存储器存储有计算机程序,所述计算机程序包括指令,当所述指令被所述处理器执行时,使得所述电子设备执行如权利要求1-13任一项所述的方法。
  28. 一种计算机存储介质,其特征在于,所述计算机存储介质中存储有计算机程序,当所述计算机程序被计算机执行时,使得所述计算机执行如权利要求1-13任一项所述的方法。
  29. 一种芯片,其特征在于,所述芯片用于读取存储器中存储的计算机程序,执行如权利要求1-13任一项所述的方法。
PCT/CN2021/119707 2020-10-30 2021-09-22 一种控制方法、装置及电子设备 WO2022089102A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21884832.3A EP4220627A4 (en) 2020-10-30 2021-09-22 CONTROL METHOD AND DEVICE AND ELECTRONIC DEVICE
JP2023523533A JP2023547821A (ja) 2020-10-30 2021-09-22 制御方法及び装置、及び電子デバイス
US18/308,244 US20230259250A1 (en) 2020-10-30 2023-04-27 Control method and apparatus, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011193906.1A CN114530148A (zh) 2020-10-30 2020-10-30 一种控制方法、装置及电子设备
CN202011193906.1 2020-10-30

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/308,244 Continuation US20230259250A1 (en) 2020-10-30 2023-04-27 Control method and apparatus, and electronic device

Publications (1)

Publication Number Publication Date
WO2022089102A1 true WO2022089102A1 (zh) 2022-05-05

Family

ID=81383575

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/119707 WO2022089102A1 (zh) 2020-10-30 2021-09-22 一种控制方法、装置及电子设备

Country Status (5)

Country Link
US (1) US20230259250A1 (zh)
EP (1) EP4220627A4 (zh)
JP (1) JP2023547821A (zh)
CN (1) CN114530148A (zh)
WO (1) WO2022089102A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002059A (zh) * 2022-05-06 2022-09-02 深圳市雷鸟网络传媒有限公司 信息处理方法、装置、计算机可读存储介质及计算机设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116863913B (zh) * 2023-06-28 2024-03-29 上海仙视电子科技有限公司 一种语音控制的跨屏互动控制方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101237256A (zh) * 2007-02-01 2008-08-06 联想移动通信科技有限公司 电子设备及其被其它电子设备通过nfc控制的方法
CN101933339A (zh) * 2008-01-31 2010-12-29 夏普株式会社 电子设备、远程控制系统、信号处理方法、控制程序及记录介质
CN103970396A (zh) * 2013-01-31 2014-08-06 鸿富锦精密工业(深圳)有限公司 手持设备及控制方法
WO2016075560A1 (en) * 2014-11-14 2016-05-19 Sony Corporation Control apparatus and method and electronic device
CN107863103A (zh) * 2017-09-29 2018-03-30 珠海格力电器股份有限公司 一种设备控制方法、装置、存储介质及服务器
WO2020008256A1 (en) * 2018-07-02 2020-01-09 Orange Method for connecting an electronic device, e.g. a smart speaker, to a target wireless access point
CN111049935A (zh) * 2013-05-22 2020-04-21 三星电子株式会社 远程控制电子设备的系统及其电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11445011B2 (en) * 2014-05-15 2022-09-13 Universal Electronics Inc. Universal voice assistant
US11595397B2 (en) * 2017-12-15 2023-02-28 Google Llc Extending application access across devices
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN110958475A (zh) * 2019-10-30 2020-04-03 华为终端有限公司 一种跨设备的内容投射方法及电子设备
CN111523095B (zh) * 2020-03-31 2024-03-15 华为技术有限公司 一种跨设备交互的方法和终端设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101237256A (zh) * 2007-02-01 2008-08-06 联想移动通信科技有限公司 电子设备及其被其它电子设备通过nfc控制的方法
CN101933339A (zh) * 2008-01-31 2010-12-29 夏普株式会社 电子设备、远程控制系统、信号处理方法、控制程序及记录介质
CN103970396A (zh) * 2013-01-31 2014-08-06 鸿富锦精密工业(深圳)有限公司 手持设备及控制方法
CN111049935A (zh) * 2013-05-22 2020-04-21 三星电子株式会社 远程控制电子设备的系统及其电子设备
WO2016075560A1 (en) * 2014-11-14 2016-05-19 Sony Corporation Control apparatus and method and electronic device
CN107863103A (zh) * 2017-09-29 2018-03-30 珠海格力电器股份有限公司 一种设备控制方法、装置、存储介质及服务器
WO2020008256A1 (en) * 2018-07-02 2020-01-09 Orange Method for connecting an electronic device, e.g. a smart speaker, to a target wireless access point

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4220627A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002059A (zh) * 2022-05-06 2022-09-02 深圳市雷鸟网络传媒有限公司 信息处理方法、装置、计算机可读存储介质及计算机设备
CN115002059B (zh) * 2022-05-06 2024-03-12 深圳市雷鸟网络传媒有限公司 信息处理方法、装置、计算机可读存储介质及计算机设备

Also Published As

Publication number Publication date
EP4220627A1 (en) 2023-08-02
JP2023547821A (ja) 2023-11-14
US20230259250A1 (en) 2023-08-17
EP4220627A4 (en) 2024-03-20
CN114530148A (zh) 2022-05-24

Similar Documents

Publication Publication Date Title
US20220342850A1 (en) Data transmission method and related device
KR102481065B1 (ko) 애플리케이션 기능 구현 방법 및 전자 디바이스
CN112558825A (zh) 一种信息处理方法及电子设备
EP3726376B1 (en) Program orchestration method and electronic device
US20230259250A1 (en) Control method and apparatus, and electronic device
WO2024016559A1 (zh) 一种多设备协同方法、电子设备及相关产品
JP2023503679A (ja) マルチウィンドウ表示方法、電子デバイス及びシステム
US20230094172A1 (en) Cross-Device Application Invoking Method and Electronic Device
WO2022127661A1 (zh) 应用共享方法、电子设备和存储介质
WO2021135734A1 (zh) 应用中传输文件的方法、电子设备及系统
CN113593279B (zh) 车辆及其交互参数调整方法、移动终端
WO2021052488A1 (zh) 一种信息处理方法及电子设备
US20230139886A1 (en) Device control method and device
JP7319431B2 (ja) アプリケーション機能の実施方法及び電子装置
WO2023005711A1 (zh) 一种服务的推荐方法及电子设备
WO2022053062A1 (zh) 一种IoT设备的管理方法及终端
WO2022052928A1 (zh) 一种应用接入方法及相关装置
CN112786022B (zh) 终端、第一语音服务器、第二语音服务器及语音识别方法
CN113608610A (zh) 交互控制方法、电子设备及系统
US11991040B2 (en) Network configuration method and device
WO2024078306A1 (zh) 横幅通知消息的显示方法与电子设备
WO2024066992A1 (zh) 一种多设备组网系统、方法及终端设备
CN114020379B (zh) 一种终端设备、信息反馈方法和存储介质
WO2024008017A1 (zh) 内容分享方法、图形界面及相关装置
WO2024007816A1 (zh) 通信方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884832

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023523533

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 2021884832

Country of ref document: EP

Effective date: 20230426

NENP Non-entry into the national phase

Ref country code: DE