CN113311975A - Application interaction method among multiple devices and related devices - Google Patents

Application interaction method among multiple devices and related devices Download PDF

Info

Publication number
CN113311975A
CN113311975A CN202010125173.1A CN202010125173A CN113311975A CN 113311975 A CN113311975 A CN 113311975A CN 202010125173 A CN202010125173 A CN 202010125173A CN 113311975 A CN113311975 A CN 113311975A
Authority
CN
China
Prior art keywords
application
control
devices
information
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010125173.1A
Other languages
Chinese (zh)
Inventor
鲁波
王振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010125173.1A priority Critical patent/CN113311975A/en
Priority to CN202210410313.9A priority patent/CN114968614A/en
Publication of CN113311975A publication Critical patent/CN113311975A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/543User-generated data transfer, e.g. clipboards, dynamic data exchange [DDE], object linking and embedding [OLE]

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an application interaction method among multiple devices and related devices, and the method comprises the following steps: the first device displays a selection menu; the selection menu comprises one or more controls, the device indicated by each of the one or more controls is a single device or a combination of devices, and the device indicated by the one or more controls has a running environment of all or part of the sub-applications included in the first application in the first device; the first device responds to the selection operation of a first control and sends a request for starting the first application to the device indicated by the first control; the first control is a control in the selection menu; the first device receives response information of successful starting of the first application of the device indicated by the first control. By the method and the device, the selectable device selection menu can be accurately displayed, and the application of the target device can be accurately and remotely started.

Description

Application interaction method among multiple devices and related devices
Technical Field
The present application relates to the field of terminal and communication technologies, and in particular, to an application interaction method among multiple devices and related devices.
Background
With the daily popularization of smart devices such as mobile phones, televisions, computers, smart speakers, and various internet of things (IoT) devices and smart home appliances, smart devices are increasing in daily life, and interaction between devices gradually becomes a function that users pay attention to and need.
In the prior art, the Apple system can communicate the MacBook, the iPhone, the iPad and the Apple Watch through a continuity mode, so as to realize the functions of relay, a universal clipboard, communication short messages and the like. Microsoft also has the similar prior art, and satisfies the function that an Application (APP) can continue on another device by selecting a component, a status component, a notification component, a gesture component and a synchronization component. Task state may be synchronized across devices via cloud services or P2P. The client application of each device collects the state of each application as part of the synchronization and uses the state to restore the same application on different devices.
However, the continuity framework of the apple system or the continuation framework of microsoft provides a method for recovering the context between applications, and the operation of the method can be performed by processing and recovering the synchronized context after the application is started by clicking on the corresponding device, and in a scene with many devices, a specific device cannot be specified to perform specific application starting.
Disclosure of Invention
The embodiment of the application interaction method and the related equipment can accurately remotely start the application of another equipment on one equipment without operating opposite equipment, and can more accurately display a selectable equipment selection menu based on different applications and the current networked equipment, so that the method and the related equipment are more in line with the expectation of a user and are easier to use.
In a first aspect, the present application discloses a method for application interaction among multiple devices, the method comprising:
the first device displays a selection menu; the device indicated by each of the one or more controls is a single device or a combination of devices, and the device indicated by the one or more controls has a running environment of all or part of sub-applications included in a first application in the first device; the execution environment includes a hardware environment and a software environment required to execute the first application.
The first device responds to the selection operation of a first control and sends a request for starting the first application to the device indicated by the first control; wherein the first control is a control in the selection menu;
and the first device receives response information of successful starting of the first application by the device indicated by the first control.
Compared with the prior art that the synchronous context can be processed and recovered after the relevant application is started by clicking on the corresponding equipment, the application of another equipment can be started on one equipment accurately and remotely without operating the opposite equipment, and the selectable equipment selection menu is displayed more accurately based on different applications and the current networked equipment, so that the method is more in line with the expectation of a user and is easier to use.
In one possible implementation, the first device displaying a selection menu includes: the first device displays the selection menu in response to a first operation on an object associated with the first application.
Optionally, the object associated with the first application includes a user interface of the first application, an icon of the first application, a thumbnail of the user interface of the first application, or a display plug-in of the first application, and it should be noted that the object associated with the first application in a specific implementation is not limited to the example here.
In one possible implementation, the first operation on the object associated with the first application includes: an operation of contact sliding on an object associated with the first application; or, the first operation on the object associated with the first application includes: an operation of clicking on an object associated with the first application by a plurality of fingers; or, the first operation on the object associated with the first application includes: and long-pressing the object associated with the first application.
The selection menu can be called out based on various objects related to the first application, the operation of calling out the selection menu is simple, the use habit of a user is met, and the use experience of the user can be improved.
In one possible embodiment, the object associated with the first application comprises a user interface of the first application; the first operation on the object associated with the first application comprises: and switching the user interface of the first application to another user interface.
In the application, when the device senses the switching of the user interface of the application, for example, the user interface returns to the previous user interface, the user interface about to quit the application, or the screen is locked, the device can display the selection menu based on the operations, so that the user is prompted to select other devices to run the application, the user can conveniently find new functions in the device, and therefore user experience is improved.
In one possible implementation, the selecting operation of the first control includes: dragging the object associated with the first application to the first control, wherein the object associated with the first application is scaled down in the dragging process; or, the selecting operation of the first control comprises: and clicking the first control.
In the application, the control is selected through dragging or clicking operation, and the method is simple and convenient.
In one possible implementation, before the first device displays the selection menu, the method further includes: the first device starts the first application; the first device detects that the device indicated by the first control is connected with a network where the first device is located, and detects that the device indicated by the first control logs in a first account; the first account is an account logged in by the first device, or an account belonging to the same account group as the account logged in by the first device. In this scenario, the first device may start the first application before detecting the other device networking login account, or may start the first application after detecting the other device networking login account.
Or before the first device displays the selection menu, the method further includes: the first device starts the first application; the first device accesses the network where the device indicated by the first control is located, and logs in a second account; the second account is an account for equipment login indicated by the first control, or belongs to the same account group with the account for equipment login indicated by the first control. In this scenario, the first device may start the first application before the networking login account, or may start the first application after the networking login account.
Or before the first device displays the selection menu, the method further includes: the first device starts the first application; and the first device establishes pairing connection with the device indicated by the first control. Optionally, for example, a bluetooth pairing connection or other forms of pairing connection are established. In this scenario, the first device may start the first application before the pairing connection or may start the first application after the pairing connection.
In the application, as long as the device starts the relevant application, the device can automatically display the selection menu in a proper scene to prompt the user that the user can select other devices to run the application, so that the user can conveniently find a new function in the device, and the user experience is improved.
In one possible implementation, the device indicated by the first control is a second device; after the first device receives the response information that the first control indicates that the device successfully starts the first application, the method further includes: and the first equipment sends one or more items of context information and running state information of the first application currently running in the first equipment to the second equipment according to the response information. The context information includes data enabling continued execution of the first application. The operating state information includes user interface data of the first application.
In this application, after the application in the second device is remotely started, the continuous operation of the application may be implemented in the second device, that is, the continuous operation is performed following the operation condition of the application of the first device. Thereby facilitating the use of the application by the user.
In one possible implementation, the device indicated by the first control is a device combination of a third device and a fourth device; the third device is used for running a first sub-application of the first application, the fourth device is used for running a second sub-application of the first application, and the first sub-application and the second sub-application jointly form the first application;
the first device responds to the selection operation of a first control, and sends a request for starting the first application to the device indicated by the first control, wherein the request comprises:
the first device responds to the selection operation of the first control and sends a request for starting the first application to the third device and the fourth device respectively;
after the first device receives the response information that the first control indicates that the device successfully starts the first application, the method further includes:
the first device remotely controls one or more of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application in the third device, and remotely controls one or more of the running logic sequence, the user interface updating state and the multimedia playing progress of the second sub-application in the fourth device.
According to the method and the device, the specific applications in the target devices can be remotely started by one device, and the running conditions of the applications in the target devices can be controlled, so that more and better functional experiences are provided for users.
In one possible implementation, the shared data in the first device and the device indicated by the first control are synchronized into a distributed data management server; the method further comprises the following steps: the first device inquires in the distributed data management server that the first application is not installed in the device indicated by the first control; the first device remotely controls the device indicated by the first control to install the first application.
In the application, the application needing to be started is installed through the remote control opposite terminal equipment, and the method and the device are convenient and fast.
In a second aspect, the present application discloses a method for application interaction among multiple devices, the method comprising:
the second equipment receives a request for starting the first application sent by the first equipment; the second device is one of devices indicated by one or more controls, the one or more controls are controls in a selection menu displayed by the first device, the device indicated by each of the one or more controls is a single device or a combination of devices, and the device indicated by the one or more controls has a running environment of all or part of sub-applications included in the first application; the request is sent by the first device in response to a selection operation of a control of the second device in the selection menu;
the second equipment starts the first application according to the request;
and the second equipment sends response information of successfully starting the first application to the first equipment.
In one possible implementation manner, after the second device sends response information to the first device, the response information indicating that the first application is successfully started, the method further includes:
the second device receives one or more items of context information and running state information of the first application currently running in the first device, which are sent by the first device;
and the second equipment continuously runs the first application according to one or more items of the context information and the running state information.
In one possible implementation, the second device is configured to run a first sub-application of the first application; after the second device sends response information of successfully starting the first application to the first device, the method further includes:
the second device receives a remote control instruction which is sent by the first device and used for controlling one or more items of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application;
and the second equipment controls one or more items of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application on the equipment according to the remote control instruction.
In one possible implementation, the shared data in the second device and the shared data in the first device are synchronized into a distributed data management server; after the second device sends response information of successfully starting the first application to the first device, the method further includes:
the second device inquires data of one or more items of context information and running state information of the first application currently running in the first device from the distributed data management server;
and the second equipment continuously runs the first application on the equipment according to the data of one or more items of the context information and the running state information of the first application.
According to the method and the device, shared data synchronization among the devices can be achieved, and data interaction efficiency among the devices can be improved.
In a third aspect, the present application discloses an apparatus comprising means for performing the method according to any of the first aspect above.
In a fourth aspect, the present application discloses an apparatus comprising means for performing the method according to any of the second aspects above.
In a fifth aspect, the present application discloses an apparatus comprising one or more processors, a memory, and a communication interface; the memory and the communication interface are coupled to the one or more processors, the memory storing a computer program, the one or more processors, when executing the computer program, causing the apparatus to perform the method according to any of the first aspect.
In a sixth aspect, the present application discloses an apparatus comprising one or more processors, a memory, and a communication interface; the memory and the communication interface are coupled to the one or more processors, the memory storing a computer program, the one or more processors, when executing the computer program, causing the apparatus to perform the method according to any of the second aspects.
In a seventh aspect, the present application discloses an information interaction system, which includes a first device and a second device, where the first device is the device in the third aspect, and the second device is the device in the fourth aspect; alternatively, the first device is the device of the fifth aspect, and the second device is the device of the sixth aspect.
In an eighth aspect, the present application discloses a computer storage medium having a computer program stored thereon, the computer program being executable by a processor to perform the method of any of the above first aspects or the method of any of the above second aspects.
In summary, compared with the prior art that the synchronous context can be processed and restored after the relevant application is started by clicking on the corresponding device, the application of another device can be accurately started on one device without operating the opposite device, and the selectable device selection menu is more accurately displayed based on different applications and the currently networked device, so that the method is more in line with the expectation of the user and is easier to use.
Drawings
Fig. 1 is a schematic diagram of a system architecture applicable to an application interaction method between multiple devices provided in the present application;
fig. 2 is a schematic diagram illustrating a software architecture applicable to the application interaction method between multiple devices provided in the present application;
fig. 3 is a schematic diagram illustrating a hardware structure of an electronic device provided in the present application;
fig. 4A to 4E are schematic user interfaces of an application interaction method between multiple devices according to the present application;
fig. 5 is a schematic device interaction flow diagram of an application interaction method among multiple devices according to the present application;
fig. 6A to 6E are schematic user interfaces of an application interaction method between multiple devices according to the present application;
fig. 7 is a schematic device interaction flow diagram of an application interaction method among multiple devices according to the present application;
fig. 8A to 8E are schematic user interfaces of an application interaction method between multiple devices according to the present application;
fig. 9A to 9B are schematic user interface diagrams of an application interaction method between multiple devices according to the present application;
fig. 10A to 10B are schematic user interfaces of an application interaction method between multiple devices according to the present application;
fig. 11A to 11B are schematic user interfaces of an application interaction method between multiple devices according to the present application;
FIG. 12 is a home screen interface schematic diagram of a first device provided herein;
fig. 13A to 13C are schematic user interface diagrams of an application interaction method between multiple devices according to the present application;
FIG. 14 is a schematic diagram of a logical structure of an apparatus provided herein;
fig. 15 is a schematic diagram of a logical structure of another apparatus provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In order to better understand the application interaction method between multiple devices and the related management device provided in the embodiment of the present application, a description is first given below of a framework of an information interaction system to which the application interaction method between multiple devices provided in the embodiment of the present application is applicable. Referring to fig. 1, fig. 1 is a schematic structural diagram of an information interaction system according to an embodiment of the present application. As shown in fig. 1, the information interaction system may include a plurality of electronic devices 11, and the plurality of electronic devices 11 may communicate with each other through a network 12, where:
the electronic device 11 may install and run one or more Applications (APPs), which may include, for example, various video playback APPs, and may also include various APPs such as WeChat, Mei Tuo, email, and so on. The electronic device 11 may include, but is not limited to, any intelligent operating system based electronic product that can interact with a user through input devices such as a keyboard, a virtual keyboard, a touch pad, a touch screen, and a voice control device, such as a smart phone, a Tablet PC, a handheld computer, a wearable electronic device, a Personal Computer (PC), and a desktop computer. The electronic device 11 may also include, but is not limited to, any IoT device, such as a smart speaker, a Television (TV), a car display of an automobile, and the like. The smart operating system includes, but is not limited to, any operating system that enriches device functionality by providing various APPs to the device, such as Android, IOS, Windows, and MAC systems, among others.
The electronic devices 11 may be connected to the same network 12 for communicating with each other. The network 12 may be a Local Area Network (LAN) or the internet. Illustratively, the network 12 may also be a Local Area Network (LAN) formed by wireless communication (e.g., Bluetooth, Wi-Fi, or ZigBee network, etc.). Of course, besides these networks, the electronic devices 11 may also communicate in other network forms, which is determined by the actual situation, and the present solution does not limit this.
The software architecture of the information interaction system provided by the embodiment of the present application is exemplarily described below, and refer to fig. 2. The information interaction system includes n (where n is an integer greater than 1) devices, which may be the electronic devices 11 shown in fig. 1. The n devices, the application market and the cloud account management server can be communicated with each other through network connection.
The application marketplace may be used to provide APP download services for the n devices. The cloud account management server may be configured to manage account information of the n devices. Optionally, the n devices may log in to the same account, or the n devices log in to different accounts, but the different accounts are in the same account group.
In addition, the shared data of the n devices may be synchronized to a distributed data management server, which is synchronously managed by the distributed data management server. The distributed data management server may be a cloud server.
Each of the n devices may include an application suite (application kit) module, an Application Management Service (AMS) module, a distributed scheduling management service (DMS), a Distributed Data Management Service (DDMS) module, a Package Management Service (PMS) module, and an account management module. Wherein:
the application kit module is a module for developing APP, and the APP comprises two parts of Activity (interface) and Service (non-interface). Wherein, the Activity is used for providing a User Interface (UI) entry operated by the user; the Service is used for providing background services. The overview Profile configuration file of the Activity/Service sets the capabilities supported by the Activity/Service, such as audio, video, editable and the like; or the types of the devices supported by the Activity/Service are set, such as a personal computer, a smart television, a smart sound box and the like.
The AMS module is a server corresponding to the ApplicationKit module, and may be configured to schedule a life cycle of a minimum scheduling unit (such as activity or service, but not limited to the example) in the App. For example, the AMS module may schedule Activity to an active state or a background state, etc. Meanwhile, the AMS module is used for managing the starting of the process of the Activity/Service, the resource loading, the creation of the thread environment and the like.
The DMS module can be used to decide whether the user-initiated App runs on the device or on a cross-device. Specifically, the method and the device can be used for interactively inquiring the AMS of the device which runs the App and is initiated by the user with the DDMS so as to further realize the running of the corresponding APP.
The DDMS module may be configured to interact with a distributed data management server, synchronize data of a device to the distributed data management server, and query required data from the distributed data management server. For example, information of AMS of a device may be queried according to an identification ID of the device, and the like.
The PMS module may be configured to manage installation and uninstallation of the APP installation package, and parsing, querying, and the like of configuration information of the installation package, and store a Device ID of the current Device and APP package information into the DDMS module.
The account management module may be configured to manage account information of the device.
Based on the above description, an exemplary electronic apparatus provided in the following embodiments of the present application is first described below.
Fig. 3 shows a schematic structural diagram of the electronic device 100, and the electronic device 100 may be the electronic device 11 shown in fig. 1.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Based on the system framework described in fig. 1 and fig. 2 and the hardware framework of the electronic device described in fig. 3, an embodiment of the present application provides an application interaction method between multiple devices.
In this embodiment, the premise that interaction between multiple devices can be achieved is that the multiple devices are connected to the same network, and meanwhile, the multiple devices log in the same account or the accounts logged in by the multiple devices belong to the same account group, so that the devices can search for other devices and obtain related information of the other devices. For example, see fig. 2, in which case distributed data management may be performed on the data of the plurality of devices. The embodiments described below can all be realized with this premise.
First embodiment, cross-device start-up application among multiple electronic devices
The process of launching an application across devices is described below by taking any two devices of a plurality of electronic devices included in an information interaction system to which the present application is applicable as an example. The arbitrary two devices may be referred to as a first device and a second device, respectively. The first device is used as a device for initiating the cross-application process, and the second device is used as a device for starting the application.
It should be noted that, for implementing the cross-device start-up application, the electronic device that needs the started-up application can support the operation of the application, that is, has the operation environment of the application. The execution environment includes a hardware environment and a software environment required to execute the application. For example, the application is a video playing application, and the hardware environment required for the video playing application to run includes a display screen, a sound player, a video codec, and the like; required software environment such as support of an operating system, etc. The cross-device starting of the video playing application can be realized only if the electronic device which is started with the video playing application is provided with the hardware environment and the software environment.
Then, it is assumed that the first device initiates a procedure for starting the first application in the second device. The first application may be any one of applications included in the first device. And the second device is provided with a running environment for running the first application. The first device may sense a first operation on an icon of the first application through a User Interface (UI) of the display screen and then display a selection menu including controls of the second device in the display screen in response to the first operation. Then, the first device senses a selection operation of a control of the second device, and sends a request for starting the first application to the second device in response to the selection operation, and after receiving the request, the second device starts the first application according to the request.
Optionally, the first operation may be, for example, a long-press operation on the icon of the first application, or may also be, for example, a click operation of multiple fingers on the icon of the first application, or may be, for example, a contact sliding operation of multiple fingers on the icon of the first application, or the like.
The long pressing operation may be, for example, an operation that the time for pressing the application icon is not less than a first preset time period, where the first preset time period may be, for example, 1 second, 2 seconds, or 3 seconds, and the like, and the specific first preset time period is determined according to specific situations, which is not limited in this embodiment.
The operation of sliding the contact of the plurality of fingers on the icon of the first application may be, for example, an operation of sliding any two fingers, such as an index finger and a middle finger, on the icon of the first application by a first preset length. The first preset length may be, for example, 0.1 cm or 0.3 cm, etc. The first preset length and the used finger are determined according to specific conditions, and the scheme is not limited to this.
It should be noted that, in this specification, operations on the user interface of the electronic device, such as clicking, dragging, and the like, may be implemented by a tool, such as a stylus, a mouse, and the like, or may also be implemented by a finger of a user, which is determined according to actual situations, and this is not limited by the present solution.
For the convenience of understanding of the first embodiment, the following description is made with reference to the accompanying drawings.
Referring to fig. 4A, fig. 4A is a schematic view of a user interface of the first device. The user interface includes icons for one or more applications, which are shown in FIG. 4A. The applications may be various types of applications, such as video playing type applications, music type applications, social type applications, office type applications, shopping type applications, financial type applications, and so forth. The specific applications are determined according to specific situations, and the scheme is not limited to this.
Assume that the first application is any one of the applications in fig. 4A. The first device may sense a first operation on an icon of the first application through the interface shown in fig. 4A. Illustratively, the first operation shown in fig. 4A is an operation of pressing the first application with a long finger.
The first device responds to the first operation on the icon of the first application, one or more devices which are logged in the same account number as the first device or belong to an account number group with the logged account number are searched, and then the devices with the running environment of the first application are selected to be displayed on an interface of a display screen in a menu selection mode.
Referring to fig. 4B, the first device displays a selection menu 401 in the interface in response to the above-described first operation on the icon of the first application. The selection menu 401 includes one or more electronic device controls, for example, the selection menu may include second device controls 4011, third device controls 4012, fourth device controls 4013, fifth electronic device controls 4014, and other device controls. Illustratively, the second device may be a TV, the third device may be a PC, the fourth device may be an iPhone2, and the fifth electronic device may be a Tablet PC, for example. Here, it is only illustrated that what kind of device the second device, the third device, the fourth device, and the fifth electronic device are specifically determined according to the actual situation, and the present solution is not limited to this.
Here, the selection menu 401 includes controls of 4 electronic devices exemplarily, and actually, the number of the electronic devices included in the selection menu 401 and the controls of which devices are specifically included are determined according to specific situations, which is not limited in this embodiment.
Furthermore, the controls of the electronic devices included in the selection menu 401 may be displayed in the interface in the form of names of the electronic devices, or may be displayed in the interface in the form of icons plus names. The specific form of expression is determined according to the actual situation, and the scheme does not limit the form.
The devices in the selection menu 401 are all devices that are connected to the same network as the first device and login to the same account or login accounts belonging to the same account group. Optionally, the devices in the selection menu 401 are all devices that can meet the requirements of the running environment of some or all of the sub-applications in the first application.
To facilitate understanding that the device may satisfy the requirements of the partial sub-application execution environment of the first application, for example. It is assumed that the first application is a video playing application, and running the application requires running environments such as a display, a sound player, a video codec module, and the like. Then, if the video playback application is decomposed into two sub-applications, respectively, two sub-applications of displaying video pictures and playing video sounds. Assuming that the hardware environment required by the sub-application for displaying the video picture is the running environment of a display, a video coding and decoding module and the like; and playing the video-sound sub-application requires a running environment such as a sound player. Any device that satisfies the execution environment of one of the two sub-applications is called a device that satisfies the requirements of the execution environment of the partial sub-application of the first application, and may be displayed in the selection menu 401. For example, the smart speaker has an operating environment such as a sound player, and can play the sound of the video played by the first application, so that the video and sound playing sub-application of the first application can be started in the smart speaker. Accordingly, the control of the smart speaker may be displayed in the selection menu 401.
It should be noted that the position where the selection menu 401 is displayed on the display screen of the first device is not limited to the position shown in fig. 4B, and may be displayed at any position in the display screen, such as the middle position or the bottom position of the display screen. Optionally, the selection menu 401 displayed on the display screen does not obscure the icon of the first application.
The first device may sense a selection operation of the second device control 4011 at, for example, the interface shown in fig. 4B.
Optionally, the selection operation may be, for example, an operation in which an icon of the first application is dragged into the second device control 4011, for example, see fig. 4C. Specifically, the selection operation may be that the first application is pressed and dragged into the second device control 4011 and then released.
Alternatively, optionally, the selection operation may also be an operation that the second device control 4011 is clicked once, for example, as shown in fig. 4D. Specifically, after the second device control 4011 is clicked once, the second device is selected as the device on which the first application is started.
The first device sends a request for starting the first application to the second device in response to the selection operation of the second device control 4011. And after receiving the request, the second equipment starts the first application according to the request. And then sending information of successful starting to the first device. The first device may display the information on the display screen after receiving the information, for example, see fig. 4E. The first device may display this information in the form of a prompt box 402. The prompt box 402 may include, for example, a description of "the first application has been successfully launched in the second device", or the like. The position displayed by the prompt box 402 may be any position in the display screen of the first device, which is not limited in this embodiment.
The implementation process of the first embodiment is described below with reference to the software architecture diagram shown in fig. 2 and fig. 5:
after the first device senses a first operation on the icon of the first application, the first device may invoke a packet management service module of the first device to query configuration information of the first application, where the configuration information includes information of an operating environment required for the operation of the first application. Then, the first device matches the configuration information with one or more devices which are searched for and have the same account number with the first device or the account number of the first device, wherein the account number belongs to an account number group, and selects the devices matched with the configuration information, namely the devices with the running environments of part of sub applications and all the sub applications in the first application. And then calling the display screen to display the selected equipment on the user interface.
After the first device senses a selection operation on the control 4011 of the second device through the display screen, a distributed scheduling management service module of the first device is called to initiate starting of a first application of the second device. Specifically, the distributed scheduling management service module calls a distributed data management service module to query the application management service module of the second device in the distributed data management server according to the device identifier ID of the second device.
Then, the distributed scheduling management service module of the first device sends a request for starting the first application to the application management service module of the second device. And the application management Service module of the second device calls the package management Service module of the second device according to the request to inquire the configuration information of the Activity or the Service of the first application to be started.
Then, the application management Service module of the second device loads the running environment for running the first application according to the configuration information, and creates an Activity or Service thread of the first application.
Then the application management Service module of the second device calls the application suite module to initiate scheduling of the Activity or the Service life cycle of the first application, such as activating the Activity, hiding the Activity or stopping the Activity and other life cycles.
Therefore, the first application of the second equipment is started by crossing the first equipment, no manual operation is needed in the second equipment, and the method is convenient, quick and easy to operate.
The above embodiment is implemented in a case where the first application is already installed in the second device. The following exemplarily describes a procedure of launching an application across devices in a case where the first application has not been installed in the second device.
In a specific embodiment, after the first device sends a request for starting the first application to the second device, it is detected that the first application is not installed in the second device. Then, the first device sends an installation request of the first application to the second device, for example, the request may be initiated through inter-process communication (IPC) of the device. The second device can initiate a request for downloading the first application installation package to a cloud application market after receiving the installation request of the first application, download the installation package of the first application, and then install the first application. Or, the second device may also obtain the installation package of the first application from the first device through inter-process communication IPC and then install the installation package. And after the installation is successful, the second equipment starts the first application according to the request for starting the first application, which is sent by the first equipment.
Alternatively, the inter-process communication may be implemented using remote procedure calls (gRPC) mechanism of google, for example.
The process of launching an application across devices without a first application installed in a second device is described below in conjunction with the software architecture diagram shown in fig. 2:
after the distributed scheduling management Service module of the first device sends a request for starting the first application to the application management Service module of the second device, the distributed scheduling management Service module of the first device calls the distributed data management Service module, and the Activity or Service installation information of the first application of the second device is not inquired in the distributed data management server according to the device identification ID of the second device. This indicates that the first application is not installed in the second device. At this time, the first device may call the second device through IPC to install the first application.
Specifically, the second device invokes, according to the instruction of the first device, the package management service module of the second device to initiate a request for downloading the first application to an application market, and downloads and installs the installation package of the first application from the application market. Or, the first device may directly send the installation package of the first application to the second device through the local network, and the second device receives the installation package and then installs the installation package. After the installation is successful, the application management Service module of the second device calls the package management Service module of the second device to inquire the Activity or Service configuration information of the first application to be started according to the request for starting the first application. The following steps may correspond to the description of fig. 5, and are not described herein again.
The above-described embodiments are not limited by a specific application without hardware limitations, as compared with the prior art. All applications can run on corresponding equipment according to the capabilities of the applications, and the applications do not need to adapt to the applications. In addition, the user can realize the starting and running of the application on the specified equipment in a mode of clicking and dragging the equipment list, and the operation is convenient. Moreover, the whole operation flow is easier for users to understand, manual operation on the other device is not needed, and the application of the other device can be directly and remotely started through one device.
In summary, the embodiment of the application can accurately start the application of another device on one device, and based on different applications and the currently networked devices, the selectable device selection menu can be displayed more accurately, so that the method and the device more accord with the expectation of a user and are easier to use.
In one possible implementation, the first device may default to the user initiating the starting of the application of the corresponding device according to a selection mode commonly used by the user based on the selection of the user and the learning of the scene.
For ease of understanding, this is illustrated. The first device learns that after the selection menu is displayed when the first operation applied to the electronic mailbox is sensed, the selection operation of the control of the tablet personal computer device is sensed next. Then, when the first device senses the first operation on the email application again next time, the first device may not reappear the selection menu for the user to select, but may directly select the tablet device according to the learned selection mode to start the email application of the tablet computer.
Alternatively, the first device may not reappear the selection menu for selection by the user, but may display an inquiry window the next time the first device senses the first operation to the email application again. The query window may include a description such as "whether to start the email application of the tablet computer", and then the query window further includes controls of "yes" and "no". The first device may launch an email application of the tablet computer across devices in response to selection of the yes control.
Of course, this is only an example, and the determination of which application is to be started and in which device the first device selects to start the application is determined according to specific situations, and this is not limited in this embodiment. According to the embodiment of the application, the flow can be simplified, and the user experience is improved.
And migrating the application running in the first equipment to the second equipment to run continuously.
The following continues to describe a process of migrating an application running in a first device to a second device and then running the application by taking any two devices (which may be referred to as a first device and a second device, respectively) in a plurality of electronic devices, which are included in an information interaction system to which the present application is applied.
Similarly, it should be noted that migrating the application running in the first device to the second device for subsequent running requires that the second device can support running of part of or all of the sub-applications in the application, that is, an environment in which part of or all of the sub-applications in the application run is provided.
Then, assume that the first application running in the first device is migrated to the second device to run continuously. The first application may be any one of applications included in the first device. And the second device is provided with a running environment for running part of or all of the sub-applications in the first application. The first device may first launch and run the first application. For example, assuming that the first application is a video playing application, the first device may first start the first application and play a video through the first application.
Then, the first device senses a first operation on a user interface of the first application, and then displays a selection menu including the second device in a display screen in response to the first operation. Then, the first device senses a selection operation of the second device, and sends a request for starting the first application to the second device in response to the selection operation, and after receiving the request, the second device starts the first application according to the request.
And the second equipment returns a message of successful starting to the first equipment after the equipment successfully starts the first application. The first device sends the context information of the first application running in the first device or the running state information of the first application to the second device according to the message, or the context information and the running state information are both sent to the second device. The context information includes data enabling to run the first application in succession, for example the context information may include an Activity state of the first application, a UI component inheritance relationship of a display page, and the like. The running state information of the first application includes user interface data of the first application, and may include, for example, information displayed in a display screen of the first device when the first application runs in the first device, information to be displayed, and the like.
After the second device receives the information sent by the first device, if the received information includes the context information and the running state information, the second device may restore the application interface displayed in the first device according to the information, continuously display the application interface in the display screen of the second device, and then continue the normal running state of the application. For example, if the interface for video playing is recovered, the playing at the normal playing speed can be continued after the second device displays the playing interface.
If the information received by the second device includes the context information but does not include the operation state information, the second device may continue to operate the first application according to the context information. Or, the second device may also acquire the running state information of the first application from the distributed data management server, and then restore the application interface displayed in the first device.
If the information received by the second device includes the operation state information but not the context information, the second device may independently run the first application according to the operation state information and based on the second device, restore the application interface displayed in the first device and continue to run.
Or, after the second device successfully starts the first application, the second device may directly obtain the context information and/or the running state information from the distributed data management server without the need for the first device to send the context information and/or the running state information, and then continue to run the first application on the second device according to the obtained information.
In order to facilitate understanding of the second embodiment, the following description is made with reference to the accompanying drawings.
Referring to fig. 6A, fig. 6A is a schematic diagram of a user interface of the first device that has started and run the first application. The user interface includes a first application user interface 601, and the first application user interface 601 is mainly used for displaying the running content of the first application. For example, assuming that the first application is a video playing application, a screen displayed in the first application user interface 601 may include a played video, and the like. Here, the first application user interface 601 is only for example, and the content specifically displayed by the first application user interface 601 is determined according to specific situations, and the present solution is not limited to this.
When the content displayed by the first application user interface 601 needs to be migrated to the second device for continuous display and continuous operation, the first device may sense a first operation on the first application user interface 601 through the interface shown in fig. 6A. The first operation may be, for example, an operation of sliding a plurality of fingers, for example, a double-finger touch on the user interface 601, or may also be, for example, an operation of clicking the user interface 601 with a plurality of fingers, for example, a double-finger click, an operation of pressing the user interface 601 for a long time, or the like.
The two-finger sliding operation, for example, referring to fig. 6A, may be two fingers, for example, an index finger and a middle finger, which are combined to press the first application user interface 601 for sliding, or may be any two adjacent fingers, which are combined to press the first application user interface 601 for sliding. The sliding direction may be any direction, and the present solution is not limited to this.
The sliding distance of the two-finger sliding operation may be not less than a first preset distance, the first preset distance may be, for example, 0.3 cm, 0.5 cm, or 1 cm, and the first preset distance may be determined according to specific situations, and this is not limited in this embodiment.
The two-finger click operation may be, for example, a single click or a double click of the first application user interface 601 by combining the index finger and the middle finger within a third preset time period. Alternatively, the double-finger click operation may be an operation in which any two adjacent fingers are combined within a third preset time period to perform one click or two clicks on the first application user interface 601. The third preset time period may be, for example, 0.3 second, 0.5 second, or 0.8 second, and a value of the third preset time period may be determined according to a specific situation, which is not limited in this embodiment.
In response to the selection operation on the first application user interface 601, the first device searches for one or more devices that have the same account number as the first device login or belong to an account number group with the login account number, and then selects a device having an operating environment of part or all of the sub-applications of the first application to be displayed on an interface of a display screen in a menu selection manner.
Referring to fig. 6B, the first device displays a selection menu 602 in the interface in response to the first operation on the first application user interface 601 described above. The selection menu 602 includes one or more electronic device controls, such as controls for devices including a second device control 6021, a third device control 6022, a fourth device control 6023, and a fifth electronic device control 6024. Illustratively, the second device may be a TV, the third device may be a PC, the fourth device may be a Phone2, and the fifth electronic device may be a Tablet PC, for example. Here, it is only illustrated that what kind of device the second device, the third device, the fourth device, and the fifth electronic device are specifically determined according to the actual situation, and the present solution is not limited to this.
Here, it is exemplarily given that the selection menu 602 includes controls of 4 electronic devices, and actually, the number of the electronic devices included in the selection menu 602 and the controls of which devices are specifically included are determined according to specific situations, which is not limited in this embodiment.
Similarly, the controls of the electronic devices included in the selection menu 602 may be displayed in the interface in the form of names of the electronic devices, or may be displayed in the interface in the form of icons plus names. The specific form of expression is determined according to the actual situation, and the scheme does not limit the form.
The devices in the selection menu 602 are all devices that are connected to the same network as the first device and login to the same account or login accounts belonging to the same account group. Optionally, the devices in the selection menu 602 are all devices that can meet the requirements of the running environment of some or all of the sub-applications of the first application.
It should be noted that the position where the selection menu 602 is displayed on the display screen of the first device is not limited to the position shown in fig. 6B, and may be displayed at any position in the display screen, such as the middle position or the bottom position of the display screen.
The first device may sense a selection operation of the second device control 6021 at an interface such as that shown in fig. 6B.
The selection operation may be, for example, an operation in which the first application user interface 601 is dragged into the second device control 6021. For example, referring again to FIG. 6B, the operation of the drag may be, for example, the index finger and middle finger holding the first application user interface 601 for dragging. Or the operation of dragging may be, for example, a finger, such as an index finger or a middle finger, pressing the first application user interface 601 for dragging. Specifically, which way is selected for the dragging operation is determined according to specific situations, and the scheme is not limited to this.
During the above-described dragging of the first application user interface 601 to the second device control 6021, the first application user interface 601 may zoom out to facilitate the dragging, as may be seen, for example, in fig. 6C.
The first application user interface 601 is dragged into the second device control 6021 described above, as may be seen in fig. 6D, for example. The first application user interface 601 is then released, thus completing the selection of the second device.
In one possible implementation, the selection operation may be, for example, a one-click operation of the second device control 6021, for example, as shown in fig. 6E. In particular, the second device control 6021 may complete the selection of the second device after a single click.
The first device sends a request to the second device to launch the first application in response to the selection operation of the second device control 6021 described above. And after receiving the request, the second equipment starts the first application according to the request.
Similarly, the second embodiment is implemented in the case that the first application is already installed in the second device. For the process of starting the application across devices when the first application is not installed in the second device, reference may be made to the process of starting the application across devices when the first application is not installed in the second device, and details are not described here.
And the second equipment returns a message of successful starting to the first equipment after the equipment successfully starts the first application. And the first equipment sends one or more items of context information and running state information of the first application, which are run in the first equipment, to the second equipment according to the message.
The implementation process of the second embodiment is described below with reference to the software architecture diagram shown in fig. 2 and fig. 7:
first, a process in which the first device starts and runs the first application will be described. When the first device senses a click operation on an icon of the first application through the display screen, a distributed scheduling management service module is called in response to the click operation, so that the first application is determined to be the application of the first device. Then, the distributed scheduling management service module sends a request for starting the first application to the application management service module of the first device, after receiving the request, the application management service module calls an application suite module of the first device to schedule an Activity life cycle of the first application, and then the application suite module calls a display screen to display a user interface of the first application.
The first device displays a selection menu 602 through the display screen in response to a first operation of the user interface of the first application. Specifically, after the first device senses a first operation on the user interface of the first application, the first device may invoke a package management service module of the first device to query configuration information of the first application, where the configuration information includes information of an operating environment required for the operation of the first application. Then, the first device matches the configuration information with one or more devices which are searched for and log in the same account as the first device or the account which is logged in belongs to an account group. And selecting a device matched with the configuration information, namely selecting a device with the running environment of part of or all of the sub-applications of the first application. The display screen is then invoked to display the selected device in the user interface, i.e., to display the selection menu 602 described above.
When the first device senses a selection operation of the second device control 6021 in the selection menu 602 through the display screen, the distributed scheduling management service module of the first device is invoked to initiate the launch of the first application of the second device. Specifically, the distributed scheduling management service module calls a distributed data management service module to query the application management service module of the second device in the distributed data management server according to the device identifier ID of the second device.
Then, the distributed scheduling management service module of the first device sends a request for starting the first application to the application management service module of the second device. The request carries a callback parameter for returning to the first device to indicate that the second device has successfully launched the first application. And the application management service module of the second device calls the package management service module of the second device according to the request to inquire the configuration information of the Activity of the first application to be started.
Then, the application management service module of the second device loads the running environment running the first application according to the configuration information, and creates a thread of Activity of the first application.
Then the application management service module of the second device calls the application suite module to initiate scheduling of the lifecycle of the Activity of the first application, such as scheduling of the lifecycle of the Activity and the like.
And after the Activity of the first application is successfully activated, the application suite module of the second device sends callback information to the application management service module of the second device according to the callback parameters. The callback information may include parameters or the like that control the operation of the first application in the second device. And then the application management service module of the second device sends the callback information to the distributed scheduling management service module of the first device. And then the distributed scheduling management service module sends the callback parameter to an application suite module of the first device. Then, the application suite module sends one or more items of context information and running state information of the first application running in the first device to an application suite module of the second device according to the callback parameter. And the application suite module of the second device restores the running interface of the first application according to one or more items of the context information and the running state information, displays the running interface in the display screen, and continues to run.
In one possible implementation, the above-mentioned one or more of context information of the first application running in the first device and running state information of the first application may be sent to the application suite module of the second device through the remote agent.
Therefore, the method and the device realize the migration of the application running in the first equipment to the second equipment for continuous running without any manual operation in the second equipment, are convenient and quick and are easy to operate.
It should be noted that, the implementation process described above with reference to the software architecture diagram shown in fig. 2 and fig. 7 is also implemented on the premise that the first application is installed in the second device. In the case that the first application is not installed in the second device, referring to the description in the process of starting the application across devices in the case that the first application is not installed in the second device as described above with reference to the schematic diagram of the software architecture shown in fig. 2, the first application is installed in the second device first, and then the first application is started and migrated. Specifically, reference may be made to the foregoing description, which is not repeated herein.
Through the embodiment of the application, under the multi-device scene, a user can flexibly operate the migration operation and linkage of the application in the multi-device scene, and better experience is achieved.
In one possible implementation manner, the first device may default that the user initiates starting of the application of the corresponding device according to a selection mode commonly used by the user based on the selection of the user and the learning of the scene, and then migrate the application running in the first device to the corresponding device to run continuously. Specifically, how to initiate the starting of the application of the corresponding device according to the selection mode commonly used by the user may be similar to the corresponding description in the first embodiment, and details are not repeated here.
And controlling a plurality of electronic devices to jointly run the application through the first device.
The following describes a process of controlling, by a first device, a plurality of electronic devices to jointly run an application, by taking any three devices (which may be referred to as a first device, a third device, and a fourth device, respectively) in the plurality of electronic devices included in an information interaction system to which the present application is applicable as an example.
In this embodiment, the first device may be regarded as a controller, which is capable of simultaneously controlling the third device and the fourth device to jointly run an application. Of course, these devices need to have a corresponding operating environment. For example, assume that the first device is a cell phone, the third device is a TV, and the fourth device is a smart speaker. The application running on the mobile phone is a video playing application, and the video playing application can include two sub-applications of displaying a video picture and playing video sound, so that the third device, namely the TV, can run the sub-application of displaying the video picture for displaying the video picture, and the fourth device, namely the smart sound box, can run the sub-application of playing the video sound for playing the sound of the video. Meanwhile, the mobile phone can remotely control one or more items of the running logic sequence, the user interface updating state and the multimedia playing progress of the running sub-applications of the TV and the intelligent sound box.
A first application in a first device is described as an example below. The first application may be any one of applications included in the first device. The first device may first launch and run the first application. For example, assuming that the first application is a video playing application, the first device may first start the first application and play a video through the first application.
The first device then senses a first operation on a user interface of the first application and then displays a selection menu including the third device and fourth device combination controls in a display screen in response to the first operation. Then, the first device senses a selection operation of the third device and the fourth device combination control, and sends a request for starting the first application to the third device and the fourth device respectively in response to the selection operation, and after receiving the request, the third device and the fourth device start respective first applications according to the request.
Optionally, the process that the first device sends the request for starting the first application on each of the third device and the fourth device to the third device and the fourth device may be executed concurrently, or may also be sent sequentially. The concurrent execution may enable a synchronous start of the first application in the two devices to facilitate synchronous control of the applications in the two devices. The sequential sending is easy to realize, and the requirement on a processor can be reduced. The method is selected according to actual conditions, and the scheme does not limit the method.
It should be noted that the first application in the third device and the fourth device may be a function of a partial sub-application that can implement the first application in the first device, or may also be a function of a full sub-application that can implement the first application in the first device. See in particular the examples of mobile phones, TVs and smart speakers mentioned above.
And the first application of the first device is respectively configured with a binding relationship with the first application of the third device and the first application of the fourth device. For example, this may be achieved by remotely invoking a binding service bindService in the service. Therefore, the first device can be ensured to remotely call and control the related objects of the first application in the third device and the fourth device, and the like.
And after the third device and the fourth device successfully start the first application in the device, respectively returning a message of successful start to the first device. Then, the first device may remotely control one or more of the running logic sequence, the user interface update status, and the multimedia playing progress of the first application in the third device and the fourth device according to the configured binding relationship.
For convenience of understanding, the first device is also taken as a mobile phone, the currently running first application is a video playing application and is playing a video, the third device is a TV, and the fourth device is a smart sound box. A sub-application displaying video pictures may be run on the third device and a sub-application playing video sounds may be run on the fourth device. Then, the first device can remotely control the progress of video playing in the third device, i.e. TV, such as pause, fast forward, etc., and the sound of video in the fourth device, i.e. smart box, such as controlling the size of the sound, etc., according to the binding relationship configured as described above. The specific control mode is set according to the actual situation, and the scheme does not limit the specific control mode.
Of course, the process of controlling multiple electronic devices to jointly run an application through a first device is described above by taking three electronic devices as an example, in practice, the first device may also control three, four or even more electronic devices to jointly run an application, which is determined specifically according to the actual situation, and this scheme is not described again.
To facilitate understanding of the present embodiments, the following description is made with reference to the accompanying drawings.
Referring to fig. 8A, fig. 8A is a schematic diagram of a user interface of the first device that has started and run the first application. The user interface includes a first application user interface 801, and the first application user interface 801 is mainly used for displaying the running content of the first application. For example, assuming that the first application is a video playing application, a screen that may be a played video and the like are displayed in the first application user interface 801. As another example, assuming the first application is an email application, an interface that may be a mailbox or the like is displayed in the first application user interface 801. Here, the first application user interface 801 specifically displays the content determined according to specific situations, and the present solution is not limited to this.
The first device may sense a first operation on the first application user interface 801 through the interface shown in fig. 8A. For specific implementation of the first operation, reference may be made to the description of the first operation in the second embodiment, and details are not described here.
In response to the first operation on the first application user interface 801, the first device searches for one or more devices that have the same login account as the first device or belong to an account group with the login account, and then selects a device having an operating environment of part or all of the sub-applications of the first application to be displayed on an interface of a display screen in a menu selection manner.
Referring to fig. 8B, the first device displays a selection menu 802 in the first application user interface 801 in response to the first operation described above. The selection menu 802 includes one or more electronic devices and controls for combinations of electronic devices, for example, may include controls for devices such as a second device control 8021, a third device control 8022, a fourth device control 8023, a fifth electronic device control 8024, and a combination control 8025 for the third device and the fourth device. Illustratively, for example, the second device may be a TV, the third device may be a smart speaker, the fourth device may be an iPhone2, and the fifth electronic device may be a Tablet PC. Here, it is only illustrated that what kind of device the second device, the third device, the fourth device, and the fifth electronic device are specifically determined according to the actual situation, and the present solution is not limited to this.
Here, it is exemplarily given that the selection menu 802 includes 5 electronic devices and their combined controls, and actually, the number of the electronic devices and their combined controls included in the selection menu 802 and which devices are specifically included are determined according to specific situations, which is not limited in this embodiment.
In addition, the controls of the electronic devices included in the selection menu 802 may be displayed in the interface in the form of names of the electronic devices, or may be displayed in the interface in the form of icons plus names. The specific form of expression is determined according to the actual situation, and the scheme does not limit the form.
The devices in the selection menu 802 are all devices that are connected to the same network as the first device and login to the same account or login accounts belonging to the same account group. The devices in the menu 802 are all devices that can meet the requirements of the running environment of some or all of the sub-applications of the first application.
It should be noted that the position where the selection menu 802 is displayed on the display screen of the first device is not limited to the position shown in fig. 8B, and may be displayed at any position in the display screen, such as the middle position or the bottom position of the display screen.
The first device may sense a selection operation of a combination control 8025 of the third device and the fourth device at an interface such as that shown in fig. 8B.
The selection operation may be, for example, an operation in which the first application user interface 801 is dragged into the combination controls 8025 of the third and fourth devices. Illustratively, referring again to FIG. 8B, the operation of the drag may be, for example, the drag of the first application user interface 801 by the combination of the index finger and the middle finger. Or the operation of dragging may be, for example, a finger, such as an index finger or a middle finger, pressing the first application user interface 801 to drag. Specifically, which way is selected for the dragging operation is determined according to specific situations, and the scheme is not limited to this.
During the above-described dragging of the first application user interface 801 to the compound control 8025 of the third and fourth devices, the first application user interface 801 may zoom out to facilitate the dragging, as can be seen, for example, in fig. 8C.
The first application user interface 801 is dragged into the combination controls 8025 of the third and fourth devices described above, as can be seen, for example, in fig. 8D. The first application user interface 801 is then released, thus completing the selection of the combination of the third device and the fourth device.
In one possible implementation, the selection operation may also be an operation of clicking a combined control 8025 of the third device and the fourth device, for example, as shown in fig. 8E. Specifically, the combination control 8025 of the third device and the fourth device is clicked once to complete the selection of the combination of the third device and the fourth device.
In response to the above-described selection operation of the combination control 8025 of the third device and the fourth device, the first device sends a request to the third device and the fourth device to start the first application, respectively. And after receiving the request, the third device and the fourth device respectively start respective first application according to the request.
Likewise, the third embodiment described above is implemented in the case where the first application has been installed in the third device and the fourth device. For the process of starting the application across devices when the first application is not installed in the third device and the fourth device, reference may be made to the process of starting the application across devices when the first application is not installed in the third device, and details are not described here.
And after the third device and the fourth device successfully start the first application, returning a message of successful start to the first device. Then, the first device may remotely control one or more of the running logic sequence, the user interface update status, and the multimedia playing progress of the first application in the third device and the fourth device according to the configured binding relationship.
In the process of implementing the third embodiment, an interaction process of the software modules of the first device and the third device, and an interaction process of the software modules of the first device and the fourth device may be introduced to the implementation process of the second embodiment with reference to the software architecture diagram shown in fig. 2 and fig. 7 in the second embodiment. The difference is that after the third device and the fourth device successfully start the first application of the third device and the fourth device, the third device and the fourth device send respective data of the control agent to the first device according to the preset callback parameter, and the first device controls the related object of the first application in the third device and the fourth device according to the received control agent. Others will not be described in detail herein.
In the embodiment of the application, the advantage combination among multiple devices can be optimized in a device combination mode, and better experience is brought to specific applications. Different devices are not in a peer-to-peer relationship but in a master-slave cooperation relationship, and experience effects which cannot be achieved by a single device can be achieved by combining a plurality of devices. For example, if the first device, the third device, and the fourth device are a mobile phone, a TV, and a smart speaker, respectively, the embodiment of the present application provides better experience for the user by using convenience of mobile phone operation, high definition resolution of a TV screen, and a large screen viewing experience, and better sound quality output of the speaker.
In one possible implementation, the first device may default to the user initiating the starting of the application of the corresponding device according to a selection mode commonly used by the user based on the selection of the user and the learning of the scene.
For ease of understanding, this is illustrated. The first device learns that after a selection menu is displayed when sensing a first operation on a certain video playing application, a selection operation on a control of the TV and smart speaker combination device is sensed next. Then, when the first device senses the first operation of the certain video playing application again next time, the first device may not present the selection menu again for the user to select, but may directly select the TV and smart speaker combination device according to the learned selection mode to start the certain video playing application of the TV and smart speaker, respectively.
Alternatively, the first device may not reappear the selection menu for user selection the next time it again senses the first operation for the certain video playback application, but may display a query window. The query window may include, for example, a description of whether to start a video playback application of the TV and smart speaker combination device, and then the query window may include controls of "yes" and "no". The first device may launch a video playback application of the TV and smart speaker combination across devices in response to selection of the yes control.
Of course, this is only an example, and the determination of which application is to be started and in which device combination the first device selects to start the application is determined according to specific situations, and this is not limited in this embodiment. According to the embodiment of the application, the flow can be simplified, and the user experience is improved.
In one possible implementation, it can be known from the foregoing description that the first device, the third device, and the fourth device are configured to be remotely invoked with each other, so that the application in the first device can be decomposed into the first sub-application and the second sub-application, and then the first sub-application and the second sub-application are run in association on the third device and the fourth device, respectively. Then, the configuration information and the information of the three devices, such as the installation package, may be packaged together to obtain a shared data package. The sharing data packet can be shared with other users, so that the trouble of reconfiguration of other users is reduced, and the user experience is improved.
In one possible implementation manner, in the second and third embodiments, the first device may also display the selection menu when it is sensed that the user interface of the first application is switched to another user interface. The specific implementation process of the mode is described in the following by way of example in conjunction with the attached drawings.
In a specific embodiment, for example, the user interface of the first device shown in fig. 6A or fig. 8A, when the first device senses that the previously displayed interface is returned based on the user interface but the first application is not exited, or the first application is about to be exited, or the screen is locked based on the user interface, the selection menu is displayed in the display screen of the first device in response to the operations.
For ease of understanding, the following examples illustrate.
See fig. 9A and 9B. Fig. 9A is a schematic diagram of a user interface a901 of a first application displayed in a first device, and when the first device senses that the user interface 2 of the first application is returned (the user interface 2 is a user interface displayed before the user interface 1 is displayed on the first device), a selection menu may be displayed on the user interface 2, as described in fig. 9B. Prompt information may be included on the selection menu 9021, for example, information that "the device described below can be selected to run the first application" may be included to prompt the user to make a selection. Controls of the selectable electronic device may be displayed below the reminder. Likewise, these controls may be displayed in the form of device icons, names, or icons plus names, etc.
See fig. 10A and 10B. Fig. 10A is a schematic diagram of a user interface of a first application displayed in a first device. When the first device senses that the first application is about to exit (at this time, a prompt message 10011 prompting exit appears in the user interface, such as "exit again program", etc.), a selection menu may be displayed on the user interface, as shown in fig. 10B. Similarly, the selection menu 10012 may include prompt information, such as "the device running the first application can be selected" to prompt the user to make a selection. Controls of the selectable electronic device may be displayed below the reminder. Likewise, these controls may be displayed in the form of device icons, names, or icons plus names, etc.
Optionally, even if the first application is exited by pressing once again, in fig. 10A, the prompt message 10011 is not displayed, but the first device can sense that the first application is exited by pressing once again, and at this time, the first device may also display the selection menu 10012, as shown in fig. 10B.
For some applications, such as music playing applications, when the first device runs the applications, some associated interfaces and information of the applications can be displayed in the interface of the locked screen even after the screen is locked, for example, see fig. 11A.
Fig. 11A is a schematic view of a user interface of the lock screen of the first device, where the first application of the first device is a music playing application, and after the lock screen is locked, information such as a song name 11011 played by the music playing application, a control 11012 of a next song, a play/pause control 11013, and a control 11014 of a previous song may be displayed on the lock screen interface. At this time, the above-described selection menu may be displayed on the lock screen interface, for example, as shown in fig. 11B. Likewise, the selection menu 11015 may include prompt information, such as "the device described below may be selected to run the first application" to prompt the user to make a selection. Controls of the selectable electronic device may be displayed below the reminder. Likewise, these controls may be displayed in the form of device icons, names, or icons plus names, etc.
According to the embodiment of the application, the user can be timely reminded of the migration operation of the application by actively displaying the selection menu, so that better user experience is achieved. After the selection menu is displayed when the user interface of the first application is sensed to be switched to another user interface, the interaction between the first device and the second device, or the interaction between the first device and the third device/the fourth device, refer to the description of the second embodiment and the third embodiment, and are not described again here.
In one possible implementation manner, in the second and third embodiments, the first device may also display the selection menu in response to a first operation on a display plug-in of the first application.
The display plug-in of the first application may be a widget of the first application. For example, if the first application is a clock application, a weather application, or a calendar application, etc. Then, when the first device runs the applications, some information of the applications may be displayed at a certain position of the desktop, for example, the information may be time information, weather information, date information, or the like displayed in the home screen interface. And the time information, weather information and date information displayed in the main screen interface are widgets of a clock application, a weather application and a calendar application, respectively.
The first operation on the display plug-in of the first application may be a first operation on a widget of the first application.
Optionally, the first operation may be, for example, a long-press operation on the widget of the first application, or may also be, for example, a click operation of multiple fingers on the widget of the first application, or may be, for example, an operation of contact sliding of multiple fingers on the widget of the first application, or the like.
The long pressing operation may be, for example, an operation that the time for pressing the application widget is not less than a first preset time duration, where the first preset time duration may be, for example, 1 second, 2 seconds, or 3 seconds, and the first preset time duration is determined according to a specific situation, and this is not limited in this embodiment.
The operation of sliding the contact of the plurality of fingers on the widget of the first application may be, for example, an operation of sliding any two fingers, such as an index finger and a middle finger, on the widget of the first application by a first preset length. The first preset length may be, for example, 0.1 cm or 0.3 cm, etc. The specific first preset length and which finger to slide are determined according to specific situations, and the scheme is not limited to this.
The specific implementation process of the mode is described in the following by way of example in conjunction with the attached drawings.
See fig. 12. Fig. 12 illustrates a home screen interface 12 on a first device.
Home screen interface 12 may include: status bar 1201, calendar information 1202, weather information 1203, tray 1204 with frequently used application icons, navigation bar 1205, location information 1206, time information 1212, and other application icons.
Wherein:
status bar 1201 may include: operator name (e.g., "china mobile") 1201A, one or more signal strength information 1201B for wireless fidelity (Wi-Fi) signals, one or more signal strength information 1201C for mobile communication signals (which may also be referred to as cellular signals), battery status information 1201D, time information 1201E.
Calendar information 1202 may be used to indicate current date information, etc.
Time information 1212 may be used to indicate time, such as hours, minutes, etc.
The weather information 1203 may be used to indicate a weather type, such as cloudy sunny, light rain, etc., and may also be used to indicate information such as temperature, etc.
A tray 1204 with common application icons may show: phone icon 1204A, address book icon 1204B, short message icon 1204C, camera icon 1204D.
The navigation bar 1205 may include: a return key 1205A, a home screen key 1205B, a multitasking key 1205C, and other system navigation keys. When it is detected that the user has clicked the return key 1205A, the electronic device may display a page previous to the current page. When the user is detected to click home screen key 1205B, the electronic device can display a home screen interface. When the user is detected to click the multi-task key 1205C, the electronic device can display the application that was recently opened by the user. The names of the navigation keys can be other keys, and the application does not limit the names. Not limited to virtual keys, the navigation keys in the navigation bar 1205 may also be implemented as physical keys.
Location information 1206 may be used to indicate information such as the city and/or area of the city in which it is currently located.
Other application icons may be, for example: an icon 1207 for a mailbox, an icon 1208 for a cell phone steward, an icon 1209 for settings, an icon 1210 for a gallery, and so on.
Home screen interface 12 may also include page indicator 1211. Other application icons may be distributed across multiple pages, and page indicator 1211 may be used to indicate which page the user is currently browsing for applications in. The user may slide the area of the other application icons from side to browse the application icons in the other pages.
Based on the home screen interface 12, when the first device senses a first operation on the calendar information 1202, the time information 1212, or the weather information 1203, the first device may display the selection menu in the home screen interface 12. After the selection menu is displayed, the interaction between the first device and the second device, or the interaction between the first device and the third device and the interaction between the first device and the fourth device refer to the description of the second embodiment and the third embodiment, and are not described again here.
In one possible implementation manner, in the second and third embodiments, the first device may also display the selection menu in response to a first operation on a thumbnail of the first application user interface. The specific implementation process of the mode is described in the following by way of example in conjunction with the attached drawings.
Referring to fig. 13A, fig. 13A is a schematic diagram of a user interface 1300 displayed on a display screen of a first device. Thumbnails of the user interfaces of the multiple applications may be included in the user interface 1300, including, for example, thumbnails 1301, 1302, 1303, and 1304 of the first, second, third, and fourth application user interfaces. The thumbnails may be displayed in the display when the first device calls the applications running in the background, or may be tiled in the display when the applications are simultaneously opened by the first device.
In the case shown in fig. 13A, when the first device senses a first operation on a certain thumbnail, for example, the user interface thumbnail 1301 of the first application, for example, a two-finger touch slide operation on the thumbnail 1301 of fig. 13B may be engaged, and a selection menu 1305 is displayed, for example, see fig. 13C.
The first operation may also be an operation of clicking the thumbnail 1301 with a plurality of fingers, for example, two fingers, or an operation of pressing the thumbnail 1301 for a long time, or the like. The description of the specific first operation may refer to the description in the foregoing embodiments, and is not repeated here.
The selection menu 1305 also includes selectable controls of a single device and/or a combination of devices, and the detailed description of the selection menu may refer to the description in the foregoing embodiments, which is not repeated herein.
After the first device displays the selection menu 1305, the interaction between the first device and the second device, or the interaction between the first device and the third device and the interaction between the first device and the fourth device refer to the description of the second embodiment and the third embodiment, and are not described again here.
In one possible implementation manner, in the second and third embodiments, the first device may call up the selection menu in some ways as follows.
First, in a specific embodiment, the first device has started the first application, and the first device displays a user interface of the first application. At this time, if the first device detects that a new device accesses the network where the first device is located, and logs in an account. If the account is the same as the account of the first device or belongs to an account group, the first device may directly display a selection menu including a control of the newly accessed device on the user interface of the first application. Of course, controls for other devices or combinations of devices may be included in the selection menu for selection. Of course, the first device may also start the first application after detecting that there is a new device networking login account, and then display the selection menu. The specific sequence of starting the application is determined according to the actual situation, and the scheme does not limit the sequence.
For the convenience of understanding, the first mode is exemplified. Assume that the first device is a handset and the second device is a TV. Then, when a first application, such as a video playing application, is being used on the handset to watch the video. Then, the TV starts to connect with the network where the mobile phone is located, and logs in the account registered by the mobile phone or the account registered by the mobile phone and the account registered by the mobile phone belong to the same group. The mobile phone may detect that the TV has logged in to the corresponding account, for example, the mobile phone may query, through the distributed data management server, that the TV has logged in to the corresponding account, and then the mobile phone may directly display, on the video-viewing user interface, a selection menu including controls of the TV, so as to prompt the user that the video being viewed may be switched to the first application in the TV to be played. Of course, if the newly accessed device is a device combination, the process is similar, and is not described here again.
In a second mode, in a specific embodiment, the first device already starts the first application, and when the first device accesses to the network where the second device or the third device and the fourth device are located, the account registered by the devices or the registered account and the account registered by the devices belong to the same account group. Then, the first electronic device may display a selection menu at a user interface of the first application, the selection menu including a control of the second device or a combination control of the third device and the fourth device being selectable to prompt a user that the first application may be migrated to run in the devices. Of course, the first device may also start the first application after the internet login account is started, and then display the selection menu. The specific sequence of starting the application is determined according to the actual situation, and the scheme does not limit the sequence.
For the convenience of understanding, the second method is exemplified. Assume that the first device is a handset and the second device is a TV. Then, when a first application, such as a video playing application, is being used on the handset to watch the video. Then, the mobile phone moves to a certain place, the network is switched to the network provided by the place, and the mobile phone logs in to an account or an account group logged in by some equipment in the places. The cell phone may then display a selection menu including controls for the certain devices on the video-viewing interface.
After the first device displays the selection menu, the interaction between the first device and the second device, or the interaction between the first device and the third device and the interaction between the first device and the fourth device refer to the description of the second embodiment and the third embodiment, and are not described again here.
In one possible implementation manner, multiple devices do not need to log in to the same account or account group, and interaction between the devices can be achieved only by establishing corresponding pairing connection.
Alternatively, the pairing connection may be a bluetooth pairing connection or other form of pairing connection between devices, and so on. The specific type of pairing connection is determined according to actual conditions, and the scheme does not limit the type of pairing connection. For ease of understanding, the bluetooth pairing connection is described as an example below.
In a specific embodiment, a bluetooth connection is established between the first device and the second device, or between the first device and the third device and the fourth device, and when the first device starts the first application, the first device may display the selection menu, where the selection menu includes a control of the second device, or includes a control of a combination of the third device and the fourth device, or includes both of the controls, or may include controls of other devices or device combinations, which is not limited herein.
After the first device displays the selection menu, the interaction between the first device and the second device, or the interaction between the first device and the third device and the interaction between the first device and the fourth device refer to the description of the second embodiment and the third embodiment, and are not described again here.
According to the embodiment, the selection menu is actively displayed to prompt the user to select other devices to run the currently displayed application, so that the user can conveniently find new functions in the devices, and the user experience is improved.
In summary, compared with the prior art that the synchronous context can be processed and restored after the relevant application is started by clicking on the corresponding device, the application of another device can be accurately started on one device without operating the opposite device, and the selectable device selection menu is more accurately displayed based on different applications and the currently networked device, so that the method is more in line with the expectation of the user and is easier to use.
The above describes the method provided by the present application mainly from the interaction between multiple devices. It is understood that each device includes a hardware structure and/or a software module for performing each function in order to realize the corresponding function. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, functional modules of a device and the like may be divided according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding functions, fig. 14 shows a schematic logical structure diagram of a first device provided in the embodiment of the present application, where the first device may be the first device in the foregoing method embodiment. The first device 1400 may include:
a display unit 1401 for implementing an operation of the first device 1400 to display information in the above-described method embodiment;
a sending unit 1402, configured to implement an operation of sending information by the first device 1400 in the foregoing method embodiment;
a receiving unit 1403, configured to implement the operation of the first device 1400 for receiving information in the above method embodiment.
Optionally, the first device 1400 may further include: a starting unit, configured to implement an operation of the first device 1400 in the foregoing method embodiment to start an application.
Optionally, the first device 1400 may further include: an access unit, configured to implement an operation of the first device 1400 accessing the network in the foregoing method embodiment.
Optionally, the first device 1400 may further include: an establishing unit, configured to implement the operation of establishing the pairing connection by the first device 1400 in the foregoing method embodiment.
Optionally, the first device 1400 may further include: and a remote control unit, configured to implement the operation of remote control of the first device 1400 in the foregoing method embodiment.
In the case of adopting the functional modules divided corresponding to the respective functions, fig. 15 shows a schematic logical structure diagram of the second device provided in the embodiment of the present application. The second device may be any one of the devices indicated by the one or more controls included in the selection menu according to the above method embodiments. The second device 1500 may include:
a receiving unit 1501 for implementing an operation of the second device 1500 to receive information in the above-described method embodiment;
a starting unit 1502, configured to implement the operation of starting the application by the second device 1500 in the foregoing method embodiment;
a sending unit 1503, configured to implement the operation of sending information by the second device 1500 in the foregoing method embodiment.
Optionally, the second device 1500 may further include: and an execution unit, configured to implement an operation of the second device 1500 executing the application in the foregoing method embodiment.
Optionally, the second device 1500 may further include: and a control unit, configured to implement the operation of controlling the running condition and the like of the application by the second device 1500 in the above method embodiment.
Optionally, the second device 1500 may further include: and a query unit, configured to implement an operation of querying information by the second device 1500 in the foregoing method embodiment.
The embodiment of the present application also provides a computer-readable storage medium, which stores a computer program, where the computer program is executed by a processor to implement the method described in fig. 5 or fig. 7.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (18)

1. A method for application interaction among multiple devices, the method comprising:
the first device displays a selection menu; the device indicated by each of the one or more controls is a single device or a combination of devices, and the device indicated by the one or more controls has a running environment of all or part of sub-applications included in a first application in the first device;
the first device responds to the selection operation of a first control and sends a request for starting the first application to the device indicated by the first control; wherein the first control is a control in the selection menu;
and the first device receives response information of successful starting of the first application by the device indicated by the first control.
2. The method of claim 1, wherein the first device displaying a selection menu comprises:
the first device displays the selection menu in response to a first operation on an object associated with the first application.
3. The method of claim 2, wherein the object associated with the first application comprises a user interface of the first application, an icon of the first application, a thumbnail of the user interface of the first application, or a display plug-in of the first application.
4. The method according to claim 2 or 3,
the first operation on the object associated with the first application comprises: an operation of contact sliding on an object associated with the first application;
or, the first operation on the object associated with the first application includes: an operation of clicking on an object associated with the first application by a plurality of fingers;
or, the first operation on the object associated with the first application includes: and long-pressing the object associated with the first application.
5. The method of claim 2, wherein the object associated with the first application comprises a user interface of the first application;
the first operation on the object associated with the first application comprises: and switching the user interface of the first application to another user interface.
6. The method according to any one of claims 2 to 5,
the selecting operation of the first control comprises: dragging the object associated with the first application to the first control, wherein the object associated with the first application is scaled down in the dragging process;
or, the selecting operation of the first control comprises: and clicking the first control.
7. The method of claim 1, wherein prior to the first device displaying the selection menu, further comprising:
the first device detects that the device indicated by the first control is connected with a network where the first device is located, and detects that the device indicated by the first control logs in a first account; the first account is an account logged in by the first device, or an account belonging to the same account group as the account logged in by the first device;
or the first device accesses the network where the device indicated by the first control is located, and logs in a second account; the second account is an account for equipment login indicated by the first control, or belongs to the same account group with the account for equipment login indicated by the first control;
or the first device establishes pairing connection with the device indicated by the first control.
8. The method according to any one of claims 1 to 7, wherein the device indicated by the first control is a second device; after the first device receives the response information that the first control indicates that the device successfully starts the first application, the method further includes:
and the first equipment sends one or more items of context information and running state information of the first application currently running in the first equipment to the second equipment according to the response information.
9. The method according to any one of claims 1 to 7, wherein the device indicated by the first control is a device combination of a third device and a fourth device; the third device is used for running a first sub-application of the first application, the fourth device is used for running a second sub-application of the first application, and the first sub-application and the second sub-application jointly form the first application;
the first device responds to the selection operation of a first control, and sends a request for starting the first application to the device indicated by the first control, wherein the request comprises:
the first device responds to the selection operation of the first control and sends a request for starting the first application to the third device and the fourth device respectively;
after the first device receives the response information that the first control indicates that the device successfully starts the first application, the method further includes:
the first device remotely controls one or more of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application in the third device, and remotely controls one or more of the running logic sequence, the user interface updating state and the multimedia playing progress of the second sub-application in the fourth device.
10. The method according to any one of claims 1 to 9, wherein the shared data in the first device and the device indicated by the first control are synchronized into a distributed data management server; the method further comprises the following steps:
the first device inquires in the distributed data management server that the first application is not installed in the device indicated by the first control;
the first device remotely controls the device indicated by the first control to install the first application.
11. A method for application interaction among multiple devices, the method comprising:
the second equipment receives a request for starting the first application sent by the first equipment; the second device is one of devices indicated by one or more controls, the one or more controls are controls in a selection menu displayed by the first device, the device indicated by each of the one or more controls is a single device or a combination of devices, and the device indicated by the one or more controls has a running environment of all or part of sub-applications included in the first application; the request is sent by the first device in response to a selection operation of a control of the second device in the selection menu;
the second equipment starts the first application according to the request;
and the second equipment sends response information of successfully starting the first application to the first equipment.
12. The method of claim 11, wherein after the second device sends the response message to the first device that the first application is successfully started, the method further comprises:
the second device receives one or more items of context information and running state information of the first application currently running in the first device, which are sent by the first device;
and the second equipment continuously runs the first application according to one or more items of the context information and the running state information.
13. The method of claim 11, wherein the second device is configured to run a first sub-application of the first application; after the second device sends response information of successfully starting the first application to the first device, the method further includes:
the second device receives a remote control instruction which is sent by the first device and used for controlling one or more items of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application;
and the second equipment controls one or more items of the running logic sequence, the user interface updating state and the multimedia playing progress of the first sub-application on the equipment according to the remote control instruction.
14. The method of claim 11, wherein the shared data in the second device and the first device are synchronized to a distributed data management server; after the second device sends response information of successfully starting the first application to the first device, the method further includes:
the second device inquires data of one or more items of context information and running state information of the first application currently running in the first device from the distributed data management server;
and the second equipment continuously runs the first application on the equipment according to the data of one or more items of the context information and the running state information of the first application.
15. An apparatus, characterized in that the apparatus comprises one or more processors, memory, and a communication interface; the memory and the communication interface are coupled to the one or more processors, the memory storing a computer program that, when executed by the one or more processors, causes the apparatus to perform the method of any of claims 1-10.
16. An apparatus, characterized in that the apparatus comprises one or more processors, memory, and a communication interface; the memory and the communication interface are coupled to the one or more processors, the memory storing a computer program that, when executed by the one or more processors, causes the apparatus to perform the method of any of claims 11-14.
17. An information interaction system, comprising a first device and a second device, wherein the first device is the device of claim 15, and the second device is the device of claim 16.
18. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 10 or any one of claims 11 to 14.
CN202010125173.1A 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices Pending CN113311975A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010125173.1A CN113311975A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices
CN202210410313.9A CN114968614A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010125173.1A CN113311975A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210410313.9A Division CN114968614A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices

Publications (1)

Publication Number Publication Date
CN113311975A true CN113311975A (en) 2021-08-27

Family

ID=77370762

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210410313.9A Pending CN114968614A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices
CN202010125173.1A Pending CN113311975A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210410313.9A Pending CN114968614A (en) 2020-02-27 2020-02-27 Application interaction method among multiple devices and related devices

Country Status (1)

Country Link
CN (2) CN114968614A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113766292A (en) * 2021-09-18 2021-12-07 海信视像科技股份有限公司 Display apparatus and content connecting method
CN114422566A (en) * 2021-12-29 2022-04-29 Oppo广东移动通信有限公司 Multi-device connection method, device, system, device and storage medium
WO2022089207A1 (en) * 2020-10-28 2022-05-05 华为技术有限公司 Cross-device application interaction method, electronic device, and server
CN114741213A (en) * 2021-10-22 2022-07-12 华为技术有限公司 Notification processing method, chip, electronic device and computer-readable storage medium
WO2023045876A1 (en) * 2021-09-24 2023-03-30 花瓣云科技有限公司 Application installation method and related devices
WO2023142943A1 (en) * 2022-01-28 2023-08-03 华为技术有限公司 Application component configuration method and related device
CN116700552A (en) * 2022-09-28 2023-09-05 荣耀终端有限公司 Application connection method and terminal equipment
CN116708062A (en) * 2022-09-30 2023-09-05 荣耀终端有限公司 Equipment management method and electronic equipment
WO2024067225A1 (en) * 2022-09-28 2024-04-04 荣耀终端有限公司 Application handoff method and terminal device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102203711A (en) * 2008-11-13 2011-09-28 高通股份有限公司 Method and system for context dependent pop-up menus
CN103856807A (en) * 2014-03-25 2014-06-11 北京奇艺世纪科技有限公司 Method and device for controlling interaction between screens
CN104079597A (en) * 2013-03-26 2014-10-01 华为终端有限公司 Transfer method of media stream and user equipment
CN104471917A (en) * 2014-04-15 2015-03-25 华为技术有限公司 Application information sharing method and device
CN105321096A (en) * 2014-05-30 2016-02-10 苹果公司 Continuity
CN107943439A (en) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 Interface Moving method, apparatus, intelligent terminal, server and operating system
CN109660842A (en) * 2018-11-14 2019-04-19 华为技术有限公司 A kind of method and electronic equipment playing multi-medium data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9746995B2 (en) * 2011-07-14 2017-08-29 Microsoft Technology Licensing, Llc Launcher for context based menus
CN102830921B (en) * 2012-08-06 2016-09-28 广东欧珀移动通信有限公司 A kind of touch screen unlocking method and mobile terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102203711A (en) * 2008-11-13 2011-09-28 高通股份有限公司 Method and system for context dependent pop-up menus
CN104079597A (en) * 2013-03-26 2014-10-01 华为终端有限公司 Transfer method of media stream and user equipment
CN103856807A (en) * 2014-03-25 2014-06-11 北京奇艺世纪科技有限公司 Method and device for controlling interaction between screens
CN104471917A (en) * 2014-04-15 2015-03-25 华为技术有限公司 Application information sharing method and device
CN105321096A (en) * 2014-05-30 2016-02-10 苹果公司 Continuity
CN107943439A (en) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 Interface Moving method, apparatus, intelligent terminal, server and operating system
CN109660842A (en) * 2018-11-14 2019-04-19 华为技术有限公司 A kind of method and electronic equipment playing multi-medium data

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022089207A1 (en) * 2020-10-28 2022-05-05 华为技术有限公司 Cross-device application interaction method, electronic device, and server
CN113766292A (en) * 2021-09-18 2021-12-07 海信视像科技股份有限公司 Display apparatus and content connecting method
WO2023045876A1 (en) * 2021-09-24 2023-03-30 花瓣云科技有限公司 Application installation method and related devices
CN114741213A (en) * 2021-10-22 2022-07-12 华为技术有限公司 Notification processing method, chip, electronic device and computer-readable storage medium
CN114422566A (en) * 2021-12-29 2022-04-29 Oppo广东移动通信有限公司 Multi-device connection method, device, system, device and storage medium
WO2023142943A1 (en) * 2022-01-28 2023-08-03 华为技术有限公司 Application component configuration method and related device
CN116700552A (en) * 2022-09-28 2023-09-05 荣耀终端有限公司 Application connection method and terminal equipment
WO2024067225A1 (en) * 2022-09-28 2024-04-04 荣耀终端有限公司 Application handoff method and terminal device
CN116700552B (en) * 2022-09-28 2024-04-19 荣耀终端有限公司 Application connection method and terminal equipment
CN116708062A (en) * 2022-09-30 2023-09-05 荣耀终端有限公司 Equipment management method and electronic equipment
CN116708062B (en) * 2022-09-30 2024-05-31 荣耀终端有限公司 Equipment management method and electronic equipment

Also Published As

Publication number Publication date
CN114968614A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN110471639B (en) Display method and related device
CN109766036B (en) Message processing method and electronic equipment
WO2021017889A1 (en) Display method of video call appliced to electronic device and related apparatus
CN113311975A (en) Application interaction method among multiple devices and related devices
WO2020134869A1 (en) Electronic device operating method and electronic device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN112148400B (en) Display method and device in locking state
CN111913750B (en) Application program management method, device and equipment
EP3876506A1 (en) Method for presenting video on electronic device when incoming call comes, and electronic device
CN113961157B (en) Display interaction system, display method and equipment
CN114363462B (en) Interface display method, electronic equipment and computer readable medium
EP4290346A1 (en) File management method, electronic device, and computer-readable storage medium
CN113821767A (en) Application program authority management method and device and electronic equipment
CN112068907A (en) Interface display method and electronic equipment
CN114258671A (en) Call method and device
EP4163782A1 (en) Cross-device desktop management method, first electronic device, and second electronic device
WO2022022674A1 (en) Application icon layout method and related apparatus
EP4290841A1 (en) Card display method, electronic device, and computer readable storage medium
CN115248693A (en) Application management method and electronic equipment
CN110737916A (en) Communication terminal and processing method
WO2024001972A1 (en) Interaction method and related device
WO2024012398A1 (en) Message sharing method and related device
CN113934352B (en) Notification message processing method, electronic device and computer-readable storage medium
WO2023045876A1 (en) Application installation method and related devices
WO2023142935A1 (en) Application component management method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination