CN112286618A - Device cooperation method, device, system, electronic device and storage medium - Google Patents

Device cooperation method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN112286618A
CN112286618A CN202011284011.9A CN202011284011A CN112286618A CN 112286618 A CN112286618 A CN 112286618A CN 202011284011 A CN202011284011 A CN 202011284011A CN 112286618 A CN112286618 A CN 112286618A
Authority
CN
China
Prior art keywords
list
function
target
interface
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011284011.9A
Other languages
Chinese (zh)
Inventor
杨俊拯
缪敬
谭柯
钟卫东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011284011.9A priority Critical patent/CN112286618A/en
Publication of CN112286618A publication Critical patent/CN112286618A/en
Priority to PCT/CN2021/116501 priority patent/WO2022100239A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a device cooperation method, a device, a system, an electronic device and a storage medium, wherein the method comprises the following steps: the method comprises the steps that first equipment displays a first interface, the first interface is used for displaying a function list related to a target application program, the target application program runs on the first equipment, and the function list comprises a separation control option of the target application program; displaying a second interface in response to a first selection operation for the separation manipulation option, wherein the second interface is used for displaying a first associated equipment list associated with the first equipment; and responding to a second selection operation aiming at least one associated device in the first associated device list so as to enable the target application program of the first device to be operated by the at least one associated device. The method can select at least one associated device from the multiple devices to realize the function selected by the user on the first device, so that the first device can fuse the functions provided by the multiple associated devices and call the functions of the multiple different types of devices at the same time, thereby achieving the cooperative work of the multiple devices.

Description

Device cooperation method, device, system, electronic device and storage medium
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a device cooperation method, apparatus, system, electronic device, and storage medium.
Background
With the increase of the number of information devices, in some application scenarios, a user needs to combine services of different devices to jointly complete execution of a task, so as to improve task execution efficiency. For example, in the process of listening to music by using a mobile phone, in order to improve the audio playing effect, a user may play sound through a smart speaker.
At present, multimedia data (picture output and audio output) can be delivered to a certain multimedia peripheral through Miracast and DLNA screen delivery technologies, such as delivering to a television, playing pictures through a television screen, and playing sound through a television sound box; or putting the sound box into the intelligent sound box and playing the sound by the intelligent sound box. However, the Miracast and DLNA screen projection technology can only realize the projection of a single class of equipment, and cannot realize the service cooperative work by using multiple devices.
Disclosure of Invention
The embodiment of the application provides a device cooperation method, device, system, electronic device and storage medium, which can simultaneously call the functions of a plurality of different types of devices to achieve the cooperative work of multiple devices.
In a first aspect, an embodiment of the present application provides an apparatus coordination method, where the method includes:
the method comprises the steps that a first interface is displayed on a first device, the first interface is used for displaying a function list associated with a target application program, the target application program runs on the first device, and the function list comprises a separation control option of the target application program;
in response to a first selection operation for the separation manipulation option, the first device displays a second interface for displaying a first associated device list associated with the first device;
in response to a second selection operation for at least one associated device in the first list of associated devices, causing the target application of the first device to be manipulated by the at least one associated device.
In a second aspect, an embodiment of the present application provides an apparatus for device coordination, where the apparatus includes:
the display unit is used for displaying a first interface, the first interface is used for displaying a function list associated with a target application program, the target application program runs on the first equipment, and the function list comprises a separation control option of the target application program;
the display unit is further used for responding to a first selection operation aiming at the separation control option, and displaying a second interface which is used for displaying a first associated equipment list associated with the first equipment;
a processing unit, configured to, in response to a second selection operation for at least one associated device in the first associated device list, cause the target application of the first device to be manipulated by the at least one associated device.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a first device displays a first interface, where the first interface is used to display a function list associated with a target application, where the target application runs on the first device, and the function list includes a separation manipulation option of the target application; in response to a first selection operation for the separation manipulation option, the first device displays a second interface for displaying a first associated device list associated with the first device; in response to a second selection operation for at least one associated device in the first list of associated devices, causing the target application of the first device to be manipulated by the at least one associated device. The method can select at least one associated device from the multiple devices to realize the function selected by the user on the first device, so that the first device can fuse the functions provided by the multiple associated devices and call the functions of the multiple different types of devices at the same time, thereby achieving the cooperative work of the multiple devices.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 3a is a schematic structural diagram of a device coordination system according to an embodiment of the present application;
fig. 3b is a schematic structural diagram of another device coordination system provided in the embodiment of the present application;
fig. 4 is a schematic flowchart of a device coordination method according to an embodiment of the present application;
FIG. 5a is a schematic diagram of a first interface display according to an embodiment of the present disclosure;
FIG. 5b is a schematic diagram of a first interface display according to an embodiment of the present application;
FIG. 5c is a schematic illustration of a second interface display according to an embodiment of the present application;
FIG. 5d is a schematic illustration of a second alternative interface display provided by an embodiment of the present application;
FIG. 5e is a schematic illustration of a second alternative interface display provided by an embodiment of the present application;
FIG. 5f is a schematic illustration of an alternative second interface display provided by an embodiment of the present application;
fig. 6a is a schematic structural diagram of an electronic device providing function according to an embodiment of the present application;
fig. 6b is a schematic structural diagram of a multi-device cooperation provided in an embodiment of the present application;
fig. 7a is a schematic structural diagram of another device coordination system provided in the embodiment of the present application;
fig. 7b is a schematic structural diagram of another device coordination system provided in the embodiment of the present application;
FIG. 8 is a schematic flowchart illustrating a process for invoking a target function on an associated device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an apparatus coordination device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The device cooperation method provided by the embodiment of the application can be applied to a handheld device, a vehicle-mounted device, a wearable device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a projection device, a projector or other devices connected to a wireless modem, the terminal device may also be a terminal device or a server in various specific forms, such as a User Equipment (UE), a terminal device (terminal device), a mobile phone (smart phone), a smart screen, a smart television, a smart watch, a notebook computer, a smart audio, a camera, a joystick, a mouse, a microphone, a Station (STA), an Access Point (AP), a Mobile Station (MS), a Personal Digital Assistant (PDA), a Personal Computer (PC), or a relay device.
For example, the terminal device may be a Station (ST) in a WLAN, which may be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a handheld device with Wireless communication capability, a computing device or other processing device connected to a Wireless modem, a vehicle-mounted device, a vehicle-mounted networking terminal, a computer, a laptop, a handheld communication device, a handheld computing device, a satellite radio device, a Wireless modem set card, a television set-top box (STB), a Customer Premises Equipment (CPE), and/or other devices for communicating on a Wireless device and a next generation communication device, such as a Mobile terminal in a 5G Network or a future-evolved Public Land Mobile Network (Public Land Mobile Network, PLMN) mobile terminals in the network, etc.
By way of example and not limitation, when the terminal device is a wearable device, the wearable device may also be a generic term for intelligently designing daily wearing by applying wearable technology, developing wearable devices, such as glasses, gloves, watches, clothing, shoes, and the like. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable intelligent device has the advantages that the generalized wearable intelligent device is complete in function and large in size, can realize complete or partial functions without depending on a smart phone, such as a smart watch or smart glasses, and only is concentrated on a certain application function, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets for monitoring physical signs, smart jewelry and the like.
For example, two electronic devices, a notebook computer and a mobile phone, are taken as examples. When the notebook computer is connected with the mobile phone through a wireless communication technology (such as Bluetooth, wireless fidelity, Zigbee, near field communication and the like) or a data line (such as a USB data line), a user can call the notebook computer to display a game picture of the mobile phone in a mode of equipment cooperation when finishing a game through the mobile phone; or, when the user finishes playing the game on the notebook computer, the mobile phone can be called in a device cooperation mode to control the game on the notebook computer.
In the embodiment of the application, the first device performing device coordination and the associated device may be directly connected, for example, the direct connection between the two electronic devices is realized through bluetooth, WiFi, and the like; alternatively, the two electronic devices may be connected to each other through a connection with another electronic device, such as a cloud server, to achieve indirect connection. In the device cooperation process, the connection between two electronic devices may be switched between a direct connection and an indirect connection, which is not limited in the embodiments of the present application.
Fig. 1 shows a schematic structural diagram of an electronic device 100. Taking the electronic device as a mobile phone as an example, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a SIM card interface 195, and the like. The sensor module 180 may include a gyroscope sensor 180A, an acceleration sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an ambient light sensor 180E, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, and a touch sensor 180K (of course, the electronic device 100 may further include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a bone conduction sensor, and the like, which are not shown in the figure).
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The processor 110 may operate the screen projection method provided in the embodiment of the present application, so as to enrich the screen projection function, improve the flexibility of screen projection, and improve the experience of the user. The processor 110 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the screen projection method provided in the embodiment of the present application, for example, part of the algorithm in the screen projection method is executed by the CPU, and another part of the algorithm is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1. The display screen 194 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display 194 may display a photograph, video, web page, or file, etc. As another example, display 194 may display a graphical user interface. Wherein the graphical user interface includes a status bar, a concealable navigation bar, a time and weather widget, and an icon of an application, such as a browser icon. The status bar includes the name of the operator (e.g., china mobile), the mobile network (e.g., 4G), the time and the remaining power. The navigation bar includes a back key icon, a home key icon, and a forward key icon. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock bar may also be included in the graphical user interface, and that a commonly used application icon may be included in the Dock bar, etc. When the processor detects a touch event of a finger (or stylus, etc.) of a user with respect to an application icon, in response to the touch event, the user interface of the application corresponding to the application icon is opened and displayed on the display 194.
In this embodiment, the display screen 194 may be an integrated flexible display screen, or may be a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens. After the processor 110 runs the screen projection method provided by the embodiment of the present application, the processor 110 may control an external audio output device to switch the output audio signal.
The cameras 193 (front camera or rear camera, or one camera may be both front camera and rear camera) are used to capture still images or video. In general, the camera 193 may include a photosensitive element such as a lens group including a plurality of lenses (convex lenses or concave lenses) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, codes of application programs (such as a camera application, a WeChat application, etc.), and the like. The storage data area may store data created during use of the electronic device 100 (e.g., images, videos, etc. captured by a camera application), and the like.
The internal memory 121 may further store one or more computer programs corresponding to the screen projection method provided in the embodiment of the present application. The one or more computer programs, which are stored in the memory 211 and configured to be executed by the one or more processors 110, include instructions, and may include an account number verification module, a priority comparison module. The account verification module is used for authenticating system authentication accounts of other terminal equipment in the local area network; the priority comparison module can be used for comparing the priority of the audio output request service with the priority of the current output service of the audio output equipment. And the state synchronization module can be used for synchronizing the equipment state of the audio output equipment currently accessed by the terminal equipment to other terminal equipment or synchronizing the equipment state of the audio output equipment currently accessed by other equipment to local. When the code of the screen projection method stored in the internal memory 121 is executed by the processor 110, the processor 110 may control the transmitting end to perform screen projection data processing.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the codes of the screen projection method provided by the embodiment of the application can also be stored in the external memory. In this case, the processor 110 may execute the code of the screen projection method stored in the external memory through the external memory interface 120, and the processor 110 may control the transmitting end to perform the screen projection data processing.
The gyro sensor 180A may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180A. That is, the gyro sensor 180A may be used to detect the current motion state of the electronic device 100, such as shaking or standing still.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 180A may be used to detect a folding or unfolding operation acting on the display screen 194. The gyro sensor 180A may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folded state or unfolded state of the display screen 194.
The acceleration sensor 180B may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). That is, the gyro sensor 180A may be used to detect the current motion state of the electronic device 100, such as shaking or standing still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 180B may be used to detect a folding or unfolding operation acting on the display screen 194. The acceleration sensor 180B may report the detected folding operation or unfolding operation as an event to the processor 110 to determine the folded state or unfolded state of the display screen 194.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. When the display screen in this embodiment of the application is a foldable display screen, the proximity optical sensor 180G may be disposed on the first screen of the foldable display screen 194, and the proximity optical sensor 180G may detect the folding angle or the unfolding angle of the first screen and the second screen according to the optical path difference of the infrared signal.
The gyro sensor 180A (or the acceleration sensor 180B) may transmit the detected motion state information (such as an angular velocity) to the processor 110. The processor 110 determines whether the electronic device 100 is currently in the handheld state or the tripod state (for example, when the angular velocity is not 0, the electronic device is in the handheld state) based on the motion state information.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
Illustratively, the display screen 194 of the electronic device 100 displays a main interface including icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks the icon of the camera application in the home interface through the touch sensor 180K, which triggers the processor 110 to start the camera application and open the camera 193. The display screen 194 displays an interface, such as a viewfinder interface, for the camera application.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110. In this embodiment of the application, the mobile communication module 150 may also be configured to perform information interaction with other terminal devices, that is, send screen projection related data to other terminal devices, or the mobile communication module 150 may be configured to receive a screen projection request and encapsulate the received screen projection request into a message in a specified format.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves. In the embodiment of the present application, the wireless communication module 160 is configured to establish a connection with a receiving end, and display screen projection content through the receiving end. Or the wireless communication module 160 may be used to access the access point device, send a message corresponding to a screen-projection request to other terminal devices, or receive a message corresponding to an audio output request sent from other terminal devices.
In addition, the electronic device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. The electronic device 100 may receive key 190 inputs, generating key signal inputs related to user settings and function control of the electronic device 100. Electronic device 100 may generate a vibration alert (e.g., an incoming call vibration alert) using motor 191. The indicator 192 in the electronic device 100 may be an indicator light, and may be used to indicate a charging status, a power change, or a message, a missed call, a notification, etc. The SIM card interface 195 in the electronic device 100 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
It should be understood that in practical applications, the electronic device 100 may include more or less components than those shown in fig. 1, and the embodiment of the present application is not limited thereto. The illustrated electronic device 100 is merely an example, and the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Fig. 2 shows a block diagram of a software structure of the electronic device 100. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In a second section, example application scenarios disclosed in embodiments of the present application are described below.
Illustratively, the technical solution of the embodiment of the present application may be applied to the device cooperation system 30 as shown in fig. 3 a. The device cooperation system 30 may include, among other things, a first device 310 and a plurality of associated devices 320. The plurality of associated devices 320 may include an electronic device 320A, an electronic device 320B, and an electronic device 320C. Meanwhile, the first device 310 and the plurality of associated devices 320 may be communicatively connected to each other through a wireless network or wired data.
Specifically, the first device 310 and the multiple associated devices 320 may be devices under the same user account. For example, when a user logs in to a mobile phone, a desktop computer, a smart screen, a notebook computer, a relay device and a smart watch using the same user account, the first device 310 and/or the associated device 320 may be the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch may communicate with each other through a wireless network.
Specifically, the first device 310 and the plurality of association devices 320 may be connected to the same WLAN network through a relay device (e.g., a router). For example, when a user accesses a mobile phone, a desktop computer, a smart screen, a notebook computer and a smart watch to a Wi-Fi network provided by a relay device, the first device 310 and the associated device 320 may include the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch, and the mobile phone, the desktop computer, the smart screen, the notebook computer, the relay device and the smart watch form a WLAN network, so that the devices in the WLAN network can communicate with each other through the relay device.
Further, the first device may call at least one associated device to display a screen of its currently running application, and at the same time, the first device may call at least one associated device to operate its current application. For example, the first device 310 calls the electronic device 320A to display a screen, while the first device 310 calls the electronic device 320B to operate a current application and calls the electronic device 320C to play a sound.
It should be understood that the device coordination system 30 may also include other numbers of electronic devices, which are not specifically limited herein.
Illustratively, the technical solution of the embodiment of the present application may be applied to the device coordination system 30 as shown in fig. 3 b. The device coordination system 30 may include a first device 310, a plurality of associated devices 320, and a cloud device 330. The first device 310 and the multiple associated devices 320 may be devices under the same user account, and the cloud device 330 is configured to manage the devices under the same user account, and store a function list and authorization information of the first device 310 and the multiple associated devices 320. The first device 310 may obtain a plurality of associated devices 320 that meet the functional requirements of the first device from the cloud device while requesting device cooperation.
In the third section, the scope of protection of the claims disclosed in the embodiments of the present application is described below.
Referring to fig. 4, fig. 4 is a flowchart illustrating a device coordination method according to an embodiment of the present application, and as shown in fig. 4, the device coordination method includes the following operations.
S410, a first interface is displayed on first equipment, the first interface is used for displaying a function list associated with a target application program, the target application program runs on the first equipment, and the function list comprises a separation control option of the target application program.
In the embodiment of the application, when a user opens an application on the first device, the user can start the related collaborative process through any preset operation. The preset arbitrary operation may be set by a user or a system of the first device, which is not limited in this embodiment of the application. For example, on a mobile phone, it may be set that a collaborative flow is invoked by long-pressing a screen of a currently running application; on a computer, a collaborative process can be invoked by clicking a right mouse button on a running application. After the collaborative process is invoked, the system displays a first window on the current interface, where the content of the first window is associated with the user, the current application, and the first device.
Optionally, the function list further includes a separate screen option and/or a separate sound option.
It is understood that the manner in which each device obtains the service list may be implemented differently on different devices, some may be coupled to the mechanisms of the operating system, and the timing of obtaining the service list and reporting the service list may be different on different devices.
Wherein, for different devices and target applications, the functions included in the function list in the target application are different. After the first device used by the user starts the collaboration flow, a function list that needs to be collaborated under the target application program may be determined, where the function list may include a separation control option, a separation screen option, and a separation sound option. For example, the current target application is a game-type application, and the function list may include a separate control option, a separate screen option, and a separate sound option. When the current target application is an album, the function list may include a separate control option and a separate screen option. When the current target application is a voice call, the function list may include a separate control option and a separate sound option.
Further, the list of functions may also include a separate location option, a separate payment option, a separate search option, and the like, for the implemented functions of the target application.
In actual operation, the application program may implement multiple tasks, and function lists required for implementing each task may be different, for example, when the target application program is social software, and when a user performs a video call by using the social software, the function list of the target application program required at this time includes a separation control option, a separation screen option, and a separation sound option; when the user makes a voice call by using the social software, the function list of the target application program required at this time includes a separation control option and a separation sound option. Thus, the manner of determining the function list of the current target application may include, but is not limited to: the method comprises the steps that a certain APP is started by first equipment used by a user, a certain task is executed in the certain APP by the first equipment used by the user, and in the process of executing the certain task by the first equipment, the user inputs a trigger instruction and the like in a set mode.
For example, the manner in which the user invokes the collaborative process may further include other triggering manners, such as Near Field Communication (NFC), Ultra Wide Band (UWB), and the like, and taking NFC as an example, when the user plays a game on a mobile phone, the user may touch a television, and may determine the television as an associated device that meets the collaborative requirement of the mobile phone; the user touches the intelligent sound box again, and the intelligent sound box can be determined as the associated equipment meeting the cooperative demand of the mobile phone; when the user touches the computer, the computer or the equipment carried by the computer can be determined as the associated equipment meeting the cooperative requirement of the mobile phone.
Illustratively, the manner in which the user evokes the collaborative flow may also include a contactless specific gesture to trigger. The first device can also detect non-contact gesture operation nearby the first device, and when the non-contact gesture operation meets preset first device detection information generation conditions, the first interface is triggered and displayed, and then the function needing assistance of the first device and at least one piece of associated device meeting the assistance function are determined according to the selection of a user. The user only needs to perform non-contact gesture operation near the first device to start the assisting operation process, and does not need to contact the first device and the associated device, so that the operation is convenient and humanized.
It is noted that the first device and the associated devices of the present invention are only functionally classified. In practical applications, each device in the device cooperation system may have functions of the first device and the associated device, and when performing the cooperative response, the first device or the associated device may be selected according to specific situations.
It should be noted that, while the first device controls the plurality of associated devices to perform the cooperation process, the first device itself may also participate in the cooperation process. Specifically, at least one associated device and the first device perform cooperation processing according to their cooperation response information.
For example, the first interface may be suspended and disposed at a fixed position of the current interface. As shown in fig. 5a, the starting coordinate position 1 at the upper left corner and the coordinate position 2 at the lower right corner of the first interface may be set by a user, or may be set by a system of the first electronic device, which is not limited in this application.
Illustratively, the display position of the first interface may also be determined according to a display direction of the screen of the first device. For example, when the screen of the first device is displayed in a landscape orientation, the display position of the first interface may be as shown in fig. 5 a; when the screen of the first device is displayed in a portrait orientation, the display position of the first interface may be as shown in fig. 5 b.
Illustratively, the display position of the first interface may be moved. When the user drags or slides the first interface, the first interface can move according to the dragging direction or the delimiting direction of the user.
Illustratively, the display position of the first interface may be random. The display position of the first interface may be a position displayed last time by the user, a display position randomly determined by the system, or a position touched by the user, which is not limited in the embodiment of the present application.
Optionally, the device operating system and/or the application program in the first device provides the function list.
In the embodiment of the present application, the first device may include an application program provided by an operating system of the device, for example, a voice call, a sound playing, a position locating, and the like. The first device may also include an application program for the user to apply his own business service, for example, an application program downloaded by the user for social interaction, an application program for watching movies and videos, an application program for listening to music, and the like. Each application has its own list of functions provided. Since the functions in the function list can be divided into services provided by the device operating system on the first device and application programs applying business services of the user, when the user starts a collaborative process on different application programs, the function list will change with the device list under the user name and the change of different application programs. Only the functions that are active are displayed on the first interface.
Wherein the method further comprises: when the first device is started, the first device scans the device operating system and/or all the application programs to obtain a service list of the first device, wherein the service list comprises a function list provided by the device operating system and/or all the application programs; and the first equipment reports the service list to the cloud equipment.
Specifically, the first device scans through the applications developed in a given manner on the first device at startup, and records the functions that can be provided by these applications in a list. Meanwhile, when the application program is installed or uninstalled, the first device can monitor the application program and record the monitored content. When the device operating system on the first device and/or the functions provided in all the application programs change, reporting the changes to the cloud device so as to synchronize the changes.
Further, the device under the management of the cloud device may also upload a service list of the device to the cloud device, and the cloud device may store the service lists uploaded by each device according to the granularity of the device, the application program, and the function. When the first device needs other devices to cooperatively complete a certain function, the first device can directly search for the device capable of realizing the function that the first device needs to coordinate through the cloud device without the adaptation of the third-party application, and adapt the capability of the device in a system application mode, so that the third-party application can realize the cooperation among the devices without any adaptation in the development stage.
S420, responding to a first selection operation aiming at the separation control option, and displaying a second interface by the first equipment, wherein the second interface is used for displaying a first associated equipment list associated with the first equipment.
The first selection operation is mainly used for selecting a function of the target application program, which needs to be operated cooperatively, from the function list. The first selection operation may be any operation set in advance, may be set by a user, or may be set by a system of the first electronic device, which is not limited in the present application. The first selection operation may include a plurality of different implementation forms according to a difference between the first devices and a difference between display modes of the first devices. For example, for a first device that may support a touch screen, such as a smartphone, the first selection operation may be a touch screen operation; for a first device controlled by a mouse, such as a desktop computer, the first selection operation may be a click operation; for a remotely controlled first device, such as a smart television, the first selection operation may be a selection operation.
Wherein the method further comprises: in response to a third selection operation for the split screen option, the first device displays the second interface for displaying a second associated device list associated with the first device; and/or, in response to a fifth selection operation for the separate sound option, the first device displays the second interface for displaying a third list of associated devices associated with the first device.
In this application, the first associated device list includes an associated device list capable of implementing an operating function; the second associated device list includes an associated device list capable of displaying a screen; the third list of associated devices includes a list of associated devices capable of playing sound. It should be noted that, when the function list includes the separate screen option and/or the separate sound option, the first device may also display the second interface in response to the first selection operation for the separate screen option and/or the separate sound option.
Specifically, after a user starts a collaboration flow of a target application program on a first device, a first interface is displayed on a current interface of the first device, and a function that the target application program can perform collaboration is displayed in the first interface, where the function includes at least one of a separation control option, a separation screen option, and a separation sound option. Then, the user selects a function which needs to be coordinated from the function list according to the self requirement, specifically, the user selects the function which needs to be coordinated through a first selection operation.
In one possible implementation manner, the target associated device list is obtained from a cloud device, and the target associated device list includes at least one of the first associated device list, the second associated device list, and the third associated device list.
The first device and the devices under the management of the cloud device can upload service lists of the first device and the cloud device, and the cloud device can store the service lists uploaded by the devices according to the granularity of the devices, application programs and functions. When the first device has selected a function that the target application program needs to cooperate with, the first device may obtain a device capable of implementing the function from the cloud device.
Optionally, the method further includes: and the first equipment reports a target account to the cloud equipment, wherein the target account is a user account logged in by the first equipment.
When multiple devices are managed through the cloud device, the cloud device can be managed through a user account logged in the devices. When a user logs in the first device by using a user account, the first device may upload a logged-in user account to the cloud device, and the cloud device may store the service list in the granularity of the user account. The user account logged in by the first device may be a currently logged-in terminal system account or a currently logged-in application account.
For example, after a user logs in by using a user account, the device under the management range of the cloud device may upload the user account that logs in the device to the cloud device.
And the user account logged in by each associated device in the target associated device list is the target account.
Specifically, after receiving the service lists uploaded by the devices, the cloud device may store the service lists of the devices according to the user account, and specifically, the service lists uploaded by the devices of the same user account are stored together. Therefore, when the first device requests the cloud device for realizing the selected function requiring cooperation, the cloud device may search for a device which is the same as the user account of the first device and which can realize the function requiring cooperation, that is, an associated device, and send the searched device to the first device in a list form. The first device displays the searched device in the second interface, so that the user can execute the function on the first device through other devices. Through the management of the user account, other people can be prevented from using own equipment at will.
In this embodiment of the application, after a user performs a first selection operation on a first interface, a first interface is displayed on a current interface by a first device, and one or more associated devices acquired from a cloud device are displayed in the first interface. For example, as shown in fig. 5c, the user selects a separate operation option on the first interface through a first selection operation, the first device responds to the first selection operation, and displays a second interface on the current interface, where a gamepad, a mouse, a keyboard, and a mobile phone touch capable of implementing an operation on a target application program of the first device are displayed in the second interface. As shown in fig. 5d, the user selects the split screen option through the first selection operation on the first interface, the first device responds to the first selection operation, and displays the second interface on the current interface, and the second interface displays the living room television and my tablet capable of realizing screen display for the target application program of the first device. As shown in fig. 5e, the user selects the sound separation option through the first selection operation on the first interface, the first device responds to the first selection operation, and displays the second interface on the current interface, and the bluetooth headset, the bluetooth speaker, and the living room television that can implement sound playing for the target application program of the first device are displayed in the second interface.
For example, the display position of the second interface may be determined according to the display position of the first interface. For example, when the first selection operation by the user is the detach control option, the display position of the second interface may be located beside the detach control option, as shown in fig. 5 c; when the first selection operation by the user is the split screen option, the display position of the second interface may be located beside the split screen option, as shown in fig. 5 d.
Illustratively, the display position of the second interface may also be determined according to the display direction of the screen of the first device. For example, when the screen of the first device is displayed in a landscape orientation, the display position of the second interface may be located beside the split control option, as shown in fig. 5 c-5 e; when the screen of the first device is displayed in the portrait orientation, the display position of the second interface may be displayed on the first interface in a floating manner, as shown in fig. 5 f.
Illustratively, the display position of the second interface may be moved. When the user drags or slides the second interface, the second interface can move according to the dragging direction or the delimiting direction of the user.
Illustratively, the display position of the second interface may be random. The display position of the second interface may be a position displayed last time by the user, a display position randomly determined by the system, or a position touched by the user, which is not limited in the embodiment of the present application.
In a possible implementation manner, the associated devices may be devices that the first device may detect and connect to, and the number of associated devices that meet the requirement obtained finally may be 0, 1, or multiple according to the scene where the user is located. For example, assuming that in the current scenario, the function list of the target application includes a separate operation option, and a separate sound option, where the devices included in the scenario are device a (capable of implementing the functions of the separate operation option and the separate screen option), device B (capable of implementing the functions of the separate operation option and the separate sound option), and device C (capable of implementing the functions of the separate operation option, and the separate sound option), the associated device combinations that satisfy the requirements of the current scenario include 5 combinations, namely device a + device B, device a + device C, device B + device C, and device a + device B + device C.
Optionally, the first device may obtain attribute information of each terminal device within a preset range, where the attribute information includes a function that the terminal device can provide. And the first equipment selects terminal equipment capable of realizing the function needing to be coordinated from the terminal equipment according to the attribute information of the terminal equipment, and stores the terminal equipment capable of realizing the function needing to be coordinated in a target associated equipment list.
In a scene where a user is located, a first device may initiate a search in a bluetooth protocol, a Hilink protocol, or the like, detect and connect peripheral terminal devices, and then read attribute information of each terminal device, respectively, to determine functions that can be implemented by each terminal device, for example, to read profile information of peripheral terminal devices, thereby obtaining which functions the device has (many profiles are specified by the bluetooth protocol, each profile describes a function; the Hilink protocol is similar). In addition, each terminal device in the scene can also broadcast the functions which can be realized by the terminal device. After determining which terminal devices exist around and which functions each terminal device has, one or more terminal devices capable of realizing the function required to be coordinated selected by the user, that is, the determined associated device, can be selected from the terminal devices.
Optionally, the first device may further obtain a pre-stored device function comparison table, where functions that can be provided by each terminal device are recorded in the device function comparison table. And the first equipment determines the terminal equipment which realizes the function needing to be coordinated in a preset range according to the equipment function comparison table, and stores the terminal equipment which can realize the function needing to be coordinated in a target associated equipment list.
In another mode, the first device may further pre-construct and store an device function comparison table, where the device function comparison table records functions that can be provided by each known type of terminal device, for example, the smart sound box has an audio playing function, and the smart television has an image display function and an audio playing function. After detecting and connecting the peripheral terminal devices in a wired or wireless (such as bluetooth) manner, the functions that can be provided by each terminal device can be respectively determined in a table look-up manner, and finally, the functions can be selected from the terminal devices.
Optionally, the method further includes: the first device sends a function request to the cloud device, wherein the function request is used for requesting a function query, and the function query comprises at least one of the following items: searching all functions under the target account, searching all the associated equipment comprising the function list, and searching all the functions on the associated equipment.
After the user logs in the first device, the user can request the cloud device for inquiring functions, and the following inquiring modes are mainly adopted:
1. and obtaining all functions which can be provided on all equipment under the current user account by searching the function list.
2. All devices with the functions, such as a display function and a positioning function, are searched through the functions, and some functions may be only available on some devices, so that a part of the devices can be screened out.
3. By means of the device finding function, the first device may find all available functions on the selected device.
The user can meet the requirements of the user in different application scenes according to the query function. For example, a user has a device a, and the user needs the device to implement a separate operation function in cooperation with the device a, and when the user finds that the accessory has a device B, the user can determine whether the device B can implement the operation function in cooperation by finding all functions that can be provided by the device B.
S430, responding to a second selection operation aiming at least one associated device in the first associated device list, so that the target application program of the first device is controlled by the at least one associated device.
In this embodiment of the application, after the second interface is displayed, the first device may determine, according to a second selection operation of the user, at least one associated device in cooperation, so that the at least one associated device implements a manipulation of the first device.
Optionally, the method further includes: in response to a fourth selection operation for at least one associated device in the second list of associated devices, causing a screen of the target application of the first device to be displayed by the at least one associated device; and/or, in response to a sixth selection operation for at least one associated device in the third list of associated devices, causing sound of the target application of the first device to be played by the at least one associated device.
It should be noted that, the user may perform the functions in the function list of the target application by the associated device through multiple selection operations. For example, the control function corresponding to the separation control option in fig. 5a may be implemented by a mouse and a keyboard, the function of the display screen corresponding to the separation screen option in fig. 5a may be implemented by my tablet, and the function of playing the sound corresponding to the separation sound option in fig. 5a may be implemented by a bluetooth speaker. And the second selection operation is mainly used for selecting the function of the target application program, which needs to be operated cooperatively, from the function list. The second selection operation may be any operation set in advance, may be set by a user, or may be set by a system of the first electronic device, which is not limited in the present application. The second selection operation may include a plurality of different implementation forms according to a difference between the first devices and a difference between display modes of the first devices. For example, for a first device that may support a touch screen, such as a smartphone, the second selection operation may be a touch screen operation; for a first device controlled by a mouse, such as a desktop computer, the second selection operation may be a click operation; for a remotely controlled first device, such as a smart television, the second selection operation may be a opt-in operation.
Illustratively, after the first device determines at least one associated device according to the operation of the user, the first device may invoke a remote service to perform interaction between the first device and the at least one associated device.
For example, as shown in FIG. 6a, a list of functions in a running application is implemented on device A using the device operating system to provide display, input, and pointing functions. As shown in fig. 6B, when the device a starts the cooperation flow, the device B provides a positioning function to cooperate with the separation positioning option of the function list in the device a, the device C provides an input function to cooperate with the separation control option of the function list in the device a, and the device D provides a display function to cooperate with the separation screen option of the function list in the device a.
Optionally, the method further includes: the first device sends an acquisition request to the cloud device, wherein the acquisition request is used for requesting to acquire authorization information of a target function on the at least one associated device, and the target function is a function of realizing a split operation function of the target application program; and the first device receives the authorization information sent by the cloud device in response to the acquisition request.
After the first device establishes connection with the at least one associated device, the first device may verify, through the cloud device, whether a target function on the at least one associated device is authorized by the user. Specifically, when the user needs to cooperate with the separate manipulation option on the first device, the first device may verify whether the target function on the associated device capable of implementing the manipulation function is authorized. For example, when a user needs a mouse on a desktop computer to operate the operation of a game on the mobile phone, the mobile phone needs to verify whether the mouse on the desktop computer is authorized by the user; when a user needs the intelligent sound to display music being played by the mobile phone, the mobile phone needs to verify whether the intelligent sound is authorized by the user; when a user needs the smart television to display a video being played on the mobile phone, the mobile phone needs to verify whether the smart television is authorized by the user.
Optionally, the method further includes: if the authorization information comprises an authorization result of the target function on first associated equipment, the first equipment calls the target function on the first associated equipment, and the at least one associated equipment comprises the first associated equipment; if the authorization information does not include the authorization result of the target function on the at least one associated device, the first device terminates calling the target function of the at least one associated device, and displays a third interface, where the third interface is used to prompt that calling the target function of the at least one associated device fails.
Specifically, if the associated device providing the function is authorized, the next step is performed; if not, the call of the whole function is terminated, and a third interface is displayed on the interface of the first device to indicate that the user fails in the cooperative call. And the display position of the third interface is the same as that of the second interface.
Optionally, the method further includes: and the first equipment detects whether the target function is available, terminates calling the target function and displays the third interface when the target function is unavailable.
If the target function is not started, the target function is started; and if the target function fails to be started, terminating the calling, and displaying a third interface on the first device to indicate that the user fails to cooperatively call the time.
In summary, the device cooperation method provided by the present application is introduced with reference to the drawings, and the method can select at least one associated device from the multiple devices to implement the function selected by the user on the first device, so that the first device can merge the functions provided by the multiple associated devices and call the functions of the multiple devices of different categories at the same time, thereby achieving the cooperative work of the multiple devices. And simultaneously, through the operation of the user, at least one associated device is selected from the multiple devices to realize the control function of the target application program on the first device by the user, so that the function that the associated device can control the first device is realized.
In a specific implementation process, as shown in fig. 7a, the device cooperation method provided by the present application may be implemented based on cooperation of three modules, which specifically includes: the system comprises a coordination module, a service management module and a connection module.
The first device and the associated device comprise the cooperation module, the service administration module and the connection module. The cooperation module is used for starting the cooperation process; the service administration module user scans the device operating system and/or all the application programs to generate and report a service list; the connection module is used for establishing communication connection between the cloud device and other devices.
The application program on the device may include functions carried by the device, such as a display function, a positioning function, an input function, a sound function, and the like, and may further include a function list provided by the downloaded application program. The functionality on the device may be exposed in a form that allows the service administration module to scan for identification. The developed functions need to meet certain development specifications, and thus can be identified by the service administration module on the equipment. In particular, default functions are developed on the device that expose capabilities of the device, such as display, sound, input, location, etc., that are consistent with the capabilities exposed by the operating system of the device, and are used in many applications, but are exposed again in the form of well-defined functions. So that the remote device can access these functions as well.
Illustratively, the development specification of the function is extensible, and can also be a remote calling mode in various forms, such as a mature mode of http, gRPC and the like. Special ways may also be used to meet special requirements, such as using proprietary protocols to achieve efficiency in the transmission of streaming data. The multiple remote calling modes can be fused by a framework in the technical scheme, and a proper mode is adopted in a proper scene.
For example, the device cooperation method provided by the present application further requires cooperative cooperation of cloud devices. As shown in fig. 7b, the cloud device may include a service administration module in which a user stores a service list of each device, a device management module for establishing a connection with each device, and a user management module for managing each device.
Illustratively, as shown in fig. 8, the determining, by the first device, of the associated device may further include:
and S81, the first device establishes connection between the first device and the associated device through the connection module on the first device and the device management module on the cloud device.
S82, the first device searches for a service administration module on the cloud device, and verifies whether the first device uses the target function on the associated device to obtain the authorization of the user.
The service management module on the cloud device stores a service list of each device and/or authorization information of each device, and the first device can send a request to the cloud device after establishing connection with the associated device so as to obtain authorization information of a target function on the associated device, so that the associated device is called.
And S83, when the service administration module on the cloud device does not store the authorization information of the target function on the associated device, the cloud requests to acquire the one-time use authorization of the associated device.
And S84, when the service administration module on the cloud device stores the authorization information of the target function on the associated device or the cloud device obtains the one-time use authorization of the associated device, the gateway detection module on the associated device detects whether the target function is available.
And S85, when the gateway detection module on the associated equipment detects that the target function is available, the first equipment calls the target function on the associated equipment.
And S86, when the gateway detection module on the associated equipment detects that the target function is unavailable, the associated equipment tries to start the target function.
And S87, when the associated equipment does not start the target function or the cloud equipment does not obtain the one-time use authorization of the associated equipment, displaying that the target function on the associated equipment is called in failure on the first equipment.
The first device calls the target function on the associated device in a well-defined manner.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module by corresponding functions, fig. 9 shows a schematic diagram of a device cooperation apparatus, as shown in fig. 9, the device cooperation apparatus 900 is applied to an electronic device, and the device cooperation apparatus 900 may include: a display unit 910 and a processing unit 920.
Among other things, display unit 910 may be used to support an electronic device performing S410, S420, etc., described above, and/or other processes for the techniques described herein.
The processing unit 920 may be used to support the electronic device in performing the above-described S430, and/or the like, and/or other processes for the techniques described herein.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The electronic device provided by the embodiment is used for executing the device cooperation method, so that the same effect as the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, for example, may be configured to support the electronic device to perform the steps performed by the display unit 910 and the processing unit 920. The memory module may be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1.
The present embodiment also provides a computer storage medium, where computer instructions are stored in the computer storage medium, and when the computer instructions are run on an electronic device, the electronic device is caused to execute the above related method steps to implement the device cooperation method in the above embodiments.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the device cooperation method in the foregoing embodiments.
In addition, embodiments of the present application further provide a first device, where the first device may specifically be a chip, a component, or a module, and the first device may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the first device runs, the processor may execute the computer execution instructions stored in the memory, so that the chip executes the device cooperation method in the above-mentioned method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the first device is divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed first device and method may be implemented in other ways. For example, the above-described first device embodiment is merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another first device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of the first device or unit through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (22)

1. A method of device cooperation, the method comprising:
the method comprises the steps that a first interface is displayed on a first device, the first interface is used for displaying a function list associated with a target application program, the target application program runs on the first device, and the function list comprises a separation control option of the target application program;
in response to a first selection operation for the separation manipulation option, the first device displays a second interface for displaying a first associated device list associated with the first device;
in response to a second selection operation for at least one associated device in the first list of associated devices, causing the target application of the first device to be manipulated by the at least one associated device.
2. The method of claim 1, wherein the list of functions further comprises a split screen option;
the method further comprises the following steps:
in response to a third selection operation for the split screen option, the first device displays the second interface for displaying a second associated device list associated with the first device;
in response to a fourth selection operation for at least one associated device in the second list of associated devices, causing a screen of the target application of the first device to be displayed by the at least one associated device.
3. The method of claim 1 or 2, wherein the list of functions further comprises a separate sound option;
the method further comprises the following steps:
in response to a fifth selection operation for the separate sound option, the first device displays the second interface for displaying a third list of associated devices associated with the first device;
in response to a sixth selection operation for at least one associated device in the third list of associated devices, causing sound of the target application of the first device to be played by the at least one associated device.
4. The method of any of claims 1-3, wherein a target associated device list is obtained from a cloud device, the target associated device list comprising at least one of the first associated device list, the second associated device list, and the third associated device list.
5. The method of claim 4, further comprising:
and the first equipment reports a target account to the cloud equipment, wherein the target account is a user account logged in by the first equipment.
6. The method according to claim 5, wherein the user account logged in by each associated device in the target associated device list is the target account.
7. The method according to any of claims 1-6, wherein the list of functions is provided by a device operating system and/or an application in the first device.
8. The method of claim 7, further comprising:
when the first device is started, the first device scans the device operating system and/or all the application programs to obtain a service list of the first device, wherein the service list comprises a function list provided by the device operating system and/or all the application programs;
and the first equipment reports the service list to the cloud equipment.
9. The method of claim 8, further comprising:
the first device sends a function request to the cloud device, wherein the function request is used for requesting a function query, and the function query comprises at least one of the following items: searching all functions under the target account, searching all the associated equipment comprising the function list, and searching all the functions on the associated equipment.
10. The method of claim 9, further comprising:
the first device sends an acquisition request to the cloud device, wherein the acquisition request is used for requesting to acquire authorization information of a target function on the at least one associated device, and the target function is a function of realizing a split operation function of the target application program;
and the first device receives the authorization information sent by the cloud device in response to the acquisition request.
11. The method of claim 10, further comprising:
if the authorization information comprises an authorization result of the target function on first associated equipment, the first equipment calls the target function on the first associated equipment, and the at least one associated equipment comprises the first associated equipment;
if the authorization information does not include the authorization result of the target function on the at least one associated device, the first device terminates calling the target function of the at least one associated device, and displays a third interface, where the third interface is used to prompt that calling the target function of the at least one associated device fails.
12. The method of claim 11, further comprising:
and the first equipment detects whether the target function is available, terminates calling the target function and displays the third interface when the target function is unavailable.
13. An apparatus for device cooperation, the apparatus comprising:
the display unit is used for displaying a first interface, the first interface is used for displaying a function list associated with a target application program, the target application program runs on the first equipment, and the function list comprises a separation control option of the target application program;
the display unit is further used for responding to a first selection operation aiming at the separation control option, and displaying a second interface which is used for displaying a first associated equipment list associated with the first equipment;
a processing unit, configured to, in response to a second selection operation for at least one associated device in the first associated device list, cause the target application of the first device to be manipulated by the at least one associated device.
14. The apparatus of claim 13, wherein the list of functions further comprises a split screen option;
the display unit is further configured to: in response to a second selection operation for the split screen option, displaying the second interface for displaying a second associated device list associated with the first device;
the processing unit is further to: in response to a fourth selection operation for at least one associated device in the second list of associated devices, causing a screen of the target application of the first device to be displayed by the at least one associated device.
15. The apparatus of claim 13 or 14, wherein the list of functions further comprises a separate sound option;
the display unit is further configured to: in response to a third selection operation for the separate sound option, displaying the second interface for displaying a third list of associated devices associated with the first device;
the processing unit is further to: in response to a sixth selection operation for at least one associated device in the third list of associated devices, causing sound of the target application of the first device to be played by the at least one associated device.
16. The apparatus of any one of claims 13-15, wherein a target associated device list is obtained from a cloud device, the target associated device list comprising at least one of the first associated device list, the second associated device list, and the third associated device list.
17. The apparatus of claim 16, further comprising a transceiver unit,
the receiving and sending unit is used for reporting a target account to the cloud device, wherein the target account is a user account logged in by the first device.
18. The apparatus according to claim 17, wherein the user account registered by each associated device in the target associated device list is the target account.
19. The apparatus of any of claims 13-18, wherein the list of functions is provided by a device operating system and/or an application in the first device.
20. The apparatus of claim 19,
the processing unit is further configured to scan the device operating system and/or all the application programs when the first device is started, so as to obtain a service list of the first device, where the service list includes a function list provided by the device operating system and/or all the application programs;
the receiving and sending unit is further configured to report the service list to the cloud device.
21. An electronic device, characterized in that the electronic device comprises a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-12.
22. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-12.
CN202011284011.9A 2020-11-16 2020-11-16 Device cooperation method, device, system, electronic device and storage medium Pending CN112286618A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011284011.9A CN112286618A (en) 2020-11-16 2020-11-16 Device cooperation method, device, system, electronic device and storage medium
PCT/CN2021/116501 WO2022100239A1 (en) 2020-11-16 2021-09-03 Device cooperation method, apparatus and system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011284011.9A CN112286618A (en) 2020-11-16 2020-11-16 Device cooperation method, device, system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN112286618A true CN112286618A (en) 2021-01-29

Family

ID=74399056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011284011.9A Pending CN112286618A (en) 2020-11-16 2020-11-16 Device cooperation method, device, system, electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN112286618A (en)
WO (1) WO2022100239A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113827953A (en) * 2021-09-28 2021-12-24 完美世界(北京)软件科技发展有限公司 Game control system
CN114302501A (en) * 2021-12-31 2022-04-08 联想(北京)有限公司 Method and device for establishing connection and electronic system
CN114422566A (en) * 2021-12-29 2022-04-29 Oppo广东移动通信有限公司 Multi-device connection method, device, system, device and storage medium
CN114491684A (en) * 2022-01-07 2022-05-13 广州三七极耀网络科技有限公司 Terminal device collaborative display method, system, terminal device and medium
CN114489540A (en) * 2022-01-12 2022-05-13 广州三七极耀网络科技有限公司 Method, system, device and medium for cooperatively displaying game pictures
WO2022100239A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Device cooperation method, apparatus and system, electronic device and storage medium
CN115002937A (en) * 2022-07-18 2022-09-02 荣耀终端有限公司 Multi-device cooperation method, electronic device and related product
CN115617498A (en) * 2022-12-15 2023-01-17 安徽淘云科技股份有限公司 Application optimization method and device, electronic equipment and storage medium
CN116680020A (en) * 2022-11-22 2023-09-01 荣耀终端有限公司 Multi-device collaborative management method, electronic device and storage medium
WO2023185817A1 (en) * 2022-03-28 2023-10-05 维沃移动通信有限公司 Multi-device cooperation method and apparatus, and electronic device and medium
WO2024027238A1 (en) * 2022-08-05 2024-02-08 荣耀终端有限公司 Multi-device cooperation method, electronic device and related product
WO2024066992A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Multi-device networking system and method, and terminal devices

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116028148B (en) * 2022-08-23 2024-04-12 荣耀终端有限公司 Interface processing method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109660842A (en) * 2018-11-14 2019-04-19 华为技术有限公司 A kind of method and electronic equipment playing multi-medium data
WO2020034227A1 (en) * 2018-08-17 2020-02-20 华为技术有限公司 Multimedia content synchronization method and electronic device
CN111367448A (en) * 2020-03-10 2020-07-03 北京达佳互联信息技术有限公司 Application function execution method and device, electronic equipment and storage medium
CN111666119A (en) * 2019-03-06 2020-09-15 华为终端有限公司 UI component display method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3195098A2 (en) * 2014-07-21 2017-07-26 Apple Inc. Remote user interface
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus
CN111523095B (en) * 2020-03-31 2024-03-15 华为技术有限公司 Cross-equipment interaction method and terminal equipment
CN112286618A (en) * 2020-11-16 2021-01-29 Oppo广东移动通信有限公司 Device cooperation method, device, system, electronic device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020034227A1 (en) * 2018-08-17 2020-02-20 华为技术有限公司 Multimedia content synchronization method and electronic device
CN109660842A (en) * 2018-11-14 2019-04-19 华为技术有限公司 A kind of method and electronic equipment playing multi-medium data
CN111666119A (en) * 2019-03-06 2020-09-15 华为终端有限公司 UI component display method and electronic equipment
CN111367448A (en) * 2020-03-10 2020-07-03 北京达佳互联信息技术有限公司 Application function execution method and device, electronic equipment and storage medium

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100239A1 (en) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 Device cooperation method, apparatus and system, electronic device and storage medium
CN113827953A (en) * 2021-09-28 2021-12-24 完美世界(北京)软件科技发展有限公司 Game control system
CN113827953B (en) * 2021-09-28 2024-03-22 完美世界(北京)软件科技发展有限公司 Game control system
CN114422566A (en) * 2021-12-29 2022-04-29 Oppo广东移动通信有限公司 Multi-device connection method, device, system, device and storage medium
CN114302501A (en) * 2021-12-31 2022-04-08 联想(北京)有限公司 Method and device for establishing connection and electronic system
CN114491684A (en) * 2022-01-07 2022-05-13 广州三七极耀网络科技有限公司 Terminal device collaborative display method, system, terminal device and medium
CN114491684B (en) * 2022-01-07 2022-09-06 广州三七极耀网络科技有限公司 Terminal device collaborative display method, system, terminal device and medium
CN114489540A (en) * 2022-01-12 2022-05-13 广州三七极耀网络科技有限公司 Method, system, device and medium for cooperatively displaying game pictures
WO2023185817A1 (en) * 2022-03-28 2023-10-05 维沃移动通信有限公司 Multi-device cooperation method and apparatus, and electronic device and medium
CN115002937B (en) * 2022-07-18 2022-12-23 荣耀终端有限公司 Multi-device cooperation method, electronic device and related product
CN115002937A (en) * 2022-07-18 2022-09-02 荣耀终端有限公司 Multi-device cooperation method, electronic device and related product
WO2024027238A1 (en) * 2022-08-05 2024-02-08 荣耀终端有限公司 Multi-device cooperation method, electronic device and related product
WO2024066992A1 (en) * 2022-09-30 2024-04-04 华为技术有限公司 Multi-device networking system and method, and terminal devices
CN116680020A (en) * 2022-11-22 2023-09-01 荣耀终端有限公司 Multi-device collaborative management method, electronic device and storage medium
CN115617498A (en) * 2022-12-15 2023-01-17 安徽淘云科技股份有限公司 Application optimization method and device, electronic equipment and storage medium
CN115617498B (en) * 2022-12-15 2023-08-22 安徽淘云科技股份有限公司 Application optimization method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2022100239A1 (en) 2022-05-19

Similar Documents

Publication Publication Date Title
WO2022100239A1 (en) Device cooperation method, apparatus and system, electronic device and storage medium
CN112286477B (en) Screen projection display method and related product
CN110839096B (en) Touch method of equipment with folding screen and folding screen equipment
CN110597473A (en) Screen projection method and electronic equipment
WO2021147406A1 (en) Audio output method and terminal device
CN111221845A (en) Cross-device information searching method and terminal device
CN117008777A (en) Cross-equipment content sharing method, electronic equipment and system
CN112130788A (en) Content sharing method and device
CN111464987B (en) Method for displaying Bluetooth device identification and electronic device
CN112527174B (en) Information processing method and electronic equipment
CN112527222A (en) Information processing method and electronic equipment
WO2022028537A1 (en) Device recognition method and related apparatus
CN114065706A (en) Multi-device data cooperation method and electronic device
CN114442969A (en) Inter-device screen cooperation method and device
WO2022048453A1 (en) Unlocking method and electronic device
WO2021197354A1 (en) Device positioning method and relevant apparatus
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114356195B (en) File transmission method and related equipment
CN114510186A (en) Cross-device control method and device
WO2022152174A1 (en) Screen projection method and electronic device
CN112351411A (en) Information transmission method and electronic equipment
WO2022105793A1 (en) Image processing method and device
CN114079691B (en) Equipment identification method and related device
CN114520867B (en) Camera control method based on distributed control and terminal equipment
CN114567871A (en) File sharing method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination