CN114065706A - Multi-device data cooperation method and electronic device - Google Patents

Multi-device data cooperation method and electronic device Download PDF

Info

Publication number
CN114065706A
CN114065706A CN202010784936.3A CN202010784936A CN114065706A CN 114065706 A CN114065706 A CN 114065706A CN 202010784936 A CN202010784936 A CN 202010784936A CN 114065706 A CN114065706 A CN 114065706A
Authority
CN
China
Prior art keywords
electronic device
data
document editing
application
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010784936.3A
Other languages
Chinese (zh)
Inventor
庾能国
马红欣
冯鹏
胡俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010784936.3A priority Critical patent/CN114065706A/en
Priority to PCT/CN2021/110646 priority patent/WO2022028494A1/en
Publication of CN114065706A publication Critical patent/CN114065706A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/546Message passing systems or structures, e.g. queues
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems

Abstract

The embodiment of the application provides a method for multi-device data collaboration and electronic equipment, which are applied to a system comprising first electronic equipment and second electronic equipment, wherein in response to a first operation acting on a first document editing application of the second electronic equipment, the second electronic equipment displays a first document editing frame of the first document editing application; the second electronic equipment sends a first notification message to the first electronic equipment, and the first electronic equipment displays a first prompt box, wherein the first prompt box comprises at least one of the following items: the first prompt message and the first control; the first prompt message is used for prompting the user whether to start data cooperation with the second electronic device; the first electronic equipment responds to the operation of the first control and displays a data cooperation interface; the first electronic equipment receives the voice input of the user and converts the voice input into characters, responds to the second user operation, sends first text data to the second electronic equipment, and displays the first text data on the first document editing frame.

Description

Multi-device data cooperation method and electronic device
Technical Field
The embodiment of the application relates to the technical field of communication, in particular to a method for multi-device data collaboration and an electronic device.
Background
Office is a main use scene of a Personal Computer (PC), and document creation is one of the main scenes of office. The main way of creating the current manuscript is that the user sits in front of the computer and uses a keyboard or a stylus to input the manuscript into a corresponding document editing application, for example: word/txt and the like have a long text editing function.
At present, the method for editing long text based on a personal computer is single, and the user can input the long text by sitting in front of the personal computer for a long time, so that the creation scene can not be changed, and the requirement of the user on the change of the manuscript creation scene can not be met.
Disclosure of Invention
The embodiment of the application provides a multi-device data cooperation method and electronic equipment, which are used for a manuscript authoring scene and can meet the scene change requirement in user manuscript authoring.
In a first aspect, an embodiment of the present application provides a method for multi-device data collaboration, which is applied to a second electronic device, where a first document editing application may be run on the second electronic device. The second electronic device responds to a first operation on a first document editing application of the second electronic device, and displays a first document editing frame of the first document editing application on the second electronic device; the method comprises the steps that a second electronic device sends a first notification message to a first electronic device, wherein the first notification message is used for establishing data cooperation between the first electronic device and the second electronic device; and the second electronic equipment receives the first text data sent by the first electronic equipment and displays the first text data on the first document editing box.
By the method, the second electronic device can send the first notification message to the first electronic device in a mode of responding to the first operation on the first document editing application of the second electronic device, so that the first electronic device and the second electronic device can establish data cooperation. The user does not need to simultaneously open corresponding data cooperation interfaces on the second electronic device and the first electronic device to actively establish a data cooperation scene, and user experience is improved.
A possible implementation manner may configure a document editing white list for a document editing application in consideration of personalized customization of a user to a first document editing application corresponding to a triggered data collaboration scenario and avoiding false triggering of the data collaboration scenario when data collaboration is not used, so that the first document editing application is an application in the document editing white list configured by the user. Therefore, the user experience of the data cooperation is improved.
A possible implementation manner, the first operation includes at least one of: opening the operation of the first document editing application, clicking the operation of the document editing box in the first document editing application, switching the operation of the first document editing application, and switching the operation of the document editing box in the first document editing application.
By the method, the second electronic device can determine the scene that the user wants to enter the document editing through the operation of the user in the first document editing application, so that the establishment of data cooperation to the first electronic device is triggered, the complex operation brought by the active triggering of the data cooperation scene by the user can be reduced, in addition, the data cooperation with the first electronic device can be triggered without perception aiming at the user who does not use the data cooperation scene, the successful establishment of the data cooperation scene by the user is facilitated, and the experience of the user on the data cooperation use is improved.
In a possible implementation manner, before the second electronic device sends the first notification message to the first electronic device, a first search device interface may be displayed on the second electronic device; the first searching equipment interface is used for displaying at least one piece of electronic equipment which is in communication connection with the second electronic equipment; the second electronic device determines the first electronic device sending the first notification message in response to a confirmation operation of the first electronic device of the at least one electronic device acting on the first search device interface.
By the method, considering that the second electronic device may establish a communication connection with the at least one electronic device, at this time, the second electronic device may display the at least one electronic device to the user by displaying the first checking device interface, so as to determine the first electronic device selected by the user in response to a confirmation operation on the first electronic device in the at least one electronic device on the first checking device interface, thereby establishing data collaboration between the second electronic device and the first electronic device based on the user's needs, and improving flexibility and adaptability of data collaboration.
In one possible implementation, a communication connection request is received from the first electronic device; the communication connection request is realized by a near field communication technology; establishing a communication connection with the first electronic equipment, and displaying a first connection message on the second electronic equipment; the first connection message is used for prompting a user that the second electronic equipment establishes communication connection with the first electronic equipment.
The communication connection between the first electronic device and the second electronic device is realized through the near field communication technology, and the method is not limited to the mode that the first electronic device and the second electronic device are required to be in the same local area network to establish the communication connection, so that the difficulty of establishing the communication connection between different electronic devices is reduced, the compatibility of establishing data cooperation between different electronic devices is improved, and the diversity and the flexibility of a scene for realizing the data cooperation are improved.
In a possible implementation manner, after sending a first notification message to a first electronic device, a second electronic device may further receive a first response message sent by the first electronic device, where the first response message is used to confirm that the first electronic device establishes data collaboration with the second electronic device; thus, the second electronic device may display the first response message on the second electronic device.
Through the method, the first electronic device may send the first response message to the second electronic device, so that the second electronic device determines that the first electronic device and the second electronic device have established data collaboration. Further, the second electronic device displays the first response message, so that the user can determine that the second electronic device and the first electronic device have successfully established data cooperation, and user experience is improved.
In one possible implementation, the second electronic device includes a plurality of document editing applications thereon; sending a second notification message to the first electronic device; the second notification message is to notify the second electronic device that the plurality of document editing applications are included; receiving a switching message sent by the first electronic equipment; the switching message is used for instructing the second electronic device to switch the focus of the second electronic device from a first document editing application to a second document editing application; switching focus to a second document editing box of the second document editing application on the second electronic device; and receiving third text data sent by the first electronic equipment, and displaying the third text data on the second document editing box.
Considering that a plurality of document editing applications may be opened on the second electronic device, the user needs to perform data collaboration in the document editing boxes of the plurality of document editing applications, and at this time, the user may send a switching message to the second electronic device in the process of performing data collaboration with the first document editing box of the first document editing application, so that the second electronic device switches the focus to the second document editing box of the second document editing application according to the switching message, thereby implementing a data collaboration scenario in which third text data is input in the second document editing box. Therefore, the data cooperation function can be suitable for a plurality of scenes of document editing application, and the applicability of data cooperation is improved.
In view of the fact that in the data collaboration process, the second electronic device and/or the first electronic device may cause interruption of a data collaboration scenario due to other events, in the present application, when it is determined to resume the data collaboration scenario, the second electronic device may resume data collaboration in a third document editing application, where the third document editing application is a document editing application corresponding to the first electronic device and the second electronic device when the data collaboration is interrupted; therefore, the second text data input by the user is prevented from being lost due to interruption of data cooperation, and the user experience is effectively improved.
In one possible implementation, when a first condition is met, displaying a third document editing application on the second electronic device; receiving, by a second electronic device, second text data of the first electronic device, and displaying the second text data on a third document editing box of the third document editing application; the second text data is text data generated by the first electronic equipment when the data cooperation is interrupted. Wherein the first condition is: in response to a first switching operation on an application, the first switching operation being to switch focus of the second electronic device to the third document editing application; or receiving a data cooperation recovery message of the first electronic device, where the data cooperation recovery message is used to request recovery of data cooperation between the first electronic device and the second electronic device.
By the method, a data collaboration recovery scene can be triggered by performing a first switching operation on the second electronic device by a user or by using a data collaboration recovery message sent to the second electronic device by the first electronic device by the user, so that the second electronic device displays the third document editing application and receives the second text data sent by the first electronic device.
In response to a second switching operation of the document editing application, displaying a switched interface on the second electronic device; the second switching operation is to switch the focus of the second electronic device out of the third document editing application. Sending a second notification message to the first electronic device through the second electronic device; the second notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device.
By the method, the first electronic device and the second electronic device can be determined to interrupt data cooperation through the second electronic device in response to the second switching operation of the document editing application. Therefore, the second electronic equipment is triggered to send the second notification message to the first electronic equipment, so that the first electronic equipment can store the second text data generated when the data cooperation is interrupted, and the data loss caused by the interruption of the data cooperation is avoided.
In a possible implementation manner, the second electronic device receives an interrupt notification message of the first electronic device, where the interrupt notification message is used to notify the second electronic device of an interrupt in data cooperation with the first electronic device.
By the method, the second electronic device determines that the first electronic device interrupts the data cooperation through the interrupt notification message, so that the second electronic device is prevented from being unaware of the interruption of the data cooperation, and the use experience of a user in the data cooperation process is improved.
In a possible implementation manner, before the second electronic device receives the second text data of the first electronic device, a third notification message may be sent to the first electronic device; the third notification message is used for notifying the first electronic device to restore data collaboration with the second electronic device.
By the method, the second electronic device can send the third notification message to the first electronic device to notify the first electronic device that the data line collaboration scene is recovered when determining to recover the data collaboration scene, so that the efficiency of recovering the data collaboration scene is improved, the first electronic device can send the second text data to the second electronic device, the delay is reduced, and the user experience is improved.
In a possible implementation manner, before the second electronic device receives the second text data of the first electronic device, a third request message may also be sent to the first electronic device; the third request message is used for requesting to restore data cooperation of the first electronic device and the second electronic device; receiving a data recovery cooperation message sent by first electronic equipment; the recovery data cooperation message is used for confirming recovery of data cooperation between the first electronic device and the second electronic device.
By the method, the second electronic device can enable a user to confirm whether to recover the data collaboration scene on the first electronic device by sending the third request message to the first electronic device, and after receiving the data collaboration recovery message sent by the first electronic device, the second electronic device determines the data collaboration recovery scene, so that the third document editing box is triggered and displayed, the second text data sent by the first electronic device is received, the accuracy of the data collaboration recovery opportunity of the second electronic device is improved by confirming the data collaboration recovery mode by the user, and the fluency of the user on the second electronic device and the first electronic device is improved.
In a second aspect, the present application provides a method for multi-device data collaboration, which is applied to a first electronic device, where the first electronic device receives a first notification message from a second electronic device, and displays a first prompt box; the first notification message is used for requesting the first electronic device to establish data cooperation with the second electronic device; the first prompt box includes at least one of: the first prompt message and the first control; displaying a data collaboration interface in response to a first user operation acting on the first control; collecting first voice data, converting the first voice data into first text data, and displaying the first text data on the data collaboration interface; and responding to a second user operation, and sending the first text data to the second electronic equipment.
The first user operation may be a confirmation operation of the user on the first control, and the second user operation may be a confirmation operation of the user on the data collaboration interface.
By the method, the first electronic device can receive the first notification message of the second electronic device, prompt whether the user wants to establish data cooperation or not through the first prompt box, and respond to the operation of the first control, the data cooperation interface is displayed on the first electronic device, so that the first voice data input by the user is collected, converted into the first text data and sent to the second electronic device, and the data cooperation between the first electronic device and the second electronic device is realized. In the process, the process of establishing the data cooperation between the first electronic device and the second electronic device is simple, the operation difficulty of establishing the data cooperation is reduced by the mode of sending the first notification message by the second electronic device, and the convenience and the applicability of creating the manuscript by a user in various scenes are improved.
In one possible implementation, the data collaboration interface further includes: editing a control; recognizing the first voice data as initial text data, and displaying the initial text data on the data collaboration interface; in response to a third user operation acting on the editing control, determining an editing instruction for the initial text data, and displaying the edited initial text data on the data collaboration interface; and in response to a confirmation operation acting on the data collaboration interface, taking the edited initial text data as the first text data.
By the method, the identified initial text data can be edited through the editing control of the data cooperation interface, the operability of inputting the text data through voice of a user is improved, the format of the initial text data can be edited through the editing control, so that an editing instruction of the initial text data is generated and displayed as the state of the edited initial text data, the editing capacity of the text data is improved, and the operation convenience of inputting the text on the second electronic equipment through data cooperation is improved.
In one possible implementation, the first text data includes: initial text data and an editing instruction for the initial text data.
According to the method, after the second electronic device receives the first text data, the edited state of the edited initial text data is displayed on the second electronic device through the editing instruction of the initial text data, and the applicability of data cooperation is improved.
In a possible implementation manner, the first electronic device may further receive a first confirmation message sent by the second electronic device, where the first confirmation message is used to indicate that the second electronic device successfully receives the first text data; accordingly, the first electronic device may delete the first text data displayed on the data collaboration interface.
By the method, the first electronic device can delete the first text data displayed on the data cooperation interface after confirming that the second electronic device successfully receives the first text data, operability of inputting voice data and editing the text data displayed on the data cooperation interface can be improved for a user, memory occupation can be reduced for the first electronic device, and performance is improved.
In one possible implementation, the first electronic device further includes a short-range wireless communication apparatus; before the first electronic device receives the first notification message from the second electronic device, the first electronic device may further send a communication connection request to the second electronic device through near field communication when determining that the distance to the second electronic device satisfies a first distance threshold; the first electronic equipment establishes communication connection with the second electronic equipment and displays a first connection message; the first connection message is used for prompting a user that the second electronic equipment establishes communication connection with the first electronic equipment.
According to the method, through the short-distance wireless communication device, when the distance between the short-distance wireless communication device of the first electronic equipment and the short-distance wireless communication device of the second electronic equipment meets the first distance threshold, the first electronic equipment and the second electronic equipment can be started to establish communication connection, the first electronic equipment and the second electronic equipment are not limited to be located in the same local area network, and convenience and application range of data cooperation used by a user are improved. The first distance threshold may be set according to the short-range wireless communication apparatus, or may be determined according to other manners, which is not limited herein.
In one possible implementation manner, the first electronic device may further send a first response message to the second electronic device in response to a first user operation acting on the first control; the first response message is used for confirming that data cooperation between the first electronic device and the second electronic device is established.
By the method, the data cooperation is established by confirming that the data cooperation is established, namely, the first control is operated by the user, and the first response message is sent to the second electronic equipment, so that the second electronic equipment determines that the data cooperation is established between the first electronic equipment and the second electronic equipment.
In one possible implementation, the second electronic device includes a plurality of document editing applications thereon; the first electronic device may also receive an application notification message of the second electronic device, and display controls of the plurality of document editing applications on the data collaboration interface; the application notification message is to indicate that the second electronic device includes the plurality of document editing applications; sending a switching message to the second electronic device in response to a switching operation acting on controls of the plurality of document editing applications; the switching operation is used for indicating the focus of the second electronic equipment to be switched from the first document editing application to the second document editing application; the switching message is used for instructing the second electronic device to switch the focus of the second electronic device to a second document editing frame of a second document editing application; collecting second voice data, converting the second voice data into third text data, and displaying the third text data on the data cooperation interface; and responding to a confirmation operation acted on the data cooperation interface, and sending the third text data to the second electronic equipment.
Considering that a plurality of document editing applications may be opened on the second electronic device, at this time, the first electronic device may display controls of the plurality of document editing applications on the data collaboration interface through the received application notification message, so as to implement switching between the plurality of document editing applications on the first electronic device by the user in response to an operation on the controls of the plurality of document editing applications, for example, switching from the first document editing application to the second document editing application, and by sending a switching message to the second electronic device, causing the second electronic device to switch the focus to a second document editing frame of the second document editing application, thereby implementing a data collaboration scenario in which third text data is input in the second document editing frame. The data cooperation function can be suitable for scenes of multiple document editing applications, and the applicability of data cooperation is improved.
According to a possible implementation manner, when a second condition is met, the first electronic device caches second text data; the second text data is text data displayed on the data cooperation interface when the first electronic equipment and the second electronic equipment interrupt data cooperation; the second condition is used for indicating that the data collaboration of the first electronic device and the second electronic device is interrupted; when a third condition is met, displaying the data cooperation interface, and displaying the second text data on the data cooperation interface; the third condition is used for indicating that the first electronic device and the second electronic device collaboratively recover data; and the first electronic equipment sends the second text data to the second electronic equipment.
By the method, when the second condition is met, the interruption of the data cooperation scene is determined, so that the first electronic device caches the text data displayed on the data cooperation interface, the problem that the text input by a user is lost due to the interruption of the data cooperation scene is avoided, when the data cooperation is recovered, namely the third condition is met, the first electronic device sends the second text data to the second electronic device, the second text data input by the user is prevented from being lost and cannot be transmitted to the second electronic device due to the interruption of the data cooperation, and the user experience is effectively improved.
A possible implementation manner, the second condition is: the first electronic device receives a second notification message sent by the second electronic device, wherein the second notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device; or the first electronic equipment responds to the application switching operation of the first electronic equipment, and displays the switched application interface on the first electronic equipment; the switching application operation is used to start the switched application on the first electronic device.
By the method, in consideration of a scenario that the second electronic device triggers the interruption of the data cooperation, the first electronic device may determine that the data cooperation is interrupted through the second notification message sent to the first electronic device. In addition, the first electronic device may determine that the data cooperation is interrupted in response to a switching application operation of the first electronic device.
In a possible implementation manner, after responding to an application switching operation of the first electronic device, the first electronic device may further send an interrupt notification message to the second electronic device; the interruption notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device.
Considering a scenario in which the first electronic device interrupts data cooperation, the first electronic device may transmit an interrupt notification message to the second electronic device, thereby causing the second electronic device to determine that data cooperation has been interrupted.
In one possible implementation manner, the third condition is:
a first trigger condition, the first trigger condition comprising at least one of: in response to detecting a gesture to the first electronic device, determining that the gesture satisfies a first gesture condition; the first gesture condition includes: the first electronic equipment moves to a range where the distance between the first electronic equipment and the user is larger than a first threshold value and smaller than a second threshold value; the gesture includes: lifted upwards or moved downwards; or the first electronic equipment is not in a call state; or the camera module of the first electronic device is not in an open state; or receiving a third notification message sent by the second electronic device, where the third notification message is used to notify the first electronic device to recover data cooperation with the second electronic device; or receiving a third request message sent by the second electronic device, where the third request message is used to request the first electronic device to restore data cooperation with the second electronic device, displaying the third request message in a second prompt box on the first electronic device, and confirming restoration of data cooperation between the first electronic device and the second electronic device in response to an operation on the second prompt box.
By the method, when the first electronic device meets the first trigger condition, it is determined that the first electronic device wants to recover the data cooperation scenario, the second electronic device wants to recover the data cooperation scenario, and a third request message is sent to the first electronic device, or a third notification message is sent to the first electronic device after the second electronic device confirms to recover the data cooperation scenario, so that the method adapts to different scenarios of recovering the data cooperation, and improves the usability of the data cooperation scenario.
In a possible implementation manner, after confirming that data cooperation between the first electronic device and the second electronic device is restored in response to the operation on the second prompt box, the first electronic device may further send a data cooperation restoration message to the second electronic device; the recovery data cooperation message is used for confirming recovery of data cooperation between the first electronic device and the second electronic device.
By the method, the first electronic device can send the data cooperation recovery message to the second electronic device to inform the second electronic device of recovering the data cooperation, so that false triggering of a data cooperation scene when the data cooperation is not recovered is avoided, and the use experience of a user on the data cooperation is improved.
In a third aspect, the present application provides a method for multi-device data collaboration, which is applied to a system including a first electronic device and a second electronic device, where the second electronic device may run a first document editing application. The second electronic device responds to a first operation on a first document editing application of the second electronic device, and displays a first document editing frame of the first document editing application on the second electronic device; the method comprises the steps that a second electronic device sends a first notification message to a first electronic device, wherein the first notification message is used for establishing data cooperation between the first electronic device and the second electronic device; the first electronic equipment receives a first notification message from the second electronic equipment and displays a first prompt box; the first prompt box includes at least one of: the first prompt message and the first control; the first prompt message is used for prompting a user whether to start data cooperation with the second electronic equipment; displaying the data collaboration interface in response to a first user operation acting on the first control; collecting first voice data, converting the first voice data into first text data, and displaying the first text data on the data collaboration interface; and responding to a second user operation, and sending the first text data to the second electronic equipment. And the second electronic equipment receives the first text data sent by the first electronic equipment and displays the first text data on the first document editing box. The first user operation may be a confirmation operation of the user on the first control, and the second user operation may be a confirmation operation of the user on the data collaboration interface.
In one possible implementation, the first document editing application is an application in a user-configured document editing whitelist.
A possible implementation manner, the first operation includes at least one of: opening the operation of the first document editing application, clicking the operation of the document editing box in the first document editing application, switching the operation of the first document editing application, and switching the operation of the document editing box in the first document editing application.
In one possible implementation, the data collaboration interface further includes: editing a control; recognizing the first voice data as initial text data, and displaying the initial text data on the data collaboration interface; in response to a third user operation acting on the editing control, determining an editing instruction for the initial text data, and displaying the edited initial text data on the data collaboration interface; and in response to a confirmation operation acting on the data cooperation interface, taking the edited initial text data as the first text data.
In one possible implementation, the first text data includes: initial text data and an editing instruction for the initial text data.
In a possible implementation manner, the first electronic device may further receive a first confirmation message sent by the second electronic device, where the first confirmation message is used to indicate that the second electronic device successfully receives the first text data; accordingly, the first electronic device may delete the first text data displayed on the data collaboration interface.
In a possible implementation manner, before the second electronic device sends the first notification message to the first electronic device, a first search device interface may be further displayed; the first searching equipment interface is used for displaying at least one piece of electronic equipment which is in communication connection with the second electronic equipment; and in response to a confirmation operation of a first electronic device of the at least one electronic device acting on the first search device interface, determining the first electronic device which sends the first notification message.
In one possible implementation, the first electronic device further includes a short-range wireless communication apparatus; when the first electronic device determines that the distance between the first electronic device and the second electronic device meets a first distance threshold value, sending a communication connection request to the second electronic device through near field communication; the second electronic equipment establishes communication connection with the first electronic equipment, and displays a first connection message on the second electronic equipment or the first electronic equipment; the first connection message is used for prompting a user that the second electronic equipment establishes communication connection with the first electronic equipment.
In one possible implementation manner, after responding to the first user operation acting on the first control, the first electronic device may further send a first response message to the second electronic device; the first response message is used for confirming that data cooperation between the first electronic device and the second electronic device is established. After the second electronic device sends the first notification message to the first electronic device, the second electronic device may further receive a first response message sent by the first electronic device, where the first response message is used to confirm that the first electronic device and the second electronic device establish data cooperation; thus, the second electronic device may display the first response message.
In one possible implementation, the second electronic device includes a plurality of document editing applications thereon; sending a second notification message to the first electronic device; the second notification message is to notify the second electronic device that the plurality of document editing applications are included; the first electronic device may also receive an application notification message of the second electronic device, and display controls of the plurality of document editing applications on the data collaboration interface; the application notification message is to indicate that the second electronic device includes the plurality of document editing applications; sending a switching message to the second electronic device in response to a switching operation acting on controls of the plurality of document editing applications; the switching operation is used for indicating the focus of the second electronic equipment to be switched from the first document editing application to the second document editing application; the second electronic equipment receives a switching message sent by the first electronic equipment; the switching message is used for instructing the second electronic device to switch the focus of the second electronic device from a first document editing application to a second document editing application; switching focus to a second document editing box of the second document editing application on the second electronic device; the method comprises the steps that first electronic equipment collects second voice data, converts the second voice data into third text data, and displays the third text data on a data cooperation interface; and responding to a second user operation, and sending the third text data to the second electronic equipment by the first electronic equipment. And the second electronic equipment receives third text data sent by the first electronic equipment and displays the third text data on the second document editing box.
According to a possible implementation manner, when a second condition is met, the first electronic device caches second text data; the second text data is text data displayed on the data cooperation interface when the first electronic equipment and the second electronic equipment interrupt data cooperation; the second condition is used for indicating that the data collaboration of the first electronic device and the second electronic device is interrupted; when a third condition is met, displaying the data cooperation interface, and displaying the second text data on the data cooperation interface; the third condition is used for indicating that the first electronic device and the second electronic device collaboratively recover data; and the first electronic equipment sends the second text data to the second electronic equipment. When the second electronic equipment determines that the first condition is met, displaying a third document editing application; receiving, by a second electronic device, second text data of the first electronic device, and displaying the second text data on a third document editing box of the third document editing application; the second text data is text data generated by the first electronic equipment when the data cooperation is interrupted. Wherein the first condition is: determining, by a second electronic device, to be responsive to a first switching operation on an application, the first switching operation being to switch focus of the second electronic device to the third document editing application; or the second electronic device determines to receive a data cooperation recovery message of the first electronic device, where the data cooperation recovery message is used to request recovery of data cooperation between the first electronic device and the second electronic device.
A possible implementation manner, the second condition is: the first electronic device receives a second notification message sent by the second electronic device, wherein the second notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device; or the first electronic equipment responds to the application switching operation acted on the first electronic equipment and displays the switched application interface; the switching application operation is used to start the switched application on the first electronic device.
In one possible implementation manner, the second electronic device determines to display a switched interface on the second electronic device in response to a second switching operation of the document editing application; the second switching operation is to switch the focus of the second electronic device out of the third document editing application. Sending a second notification message to the first electronic device through the second electronic device; the second notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device.
In one possible implementation manner, the third condition is:
a first trigger condition, the first trigger condition comprising at least one of: in response to detecting a gesture to the first electronic device, determining that the gesture satisfies a first gesture condition; the first gesture condition includes: the first electronic equipment moves to a range where the distance between the first electronic equipment and the user is larger than a first threshold value and smaller than a second threshold value; the gesture includes: lifted upwards or moved downwards; or the first electronic equipment is not in a call state; or the camera module of the first electronic device is not in an open state; or receiving a third notification message sent by the second electronic device, where the third notification message is used to notify the first electronic device to recover data cooperation with the second electronic device; or receiving a third request message sent by the second electronic device, where the third request message is used to request the first electronic device to restore data cooperation with the second electronic device, displaying the third request message in a second prompt box on the first electronic device, and confirming restoration of data cooperation between the first electronic device and the second electronic device in response to an operation on the second prompt box.
In a possible implementation manner, a first electronic device sends an interrupt notification message to a second electronic device, and the second electronic device receives the interrupt notification message of the first electronic device, where the interrupt notification message is used to notify the second electronic device of data cooperation interrupt with the first electronic device.
In a possible implementation manner, before the second electronic device receives the second text data of the first electronic device, a third notification message may be sent to the first electronic device; the third notification message is used for notifying the first electronic device to restore data collaboration with the second electronic device.
In a possible implementation manner, before the second electronic device receives the second text data of the first electronic device, a third request message may also be sent to the first electronic device; the third request message is used for requesting to restore data cooperation of the first electronic device and the second electronic device; the first electronic equipment sends a data recovery cooperation message to the second electronic equipment; the second electronic equipment receives a recovery data cooperation message sent by the first electronic equipment; the recovery data cooperation message is used for confirming recovery of data cooperation between the first electronic device and the second electronic device.
In a fourth aspect, the present application provides an electronic device comprising: a display screen; one or more processors; one or more memories; wherein the one or more memories store one or more computer programs, the one or more computer programs comprising instructions, which when executed by the one or more processors, cause the electronic device to perform any of the possible aspects of the first and second aspects.
In a fifth aspect, there is also provided an electronic device comprising means for performing the method of any one of the possible designs of the first or second aspect; these modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a sixth aspect, an embodiment of the present application provides a chip, where the chip is coupled with a memory in an electronic device, and executes any one of the possible technical solutions of the first aspect or the second aspect of the embodiment of the present application; "coupled" in the context of this application means that two elements are joined to each other either directly or indirectly.
In a seventh aspect, an embodiment of the present application provides a system, where the system includes the electronic device of the fourth aspect or the electronic device of the fifth aspect.
In an eighth aspect, there is provided a computer program product comprising: computer program (also called code, or instructions), which when executed, causes a computer to perform the method of any of the possible implementations of the first or second aspect.
In a ninth aspect, a computer-readable storage medium is provided, which stores a computer program (which may also be referred to as code or instructions) that, when executed on a computer, causes the computer to perform the method of any one of the possible implementations of the first or second aspect.
In a tenth aspect, a graphical user interface on an electronic device is further provided, where the electronic device has a display screen, one or more memories, and one or more processors configured to execute one or more computer programs stored in the one or more memories, and the graphical user interface includes a graphical user interface displayed when the electronic device executes any one of the possible technical solutions of the first, second, or third aspects of the embodiments of the present application.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an android operating system according to an embodiment of the present application;
fig. 4a is a schematic flowchart of a method for multi-device data collaboration according to an embodiment of the present application;
4 b-4 g are schematic diagrams of interfaces involved in document editing of a second electronic device according to an embodiment of the present application;
5 a-5 c are schematic diagrams of a data collaboration control interface of a second electronic device according to an embodiment of the present application;
fig. 5d is a schematic view of a data collaboration control interface of a second electronic device according to an embodiment of the present application;
6 a-6 c are schematic diagrams of an interface of a first electronic device according to an embodiment of the present application;
7 a-7 b are schematic diagrams of a data collaboration interface of a second electronic device according to an embodiment of the present application;
8a-8b are schematic diagrams of a data collaboration interface of a first electronic device provided by an embodiment of the present application;
FIG. 9a is a diagram of a rich text field provided by an embodiment of the present application;
fig. 9b is a schematic flowchart of parsing text data into a keyboard buffer according to an embodiment of the present application;
fig. 10a is a schematic flowchart of a method for multi-device data collaboration according to an embodiment of the present application;
fig. 10b is a schematic view of a data collaboration control interface of a first electronic device according to an embodiment of the present application;
fig. 10c is a schematic view of a data collaboration control interface of a second electronic device according to an embodiment of the present application;
fig. 11 is a schematic gesture diagram of a first electronic device according to an embodiment of the present application;
fig. 12a is a schematic flowchart of a method for multi-device data collaboration according to an embodiment of the present application;
fig. 12b is a schematic view of a data collaboration control interface of a second electronic device according to an embodiment of the present application;
fig. 12c is a schematic view of a data collaboration control interface of a first electronic device according to an embodiment of the present application.
Detailed Description
With the rapid development of society, mobile terminals such as mobile phones are becoming more and more popular. The mobile phone not only has a communication function, but also has strong processing capability, storage capability, a photographing function and the like. Therefore, the mobile phone can be used as a communication tool, and is a mobile file library of the user, and can be connected with other electronic equipment for transmitting various personal information, photos, videos and other information of the user. Therefore, based on the mobility and convenience of the mobile terminal, the mobile terminal can be used for manuscript authoring, and the mobile terminal can be combined with the document editing function of the document editing software on the personal computer, so that the mobile terminal can be suitable for various manuscript authoring scenes of users. Based on this, the application provides a multi-device data cooperation method.
The embodiments of the present application may be applied to electronic devices such as a mobile phone, a tablet computer, a wearable device (e.g., a watch, a bracelet, a helmet, an earphone, etc.), an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a smart home device (e.g., a smart television, a smart speaker, a smart camera, etc.), and the like. It is understood that the embodiment of the present application does not set any limit to the specific type of the electronic device.
Referring to fig. 1, a simplified diagram of a system architecture according to an embodiment of the present application is provided. As shown in fig. 1, the system architecture may include: a first electronic device 101 and a second electronic device 102. The first electronic device 101 and the second electronic device 102 may access the same local area network or different local area networks. The example that the first electronic device 101 and the second electronic device 102 access the same local area network may specifically be: the first electronic device 101 and the second electronic device 102 establish a wireless connection with the same wireless access point. For example, the first electronic device 101 and the second electronic device 102 access the same wireless fidelity (WI-FI) hotspot, and for example, the first electronic device 101 and the second electronic device 102 may also access the same bluetooth beacon through a bluetooth protocol. For another example, the first electronic device 101 and the second electronic device 102 may also trigger a communication connection through a Near Field Communication (NFC) tag, and transmit encrypted information through a bluetooth module to perform identity authentication. After the authentication is successful, data transmission is performed in a point-to-point (P2P) manner.
The second electronic device 102 also supports a network communication protocol for the first electronic device 101 to establish a network connection. The second electronic device 102 may act as a target client for text collaboration, so that the collaboration data is input in the first electronic device 101 and the text data is output in the second electronic device 102. In some embodiments, the second electronic device 102 may be a mobile phone, a tablet, or a computer, and the computer may be a desktop computer or a notebook computer.
The following describes embodiments of the first electronic device 101, a graphical user interface (hereinafter may be abbreviated as GUI) for the first electronic device 101, and for the first electronic device 101. In some embodiments of the present application, the first electronic device 101 may be a portable electronic device, such as a cell phone, a tablet computer, a wearable device with wireless communication capabilities (e.g., a smart watch), etc., that also includes other functionality, such as personal digital assistant and/or music player functionality. Exemplary embodiments of the portable electronic device include, but are not limited to, a mount
Figure BDA0002621586840000111
Or other operating system. The portable electronic device described above may also be other portable electronic devices such as a laptop computer (Lap) with a touch sensitive surface (e.g., a touch panel)top), and the like. It should also be understood that in some other embodiments of the present application, the first electronic device 101 may not be a portable electronic device, but may be a desktop computer having a touch-sensitive surface (e.g., a touch panel).
It should be understood that "at least one" in the embodiments of the present application means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
Referring to fig. 2, an electronic device 200 may be the first electronic device 101 or the second electronic device 102 in the embodiment of the present application, and the electronic device 200 provided in the embodiment of the present application is described by taking the first electronic device 101 as the electronic device 200 in the embodiment of the present application as an example. It will be understood by those skilled in the art that the electronic device 200 shown in fig. 2 is merely an example and does not constitute a limitation of the electronic device 200, and that the electronic device 200 may have more or fewer components than those shown, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a SIM card interface 295, and the like. The sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the electronic device 200 may further include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like, which are not shown in the figure).
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be, among other things, a neural center and a command center of the electronic device 200. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
The processor 210 may operate the audio output method provided in the embodiment of the present application, so as to reduce the operation complexity of the user, improve the intelligent degree of the terminal device, and improve the user experience. The processor 210 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the audio output method provided in the embodiment of the present application, for example, part of the algorithm in the audio output method is executed by the CPU, and another part of the algorithm is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 200 may include 1 or N display screens 294, N being a positive integer greater than 1. The display screen 294 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display 294 may display a photograph, video, web page, or file, among others. As another example, the display 294 may display a graphical user interface of the first electronic device 101 as shown in FIG. 1. The graphical user interface shown in fig. 1 includes, among other things, a status bar, a Dock bar, a concealable navigation bar, time and weather widgets (widgets), and icons of applications, such as browser icons. The status bar includes the name of the operator (e.g., china mobile), the mobile network (e.g., 4G), the time and the remaining power. A back key icon, a home key icon, and a forward key icon may be included in the navigation bar. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It is also understood that, in other embodiments, a Dock bar may be included in the graphical user interface shown in fig. 1, and a commonly used application icon may be included in the Dock bar. When the processor 210 detects a touch event of a finger (or a stylus, etc.) of a user with respect to an application icon, in response to the touch event, a user interface of an application corresponding to the application icon is opened and displayed on the display 294.
In the embodiment of the present application, the display screen 294 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens may be adopted. After the processor 210 runs the audio output method provided by the embodiment of the present application, the processor 210 may control an external audio output device to switch the output audio signal.
The cameras 293 (front camera or rear camera, or one camera may be used as both front camera and rear camera) are used for capturing still images or video. In general, the camera 293 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave lenses) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
Internal memory 221 may be used to store computer-executable program code, including instructions. The processor 210 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 221. The internal memory 221 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, codes of application programs (such as a camera application, a WeChat application, etc.), and the like. The storage data area may store data created during use of the electronic device 200 (e.g., images, video, etc. captured by a camera application), and the like.
The internal memory 221 may also store one or more computer programs 1310 corresponding to the audio output algorithms provided by embodiments of the present application. The one or more computer programs 1 are stored in the memory 221 and configured to be executed by the one or more processors 210, the one or more computer programs 1310 include instructions that can be used to perform the steps in the corresponding embodiments, such as fig. 4a, 10a, and 12a, the computer programs 1310 can include an account number verification module 2211, a priority comparison module 2212. The account verification module 2211 is used for authenticating system authentication accounts of other terminal devices in the local area network; the priority comparison module 2212 can be used for comparing the priority of the audio output request service with the priority of the current output service of the audio output device. The state synchronization module 2213 may be configured to synchronize the device state of the audio output device currently accessed by the terminal device to another terminal device, or synchronize the device state of the audio output device currently accessed by another device to the local. When the code of the audio output algorithm stored in the internal memory 221 is executed by the processor 210, the processor 210 may control the audio output device to switch the output audio signal.
In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the codes of the audio output algorithm provided by the embodiment of the present application may also be stored in the external memory. In this case, the processor 210 may execute the code of the audio output algorithm stored in the external memory through the external memory interface 220, and the processor 210 may control the audio output device to switch the output audio signal.
The function of the sensor module 280 is described below.
The gyro sensor 280A may be used to determine the motion pose of the electronic device 200. In some embodiments, the angular velocity of the electronic device 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 280A. I.e., the gyro sensor 280A may be used to detect the current motion state of the electronic device 200, such as shaking or standing still.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 280A may be used to detect a folding or unfolding operation acting on the display screen 294. The gyro sensor 280A may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The acceleration sensor 280B may detect the magnitude of acceleration of the electronic device 200 in various directions (typically three axes). I.e., the gyro sensor 280A may be used to detect the current motion state of the electronic device 200, such as shaking or standing still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B may be used to detect a folding or unfolding operation acting on the display screen 294. The acceleration sensor 280B may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. When the display screen in the embodiment of the present application is a foldable display screen, the proximity optical sensor 280G may be disposed on a first screen of the foldable display screen 294, and the proximity optical sensor 280G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 280A (or the acceleration sensor 280B) may transmit the detected motion state information (such as an angular velocity) to the processor 210. The processor 210 determines whether the electronic device 200 is currently in the handheld state or the tripod state (for example, when the angular velocity is not 0, the electronic device 200 is in the handheld state) based on the motion state information.
The fingerprint sensor 280H is used to collect a fingerprint. The electronic device 200 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and the like.
The touch sensor 280K is also referred to as a "touch panel". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 294. In other embodiments, the touch sensor 280K can be disposed on a surface of the electronic device 200 at a different location than the display screen 294.
Illustratively, the display screen 294 of the electronic device 200 displays a home interface including icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks an icon of the camera application in the main interface through the touch sensor 280K, and the processor 210 is triggered to start the camera application and open the camera 293. Display screen 294 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 251 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 200. The mobile communication module 251 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 251 can receive electromagnetic waves from the antenna 1, and filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 251 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the same device as at least some of the modules of the processor 210. In this embodiment, the mobile communication module 251 may be further configured to perform information interaction with other terminal devices, that is, send an audio output request to other terminal devices, or the mobile communication module 251 may be configured to receive the audio output request and encapsulate the received audio output request into a message in a specified format.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays images or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 251 or other functional modules, independent of the processor 210.
The wireless communication module 252 may provide a solution for wireless communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), NFC, Infrared (IR), and the like. The wireless communication module 252 may be one or more devices that integrate at least one communication processing module. The wireless communication module 252 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 252 may also receive a signal to be transmitted from the processor 210, perform frequency modulation on the signal, amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves. In this embodiment, the wireless communication module 252 is configured to establish a connection with an audio output device, and output a voice signal through the audio output device. Or the wireless communication module 252 may be used to access the access point device, send messages corresponding to audio output requests to other terminal devices, or receive messages corresponding to audio output requests sent from other terminal devices. Optionally, the wireless communication module 252 may also be used to receive voice data from other terminal devices.
In addition, the electronic device 200 may implement an audio function through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone interface 270D, the application processor, and the like. Such as music playing, recording, etc. The electronic apparatus 200 may receive a key 290 input, generating a key signal input related to user setting and function control of the electronic apparatus 200. The electronic device 200 may generate a vibration alert (such as an incoming call vibration alert) using the motor 291. The indicator 292 of the electronic device 200 may be an indicator light, and may be used to indicate a charging status, a power change, or a message, a missed call, a notification, etc. The SIM card interface 295 in the electronic device 200 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic apparatus 200 by being inserted into the SIM card interface 295 or being pulled out from the SIM card interface 295.
It should be understood that in practical applications, the electronic device 200 may include more or less components than those shown in fig. 2, and the embodiment of the present application is not limited thereto. The illustrated electronic device 200 is merely an example, and the electronic device 200 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a terminal device. Fig. 3 is a block diagram of a software configuration of a terminal device according to an embodiment of the present invention. Fig. 3 is a schematic diagram of a software architecture that can be run in the first electronic device. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. As shown in fig. 3, the software architecture may be divided into five layers, which are an application layer, an application framework layer, an android runtime and system library, a hardware abstraction layer, and a Linux kernel layer.
The application layer is the top layer of the operating system and includes native applications of the operating system, such as email clients, notes, calls, calendars, browsers, contacts, etc. The APP, abbreviated as application, related to the embodiments of the present application is a software program capable of implementing one or more specific functions. Generally, a plurality of applications can be installed in a terminal device. For example, a camera application, a mailbox application, an intelligent home control application, and the like. The application mentioned below may be a system application installed when the terminal device leaves a factory, or may be a third-party application downloaded from a network or acquired from another terminal device by a user during the process of using the terminal device.
Of course, for a developer, the developer may write an application and install it into the layer. The application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like. Wherein, the window manager is used for managing the window program. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
In some embodiments of the present application, the application layer may be configured to implement presentation of a setting interface, which may be used for a user to set a data collaboration function of the first electronic device 101. For example, the user may perform on/off setting of the data cooperation function in the setting interface, and may also perform configuration of the data cooperation function in the setting interface, such as document type setting available for the data cooperation function, user name and password setting for login authentication, and the like. For example, the setting interface may be content in a status bar or a notification bar displayed on the touch screen of the first electronic device 101, or may be a collaboration control interface displayed on the touch screen of the first electronic device 101.
In a possible implementation manner, the application program may be developed using Java language, and is completed by calling an Application Programming Interface (API) provided by an application framework layer, and a developer may interact with a bottom layer (e.g., a hardware abstraction layer, a kernel layer, etc.) of an operating system through the application framework to develop its own application program. The application framework is primarily a series of services and management systems for the operating system.
In some embodiments of the present application, the application framework layer includes a document editing discovery service, which is mainly responsible for calling a service interface for communicating with the hardware abstraction layer, so as to transmit a request for opening or closing a data collaboration function to the hardware abstraction layer, and is also responsible for managing document editing discovery service information, such as managing a document type (word \ outlook \ txt \ nottepad + + \\ UltraEdit, etc.) triggering the document editing discovery service, managing a user name and a password for login authentication, and the like. Illustratively, the document editing discovery service may include various modules for managing data collaboration functions. For example, the document editing discovery service includes an interface adapter, a document discovery setting, a document discovery state machine, a local controller, and the like. The document editing discovery service may also include a configuration file. The configuration file may include a white list of applications that may trigger the authoring of the document corresponding to the document editing discovery service. For example, applications in the white list may include: and document editing applications, such as word, excel, wps, txt, statepad + +, UltraEdit, and the like. It may also be a mail application, e.g. outlook, etc., an instant messaging application, e.g. WeChat, nail, Taobao client, etc., and a browser application, e.g. IE, Firefox, etc. Of course, there may be other applications for which manuscript authoring may exist, such as multimedia applications like photo editing applications. For example, a user may need to perform text editing when making multimedia files such as an album, a video file, and a voice file, and therefore, the user may add the application to the white list, so that the application in the white list is stored in the configuration file, and the document editing discovery service may determine to open a document editing scene when the application in the white list is opened.
In some other embodiments, the data cooperation function is implemented as a sub-function of an original document editing function in the first electronic device, for example, the data cooperation function may be implemented as a sub-function of a Huawei sharing (Huawei Share) function of a Huawei mobile phone, and in this case, the document editing discovery service may exist in the Huawei Share application in the form of a sub-module.
For example, when a user operates a cooperation switch button of the data cooperation function, the interface adapter may provide a corresponding service for the setting interface, so as to change a state of the cooperation switch button of the data cooperation function in the setting interface (e.g., switch from an off state to an on state). The document editing discovery service setting is mainly responsible for storing a user name and a password for triggering the document type and/or login authentication of the document editing discovery service. The manuscript discovery state machine is mainly responsible for monitoring the running state of the bottom layer collaboration service. The local controller is mainly responsible for controlling signals of the bottom layer. The configuration file is mainly used for storing a cooperation directory, a file directory, cooperation range setting of a user and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including on, off, etc.). The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
In the embodiment of the application, the application framework layer may further include a file viewer, a WI-FI service, and a notification manager, and these three modules are mainly responsible for cooperating with a document editing discovery service to provide a data collaboration function. Such as WI-FI service, for monitoring whether the first electronic device is attached to the wireless access point. As another example, the notification manager is configured to transmit the notification message to an upper layer for presentation in a touch screen of the first electronic device.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library of the android system comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of the android system. The application layer and the application framework layer run in a virtual machine. Taking java as an example, the virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers, media libraries, three-dimensional graphics processing libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like. The surface manager is used to manage the display subsystem and provide a fusion of two-dimensional and three-dimensional layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The two-dimensional graphics engine is a two-dimensional drawing engine.
In some embodiments of the present application, the system library may further include: the system comprises a data cooperation configuration module, a document editing and discovering service password module, a document editing and discovering configuration file module, a service interface module, a data cooperation service module, a document editing and discovering service password module and a document editing and discovering service password module, wherein the data cooperation configuration module is used for providing a service interface for communicating with an application program framework layer, and the document editing and discovering service password module is also used for managing data cooperation service, managing a document discovering configuration file, a document editing and discovering service password and the like. The document discovery configuration file may be used to store document editing discovery service information, e.g., a password for the document editing discovery service may be used to store information such as a username and password for login authentication.
The Hardware Abstraction Layer (HAL) is a support of the application framework, and is an important link for connecting the application framework Layer and the Linux kernel Layer, and can provide services for developers through the application framework Layer.
Illustratively, the functions of the data collaboration service and the document editing discovery service in the embodiments of the present application may be implemented by configuring a first process at a hardware abstraction layer, and the first process may be a sub-process separately constructed in the hardware abstraction layer. The first process may include modules such as an interface, a document editing discovery controller, a data collaboration controller, and a document editing discovery configuration interface. The interface is a service interface for communicating with the application framework layer. The document editing discovery controller is configured to monitor a document editing discovery service configuration of an upper layer, for example, to control whether authentication is required. The document editing discovery controller is mainly responsible for monitoring whether the data input into the first electronic device needs to be cached or updated, and when the input data needs to be cached or updated, the upper layer can be informed to cache or update the collaboration directory and the like. The hardware abstraction layer may further include a daemon process, which may be used to cache data in the first process, and may be a sub-process separately constructed in the hardware abstraction layer.
The Linux Kernel layer provides core system services of the operating system, and security, memory management, process management, a network protocol stack, a driver model and the like are all based on the Linux Kernel. The Linux kernel also serves as an abstraction layer between the hardware and software stacks. The layer has many drivers associated with the electronic device, the main drivers being: display driving; linux-based frame buffer drivers; a keyboard driver as an input device; flash drive based on memory technology equipment; driving a camera; audio driving; driving by Bluetooth; WI-FI drives, etc. In some embodiments of the present application, the Linux kernel relies on a local file system. The local file system can be accessed through the collaboration service, and collaboration data in the local file system can be configured through a service interface of the hardware abstraction layer.
The technical solutions in the following embodiments may be implemented in an electronic device having the above hardware architecture and software architecture.
For convenience of understanding, in the embodiment of the present application, the first electronic device is a mobile phone, the second electronic device is a notebook computer, and the notebook computer accesses the collaboration data in the mobile phone, which is taken as an example, and a text processing method for multiple devices provided in the embodiment of the present application is specifically described. As shown in fig. 4a, the method comprises the following steps:
s401, the document editing discovery service of the second electronic device determines that the second electronic device is in a document editing scene.
In some embodiments of the present application, a first document editing box of a document editing application of the second electronic device is displayed on the second electronic device in response to a first operation on the document editing application. Wherein the first operation may include at least one of: opening the operation of the first document editing application, clicking the operation of the document editing box in the first document editing application, switching the operation of the first document editing application, and switching the operation of the document editing box in the first document editing application.
In one possible approach, the document editing discovery service may determine the currently open application of the document editing by looking at the current process. For example, in response to an operation acting on the second electronic device to open the word application, the second electronic device opens the process of the word application, so that the document editing discovery service may determine in the process that the process of the word application is enabled, and for example, in response to an operation to open the mail application on the second electronic device, the second electronic device opens the process of the mail application, so that the document editing discovery service may determine in the process that the process of the mail application is enabled. Alternatively, the application may also be a browser application, an instant messaging application, and other white-listed applications, which is not limited herein.
Another possibility is that the second electronic device may be provided with a document edit discovery service, for example, the document edit discovery service may identify that the second electronic device is currently focused on a text edit box. The text edit box may be a text edit box in an application having a text editing function. The application having the text editing function (document editing application) may be a browser, a mailer application, a picture editing application, a call application, a video browsing application, a document editing application, or the like.
The following illustrates possible ways in which the document editing discovery service of the second electronic device determines that the second electronic device is in a document editing scenario by way of mode 1-mode 2.
In the mode 1, taking a document editing application as a word as an example, in response to an operation of opening the word application by a user, the second electronic device opens the word application, and displays a document opening interface on a display screen of the second electronic device, where a control for opening a document (for example, as shown in fig. 4b, an opening control 4013) and the like may be displayed in the document opening interface. As shown in fig. 4b, in response to an operation of a user to open the first document, the second electronic device may display an open document interface 4016, wherein the open document interface 4016 may include a plurality of text edit boxes 4017, for example, a text edit box to input a file name, a text edit box to input a document path, and a search input box. And the second electronic equipment displays the editing interface of the first document in the word application. As shown in fig. 4c, the editing interface of the first document may include at least one editable text editing box, for example, a main interface 4010 of text editing may be included, a text editing box 4011 of a search bar may be included, and a text editing box in a toolbar may be included, for example, in a start tab page in the toolbar, a text editing box 4012 of a font and a font size may be included, in a page layout tab page in the toolbar, a text editing box corresponding to a paragraph may be included, for example, as shown in fig. 4d, when a word opens an interface of a formula editor, an interface of the formula editor may also include a text editing box 4014 of formula editing, or, as shown in fig. 4e, in an editing interface of the word, an editable text editing box 4015 may be included, and contents that may be inserted in the text editing box include: the picture type, the ppt type, and other editable picture types, for example, a file generated by a drawing application such as a viso.
In response to the click operation on the editable text edit box, the word application may put the editable text edit box in an editable state (for example, a dashed box indicated by 4015 shown in fig. 4e, but of course, the editable state may also be identified by other means, which is not limited herein), and at this time, the user may edit the inserted content. In response to a closing operation on the editable text edit box, the word application may put the editable text edit box in an uneditable state, at which time the word application closes a function of editing the inserted content.
Possible scenarios for determining the trigger S403 are illustrated in ways 1.1-1.3 below.
Mode 1.1, the document editing discovery service uses the document editing boxes in the application as the document editing boxes of the document editing scene. At this time, the second electronic device may perform S403 after determining the document editing scene.
In response to an operation acting on the text edit box, the word application may switch focus between the text edit boxes, for example, in response to a click operation acting on the main interface, the word application displays a cursor of the first document on the main interface 4010. At this time, the document editing discovery service may recognize that the second electronic device is currently focused on the main interface 4010. In response to a click operation on the text edit box 4015, the word application displays the focus of the first document within the text edit box 4015. At this point, the document editing discovery service may identify that the second electronic device is currently focused on the text edit box 4015. In response to a close operation on the text edit box 4015, the word application displays the focus of the first document in the main interface 4010. At this time, the document edit discovery service may identify that the second electronic device is currently focused on the text edit box 4010.
Mode 1.2, the user may confirm to establish the data collaboration scenario by confirming to trigger the second electronic device to send the first notification message to the first electronic device by confirming to trigger the document editing box in the data collaboration scenario in S403 described below.
For example, as shown in (a) of fig. 4f, the operation of triggering the data collaboration scenario may be a control 4018 triggering the data collaboration scenario on the document editing discovery service interface, as shown in (b) of fig. 4f, or may be a document editing discovery control 4018 embedded in the document editing application, which is not limited herein. In this manner, in response to a user selecting a text editing box (e.g., the home interface 4010) of the document editing application and an operation of a control that triggers a data collaboration scenario, the second electronic device may determine to trigger the data collaboration scenario.
Mode 1.3, the document editing discovery service may determine the document editing box that triggers the scene of the document editing according to the type of the document editing box in the application. For example, in response to an operation of a user selecting a preset type of text edit box, the second electronic device determines to trigger a data collaboration scenario. For example, the preset type of text edit box may be a text edit box preset by a user or a system, which may be used for data collaboration, and the text edit box may be a text edit box for inputting long text. In connection with the above example, the preset type text editing box may be the main interface 4010, the text editing box 4010, or the like. At this time, the second electronic device may be located in the home interface 4010 according to the display focus of the second electronic device, and may determine to trigger the data collaboration scene to perform S403.
In a mode 2, taking a document editing application as an email application as an example, in response to an operation of opening the email application by a user, the second electronic device opens the email application, and displays an email application interface on a display screen of the second electronic device, as shown in (a) in fig. 4g, a control for opening email editing (for example, a file control as shown in (a) in fig. 4g) may be displayed in the email application interface, and a text editing box 4021 of a search bar may also be included. And responding to the operation of opening the first mail by the user, and displaying an editing interface of the first mail in the mail application by the second electronic equipment. As shown in (b) in fig. 4g, the editing interface of the first email may include at least one editable text editing box, for example, a main interface 4020 for text editing, a recipient editing box 4022, a subject editing box 4023, and the like, and may further include a text editing box 4024 in the search bar.
Possible scenarios for determining the trigger S403 are illustrated in manner 2.1-manner 2.3 below.
In the mode 2.1, the document editing and discovering service uses the edit boxes in the application, for example, the text edit box 4021, the main interface 4020, the recipient edit box 4022, the subject edit box 4023, the text edit box 4024, and the like, as the document edit boxes of the document editing scene. At this time, the second electronic device may perform S403 after determining the document editing scene.
In response to an operation acting on the text editing box, the mail application may switch focus between the text editing boxes, for example, in response to a click operation acting on the main interface 4020, the mail application displays a cursor of the first document on the main interface 4020. At this time, the document editing discovery service may identify that the second electronic device is currently focused on the primary interface 4020. In response to an operation acting on the text editing box 4022, the mail application displays the focus of the first document within the text editing box 4022. At this point, the document editing discovery service may identify that the second electronic device is currently focused on the text edit box 4022.
Mode 2.2, the document editing box in the data collaboration scenario may be triggered by the user confirmation in S403 described below, so that the data collaboration scenario is established by the user confirmation to trigger the second electronic device to send the first notification to the first electronic device.
For example, the operation for triggering the data collaboration scenario may be a control for triggering the data collaboration scenario on the document editing discovery service interface, or may be a document editing discovery control embedded in the document editing application, which is not limited herein. In this manner, in response to a user selecting the text editing box 4020 of the document editing application and an operation of the control triggering the data collaboration scenario, the second electronic device may determine to trigger the data collaboration scenario. Reference may be made specifically to the embodiment in the mode 1.2, which is not described herein again.
Mode 2.3, the document editing discovery service may determine the document editing box that triggers the scene of the document editing according to the type of the document editing box in the application. For example, in response to an operation of a user selecting a preset type of text edit box, the second electronic device determines to trigger a data collaboration scenario. For example, the preset type of text edit box may be a text edit box preset by a user or a system, which may be used for data collaboration, and the text edit box may be a text edit box for inputting long text. In connection with the above example, the preset type text editing box may be the main interface 4020, the text editing box 4024, or the like. At this time, the second electronic device may be located at the home interface 4020 according to the display focus of the second electronic device, and may determine to trigger the data cooperation scenario to perform S403.
In some embodiments of the present application, the document editing and discovering service may be implemented as a sub-function of the data collaboration function in the second electronic device, for example, the data collaboration function in the Huawei computer may be implemented as a sub-function of the Huawei Share function of the computer. Referring to fig. 5a, a user may operate on a data collaboration control interface displayed on a display screen of a computer to start a document editing discovery service of a second electronic device. For example, in data collaboration control interface 510, a switch button 520 for the Huawei Share function, a switch button for the document editing discovery service may be included. Thus, the Huawei Share function is turned on or off in response to operation of a switch button acting on the Huawei Share function. Or in response to an operation of a switch button acting on the document editing discovery service to turn on or off the document editing discovery service.
For another example, referring to fig. 5b, the user may operate on the collaboration control interface displayed on the display screen of the computer, or may operate a prompt box in a notification bar or a task bar of the second electronic device, where the prompt box is used to set whether the second electronic device starts a data collaboration function or a document editing discovery service. Fig. 5b (a) shows a prompt box of the second electronic device when the user is not operating the second electronic device. Fig. 5b (b) shows a prompt box of the second electronic device displayed when the user operates the control of the Huawei Share function, in which case the control of the Huawei Share function includes an open control and a close control of the Huawei Share function. The operation may also be an operation on a related icon in a notification bar or a task bar of the second electronic device, which is not limited herein.
In order to avoid false triggering of the document editing discovery service, the service may be configured with a document editing white list and a document editing control ID, where the document editing white list may include: and triggering the application type of the document editing discovery service, such as word \ outlook \ txt \ notpad + + \ UltraEdit and the like, and when the user input focus is switched to the white list applications in the white list applications, triggering the document editing discovery service to send a notification message to the second electronic device. For example, an application in a document editing whitelist may be set for a user. Illustratively, as shown in FIG. 5a, a document editing white list that triggers the document editing discovery service may also be included in the data collaboration control interface 501.
Further, as shown in fig. 5a, the data cooperation control interface 501 may further include a control for a first electronic device list, so as to set a list of first electronic devices that can be used by the data cooperation function, and avoid that an untrusted first electronic device can establish a data cooperation connection with a second electronic device, so as to improve the security of the data cooperation function.
S402, the first electronic device and the second electronic device are in communication connection.
The first electronic device and the second electronic device can be in communication connection through Bluetooth, NFC and WIFI. For example, the second electronic device triggers the communication connection in an NFC manner to discover the first electronic device, so as to receive the communication connection request from the first electronic device, and further, establish the communication connection with the first electronic device. The following is exemplified in the manner a 1-manner a 3.
In a1 manner, the first electronic device and the second electronic device may establish a wireless connection with the same WI-FI/hotspot, and the second electronic device may search for the first electronic device that establishes a communication connection with the second electronic device through the communication connection, wherein after establishing a wireless connection with the WI-FI hotspot, the second electronic device may broadcast a WI-FI frame, and the WI-FI frame may carry an identifier of the WI-FI hotspot that establishes a wireless connection with the second electronic device, so that the first electronic device may know whether the second electronic device and the first electronic device establish a wireless connection with the same WI-FI hotspot according to the received WI-FI frame.
The identity of the first electronic device discovered by the second electronic device may be displayed in an interface of a data collaboration function of the second electronic device. The identifier of the first electronic device may include a device icon of the first electronic device and/or a discovery name of the first electronic device when the first electronic device is used as a collaboration server. Exemplarily, as shown in (a) of fig. 5c, taking the first electronic device as a mobile phone and the second electronic device as a notebook computer as an example, the user opens the data collaboration function of the notebook computer. In response to the click operation of the user, the notebook computer displays a first search device interface, where the first search device interface may be a search device interface 530 of the data collaboration service of the notebook computer, and the search device interface 530 may include a discoverable identifier of the first electronic device, specifically, as shown in fig. 5c, the search device interface includes an icon 540 of the mobile phone and a discovery name 550 of the mobile phone when the mobile phone is used as a collaboration service end. Therefore, the user can conveniently distinguish the type of the equipment of the collaboration service end discovered by the second electronic equipment, for example, whether the equipment type of the collaboration service end is a tablet computer or a mobile phone. In some other embodiments of the present application, the discovered devices may not be distinguished according to the types of the devices of the collaboration server in the first lookup device interface.
For example, in some other embodiments, the user may operate a prompt box in a notification bar or a task bar of the second electronic device, and in response to the operation, the second electronic device opens the first search device interface. In some other embodiments, the user may operate the related icon in the notification bar or the task bar of the second electronic device, and in response to the operation, the second electronic device opens the first search device interface. In response to the operation of the data cooperation request acting on the first search device interface, the second electronic device displays a login interface, and the login interface is used for prompting the user to input a user name and a password for login authentication. The login interface may also be referred to as a verification interface, and is used to prompt a user name and a password for inputting login authentication.
Under the condition that the first electronic device side does not allow the password-free access, after the user operates the identifier of the first electronic device in the first search device interface shown in fig. 5c, in response to the operation, the second electronic device displays a login interface for the user to input a user name and a password for login authentication. And the second electronic equipment carries the user name and the password input by the user in the communication connection request information and sends the communication connection request information to the first electronic equipment.
After the second electronic device receives the confirmation input of the user, the second electronic device may carry the user name and the password input by the user in the communication connection request information and send the communication connection request information to the first electronic device through the wireless access point.
Illustratively, in conjunction with fig. 5c (a), after the user clicks the identifier of the first electronic device in the first lookup device interface 530 shown in fig. 5c (a), in response to a click operation acting on the identifier of the first electronic device, as shown in fig. 5c (b), the notebook computer displays a login interface 560, and the login interface 560 may include a popup window for the user to enter a user name and password for login authentication. After the pop-up window is displayed on the notebook computer, the user can input a user name and a password in the pop-up window and click a connection button after the input is finished. In response to the click operation acting on the connection button, the notebook computer carries the user name and the password input by the user in the communication connection request information and sends the communication connection request information to the mobile phone through the wireless access point.
And the first electronic equipment receives the communication connection request of the second electronic equipment from the first port and carries out validity verification on the user name and the password carried in the communication connection request. The first port is a service port of a local area network cooperative access protocol.
After the first electronic device opens the data cooperation function, the first port can be monitored, and whether other electronic devices need to establish communication connection with the local computer or not can be known in time. After the first electronic device receives, from the first port, the communication connection request information forwarded by the second electronic device via the wireless access point, the first electronic device may verify the validity of the second electronic device based on the communication connection request information, that is, the first electronic device may verify a user name and a password, which are input by a user and carried in the received communication connection request information. For example, the above validity verification operation may be performed by an application layer or a hardware abstraction layer.
In some other embodiments of the present application, after the first electronic device passes the validity verification of the second electronic device, the first electronic device may further display a notification message to notify the user that a device has successfully established a connection with the first electronic device, which may establish a transmission scenario with the collaboration data in the first electronic device. Illustratively, the hardware abstraction layer performs the validity verification. After the validity of the second electronic device is verified by the hardware abstraction layer, the hardware abstraction layer may send a notification message to the application framework layer that the existing electronic device successfully establishes a connection with the native device, which may establish data collaboration with the native device. And after receiving the notification message, the application program framework layer forwards the notification message to the application program layer. After the application program layer receives the notification message, the notification message used for indicating that the existing electronic equipment is successfully connected with the local machine and establishing data cooperation with the local machine can be displayed on a notification bar or a status bar displayed on the touch screen of the mobile phone. For example, the mobile phone may display a notification message in a notification bar displayed on a touch screen of the mobile phone to prompt a user that an existing electronic device is successfully connected with the local computer, which may establish a data collaboration scenario with the local computer.
In some other embodiments of the present application, in a case that the first electronic device side allows the secret-less access, after the user operates the identifier of the first electronic device in the first query device interface shown in fig. 5c, in response to an operation acting on the identifier of the first electronic device, the second electronic device may send data collaboration request information to the first electronic device to request to enable the data collaboration function of the first electronic device.
In manner a2, the second electronic device may find the first electronic device that established the communication connection with the second electronic device.
The first possible implementation manner may be that the first electronic device establishes a communication connection with the second electronic device, and the second electronic device discovers the first electronic device through the local area network; for example, the first electronic device and the second electronic device may be located in the same local area network, so that the second electronic device searches for all the first electronic devices in the local area network where the second electronic device is located. The second electronic device may also discover the first electronic device in an NFC manner. In this case, the first electronic device and the second electronic device are not limited to being both connected to the local area network, and may also discover the first electronic device. For example, the first electronic device is not networked, the second electronic device accesses the mobile network through data traffic, and at this time, the second electronic device may find the first electronic device in an NFC manner.
For example, the communication connection request may be triggered by the first electronic device, i.e., the first electronic device sends the communication connection request to the second electronic device. For example, a user may scan an identifier such as a two-dimensional code of a second electronic device through a scanning function of a first electronic device, so that the first electronic device sends a communication connection request to the second electronic device according to the acquired identifier of the second electronic device; or, the user may obtain the NFC identifier of the second electronic device through the NFC function of the first electronic device to send the communication connection request to the second electronic device. The second electronic device may transmit a communication connection response to the first electronic device in response to an operation of the communication connection request acting on the found first electronic device. And the first electronic equipment confirms that the communication connection is established with the first electronic equipment according to the communication connection sending response.
As another example, the communication connection request may also be triggered by the second electronic device. For example, the user may obtain the identity of bluetooth of the first electronic device through the bluetooth function of the second electronic device to send a communication connection request to the first electronic device. And after receiving the communication connection request, the first electronic equipment sends a communication connection response to the second electronic equipment. And the second electronic equipment confirms that the communication connection is established with the first electronic equipment according to the communication connection sending response. In addition, the first electronic device may further display a connection request interface on the first electronic device before transmitting the communication connection response to the second electronic device, and confirm establishment of the communication connection with the second electronic device in response to an operation acting on the communication connection request interface.
The process of sending the communication connection request for verification on the first electronic device in the manner a2 may refer to the manner a1, which is not described herein again.
In the manner a3, the first electronic device and the second electronic device may respectively establish a connection with the WI-FI hotspot, and the first electronic device and the second electronic device trust each other, the first electronic device that may establish a communication connection with the second electronic device is determined.
According to a possible implementation manner, the second electronic device establishes communication connection with the first electronic device by default, so that disturbance to a user is avoided.
There are many scenarios that trust each other, which are illustrated in the following by way b1 and way b 2.
In the mode b1, the first electronic device and the second electronic device log in the same account. The account may be an account provided by an operator for a user, such as a glory account, a huacheng account, or the like, or may be an application account, such as a social application account, a video application account, or the like. The WI-FI frame can also carry the account number, so that the first electronic device can know whether the second electronic device and the first electronic device log in the same account number or not according to the WI-FI frame.
Illustratively, when the first electronic device and the second electronic device establish connection with the same WI-FI hotspot, a second user identifier of the second electronic device is acquired, for example, the hua account number logged in by the second electronic device, and it is determined whether the second user identifier of the second electronic device is the same as the first user identifier stored in the first electronic device, for example, the hua account number logged in by the first electronic device. If the first electronic device determines that the second user identifier is the same as the first user identifier, the first electronic device and the second electronic device are mutually trusted, and at this time, the first electronic device may automatically establish a communication connection with the second electronic device.
Mode b2, a mutually trusted scenario may be a first electronic device and a second electronic device that have historically established a communication connection.
For example, a first electronic device that has historically established a communication connection may be stored in the second electronic device, and the second electronic device may use the first electronic device as a trusted first electronic device, at this time, an identifier list of the trusted first electronic device may be displayed on an interface of the second electronic device, and in response to a selection operation performed on the identifier list of the trusted first electronic device, the first electronic device that is currently establishing a communication connection with the second electronic device is determined, so as to send a request for a communication connection to the first electronic device, and receive a response of the communication connection returned by the first electronic device, so as to complete establishment of a communication connection between the first electronic device and the second electronic device.
Optionally, a notification message may also be displayed on the display screen of the second electronic device, for example, the first connection message is displayed on the second electronic device; the first connection message is used for prompting a user that the second electronic equipment establishes communication connection with the first electronic equipment. Similarly, a notification message may be displayed on the touch screen of the first electronic device to notify the user that the second electronic device is in communication with the first electronic device. For example, a first connection message is displayed on a first electronic device; the first connection message is used for prompting the user that the second electronic equipment establishes communication connection with the first electronic equipment.
The process of sending the communication connection request for verification on the first electronic device in the manner a3 may refer to the manner a1, which is not described herein again.
Optionally, a first response message may also be displayed on a display screen of the second electronic device, where the first response message is used to confirm that the first electronic device and the second electronic device start data cooperation. Namely, the first response message is used for notifying the user that the first electronic device opens the data cooperation function, and the first electronic device establishes a communication connection with the second electronic device. As shown in fig. 5d, a data collaboration control interface may also be displayed on the notification bar of the second electronic device, where the data collaboration control interface includes prompt information, and the prompt information may be: "data collaboration with native using HUWEI p 40". The data control interface 510 may also include hints for the first electronic device establishing the connection (e.g., the first electronic device's discovery name is huabei p40), which may be: "HUAWEI p40 connected". Thus, the user is informed that the first electronic device opens the data cooperation function, and the first electronic device establishes communication connection with the second electronic device. Similarly, a notification message may be displayed on the touch screen of the first electronic device to notify the user that the first electronic device opens the data collaboration function, and the second electronic device establishes a communication connection with the first electronic device. For example, as shown in (a) in fig. 6b, on the notification bar of the first electronic device, a data collaboration control interface 710 is displayed, and the data collaboration control interface 710 includes prompt information of the data collaboration service, where the prompt information may be: "use handset to collaborate with HUWEI MATEBOOK data". The data control interface 710 may also include a prompt for the second electronic device to establish a connection (e.g., the discovery name of the second electronic device is HUWEIMATEBOOK), which may be: "HUAWEI MATEBOOK ligated".
In a scenario that the first electronic device establishes a communication connection with the second electronic device, the second electronic device may discover the first electronic device to prepare for a subsequent user to receive, on the second electronic device, the cooperation data sent by the first electronic device.
Further, in order to avoid frequent operation of the data cooperation function by the user in the process of establishing the communication connection between the first electronic device and the second electronic device, in a possible implementation manner, the second electronic device may preset a trigger condition for starting the data cooperation function, for example, when the first electronic device establishes the communication connection with the second electronic device and the second electronic device starts an application for document editing, the second electronic device automatically starts the data cooperation function.
Correspondingly, the first electronic device may also preset a trigger condition for turning on the data cooperation function, for example, after the first electronic device establishes a communication connection with the second electronic device, the first electronic device automatically turns on the data cooperation function. And after the data cooperation is started, sending a first response message to the second electronic equipment, wherein the first response message is used for the second electronic equipment to confirm that the first electronic equipment and the second electronic equipment start the data cooperation. Optionally, the second electronic device may further display a first response message on the display screen, where the first response message may be displayed in the notification bar (for example, as shown in fig. 5d, "huabei p40 connected"), or may be displayed in a pop-up window manner, and the specific display manner may refer to the display manner of the first notification message, which is not described herein again.
Another possible implementation manner, considering that the communication connection is used for implementing the data cooperation function, therefore, the user can set the data cooperation control interface in the second electronic device to: the searched first electronic device is the electronic device with the data cooperation function turned on. Therefore, the second electronic device establishes communication connection only with the first electronic device which has opened the data cooperation function, so as to improve the security of communication connection. Illustratively, when the second electronic device determines that the first electronic device starts the data cooperation function, the communication connection is established with the first electronic device.
Accordingly, the first electronic device may preset to start the data cooperation function.
The following illustrates a manner of turning on the data synergy function A1-A5.
In manner a1, the second electronic device may send a data cooperation request to the first electronic device in response to a data cooperation request operation acting on the found first electronic device. And the first electronic equipment responds to the confirmation operation acting on the data cooperation request and displays the data cooperation control interface. And confirming to start the data cooperation function in response to the operation acting on the data cooperation control interface.
In this embodiment of the present application, the data cooperation function of the first electronic device may be implemented as a sub-function of the manuscript editing service function in the first electronic device, for example, the data cooperation function in the huawei mobile phone may be implemented as a sub-function of the huawei share function of the mobile phone, at this time, referring to fig. 6a, the user operates on a cooperation control interface displayed on a touch screen of the mobile phone to set the data cooperation function of the first electronic device. For example, in the data cooperation control interface 610, the switch button 620 of the Huawei Share function and the switch button 630 of the data cooperation service may be included, and the document editing white list of the second electronic device that triggers the data cooperation function and the second electronic device list corresponding to the trusted second electronic device may also be included.
Among them, the main function of the switch button 620 of the Huawei Share function is to turn on/off the Huawei Share function. The user may operate the switch button 620 of the Huawei Share function, and the mobile phone, upon receiving the user's operation on the switch button 620 of the Huawei Share function, may invoke the relevant system service interface to turn on/off the Huawei Share function, and may switch the state of the switch button 602 of the Huawei Share function (e.g., from the off state to the on state) in response to the user's operation. The data collaboration control interface 610 may further include a real-time status identifier (not shown in fig. 6 a) of the Huawei Share function. Similarly, the main function of the toggle button 630 of the data cooperation service is to turn on/off the data cooperation service. The user may operate the toggle button 630 of the data collaboration service, and after receiving the operation of the toggle button 630 acting on the data collaboration service, the mobile phone may invoke the relevant system service interface to turn on/off the data collaboration service, and may switch the state of the toggle button 630 of the data collaboration service (e.g., from the off state to the on state) in response to the user operation. The data collaboration control interface 610 may further include a real-time status identifier (not shown in fig. 6 a) of the data collaboration service.
In one possible embodiment, as shown in fig. 6b (a), the user operates a collaboration control 720 (input) of the data collaboration function on a first prompt box 710 in a notification bar displayed on a touch screen of the mobile phone, so as to start the data collaboration function of the first electronic device. The notification bar may be an interface displayed on a touch screen of the mobile phone by the user, such as a main screen of the mobile phone after a sliding operation is performed.
Illustratively, in conjunction with fig. 3 and 5a, when a user wants to implement the collaborative composition of text by the first electronic device and the second electronic device, the user may perform a touch operation on a collaboration switch button 502 of a data collaboration function in a collaboration control interface 501 displayed on a touch screen of the first electronic device. At this time, the application layer of the operating system running in the mobile phone may detect the operation of starting the data cooperation function. And the application program layer converts the detected operation into a request for starting the data cooperation function and then transmits the request to an application program framework layer of an operating system running in the mobile phone. And after receiving the request, the application program framework layer calls a service interface communicated with a hardware abstraction layer of an operating system running in the mobile phone so as to transmit the request for starting the data cooperation function to the hardware abstraction layer. And after receiving the request for starting the data cooperation function, the hardware abstraction layer starts the data cooperation function of the mobile phone. For example, the hardware abstraction layer may launch two daemons to open the data collaboration function of the handset. The first daemon process can provide functions of data cooperation service, user authority verification and the like, can open document editing and discovering service, and enables the notebook computer to receive local text data files of a Linux kernel layer of an operating system in the mobile phone.
In some other embodiments of the present application, the cooperation switch button of the data cooperation function may be a physical button, and the button may be disposed on one surface of the first electronic device; when the physical key is pressed or toggled by the finger of the user (i.e., the first input), the first electronic device starts the data cooperation function, and when the physical key is pressed or toggled by the finger of the user again, the first electronic device may stop the data cooperation function.
In the mode a2, the second electronic device establishes communication connection with the first electronic device by default, and the first electronic device automatically starts the data cooperation function, so as to avoid disturbing the user.
For example, the first electronic device may turn on the data collaboration function in response to a user setting a default turn-on data collaboration function on the data collaboration control interface before establishing a communication connection with the second electronic device.
In the manner a3, the first electronic device may also automatically start the data collaboration function of the first electronic device when a specific condition is satisfied (for example, conditions such as S404, the first electronic device receives a data collaboration request from the second electronic device, and the first electronic device satisfies a preset gesture).
In the mode a4, the first electronic device and the second electronic device may respectively establish a connection with the WI-FI hotspot, and when the first electronic device and the second electronic device trust each other, the first electronic device may determine that the first electronic device may establish a communication connection with the second electronic device, and the first electronic device may automatically start a data collaboration function.
For example, after the first electronic device sets a trusted second electronic device in the data collaboration control interface to establish a communication connection, the first electronic device starts a data collaboration function.
In the mode a5, an identification list of the trusted first electronic device may be displayed on an interface of the second electronic device, and in response to a selection operation performed on the identification list of the trusted first electronic device, the first electronic device that is currently in communication connection with the second electronic device is determined, and the second electronic device sends a data collaboration request to the first electronic device. And the first electronic equipment responds to the confirmation operation acting on the data cooperation request, confirms to start the data cooperation function and establishes communication connection with the second electronic equipment.
In this embodiment of the application, after the data cooperation function is started, a cooperation state identifier may also be displayed on the touch screen of the first electronic device, where the cooperation state identifier is used to indicate that the data cooperation function is started. For example, after the hardware abstraction layer turns on the data collaboration function, a notification message that the data collaboration function has been turned on may be sent to the application framework layer. And after receiving the notification message, the application program framework layer forwards the notification message to the application program layer. After the application program layer receives the notification message, a collaboration state identifier for indicating that the data collaboration function is started can be displayed on the mobile phone touch screen. For example, as shown in fig. 6c, the cell phone may display a collaboration icon 740 in a status bar 730 displayed on the cell phone touch screen to alert the user that the data collaboration function of the cell phone has been turned on. The first electronic device may monitor the real-time status of the data collaboration service in the setting database in a manner of registering an observer, and as the status of the data collaboration service changes, the real-time status identifier of the data collaboration service may be updated correspondingly in the control interface 601, or the real-time status identifier of the data collaboration service may be updated in the notification bar, which is not limited herein.
In conjunction with the above embodiments, in the data collaboration control interface 610, a toggle button visible to the second electronic device may also be included. For example, the first electronic device may receive the data cooperation request of the second electronic device only when the toggle button 630 of the data cooperation service and the toggle button visible to the second electronic device establishing the communication connection are simultaneously turned on.
Optionally, a pop-up box (not shown in fig. 6 a) for switching the visible range may be further included in the data collaboration control interface 610, and the pop-up box is used for switching the visible range of the data collaboration by the user. The user can operate in a notification option of a setting interface or a notification bar to trigger the display of the popup box on a touch screen of the mobile phone. And when the state of the data cooperation service is changed, if the popup frame is displayed, the popup frame disappears. In some embodiments, in the case where the user selects the visibility range to be "all devices" visible, the cell phone may display a prompt pop-up window to alert the user that it is risky to open the data collaboration service when the cell phone is connected to an open wireless access point or other electronic device is allowed to gain privacy access. In addition, when the user starts the data collaboration service, the mobile phone can also verify the authority of the user, such as whether the data collaboration service is allowed to be started and/or whether the Huawei account is allowed to be acquired.
In some other embodiments of the present application, the user may further set, in the collaboration control interface, a discovery name of the first electronic device when the first electronic device serves as a collaboration server, a range of collaboration data that can be accessed by the collaboration client, a user name and a password for login authentication, and the like.
Illustratively, as shown in fig. 3 and fig. 5 a-5 d, taking an example of a user name and a password that the user wants to authenticate to login, the data collaboration control interface 510 further includes a computer access verification option. The user may operate on a computer access verification option in data collaboration control interface 510. At this point, the application layer may detect the above operation. And the application program layer converts the operation into a request for setting a user name and a password of login authentication and then transmits the request to the application program framework layer. And after receiving the request, the application framework layer provides a function of setting a user name and a password of login authentication for the application layer. At this time, the application layer may display a user name and a password setting window on the touch screen of the mobile phone in response to the above operation according to the function provided by the application framework layer. The user may enter the corresponding username and password in the username and password setup window and click the ok button after the entry is complete. And the application program layer transmits the user name and the password input by the user to the application program framework layer according to the click operation acted on the determined button. And after receiving the user name and the password, the application program framework layer calls a service interface communicated with the hardware abstraction layer so as to transfer the received user name and the password to the hardware abstraction layer. And after receiving the user name and the password, the hardware abstraction layer stores the received user name and the received password into the document editing and discovering service password so as to realize the setting of the user name and the password for login authentication. In other embodiments of the present application, the user name and password for login authentication may be stored in the document editing discovery service password in the hardware abstraction layer, and a copy of the user name and password may also be stored in the application framework layer, so that when the validity of the user name and password input by the user at the computer is verified, the legal user name and password may be queried from the copy stored in the application framework layer, thereby reducing interaction between layers. In other embodiments of the present application, the user may not set the user name and password for login authentication, so that the user name and password for login authentication will adopt default configuration. In other embodiments of the present application, for setting the user name and the password for login authentication, if the user deletes the default user name and password for login authentication and does not set a new user name and password for login authentication, the second electronic device may automatically access the first electronic device without inputting the user name and password. In addition, in some embodiments of the present application, the first electronic device may further provide an interface for the user to select whether the second electronic device needs to be verified to access, so as to determine whether to start the secret access prevention function.
For example, taking a discovery name when the user wants to use the mobile phone as the collaboration server, as shown in fig. 5a, the user may operate a computer name option (a first electronic device list) in the data collaboration control interface 510. At this point, the application layer may detect the above operation. And the application program layer converts the operation into a request for setting the discovery name and then transmits the request to the application program framework layer. And after receiving the request, the application framework layer provides a function of setting the discovery name for the application layer. At this time, the application layer may display a setting window of the discovery name on the touch screen of the mobile phone in response to the above operation according to the function provided by the application framework layer. The user may input a corresponding name in the setting window of the discovery name, and click the ok button after the input is completed. And the application program layer transmits the name input by the user to the application program framework layer according to the clicking operation acted on the determined button. The application framework layer receives the name and calls a service interface in communication with the hardware abstraction layer to pass the received name to the hardware abstraction layer. And after receiving the name, the hardware abstraction layer stores the received name into the data cooperation configuration file so as to realize the setting of the discovery name. In some embodiments, the mobile phone may further perform validity detection on the discovery name input by the user before the user clicks the confirmation button, and the user may operate the confirmation button only when it is detected that the discovery name input by the user is legal, and if it is detected that the discovery name input by the user is illegal, a prompt box may be displayed to prompt the user that the discovery name is illegal. In other embodiments of the present application, the discovery name may be stored in a data collaboration configuration file of the hardware abstraction layer, and a copy of the discovery name may also be stored in the application framework layer, so that when the discovery name needs to be queried subsequently, the discovery name may be queried from the copy stored in the application framework layer, thereby reducing interaction between layers. In other embodiments of the present application, the user may not set the discovery name, and thus the discovery name will adopt a default configuration. For example, as shown in (a) of fig. 5c, the discovery name of the mobile phone as the collaboration server is as follows: hua is the mobile phone 1. If the user does not modify the found name, the name of the collaboration service end searched by the notebook computer end is a Huawei notebook.
In this embodiment of the application, when the user sets the data cooperation function for the first time, the first electronic device may display a series of guide interfaces to guide the user to set the data cooperation function, and the first electronic device may further store the setting preference of the user, so that the cooperation function may be set according to the setting preference of the user next time.
It should be noted that S401 and S402 are not limited in sequence, and in a specific implementation process, S401 and S402 may be executed simultaneously, or S402 may be executed first and S401 is executed later, or S401 may be executed first and S402 is executed later, which is not limited herein. The step of starting the data coordination function may be executed simultaneously with the establishment of the network connection, may be executed after S404, and may be specifically set according to a scene. The specific embodiment performed after S404 is described in detail below.
S403, the second electronic device responds to the first operation acting on the document editing application, and the second electronic device sends a first notification message to the first electronic device.
As shown in fig. 7a, the second electronic device may determine that the second electronic device is in the document editing scenario according to the display focus of the second electronic device being located in the document editing box 720 in the document editing application 710, or may determine that the second electronic device is in the document editing scenario according to the display focus of the second electronic device being located on the document editing application 710.
In one possible implementation, the second electronic device may send the first notification message to the first electronic device upon determining that the second electronic device is in a document editing scenario.
In another possible implementation manner, the second electronic device may send a first notification message to the first electronic device when it is determined that the second electronic device triggers the data collaboration scenario. Wherein the second electronic device may determine to trigger the data collaboration scenario in response to an operation acting on the document editing application. In combination with the manner 1 and the manner 2, in response to the user selecting the text editing box of the document editing application and the operation of triggering the data collaboration scenario, the second electronic device determines to trigger the data collaboration scenario. The second electronic device may determine to trigger the data collaboration scenario in response to an operation of a user selecting a preset type of text edit box. Thus, the second electronic device may send a first notification message to the first electronic device to establish a data collaboration scenario.
The first notification message sent by the second electronic device may further include an identifier of the document editing software at the focus of the second electronic device, so as to notify the user whether the collaborative input of the text in the document editing software at the focus of the second electronic device is required by the first electronic device.
In some possible embodiments, the document editing application may include only one window (interface). In other possible embodiments, the document editing application may include multiple windows (interfaces), such as in the case of multiple windows of the application, a floating window, and so forth. The second electronic device comprises a plurality of editing interfaces, considering the scene that the multiple application interfaces can be displayed simultaneously on the second electronic device. As shown in fig. 7b, the currently opened applications of the second electronic device include a document editing application and application 1, wherein the document editing application opens 2 interfaces, including an interface 710 for document 1 and an interface 712 for document 2. The interface 710 includes an edit box 720 for document editing, and the interface 712 includes an edit box 722 for document editing. Application 1 opens 1 interface 711. An edit box 721 in which document editing is possible is included in the application 1. The second electronic device may determine that the second electronic device is in a document editing scenario when the user's focus is switched to a document editing application (e.g., the interface for document 1 shown in fig. 7 b). It may also be that when the user opens the document editing application (such as the interface of document 1 or the interface of document 2 shown in fig. 7 b), that is, the second electronic device is determined to be in the document editing scene. Thus, the second electronic device may transmit notification messages of the plurality of document editing applications to the first electronic device; the notification message may be for notifying the second electronic device of a plurality of document editing boxes of a plurality of document editing applications; thus, after receiving the notification message, the first electronic device displays a plurality of edit preview interfaces (e.g., an edit preview interface in a third control shown in (b) of fig. 8a or an edit preview interface correspondingly displayed in a fourth control shown in (c) of fig. 8 a) of the plurality of document edit boxes on the data collaboration interface; responding to switching operation acting on the plurality of editing preview interfaces, and sending a switching message to the second electronic equipment; the switching operation is used for indicating a switched second document editing frame; and after receiving the switching message sent by the first electronic equipment, the second electronic equipment switches the focus of the second electronic equipment to a second document editing frame. Thus, the second electronic device can receive the third text data sent by the first electronic device and display the third text data on the second document editing box.
S404, the data cooperation service of the first electronic device receives the first notification message and displays a first prompt box on the first electronic device.
Here, the first prompt box may be displayed in the notification bar, for example, as shown in (a) of fig. 6b, the first prompt box 710 is displayed in the notification bar. Alternatively, the first prompt box may also be displayed on the main interface of the first electronic device, for example, as shown in (b) of fig. 6b, the first prompt box 730 is displayed on the main interface of the first electronic device. The first prompt box may include first prompt information to display a first notification message. As shown in (a) and (b) of fig. 6b, the identification of the second electronic device is HUWEI mate. The prompt message may be: "use handset to perform data collaboration with HUWEI MATEBOOK".
One possible implementation is to display the first control on the first prompt box in response to an operation, such as a slide operation, a check operation, etc., acting on the first prompt box.
As shown in fig. 6b (a), a first control 711 included in the first prompt box 710 may be a confirmation control for prompting the user to click on the first control to enter the data collaboration display interface. An ignore control 712 may also be included in the first prompt box 710 for the user to close the first prompt box 710. As shown in fig. 6b (b), a first control 731 included in the first prompt box 730 may be a confirmation control for prompting the user to click the first control to enter the data collaboration display interface. An ignore control 732 may also be included in the first prompt box 730 for the user to close the first prompt box 730.
Further, the first control may also be a cooperative switch of the data cooperation function. The switch may be used to turn the data collaboration function on or off. For example, as shown in fig. 6a, the first prompt box may be a collaboration control interface popped up by the first electronic device, and the collaboration switch button of the data collaboration function may be included in the collaboration control interface, for example, in some embodiments, the first operation may be an operation of the collaboration switch button 630 of the data collaboration function by the user in the data collaboration control interface 610 displayed on the touch screen of the mobile phone. In response to the operation of the cooperation switch button 630 acting on the data cooperation function, the state of the cooperation switch button 630 of the data cooperation function may be changed, such as being changed from the off state to the on state. And when the cooperation switch button of the data cooperation function is in an opening state, the first electronic equipment confirms that the data cooperation scene is entered.
When the user operates the first control to start the data cooperation function, the first electronic device displays a data cooperation interface. When the user performs an operation of closing the data cooperation function on the first control, the first electronic device displays a notification message of closing the data cooperation function, and a display mode of the notification message may refer to a display mode of the first prompt box, which is not described herein again. The operation of the cooperation switch of the data cooperation function of the first electronic device by the user may include: clicking, sliding, double clicking, multi-gesture operation or voice control, etc., without limitation.
S405, responding to a first user operation acting on the first control, and popping up a data cooperation interface.
Optionally, the first electronic device may further execute S405a to send a first response message to the second electronic device, where the first response message is used to confirm that the data cooperation function of the first electronic device and the second electronic device is turned on. Therefore, after the second electronic device receives the first response message, the data cooperation service can be displayed in the notification bar to confirm that the first electronic device and the second electronic device establish data cooperation.
And responding to the confirmation operation acted on the first control, and popping up a data cooperation interface on the display screen by the first electronic equipment. In one possible implementation, the data collaboration interface may be displayed on the first electronic device in the form of a first display frame.
As shown in (a) of fig. 8a, the first display frame 810 may include at least one of: a second prompt, a second control 820.
The second prompt information may be used to prompt the data collaboration scene to start, and please perform voice input.
The second control 820 may include a cooperative operation button of a data cooperation function. The operation button is used for starting or closing functions of voice collection, text entry, text operation and the like of a user.
Further, considering that the second electronic device may have opened interfaces of a plurality of document editing applications, the first display frame 810 may further include a third control, as shown in (b) of fig. 8a, and the third control 830 may display an edit preview interface of the plurality of document editing frames of the second electronic device, that is, small icons (including document 1, document 2, and application 1) of the interfaces of the plurality of document editing applications opened by the second electronic device for the user to select an interface of a document editing application corresponding to the collaboratively authored document. For example, in response to an operation on an icon of the document 1, the second electronic device switches to the document editing application of the document 1 for data collaboration with the first electronic device. In response to an operation on the icon of the document 2, the second electronic device switches to the document editing application of the document 2 for data collaboration with the first electronic device. In response to an operation on the icon of the application 1, the second electronic device switches to the document editing box of the application 1 for data collaboration with the first electronic device. Certainly, the switching process may be that the first electronic device sends the corresponding interface identifier to the second electronic device when determining the small icon selected by the user, so as to implement real-time switching. Another possible mode may also be that, after the first electronic device determines the text to be sent, when the text to be sent is sent to the second electronic device, the interface identifier is sent to the second electronic device at the same time, so that, after the second electronic device receives the text to be sent and the interface identifier, the focus is switched to the interface, and then the text to be sent is displayed on the interface.
Considering that the second electronic device may open at least one editable document editing box in the interface of the plurality of document editing applications, the first electronic device may determine the plurality of document editing applications of the second electronic device by receiving notification messages of the plurality of document editing applications of the second electronic device. Thereby, a plurality of edit preview interfaces of the plurality of document editing applications are displayed on the data collaboration interface. The notification message may also be used to notify the first electronic device of the plurality of document editing boxes of the second electronic device so that the first electronic device determines the plurality of document editing boxes of the plurality of document editing applications of the second electronic device. Thereby, a plurality of edit preview interfaces of the plurality of document edit boxes are displayed on the data collaboration interface.
Responding to switching operation acting on the plurality of editing preview interfaces, and sending a switching message to the second electronic equipment; the switching operation is used for indicating a switched second document editing frame; the switching message is used for instructing the second electronic device to switch the focus of the second electronic device to a second document editing frame. For example, the current focus of the second electronic device is on a first document editing frame of a first document editing application, and after the second electronic device receives the switching message, the second electronic device may switch focus from the first document editing frame of the first document editing application to a second document editing frame of a second document editing application. At the moment, if the first electronic equipment collects second voice data, converting the second voice data into third text data, and displaying the third text data on the data cooperation interface; and responding to a second user operation, and sending the third text data to the second electronic equipment. Correspondingly, the second electronic device receives third text data sent by the first electronic device, and the third text data can be displayed on the second document editing box.
In one possible embodiment, multiple edit preview interfaces can be displayed within the first display frame, e.g., the first display frame can further include a fourth control, as shown in fig. 8a (c), the fourth control 840 can be a button of a next focus interface (e.g., an edit box), and in response to an operation acting on the fourth control 840, focus in the second electronic device can be moved to the next editable edit box. Further, the fourth control 840 may further include an edit preview interface of the focus interface in the second electronic device, so as to ensure that the focus interface currently selected by the user is the focus interface desired by the user.
S406, responding to a first operation acted on the first display frame, receiving first voice data of a user, and displaying the first text data on the data cooperation interface.
The operation of the cooperative operation button of the data cooperation function of the first electronic device by the user may include: clicking, sliding, double clicking, multi-gesture operation or voice control, etc., without limitation.
The user may make a voice input based on the second control 820 in the first display box.
As shown in (a) - (c) of fig. 8a, the voice input mode may be a mode of pressing the second control for a long time, and when the user presses the second control, the voice signal starts to be received; and when the user releases the second control, stopping receiving the voice signal, thereby completing one voice input. Or clicking the second control to trigger the starting of the voice input function, and displaying the current voice input state in the first display frame. And stopping receiving the voice signal in response to the operation that the user clicks the second control again, thereby completing one voice input.
The second control 820 may invoke a speech recognition application to recognize the input speech, in this embodiment of the application, the speech recognition application invoked by the second control may be a default speech recognition application of the first electronic device, or may also be a third-party application set in a speech recognition setting interface by the user, which is not limited herein.
Recognizing the first voice data as initial text data, and displaying the initial text data on the data collaboration interface. One possible approach, as shown in (a) of fig. 8b, the first display frame may further include a preview interface 850. The preview interface 850 is used to display the converted initial text data.
The display mode may be various, for example, one possible mode may be that the received voice data is recognized and converted into text while the voice is input, and the converted text may be displayed on the preview interface in the first display frame. In another possible mode, after completing one voice input, the input voice data is recognized, and the recognized text is displayed on the preview interface. Furthermore, error correction can be performed on the recognized text, and the error correction process can be displayed on the preview interface at the same time.
And S407, responding to the confirmation operation acted on the first display frame, and sending the first text data to the second electronic equipment.
The confirmation operation acting on the first display box may be to confirm that the text of the preview interface is the text data to be sent.
Illustratively, as shown in (b) of fig. 8b, the first display frame 810 may further include an edit control 860, and the edit control 860 may be a confirmation button for confirming that the text of the preview interface is the text to be sent by the user.
Before the user confirms the text to be sent, the user can also edit the text in the preview interface, and at this time, the data collaboration interface may include an editing control for editing the text in the preview interface. Therefore, the first electronic device responds to a third user operation acting on the editing control, determines an editing instruction of the initial text data, and displays the edited initial text data on the data cooperation interface; and the first electronic equipment responds to the second user operation of the user and takes the edited initial text data as the first text data. The editing modes can be various, and are exemplified by modes c1-c3 below.
Mode c1, preview interface 850 may be an editable interface in which a cursor may be displayed for displaying the location of the currently inserted text. In response to a user's cursor operation, e.g., moving a cursor, selecting at least one text, etc., of the editable interface, the position of the cursor or the selected at least one text is displayed on the preview interface 850. In response to an operation acting on the voice confirmation control, voice data input by a user is received, voice input by the user is recognized, and displayed text is inserted or updated at the cursor position or at the position of the selected at least one text.
Manner c2, the text in the preview interface 850 may be selected by the user, and in response to a selection operation on the text, an edit control is displayed, which may include: and editing the child controls by deleting, inserting, line changing, backspace, copying, pasting and the like. And responding to the clicking operation acting on the editing control to edit the text. For example, the input text is wrapped, punctuation is added, the text content is modified, etc.
Mode c3, the preview interface 850 may be an editable interface, displaying a keyboard entry box in response to a user clicking on the editable interface. And receiving the text input by the user in response to the operation on the keyboard input box, and displaying the text on the preview interface to realize the editing of the text.
The text data sent by the first electronic device to the second electronic device may also be rich text data. For example, the first text data transmitted by the first electronic device may include: initial text data and an editing instruction for the initial text data. One possible implementation, the rich text data includes: a field of a rich text editing operation. The protocol satisfied by the text data may be a preset protocol, for example, as shown in fig. 9a, the protocol field may include the serial number Cmd identifier of the protocol, the fields corresponding to the protocol requester and the protocol responder, and the text data field. For example, Cmd is identified as 0x01, which may indicate that the protocol requester sends a first notification message to the responder to establish a data collaboration scenario. Cmd is identified as 0x02, which may indicate that the protocol requester sends a second notification message to the responder to notify the data collaboration scenario interruption. Cmd is identified as 0x03, it may indicate that the protocol requester sends text data to the responder to inform the responder to receive text data, which may be data including only text, excluding rich text editing instructions. Cmd is identified as 0x04, it may indicate that the protocol requester sends text data to the responder to inform the responder to receive text data, which may be data including text and rich text editing instructions. The fields corresponding to the protocol requester and the protocol responder may occupy 4 bits, for example, 0x01 is used to indicate: the second electronic device sends a text input request to the first electronic device, 0x81 being used to indicate: the first electronic device sends a text input response to the second electronic device. 0x02 is used to represent: the first electronic device sends a text input request to the second electronic device, 0x82 being used to indicate: the second electronic device sends a text input response to the first electronic device.
S408, the second electronic device receives the first text data and displays the first text data in a first document editing box of the first document editing application.
And the second electronic equipment analyzes the text data field in the first text data after receiving the first text data through the application layer, and sends the text data field to the kernel layer, and the kernel layer stores the text data field in the keyboard buffer area, so that the text data field is displayed on the first document editing frame of the first document editing application through the keyboard buffer area.
The manner in which the application layer transmits the text data field and other fields of the text data to the core layer and stores in the keyboard buffer is illustrated below in manner C1-C3. As shown in fig. 9 b.
In the mode C1, the text data field and other fields of the text data analyzed from the protocol are converted into UNICODE keyboard codes at the application layer, and the converted UNICODE keyboard codes are sent to the keyboard buffer at the kernel layer.
In the mode C2, a keyboard driver is added to the application layer, and the keyboard driver analyzes the text data field and other fields of the text data to generate UNICODE. And sending the UNICODE keyboard code generated by the keyboard driver to the keyboard buffer area through the interface between the keyboard driver and the keyboard buffer area. The method can improve the success rate of analyzing the text data field and other fields of the text data.
In the mode C3, an input method interface of the third-party application is added in the application layer, and the first text data is transmitted to the corresponding keyboard driver corresponding to the third-party application through the input method interface, so that the text data field and other fields of the text data are analyzed by the keyboard driver corresponding to the third-party application, and the UNICODE keyboard code is generated and input to the keyboard buffer. It should be noted that, in this manner, the second electronic device is required to load the third-party application, so as to implement that the text data is stored in the keyboard buffer by the keyboard driver of the third-party application.
S409, the first electronic equipment determines that the second electronic equipment successfully receives the first text data, deletes the first text data, and deletes the first text data on the display interface of the first display frame.
In some embodiments of the present application, the first electronic device may determine that the second electronic device successfully receives the first text data by receiving the first confirmation message sent by the second electronic device. The first confirmation message is used for indicating that the second electronic equipment successfully receives the first text data.
In some other embodiments of the present application, after it is determined that the data cooperation scenario between the second electronic device and the first electronic device is ended, the data cooperation function of the first electronic device may be turned off. One possible implementation manner may be that the user actively turns off the data cooperation function of the first electronic device. At this time, in response to a closing operation on the data collaboration interface, the application layer may detect the closing operation on the data collaboration interface. And the application program layer converts the operation into a request for closing the data cooperation function and then transmits the request to the application program framework layer. And after receiving the request, the application program framework layer calls a service interface communicated with the hardware abstraction layer so as to transmit the request for closing the data cooperation function to the hardware abstraction layer. And after receiving the request for closing the data cooperation function, the hardware abstraction layer closes the data cooperation function of the mobile phone.
In another possible implementation manner, the first electronic device may further dynamically monitor a communication connection state between the first electronic device and the second electronic device, or a connection state between the first electronic device and the wireless access point, and if the first electronic device detects that the first electronic device is disconnected from the second electronic device or the wireless access point, the first electronic device may automatically turn off the data cooperation function, so as to save power consumption of the first electronic device. In addition, the first electronic device can also display a notification message on a touch screen of the first electronic device to prompt the user that the data cooperation function of the local device is automatically turned off due to disconnection between the local device and the second electronic device or disconnection between the local device and the wireless access point.
In the process of cooperatively processing the text, the second electronic device needs to jump to other applications to process other things, so that the scene of data cooperation is interrupted. In the embodiment of the present application, as shown in fig. 10a, the method for data collaboration among multiple devices may further include the following steps:
s501, the second electronic equipment responds to second switching operation of the application, and the switched interface is displayed on the second electronic equipment; the second switching operation is to switch focus of the second electronic device out of a third document editing application; the third document editing application is corresponding to the first electronic device and the second electronic device when the data cooperation is interrupted. Thus, the data collaboration service of the second electronic device determines that the second electronic device exits the data collaboration scenario.
In one possible implementation, the second switching operation on the application may be that the user jumps to another application, which causes the focus of the second electronic device to be switched into the display interface of the other application. The display interface of the other application may be an interface of an application newly opened by the user, or may be an interface of an application already opened by the user, and is not limited herein.
In another possible implementation, the second switching operation on the application may be displaying a notification message on the second electronic device, or receiving a popup window of another application (e.g., an instant messaging application, a video communication application, etc.), which causes the focus of the second electronic device to switch to a display interface of the other application.
And S502, the second electronic equipment sends a second notification message to the first electronic equipment.
The second notification message is used for notifying the first electronic device that the focus of the second electronic device is switched so as to notify the second electronic device of data cooperation interruption with the first electronic device.
Further, the data collaboration service of the second electronic device may store a third document editing frame of the interrupted document editing application (third document editing application), so that when the data collaboration scene is restored, the second electronic device may directly display the third document editing frame of the third document editing application on the second electronic device according to the stored third document editing frame of the third document editing application, or move the focus of the second electronic device to the third document editing frame of the third document editing application, thereby increasing the response speed of restoring the data collaboration scene.
S503, the first electronic device receives the second notification message, and displays a second prompt box on the first electronic device.
The second prompt box is used for prompting the user that the focus of the second electronic equipment is switched.
As shown in (a) of fig. 10b, the second prompt box 1010 may include third prompt information, which may be: "the scenario of the huaboei matchook data collaboration has been interrupted". Further, second prompt box 1010 may also include a control 1020 for ceasing input, such that, in response to operation of control 1020 for ceasing input, first electronic device closes the data collaboration interface.
And S504, saving the second text data which is not sent on the preview interface.
According to a possible implementation manner, the second text data may be directly stored in a cache region in the data collaboration service, or the second text data may also be used to establish a file corresponding to a document editing application of the second electronic device selected by the user, so that the second text data is cached in the corresponding file.
In some embodiments of the present application, the file may be dynamically updated, for example, the application framework layer may monitor text data in a preview interface in the mobile phone, and notify the hardware abstraction layer when it is monitored that the text data in the preview interface changes, so that the hardware abstraction layer updates the corresponding file.
Further, the first electronic device may close the preview interface.
Thus, the second electronic device displays a third document editing frame of a third document editing application on the second electronic device when the first condition is satisfied; a first condition is used for indicating that the first electronic device and the second electronic device collaboratively recover data; the first condition may include various conditions, for example, triggering, by the second electronic device, the data collaboration restoration of the first electronic device and the second electronic device, and also triggering, by the first electronic device, the data collaboration restoration of the first electronic device and the second electronic device, as illustrated in S505a and S505c below.
S505a, responding to a first switching operation of the second electronic device on an application, where the first switching operation is to switch the focus of the second electronic device back to the third document editing application, and the third document editing application is a document editing application corresponding to the first electronic device and the second electronic device when the data collaboration is interrupted. Accordingly, the data cooperation service of the second electronic device determines that the second electronic device resumes the data cooperation scenario, and thus, sends a third request message to the first electronic device, where the third request message is used to request to resume data cooperation between the first electronic device and the second electronic device.
S506a, the first electronic device receives the third request message sent by the second electronic device, and displays a third prompt box on the first electronic device.
The third prompt box is used for prompting the user that the focus of the second electronic equipment is restored to the third document editing box of the third document editing application. The third prompt box may be displayed in the notification bar, or may be displayed in the main interface, which is not limited herein.
Taking the example of displaying the third prompt box in the notification bar, as shown in (b) of fig. 10b, the third prompt box 1011 may include prompt information, which may be: "whether to use handset to cooperate with the HUAWEI MATEBOOK data". Further, the third prompt field 1011 may further include a resume input control 1021, such that, in response to an operation on the resume input control, the first electronic device displays the data collaboration interface and sends a resume data collaboration message to the second electronic device, the resume data collaboration message being used to confirm that the first electronic device resumes data collaboration with the second electronic device. Here, the data collaboration interface may be the data collaboration interface shown in FIGS. 8a-8 b. And will not be described in detail herein.
In the solutions of S505a-S506a, the data collaboration scenario between the first electronic device and the second electronic device is recovered only by the first electronic device performing confirmation, so that the user is prevented from interrupting other events that may be performed on the first electronic device, and the user experience is improved.
S505b, responding to a first switching operation of the application by the second electronic device, wherein the first switching operation is to switch the focus of the second electronic device back to the document editing application, so that the data cooperation service of the second electronic device determines that the second electronic device resumes the data cooperation scenario, and thereby sends a third notification message to the first electronic device, wherein the third notification message is used to confirm that the data cooperation of the first electronic device and the second electronic device is resumed. Correspondingly, the first electronic device receives a third notification message sent by the second electronic device, and displays a third prompt box on the first electronic device. At this time, the third prompt box may include prompt information, and the prompt information may be: "resume using handset to collaborate with the HUAWEI MATEBOOK data".
In the S505b scenario, the first electronic device does not need to confirm, and the data cooperation scenario between the first electronic device and the second electronic device is directly recovered, so that excessive confirmation by the user is avoided, and the user experience is improved.
S505c, when the first electronic device determines that the first trigger condition is satisfied, the data cooperation service of the second electronic device determines that the first electronic device recovers the data cooperation scenario, and thus, sends a recovery data cooperation message to the second electronic device.
The first trigger condition is used for triggering data cooperation services of the first electronic equipment. Possible implementations of the first trigger condition are illustrated below in ways d1-d 2.
Manner d1, in response to a recovery operation of the data collaboration service by the first electronic device, the recovery operation may be a gesture acting on the first electronic device, and the gesture acting on the first electronic device may include: lifted upwards, or moved downwards. The first electronic device may determine that the user is in a gesture of lifting up or a gesture of moving down through a gyroscope, an accelerometer, a gravity accelerometer, or the like. Further, in order to improve the accuracy of triggering and recovering the data collaboration scene by the first electronic device, the first electronic device may further determine that the first electronic device is in the data collaboration scene when the gesture meets the first gesture condition.
As shown in fig. 11, in one possible implementation, the first gesture condition may include: the first electronic equipment moves to a range where the distance between the first electronic equipment and the user is larger than a first threshold value and smaller than a second threshold value. For example, as shown in fig. 11, the first electronic device moves from an initial position to a first position, and the corresponding first distance is within a first threshold range. The first electronic device moves from the initial position to the second position, and the corresponding first distance is within a range larger than a second threshold value. The first electronic equipment moves from the initial position to the third position, and the corresponding first distance is located in a range which is larger than or equal to the first threshold and smaller than or equal to the second threshold. The first threshold may be a distance range between the first electronic device and the user when the first electronic device receives the call. The second threshold may be a distance range in which the user uses a front camera of the terminal device.
In manner d2, the first trigger condition may further include: the first electronic device is not in a call state, or a camera module of the first electronic device is not in an open state, and the like.
And the first electronic equipment sends a data recovery cooperation message to the second electronic equipment when determining that the first trigger condition is met, so that the second electronic equipment switches the focus to a third document editing frame of the interrupted third document editing application after receiving the data recovery cooperation message of the first electronic equipment. And the third document editing application is a corresponding document editing application when the first electronic device and the second electronic device interrupt data cooperation.
Further, the second electronic device may display a fourth prompt box on the second electronic device after receiving the data collaboration recovery message of the first electronic device. The fourth prompt box is used for prompting the user of a data cooperation recovery message sent by the first electronic device to the second electronic device so that the user can determine whether data cooperation needs to be recovered.
As shown in (a) of fig. 10c, the fourth prompt box 1030 may include prompt information, which may be: "whether to resume data collaboration using the HUAWEI p 40". Further, the fourth prompt box 1030 may further include a control 1031 for restoring the input, so that, in response to the operation of the control for restoring the input, a cooperation restoring response is sent to the first electronic device to restore the scene of the data cooperation.
S506c, the first electronic device receives the third notification message from the second electronic device, and displays a third prompt box on the first electronic device.
Wherein the third notification message is used for notifying the first electronic device to restore the data cooperation with the second electronic device. The third prompt box is used for prompting the user that the focus of the second electronic equipment is restored to the interface of the document editing application. The third prompt box may be displayed in the notification bar, or may be displayed in the main interface, which is not limited herein. The third prompt box 1011 in fig. 10b can be referred to for displaying the third prompt box, which is not described herein again.
It should be noted that S506c may be an optional step, for example, the first electronic device and the second electronic device may negotiate in advance, and the recovery of the data collaboration scenario is triggered by the first electronic device without confirmation of the second electronic device, that is, the first electronic device confirms that the data collaboration scenario of the first electronic device and the second electronic device is recovered when confirming that the first trigger condition is met. And when the first electronic equipment confirms that the first trigger condition is met, displaying a third prompt box on the first electronic equipment.
And S507, responding to the operation acted on the third prompt box, and displaying the data cooperation interface by the first electronic equipment.
Wherein, the data collaboration interface can display the text data which is not sent last time.
Further, in response to an operation on the third prompt box, the first electronic device may further display a preview interface.
And S508, responding to the second user operation, and sending second text data to the second electronic equipment.
The second user operation may be a confirmation operation acting on the data collaboration interface, and the second text data is text data that is not sent on the preview interface stored in the first electronic device.
In S508, the first electronic device may further transmit a confirmation message (e.g., a third notification message) to restore the data collaboration to the second electronic device, at which time the second electronic device may display a fifth prompt box on the display screen. As shown in (b) of fig. 10c, the fifth prompt box 1040 may include prompt information, which may be: "resumed use of the huaboei p40 for data collaboration".
Further, the user can continue to input voice on the data collaboration interface to continue the data collaboration scene. This step can be referred to as S406-S409, and is not described herein.
In the process of cooperatively processing the text, the first electronic device may be interrupted by other applications, for example, the first electronic device needs to answer a call or switch a mode of accessing to a network, and for example, a user needs to switch to another application, search for data, and the like, and also the network of the first electronic device may be interrupted to perform a reconnection process, thereby causing an interruption of a scene of data cooperation. In this embodiment of the present application, as shown in fig. 12a, the method for data collaboration among multiple devices may further include the following steps:
s601, the first electronic device responds to an application switching operation on the first electronic device, and the data cooperation service of the first electronic device determines that the first electronic device exits from the document creation scenario, that is, the data cooperation between the second electronic device and the first electronic device is interrupted.
In one possible implementation manner, the switching of the application operation may be that the user jumps to another application, which causes the display interface of the first electronic device to display the display interface of the other application. The display interface of the other application may be an application newly opened by the user, or the display interface of the first electronic device may be switched to another application interface already opened by the user, which is not limited herein.
In another possible implementation, the switching application operation may be displaying a notification message on the first electronic device, or receiving a popup window of another application (e.g., an instant messaging application, a video communication application, etc.), causing the display interface of the second electronic device to display a display interface of the other application.
S602, the first electronic device sends an interrupt notification message to the second electronic device.
The interruption notification message is used for notifying that the data cooperation scene of the first electronic device is interrupted.
Accordingly, the first electronic device may close the preview interface.
In one possible implementation, the first electronic device may save the second text data that is not sent on the preview interface. The manner in which the first electronic device stores the second text data that is not sent on the preview interface may refer to S502, which is not described herein again.
In another possible implementation manner, when the user confirms that the second text data is the data to be sent before the first electronic device is interrupted, the first electronic device may send the second text data to the second electronic device, so that the second electronic device displays the second text data in the corresponding document editing application.
S603, the second electronic device receives the interrupt notification message and displays a second prompt interface on the second electronic device.
The second prompt interface is used for prompting the user that the data collaboration scene of the first electronic device is interrupted.
For example, as shown in (a) of fig. 12b, the second prompt interface 1210 may include prompt information, which may be: "the scenario of the data collaboration of the huaboei P40 has been interrupted". Further, the second prompt interface 1210 may further include a control 1211 for stopping data collaboration, such that, in response to an operation performed on the control 1211 for stopping data collaboration, the second electronic device may stop the data collaboration service and may store an interface of a document editing application corresponding to the data collaboration. For example, the data collaboration service of the second electronic device may store an interface of the interrupted data collaboration application (third document editing application), so that when the data collaboration scenario is resumed, the second electronic device may directly display the interface of the interrupted data collaboration application on the second electronic device according to the stored interface of the interrupted data collaboration application, or move the focus of the second electronic device to the interface of the interrupted data collaboration application, thereby increasing the response speed for resuming the data collaboration scenario.
When the first electronic device determines that a third condition is met, displaying the data cooperation interface on the first electronic device, and displaying the second text data on the data cooperation interface; the third condition is used for indicating the cooperative recovery of data of the first electronic device and the second electronic device.
The third condition may include a variety of ways, either triggered by the first electronic device or the second electronic device, and possible scenarios for the third condition are illustrated below as S604a-S604 c.
S604a, when the first electronic device meets the first trigger condition, determining to start the data cooperation service of the first electronic device.
Manner D1, in response to the application resuming operation by the first electronic device, the data collaboration service of the first electronic device determines that the first electronic device resumes the data collaboration scenario, and thus, the first electronic device sends a resume data collaboration message to the second electronic device.
In one possible implementation, the recovery operation may be a gesture acting on the first electronic device, it is determined that the gesture satisfies the first gesture condition, and a third request message is sent to the second electronic device. Thus, the second electronic device switches the focus to the third document editing frame of the interrupted third document editing application after receiving the data collaboration restoration message of the first electronic device. Reference may be made to the method d1, which is not described herein in detail.
In the manner D2, the first trigger condition may be that the first electronic device determines not to be in a call state; or the camera module of the first electronic device is not in an open state. Reference may be made to the method d2, which is not described herein in detail.
S605a, the second electronic device receives a data cooperation recovery message sent by the first electronic device, where the data cooperation recovery message is used to request recovery of data cooperation between the first electronic device and the second electronic device; and displaying a third prompt interface on the second electronic equipment.
The third prompt interface is used for prompting the user that the data collaboration interface of the first electronic device is recovered. The third prompt interface may be displayed in the notification bar, or may be displayed in a third document editing box of a third document editing application, which is not limited herein.
For example, as shown in (b) of fig. 12b, the third prompt interface 1220 may include prompt information, which may be: "resume the scenario of the data collaboration of the huaboei P40". Further, the third prompt interface 1220 may further include a control 1221 for resuming the data collaboration, so that, in response to an operation performed on the control 1221 for resuming the data collaboration, the second electronic device may start the data collaboration service and display a third document editing box of the stored document editing application corresponding to the data collaboration before interruption.
In some embodiments, the second electronic device, in response to an operation of the control acting to resume data collaboration, may send a third notification message to the first electronic device, the third notification message confirming resumption of data collaboration of the first electronic device with the second electronic device. In other embodiments, the second electronic device may also confirm that the data cooperation between the first electronic device and the second electronic device is recovered after receiving the data cooperation recovery message, and a third notification message does not need to be sent to the first electronic device.
S604b, in response to the first switching operation of the second electronic device, the data cooperation service of the second electronic device determining that the second electronic device resumes the data cooperation scenario, thereby sending a third notification message to the first electronic device, the third notification message being used to confirm that the first electronic device resumes the data cooperation with the second electronic device.
For a specific implementation, reference may be made to S505b, which is not described herein again.
S604c, in response to the first switching operation of the second electronic device, the data cooperation service of the second electronic device determines that the second electronic device resumes the data cooperation scenario, and thereby sends a third request message to the first electronic device, the third request message being used for requesting the first electronic device to resume data cooperation with the second electronic device.
For the specific data cooperation service of the second electronic device to determine that the second electronic device resumes the data cooperation scenario, refer to S505a, which is not described herein again.
S605c, the first electronic device receives a data cooperation restoring message from the second electronic device, where the data cooperation restoring message is used to notify the first electronic device to restore data cooperation with the second electronic device, and a fourth prompt interface is displayed on the first electronic device.
The fourth prompting interface is used for prompting the user that the focus of the second electronic equipment is restored to a third document editing frame of the document editing application. The fourth prompt interface may be displayed in the notification bar, or may be displayed in the main interface, which is not limited herein.
For example, as shown in (a) of fig. 12c, the fourth prompt interface 1230 may include fourth prompt information, which may be: "resume the huaboei P40 data collaboration scenario". Further, the fourth prompt interface 1230 may further include a recovery confirmation control 1231, so that, in response to an operation on the recovery confirmation control 1221, the second electronic device may start the data collaboration service and send a recovery collaboration response to the first electronic device.
And S606, responding to the operation of the fifth prompt interface acting on the first electronic equipment, and displaying a preview interface by the first electronic equipment.
For example, as shown in (b) of fig. 12c, the fifth prompt interface 1240 may include fifth prompt information, which may be: "resume the scenario of the data collaboration of the huaboei P40". Further, the fifth prompt interface 1240 may further include a control 1241 for resuming the input, so that, in response to an operation on the control 1241 for confirming the resumption, the second electronic device may start the data collaboration service and display the data collaboration interface before the data collaboration is interrupted. The data collaboration interface can comprise a preview interface before data collaboration is interrupted.
Wherein the preview interface may display the last unsent text data. The display manner of the preview interface can refer to the display manners of (a) and (b) in fig. 8b, and details are not repeated here.
And S607, responding to the second user operation, and sending the third text data to the second electronic equipment.
The second user operation may be a confirmation operation acting on the data collaboration interface, and the third text data may be text data that is not sent on a preview interface stored in the first electronic device, or text data to be sent that is newly confirmed by the first electronic device.
Further, the user can continue to input voice on the data collaboration interface to continue the data collaboration scene. This step can be referred to as S406-S409, and is not described herein.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiment of the present application, the terminal device may include a hardware structure and/or a software module, and implement the functions in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
As used in the above embodiments, the terms "when …" or "after …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)". In addition, in the above-described embodiments, relational terms such as first and second are used to distinguish one entity from another entity without limiting any actual relationship or order between the entities.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that a portion of this patent application contains material which is subject to copyright protection. The copyright owner reserves the copyright rights whatsoever, except for making copies of the patent files or recorded patent document contents of the patent office.

Claims (28)

1. A method for multi-device data collaboration is applied to a second electronic device, and is characterized by comprising the following steps:
displaying a first document editing frame of a first document editing application in response to a first operation acting on the first document editing application;
sending a first notification message to a first electronic device, wherein the first notification message is used for establishing data cooperation between the first electronic device and the second electronic device;
and receiving first text data sent by the first electronic equipment, and displaying the first text data on the first document editing box.
2. The method of claim 1, wherein the first document-editing application is an application in a document-editing whitelist configured in the second electronic device.
3. The method of claim 1, wherein the first operation comprises at least one of:
an operation of opening the first document editing application; clicking a document editing box in the first document editing application; switching operation of the first document editing application; or switching the operation of the document editing box in the first document editing application.
4. The method of claim 1, wherein before sending the first notification message to the first electronic device, further comprising:
displaying a first searching device interface, wherein the first searching device interface is used for displaying electronic equipment which establishes communication connection with the second electronic equipment;
determining to send the first notification message to the first electronic device in response to a second operation of selecting the first one of the electronic devices.
5. The method of claim 1, further comprising:
receiving a communication connection request sent by the first electronic device, wherein the communication connection request is sent through near field communication;
establishing a communication connection with the first electronic device;
displaying a first connection message, wherein the first connection message is used for prompting a user that the second electronic equipment establishes communication connection with the first electronic equipment.
6. The method of any of claims 1-5, wherein after sending the first notification message to the first electronic device, further comprising:
receiving a first response message sent by the first electronic device, wherein the first response message is used for confirming that the first electronic device establishes data cooperation with the second electronic device;
and displaying the first response message.
7. The method of any of claims 1-6, wherein the second electronic device includes a plurality of document editing applications thereon, the plurality of document editing applications including at least the first document editing application and a second document editing application, the method further comprising:
sending a second notification message to the first electronic device, the second notification message instructing the second electronic device to run the plurality of document editing applications,
receiving a switching message sent by the first electronic device, wherein the switching message is used for instructing the second electronic device to switch the focus from the first document editing application to the second document editing application;
switching the focus to a second document editing box of the second document editing application;
and receiving third text data sent by the first electronic equipment, and displaying the third text data on the second document editing box.
8. The method of claim 1, further comprising:
displaying a third document editing application on the second electronic equipment when a first condition is met, wherein the third document editing application is a corresponding document editing application when the first electronic equipment and the second electronic equipment interrupt data cooperation;
receiving second text data sent by the first electronic equipment, and displaying the second text data on a third document editing frame of a third document editing application;
the second text data is text data generated by the first electronic equipment when the data cooperation is interrupted;
wherein the first condition comprises at least one of: in response to a first switching operation, the first switching operation switching focus of the second electronic device to the third document editing application;
or receiving a data cooperation recovery message sent by the first electronic device, where the data cooperation recovery message is used to request recovery of data cooperation between the first electronic device and the second electronic device.
9. The method of claim 8, further comprising:
displaying a switched interface in response to a second switching operation acting on the third document editing application, wherein the second switching operation is used for switching the focus of the second electronic equipment out of the third document editing application;
and sending a third notification message to the first electronic device, wherein the third notification message is used for notifying the second electronic device of data cooperation interruption with the first electronic device.
10. The method of claim 8, wherein the first electronic device is configured to interrupt data collaboration with the second electronic device, comprising:
receiving an interrupt notification message of the first electronic device, wherein the interrupt notification message is used for notifying the second electronic device of data cooperation interrupt with the first electronic device.
11. The method according to any one of claims 8-10, wherein before receiving the second text data sent by the first electronic device, further comprising:
and sending a fourth notification message to the first electronic device, wherein the fourth notification message is used for notifying the first electronic device to recover the data cooperation with the second electronic device.
12. The method according to any one of claims 8-10, wherein before receiving the second text data sent by the first electronic device, further comprising:
sending a recovery request message to the first electronic device, wherein the recovery request message is used for requesting recovery of data cooperation between the first electronic device and the second electronic device;
and receiving a data cooperation recovery message sent by the first electronic device, wherein the data cooperation recovery message is used for confirming recovery of data cooperation between the first electronic device and the second electronic device.
13. A method for multi-device data collaboration is applied to a first electronic device, and is characterized by comprising the following steps:
receiving a first notification message sent by second electronic equipment, and displaying a first prompt box, wherein the first notification message is used for requesting the first electronic equipment to establish data cooperation with the second electronic equipment, and the first prompt box comprises a first control;
displaying a data collaboration interface in response to a first user operation acting on the first control;
collecting first voice data, converting the first voice data into first text data, and displaying the first text data on the data collaboration interface;
and responding to a second user operation, and sending the first text data to the second electronic equipment.
14. The method of claim 13, wherein the data collaboration interface further comprises an edit control that converts the first speech data to first text data, comprising:
converting the first voice data into initial text data, and displaying the initial text data on the data collaboration interface;
responding to a third user operation acting on the editing control, executing an editing instruction on the initial text data, and displaying the edited text data on the data collaboration interface;
and replacing the first text data with the edited text data in response to a confirmation operation acting on the data collaboration interface.
15. The method of claim 14, wherein the first text data comprises initial text data and editing instructions for the initial text data.
16. The method according to any one of claims 13-15, further comprising:
receiving a first confirmation message sent by the second electronic device, wherein the first confirmation message is used for indicating that the second electronic device successfully receives the first text data;
and deleting the first text data displayed on the data collaboration interface.
17. The method according to any of claims 13-16, wherein prior to receiving the first notification message sent by the second electronic device, further comprising:
when the distance between the first electronic equipment and the second electronic equipment meets a first distance threshold value, sending a communication connection request to the second electronic equipment through near field communication;
and establishing communication connection with the first electronic equipment, and displaying a first connection message, wherein the first connection message is used for prompting the second electronic equipment to establish communication connection with the first electronic equipment.
18. The method of any of claims 13-17, wherein in response to a first user operation acting on the first control, further comprising:
and sending a first response message to the second electronic device, wherein the first response message is used for confirming that the data cooperation between the first electronic device and the second electronic device is started.
19. The method of any of claims 13-18, wherein the second electronic device includes a plurality of document editing applications thereon, the plurality of document editing applications including at least a first document editing application and a second document editing application, the method further comprising:
receiving an application notification message sent by the second electronic device, and displaying controls of the plurality of document editing applications on the data collaboration interface; the application notification message is to indicate that the second electronic device includes the plurality of document editing applications;
sending a switching message to the second electronic device in response to a switching operation of a control acting on the plurality of document editing applications, wherein the switching operation is to switch focus from the first document editing application to the second document editing application by the second electronic device, and the switching message is used for instructing the second electronic device to switch focus of the second electronic device to a second document editing frame of the second document editing application;
collecting second voice data, converting the second voice data into third text data, and displaying the third text data on the data cooperation interface;
and responding to the second user operation, and sending the third text data to the second electronic equipment.
20. The method according to any one of claims 13-19, further comprising:
when a second condition is met, caching second text data, wherein the second text data is text data displayed on the data cooperation interface when the first electronic equipment and the second electronic equipment interrupt data cooperation, and the second condition is used for indicating data cooperation interruption of the first electronic equipment and the second electronic equipment;
when a third condition is met, displaying the data cooperation interface, and displaying the second text data on the data cooperation interface, wherein the third condition is used for indicating data cooperation recovery of the first electronic device and the second electronic device;
and sending the second text data to the second electronic equipment.
21. The method of claim 20, wherein the second condition comprises at least one of:
receiving a second notification message sent by the second electronic device, where the second notification message is used to notify the second electronic device of data cooperation interruption with the first electronic device;
or, in response to an application switching operation of the first electronic device, displaying a switched application interface on the first electronic device, where the application switching operation is used to start a switched application on the first electronic device.
22. The method of claim 21, wherein applying the operation in response to the switching of the first electronic device further comprises:
and sending an interrupt notification message to the second electronic device, wherein the interrupt notification message is used for notifying the second electronic device of data cooperation interrupt with the first electronic device.
23. The method of claim 20, wherein the third condition comprises at least one of:
detecting a gesture on the first electronic device, determining that the gesture meets a first gesture condition, wherein the first gesture condition means that the first electronic device moves to a range where the distance between the first electronic device and a user is larger than a first threshold value and smaller than a second threshold value, and the gesture comprises: lifted upwards or moved downwards;
the first electronic equipment is not in a call state; or the camera module of the first electronic device is not in an open state; receiving a third notification message sent by the second electronic device, where the third notification message is used to notify the first electronic device to recover data collaboration with the second electronic device;
or receiving a third request message sent by the second electronic device, where the third request message is used to request the first electronic device to restore data cooperation with the second electronic device, displaying the third request message in a second prompt box on the first electronic device, and confirming restoration of data cooperation between the first electronic device and the second electronic device in response to an operation on the second prompt box.
24. The method of claim 23, wherein after confirming recovery of data collaboration of the first electronic device with the second electronic device in response to the operation of the second prompt box, further comprising:
and sending a recovery data cooperation message to the second electronic device, wherein the recovery data cooperation message is used for confirming recovery of data cooperation between the first electronic device and the second electronic device.
25. An electronic device, wherein the electronic device comprises memory and one or more processors; wherein the memory is to store computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-12.
26. An electronic device, wherein the electronic device comprises memory and one or more processors; wherein the memory is to store computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 13 to 24.
27. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises program instructions which, when run on a terminal device, cause the terminal device to carry out the method according to any one of claims 1 to 12.
28. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises program instructions which, when run on a terminal device, cause the terminal device to carry out the method according to any one of claims 13 to 24.
CN202010784936.3A 2020-08-06 2020-08-06 Multi-device data cooperation method and electronic device Pending CN114065706A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010784936.3A CN114065706A (en) 2020-08-06 2020-08-06 Multi-device data cooperation method and electronic device
PCT/CN2021/110646 WO2022028494A1 (en) 2020-08-06 2021-08-04 Multi-device data collaboration method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010784936.3A CN114065706A (en) 2020-08-06 2020-08-06 Multi-device data cooperation method and electronic device

Publications (1)

Publication Number Publication Date
CN114065706A true CN114065706A (en) 2022-02-18

Family

ID=80120001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010784936.3A Pending CN114065706A (en) 2020-08-06 2020-08-06 Multi-device data cooperation method and electronic device

Country Status (2)

Country Link
CN (1) CN114065706A (en)
WO (1) WO2022028494A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115941674A (en) * 2023-02-21 2023-04-07 荣耀终端有限公司 Multi-device application connection method, device and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489920B (en) * 2022-04-18 2022-07-05 北京麟卓信息科技有限公司 Input method of android application on Linux platform

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6328797B2 (en) * 2014-05-30 2018-05-23 アップル インコーポレイテッド Transition from using one device to using another device
US9699152B2 (en) * 2014-08-27 2017-07-04 Microsoft Technology Licensing, Llc Sharing content with permission control using near field communication
CN105072246A (en) * 2015-07-01 2015-11-18 小米科技有限责任公司 Information synchronization method, device and terminal
CN113157168B (en) * 2019-04-03 2022-02-18 腾讯科技(深圳)有限公司 Document display method and device
CN111367689A (en) * 2020-03-02 2020-07-03 北京字节跳动网络技术有限公司 Interactive prompt information sending method and device of online document and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115941674A (en) * 2023-02-21 2023-04-07 荣耀终端有限公司 Multi-device application connection method, device and storage medium

Also Published As

Publication number Publication date
WO2022028494A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
US20220147228A1 (en) Display Method and Related Apparatus
US20220342850A1 (en) Data transmission method and related device
CN110839096B (en) Touch method of equipment with folding screen and folding screen equipment
CN111666055B (en) Data transmission method and device
WO2022100239A1 (en) Device cooperation method, apparatus and system, electronic device and storage medium
KR102481065B1 (en) Application function implementation method and electronic device
AU2020423946B2 (en) Audio output method and terminal device
CN111221845A (en) Cross-device information searching method and terminal device
WO2022100237A1 (en) Screen projection display method and related product
CN111464987B (en) Method for displaying Bluetooth device identification and electronic device
WO2022028494A1 (en) Multi-device data collaboration method and electronic device
CN114442969A (en) Inter-device screen cooperation method and device
CN114371963A (en) Fault detection method and electronic terminal
CN114510186A (en) Cross-device control method and device
CN114237529A (en) Navigation bar display method, navigation bar display method and first electronic equipment
CN115484404B (en) Camera control method based on distributed control and terminal equipment
CN115114607A (en) Sharing authorization method, device and storage medium
CN111787157A (en) Mobile terminal and operation response method thereof
CN114513760B (en) Font library synchronization method, device and storage medium
WO2022166614A1 (en) Method and apparatus for executing control operation, storage medium, and control
WO2022194005A1 (en) Control method and system for synchronous display across devices
EP4345590A1 (en) Task synchronization system and method, and device
CN114615362A (en) Camera control method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination