WO2023051116A1 - 分布式实现方法、系统、电子设备及存储介质 - Google Patents

分布式实现方法、系统、电子设备及存储介质 Download PDF

Info

Publication number
WO2023051116A1
WO2023051116A1 PCT/CN2022/114797 CN2022114797W WO2023051116A1 WO 2023051116 A1 WO2023051116 A1 WO 2023051116A1 CN 2022114797 W CN2022114797 W CN 2022114797W WO 2023051116 A1 WO2023051116 A1 WO 2023051116A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
mobile phone
instruction
smart
Prior art date
Application number
PCT/CN2022/114797
Other languages
English (en)
French (fr)
Inventor
胡怡洁
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023051116A1 publication Critical patent/WO2023051116A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/38Concurrent instruction execution, e.g. pipeline or look ahead
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/38Concurrent instruction execution, e.g. pipeline or look ahead
    • G06F9/3854Instruction completion, e.g. retiring, committing or graduating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]

Definitions

  • the present invention relates to the technical field of intelligent terminals, in particular to a distributed implementation method, system, electronic equipment and storage medium between electronic equipment.
  • the whole house smart scene has gradually become popular, and more and more smart electronic devices are added to the whole house smart scene.
  • the barrier-free interaction between electronic devices greatly improves the user experience.
  • the video played by the user on the mobile phone can be put on the TV to continue playing, or the music in the mobile phone can be played through the smart speaker at home.
  • the navigation application on the mobile phone can also plan the optimal commute The route is projected onto the screen of the on-board computer for navigation.
  • the interactive operation between the involved multiple devices is mainly implemented manually by the user, and the implementation process is relatively cumbersome.
  • the embodiment of the present application provides a distributed implementation method, system, electronic equipment, and storage medium among electronic devices.
  • a distributed implementation method By pre-recording the distributed implementation among electronic devices as a specified task, each item performed by the user on each electronic device
  • the operation information of the operation generates an executable file, and then, when the user subsequently interacts with the device, it only needs to trigger the execution of the executable file, instead of manually repeating it every time the specified task is distributed among electronic devices
  • Relatively cumbersome interactive operations can make it more convenient and faster to realize specified tasks in a distributed manner among electronic devices, improve the user's operation efficiency, and also help improve the user's experience.
  • the embodiment of the present application provides a distributed implementation method between electronic devices, the method includes: the first electronic device detects the first instruction of the user, and the first instruction is used to instruct the second electronic device to complete the first instruction. A specified task; the first electronic device responds to the first instruction, simulating one or more first user operations on the first electronic device, wherein the first user operation is: the process of controlling the second electronic device to complete the first specified task In this method, user operations on the first electronic device are required; the first electronic device instructs the second electronic device to perform the first specified task.
  • the first electronic device detects a first instruction corresponding to an operation performed by the user on the first electronic device, wherein the first electronic device can determine the second electronic device as the execution device for performing the first specified task based on the first instruction, and the second An electronic device can simulate the first user operation on the first electronic device in response to the first instruction to realize the process of controlling the second electronic device to complete the first designated task, and the second electronic device can Execution of the first designated task begins.
  • the above-mentioned first electronic device can be, for example, the mobile phone 100 in the following embodiments
  • the above-mentioned second electronic device can be, for example, the smart TV 200 in the following embodiments
  • the above-mentioned first instruction is, for example, that the user can click on the automatic operation button created on the mobile phone 100 ( For example, the "one-key projection" button) instructs the mobile phone 100 to control and realize the instruction of the projection scene
  • the mobile phone 100 can determine the smart TV 200 as the execution device for executing the video playback task based on the instruction corresponding to the user clicking the "one-key projection" button , that is, it is necessary to play the video played on the mobile phone 100 through the smart TV 200
  • the mobile phone 100 can also respond to the user's operation instruction of clicking the "one-key projection” button, simulating that the user clicks on the mobile phone 100 to run the video application and play the desired video.
  • the method further includes: the first electronic device generates and sends at least one simulation instruction to the second electronic device in response to the first instruction, where the simulation instruction is used to indicate A second user operation is simulated on the electronic device, wherein the second user operation is a user operation that needs to be performed by the user on the second electronic device during the process of completing the first specified task.
  • the first electronic device may instruct the second electronic device to simulate a second user operation by sending a simulation command to the second electronic device in response to the first instruction, so that the second electronic device completes the switching during the execution of the first specified task Operations related to the execution of the first specified task, such as task content and setting parameters.
  • the simulated instruction may include, for example, in steps 802, 804 to 805 in the following embodiments, when the mobile phone 100 determines that the subject of the automatic operation instruction to be executed is the smart TV 200, the automatic operation instruction pushed to the smart TV 200, the smart TV 200 After receiving the automatic operation instruction sent by the mobile phone 100, the instruction is executed to simulate the user's operations on the smart TV 200 such as setting the playback volume value, setting the playback brightness value, and setting the definition.
  • the first user operation or the second user operation includes any one of the following operations: the operation of running an application program; the operation of inputting text information; Operation; the operation of setting parameters; the operation of selecting the content of the first specified task; the operation of switching pages.
  • the first user operation that needs to be simulated on the first electronic device may be any of the above-mentioned operations.
  • the second user operation simulated on the second electronic device may be any of the above-mentioned operations. It can be understood that the first user operation that needs to be simulated on the first electronic device and the user operation that needs to be simulated on the second electronic device The second user operation may also be other operations different from the above operations, which is not limited here.
  • the first user operation includes the user's operation of clicking a control on the first electronic device, or the user's sliding operation on the first electronic device;
  • the second user operation includes the user's operation of The operation of clicking the control on the second electronic device, or the user's control operation on the second electronic device through the remote control device.
  • the first user operation simulated by the first electronic device may be an operation in which the user clicks each control on the display interface of the first electronic device, or may be a sliding operation performed by the user on the display interface of the first electronic device.
  • the second user operation simulated by the second electronic device may be an operation in which the user clicks various controls on the display interface of the second electronic device, or may be an operation in which the user clicks various controls on the display interface of the second television device through a remote control device, etc. Control operations and the like are not limited here.
  • the above-mentioned second electronic device may be, for example, the smart TV 200 in the following embodiments, and the remote control device may be, for example, a remote control for controlling the smart TV 200 .
  • the first user operation or the second user operation includes any one of the following operations: the operation of running an application program; the operation of inputting text information; Operation; the operation of setting parameters; the operation of selecting the content of the first specified task; the operation of switching pages.
  • the first user operation simulated by the first electronic device is used to realize a certain function required to complete the first specified task;
  • the second user operation is also used to realize a certain function required to complete the first designated task.
  • operating functions such as running an application program, inputting text information, connecting to a network or a device, setting parameters, selecting the content of the first specified task, or switching pages, etc. are realized.
  • the content of selecting the first designated task may be, for example, the video content played by selecting the video playing task in the implementation below.
  • the operation of running the application program includes the operation of the user clicking the application program icon control on the first electronic device to run the application program;
  • the operation of inputting text information includes the operation of the user clicking on the first electronic device The operation of clicking the input control on the electronic device to input text information; the operation of connecting to the network or connecting the device, including the operation of the user clicking the control of connecting to the network or connecting the device on the first electronic device to send a connection request; the operation of setting parameters, including The user clicks on the parameter setting control on the first electronic device to set parameters, and/or the user slides on the first electronic device to set parameters;
  • the operation of switching pages includes the operation of the user sliding on the first electronic device to switch pages.
  • the first specified task is a video playback task
  • the first user operation includes any of the following: the user clicks the video application icon on the first electronic device to run the video application operation; the user clicks the button on the input method interface in the display interface of the first electronic device to input text information; the user selects the second electronic device as the device for completing the video playback task in the display interface of the first electronic device.
  • the operation of initiating a connection request the user clicks the parameter setting button in the video playback interface displayed by the first electronic device to set the parameters; the user clicks the video content switch button or the video option button in the video application interface displayed by the first electronic device
  • the operation of selecting the video content to be played by the video playback task the user slides up and down in the video playback interface displayed by the first electronic device to set the volume, brightness value or definition; the user slides left and right on the display interface of the first electronic device to switch pages.
  • the above-mentioned video option button may be, for example, the episode switching button 543 in the following embodiments, or a button displayed with an episode number, or the like.
  • the above-mentioned first designated task is, for example, the projection task in the following embodiments, that is, the video played on the mobile phone 100 is put on the smart TV 200 for playback, that is, the video playback task executed on the mobile phone 100 is completed by the smart TV 200. .
  • the operation of running the application program includes the user clicking the control of the application program icon on the second electronic device to run the application program Operation, or the user controls the operation of running the application program on the second electronic device through the remote control device;
  • the operation of inputting text information includes the operation of the user clicking the input control on the second electronic device to input text information, or the user controls the application program through the remote control device
  • the operation of inputting text information on the second electronic device includes the operation of connecting to the network or connecting the device, including the operation of the user clicking on the control of connecting to the network or connecting the device on the second electronic device to send a connection request, or the user controlling the second electronic device through the remote control device
  • the second electronic device sends a request to connect to the network or connect to the device;
  • the operation of setting parameters includes the operation of the user clicking the parameter setting control on the second electronic device to set the parameter, or the user controlling the setting on the second electronic device through the remote control device
  • the first specified task is a video playback task
  • the second user operation includes any of the following: the user clicks on the input method interface on the display interface of the second electronic device The operation of pressing a button to input text information; the user controls the operation of inputting text information in the display interface of the second electronic device through the remote control device; the user clicks the parameter setting button in the video playback interface displayed by the second electronic device to set the parameter; In the display interface of the second electronic device, the user clicks the video content switch button, or includes the video option button to select the operation of the video content played by the executed video playback task; Manipulation of brightness value or sharpness.
  • the first electronic device stores a first program corresponding to the first instruction, and the method includes; the first electronic device executes the first program in response to the first instruction, wherein The first program can simulate a first user operation on the first electronic device; the first program can generate and send a simulated instruction to the second electronic device.
  • the first electronic device is pre-stored with a first program corresponding to the first instruction.
  • the first electronic device can run the pre-stored first program on the first electronic device.
  • the first electronic device running the first program may also generate and send a simulation instruction to the second electronic device to instruct the second electronic device to simulate the second user operation.
  • the automatic operation program executed by the mobile phone 100 can make the mobile phone 100 simulate various operations performed by the user on the mobile phone 100 to realize the screen projection scene.
  • the automatic operation program executed by the mobile phone 100 can also enable the mobile phone 100 to send the automatic operation instruction to the smart TV 200 for execution when the mobile phone 100 judges that the subject of the automatic operation instruction to be executed is the smart TV 200, and the mobile phone 100 can also send the automatic operation instruction to the smart TV 200. 200 sending an instruction to trigger the operation of the automatic operation instruction, which is not limited here.
  • the first program includes an executable file
  • the method for generating the executable file includes: the first electronic device records the user's first user operation to generate the first operation data, wherein the first operation data is used to be invoked by the first electronic device to execute the first operation instruction in response to the first user operation; the first electronic device according to the first operation data and the first operation data Build time, build executable.
  • the executable file included in the first program above can be, for example, the executable file corresponding to the screen projection scene 10 in the following embodiments, and the above first operation data can be, for example, the The operation information corresponding to each operation performed on the mobile phone 100, the mobile phone 100 can be based on the recorded operation information corresponding to each operation performed by the user on the mobile phone 100 in order to realize the screen projection scene, and the corresponding operation information of each operation
  • the above-mentioned executable file corresponding to the screen projection scene 10 is generated at the generation time. For details, please refer to the relevant description in the following embodiments, and details are not repeated here.
  • the method for generating the executable file further includes: the first electronic device receives the information recorded by the second electronic device and generated by the user's second user operation on the second electronic device.
  • the second operation data wherein the second operation data is used to be invoked by the second electronic device to execute the second operation instruction in response to the second user operation; the first electronic device generates according to the first operation data and the first operation data time, the second operation data, and the generation time of the second operation data to generate an executable file.
  • the above-mentioned second operation data can be, for example, the operation information corresponding to the operation of the user setting the playback volume value, brightness value, and/or definition value on the smart TV 200 in the following embodiments, and the mobile phone 100 can be based on the user’s
  • the above executable file corresponding to the screen projection scene 10 is generated when the information is generated. For details, please refer to the relevant description in the following embodiments, which will not be repeated here.
  • the method before generating the executable file, includes: the first electronic device prompts the user to complete the first user operation on the first electronic device.
  • the first electronic device may prompt the user, through the display interface, what operations the user needs to perform on the first electronic device during the process of controlling the second electronic device to complete the first specified task.
  • the mobile phone 100 can prompt the user "whether to enable the scene copy function to quickly realize multi-device interaction" by displaying a notification pop-up window, and the mobile phone 100 can also display a tutorial interface, prompting the user how to operate the mobile phone 100 to record operation information, How to operate the mobile phone 100 to generate the "one-key projection screen” button, etc., can refer to the relevant description in the following embodiments for details, and will not be repeated here.
  • the method before generating the executable file, the method further includes: the first electronic device prompts the user to complete the second user operation on the second electronic device.
  • the first electronic device may prompt the user, through the display interface, what operations the user needs to perform on the second electronic device during the process of controlling the second electronic device to complete the first specified task.
  • the mobile phone 100 may display a tutorial interface to prompt the user how to operate the smart TV 200 to record operation information, etc.
  • a tutorial interface to prompt the user how to operate the smart TV 200 to record operation information, etc.
  • the process for the first electronic device to record the first operation data generated by the user's first user operation on the first electronic device includes: When the operation for instructing to start recording is performed on the display interface of the user, the first operation data generated by the user’s first user operation on the first electronic device is recorded, and when it is detected that the user is on the display interface of the first electronic device When the operation for instructing to stop recording is executed, the recording operation is stopped.
  • the user can control the start and end time of the first electronic device to record the first operation data, the user performs an operation for instructing to start recording on the display interface of the first electronic device, and the user performs an operation for indicating on the display interface of the first electronic device
  • the operation of instructing to stop recording may be, for example, the operation in which the user clicks the operation recording button 411 displayed on the mobile phone 100 in the following embodiments.
  • the mobile phone 100 enters the operation recording mode, Start recording operation information; when the user clicks the operation recording button displayed on the mobile phone 100 again, the mobile phone 100 exits the operation recording mode and stops recording operation information.
  • the process of the first electronic device receiving the second operation data includes: the first electronic device notifies the second electronic device based on the received operation focus switching notification sent by the second electronic device
  • the device sends an instruction to start recording, wherein the instruction to start recording is used to instruct the second electronic device to start recording the second operation data generated on the second electronic device by the user's second user operation;
  • an instruction to stop recording is sent to the second electronic device, wherein the instruction to stop recording is used to instruct the second electronic device to stop the recording operation;
  • the first electronic device Receive second operation data sent by the second electronic device.
  • the second electronic device when it detects a user operation, it can send an operation focus switching notification to the first electronic device to notify the first electronic device that the user's operation focus has been switched to the second electronic device.
  • the first electronic device An instruction to start recording may be sent to the second electronic device based on the connection relationship with the second electronic device; when the first electronic device detects that the user performs an operation for instructing to stop recording on the display interface of the first electronic device, An instruction to stop recording may be sent to the second electronic device.
  • the connection relationship between the first electronic device and the second electronic device can be established when the user operates the first electronic device to send a connection request to the second electronic device, or can be established when the user operates the second electronic device to send a connection request to the first electronic device.
  • the above-mentioned instruction to start recording may be, for example, the recording instruction sent by the mobile phone 100 to the smart TV 200 in step 308 of the embodiment below
  • the above-mentioned instruction to stop recording may be, for example, the end recording instruction sent by the mobile phone 100 to the smart TV 200 in step 311 of the embodiment below
  • the connection relationship between the first electronic device and the second electronic device may be the connection request sent by the mobile phone 100 received when the smart TV 200 is selected as the peer device for screen projection described in step 306 of the embodiment below , establish a connection relationship with the first electronic device, and enter the preparation mode for detecting user operations at any time.
  • the sequence of simulating the operations of the first user on the first electronic device is the same as the sequence in which the first operation data is generated; the sequence of simulating the operations of the second user on the second electronic device The sequence is the same as the sequence in which the second operation data is generated.
  • the time sequence of the first user operation simulated by the first electronic device is the same as the time sequence of the first operation data generated when the user performs the first user operation on the first electronic device; the second user operation simulated by the second electronic device
  • the time sequence of the operations is the same as the time sequence of the second operation data generated when the user performs the second user operation on the second electronic device.
  • the order in which the mobile phone 100 or the smart TV 200 executes the corresponding automatic operation instructions can refer to the operation order of the various operations corresponding to the "one-key projection" button 630, for example, first on the mobile phone 100 Open the video application, click the video to be played, and click the screen projection button to select the smart TV 200 as the screen projection device in sequence, and then the user sequentially sets the definition, playback volume, or brightness on the smart TV 200
  • the user For specific operations, please refer to FIG. 9B and related descriptions in the following embodiments, and details are not repeated here.
  • the first electronic device stores the first program and the second program corresponding to the second instruction, wherein the second instruction is used to instruct the third electronic device to complete the second specified task ; Wherein, the second electronic device is different from the third electronic device; and/or the first specified task is different from the second specified task.
  • the first electronic device may instruct the same electronic device to perform two different specified tasks in response to a user operation, such as clicking an automatic operation button, or may instruct two different electronic devices to perform two different specified tasks, or instruct Two different electronic devices perform the same designated task, which is not limited here.
  • a user operation such as clicking an automatic operation button
  • Two different electronic devices perform the same designated task, which is not limited here.
  • the video played on the mobile phone 100 is played by the smart TV 200
  • the document received by the mobile phone 100 is processed by the tablet computer.
  • the first instruction includes at least one of a screen projection instruction, a music delivery instruction, and a document processing instruction, wherein the screen projection instruction is used to instruct the second electronic device to complete the The video playing task; the music delivery instruction, used for instructing the second electronic device to complete the music playing task; the document processing instruction, used for instructing the second electronic device to complete the document processing task.
  • an embodiment of the present application provides a distributed implementation method between electronic devices, the method including: the second electronic device executes the first designated task in response to an instruction from the first electronic device indicating to complete the first designated task; The second electronic device simulates a second user operation on the second electronic device in response to the first electronic device's instruction to simulate a second user operation on the second electronic device, wherein the second user operation is: after completing the first specified During the task, user operations on the second electronic device are required.
  • the second user operation includes any of the following operations: the operation of running an application program; the operation of inputting text information; the operation of connecting to a network or connecting a device; An operation; an operation of selecting the content of the first designated task; an operation of switching pages.
  • the operation of running the application program includes the operation of the user clicking the application program icon control on the second electronic device to run the application program, or the user controls the application program on the second electronic device through the remote control device.
  • the operation of running an application program on the Internet The operation of inputting text information, including the operation of the user clicking the input control on the second electronic device to input text information, or the user controlling the operation of inputting text information on the second electronic device through the remote control device; connecting Operations on the network or connected devices, including the operation that the user clicks on the second electronic device to connect to the network or the control of the connected device to send a connection request, or the user controls the second electronic device to send a request to connect to the network or connect to the device through the remote control device
  • the operation of setting parameters includes the operation of the user clicking the parameter setting control on the second electronic device to set the parameters, or the operation of setting the parameters on the second electronic device controlled by the user through the remote control device;
  • the operation of clicking the selection control on the device to select the content of the executed first designated task; the operation of switching pages includes the operation of switching pages by the user controlling the second electronic device through the remote control device.
  • the first specified task is a video playback task
  • the second user operation includes any of the following: the user clicks on the input method interface on the display interface of the second electronic device The operation of pressing a button to input text information; the user controls the operation of inputting text information in the display interface of the second electronic device through the remote control device; the user clicks the parameter setting button in the video playback interface displayed by the second electronic device to set the parameter; In the display interface of the second electronic device, the user clicks the video content switch button, or includes the video option button to select the operation of the video content played by the executed video playback task; Manipulation of brightness value or sharpness.
  • the simulation instruction is generated by running the first program on the first electronic device.
  • the first program includes an executable file
  • the method for generating the executable file includes: the second electronic device sends the second operation data to the first electronic device for generating The executable file, wherein the second operation data is the data generated on the second electronic device by the second electronic device recording the user's second user operation, and the second operation data is used to be invoked by the second electronic device to execute a response to the first 2.
  • the process of the second electronic device recording the second operation data generated on the second electronic device by the user's second user operation includes: the second electronic device responds to the first electronic The instruction to start recording sent by the device records the second operation data generated by the second user operation on the second electronic device; the second electronic device stops the recording operation in response to the instruction to stop recording sent by the first electronic device, and Send the second operation data to the first electronic device.
  • the sequence of simulating the second user's operations on the second electronic device is the same as the sequence in which the second operation data is generated.
  • the second electronic device displays the change of the display content of the second electronic device caused by the second user operation, and and/or changes in volume.
  • the smart TV 200 displays the volume change icon and the volume value set by the user when the user operates to adjust the volume.
  • FIG. 2B and related descriptions in the following embodiments, which will not be repeated here.
  • the embodiment of the present application provides a distributed system, including a first electronic device and a second electronic device, the first electronic device is used to simulate one or more A first user operates, and generates and sends at least one simulation instruction to the second electronic device, wherein the first instruction is used to instruct the second electronic device to complete the first specified task, and the first user operation is: controlling the second electronic device In the process of the device completing the first specified task, the user needs to perform user operations on the first electronic device; the simulation instruction is used to instruct to simulate the second user operation on the second electronic device, and the second user operation is: after completing the first In the process of specifying a task, user operations on the second electronic device are required; the second electronic device is used to respond to the instruction of the first electronic device indicating to complete the first specified task, execute the first specified task, and the second electronic device It is used for simulating a second user's operation on the second electronic device in response to the simulation instruction sent by the first electronic device.
  • the embodiment of the present application provides an electronic device, including: one or more processors; one or more memories; one or more memories storing one or more programs, when the one or more programs are When executed by one or more processors, the electronic device executes the distributed implementation method between electronic devices in the claims.
  • the embodiment of the present application provides a computer-readable storage medium, where instructions are stored on the storage medium, and when the instructions are executed on a computer, the computer executes the above-mentioned distributed implementation method between electronic devices.
  • an embodiment of the present application provides a computer program product, including a computer program/instruction, and when the computer program/instruction is executed by a processor, the above-mentioned distributed implementation method among electronic devices is implemented.
  • FIG. 1 is a schematic diagram of a screen projection scene provided by an embodiment of the present application in which the technical solution of the present application is used to realize screen projection between devices.
  • FIGS 2A to 2D are schematic diagrams of interfaces provided by the embodiment of the present application to implement the interaction process between the mobile phone 100 and the smart TV 200 in the screen projection scene using the distributed implementation method among electronic devices of the present application.
  • FIG. 3 is a schematic interaction flowchart for generating a "one-key projection" button by the mobile phone 100 based on the operation information recorded by the mobile phone 100 and the smart TV 200 in the screen projection scenario provided by the embodiment of the present application.
  • 4A to 4B are schematic diagrams of the operation interface provided by the embodiment of the present application for the user to operate the mobile phone 100 to enter the operation recording mode.
  • 5A to 5D are schematic diagrams of interfaces corresponding to various operations performed by the user on the mobile phone 100 for realizing the screen projection scene 10 provided by the embodiment of the present application.
  • 6A to 6B show the interface changes corresponding to the mobile phone 100 provided by the embodiment of the present application displaying a pop-up window to prompt the user to operate the mobile phone 100 to complete the creation of an automatic operation button.
  • FIG. 7A to 7B are schematic flow diagrams showing the interactive operation information between the mobile phone 100 , the smart TV 200 and the operation information database when the operation information database provided by the embodiment of the present application is set on the mobile phone 100 .
  • Fig. 8 shows that the mobile phone 100 provided by the embodiment of the present application responds to the user clicking the "one-key projection" button to control the execution of the automatic operation instructions corresponding to the various operations in the projection scene 10 in the executable file corresponding to the button.
  • FIG. 9A is a schematic diagram of an interaction process in which the mobile phone 100 pushes the automatic operation instruction to the execution subject corresponding to the automatic operation instruction under the situation that the mobile phone 100 is directly connected to the smart TV 200 according to the embodiment of the present application.
  • FIG. 9B is a schematic diagram of the interactive process in which the mobile phone 100 pushes the automatic operation instruction to the execution subject corresponding to the automatic operation instruction when the mobile phone 100 and the smart TV 200 are connected through the cloud server 300 according to the embodiment of the present application.
  • FIG. 10 is a schematic diagram of an interface in which the mobile phone 100 prompts the user for abnormal termination of program execution provided by the embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a mobile phone 100 provided by an embodiment of the present application.
  • FIG. 12 is a block diagram of a software structure of a mobile phone 100 provided by an embodiment of the present application.
  • the illustrative embodiments of the present application include, but are not limited to, a distributed implementation method, system, electronic device, and storage medium among electronic devices.
  • the embodiment of the present application discloses a distributed implementation method among electronic devices, which can be pre-recorded as the distributed implementation of specified tasks among electronic devices, and the operations of various operations performed by users on each electronic device information, generate an executable file, and then, when the user subsequently interacts with the device, as long as the execution of the executable file is triggered, for example, the execution of the executable file is triggered by an automated operation button, a voice command, or other preset scenarios.
  • interactive operations on multiple devices are required.
  • the process of recording operation information can be understood as when a user opens an application program, enters text information, connects to other electronic devices, sets parameters, clicks a control, and selects a peer device for a distributed task on the electronic device.
  • the electronic device can record data such as the coordinate information of the touch position of the setting application button clicked by the user, the program name of the setting application, etc. Action information.
  • the electronic device runs the executable file generated based on the recorded operation information, it can simulate the operation of clicking the setting application button based on the recorded touch position coordinate information, and the setting application opens the interface of the setting application.
  • the operation information recorded by each electronic device interacting above may include, but not limited to, the execution sequence information of each operation, the name of the electronic device involved in performing each operation, the model of the electronic device, and the name of the network to which the relevant electronic device is connected. , the name of the application running on the relevant electronic device, the text entered by the user on the relevant electronic device, etc., the position where the user entered the text, and the type of operation performed by the user on each electronic device (including click operation, slide operation, shortcut operation gesture, etc. ) and other relevant data information.
  • the user can record the operation information of the various operations performed by the user on each electronic device when the specified task is implemented in a distributed manner for the first time to perform interactive operations on each device.
  • the user can also After understanding the function of recording operation information provided by electronic devices such as mobile phones based on the distributed implementation method among electronic devices provided by this application, when a certain device interaction completes the specified task of distributed The operation information of each operation performed on the device generates an executable file, and then the user can trigger the execution of the executable file through the automatic operation button to realize the interaction of the device.
  • FIG. 1 shows a screen projection scenario in which the technical solution of the present application is used to implement screen projection between devices.
  • the screen projection scene 10 includes a mobile phone 100 and a smart TV 200 .
  • the executable file corresponding to the projection scene 10 generated according to the distributed implementation method among electronic devices of the present application is pre-stored in the mobile phone 100, and an automatic operation that can trigger the execution of the executable file is also created on the mobile phone 100. button.
  • FIGS. 2A to 2D show schematic diagrams of interfaces related to implementing the interaction process between the mobile phone 100 and the smart TV 200 in the screen projection scene 10 shown in FIG. 1 by adopting the distributed implementation method among electronic devices of the present application.
  • the control center menu bar 210 of the mobile phone 100 includes an operation record button 211, and the user can click the operation record button 211 in the control center menu bar 210 of the mobile phone 100 to make the mobile phone 100 enter the operation information recording mode, and the mobile phone 100 It can be used as the main device in the screencasting scene 10.
  • the user can click on the video to be played in the video application running on the mobile phone 100, and the user clicks the screen projection button 220 on the video playback interface to select the smart TV 200 as the projection screen.
  • the screen device plays the video played on the mobile phone 100 through the smart TV 200 to realize various operations of the screen projection scene 10 .
  • the mobile phone 100 can record the execution instructions, signals, and related application information, device information, etc. generated by the mobile phone 100 in response to the above-mentioned various operations of the user as operation information corresponding to various operations, such as the video application information of the mobile phone 100 running. Program package name, application icon access path information, etc.; the smart TV 200 can also record the operation information of the screen projection related operations performed by the user on the smart TV 200, such as the volume value, brightness value, and definition set by the user operating the smart TV 200 The corresponding volume value, brightness value, clarity and other parameters are recorded as operation information.
  • the smart TV 200 can send the recorded operation information to the mobile phone 100, and the mobile phone 100 synthesizes the operation information of various operations to generate automatic operation instructions and store them. document.
  • the mobile phone 100 can control the execution of the executable file by creating an automatic operation button, wherein the automatic operation button created by the mobile phone 100 can be, for example, the "one-key projection” button 230 shown in FIG. 2C . As shown in FIG. 2C , after the “one-key projection screen” button 230 corresponding to the screen projection scene 10 shown in FIG.
  • the mobile phone 100 can execute the executable file corresponding to the "one-key projection” button 230, and control the mobile phone 100 and the smart TV 200 to automatically execute the corresponding screen projection scene.
  • the automatic operation instructions for each operation in 10 complete the interaction process between the mobile phone 100 and the smart TV 200 with one key, and realize the screen projection scene 10, that is, the distributed screen projection task (or video playback) between the mobile phone 100 and the smart TV 200 task) scenarios.
  • the mobile phone 100 can automatically control the mobile phone 100 or the smart TV 200 to automatically perform related operations in the process of controlling and executing the automatic operation instructions corresponding to the various operations in the screen projection scene 10 in response to the user clicking the "one-key screen projection" button 230.
  • the operation realizes interaction without user participation. Therefore, the distributed implementation method between electronic devices of this application can automate multiple cumbersome operations involved in the specified tasks that users need to perform when reproducing commonly used multi-device interaction scenarios.
  • the operation button is realized with one key, which simplifies the user's operation process, thereby improving operation efficiency and saving operation time.
  • the mobile phone 100 and other electronic devices can quickly realize the interaction scene between multiple devices based on the distributed implementation method provided by this application.
  • the mobile phone 100 can properly push the A notification introducing the feature.
  • the mobile phone 100 can prompt the user whether to record the relevant operation process of the interaction of each device during the screen projection process, as shown in FIG. 2D , For example, when the user clicks the projection button 220 on the video playback interface of the mobile phone 100, the mobile phone 100 displays the notification pop-up window 240 shown in FIG.
  • the function introduction 242 can also be displayed below the notification content 241, for example, "a one-click projection button can be generated after the recording is completed, and the next time the screen projection can be completed with one button, which is convenient and fast"; the user can click the notification pop-up window 240 "View” button 243 to view the tutorial, including how to operate the mobile phone 100 to record operation information, how to operate the mobile phone 100 to generate the "one-key projection” button 230, and how to use the generated "one-key projection” button 230 for one-key projection
  • the user can also click the "close” button 244 on the notification pop-up window 240 to close the notification pop-up window 240, and continue the manual operation process related to screen projection.
  • the mobile phone 100 can also display a pop-up window or a tutorial interface, etc., to prompt the user how to operate the smart TV 200 to record operation information.
  • the interface and the like may also be other interface forms, which are not limited here.
  • a database for storing the operation information recorded by each interactive electronic device may be set on the main device used to control and record the operation information, and each electronic device completes the recording After the operation information of each operation, the recorded operation information can be sent to the main device and stored in the operation information database for the main device to call.
  • an operation information database can also be set on the cloud server, and each electronic device can upload the recorded operation information to the cloud server after recording the operation information of each operation, and the master device needs to obtain the records of each electronic device When operating information, you can request it from the cloud server, which is not limited here.
  • the distributed implementation method between electronic devices can be applied to various electronic devices, not limited to mobile phones 100, smart TVs 200, smart speakers, portable computers, and smart screens, desktop computers, and tablet computers.
  • each electronic device that interacts includes the mobile phone 100 and executes the distributed implementation method among electronic devices of the present application.
  • the main device is also the mobile phone 100 as an example for description. It can be understood that, in some other embodiments, the interactive electronic devices in the multi-device interaction scene may not include the mobile phone 100, or the main device among the interactive electronic devices in the multi-device interaction scene may also be other than the mobile phone 100. Electronic devices are not limited here.
  • FIG. 3 shows a schematic interaction flowchart for generating a "one-key projection" button by the mobile phone 100 based on the operation information recorded by the mobile phone 100 and the smart TV 200 in the screen projection scene 10 according to an embodiment of the present application.
  • the process includes the following steps:
  • the mobile phone 100 detects that the user clicks the recording control to start recording an operation instruction, and enters the operation recording mode.
  • the interactive mobile phone 100 and the smart TV 200 are installed with application programs capable of recording operation information related to various operations performed by the user on the mobile phone 100 or the smart TV 200 respectively, and the user
  • the mobile phone 100 can enter the operation recording mode by clicking the program icon corresponding to the application program on the mobile phone 100 or triggering the control to run the application program to start recording.
  • the user Before the user operates the mobile phone 100 to enter the operation recording mode, he can first complete the network connection settings of the smart TV 200, such as starting the smart TV 200, entering the network settings of the smart TV 200, selecting a network name or a network account, and entering a password to connect the smart TV 200.
  • the 200 is connected to the home wireless local area network. If there are many wireless local area networks that the smart TV 200 can connect to, one of the wireless local area networks can be set as the default connection network of the smart TV 200 . I won't repeat them here.
  • the operation interface for the user to operate the mobile phone 100 to enter the operation record mode can be referred to in FIG. 4A to FIG. 4B .
  • FIG. 4A shows a schematic diagram of an operation interface in which the user operates the mobile phone 100 to control and start recording operation information.
  • the user can slide down from the top right side of the screen of the mobile phone 100 to call out the control center menu bar 410, and then refer to the operation 2 shown in FIG. 4A, the user clicks the operation record displayed in the control center menu bar 410
  • the button 411 can make the mobile phone 100 enter the operation recording mode, and at this time, the mobile phone 100 responds to the user's operation control to start recording operation information.
  • the screen of the mobile phone 100 may display a status bar or a control indicating that the mobile phone 100 is in the operation recording mode, refer to the floating control 420 shown in FIG. 4B .
  • the above-mentioned operation record button 411 may be a function entry provided by an application program installed on the mobile phone 100 for recording operation information. The icon of the application starts the process of recording operation information.
  • FIG. 4B shows a schematic interface of the mobile phone 100 after entering the operation recording mode.
  • a floating control 420 can be displayed on the screen of the mobile phone 100, and the floating control 420 can display a text indicating that the mobile phone 100 is in the operation recording mode, such as "recording".
  • the recording duration can also be displayed.
  • the interface after the mobile phone 100 enters the operation record mode may also be another interface form different from that shown in FIG. 4B , which is not limited here.
  • the mobile phone 100 detects that the user performs an operation in an interactive scene, records and stores operation information corresponding to each operation.
  • the user's operation focus is on the mobile phone 100 . That is, the mobile phone 100 can detect various operations of the user, and record and store operation information generated by various operations performed by the user on the mobile phone 100 .
  • the interface change process corresponding to various operations performed by the user on the mobile phone 100 will be described in detail below, and will not be repeated here.
  • the operation information corresponding to the above-mentioned various operations recorded by the mobile phone 100 includes, for example, information related to the video application opened by the user operating the mobile phone 100 (such as the name of the video application, the icon of the video application, etc.), Information related to the click operation (such as the coordinate information of the click position, the name of the control corresponding to the click position, the video name corresponding to the click position, the video access address, etc.), and the information related to the click operation of the user on the video playback interface ( For example, the coordinate information of the clicked position, the name of the control corresponding to the clicked position, the name and access ID of the screen projection device corresponding to the clicked position, and information related to playback parameter setting operations such as switching the resolution or adjusting the playback volume after the screen projection is completed. wait).
  • information related to the video application opened by the user operating the mobile phone 100 such as the name of the video application, the icon of the video application, etc.
  • Information related to the click operation such as the coordinate information of the click position, the name of the control
  • the mobile phone 100 can also record the above-mentioned operation information through the window management service (Windows Manager Service, WMS), and can also record the above-mentioned operation information through the processor responding to the above-mentioned user operation, and the operation information obtained by the mobile phone 100 record can be first to store.
  • WMS Windows Manager Service
  • the system allocates a database for storing the operation information (hereinafter referred to as the operation information database), and the operation information recorded by the mobile phone 100 can also be directly stored.
  • the operation information database can also be set on the cloud server, then the mobile phone 100 can first store the recorded operation information, and when the recording is finished, the mobile phone 100 can store all the recorded operation information.
  • the operation information is uploaded to the operation information database on the cloud server for calling.
  • the operation information data transmission process corresponding to the operation information database being set on the mobile phone 100 or on the cloud server will be described in detail below, and will not be repeated here.
  • the order in which the mobile phone 100 records the above operation information corresponds to the order in which the user performs related operations on the mobile phone 100
  • the operation information recorded by the mobile phone 100 may carry sequence information corresponding to the user's operation order, such as time information when the user operates .
  • the mobile phone 100 detects an operation in which the user selects the smart TV 200 as the peer device.
  • the operation of selecting the peer device to perform the interactive task for the main device for example, in the various operations of the screen projection scene 10, when the user operates the mobile phone 100 to perform screen projection , it is necessary to select a screen-casting device (that is, a peer device) for performing video playback tasks on the mobile phone 100, such as a smart TV 200.
  • a screen-casting device that is, a peer device
  • the specific operation of selecting a screen-casting device may be, for example: the user clicks on the video playback interface opened by the mobile phone 100 to cast Screen button to select the smart TV 200 as the screen projection device.
  • the mobile phone 100 sends an instruction to the smart TV 200 to enter the operation recording mode.
  • the mobile phone 100 sends an instruction to enter the operation record mode to the peer device (the smart TV 200 ).
  • the user will select a peer device for interactive operation through the master device, and the peer device is another electronic device that interacts with the master device in the scenario.
  • the master device may send an instruction of the operation record mode to the peer device, so that the peer device enters the operation record mode.
  • the mobile phone 100 when the mobile phone 100 detects that the user selects the smart short time 200 as the operation of the peer device, it can also send the status label that the mobile phone 200 is in the operation recording mode to the smart TV 200 at this time, so that The smart TV 200 enters a preparation state for recording operation information, which is not limited here.
  • the smart TV 200 receives an instruction to enter the operation recording mode, and enters the operation recording mode.
  • the processor of the smart TV 200 can analyze and execute the instruction, and control the smart TV 200 to enter the operation recording mode.
  • the smart TV 200 detects that the user implements an operation of an interactive scene.
  • various operations for the user to realize the interactive scene include operations on the smart TV 200 as the peer device for realizing the screen projection scene 10.
  • the mobile phone 100 can send a connection request to the smart TV 200, requesting to establish a connection with the smart TV 200 to transmit the relevant data of the video playback task. any operation on the .
  • the smart TV 200 further implements the screen projection scene 10.
  • the smart TV 200 can determine that the user's operation has been detected after receiving the control signal sent by the user through the remote control.
  • the user can also switch episodes, switch the playback progress, set the definition, set the playback volume, or Operations such as setting brightness are not limited here.
  • the smart TV 200 sends an operation focus switching notification to the mobile phone 100 .
  • the smart TV 200 sends a notification to the mobile phone 100 that the focus of the user's operation has been switched to the smart TV 200 when it detects that the user has implemented an operation in an interactive scene.
  • the smart TV 200 after the smart TV 200 enters the operation recording mode, it detects the user operation and immediately starts recording the operation information corresponding to the various operations performed by the user on the smart TV 200 to realize the screen projection scene 10 , without limitation here.
  • the smart TV 200 when the smart TV 200 detects a user operation, it may also send an operation focus switching notification and a request to apply for recording operation information to the mobile phone 100. After receiving the request, the mobile phone 100 confirms the user's operation focus switching and authorizes the smart TV The TV 200 records user operation information, which is not limited here.
  • the mobile phone 100 sends a recording instruction to the smart TV 200 based on the received operation focus switching notification.
  • the mobile phone 100 may send a recording instruction to start recording operation information to the smart TV 200, so as to instruct the smart TV 200 to start recording information corresponding to various operations performed by the user. Action information.
  • the smart TV 200 Based on the received recording instruction, the smart TV 200 records and stores operation information corresponding to various operations performed by the user on the smart TV 200 .
  • the smart TV 200 can respond to the recording instruction sent by the mobile phone 100 through a processor or the like and the user performs various operations on the smart TV 200 to realize the screen projection scene 10, for example, the user uses the remote control of the smart TV 200 Control the smart TV 200 to switch episodes, switch play progress, set definition, set playback volume, or set brightness, etc., and record the operation information corresponding to these operations.
  • the operation information recorded by the smart TV 200 may be stored first, for example, stored in a memory of the smart TV 200 or a connected external memory, etc., and no limitation is set here.
  • the electronic devices interacting with the mobile phone 100 include not only the smart TV 200 but also a tablet computer, At this time, after the smart TV 200 finishes recording the operation information corresponding to various operations performed by the user on the smart TV 200 , it may enter a state of waiting for user operations.
  • the mobile phone 100 If the user starts to operate the mobile phone 100 after completing the operation on the smart TV 200 and selects the tablet computer as a peer device for a certain task, for example, as a peer device for a document processing task, the mobile phone 100 repeats the above step 302 to record the corresponding The operation information, and the above step 303 is executed to send an instruction to enter the operation record mode to the tablet computer. Afterwards, when the user operates the tablet computer, the tablet computer acts as an execution subject to perform the above steps 305 to 309, and record the operation information corresponding to various operations performed by the user on the tablet computer.
  • the electronic device that detects the user's operation can act as the execution subject to perform the above steps 305 to 309, and record the operation information of each operation according to the user's operation sequence.
  • the mobile phone 100 as the main device can be operated to end the operation recording mode, and the mobile phone 100 continues to execute the following step 310 .
  • the mobile phone 100 There is no limitation here.
  • the mobile phone 100 detects the user's operation instruction to end the recording, and ends the recording of the operation information.
  • an end record button 421 may be displayed on the floating control 420.
  • the user may click the end recording button 421 displayed in the floating control 420 on the screen of the mobile phone 100.
  • Make the mobile phone 100 exit the operation recording mode (refer to operation 3 shown in FIG. 4B ), and at this time, the mobile phone 100 finishes recording the operation information in response to the user's operation to end the recording.
  • the user can also refer to the operation 1 shown in FIG. 4A above for the operation of the user to end the recording.
  • the user can click the operation recording button 411 again in the control center menu bar 410 displayed on the mobile phone 100 to make the mobile phone 100 exit the operation recording.
  • Mode end recording operation information, no limitation here.
  • the mobile phone 100 sends an instruction to end recording to the smart TV 200 .
  • the mobile phone 100 may send an end recording instruction to the smart TV 200 .
  • the smart TV 200 receives a recording end instruction, and ends recording operation information.
  • the smart TV 200 after receiving the instruction to end the recording, the smart TV 200 exits the operation recording mode and ends the recording of the operation information.
  • the smart TV 200 sends the recorded operation information to the mobile phone 100.
  • the smart TV 200 after the smart TV 200 finishes recording the operation information, it can send the recorded operation information to the mobile phone 100. Send to the mobile phone 100, or the smart TV 200 may also send the recorded operation information to the mobile phone 100 in batches according to the recording time, which is not limited here.
  • each electronic device can send the recorded operation information to the For the mobile phone 100, it can be understood that the operation information sent by each electronic device to the mobile phone 100 includes device information of each electronic device, such as a device label.
  • each electronic device sends the recorded operation information to the mobile phone 100.
  • the mobile phone 100 can directly store the received operation information into the operation information database; if the operation information database is set on the cloud server, each electronic device can upload the recorded operation information to the cloud server for the mobile phone 100 to apply for transfer.
  • the operation information data transmission process corresponding to the operation information database being set on the mobile phone 100 or on the cloud server will be described in detail below, and will not be repeated here.
  • the mobile phone 100 generates an executable file based on the operation information recorded by the mobile phone 100 and the smart TV 200 .
  • the mobile phone 100 can call all the operation information stored in the operation information database, for example, including the operation information recorded by the mobile phone 100 and the smart TV 200, and generate an executable automatic operation program script corresponding to each operation based on the operation information , or the automatic operation instructions corresponding to various operations are stored in the executable file.
  • the executable file is triggered to execute, the automatic operation instructions corresponding to various operations can be performed according to the user’s various operations on the mobile phone 100 and the user’s
  • the operation sequence of each operation on the smart TV 200 is executed, and when the executable file generated by the mobile phone 100 is triggered to execute, the mobile phone 100 can control to send corresponding automatic operation instructions to the execution subject of each operation. It will be described in detail below and will not be repeated here.
  • various operations performed by the user on the mobile phone 100 can refer to the relevant descriptions in the above step 302
  • various operations performed by the user on the smart TV 200 can refer to the relevant descriptions in the above step 309 , which will not be repeated here.
  • the operation information recorded by the above mobile phone 100 or smart TV 200 may be information such as instructions and related data generated in response to user operations, or may be instructions and data information converted into an executable script language for description.
  • 100 based on the operation information recorded by the mobile phone 100 or the smart TV 200, generates an automatic operation program script or an automatic operation instruction corresponding to each operation.
  • the text input content, set parameters and other data corresponding to the operations performed, and the corresponding processing methods in the process of converting these operation information into automatic operation instructions will be described in detail below, and will not be repeated here.
  • the mobile phone 100 prompts and responds to the user's operation, and finishes creating an automatic operation button.
  • the mobile phone 100 may display a pop-up window to prompt the user to operate the mobile phone 100 to complete the process of generating an automatic operation button, which will be described in detail below in conjunction with the accompanying drawings, and will not be repeated here.
  • the automatic operation button created by the mobile phone 100 will be used as an operation entry for one-click interaction between the mobile phone 100 and the smart TV 200.
  • the mobile phone 100 controls the process of executing the automatic operation instructions corresponding to various operations in response to the user's operation of clicking the automatic operation button, which will be described in detail below and will not be repeated here.
  • the smart TV 200 may also enter the operation recording mode after receiving the instruction to enter the operation recording mode from the mobile phone 100, and immediately detect the user operation. Operation information corresponding to various operations performed by the user on the smart TV 200 is started to be recorded. That is, in some other embodiments, when implementing the distributed implementation method among electronic devices provided by the present application, after performing the above steps 305 to 306, directly perform step 309 to record the corresponding operations performed by the user on the smart TV 200 instead of performing the processes described in steps 307 to 308 to transmit the operation focus switching notification and transmit the recording instruction between the mobile phone 100 and the smart TV 200, then perform step 309 to record the operation information, which is not limited here.
  • the user may first complete the connection between the mobile phone 100 and the smart device 200, such as using the smart TV 200 as a distributed device of the mobile phone 100.
  • the connection is realized through a distributed soft bus.
  • the mobile phone 100 as the main device enters the operation information recording mode, it can send an instruction to enter the operation recording mode to the smart TV 200 that has been connected with the mobile phone 100, so that the smart TV 200 enters the operation information recording mode after receiving the instruction.
  • the smart TV 200 detects a user operation, it starts to record relevant operation information, which is not limited here.
  • the interfaces corresponding to the various operations performed by the user on the mobile phone 100 for realizing the screen projection scene 10 may refer to FIGS. 5A to 5D .
  • FIG. 5A shows a schematic interface diagram of the main interface of the video application displayed by the mobile phone 100 in response to the user's operation of opening the video application.
  • the video application opened by the user operating the mobile phone 100 is Huawei TM Video, that is, the user clicks the application icon of Huawei TM Video in the desktop application of the mobile phone 100, the mobile phone 100 runs the Huawei TM Video application, and the user clicks on the bottom of the video interface 510 "My" button 511, the video interface 510 displayed on the mobile phone 100 is the interface shown in FIG. 5A.
  • the user can also click other buttons below the video interface 510, such as the "Home” button 512, the “Aspect” button 513, the “Special Area” button 514, and the “Education” button 615 to enter the corresponding display interface for Choose the video you want to play, there is no limit here.
  • FIG. 5B shows a schematic diagram of a video playback interface displayed by the mobile phone 100 in response to the user's selection operation of clicking on a video to be played.
  • the user can click on the video below the "play history" option 516 on the interface shown in FIG. Screen projection button 521 performs screen projection.
  • the screen projection button 521 displayed on the video playback interface 520 may refer to the screen projection button 220 shown in FIG. 2B above.
  • the video playback interface 520 displayed on the mobile phone 100 may also be in other interface forms, for example, it may be a video playback interface displayed on a horizontal screen, which is not limited here.
  • FIG. 5C shows a schematic interface diagram of the screen projection device selection interface displayed by the mobile phone 100 in response to the user's operation of clicking the screen projection button 521 .
  • a screen projection device selection window 530 is displayed on the video playback interface 520, and the user can select the searched smart TV connected to the mobile phone 100 200 is used as a screen projection device (that is, a peer device interacting with the mobile phone 100), refer to operation 4 shown in FIG. 5C.
  • FIG. 5D shows a schematic diagram of the interface of the mobile phone 100 after screen projection is completed in response to the user selecting the smart TV 200 as the screen projection device.
  • the user can click the resolution switch button 541 on the interface 540 to set the resolution of the video played on the smart TV 200 (for example, the resolution is set to 720p), or click the volume adjustment button 542 to adjust the video resolution on the smart TV.
  • the mobile phone 100 can temporarily stop responding to other events when it is in the operation recording mode. For example, internal events such as alarm clocks and memos set on the mobile phone 100 can temporarily stop responding, and then respond to internal events such as alarm clocks and memos after the mobile phone 100 exits the operation recording mode. . In some other embodiments, the mobile phone 100 can also respond to other events at the same time when it is in the operation recording mode. If the mobile phone 100 is recording the relevant operation information of the user operating the mobile phone 100 at this time, the mobile phone 100 can respond to internal events such as alarm clocks and memos.
  • internal events such as alarm clocks and memos set on the mobile phone 100 can temporarily stop responding, and then respond to internal events such as alarm clocks and memos after the mobile phone 100 exits the operation recording mode.
  • the mobile phone 100 can also respond to other events at the same time when it is in the operation recording mode. If the mobile phone 100 is recording the relevant operation information of the user operating the mobile phone 100 at this time, the mobile phone 100 can respond to internal events such as alarm clocks and
  • the user should not perform any operation on the alarm clock reminder window and memo reminder window displayed on the mobile phone 100 at this time, so as not to record the user's operation information on the alarm clock reminder window or memo reminder window, thereby affecting the execution of other subsequent steps.
  • step 314 for the corresponding processing methods in the process of converting operation information into automated operation instructions, for example, you can refer to the following text entry operations, running applications and application jump operations, and connection operations between multiple devices , parameter setting operations, operations of clicking controls on the screen, or sliding operations on the screen and other corresponding operation information are converted into automatic operation instructions.
  • the operation information corresponding to the text entry operation when the operation information is converted into an automatic operation instruction, the user input text in the operation information and the location-related information of the user operation input can be read into the instruction.
  • the electronic device that receives the text input by the user can automatically fill in the input text in the instruction at the corresponding input position based on the input position related information in the instruction.
  • a video name such as "Chen Qing Ling”
  • the search box of the video application interface opened by the mobile phone 100 the user enters the video name in the search box of the video application interface.
  • the information will include the text "Chen Qing Ling".
  • the operation information corresponding to the above operation process is converted into an automated operation instruction
  • the text "Chen Qing Ling" and the input location (such as the interface search box of the video application) can be read.
  • the information is written into the instruction, so that when the subsequent mobile phone 100 executes the automatic operation instruction corresponding to the above-mentioned operation process, it can automatically enter the text "Chenqing Ling" in the search box of the interface of the video application to search.
  • the operation information recorded by the mobile phone 100 includes the operation information corresponding to the user operating the electronic device to run application program A and application program B, then the operation information recorded by the mobile phone 100 When converted into an automatic operation instruction, the program package names corresponding to the application program A and application program B in the operation information can be read, and the sequence information of the mobile phone 100 running application program A and application program B can be written into the instruction, such as recording operation information During the process, the mobile phone 100 first runs the application program A and then runs the application program B.
  • the program package can be queried based on the program package name of the application program A in the instruction and automatically run the application program A, and then the program package can be queried based on the program package name of the application program B in the instruction and automatically run Application B completes the jump running process from application A to application B.
  • the operation information corresponding to the parameter setting operation performed by the user on the device such as the operation information corresponding to the user playing the parameter setting operation on the smart TV 200 shown in FIG.
  • the setting parameter data in the operation information can be read and written into the instruction, for example, the user's setting values for parameters such as volume, clarity, and brightness when recording the above-mentioned operation information.
  • the smart TV 200 can automatically adjust parameters such as the volume, clarity, and brightness of the video played by the smart TV 200 to the set values based on the setting parameter data in the instruction.
  • the operation information of the above-mentioned text entry operation, application jump operation, connection operation between multiple devices, setting operation and related operations performed by clicking on the screen is converted into an automatic operation instruction.
  • Other reasonable processing methods can also be adopted, which are not limited here.
  • the mobile phone 100 can copy and write the operation information read from the operation information database to the same automation system according to the corresponding operation sequence.
  • the operating program can be run in the file.
  • scripting language used to run the various operations corresponding to the operation information includes but not limited to Scala, JavaScript, VBScript, ActionScript, MAX Script, ASP, JSP, PHP, SQL, Perl, Shell, python, Ruby , JavaFX, Lua, Auto It, etc., there is no limitation here.
  • the mobile phone 100 prompts the user to operate the mobile phone 100 to complete the interface change process corresponding to the creation of the automatic operation button by displaying a pop-up window, as shown in FIG. 6A to FIG. 6B .
  • FIG. 6A shows a schematic diagram of an interface where the mobile phone 100 prompts the user to create an automatic operation button by displaying a pop-up window.
  • the mobile phone 100 can display a selection window 610 through a pop-up window.
  • Button 611 and re-record operation information button 612 referring to operation 5 shown in FIG. 6A , the user can click the generate automatic operation button 611 to generate the operation information stored in the operation information database as an automatic operation button.
  • the mobile phone 100 reads the operation information in the operation information database in response to the user's operation of clicking the generate automatic operation button 611, writes an automatic operation script, and generates an automatic operation button. If the user clicks the re-record operation information button 612 shown in FIG. 6A , the mobile phone 100 can restart the above step 301 to enter the operation record mode.
  • FIG. 6B shows a schematic diagram of an interface in which the mobile phone 100 prompts the user to perform corresponding settings on the created automatic operation button through a pop-up window after the automatic operation button is created.
  • the mobile phone 100 can display a selection window 620.
  • the selection window 620 includes a custom naming option box 621, an option button 622 corresponding to "Add to desktop”, and an option button 622 corresponding to "Add to control”. Center” option button 623.
  • the user can click on the custom naming option box 621 to input a custom name, for example, the user can input "one-key projection" as the name of the automatic operation button generated by the mobile phone 100; the user can click to select
  • the Add to Desktop option button 622 on the window 620 can add the chart corresponding to the generated automatic operation button (or the "one-click projection” button after the custom name) to the desktop of the mobile phone 100, referring to the mobile phone shown in Figure 6B 100, the "one-key projection screen" 630 displayed on the desktop.
  • the mobile phone 100 may also prompt the user to create a voice command.
  • the process of the mobile phone 100 prompting the user to create a voice command may refer to the above-mentioned FIGS. 6A to 6B , which will not be repeated here.
  • the user can first wake up the voice assistant of the mobile phone 100 through a preset wake-up word, for example, the user can wake up the voice assistant of the mobile phone 100 by saying "Xiaoyi!
  • the operation information database can be It can be set on the mobile phone 100 as the main device, or it can be set on the cloud server.
  • the process of interaction between the mobile phone 100, the smart TV 200 and the operation information database will be different.
  • FIG. 7A shows a schematic flow chart of interactive operation information between the mobile phone 100 , the smart TV 200 and the operation information database when the operation information database is set on the mobile phone 100 .
  • the operation information database 101 for storing the operation information recorded by the mobile phone 100 and the smart TV 200 can be set on the mobile phone 100 , for example
  • the system of the mobile phone 100 sets the above-mentioned operation information database in the storage space configured for the application program.
  • the mobile phone 100 and the smart TV 200 are directly connected by communication and can transmit data such as operation information to each other.
  • the connection between the mobile phone 100 and the smart TV 200 can be realized in various ways.
  • the mobile phone 100 and the smart TV 200 can be connected through a distributed soft bus.
  • the connection between the mobile phone 100 and the smart TV 200 may also be realized through other means such as Bluetooth, which is not limited here.
  • the window management service 102 of the mobile phone 100 can record various items performed by the user on the mobile phone 100 while responding to the user's operation.
  • the operation information corresponding to the operation is stored, and for details, refer to the relevant description in the above step 302, which will not be repeated here.
  • the mobile phone 100 may first store the recorded operation information in the internal memory, and then store the recorded operation information in the operation information database when the mobile phone 100 finishes recording. It can be understood that, in some other embodiments, the operation information corresponding to various operations performed by the user on the mobile phone 100 may also be directly stored in the operation information database on the mobile phone 100 , which is not limited here.
  • the smart TV 200 can record and store the operation information corresponding to the various operations performed by the user on the smart TV 200. For details, please refer to the relevant description in the above step 309, which will not be repeated here. . Wherein, when the smart TV 200 detects the user's operation, it can perform notification and command interaction with the mobile phone 100 . For the interaction process, reference can be made to the relevant descriptions in the above steps 305 to 309 , which will not be repeated here. When the smart TV 200 finishes recording, it can send the recorded and stored operation information to the mobile phone 100 , and the mobile phone 100 can store the operation information in the operation information database 101 after receiving the operation information recorded by the smart TV 200 .
  • the mobile phone 100 After the mobile phone 100 obtains the operation information sent by the smart TV 200 , the mobile phone 100 can load all the operation information from the operation information database, including the operation information recorded by the mobile phone 100 and the operation information recorded by the smart TV 200 . The mobile phone 100 further generates an automated operation program script and corresponding executable files based on all the operating information, and then the mobile phone 100 prompts and responds to user operations to create an automated operation button. After the creation of the automated operation button is completed, the created automated operation button can be displayed on the mobile phone 100. Action button.
  • the process of generating the executable file based on the operation information by the mobile phone 100 can refer to the relevant description in the above step 314, and the process of creating an automatic operation button by the mobile phone 100 can refer to the above step 315, which will not be repeated here.
  • FIG. 7B shows a schematic flow diagram of the interactive operation information between the mobile phone 100 , the smart TV 200 and the operation information database when the operation information database is set on the mobile phone 100 .
  • the operation information database 101 for storing the operation information recorded by the mobile phone 100 and the smart TV 200 can also be set on the cloud server 300
  • the operation information database 101 set on the cloud server 300 can also be used to respond to and transmit instruction information related to the application of the user operating the mobile phone 100 to record operation information and the like.
  • the mobile phone 100 and the smart TV 200 can communicate and transmit data such as operation information through the cloud server 300.
  • the mobile phone 100 can send a link application requesting a communication connection to the smart TV 200 through the cloud server 300.
  • the incoming link application After the incoming link application, it can display whether to authorize the notification window to connect with the mobile phone 100 and send the device information to the mobile phone 100 for the user to confirm. If the user confirms the authorization, the smart TV 200 can send its own device information to the mobile phone through the cloud server 300 100. The mobile phone 100 completes the device connection with the smart TV 200 after acquiring the device information of the smart TV 200 .
  • the mobile phone 100 can record the operation information corresponding to the various operations performed by the user on the mobile phone 100 while responding to the user's operation. and store it.
  • the operation information recorded by the mobile phone 100 may also be referred to as the mobile phone side operation information shown in FIG. 7B .
  • the mobile phone 100 as the main device can store the recorded operation information in the internal memory of the mobile phone 100 for calling, and can also upload the recorded operation information to the cloud server 300 and store it in the operation information database when the recording is finished.
  • the cloud server 300 can store the recorded operation information to the cloud server 300 and store it in the operation information database when the recording is finished.
  • the smart TV 200 When the smart TV 200 detects the user's operation, it can send a focus switching notification to the mobile phone 100 through the cloud server 300. After receiving the notification, the mobile phone 100 sends a recording instruction to the smart TV 200 through the cloud server 300. After the smart TV 200 receives the recording instruction
  • the operation information corresponding to various operations performed by the user on the smart TV 200 is started to be recorded and stored. For details, reference may be made to relevant descriptions in steps 305 to 309 above, which will not be repeated here.
  • the operation information recorded by the smart TV 200 may also be referred to as terminal-side operation information shown in FIG. 7B .
  • the smart TV 200 finishes recording it can upload the recorded and stored operation information to the cloud server 300 and store it in the operation information database 101 .
  • the mobile phone 100 as the main device needs to obtain all the operation information, including the operation information recorded by the mobile phone 100 and the operation information recorded by the smart TV 200.
  • the cloud server 300 can store the operation information stored in the operation information database 101 The information is pushed to the mobile phone 100 as the main device, and the cloud server 300 can also push all the operation information stored in the operation information database 101 to the mobile phone 100 based on the data request sent by the mobile phone 100, which is not limited here.
  • the mobile phone 100 After the mobile phone 100 obtains all the operation information, it can generate an automatic operation program script and corresponding executable files based on all the operation information, and then the mobile phone 100 prompts and responds to the user's operation to create an automatic operation button. After the creation of the automatic operation button is completed, the mobile phone 100 Created automated action buttons can be displayed on the For the process of mobile phone 100 generating an executable file based on the operation information, reference may be made to the relevant description in step 314 above. For the process of mobile phone 100 creating an automatic operation button, reference may be made to the relevant descriptions shown in FIGS. 6A to 6B .
  • Fig. 8 shows a schematic diagram of the mobile phone 100 controlling the execution of the automatic operation instructions corresponding to the various operations in the screen projection scene 10 in the executable file corresponding to the button in response to the user clicking the "one-key screen projection" button according to the embodiment of the present application Interactive flowchart.
  • the process includes the following steps:
  • the mobile phone 100 detects that the user clicks an automatic operation button, and invokes an automatic operation instruction in an executable file corresponding to the button.
  • the mobile phone 100 detects that the user clicks an automatic operation button for realizing the screen projection scene 10, such as the "one-key screen projection" button 630 shown in FIG. Run the executable file corresponding to the "one-key projection” button 630.
  • the executable file stores automatic operation instructions corresponding to various operations, and the mobile phone 100 can call the automatic operation in the executable file. instruction, and then proceed to the following judgment step 802.
  • the execution subject of the automatic operation instruction corresponding to each operation includes the mobile phone 100, that is, the automatic operation generated based on the operation information of the various operations performed by the user on the mobile phone 100 recorded by the mobile phone 100
  • the instruction is executed by the mobile phone 100 ; the automatic operation instruction generated based on the operation information of various operations performed by the user on the smart TV 200 recorded by the smart TV 200 is executed by the smart TV 200 .
  • the mobile phone 100 judges that the execution subject device of the automatic operation instruction to be executed. If the executor corresponding to the automatic operation instruction to be executed is the mobile phone 100 , continue to execute step 803 ;
  • the mobile phone 100 can sequentially judge the execution subject of each automatic operation instruction according to the operation sequence of the operation corresponding to each instruction, and push each automatic operation instruction to the corresponding execution subject.
  • the main body performs execution, wherein the order in which the mobile phone 100 pushes the instructions is consistent with the operation order of the corresponding operations.
  • the mobile phone 100 judges that the execution subject corresponding to the automatic operation instruction to be executed is the mobile phone 100, the mobile phone 100 can directly execute the instruction, that is, proceed to the next step. Step 803; if the mobile phone 100 judges that the execution subject corresponding to the automatic operation instruction to be executed is the smart TV 200, the mobile phone 100 needs to push the instruction to the smart TV 200 for execution, that is, proceed to the following step 804.
  • the mobile phone 100 executes the automatic operation instruction with the mobile phone 100 as the execution subject.
  • the processor of the mobile phone 100 may directly invoke the instruction for execution.
  • the specific instruction push process and instruction execution process will be described in detail below in conjunction with the accompanying drawings, and will not be repeated here.
  • the scripts corresponding to the operations such as text entry, application jump, device connection, parameter setting, etc. involved in the instruction can refer to the relevant description in the above step 314 for processing, and there is no limitation here.
  • the mobile phone 100 sends to the smart TV 200 an automatic operation command whose execution subject is the smart TV 200 .
  • the mobile phone 100 pushes the instruction to the smart TV 200 for execution.
  • the specific instruction push process and instruction execution process will be described in detail below in conjunction with the accompanying drawings, and will not be repeated here.
  • the smart TV 200 executes the received automatic operation instruction.
  • the processor of the smart TV 200 may invoke the instruction for execution.
  • the process of specifically executing the received instruction will be described in detail below in conjunction with the accompanying drawings, and will not be repeated here.
  • the smart TV 200 feeds back a signal to the mobile phone 100 that execution of the instruction is completed.
  • the smart TV 200 may feed back a signal of completion of execution to the mobile phone 100 .
  • the specific process of feeding back the execution completion signal will be described in detail below in conjunction with the accompanying drawings, and will not be repeated here.
  • the mobile phone 100 can push the automatic operation instructions to the execution subject one by one according to the operation sequence, and when the execution subject feeds back a signal that the execution is completed after the execution of the pushed automatic operation instructions is completed, then according to the next automatic operation instruction to be executed The execution subject pushes the instruction.
  • the mobile phone 100 can also browse all the instructions and determine the execution subject corresponding to each instruction during the time limit for calling the automatic operation instructions in the executable file.
  • the mobile phone 100 can push the multiple automated operation instructions to be executed to the corresponding execution subject together, that is, in this case, the push instructions can be stacked; Execute sequentially, and after completing all the automatic operation instructions pushed by the mobile phone 100, feed back a signal of completion of execution to the mobile phone 100; The execution subject pushes the command.
  • the push instructions can be stacked; Execute sequentially, and after completing all the automatic operation instructions pushed by the mobile phone 100, feed back a signal of completion of execution to the mobile phone 100; The execution subject pushes the command.
  • FIG. 9A shows a schematic diagram of an interaction process in which the mobile phone 100 pushes an automatic operation instruction to an execution subject corresponding to the automatic operation instruction when the mobile phone 100 is directly connected to the smart TV 200 through communication.
  • the mobile phone 100 when the mobile phone 100 responds to the user's operation of clicking the "one-key projection screen" button 630 and controls the execution of automatic operation instructions corresponding to various operations, if it is determined that the subject of the automatic operation instruction to be executed is the mobile phone 100 , then the mobile phone 100 directly executes the instruction; if the mobile phone 100 judges that the subject of the automatic operation instruction to be executed is the smart TV 200, the mobile phone 100 can directly push the instruction to the smart TV 200, and the smart TV 100 receives the automatic operation pushed by the mobile phone 100 Execute the command after the command. Wherein, after the smart TV 200 finishes executing the corresponding instruction, it may directly feed back (send) a signal of completion of execution to the mobile phone 100 .
  • FIG. 9B shows a schematic diagram of an interaction process in which the mobile phone 100 pushes the automatic operation instruction to the execution subject corresponding to the automatic operation instruction when the mobile phone 100 is connected to the smart TV 200 through the cloud server 300 .
  • the mobile phone 100 when the mobile phone 100 responds to the user's operation of clicking the "one-key projection" button 630 and controls the execution of automatic operation instructions corresponding to various operations, if it is determined that the subject of the automatic operation instruction to be executed is the mobile phone 100 , then the mobile phone 100 directly executes the command; if the mobile phone 100 judges that the subject of the automatic operation command to be executed is the smart TV 200, the mobile phone 100 needs to push the command to the smart TV 200 through the cloud server 300, and the smart TV 100 receives the command from the cloud server. 300 to execute the instruction after the automatic operation instruction pushed by the mobile phone 100 . Wherein, after the smart TV 200 finishes executing the corresponding instruction, it can feed back a signal of completion of execution to the mobile phone 100 through the cloud server 300 .
  • the order in which the mobile phone 100 or the smart TV 200 executes the corresponding automatic operation instructions can refer to the relevant descriptions in the above-mentioned steps 302, 309 and 314.
  • operations such as opening the video application, clicking on the video to be played, and clicking the projection button to select the smart TV 200 as the screen projection device are performed in sequence, and then the user sequentially sets the definition and playback volume on the smart TV 200, or Set the brightness and other operations. Therefore, when executing the automatic operation instructions corresponding to various operations, the mobile phone 100 can first execute the above-mentioned opening video application, clicking on the video to be played, and clicking the screen projection button to select the smart TV 200 in sequence.
  • the mobile phone 100 pushes the automatic operation command that needs to be executed by the smart TV 200 to the smart TV 200, and the smart TV 200 executes in order after receiving the command.
  • the smart TV 200 feeds back a signal of completion of execution to the mobile phone 100 .
  • the mobile phone 100 or the smart TV 200 can Execute related troubleshooting algorithms to eliminate corresponding execution failures.
  • the processing methods corresponding to related troubleshooting algorithms can refer to the following processing methods:
  • the mobile phone 100 can control to re-execute the execution instruction of the current operation to eliminate the execution failure.
  • the smart TV 200 terminates the execution command of the current operation, the smart TV 200 can feed back a message of suspending the execution command of the current operation to the mobile phone 100, and the mobile phone 100 can send the current operation to the smart TV 200 again after receiving the feedback message.
  • the execution instruction and related operation information are used to control the smart TV 200 to re-execute the execution instruction of the current operation.
  • the mobile phone 100 then sends the smart TV 200 the execution command corresponding to the next operation and related operation information.
  • the mobile phone 100 can also automatically control and resend the current operation execution command and related operation information to the smart TV 200 based on the smart TV 200 giving feedback on the completion of the execution command over time, so as to eliminate the execution fault on the smart TV 200 , without limitation here.
  • the mobile phone 100 can automatically exit the corresponding function of the "one-key screen projection" button 630.
  • Execution of the automatic operation instruction and remind the user to check the current multi-device environment to eliminate execution interference factors, and then click the "one-key screen projection" button 630 to execute the corresponding automatic operation instruction.
  • the user checks the current multi-device environment, including, for example, checking whether the respective network connections of the multi-device are normal, whether the connection between the main device and other electronic devices is normal, and so on.
  • the interface of the mobile phone 100 prompting the user that the program execution is abnormally terminated can be referred to as shown in FIG. 10 .
  • the mobile phone 100 displays a prompt window 1010.
  • Option buttons are provided, such as a “Find Problems and Fix” option button 1011 and a “Rerun” button 1012 .
  • the user can click the option button 1011 to start the troubleshooting program on the mobile phone 100 to find the cause of the abnormal termination and perform related processing.
  • the troubleshooting method can refer to the above-mentioned troubleshooting method; Clicking the option button 1012 enables the mobile phone 100 to rerun the automatic operation instruction corresponding to the “one-key projection” button 630 .
  • the interface that the mobile phone 100 prompts the user for the abnormal termination of program execution may also be in other forms, which is not limited here.
  • the master device among the interacting electronic devices is based on the The process of generating automatic operation buttons from the recorded operation information may be, for example, as follows: First, the master device generates execution control instructions, specific operation items, parameters, etc. to control each electronic device to automatically execute corresponding operations based on the summarized operation information of each operation , wherein the electronic device that records the operation information of a certain operation is also the execution device that executes the operation; furthermore, the main device writes the execution control instructions, specific operation items, parameters, etc.
  • the main device can store the above-mentioned automatic operation instructions by generating an executable file.
  • the executable file is triggered to execute, for example, when the user clicks the corresponding automatic operation button, the main device can control the execution of the Automated action instructions stored in executable files.
  • the user can use the created automatic operation button to key to realize the interaction between multiple devices, and to realize the corresponding multi-device interaction scene, such as the screen projection scene 10 above, which simplifies the cumbersome process of the user's manual operation of the multi-device interaction, which is conducive to improving user experience.
  • Fig. 11 shows a schematic structural diagram of a mobile phone 100 according to an embodiment of the present application.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and user An identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light Sensor 180L etc.
  • the structure shown in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 100 .
  • the mobile phone 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • the processor 110 can control the mobile phone 100 through the controller to implement the distributed implementation method between electronic devices of the present application in response to user operations, for example, controlling the mobile phone 100 to record operation information, send recording instructions, and control the execution of automatic operations instructions etc.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the mobile phone 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the mobile phone 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the mobile phone 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transmit data between the mobile phone 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the mobile phone 100 .
  • the mobile phone 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 can receive wireless charging input through the wireless charging coil of the mobile phone 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 100 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the mobile phone 100 .
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (Bluetooth, BT), global navigation satellite system, etc. applied on the mobile phone 100. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the mobile phone 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • WLAN NFC
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone 100 realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Mini-LED, Micro-LED, Micro-OLED, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile phone 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone 100 can realize the shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the mobile phone 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the handset 100 may support one or more video codecs.
  • the mobile phone 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the mobile phone 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the operation information recorded by the mobile phone 100 or the operation information recorded by the smart TV 200 received by the mobile phone 100 can be stored in an external memory for calling when generating an automatic operation instruction.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100 .
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the automatic operation instructions corresponding to various operations generated by the processor 110 of the mobile phone 100 based on the operation information can be stored in the internal memory 121 (for example, stored as an executable file) for the processor 110 calls.
  • the mobile phone 100 can realize the audio function through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Cell phone 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to listen to the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the mobile phone 100 can be provided with at least one microphone 170C.
  • the mobile phone 100 can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals.
  • the mobile phone 100 can also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, identify sound sources, realize directional recording functions, and the like.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyroscope sensor 180B can be used to determine the motion posture of the mobile phone 100 .
  • the angular velocity of the mobile phone 100 about three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the mobile phone 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the mobile phone 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the mobile phone 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the mobile phone 100 can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the mobile phone 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the mobile phone 100 in various directions (generally three axes). When the mobile phone 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the mobile phone 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 can use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the mobile phone 100 emits infrared light through the light emitting diode.
  • Cell phone 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the mobile phone 100 . When insufficient reflected light is detected, the cell phone 100 may determine that there is no object in the vicinity of the cell phone 100 .
  • the mobile phone 100 can use the proximity light sensor 180G to detect that the user is holding the mobile phone 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the mobile phone 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile phone 100 can use the collected fingerprint features to realize fingerprint unlocking, access to the application lock, take pictures with the fingerprint, answer calls with the fingerprint, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the mobile phone 100 uses the temperature detected by the temperature sensor 180J to implement a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the mobile phone 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the mobile phone 100 when the temperature is lower than another threshold, the mobile phone 100 heats the battery 142 to avoid abnormal shutdown of the mobile phone 100 due to low temperature.
  • the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone 100 , which is different from the position of the display screen 194 .
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the mobile phone 100 can receive key input and generate key signal input related to user settings and function control of the mobile phone 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the mobile phone 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the mobile phone 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the mobile phone 100 adopts eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100 .
  • FIG. 12 is a block diagram of the software structure of the mobile phone 100 according to the embodiment of the present invention.
  • the software system of the mobile phone 100 can adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the software structure of the mobile phone 100 is illustrated by taking the Android system with a layered architecture as an example.
  • FIG. 12 is a block diagram of the software structure of the mobile phone 100 according to the embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the mobile phone 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, and other information).
  • Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event.
  • the "one-key projection" button 630 calls the interface of the application framework layer, and starts the Execution of the executable file at 630, the mobile phone 100 reads the automatic operation instruction in the executable file, for the automatic execution instruction whose execution subject is the mobile phone 100, the mobile phone 100 invokes the kernel layer to start the touch driver, and executes the touch control in the automatic operation instruction Operating instructions.
  • the present disclosure also relates to means for performing operations in text.
  • This apparatus may be specially constructed for the required purposes or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored on a computer readable medium such as, but not limited to, any type of disk including floppy disk, compact disk, CD-ROM, magneto-optical disk, read-only memory (ROM), random-access memory (RAM) , EPROM, EEPROM, magnetic or optical card, application specific integrated circuit (ASIC), or any type of medium suitable for storing electronic instructions, and each may be coupled to a computer system bus.
  • computers referred to in the specification may comprise a single processor or may be architectures involving multiple processors for increased computing power.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)
  • Selective Calling Equipment (AREA)

Abstract

本申请涉及智能终端技术领域,具体涉及一种电子设备间的分布式实现方法、系统、电子设备及存储介质,该方法包括:第一电子设备检测到用户的第一指令,第一指令用于指令通过第二电子设备完成第一指定任务;第一电子设备响应于第一指令,在第一电子设备上模拟一个或多个第一用户操作,其中第一用户操作为:在控制第二电子设备完成第一指定任务的过程中,需要用户在第一电子设备上进行的用户操作;第一电子设备指示第二电子设备执行第一指定任务。本申请通过预先记录用户在各电子设备上进行的各项操作的操作信息,生成用于分布式完成指定任务的可执行文件,再基于用户操作触发运行可执行文件,以在电子设备间分布式实现指定任务,提高用户操作效率。

Description

分布式实现方法、系统、电子设备及存储介质
本申请要求于2021年09月29日提交中国专利局、申请号为202111154648.0、申请名称为“分布式实现方法、系统、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及智能终端技术领域,具体涉及一种电子设备间的分布式实现方法、系统、电子设备及存储介质。
背景技术
随着智能终端技术的发展,全屋智慧场景已逐渐普及,并且有越来越多的智能电子设备加入全屋智慧场景。其中,在全屋智慧场景中,电子设备之间的无障碍交互大大提高了用户体验。例如,可以将用户在手机播放的视频投放到电视机上继续播放,也可以将手机里的音乐通过家中的智能音箱播放,当用户启动汽车时还可以将手机上的导航应用规划的最优上下班路线投放到车载电脑的屏幕上进行导航。
但是,目前在分布式实现指定任务的过程中,涉及的多设备之间的交互操作主要是用户手动实现的,实现过程比较繁琐,例如,当用户将手机播放的视频投放到电视机上继续播放时,需要先打开电视机,建立手机与电视机之间的通信连接,然后在手机上进行将播放的视频投屏至电视机的相关操作。
发明内容
本申请实施例提供了一种电子设备间的分布式实现方法、系统、电子设备及存储介质,通过预先记录为指定任务在电子设备间的分布式实现,用户在各电子设备上进行的各项操作的操作信息,生成可执行文件,然后,在用户后续进行设备交互时,只要触发可执行文件的运行,而不需要每次在电子设备间分布式实现指定任务的过程中,都要手动重复比较繁琐的交互操作,这样可以使得在电子设备间分布式实现指定任务更加方便、快捷,提高了用户的操作效率,也利于提高了用户的使用体验。
第一方面,本申请实施例提供了一种电子设备间的分布式实现方法,该方法包括:第一电子设备检测到用户的第一指令,第一指令用于指令通过第二电子设备完成第一指定任务;第一电子设备响应于第一指令,在第一电子设备上模拟一个或多个第一用户操作,其中第一用户操作为:在控制第二电子设备完成第一指定任务的过程中,需要用户在第一电子设备上进行的用户操作;第一电子设备指示第二电子设备执行第一指定任务。
即第一电子设备检测到用户在第一电子设备进行的一个操作对应的第一指令,其中,第一电子设备基于第一指令可以确定第二电子设备作为执行第一指定任务的执行设备,第一电子设备响应于第一指令可以在第一电子设备上模拟第一用户操作来实现控制第二电子设备完成第一指定任务的过程,第二电子设备接收到执行第一指定任务的指令后可以开始执行第一指定任务。
上述第一电子设备例如可以是下文实施例中的手机100,上述第二电子设备例如是下文实施例中的智能电视200,上述第一指令例如是用户可以点击手机100上创建的自动化操作按钮(比如“一键投屏”按钮)指示手机100控制实现投屏场景的指令;手机100则可以基于用户点击“一键投屏”按钮对应的指令,确定智能电视200作为执行视频播放任务的执行设备,即需要将手机100上播放的视频通过智能电视200进行播放;手机100还可以响应于用户点击“一键投屏”按钮的操作指令,模拟用户在手机 100上点击运行视频应用、播放想看的视频、以及在视频播放界面上点击投屏按钮、选择智能电视200作为投屏设备以将所播放的视频投放至智能电视200上进行播放的各项操作,这样智能电视200便可以在接收到手机100发来的执行视频播放任务的指令时,即可播放指定的视频。
在上述第一方面的一种可能的实现中,该方法还包括:第一电子设备响应于第一指令,生成并向第二电子设备发送至少一条模拟指令,其中模拟指令用于指示在第二电子设备上模拟第二用户操作,其中,第二用户操作为:在完成第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作。
即第一电子设备响应于第一指令可以通过向第二电子设备发送模拟指令,来指示第二电子设备模拟第二用户操作,以使第二电子设备在执行第一指定任务的过程中完成切换任务内容、设置参数等与执行第一指定任务相关的操作。其中,模拟指令例如可以包括下文实施例中步骤802、804至805中,手机100判断即将执行的自动化操作指令的执行主体为智能电视200时,向智能电视200推送的自动化操作指令,智能电视200接收到手机100发送的自动化操作指令后执行该指令,模拟用户在智能电视200上进行的设置播放音量值、设置播放亮度值、设置清晰度等操作。
在上述第一方面的一种可能的实现中,第一用户操作或第二用户操作,包括下列操作中的任一项:运行应用程序的操作;输入文字信息的操作;连接网络或者连接设备的操作;设置参数的操作;选择第一指定任务的内容的操作;切换页面的操作。
例如,在控制第二电子设备完成第一电子设备上的第一指定任务的过程中,需要在第一电子设备上模拟的第一用户操作可以是上述各项操作中的任一项,需要在第二电子设备上模拟的第二用户操作可以是上述各项操作中的任一项,可以理解,需要在第一电子设备上模拟的第一用户操作、以及需要在第二电子设备上模拟的第二用户操作也可以是不同于上述各项操作的其他操作,在此不做限制。
在上述第一方面的一种可能的实现中,第一用户操作,包括用户在第一电子设备上点击控件的操作,或者用户在第一电子设备上的滑动操作;第二用户操作,包括用户在第二电子设备上点击控件的操作,或者用户通过遥控设备对第二电子设备的控制操作。
即第一电子设备所模拟的第一用户操作,可以是用户在第一电子设备的显示界面上点击各个控件的操作,也可以是用户在第一电子设备的显示界面上进行的滑动操作,在此不做限制。第二电子设备所模拟的第二用户操作,可以是用户在第二电子设备的显示界面上点击各个控件的操作,也可以是用户通过遥控设备点击第二电视设备的显示界面上的各个控件等控制操作等,在此不做限制。上述第二电子设备例如可以是下文实施例中的智能电视200,遥控设备例如可以是用于控制智能电视200的遥控器。
在上述第一方面的一种可能的实现中,第一用户操作或第二用户操作,包括下列操作中的任一项:运行应用程序的操作;输入文字信息的操作;连接网络或者连接设备的操作;设置参数的操作;选择第一指定任务的内容的操作;切换页面的操作。
即第一电子设备所模拟的第一用户操作,无论是上述点击控件的操作,还是上述滑动操作等,均是用于实现完成第一指定任务所需的某项功能;第二电子设备所模拟的第二用户操作,无论是上述点击控件的操作,还是上述通过遥控设备对第二电子设备进行的控制操作,也均是用于实现完成第一指定任务所需的某项功能。例如是实现运行应用程序、输入文字信息、连接网络或者连接设备、设置参数、选择第一指定任务的内容、或者切换页面等操作功能。其中,选择第一指定任务的内容例如可以是下文实施中选择视频播放任务所播放的视频内容。
在上述第一方面的一种可能的实现中,运行应用程序的操作,包括用户在第一电子设备上点击应用程序图标控件以运行应用程序的操作;输入文字信息的操作,包括用户在第一电子设备上点击输入控件 以输入文字信息的操作;连接网络或连接设备的操作,包括用户在第一电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作;设置参数的操作,包括用户在第一电子设备上点击参数设置控件以设置参数的操作,和/或用户在第一电子设备上滑动以设置参数的操作;选择第一指定任务的内容的操作,包括用户在第一电子设备上点击选择控件以选择第一指定任务的内容的操作;切换页面的操作,包括用户在第一电子设备上滑动以切换页面的操作。
在上述第一方面的一种可能的实现中,第一指定任务为视频播放任务,第一用户操作包括下列中的任一项:用户在第一电子设备上点击视频应用图标,以运行视频应用的操作;用户在第一电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;用户在第一电子设备的显示界面中选择第二电子设备作为完成视频播放任务的设备以发起连接请求的操作;用户在第一电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;用户在第一电子设备显示的视频应用界面中点击视频内容切换按钮、或者视频选项按钮以选择视频播放任务播放的视频内容的操作;用户在第一电子设备显示的视频播放界面中上下滑动以设置音量、亮度值或清晰度的操作;用户在第一电子设备的显示界面上左右滑动以切换页面的操作。
上述视频选项按钮,例如可以是下文实施例中的剧集切换按钮543,或者显示有剧集序号的按钮等。
上述第一指定任务例如是下文实施例中的投屏任务,即将手机100上播放的视频投放至智能电视200上播放,也就是将手机100上执行的视频播放任务通过智能电视200来完成的过程。
在上述第一方面的一种可能的实现中,在上述第一方面的一种可能的实现中,运行应用程序的操作,包括用户在第二电子设备上点击应用程序图标控件以运行应用程序的操作,或者用户通过遥控设备控制在第二电子设备上运行应用程序的操作;输入文字信息的操作,包括用户在第二电子设备上点击输入控件以输入文字信息的操作,或者用户通过遥控设备控制在第二电子设备上输入文字信息的操作;连接网络或连接设备的操作,包括用户在第二电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作,或者用户通过遥控设备控制第二电子设备发送连接网络或者连接设备的请求的操作;设置参数的操作,包括用户在第二电子设备上点击参数设置控件以设置参数的操作,或者用户通过遥控设备控制在第二电子设备上设置参数的操作;选择操作,包括用户在第二电子设备上点击选择控件以选择所执行的第一指定任务的内容的操作;切换页面的操作,包括用户通过遥控设备控制第二电子设备切换页面的操作。
在上述第一方面的一种可能的实现中,第一指定任务为视频播放任务,第二用户操作包括下列中的任一项:用户在第二电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;用户通过遥控设备控制在第二电子设备的显示界面中输入文字信息的操作;用户在第二电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;用户在第二电子设备的显示界面中点击视频内容切换按钮、或者包括视频选项按钮以选择所执行的视频播放任务播放的视频内容的操作;用户通过遥控设备控制在第二电子设备上设置音量、亮度值或清晰度的操作。
在上述第一方面的一种可能的实现中,第一电子设备存储有对应于第一指令的第一程序,并且该方法包括;第一电子设备响应于第一指令,运行第一程序,其中第一程序能够在第一电子设备上模拟第一用户操作;第一程序能够生成并向第二电子设备发送模拟指令。
即第一电子设备上预先存储有对应于第一指令的第一程序,当第一电子设备检测到用户的第一指令时,第一电子设备可以运行预先存储的第一程序在第一电子设备上模拟第一用户操作,第一电子设备运行第一程序还可以生成并向第二电子设备发送模拟指令,以指示第二电子设备模拟第二用户操作。例如 在下文实施例中,手机100在响应用户点击“一键投屏”按钮的操作时,所执行的自动化操作程序可以使手机100模拟用户为实现投屏场景在手机100上进行的各项操作,手机100所执行的自动化操作程序还可以使手机100在判断即将执行的自动化操作指令的执行主体为智能电视200时,将自动化操作指令发送给智能电视200执行,同时手机100也可以向智能电视200发送触发运行自动化操作指令的指令,在此不做限制。
在上述第一方面的一种可能的实现中,第一程序包括可执行文件,并且该可执行文件的生成方式,包括:第一电子设备记录用户的第一用户操作在第一电子设备上产生的第一操作数据,其中第一操作数据用于被第一电子设备调用,以执行响应于第一用户操作的第一操作指令;第一电子设备根据第一操作数据、以及第一操作数据的生成时间,生成可执行文件。
上述第一程序所包括的可执行文件,例如可以是下文实施例中对应于投屏场景10的可执行文件,上述第一操作数据例如可以是下文实施例中用户为实现投屏场景而在手机100上进行的各项操作所对应的操作信息,手机100可以基于所记录的用户为实现投屏场景而在手机100上进行的各项操作所对应的操作信息、以及各项操作对应的操作信息的生成时间生成上述对应于投屏场景10的可执行文件,具体可以参考下文实施例中的相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,该可执行文件的生成方式,还包括:第一电子设备接收由第二电子设备记录的、用户的第二用户操作在第二电子设备上产生的第二操作数据,其中第二操作数据用于被第二电子设备调用,以执行响应于第二用户操作的第二操作指令;第一电子设备根据第一操作数据、第一操作数据的生成时间、第二操作数据、以及第二操作数据的生成时间,生成可执行文件。
上述第二操作数据例如可以是下文实施例中用户在智能电视200上设置播放音量值、亮度值、和/或清晰度值等操作所对应的操作信息,手机100可以基于手机100所记录的用户为实现投屏场景而在手机100上进行的各项操作所对应的操作信息、智能电视200所记录的用户在智能电视200上进行的各项操作对应的操作信息、以及各项操作对应的操作信息的生成时间生成上述对应于投屏场景10的可执行文件,具体可以参考下文实施例中的相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,在生成可执行文件之前,方法包括:第一电子设备提示用户在第一电子设备完成第一用户操作。
即第一电子设备可以通过显示界面提示用户,在控制第二电子设备完成第一指定任务的过程中,需要用户在第一电子设备上进行哪些操作。例如在下文实施例中,手机100可以通过显示通知弹窗,提示用户“是否启用场景复制功能,快速实现多设备交互”,手机100还可以显示教程界面,提示用户如何操作手机100记录操作信息、如何操作手机100生成“一键投屏”按钮等,具体可以参考下文实施例中的相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,在生成可执行文件之前,方法还包括:第一电子设备提示用户在第二电子设备上完成第二用户操作。
即第一电子设备可以通过显示界面提示用户,在控制第二电子设备完成第一指定任务的过程中,需要用户在第二电子设备上进行哪些操作。例如在下文实施例中,手机100可以通过显示教程界面,提示用户如何操作智能电视200记录操作信息等,具体可以参考下文实施例中的相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,第一电子设备记录用户的第一用户操作在第一电子设备上产生的第一操作数据的过程,包括:在检测到用户在第一电子设备的显示界面上执行用于指示开始记录的操作的情况下,记录用户的第一用户操作在第一电子设备上产生的第一操作数据,并且在检测到用户在 第一电子设备的显示界面上执行用于指示停止记录的操作的情况下,停止记录操作。
即用户可以控制第一电子设备记录第一操作数据的起止时间,用户在第一电子设备的显示界面上执行用于指示开始记录的操作、以及用户在第一电子设备的显示界面上执行用于指示停止记录的操作,例如可以是下文实施例中用户点击手机100所显示的操作记录按钮411的操作,例如用户第一次点击手机100所显示的操作记录按钮时,手机100进入操作记录模式、开始记录操作信息;用户再次点击手机100所显示的操作记录按钮时,手机100则退出操作记录模式、停止记录操作信息,具体可以参考下文实施例中图4A至图4B所示及相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,第一电子设备接收第二操作数据的过程,包括:第一电子设备基于接收到的第二电子设备发送的操作焦点切换通知,向第二电子设备发送开始记录的指令,其中开始记录的指令用于指示第二电子设备开始记录用户的第二用户操作在第二电子设备上产生的第二操作数据;第一电子设备在检测到用户在第一电子设备的显示界面上执行用于指示停止记录的操作的情况下,向第二电子设备发送停止记录的指令,其中停止记录的指令用于指示第二电子设备停止记录操作;第一电子设备接收第二电子设备发送的第二操作数据。
即当第二电子设备在检测到用户操作时,可以向第一电子设备发送操作焦点切换通知,以通知第一电子设备用户的操作焦点已切换至第二电子设备上,此时第一电子设备可以基于与第二电子设备之间的连接关系向第二电子设备发送开始记录的指令;当第一电子设备检测到用户在第一电子设备的显示界面上执行用于指示停止记录的操作时,可以向第二电子设备发送停止记录的指令。其中,第一电子设备与第二电子设备之间的连接关系,可以在用户操作第一电子设备向第二电子设备发送连接请求时建立,也可以在用户操作第二电子设备向第一电子设备发送连接请求时建立,在此不做限制。上述开始记录的指令例如可以是下文实施例步骤308中手机100向智能电视200发送的记录指令,上述停止记录的指令例如可以是下文实施例步骤311中手机100向智能电视200发送的结束记录指令,第一电子设备与第二电子设备之间的连接关系,例如可以是下文实施例步骤306中描述的智能电视200在被选定为投屏对端设备时接收到的手机100发来连接请求时,建立与第一电子设备之间的连接关系,并进入随时检测用户操作的准备模式,具体可以参考下文实施例步骤306、308以及311中相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,在第一电子设备上模拟第一用户操作的先后顺序,与第一操作数据产生的顺序相同;在第二电子设备上模拟第二用户操作的先后顺序,与第二操作数据产生的顺序相同。
即第一电子设备所模拟的第一用户操作的时间顺序,与用户在第一电子设备上进行第一用户操作时产生第一操作数据的时间顺序相同;第二电子设备所模拟的第二用户操作的时间顺序,与用户在第二电子设备上进行第二用户操作时产生第二操作数据的时间顺序相同。例如在下文实施例中所描述的,手机100或智能电视200执行相应自动化操作指令的顺序,可以参考与“一键投屏”按钮630对应的各项操作的操作顺序,例如先在手机100上依次进行打开视频应用、点开将要播放的视频、以及点击投屏按钮选择智能电视200作为投屏设备等操作,然后用户在智能电视200上依次进行设置清晰度、设置播放音量、或者设置亮度等操作,具体可以参考下文实施例中图9B所示及相关描述,在此不再赘述。
在上述第一方面的一种可能的实现中,在第一电子设备上模拟第一用户操作的过程中,显示第一用户操作引起的第一电子设备的显示内容的变化,和/或音量的变化。
在上述第一方面的一种可能的实现中,第一电子设备存储有第一程序和对应于第二指令的第二程序,其中第二指令用于指令通过第三电子设备完成第二指定任务;其中,第二电子设备与第三电子设备不同; 和/或第一指定任务与第二指定任务不同。
即第一电子设备响应于用户操作,例如点击自动化操作按钮,可以指令同一个电子设备执行不同的两个指定任务,也可以指令不同的两个电子设备分别执行不同的两个指定任务,或者指令不同的两个电子设备执行同一指定任务,在此不做限制。例如下文实施例中描述的一种多设备交互场景中,将手机100上播放的视频通过智能电视200进行播放,并将手机100接收到的文档通过平板电脑进行处理的过程。
在上述第一方面的一种可能的实现中,第一指令,包括投屏指令、音乐投放指令、文档处理指令中的至少一种,其中,投屏指令,用于指令通过第二电子设备完成视频播放任务;音乐投放指令,用于指令通过第二电子设备完成音乐播放任务;文档处理指令,用于指令通过第二电子设备完成文档处理任务。
第二方面,本申请实施例提供了一种电子设备间的分布式实现方法,该方法包括:第二电子设备响应于第一电子设备指示完成第一指定任务的指令,执行第一指定任务;第二电子设备响应于第一电子设备指示在第二电子设备上模拟第二用户操作的模拟指令,在第二电子设备上模拟第二用户操作,其中第二用户操作为:在完成第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作。
在上述第二方面的一种可能的实现中,第二用户操作,包括下列操作中的任一项:运行应用程序的操作;输入文字信息的操作;连接网络或者连接设备的操作;设置参数的操作;选择第一指定任务的内容的操作;切换页面的操作。
在上述第二方面的一种可能的实现中,运行应用程序的操作,包括用户在第二电子设备上点击应用程序图标控件以运行应用程序的操作,或者用户通过遥控设备控制在第二电子设备上运行应用程序的操作;输入文字信息的操作,包括用户在第二电子设备上点击输入控件以输入文字信息的操作,或者用户通过遥控设备控制在第二电子设备上输入文字信息的操作;连接网络或连接设备的操作,包括用户在第二电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作,或者用户通过遥控设备控制第二电子设备发送连接网络或者连接设备的请求的操作;设置参数的操作,包括用户在第二电子设备上点击参数设置控件以设置参数的操作,或者用户通过遥控设备控制在第二电子设备上设置参数的操作;选择操作,包括用户在第二电子设备上点击选择控件以选择所执行的第一指定任务的内容的操作;切换页面的操作,包括用户通过遥控设备控制第二电子设备切换页面的操作。
在上述第二方面的一种可能的实现中,第一指定任务为视频播放任务,第二用户操作包括下列中的任一项:用户在第二电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;用户通过遥控设备控制在第二电子设备的显示界面中输入文字信息的操作;用户在第二电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;用户在第二电子设备的显示界面中点击视频内容切换按钮、或者包括视频选项按钮以选择所执行的视频播放任务播放的视频内容的操作;用户通过遥控设备控制在第二电子设备上设置音量、亮度值或清晰度的操作。
在上述第二方面的一种可能的实现中,模拟指令是由第一电子设备运行第一程序生成的。
在上述第二方面的一种可能的实现中,第一程序包括可执行文件,并且可执行文件的生成方式,包括:第二电子设备向第一电子设备发送第二操作数据,以用于生成可执行文件,其中第二操作数据是第二电子设备记录用户的第二用户操作在第二电子设备上产生的数据,并且第二操作数据用于被第二电子设备调用,以执行响应于第二用户操作的第二操作指令。
在上述第二方面的一种可能的实现中,第二电子设备记录用户的第二用户操作在第二电子设备上产生的第二操作数据的过程,包括:第二电子设备响应于第一电子设备发送的开始记录的指令,记录用户 的第二用户操作在第二电子设备上产生的第二操作数据;第二电子设备响应于第一电子设备发送的停止记录的指令,停止记录操作,并向第一电子设备发送第二操作数据。
在上述第二方面的一种可能的实现中,在第二电子设备上模拟第二用户操作的先后顺序,与第二操作数据产生的顺序相同。
在上述第二方面的一种可能的实现中,在第二电子设备上模拟第二用户操作的过程中,第二电子设备显示第二用户操作引起的第二电子设备的显示内容的变化,和/或音量的变化。
例如下文实施例中智能电视200在用户进行调节音量的操作时显示音量变化图标以及用户所设置音量值等,具体可以参考下文实施例中图2B所示及相关描述,在此不再赘述。
第三方面,本申请实施例提供了一种分布式系统,包括第一电子设备和第二电子设备,第一电子设备用于响应用户的第一指令,在第一电子设备上模拟一个或多个第一用户操作,并生成并向第二电子设备发送的至少一条模拟指令,其中第一指令用于指令通过第二电子设备完成第一指定任务,第一用户操作为:在控制第二电子设备完成第一指定任务的过程中,需要用户在第一电子设备上进行的用户操作;模拟指令用于指示在第二电子设备上模拟第二用户操作,第二用户操作为:在完成第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作;第二电子设备用于响应第一电子设备指示完成第一指定任务的指令,执行第一指定任务,并且第二电子设备用于响应第一电子设备发送的模拟指令,在第二电子设备上模拟第二用户操作。
第四方面,本申请实施例提供了一种电子设备,包括:一个或多个处理器;一个或多个存储器;一个或多个存储器存储有一个或多个程序,当一个或者多个程序被一个或多个处理器执行时,使得电子设备执行权利要求上述电子设备间的分布式实现方法。
第五方面,本申请实施例提供了一种计算机可读存储介质,存储介质上存储有指令,指令在计算机上执行时使计算机执行上述电子设备间的分布式实现方法。
第六方面,本申请实施例提供了一种计算机程序产品,包括计算机程序/指令,该计算机程序/指令被处理器执行时实现上述电子设备间的分布式实现方法。
附图说明
图1所示为本申请实施例提供的一种采用本申请技术方案实现设备间投屏的投屏场景示意图。
图2A至2D所示为本申请实施例提供的采用本申请的电子设备间的分布式实现方法实现投屏场景中手机100与智能电视200之间交互过程的相关界面示意图。
图3所示为本申请实施例提供的投屏场景中手机100基于手机100及智能电视200所记录的操作信息生成“一键投屏”按钮的示意性交互流程图。
图4A至4B所示为本申请实施例提供的用户操作手机100进入操作记录模式的操作界面示意图。
图5A至5D所示为本申请实施例提供的用户在手机100上进行用于实现投屏场景10的各项操作所对应的界面示意图。
图6A至6B所示为本申请实施例提供的手机100通过显示弹窗来提示用户操作手机100完成创建自动化操作按钮所对应的界面变化。
图7A至7B所示为本申请实施例提供的操作信息数据库设置在手机100上时,手机100、智能电视200与操作信息数据库之间交互操作信息的一种流程示意图。
图8所示为本申请实施例提供的手机100响应于用户点击“一键投屏”按钮控制执行对应于该按钮的可执行文件中对应于投屏场景10中各项操作的自动化操作指令的一种交互流程示意图。
图9A所示为本申请实施例提供的手机100与智能电视200直接通信连接的情形下,手机100向自动化操作指令对应的执行主体推送自动化操作指令的交互过程示意图。
图9B所示为本申请实施例提供的手机100与智能电视200通过云服务器300进行通信连接的情形下,手机100向自动化操作指令对应的执行主体推送自动化操作指令的交互过程示意图。
图10所示为本申请实施例提供的手机100提示用户程序执行异常中止的界面示意图。
图11所示为本申请实施例提供的一种手机100的结构示意图。
图12所示为本申请实施例提供的一种手机100的软件结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面通过结合附图和实施方案,对本申请实施例的技术方案做进一步地详细描述。
本申请的说明性实施例包括但不限于一种电子设备间的分布式实现方法、系统、电子设备及存储介质。
如前文所述,在目前的智慧场景中,存在需要用户手动实现多设备之间的交互、并且实现过程比较繁琐的问题。为了解决该问题,本申请实施例公开了一种电子设备间的分布式实现方法,可以预先记录为指定任务在电子设备间的分布式实现,用户在各电子设备上进行的各项操作的操作信息,生成可执行文件,然后,在用户后续进行设备交互时,只要触发可执行文件的运行,例如,通过自动化操作按钮、语音指令或者其他预设的场景触发可执行文件的运行,便可自动完成在电子设备间分布式实现指定任务的过程中需要在多个设备上进行的交互操作,例如,如前文提到的,将用户在手机播放的视频投放到电视机上继续播放、将手机里的音乐通过家中的智能音箱播放等等,而不需要每次在电子设备间分布式实现指定任务的过程中,都要手动重复比较繁琐的交互操作,这样可以使得在电子设备间分布式实现指定任务更加方便、快捷。
其中,记录操作信息的过程,可以理解为用户在电子设备上进行打开应用程序、输入文字信息、连接其他电子设备、设置参数、点击控件、以及选择分布式任务的对端设备等操作时,电子设备记录用户的上述操作在电子设备上产生的相应操作数据的过程。例如,用户点击电子设备上的设置应用按钮、打开设置应用的界面的操作过程中,电子设备可以记录用户所点击的设置应用按钮的触控位置坐标信息、设置应用的程序名称等数据作为记录的操作信息。这样电子设备在运行基于上述记录的操作信息所生成的可执行文件时,可以基于所记录的触控位置坐标信息模拟点击设置应用按钮的操作、并设置应用打开设置应用的界面。
可以理解,上述交互的各电子设备所记录的操作信息,可以包括但不限于各项操作的执行顺序信息、执行各项操作涉及的电子设备名称、电子设备型号、相关电子设备所连接的网络名称、相关电子设备所运行的应用名称、用户在相关电子设备上输入的文字等内容、用户输入文字的位置、以及用户在各电子设备上的操作类型(包括点击操作、滑动操作、快捷操作手势等)等相关的数据信息。
此外,可以理解,用户可以在首次分布式实现指定任务进行在各设备上的交互操作时,记录用户在各电子设备上进行的各项操作的操作信息,在另一些实施例中,用户也可以在了解了手机等电子设备基于本申请所提供的电子设备间的分布式实现方法所提供的记录操作信息的功能后,在某一次设备交互完成分布式实现指定任务时,操作记录用户在各电子设备上进行的各项操作的操作信息、生成可执行文件,之后用户则可以通过自动化操作按钮等触发可执行文件的运行,来实现设备的交互。在此不做限制。
例如,图1示出了一种采用本申请技术方案实现设备间投屏的投屏场景。如图1所示,该投屏场景 10包括手机100和智能电视200。其中,手机100中预先存储有根据本申请的电子设备间的分布式实现方法所生成的对应于投屏场景10的可执行文件,手机100上还创建有能够触发运行该可执行文件的自动化操作按钮。当用户想要使用投屏场景10时,可以点击手机100上创建的自动化操作按钮、触发手机100运行上述对应于投屏场景10的可执行文件,控制手机100和智能电视200分别执行对应于投屏场景10的各项操作,完成投屏场景10中手机100与智能电视200之间的交互。
例如,图2A至2D示出了采用本申请的电子设备间的分布式实现方法实现图1所示投屏场景10中手机100与智能电视200之间交互过程的相关界面示意图。
如图2A所示,手机100的控制中心菜单栏210上包括操作记录按钮211、用户可以通点击手机100的控制中心菜单栏210中的操作记录按钮211使手机100进入操作信息记录模式,手机100可以作为投屏场景10中的主设备。手机100进入操作信息记录模式后,如图2B所示,用户可以在手机100运行的视频应用中点开将要播放的视频、以及用户点击视频播放界面上的投屏按钮220选择智能电视200作为投屏设备将手机100上播放的视频通过智能电视200进行播放等用于实现投屏场景10的各项操作。手机100可以将手机100响应于用户的上述各项操作而产生的执行指令、信号以及相关应用信息、设备信息等作为对应于各项操作的操作信息进行记录,例如手机100所运行的视频应用的程序包名称、应用图标访问路径信息等;智能电视200也可以记录用户在智能电视200上进行的投屏相关操作的操作信息,例如将用户操作智能电视200设置的音量值、亮度值、清晰度等对应的音量值、亮度值、清晰度等参数作为操作信息记录下来。
在手机100和智能电视200均完成记录操作信息后,智能电视200可以将所记录的操作信息发送至手机100,由手机100综合各项操作的操作信息,生成自动化操作指令及存储指令的可执行文件。手机100则可以通过创建自动化操作按钮来控制可执行文件的执行与否,其中手机100所创建的自动化操作按钮例如可以是图2C所示“一键投屏”按钮230。如图2C所示,基于上述采用本申请的电子设备间的分布式实现方法生成对应于图1所示投屏场景10的“一键投屏”按钮230后,当用户需要再次投屏时,只需点击手机100上的“一键投屏”按钮230,手机100便可以执行对应于“一键投屏”按钮230的可执行文件,控制手机100和智能电视200自动执行对应于投屏场景10中各项操作的自动化操作指令,一键完成手机100与智能电视200间的交互过程,实现投屏场景10,即手机100与智能电视200之间分布式实现投屏任务(或者说视频播放任务)的场景。
可见,手机100响应于用户点击“一键投屏”按钮230控制执行对应于投屏场景10中各项操作所对应的自动化操作指令的过程中,可以自动控制手机100或智能电视200自动执行相关操作实现交互,无需用户参与,因此,本申请的电子设备间的分布式实现方法能够将用户在复现常用的多设备交互场景时,将用户需要执行的指定任务涉及的多项繁琐操作通过自动化操作按钮一键实现,简化用户的操作过程,从而提高操作效率,节约操作时间。
可以理解,手机100等电子设备基于本申请所提供的分布式实现方法可以快捷实现多设备间的交互场景,在用户不了解电子设备所具有的该功能的情况下,手机100可以适当向用户推送介绍该功能的通知。例如,用户在首次操作将手机100上播放的视频投放至智能电视200继续播放时,手机100可以提示用户是否需要记录本次投屏过程中各设备交互的相关操作过程,参考图2D所示,例如手机100在用户点击视频播放界面的投屏按钮220时,手机100显示图2D所示的通知弹窗240,所显示通知内容241例如可以是“是否启用场景复制功能,快速实现多设备交互”,该通知内容241的下方还可以显示功能简介242,例如“完成记录后可以生成一键投屏按钮,下次投屏时可以一键完成投屏,方便快捷”;用 户可以点击通知弹窗240上的“查看”按钮243查看教程,包括如何操作手机100记录操作信息、如何操作手机100生成“一键投屏”按钮230以及如何使用所生成的“一键投屏”按钮230进行一键投屏的教程界面;另外,用户还可以点击通知弹窗240上的“关闭”按钮244,关闭该通知弹窗240,继续进行投屏相关的手动操作过程。可以理解,手机100还可以显示弹窗或教程界面等,提示用户如何操作智能电视200记录操作信息等,在另一些实施例中,手机100提示用户的形式、或者手机100显示的通知内容、教程界面等也可以是其他界面形式,在此不做限制。
可以理解,在多设备交互场景中,用于控制记录操作信息的主设备上可以设置用于存储交互的各电子设备所记录的操作信息的数据库(下称操作信息数据库),各电子设备完成记录各项操作的操作信息后可以将所记录的操作信息发送给主设备存入操作信息数据库中,供主设备调用。在另一些实施例中,也可以在云服务器上设置操作信息数据库,各电子设备完成记录各项操作的操作信息后可以将所记录的操作信息上传至云服务器,主设备需要获取各电子设备记录的操作信息时,可以向云服务器请求获取,在此不做限制。
可以理解,本申请实施例所提供的电子设备间的分布式实现方法,可以适用于各种电子设备,不限于手机100、智能电视200、智能音箱、便携电脑、以及智慧屏、台式计算机、平板电脑、膝上型计算机、可穿戴设备、头戴式显示器、移动电子邮件设备、便携式游戏机、便携式音乐播放器、阅读器设备、个人数字助理、虚拟现实或者增强现实设备、其中嵌入或耦接有一个或多个处理器的电子设备等。
为了方便描述,下面在介绍本申请的电子设备间的分布式实现方法的具体实施过程时,以交互的各电子设备中包括手机100、并且其中执行本申请的电子设备间的分布式实现方法的主设备也是手机100为例进行描述。可以理解,在另一些实施例中,多设备交互场景中交互的各电子设备也可以不包括手机100,或者多设备交互场景中交互的各电子设备中的主设备也可以是手机100以外的其他电子设备,在此不做限制。
以下先结合附图详细说明本申请的电子设备间的分布式实现方法实施过程中基于交互的各电子设备所记录的操作信息生成可执行文件以及创建触发可执行文件运行的自动化操作按钮的具体过程,再结合附图详细说明本申请的电子设备间的分布式实现方法实施过程中通过所创建的自动化操作按钮运行可执行文件执行自动化操作程序、一键实现多设备间的交互的具体过程。
示例性地,基于上述图1所示的投屏场景10以及图2A至2C所示的相关界面示意图,下面先结合流程图详细说明本申请的电子设备间的分布式实现方法实施过程中基于交互的各电子设备所记录的操作信息生成可执行文件以及触发可执行文件运行的自动化操作按钮的具体过程。
图3根据本申请实施例示出了投屏场景10中手机100基于手机100及智能电视200所记录的操作信息生成“一键投屏”按钮的示意性交互流程图。
如图3所示,该流程包括以下步骤:
301:手机100检测到用户点击记录控件启动记录的操作指令,进入操作记录模式。
示例性地,在投屏场景10中,交互的手机100和智能电视200上均安装有能够分别记录用户在手机100或者在智能电视200上进行的各项操作相关的操作信息的应用程序,用户可以通过点击手机100上对应于该应用程序的程序图标或者触发运行该应用程序的控件启动记录,使手机100进入操作记录模式。
可以理解,用户在操作手机100进入操作记录模式之前,可以先完成智能电视200的网络连接设置,例如启动智能电视200、进入智能电视200的网络设置选择网络名称或者网络账号、输入密码将智能电 视200接入家庭无线局域网络,如果智能电视200可连接的无线局域网络较多,则可以设置其中一个无线局域网络为智能电视200的默认连接网络。在此不再赘述。
上述用户操作手机100进入操作记录模式的操作界面可以参考图4A至图4B所示。
作为示例,图4A示出了用户操作手机100控制开启记录操作信息的一种操作界面示意图。
参考图4A所示的操作①,用户可以从手机100的屏幕顶部右侧下滑调出控制中心菜单栏410,进而参考图4A所示的操作②,用户点击控制中心菜单栏410中显示的操作记录按钮411可以使手机100进入操作记录模式,此时手机100响应于用户操作控制开启记录操作信息。手机100进入操作记录模式后,手机100屏幕上可以显示表示手机100正处于操作记录模式的状态栏或者控件,参考图4B所示的悬浮控件420。可以理解,上述操作记录按钮411可以是手机100上安装的用于记录操作信息的应用程序所提供的一个功能入口,用户点击操作记录按钮411即相当于点击手机100所安装的用于记录操作信息的应用程序的图标启动记录操作信息的过程。
作为示例,图4B示出了手机100进入操作记录模式后的一种界面示意图。
参考图4B所示,手机100进入操作记录模式后,手机100屏幕上可以显示悬浮控件420,悬浮控件420可以显示表示手机100正处于操作记录模式的文字,例如“正在记录”,悬浮控件420上还可以显示记录时长。在另一些实施例中,手机100进入操作记录模式后的界面也可以是不同于图4B所示的另一界面形式,在此不做限制。
302:手机100检测到用户实现交互场景的操作,记录各项操作所对应的操作信息并存储。
示例性地,手机100进入操作记录模式后,用户点击手机100开始进行实现交互场景的操作,例如实现投屏场景10的操作,包括用户在手机100上依次进行的打开视频应用、点开将要播放的视频、以及点击投屏按钮选择智能电视200作为投屏设备等操作,此时用户的操作焦点在手机100上。即手机100可以检测到用户的各项操作,并记录用户在手机100上进行的各项操作产生的操作信息并存储。其中,用户在手机100上进行的各项操作所对应的界面变化过程,将在下文具体描述,在此不再赘述。
示例性地,手机100所记录的对应于上述各项操作的操作信息,例如包括用户操作手机100打开的视频应用相关信息(例如视频应用的名称、视频应用图标等)、用户在视频应用主界面上的点击操作相关信息(例如点击位置的坐标信息、点击位置所对应的控件名称、点击位置所对应的视频名称、视频访问地址等信息)、以及用户在视频播放界面上的点击操作相关信息(例如点击位置的坐标信息、点击位置所对应的控件名称、点击位置所对应的投屏设备的名称及访问ID、以及完成投屏后用户点击切换清晰度或者调节播放音量等播放参数设置操作相关信息等)。
可以理解,手机100也可以通过窗口管理服务(Windows Manager Service,WMS)记录上述操作信息,也可以通过响应上述用户操作的处理器等对上述操作信息进行记录,手机100记录得到的操作信息可以先进行存储。在一些实施例中,如果手机100上安装用于记录操作信息的应用程序时,系统分配有用于存储操作信息的数据库(下称操作信息数据库),则手机100所记录的操作信息也可以直接存储至操作信息数据库中,在另一些实施例中,操作信息数据库也可以设置在云服务器上,则手机100可以先将所记录的操作信息进行存储,待结束记录时手机100可以将所记录的全部操作信息上传至云服务器上的操作信息数据库中,以供调用。具体地,操作信息数据库设置在手机100上或者设置在云服务器上所对应的操作信息数据传送过程将在下文详细描述,在此不再赘述。
另外可以理解,手机100记录上述操作信息的顺序对应于用户在手机100上进行相关操作的顺序,手机100记录的操作信息中可以携带对应于用户操作顺序的时序信息,例如用户操作时的时间信息。
303:手机100检测到用户选择智能电视200作为对端设备的操作。
示例性地,在用户实现交互场景的各项操作中,包括为主设备选择执行交互任务的对端设备的操作,例如实现投屏场景10各项操作中,用户在操作手机100进行投屏时,需要选择执行手机100上的视频播放任务的投屏设备(即对端设备),例如智能电视200,具体选择投屏设备的操作例如可以是:用户在手机100所打开的视频播放界面点击投屏按钮选择智能电视200作为投屏设备。
304:手机100向智能电视200发送进入操作记录模式的指令。
示例性地,手机100确定智能电视200作为对端设备后,手机100则向对端设备(智能电视200)发送进入操作记录模式的指令。
可以理解,在多设备交互场景中,用户会通过主设备选择交互操作的对端设备,该对端设备即为该场景中与主设备交互的其他电子设备。用户选择对端设备操作的过程中,主设备可以将操作记录模式的指令发送给该对端设备,使该对端设备进入操作记录模式。
可以理解,在另一些实施例中,手机100在检测到用户选择智能短时200作为对端设备的操作时,也可以向智能电视200发送手机200此时处于操作记录模式的状态标签,以使智能电视200进入记录操作信息的准备状态,在此不做限制。
305:智能电视200接收进入操作记录模式的指令,并进入操作记录模式。
示例性地,智能电视200接收到手机100发来的进入操作记录模式的指令后,智能电视200的处理器可以解析并执行该指令,控制智能电视200进入操作记录模式。
可以理解,智能电视200进入操作记录模式后,即可响应于用户操作继续执行下述步骤306及其后续各步骤。
306:智能电视200检测到用户实现交互场景的操作。
示例性地,在用户实现交互场景的各项操作中,包括对作为对端设备的智能电视200进行的用于实现投屏场景10的操作,当用户选择智能电视200作为投屏的对端设备时,手机100可以向智能电视200发送连接请求,请求与智能电视200建立连接以传送视频播放任务的相关数据,此时智能电视200可以进入检测用户操作的准备模式中,检测用户在智能电视200上的任何操作。例如,在投屏场景10中,用户完成在手机100上所进行的实现投屏场景10的各项操作后,手机100所播放的视频智能电视200进行播放,此时用户如果继续通过遥控器操作智能电视200进一步实现投屏场景10,此时智能电视200在接收到用户通过遥控器发来的控制信号,则可以确定已检测到用户的操作。
在另一些实施例中,如果智能电视200采用触控屏,用户也可以通过触摸智能电视200的屏幕实现在智能电视200进行切换剧集、切换播放进度、以及设置清晰度、设置播放音量、或者设置亮度等操作,在此不做限制。
307:智能电视200向手机100发送操作焦点切换通知。
示例性地,智能电视200在检测到用户实现交互场景的操作时即向手机100发送用户操作焦点已切换至智能电视200的通知。
在另一些实施例中,智能电视200也可以在进入操作记录模式后,检测到用户操作便立即开始记录用户在智能电视200上进行的用户实现投屏场景10的各项操作所对应的操作信息,在此不做限制。
另外,在另一些实施例中,当智能电视200检测到用户操作时也可以向手机100发送操作焦点切换通知及申请记录操作信息的请求,手机100接收请求后确认用户的操作焦点切换并授权智能电视200记录用户操作信息,在此不做限制。
308:手机100基于接收到的操作焦点切换通知,向智能电视200发送记录指令。
示例性地,手机100接收到智能电视200发来的操作焦点切换通知后,可以向智能电视200发送开始记录操作信息的记录指令,以指令智能电视200开始记录用户所进行的各项操作对应的操作信息。
309:智能电视200基于所接收到的记录指令,记录用户在智能电视200上进行的各项操作对应的操作信息并存储。
示例性地,智能电视200可以通过处理器等响应于手机100发来的记录指令以及用户在智能电视200上进行用于实现投屏场景10的各项操作,例如用户通过智能电视200的遥控器控制智能电视200切换剧集、切换播放进度、以及设置清晰度、设置播放音量、或者设置亮度等操作,并记录对应于这些操作的操作信息。可以理解,智能电视200所记录的操作信息可以先进行存储,例如存储至智能电视200的内存或所连接的外部存储器等,在此不做限制。
可以理解,在另一些多设备交互场景中执行本步骤308时的,当与主设备交互的电子设备为两个以上,例如与手机100交互的电子设备不仅包括智能电视200,还包括平板电脑,此时智能电视200在完成记录用户在智能电视200上进行各项操作对应的操作信息后,可以进入等待用户操作的状态。用户如果在智能电视200上完成操作后,又开始操作手机100选择平板电脑作为某项任务的对端设备,例如作为文档处理任务的对端设备时,此时手机100重复执行上述步骤302记录相应操作信息、以及执行上述步骤303向平板电脑发送进入操作记录模式的指令。之后,用户操作平板电脑时,平板电脑则作为执行主体执行上述步骤305至309,记录用户在平板电脑上进行的各项操作对应的操作信息。可以理解,多设备交互过程中,当用户的操作焦点发生变化时,检测到用户操作的电子设备便可以作为执行主体执行上述步骤305至309,按照用户操作顺序记录各项操作的操作信息。当用户完成的多设备交互场景的全部操作后,可以操作作为主设备的手机100结束操作记录模式,手机100继续执行下述步骤310。在此不做限制。
310:手机100检测到用户结束记录的操作指令,结束记录操作信息。
示例性地,用户完成在智能电视200上的操作之后,可以操作手机100退出操作记录模式。作为示例,参考上述图4B,悬浮控件420上可以显示结束记录按钮421,当用户完成用于实现投屏场景10的全部操作后,可以点击手机100屏幕上悬浮控件420中显示的结束记录按钮421使手机100退出操作记录模式(参考图4B所示的操作③),此时手机100响应于用户结束记录的操作结束记录操作信息。
在另一些实施例中,用户结束记录的操作也可以参考上述图4A所示的操作①,用户可以在手机100所显示的控制中心菜单栏410中再次点击操作记录按钮411使手机100退出操作记录模式,结束记录操作信息,在此不做限制。
311:手机100向智能电视200发送结束记录指令。
示例性地,当手机100响应于用户操作退出操作记录模式时,手机100可以向智能电视200发送结束记录指令。
312:智能电视200接收结束记录指令,结束记录操作信息。
示例性地,智能电视200接收到结束记录指令后,退出操作记录模式,结束记录操作信息。
313:智能电视200将所记录的操作信息发送给手机100。
示例性地,智能电视200结束记录操作信息后,可以将已记录的操作信息发送给手机100,例如智能电视200可以将所记录的操作信息打包成数据包之后基于与手机100之间的通信连接传送给手机100,或者智能电视200也可以将所记录的操作信息按照记录时间批量发送给手机100,在此不做限制。
可以理解,在另一些涉及多设备间的交互场景中,各电子设备在接收到的结束记录指令结束记录操作信息后,各电子设备可以同时或者按照用户操作顺序分别将所记录的操作信息发送给手机100,可以理解,各电子设备发送给手机100的操作信息中包含各电子设备的设备信息,例如设备标签等。
另外,参考上述步骤302中的相关描述,如果用于存储交互的各电子设备所记录的操作信息的操作信息数据库设置在手机100上,则各电子设备将所记录的操作信息发送给手机100后,手机100可以将所接收的操作信息直接存入操作信息数据库中;如果操作信息数据库设置在云服务器上,则各电子设备可以将将所记录的操作信息上传至云服务器,以供手机100申请调用。具体地,操作信息数据库设置在手机100上或者设置在云服务器上所对应的操作信息数据传送过程将在下文详细描述,在此不再赘述。
314:手机100基于手机100及智能电视200所记录的操作信息生成可执行文件。
示例性地,手机100可以调用操作信息数据库中所存储的全部操作信息,例如包括手机100及智能电视200所记录的操作信息,并基于操作信息生成对应于各项操作的可执行自动化操作程序脚本,或者对应于各项操作的自动化操作指令,存于可执行文件中,当可执行文件被触发执行时,对应于各项操作的自动化操作指令可以按照用户在手机100上进行各项操作以及用户在智能电视200上进行各项操作的操作顺序进行执行,手机100所生成的可执行文件被触发执行时,手机100可以控制向各项操作的执行主体发送相对应的自动化操作指令,具体将在下文详细描述,在此不再赘述。另外,其中用户在手机100上进行的各项操作可以参考上述步骤302中相关描述,用户在智能电视200上进行各项操作可以参考上述步骤309中相关描述,在此不再赘述。
可以理解,上述手机100或智能电视200所记录的操作信息可以是响应于用户操作所产生的指令及相关数据等信息,也可以是转换为可执行的脚本语言进行描述的指令及数据信息,手机100基于手机100或智能电视200所记录的操作信息生成对应于各项操作的自动化操作程序脚本或自动化操作指令,所生成的自动化操作程序脚本或自动化操作指令中可以包括但不限于执行主体标识、所执行操作对应的文字输入内容、设置的参数等数据,对这些操作信息转换成自动化操作指令过程中所对应的处理方式将在下文详细描述,在此不再赘述。
315:手机100提示并响应于用户操作,完成创建自动化操作按钮。
示例性地,手机100在上述步骤314中基于手机100及智能电视200所记录的操作信息生成对应于图1所示投屏场景10的可执行文件后,可以提示用户进行创建自动化操作按钮的操作。例如,手机100可以通过显示弹窗来提示用户操作手机100完成生成自动化操作按钮的过程,具体将在下文结合附图详细描述,在此不再赘述。
可以理解,手机100所创建的自动化操作按钮将作为一键实现手机100与智能电视200间交互的操作入口,用户通过点击手机100上对应于图1所示投屏场景10的自动化操作按钮,即可触发各项操作的自动化执行过程。手机100响应于用户点击自动化操作按钮的操作,控制执行对应于各项操作的自动化操作指令的过程,将在下文详细描述,在此不再赘述。
可以理解,对于上述步骤301至315,在另一些实施例中,智能电视200也可以在接收到手机100发来的进入操作记录模式的指令、进入操作记录模式后,在检测到用户操作便立即开始记录用户在智能电视200上进行的各项操作所对应的操作信息。即在另一些实施例中,本申请所提供的电子设备间的分布式实现方法实施时,可以执行上述步骤305至306后,直接执行步骤309记录用户在智能电视200上进行的各项操作对应的操作信息,而不执行步骤307至308所描述的手机100与智能电视200之间传送操作焦点切换通知、及传送记录指令等过程后再执行步骤309记录操作信息,在此不做限制。
另外,在另一些实施例中,用户在操作手机100进入操作记录模式之前,也可以先完成手机100与智能设备200之间的连接,例如将智能电视200作为手机100的分布式设备,二者通过分布式软总线实现连接。进而在上述步骤301中,当作为主设备的手机100进入操作信息记录模式时,可以向已经与手机100完成连接的智能电视200发送进入操作记录模式的指令,如此,智能电视200接收指令后进入记录操作模式,当智能电视200检测到用户操作时,开始记录相关操作信息,在此不做限制。
在上述步骤302中,用户在手机100上进行用于实现投屏场景10的各项操作所对应的界面可以参考图5A至5D所示。
图5A示出了手机100响应于用户打开视频应用的操作所显示的视频应用主界面的界面示意图。作为示例,例如用户操作手机100所打开的视频应用为华为 TM视频,即用户在手机100的桌面应用程序中点击华为 TM视频的应用图标,手机100运行华为 TM视频应用,用户点击视频界面510下方“我的”按钮511,则手机100所显示的视频界面510为图5A所示界面。在另一些实施例中,用户也可以点击视频界面510下方的其他按钮,例如“首页”按钮512、“看点”按钮513、“专区”按钮514、“教育”按钮615进入相应的显示界面以选择想要播放的视频,在此不做限制。
图5B示出了手机100响应于用户点开将要播放的视频的选择操作所显示的视频播放界面的界面示意图。用户在上述图5A所示界面上可以点击“播放历史”选项516下方的视频以点开将要播放的视频,手机100显示图5B所示的视频播放界面520,用户可以在视频播放界面520上点击投屏按钮521进行投屏。其中,视频播放界面520所显示的投屏按钮521可以参考上述图2B所示的投屏按钮220。在另一些实施例中,手机100所显示的视频播放界面520还可以是其他界面形式,例如可以是横屏显示的视频播放界面,在此不做限制。
图5C示出了手机100响应于用户点击投屏按钮521的操作所显示的投屏设备选择界面的界面示意图。如图5C所示,用户可以在视频播放界面520上点击投屏按钮521进行投屏后,视频播放界面520上显示投屏设备选择窗口530,用户可以选择搜索到的与手机100互联的智能电视200作为投屏设备(即与手机100交互的对端设备),参考图5C所示的操作④。
图5D示出了手机100响应于用户选择智能电视200作为投屏设备后完成投屏后的界面示意图。如5D所示,用户在上述图5C所示界面选择智能电视200作为投屏设备后,手机100上正在播放的视频投放至智能电视200上继续播放,此时手机100显示图5D所示的完成投屏后的界面540,用户可以在界面540上点击清晰度切换按钮541设置视频在智能电视200上播放的清晰度(例如清晰度设置为720p),或者点击音量调节按钮542调节视频在智能电视200上播放的音量,或者点击剧集切换按钮543切换剧集等操作,可以理解,当手机100处于操作记录模式下,用户在界面540上操作的相关操作信息均会被记录,因此不推荐用户在手机100处于操作记录模式下时进行选择剧集或者切换剧集等操作,在此不做限制。
可以理解,手机100处于操作记录模式下可以暂时停止响应其他事件,例如手机100上设置的闹钟、备忘录等内部事件可以暂时停止响应,待手机100退出操作记录模式后再响应闹钟、备忘录等内部事件。在另一些实施例中,手机100处于操作记录模式下也可以同时响应其他事件,如果此时手机100正在记录用户操作手机100的相关操作信息,手机100响应闹钟、备忘录等内部事件的同时可以提醒用户此时不要对手机100显示的闹钟提醒窗口、备忘录提醒窗口等执行任何操作,以免将用户对闹钟提醒窗口或备忘录提醒窗口的操作信息记录下来,从而影响后续其他步骤的执行过程。在此不做限制。
在上述步骤314中,对于操作信息转换成自动化操作指令过程中所对应的处理方式,示例性地,可 以参考以下对文字录入操作、运行应用程序及应用跳转操作、多设备之间的连接操作、参数设置操作、以及点击屏幕上的控件的操作、或者在屏幕上进行的滑动操作等对应的操作信息转换成自动化操作指令的处理方式。
(1)针对文字录入操作所对应的操作信息,在将该操作信息转换为自动化操作指令时可以读取该操作信息中的用户输入文字以及用户操作输入的位置相关信息写入指令中。如此,在后续执行该自动化操作指令时,接收用户输入文字的电子设备能够自动基于指令中的输入位置相关信息在相应输入位置上自动填写指令中的输入文字。例如在投屏场景中,如果用户在手机100打开的视频应用的界面搜索框内输入视频名称,例如“陈情令”,则用户在视频应用的界面搜索框内输入视频名称的操作对应的操作信息中会包括文字“陈情令”,在将对应于上述操作过程的操作信息转换为自动化操作指令时,可以读取文字“陈情令”以及输入位置(例如视频应用的界面搜索框)相关信息写入指令中,如此,在后续手机100执行对应于上述操作过程的自动化操作指令时则能够自动在视频应用的界面搜索框内输入文字“陈情令”进行搜索。
(2)针对运行应用程序及应用跳转操作所对应的操作信息,在将该操作信息转换为自动化操作指令时可以读取相关应用的程序包名称等信息、以及相关应用的运行顺序信息等写入指令中。如此,在后续执行该自动化操作指令时,运行相关应用程序的电子设备能够基于相关应用的运行顺序信息查找相应应用的程序包名称并自动运行该应用程序。
例如,在一些涉及多设备的交互场景中,手机100所记录的操作信息中包括用户操作该电子设备运行应用程序A和应用程序B所对应的操作信息,则在将手机100所记录的操作信息转换为自动化操作指令时可以读取操作信息中的应用程序A和应用程序B所对应的程序包名称、以及手机100运行应用程序A和应用程序B的顺序信息写入指令中,例如记录操作信息的过程中手机100先运行应用程序A再运行应用程序B。则在手机100执行上述自动化操作指令时,可以先基于指令中应用程序A的程序包名称查询程序包并自动运行应用程序A、再基于指令中应用程序B的程序包名称查询程序包并自动运行应用程序B,完成应用程序A至应用程序B的跳转运行过程。
(3)针对多设备之间的连接操作所对应的操作信息,例如上述图2A所示的手机100与智能电视200的连接操作所对应的操作信息,在将该操作信息转换为自动化操作指令时可以读取操作信息中的对端连接设备(例如智能电视200)的设备信息以及用户操作发起搜索及连接请求等对应的操作控件信息等写入指令中,如此,在后续执行该自动化操作指令时,手机100可以基于指令中的操作控件信息自动触发发起搜索的操作,搜索临近设备,并基于指令中的对端连接设备智能电视200的设备信息进行自动匹配,该自动匹配的过程例如对搜索到的所有临近设备信息中顺序查看是否存在智能电视200的设备信息,匹配成功则自动向智能电视200发起连接请求进行自动连接。
(4)针对用户对设备进行的参数设置操作所对应的操作信息,例如上述图2B所示的用户在智能电视200上播放参数设置操作所对应的操作信息,在将该操作信息转换为自动化操作指令时可以读取操作信息中的设置参数数据写入指令中,例如记录上述操作信息时用户对音量、清晰度、亮度等参数的设置值。如此,在后续执行该自动化操作指令时,智能电视200可以基于指令中的设置参数数据自动调整智能电视200播放视频的音量、清晰度、亮度等参数为设置值。
(5)针对用户在具有触控屏的设备屏幕上点击控件的操作、或者在设备屏幕上进行的滑动操作等所对应的操作信息,例如用户在手机100上屏幕点击控件所对应的操作信息,在将该操作信息转换为自动化操作指令时可以读取操作信息中的屏幕点击事件相关数据,例如屏幕点击事件所对应的触控位置坐 标数据等。如此,在后续执行该自动化操作指令时,手机100可以基于指令中的触控位置坐标数据等屏幕点击事件相关数据自动响应于该屏幕点击数据事件执行相应操作。
可以理解,在另一些实施例中,对于上述示例的文字录入操作、应用跳转操作、多设备之间的连接操作、设置操作以及点击屏幕执行的相关操作的操作信息转换为自动化操作指令的处理也可以采用其他合理的处理方式,在此不做限制。
另外,可以理解,如果上述手机100和智能电视200所记录的操作信息是通过脚本语言记录的,则手机100可以将从操作信息数据库中读取的操作信息按照相应的操作顺序复制编写到同一自动化操作程序可运行文件中即可。
可以理解,用于运行操作信息所对应的各项操作的自动化操作程序脚本语言包括但不限于Scala、JavaScript,VBScript,ActionScript,MAX Script,ASP,JSP,PHP,SQL,Perl,Shell,python,Ruby,JavaFX,Lua,Auto It等,在此不做限制。
在上述步骤315中,手机100通过显示弹窗来提示用户操作手机100完成创建自动化操作按钮所对应的界面变化过程,可以参考图6A至图6B所示。
图6A示出了一种手机100通过显示弹窗提示用户创建自动化操作按钮的界面示意图。
如图6A所示,手机100生成对应于图1所示投屏场景10的可执行文件后,手机100上可以通过弹窗显示选择窗口610,示例性地,选择窗口610中可以包括生成自动化操作按钮611以及重新记录操作信息按钮612,参考图6A所示的操作⑤,用户可以点击生成自动化操作按钮611将存储在操作信息数据库中的操作信息生成为自动化操作按钮。手机100响应于用户点击生成自动化操作按钮611的操作读取操作信息数据库中的操作信息,编写自动化操作脚本并生成自动化操作按钮。如果用户点击图6A所示的重新记录操作信息按钮612,则手机100可以重新开始执行上述步骤301进入操作记录模式。
图6B示出了一种手机100创建自动化操作按钮后通过弹窗提示用户对所创建的自动化操作按钮进行相应设置的界面示意图。
如图6B所示,手机100生成自动化操作按钮后可以显示选择窗口620,作为示例,选择窗口620中包括自定义命名选项框621、“添加至桌面”对应的选项按钮622、以及“添加至控制中心”的选项按钮623。用户在手机100所显示的选择窗口620上,可以点击自定义命名选项框621输入自定义名称,例如用户可以输入“一键投屏”作为手机100生成的自动化操作按钮的名称;用户可以点击选择窗口620上的添加至桌面选项按钮622可以将生成的自动化操作按钮(或者自定义名称后的“一键投屏”按钮)对应的图表添加至手机100的桌面上,参考图6B所示的手机100桌面上显示的“一键投屏”630。
可以理解,在另一些实施例中,手机100也可以提示用户创建语音指令,手机100提示用户创建语音指令的过程可以参考上述图6A至6B所示,在此不再赘述。手机100完成创建语音指令后,用户可以先通过预设的唤醒词唤醒手机100的语音助手,例如用户可以通过向手机100说出“小艺!小艺!”唤醒手机100的语音助手,然后用户可以说出手机100创建对应于投屏场景的上述语音指令时所设置的指令词,例如“一键投屏”,进而触发手机100执行上述可执行文件对应的自动化操作程序实现手机100与智能电视200间的交互,实现图1所示的投屏场景,在此不做限制。具体手机100执行自动化操作程序手机100与智能电视200间的交互的具体过程,将在下文详细描述,在此不再赘述。
另外,可以理解,上述步骤301至315所描述的投屏场景10中手机100基于手机100及智能电视200所记录的操作信息生成“一键投屏”按钮的交互流程中,操作信息数据库即可以设置在作为主设备的手机100上,也可以设置在云服务器上,相对应地,手机100、智能电视200与操作信息数据库之间 进行交互的过程会有所区别。
示例性地,图7A示出了操作信息数据库设置在手机100上时,手机100、智能电视200与操作信息数据库之间交互操作信息的流程示意图。
如图7A所示,在涉及手机100与智能电视200间交互的投屏场景10中,用于存储手机100及智能电视200所记录的操作信息的操作信息数据库101可以设置在手机100上,例如在手机100上安装用于记录操作信息的应用程序时,手机100的系统为该应用程序所配置的存储空间内设置上述操作信息数据库。并且,手机100与智能电视200之间直接通信连接并且可以相互传送操作信息等数据。例如,手机100与智能电视200之间的连接可以通过多种方式实现。例如,如果手机100和智能电视200所搭载的操作系统为鸿蒙 TM操作系统(Harmony OS),则手机100与智能电视200之间可以通过分布式软总线连接。在另一些实施例中,手机100与智能电视200之间也可以通过蓝牙等其他方式实现连接等,在此不做限制。
继续如图7A所示,当手机100进入操作记录模式后,手机100检测到用户的操作时,手机100的窗口管理服务102可以在响应于用户操作的同时记录用户在手机100上进行的各项操作对应的操作信息并存储,具体参考上述步骤302中相关描述,在此不再赘述。手机100可以将所记录的操作信息先存储至内部存储器中,当手机100结束记录时,再将所记录的操作信息存入操作信息数据库中。可以理解,在另一些实施例中,用户在手机100上进行的各项操作所对应的操作信息也可以直接存入手机100上的操作信息数据库中,在此不做限制。
当智能电视200检测到用户的操作时,智能电视200可以记录用户在智能电视200上进行的各项操作所对应的操作信息并存储,具体可以参考上述步骤309中相关描述,在此不再赘述。其中,智能电视200检测到用户的操作时可以与手机100进行通知及指令交互,该交互过程可以参考上述步骤305至309中相关描述,在此不再赘述。智能电视200结束记录时,可以将所记录并存储的操作信息发送给手机100,手机100接收智能电视200所记录的操作信息后可以将该操作信息存入操作信息数据库101中。
在手机100获取智能电视200发来的操作信息后,手机100可以从操作信息数据库中加载全部操作信息,包括手机100所记录的操作信息及智能电视200所记录的操作信息。手机100进而基于全部操作信息生成自动化操作程序脚本及相应的可执行文件,进而手机100提示并响应于用户操作,创建自动化操作按钮,完成创建自动化操作按钮后,手机100上可以显示所创建的自动化操作按钮。其中,手机100基于操作信息生成可执行文件的过程可以参考上述步骤314中相关描述,手机100创建自动化操作按钮的过程可以参考上述步骤315,在此不再赘述。
图7B示出了操作信息数据库设置在手机100上时,手机100、智能电视200与操作信息数据库之间交互操作信息的流程示意图。
如图7B所示,在涉及手机100与智能电视200间交互的投屏场景10中,用于存储手机100及智能电视200所记录的操作信息的操作信息数据库101也可以设置在云服务器300上,云服务器300上所设置的操作信息数据库101还可以用于响应及传送用户操作手机100运行记录操作信息的应用程序相关的指令信息等。手机100与智能电视200可以通过云服务器300通信连接及传送操作信息等数据,例如手机100可以通过云服务器300向智能电视200发送请求通信连接的链接申请,智能电视200接收到通过云服务器300传送来的链接申请后,可以显示是否授权与手机100连接并向手机100发送设备信息的通知窗口供用户确认,如果用户确认授权,则智能电视200可以将自身的设备信息通过云服务器300发送给手机100,手机100获取智能电视200的设备信息后完成与智能电视200之间的设备连接。
继续如图7B所示,当手机100进入操作记录模式后,手机100检测到用户的操作时,手机100可以在响应于用户操作的同时记录用户在手机100上进行的各项操作对应的操作信息并存储,具体参考上述步骤302中相关描述、以及图5A至5D所示及相关描述,在此不再赘述。手机100所记录的操作信息,也可以称为图7B所示的手机侧操作信息。可以理解,作为主设备的手机100可以将所记录的操作信息存储在手机100内存中以供调用,也可以在结束记录时,将所记录的操作信息上传云服务器300存入操作信息数据库中,在此不做限制。
当智能电视200检测到用户的操作时,可以通过云服务器300向手机100发送焦点切换通知,手机100接收到该通知后通过云服务器300向智能电视200发送记录指令,智能电视200接收记录指令后开始记录用户在智能电视200上进行的各项操作所对应的操作信息并存储,具体可以参考上述步骤305至309中相关描述,在此不再赘述。智能电视200所记录的操作信息,也可以称为图7B所示的终端侧操作信息。智能电视200结束记录时,可以将所记录并存储的操作信息上传至云服务器300存入操作信息数据库101中。
完成记录后,作为主设备的手机100需要获取全部操作信息,包括手机100所记录的操作信息及智能电视200所记录的操作信息,此时云服务器300可以将存入操作信息数据库101中的操作信息推送给作为主设备的手机100,云服务器300也可以基于手机100发送的数据请求向手机100推送存入操作信息数据库101中的全部操作信息,在此不做限制。
在手机100获取全部操作信息后,可以基于全部操作信息生成自动化操作程序脚本及相应的可执行文件,进而手机100提示并响应于用户操作,创建自动化操作按钮,完成创建自动化操作按钮后,手机100上可以显示所创建的自动化操作按钮。其中,手机100基于操作信息生成可执行文件的过程可以参考上述步骤314中相关描述,手机100创建自动化操作按钮的过程可以参考图6A至6B所示及相关描述,在此不再赘述。
下面结合另一流程图详细说明本申请的电子设备间的分布式实现方法实施过程中通过所创建的自动化操作按钮触发执行可执行文件中的自动化操作指令、一键实现多设备间的交互的具体过程。
图8根据本申请实施例示出了手机100响应于用户点击“一键投屏”按钮控制执行对应于该按钮的可执行文件中对应于投屏场景10中各项操作的自动化操作指令的示意性交互流程图。
如图8所示,该流程包括以下步骤:
801:手机100检测到用户点击自动化操作按钮的操作,调取对应于该按钮的可执行文件中的自动化操作指令。
示例性地,手机100检测到用户点击用于实现投屏场景10的自动化操作按钮,例如上述图6B所示的“一键投屏”按钮630,用户对该按钮630的点击操作可以触发手机100运行“一键投屏”按钮630所对应的可执行文件,如上所述,该可执行文件中存储有对应于各项操作的自动化操作指令,手机100可以调取该可执行文件中的自动化操作指令,再继续进行下述判断步骤802。
可以理解,参考上述步骤314中相关描述,对应于各项操作的自动化操作指令的执行主体包括手机100,即基于手机100记录的用户在手机100上进行的各项操作的操作信息生成的自动化操作指令,执行主体为手机100;基于智能电视200记录的用户在智能电视200上进行的各项操作的操作信息生成的自动化操作指令,执行主体为智能电视200。
802:手机100判断即待执行的自动化操作指令的执行主体设备。若即将执行的自动化操作指令对应的执行主体为手机100,则继续执行步骤803;若即将执行的自动化操作指令对应的执行主体为智能 电视200,则继续执行步骤804。
示例性地,手机100调取可执行文件中的自动化操作指令后,可以按照各指令对应的操作的操作顺序,依次判断各自动化操作指令的执行主体,并将各自动化操作指令推送给相应的执行主体进行执行,其中手机100推送指令的顺序与相应的各项操作的操作顺序一致。对于“一键投屏”按钮630对应的可执行文件中的自动化操作指令,如果手机100判断即将执行的自动化操作指令对应的执行主体为手机100,则手机100可以直接执行该指令,即进行下述步骤803;如果手机100判断即将执行的自动化操作指令对应的执行主体为智能电视200,则手机100需要将该指令推送给智能电视200执行,即进行下述步骤804。
803:手机100执行以手机100作为执行主体的自动化操作指令。
示例性地,如果待执行的自动化操作指令的执行主体为手机100,则手机100的处理器可以直接调用该指令进行执行。具体的指令推送过程和指令执行过程,将在下文结合附图详细描述,在此不再赘述。
其中,手机100执行自动化操作指令时,对于指令中涉及的文字录入、应用跳转、设备连接、参数设置等操作对应的脚本可以参考上述步骤314中相关描述进行处理,在此不做限制。
804:手机100发送执行主体为智能电视200的自动化操作指令给智能电视200。
示例性地,如果待执行的自动化操作指令的执行主体为智能电视200,则手机100将该指令推送给智能电视200进行执行。具体的指令推送过程和指令执行过程,将在下文结合附图详细描述,在此不再赘述。
805:智能电视200执行所接收的自动化操作指令。
示例性地,智能电视200接收到需要执行的自动化操作指令后,智能电视200的处理器可以调用该指令进行执行。具体执行所接收的指令的过程,将在下文结合附图详细描述,在此不再赘述。
806:智能电视200向手机100反馈完成执行指令的信号。
示例性地,智能电视200完成执行所接收的指令后,可以向手机100反馈完成执行的信号。具体反馈完成执行的信号的过程,将在下文结合附图详细描述,在此不再赘述。
对于上述步骤803至806所涉及的指令推送、指令执行以及反馈完成执行的信号的具体过程,以下将进行具体说明。
可以理解,手机100可以根据操作顺序向执行主体逐条推送自动化操作指令,待已推送的自动化操作指令执行完毕后执行主体反馈完成执行的信号时,再根据即将执行的下一个自动化操作指令所对应的执行主体推送该指令。在另一些实施例中,手机100还可以在调取可执行文件中的自动化操作指令时限遍览全部指令并确定各指令对应的执行主体,如果即将执行的多条自动化操作指令的执行主体为同一设备,则手机100可以将即将执行的多条自动化操作指令一并推送给相应的执行主体,即此种情形下可以堆叠推送指令;相应的执行主体接收到手机100推送的自动化操作指令后按序依次执行,并在完成执行手机100推送的全部自动化操作指令后,向手机100反馈完成执行的信号;手机100在接收到完成执行的信号后,继续向即将执行的下一个或多个自动化操作指令的执行主体推送指令。在此不做限制。
作为示例,图9A示出了手机100与智能电视200直接通信连接的情形下,手机100向自动化操作指令对应的执行主体推送自动化操作指令的交互过程示意图。
如图9A所示,手机100响应于用户点击“一键投屏”按钮630的操作、控制执行对应于各项操作的自动化操作指令时,如果判断即将执行的自动化操作指令的执行主体为手机100,则手机100直接执行该指令;如果手机100判断即将执行的自动化操作指令的执行主体为智能电视200,则手机100可以 直接向智能电视200推送该指令,智能电视100接收手机100推送的自动化操作指令后执行该指令。其中,智能电视200完成执行相应指令后,可以直接向手机100反馈(发送)完成执行的信号。
图9B示出了手机100与智能电视200通过云服务器300进行通信连接的情形下,手机100向自动化操作指令对应的执行主体推送自动化操作指令的交互过程示意图。
如图9B所示,手机100响应于用户点击“一键投屏”按钮630的操作、控制执行对应于各项操作的自动化操作指令时,如果判断即将执行的自动化操作指令的执行主体为手机100,则手机100直接执行该指令;如果手机100判断即将执行的自动化操作指令的执行主体为智能电视200,则手机100需通过云服务器300向智能电视200推送该指令,智能电视100接收到云服务器300转送的由手机100推送的自动化操作指令后执行该指令。其中,智能电视200完成执行相应指令后,可以通过云服务器300向手机100反馈完成执行的信号。
其中,手机100或智能电视200执行相应自动化操作指令的顺序,可以参考上述步骤302、309及314中相关描述,例如“一键投屏”按钮630对应的各项操作的操作顺序为用户先在手机100上依次进行打开视频应用、点开将要播放的视频、以及点击投屏按钮选择智能电视200作为投屏设备等操作,然后用户在智能电视200上依次进行设置清晰度、设置播放音量、或者设置亮度等操作,因此,在执行对应于各项操作的自动化操作指令时,手机100可以按序先执行对应于上述打开视频应用、点开将要播放的视频、以及点击投屏按钮选择智能电视200作为投屏设备等操作的自动化操作指令,完成执行后手机100再向智能电视200推送需要智能电视200执行的自动化操作指令,智能电视200接收指令后按序执行对应于上述设置清晰度、设置播放音量、或者设置亮度等操作的自动化操作指令,执行完成后智能电视200向手机100反馈完成执行的信号。
另外,可以理解,在上述对应于各项操作的自动化指令的执行过程中,可能会遇到手机100执行相应指令或者智能电视200执行相应指令中止等执行故障,此时手机100或智能电视200可以执行相关排障算法排除相应执行故障,作为示例,相关排障算法所对应的处理方式可以参考以下几种处理方式:
例如,当手机100执行当前操作的执行指令中止时,手机100控制重新执行当前操作的执行指令排除执行故障即可。当智能电视200执行当前操作的执行指令中止的情况下,智能电视200可以向手机100反馈中止执行当前操作的执行指令的消息,手机100在接收到反馈消息后可以向智能电视200再次发送当前操作的执行指令及相关操作信息,以控制智能电视200重新执行当前操作的执行指令。当智能电视200向手机100反馈已完成当前指令的执行信息后,手机100再向智能电视200发送下一个操作对应的执行指令及相关操作信息。在另一些实施例中,手机100也可以基于智能电视200超时反馈完成执行指令的情况,自发控制向智能电视200再次发送当前操作的执行指令及相关操作信息,以排除智能电视200上的执行故障,在此不做限制。
例如,当手机100或者智能电视200执行当前操作的执行指令中止,并且重复执行当前操作的执行指令超时,例如超过预设的执行时长,则手机100可以自动退出“一键投屏”按钮630对应的自动化操作指令的执行,并提示用户检查当前多设备环境排除执行干扰因素后,再点击“一键投屏”按钮630执行相应的自动化操作指令。用户检查当前多设备环境,例如包括检查多设备的各自网络连接是否正常、主设备与其他电子设备之间的连接是否正常等。
其中,手机100提示用户程序执行异常中止的界面可以参考图10所示,手机100显示提示窗口1010,提示窗口1010上例如可以显示“自动化程序异常中止!”等提示字样,提示窗口1010上还可以提供选项按钮,例如“查找问题并解决”选项按钮1011以及“重新运行”按钮1012。参考图10所示的操作 ⑦,用户可以点击选项按钮1011启动手机100上的故障排除程序查找异常中止的原因并进行相关处理,故障排除处理方式例如可以参考上述排除执行故障的方式;用户也可以点击选项按钮1012使手机100重新运行“一键投屏”按钮630对应的自动化操作指令。在另一些实施例中,手机100提示用户程序执行异常中止的界面也可以是其他形式,在此不做限制。
基于上述步骤301至315、以及步骤801至806所描述的本申请的电子设备间的分布式实现方法的实施过程,可以理解在多设备交互场景中,交互的各电子设备中的主设备基于所记录的操作信息生成自动化操作按钮的过程,例如可以是:首先,该主设备基于所汇总的各项操作的操作信息生成控制各电子设备自动执行相应操作的执行控制指令、具体操作事项、参数等,其中记录某一项操作的操作信息的电子设备也是执行该项操作的执行设备;进而,该主设备将上述生成的执行控制指令、具体操作事项、参数等写入可执行的自动化操作程序脚本或自动化操作指令中;最后,该主设备可以通过生成一个可执行文件来存储上述自动化操作指令,该可执行文件被触发执行时,例如用户点击相应的自动化操作按钮时,主设备可以控制执行该可执行文件中存储的自动化操作指令。如此,通过本申请的电子设备间的分布式实现方法,经过一次操作信息的记录过程以及基于操作信息生成可执行文件、创建自动化操作按钮的过程之后,用户便可以通过所创建的自动化操作按钮一键实现多设备间的交互,实现相应的多设备交互场景,例如上述投屏场景10,使得用户手动操作多设备交互的繁琐过程变得简化,利于提高用户体验。
图11根据本申请实施例示出了一种手机100的结构示意图。
手机100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L等。
可以理解的是,本发明实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。在本申请实施例中,处理器110可以通过控制器控制手机100响应于用户操作实施本申请的电子设备间的分布式实现方法,例如控制手机100记录操作信息、发送记录指令以及控制执行自动化操作指令等。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现手机100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现手机100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现手机100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为手机100充电,也可以用于手机100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机100的结构限定。在本申请另一些实施例中,手机100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过手机100的无线充电线圈接收无线充电输 入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
手机100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。
移动通信模块150可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。
无线通信模块160可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Mini-LED,Micro-LED,Micro-OLED,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机100可以包括1个或N个显示屏194,N为大于1的正整数。
手机100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机100可以支持一种或多种视频编解码器。这样,手机100可以播放或记录多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现手机100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。在本申请的一些实施例中,可以将手机100所记录的操作信息或手机100所接收的智能电视200记录的操作信息存入外部存储器中,以供生成自动化操作指令时调用。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行手机100的各种功能应用以及数据处理。在本申请的一些实施例中,可以将手机100的处理器110基于操作信息生成的对应于各项操作的自动化操作指令存入内部存储器121中(例如存为可执行文件),以供处理器110调用。
手机100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机100可以设置至 少一个麦克风170C。在另一些实施例中,手机100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。手机100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,手机100根据压力传感器180A检测所述触摸操作强度。手机100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定手机100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定手机100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测手机100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消手机100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,手机100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。手机100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当手机100是翻盖机时,手机100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测手机100在各个方向上(一般为三轴)加速度的大小。当手机100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。手机100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,手机100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机100通过发光二极管向外发射红外光。手机100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手机100附近有物体。当检测到不充分的反射光时,手机100可以确定手机100附近没有物体。手机100可以利用接近光传感器180G检测用户手持手机100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。手机100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测手机100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。手机100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,手机100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,手机100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,手机100对电池142加热,以避免低温导致手机100异常关机。在其他一些实施例中,当温度低于又一阈值时,手机100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机100的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。手机100可以接收按键输入,产生与手机100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机100的接触和分离。手机100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机100中,不能和手机100分离。
图12是本发明实施例的手机100的软件结构框图。
手机100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明手机100的软件结构。
图12是本发明实施例的手机100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图12所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图12所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供手机100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和记录,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合投屏场景,示例性说明手机100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为“一键投屏”按钮630为例,“一键投屏”按钮630调用应用框架层的接口,启动对应于该按钮630的可执行文件的执行,手机100读取可执行文件中的自动化操作指令,对于执行主体为手机100的自动化执行指令,手机100调用内核层启动触控驱动,执行自动化操作指令中的触控操作指令。
在说明书对“一个实施例”或“实施例”的引用意指结合实施例所描述的具体特征、结构或特性被包括在根据本申请公开的至少一个范例实施方案或技术中。说明书中的各个地方的短语“在一个实施例中”的出现不一定全部指代同一个实施例。
本申请公开还涉及用于执行文本中的操作装置。该装置可以专门处于所要求的目的而构造或者其可以包括被存储在计算机中的计算机程序选择性地激活或者重新配置的通用计算机。这样的计算机程序可以被存储在计算机可读介质中,诸如,但不限于任何类型的盘,包括软盘、光盘、CD-ROM、磁光盘、只读存储器(ROM)、随机存取存储器(RAM)、EPROM、EEPROM、磁或光卡、专用集成电路(ASIC)或者适于存储电子指令的任何类型的介质,并且每个可以被耦合到计算机系统总线。此外,说明书中所提到的计算机可以包括单个处理器或者可以是采用针对增加的计算能力的多个处理器涉及的架构。
本文所提出的过程和显示器固有地不涉及任何具体计算机或其他装置。各种通用系统也可以与根据本文中的教导的程序一起使用,或者构造更多专用装置以执行一个或多个方法步骤可以证明是方便的。在一下描述中讨论了用于各种这些系统的结构。另外,可以使用足以实现本申请公开的技术和实施方案的任何具体编程语言。各种编程语言可以被用于实施本公开,如本文所讨论的。
另外,在本说明书所使用的语言已经主要被选择用于可读性和指导性的目的并且可能未被选择为描绘或限制所公开的主题。因此,本申请公开旨在说明而非限制本文所讨论的概念的范围。

Claims (30)

  1. 一种电子设备间的分布式实现方法,其特征在于,包括:
    第一电子设备检测到用户的第一指令,所述第一指令用于指令通过第二电子设备完成第一指定任务;
    所述第一电子设备响应于所述第一指令,在第一电子设备上模拟一个或多个第一用户操作,其中所述第一用户操作为:在控制所述第二电子设备完成所述第一指定任务的过程中,需要用户在第一电子设备上进行的用户操作;
    所述第一电子设备指示所述第二电子设备执行所述第一指定任务。
  2. 根据权利要求1所述的方法,其特征在于,还包括:
    所述第一电子设备响应于所述第一指令,生成并向所述第二电子设备发送至少一条模拟指令,其中所述模拟指令用于指示在第二电子设备上模拟第二用户操作,其中,
    所述第二用户操作为:在完成所述第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作。
  3. 根据权利要求2所述的方法,其特征在于,所述第一用户操作或所述第二用户操作,包括下列操作中的任一项:
    运行应用程序的操作;
    输入文字信息的操作;
    连接网络或者连接设备的操作;
    设置参数的操作;
    选择第一指定任务的内容的操作;
    切换页面的操作。
  4. 根据权利要求3所述的方法,其特征在于,所述运行应用程序的操作,包括用户在所述第一电子设备上点击应用程序图标控件以运行应用程序的操作;
    所述输入文字信息的操作,包括用户在所述第一电子设备上点击输入控件以输入文字信息的操作;
    所述连接网络或连接设备的操作,包括用户在所述第一电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作;
    所述设置参数的操作,包括用户在所述第一电子设备上点击参数设置控件以设置参数的操作,和/或用户在所述第一电子设备上滑动以设置参数的操作;
    所述选择第一指定任务的内容的操作,包括用户在所述第一电子设备上点击选择控件以选择所述第一指定任务的内容的操作;
    所述切换页面的操作,包括用户在所述第一电子设备上滑动以切换页面的操作。
  5. 根据权利要求4所述的方法,其特征在于,所述第一指定任务为视频播放任务,所述第一用户操作包括下列中的任一项:
    用户在所述第一电子设备上点击视频应用图标,以运行视频应用的操作;
    用户在所述第一电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;
    用户在所述第一电子设备的显示界面中选择所述第二电子设备作为完成所述视频播放任务的设备以发起连接请求的操作;
    用户在所述第一电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;
    用户在所述第一电子设备显示的视频应用界面中点击视频内容切换按钮、或者视频选项按钮以选择 视频播放任务播放的视频内容的操作;
    用户在所述第一电子设备显示的视频播放界面中上下滑动以设置音量、亮度值或清晰度的操作;
    用户在所述第一电子设备的显示界面上左右滑动以切换页面的操作。
  6. 根据权利要求3所述的方法,其特征在于,所述运行应用程序的操作,包括用户在所述第二电子设备上点击应用程序图标控件以运行应用程序的操作,或者用户通过遥控设备控制在所述第二电子设备上运行应用程序的操作;
    所述输入文字信息的操作,包括用户在所述第二电子设备上点击输入控件以输入文字信息的操作,或者用户通过遥控设备控制在所述第二电子设备上输入文字信息的操作;
    所述连接网络或连接设备的操作,包括用户在所述第二电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作,或者用户通过遥控设备控制所述第二电子设备发送连接网络或者连接设备的请求的操作;
    所述设置参数的操作,包括用户在所述第二电子设备上点击参数设置控件以设置参数的操作,或者用户通过遥控设备控制在所述第二电子设备上设置参数的操作;
    所述选择操作,包括用户在所述第二电子设备上点击选择控件以选择所执行的第一指定任务的内容的操作;
    所述切换页面的操作,包括用户通过遥控设备控制所述第二电子设备切换页面的操作。
  7. 根据权利要求6所述的方法,其特征在于,所述第一指定任务为视频播放任务,所述第二用户操作包括下列中的任一项:
    用户在所述第二电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;
    用户通过遥控设备控制在所述第二电子设备的显示界面中输入文字信息的操作;
    用户在所述第二电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;
    用户在所述第二电子设备的显示界面中点击视频内容切换按钮、或者包括视频选项按钮以选择所执行的视频播放任务播放的视频内容的操作;
    用户通过遥控设备控制在所述第二电子设备上设置音量、亮度值或清晰度的操作。
  8. 根据权利要求2至7中任一项所述的方法,其特征在于,所述第一电子设备存储有对应于所述第一指令的第一程序,并且所述方法包括;
    所述第一电子设备响应于所述第一指令,运行所述第一程序,其中所述第一程序能够在第一电子设备上模拟第一用户操作;
    所述第一程序能够生成并向所述第二电子设备发送所述模拟指令。
  9. 根据权利要求8所述的方法,其特征在于,所述第一程序包括可执行文件,并且所述可执行文件的生成方式,包括:
    所述第一电子设备记录用户的所述第一用户操作在所述第一电子设备上产生的第一操作数据,其中所述第一操作数据用于被所述第一电子设备调用,以执行响应于所述第一用户操作的第一操作指令;
    所述第一电子设备根据所述第一操作数据、以及所述第一操作数据的生成时间,生成所述可执行文件。
  10. 根据权利要求9所述的方法,其特征在于,所述可执行文件的生成方式,还包括:
    所述第一电子设备接收由第二电子设备记录的、用户的所述第二用户操作在所述第二电子设备上产生的第二操作数据,其中所述第二操作数据用于被所述第二电子设备调用,以执行响应于所述第二用户 操作的第二操作指令;
    所述第一电子设备根据所述第一操作数据、所述第一操作数据的生成时间、所述第二操作数据、以及所述第二操作数据的生成时间,生成所述可执行文件。
  11. 根据权利要求10所述的方法,其特征在于,在生成所述可执行文件之前,所述方法包括:
    所述第一电子设备提示用户在第一电子设备完成所述第一用户操作。
  12. 根据权利要求11所述的方法,其特征在于,在生成所述可执行文件之前,所述方法还包括:
    所述第一电子设备提示用户在所述第二电子设备上完成所述第二用户操作。
  13. 根据权利要求9至12中任一项所述的方法,其特征在于,所述第一电子设备记录用户的所述第一用户操作在所述第一电子设备上产生的第一操作数据的过程,包括:
    在检测到用户在所述第一电子设备的显示界面上执行用于指示开始记录的操作的情况下,记录用户的所述第一用户操作在所述第一电子设备上产生的第一操作数据,并且
    在检测到用户在所述第一电子设备的显示界面上执行用于指示停止记录的操作的情况下,停止所述记录操作。
  14. 根据权利要求10至12中任一项所述的方法,其特征在于,所述第一电子设备接收所述第二操作数据的过程,包括:
    所述第一电子设备基于接收到的所述第二电子设备发送的操作焦点切换通知,向所述第二电子设备发送开始记录的指令,其中所述开始记录的指令用于指示所述第二电子设备开始记录用户的所述第二用户操作在所述第二电子设备上产生的第二操作数据;
    所述第一电子设备在检测到用户在所述第一电子设备的显示界面上执行用于指示停止记录的操作的情况下,向所述第二电子设备发送停止记录的指令,其中所述停止记录的指令用于指示所述第二电子设备停止所述记录操作;
    所述第一电子设备接收所述第二电子设备发送的所述第二操作数据。
  15. 根据权利要求14所述的方法,其特征在于,在第一电子设备上模拟所述第一用户操作的先后顺序,与所述第一操作数据产生的顺序相同;
    在所述第二电子设备上模拟所述第二用户操作的先后顺序,与所述第二操作数据产生的顺序相同。
  16. 根据权利要求2所述的方法,其特征在于,在第一电子设备上模拟所述第一用户操作的过程中,所述第一电子设备显示所述第一用户操作引起的第一电子设备的显示内容的变化,和/或音量的变化。
  17. 根据权利要求8至15中任一项所述的方法,其特征在于,所述第一电子设备存储有所述第一程序和对应于第二指令的第二程序,其中所述第二指令用于指令通过第三电子设备完成第二指定任务;其中,
    所述第二电子设备与所述第三电子设备不同;和/或
    所述第一指定任务与所述第二指定任务不同。
  18. 根据权利要求1至17中任一项所述的方法,其特征在于,所述第一指令,包括投屏指令、音乐投放指令、文档处理指令中的至少一种,其中,
    所述投屏指令,用于指令通过所述第二电子设备完成视频播放任务;
    所述音乐投放指令,用于指令通过所述第二电子设备完成音乐播放任务;
    所述文档处理指令,用于指令通过所述第二电子设备完成文档处理任务。
  19. 一种电子设备间的分布式实现方法,其特征在于,包括:
    第二电子设备响应于第一电子设备指示完成第一指定任务的指令,执行所述第一指定任务;
    所述第二电子设备响应于所述第一电子设备指示在第二电子设备上模拟第二用户操作的模拟指令,在所述第二电子设备上模拟所述第二用户操作,其中所述第二用户操作为:在完成所述第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作。
  20. 根据权利要求19所述的方法,其特征在于,所述第二用户操作,包括下列操作中的任一项:
    运行应用程序的操作;
    输入文字信息的操作;
    连接网络或者连接设备的操作;
    设置参数的操作;
    选择第一指定任务的内容的操作;
    切换页面的操作。
  21. 根据权利要求20所述的方法,其特征在于,所述运行应用程序的操作,包括用户在所述第二电子设备上点击应用程序图标控件以运行应用程序的操作,或者用户通过遥控设备控制在所述第二电子设备上运行应用程序的操作;
    所述输入文字信息的操作,包括用户在所述第二电子设备上点击输入控件以输入文字信息的操作,或者用户通过遥控设备控制在所述第二电子设备上输入文字信息的操作;
    所述连接网络或连接设备的操作,包括用户在所述第二电子设备上点击连接网络或者连接设备的控件以发送连接请求的操作,或者用户通过遥控设备控制所述第二电子设备发送连接网络或者连接设备的请求的操作;
    所述设置参数的操作,包括用户在所述第二电子设备上点击参数设置控件以设置参数的操作,或者用户通过遥控设备控制在所述第二电子设备上设置参数的操作;
    所述选择操作,包括用户在所述第二电子设备上点击选择控件以选择所执行的第一指定任务的内容的操作;
    所述切换页面的操作,包括用户通过遥控设备控制所述第二电子设备切换页面的操作。
  22. 根据权利要求21所述的方法,其特征在于,所述第一指定任务为视频播放任务,所述第二用户操作包括下列中的任一项:
    用户在所述第二电子设备的显示界面中点击输入法界面上的按键以输入文字信息的操作;
    用户通过遥控设备控制在所述第二电子设备的显示界面中输入文字信息的操作;
    用户在所述第二电子设备显示的视频播放界面中点击参数设置按钮以设置参数的操作;
    用户在所述第二电子设备的显示界面中点击视频内容切换按钮、或者包括视频选项按钮以选择所执行的视频播放任务播放的视频内容的操作;
    用户通过遥控设备控制在所述第二电子设备上设置音量、亮度值或清晰度的操作。
  23. 根据权利要求19至22中任一项所述的方法,其特征在于,所述模拟指令是由所述第一电子设备运行第一程序生成的。
  24. 根据权利要求23所述的方法,其特征在于,所述第一程序包括可执行文件,并且所述可执行文件的生成方式,包括:
    所述第二电子设备向所述第一电子设备发送第二操作数据,以用于生成所述可执行文件,其中所述第二操作数据是所述第二电子设备记录用户的所述第二用户操作在所述第二电子设备上产生的数据,并 且
    所述第二操作数据用于被所述第二电子设备调用,以执行响应于所述第二用户操作的第二操作指令。
  25. 根据权利要求24所述的方法,其特征在于,所述第二电子设备记录用户的所述第二用户操作在所述第二电子设备上产生的第二操作数据的过程,包括:
    所述第二电子设备响应于所述第一电子设备发送的开始记录的指令,记录用户的所述第二用户操作在所述第二电子设备上产生的第二操作数据;
    所述第二电子设备响应于所述第一电子设备发送的停止记录的指令,停止所述记录操作,并向所述第一电子设备发送所述第二操作数据。
  26. 根据权利要求25所述的方法,其特征在于,在所述第二电子设备上模拟所述第二用户操作的先后顺序,与所述第二操作数据产生的顺序相同。
  27. 根据权利要求19所述的方法,其特征在于,在第二电子设备上模拟第二用户操作的过程中,所述第二电子设备显示第二用户操作引起的第二电子设备的显示内容的变化,和/或音量的变化。
  28. 一种分布式系统,其特征在于,包括第一电子设备和第二电子设备,
    所述第一电子设备用于响应用户的第一指令,在所述第一电子设备上模拟一个或多个第一用户操作,并生成并向所述第二电子设备发送的至少一条模拟指令,其中
    所述第一指令用于指令通过第二电子设备完成第一指定任务,所述第一用户操作为:在控制所述第二电子设备完成所述第一指定任务的过程中,需要用户在第一电子设备上进行的用户操作;
    所述模拟指令用于指示在第二电子设备上模拟第二用户操作,所述第二用户操作为:在完成所述第一指定任务的过程中,需要用户在第二电子设备上进行的用户操作;
    所述第二电子设备用于响应所述第一电子设备指示完成第一指定任务的指令,执行所述第一指定任务,并且
    所述第二电子设备用于响应所述第一电子设备发送的所述模拟指令,在所述第二电子设备上模拟所述第二用户操作。
  29. 一种电子设备,其特征在于,包括:一个或多个处理器;一个或多个存储器;所述一个或多个存储器存储有一个或多个程序,当所述一个或者多个程序被所述一个或多个处理器执行时,使得所述电子设备执行权利要求1至27中任一项所述的电子设备间的分布式实现方法。
  30. 一种计算机可读存储介质,其特征在于,所述存储介质上存储有指令,所述指令在计算机上执行时使所述计算机执行权利要求1至27中任一项所述的电子设备间的分布式实现方法。
PCT/CN2022/114797 2021-09-29 2022-08-25 分布式实现方法、系统、电子设备及存储介质 WO2023051116A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111154648.0 2021-09-29
CN202111154648.0A CN115878180A (zh) 2021-09-29 2021-09-29 分布式实现方法、系统、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023051116A1 true WO2023051116A1 (zh) 2023-04-06

Family

ID=85756357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114797 WO2023051116A1 (zh) 2021-09-29 2022-08-25 分布式实现方法、系统、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN115878180A (zh)
WO (1) WO2023051116A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008097006A1 (en) * 2007-02-09 2008-08-14 Kaonmedia Co., Ltd. Record control apparatus for mobile terminal, and method for the same
CN105282587A (zh) * 2015-07-13 2016-01-27 深圳市美贝壳科技有限公司 一种基于手机的同步可视化控制智能电视的方法
CN112817790A (zh) * 2021-03-02 2021-05-18 腾讯音乐娱乐科技(深圳)有限公司 模拟用户行为的方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008097006A1 (en) * 2007-02-09 2008-08-14 Kaonmedia Co., Ltd. Record control apparatus for mobile terminal, and method for the same
CN105282587A (zh) * 2015-07-13 2016-01-27 深圳市美贝壳科技有限公司 一种基于手机的同步可视化控制智能电视的方法
CN112817790A (zh) * 2021-03-02 2021-05-18 腾讯音乐娱乐科技(深圳)有限公司 模拟用户行为的方法

Also Published As

Publication number Publication date
CN115878180A (zh) 2023-03-31

Similar Documents

Publication Publication Date Title
WO2021052263A1 (zh) 语音助手显示方法及装置
KR102470275B1 (ko) 음성 제어 방법 및 전자 장치
CN111345010B (zh) 一种多媒体内容同步方法、电子设备及存储介质
WO2021063343A1 (zh) 语音交互方法及装置
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
JP7180007B2 (ja) 翻訳方法及び電子装置
CN113691842B (zh) 一种跨设备的内容投射方法及电子设备
CN111666119A (zh) Ui组件显示的方法及电子设备
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
CN113496426A (zh) 一种推荐服务的方法、电子设备和系统
WO2022068819A1 (zh) 一种界面显示方法及相关装置
WO2022042770A1 (zh) 控制通信服务状态的方法、终端设备和可读存储介质
WO2022042326A1 (zh) 显示控制的方法及相关装置
WO2022135157A1 (zh) 页面显示的方法、装置、电子设备以及可读存储介质
CN114040242A (zh) 投屏方法和电子设备
CN112383664B (zh) 一种设备控制方法、第一终端设备、第二终端设备及计算机可读存储介质
WO2022160991A1 (zh) 权限控制方法和电子设备
WO2022007707A1 (zh) 家居设备控制方法、终端设备及计算机可读存储介质
WO2021104122A1 (zh) 呼叫需求响应方法、装置及电子设备
WO2024045801A1 (zh) 用于截屏的方法、电子设备、介质以及程序产品
WO2022143258A1 (zh) 一种语音交互处理方法及相关装置
WO2022033355A1 (zh) 一种邮件处理方法及电子设备
WO2022002213A1 (zh) 翻译结果显示方法、装置及电子设备
WO2021238740A1 (zh) 一种截屏方法及电子设备
WO2022001279A1 (zh) 跨设备桌面管理方法、第一电子设备及第二电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22874508

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE