CN114510186A - Cross-device control method and device - Google Patents

Cross-device control method and device Download PDF

Info

Publication number
CN114510186A
CN114510186A CN202011173806.2A CN202011173806A CN114510186A CN 114510186 A CN114510186 A CN 114510186A CN 202011173806 A CN202011173806 A CN 202011173806A CN 114510186 A CN114510186 A CN 114510186A
Authority
CN
China
Prior art keywords
electronic device
screen
electronic equipment
control
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011173806.2A
Other languages
Chinese (zh)
Inventor
马良川
杜奕全
陆建海
闫玉林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202011173806.2A priority Critical patent/CN114510186A/en
Publication of CN114510186A publication Critical patent/CN114510186A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B19/00Driving, starting, stopping record carriers not specifically of filamentary or web form, or of supports therefor; Control thereof; Control of operating function ; Driving both disc and head
    • G11B19/02Control of operating function, e.g. switching from recording to reproducing
    • G11B19/022Control panels
    • G11B19/025'Virtual' control panels, e.g. Graphical User Interface [GUI]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B20/00Signal processing not specific to the method of recording or reproducing; Circuits therefor
    • G11B20/10Digital recording or reproducing
    • G11B20/10527Audio or video recording; Data buffering arrangements
    • G11B2020/10537Audio or video recording
    • G11B2020/10546Audio or video recording specifically adapted for audio data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application provides a cross-device control method and device, which are applied to a system consisting of a first electronic device and a second electronic device, and the method comprises the following steps: the method comprises the steps that a first electronic device and a second electronic device are in wireless communication connection; the first electronic equipment responds to the received first operation and executes a first target acquisition operation; the first electronic equipment responds to the received second operation and sends a first instruction to the second electronic equipment, and the first instruction is used for indicating the second electronic equipment to execute a second target acquisition operation; the second electronic equipment responds to the received first instruction and executes second target acquisition operation; wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation. In this application scheme, after first electronic equipment and second electronic equipment establish communication connection, through receiving different operations, control self and second electronic equipment respectively and carry out screen capture operation, record screen operation or recording operation, can realize swift cross equipment screen capture, record screen or recording control.

Description

Cross-device control method and device
Technical Field
The present application relates to the field of electronic devices, and in particular, to a cross-device control method and device.
Background
At present, functions such as rapid screen capture and screen recording are widely applied to electronic equipment. The user can quickly capture and record the screen of the content displayed by the electronic equipment through quick operation. However, the method for rapidly capturing and recording the screen through the shortcut operation is only specific to the electronic device, and the application scene is single, and cannot cover multiple device scenes, such as a device linkage control scene in which the electronic device controls other electronic devices. Therefore, the current screen capture and recording methods of the electronic equipment have the problem that cross-equipment screen capture and recording control cannot be realized.
Disclosure of Invention
The application provides a cross-device control method and device, which are used for cross-device screen capture, screen recording and recording control, and improving user experience.
In a first aspect, an embodiment of the present application provides a cross-device control method, which is applied to a system formed by a first electronic device and a second electronic device, and the method includes:
the first electronic equipment and the second electronic equipment establish wireless communication connection;
the first electronic equipment responds to the received first operation and executes a first target acquisition operation;
the first electronic equipment responds to the received second operation and sends a first instruction to the second electronic equipment, and the first instruction is used for instructing the second electronic equipment to execute a second target acquisition operation;
the second electronic equipment responds to the received first instruction and executes the second target acquisition operation;
wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
According to the method, after the first electronic device and the second electronic device are in communication connection, the first electronic device and the second electronic device are respectively controlled to execute screen capture operation, screen recording operation or sound recording operation by receiving different operations. Therefore, the user can control the first electronic equipment or the second electronic equipment to capture, record or record the screen through the first electronic equipment by corresponding operation, so that the screen capture, the screen record or the record control of the cross-equipment is realized, the corresponding control can be performed on different objects, and the user experience is improved.
In one possible design, before the second target acquisition operation is performed by the second electronic device, the method further includes:
the first electronic equipment determines the transmission delay with the second electronic equipment and sends the transmission delay to the second electronic equipment; or
The second electronic device determines the transmission delay.
By the method, after the second electronic device determines the transmission delay between the first electronic device and the second electronic device, the time for performing the target acquisition operation by the user indication can be determined accurately according to the delay, and the target acquisition operations such as screen capture, screen recording, sound recording and the like can be executed accurately according to the time.
In one possible design, the determining, by the first electronic device, a transmission delay with the second electronic device includes:
the first electronic equipment sends a heartbeat data packet to the second electronic equipment according to a heartbeat cycle;
the second electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the first electronic device;
and the first electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
In the method, because the time information of the heartbeat packet transceiving process is accurate, the first electronic device can obtain relatively accurate transmission time delay according to the heartbeat packet transceiving process between the first electronic device and the second electronic device, and meanwhile, the heartbeat packet transceiving process is periodically executed, so that the determined transmission time delay can be timely updated, and the accuracy of the transmission time delay is ensured.
In one possible design, the determining, by the second electronic device, the transmission delay includes:
the first electronic equipment sends a heartbeat data packet carrying a sending time stamp to the second electronic equipment according to a heartbeat cycle;
and the second electronic equipment receives the heartbeat data packet and determines the transmission delay according to the receiving time and the sending time stamp of the heartbeat data packet.
In the method, the first electronic device carries the sending time of the heartbeat packet in the heartbeat packet, and the second electronic device can determine the transmission delay only by receiving the heartbeat packet, so that a heartbeat response data packet does not need to be returned to the first device. The interactive process of the first electronic device and the second electronic device in the process of determining the transmission delay is simplified, and the speed of determining the transmission delay is improved.
In one possible design, the determining, by the second electronic device, the transmission delay includes:
the second electronic equipment sends a heartbeat data packet to the first electronic equipment according to a heartbeat cycle;
the first electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic device;
and the second electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
In the method, the second electronic device can determine relatively accurate transmission delay according to the transceiving process of the heartbeat packet with the first electronic device, and the first electronic device is not required to indicate the transmission delay. Meanwhile, the heartbeat packet receiving and sending process is periodically executed, so that the transmission delay can be updated in time, and the accuracy of the transmission delay is ensured.
In one possible design, the determining, by the second electronic device, the transmission delay includes:
the second electronic equipment sends a heartbeat data packet carrying a sending time stamp to the first electronic equipment according to a heartbeat cycle;
the first electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic device, wherein the heartbeat response data packet comprises a sending time stamp in the heartbeat data packet;
and the second electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the receiving time and the sending time stamp of the heartbeat response data packet.
In the method, the same time information is carried in the heartbeat packet transmitted between the first electronic device and the second electronic device, so that on one hand, the second electronic device can obtain relatively accurate transmission delay according to the receiving and sending process of the heartbeat packet, on the other hand, the same time information identifies the corresponding relation between the heartbeat packet and the heartbeat response packet, and the transmission delay can be ensured to be determined according to one successful heartbeat sending and response process.
In one possible design, the first instruction carries a sending timestamp of the first instruction, and the second electronic device determines the transmission delay, including:
and the second electronic equipment determines the transmission delay according to the receiving time and the sending time stamp of the first instruction.
In the method, the first electronic device directly informs the second electronic device of the indication time of the target acquisition operation, and the second electronic device can quickly execute the target acquisition operation according to the indication time after receiving the instruction, so that the process of calculating the indication time by the second electronic device is omitted, and the response speed of the second electronic device can be improved.
In one possible design, the method further includes:
the second electronic equipment sends the acquired data obtained by executing the second target acquisition operation to the first electronic equipment;
and the first electronic equipment displays the received collected data.
According to the method, after the second electronic device executes target acquisition operations such as screen capture, screen recording, sound recording and the like, the obtained data is returned to the first electronic device, and then the data is fed back to a user through the first electronic device, so that the screen capture, screen recording and sound recording data can be timely returned. And after the user controls the second electronic equipment to perform screen capture, screen recording or sound recording operation through the first electronic equipment, the user can directly check the data obtained by the execution operation of the second electronic equipment in time through the first electronic equipment, and the user experience degree is high.
In one possible design, the system further includes a third electronic device, and before the second electronic device transmits the collected data to the first electronic device, the method further includes:
the first electronic equipment and the third electronic equipment establish wireless communication connection;
the first electronic device responds to the received second operation and sends a second instruction to the third electronic device, and the second instruction is used for instructing the third electronic device to execute a third target acquisition operation;
the third electronic equipment responds to the received second instruction and executes the third target acquisition operation;
the first electronic equipment displays a first option and a second option;
the first electronic device responds to the operation of the first option and sends a third instruction to the second electronic device, wherein the third instruction is used for instructing the second electronic device to send the acquired data to the first electronic device;
and the first electronic equipment responds to the operation of the second option and sends a fourth instruction to the third electronic equipment, wherein the fourth instruction is used for instructing the third electronic equipment to send the acquired data to the first electronic equipment.
In the method, the first electronic equipment can perform screen capture, screen recording or sound recording control on the plurality of electronic equipment, so that the method is suitable for cross-equipment screen capture, screen recording or sound recording control in a multi-equipment scene, and further improves the universality of cross-equipment control. When the first electronic equipment controls the plurality of electronic equipment to perform screen capture, screen recording or sound recording operations, data obtained by target acquisition operations executed by the electronic equipment selected by the user can be fed back to the user according to user requirements, and user experience is further improved.
In one possible design, the second electronic device performs the second target acquisition operation, including:
and if the screen projection window from the first electronic equipment to the second electronic equipment is included in the display screen of the second electronic equipment, the second electronic equipment executes the second target acquisition operation on other display areas except the screen projection window on the display screen.
In the method, when the second electronic equipment receives the indication of the first electronic equipment and performs target acquisition operation under the condition of receiving screen projection of the first electronic equipment, the target acquisition operation is performed on the area except the area from the screen projection of the first electronic equipment to the screen projection window of the second electronic equipment on the real screen, so that the influence of screen projection can be eliminated, and the acquired data returned to the first electronic equipment is guaranteed to be the data of the second electronic equipment.
In one possible design, the second electronic device performs the second target acquisition operation, including:
the second electronic equipment executes continuous screen capture operation within a preset length of target time period; or
The second electronic equipment starts to execute screen recording operation or sound recording operation from a target time point;
the target time point is obtained by subtracting the transmission delay from the receiving time point of the first instruction, and the central time point of the target time period is the target time point.
In view of the fact that information transmission delay exists when the first electronic device sends an instruction to the second electronic device, a delay problem exists when the second electronic device executes target collection operation. Therefore, in the method, the second electronic device performs the target acquisition operation after performing the error time compensation according to the determined information transmission delay with the first electronic device, so that the screen capture, screen recording or sound recording operation can be performed quickly and accurately, and the user experience is further improved.
In one possible design, the first operation or the second operation includes at least one of:
an operation acting on a touch display screen of the first electronic device;
an operation acting on at least one key of the first electronic device.
In one possible design, the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving the first operation, and the second touch operation area is used for receiving the second operation.
According to the method, the first electronic equipment can distinguish whether target acquisition operations such as screen capture, screen recording or sound recording are executed by the first electronic equipment or other electronic equipment is controlled to perform the target acquisition operations according to user operation by partitioning the display screen and receiving touch operations aiming at different control objects in different partitions, so that screen capture, screen recording or sound recording control over different control objects is realized.
In one possible design, the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving a first operation for controlling the first electronic device to execute the screen capture operation, and the first touch operation area is also used for receiving a second operation for controlling the second electronic device to execute the screen capture operation;
the second touch operation area is used for receiving a first operation for controlling the first electronic device to execute the screen recording operation, and the second touch operation area is also used for receiving a second operation for controlling the second electronic device to execute the screen recording operation.
In the method, the first electronic device receives touch operations indicating different target acquisition operations in different partitions by partitioning the display screen, so that the specific type of the target acquisition operations can be distinguished according to user operations, and the flexibility of a control mode is enhanced.
In a second aspect, an embodiment of the present application provides a cross-device control method, which is applied to a first electronic device, and includes:
establishing a wireless communication connection with a second electronic device;
executing a first target acquisition operation in response to the received first operation;
responding to the received second operation, and sending a first instruction to the second electronic equipment, wherein the first instruction is used for instructing the second electronic equipment to execute a second target acquisition operation;
wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
In one possible design, when sending the first instruction to the second electronic device, the method further includes:
and determining the transmission delay with the second electronic equipment, and sending the transmission delay to the second electronic equipment.
In one possible design, determining a transmission delay with the second electronic device includes:
sending a heartbeat data packet to the second electronic device according to the heartbeat period;
receiving a heartbeat response data packet corresponding to the heartbeat data packet returned by the second electronic device;
and determining the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
In one possible design, the method further includes:
and the first electronic equipment sends a heartbeat data packet carrying a sending time stamp to the second electronic equipment according to the heartbeat cycle.
In one possible design, the method further includes:
and receiving a heartbeat data packet from the second electronic equipment, and sending a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic equipment.
In one possible design, the method further includes:
and receiving a heartbeat data packet carrying a sending time stamp from the second electronic equipment, and sending a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic equipment, wherein the heartbeat response data packet comprises the sending time stamp in the heartbeat data packet.
In one possible design, the first instruction carries a transmission time stamp of the first instruction.
In one possible design, the method further includes:
receiving collected data from the second electronic device, wherein the collected data is obtained by the second electronic device executing the second target collection operation;
and displaying the received acquisition data.
In one possible design, prior to receiving the collected data from the second electronic device, the method further includes:
establishing a wireless communication connection with a third electronic device;
responding to the received second operation, and sending a second instruction to the third electronic device, wherein the second instruction is used for instructing the third electronic device to execute a third target acquisition operation;
displaying a first option and a second option;
responding to the operation of the first option, and sending a third instruction to the second electronic device, wherein the third instruction is used for instructing the second electronic device to send the acquired data to the first electronic device;
and responding to the operation of the second option, and sending a fourth instruction to the third electronic equipment, wherein the fourth instruction is used for instructing the third electronic equipment to send the acquired data to the first electronic equipment.
In one possible design, the first operation or the second operation includes at least one of:
an operation acting on a touch display screen of the first electronic device;
an operation acting on at least one key of the first electronic device.
In one possible design, the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving the first operation, and the second touch operation area is used for receiving the second operation.
In one possible design, the touch display screen of the first electronic device comprises a first touch operation area, a second touch operation area and a third touch operation area;
the first touch operation area is used for receiving a first operation and/or a second operation for controlling the second electronic device to execute the screen capture operation, the second touch operation area is used for receiving a first operation and/or a second operation for controlling the second electronic device to execute the screen recording operation, and the third touch operation area is used for receiving a first operation and/or a second operation for controlling the second electronic device to execute the screen recording operation.
In a third aspect, an embodiment of the present application provides a cross-device control method, which is applied to a second electronic device, and the method includes:
establishing a wireless communication connection with a first electronic device;
receiving a first instruction from the first electronic device, wherein the first instruction is used for instructing the second electronic device to execute a second target acquisition operation;
executing the second target acquisition operation in response to the first instruction;
wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
In one possible design, before performing the second target acquisition operation, the method further includes:
receiving transmission delay between the first electronic device and the second electronic device sent by the first electronic device; or
And determining the transmission time delay.
In one possible design, before receiving the transmission delay sent by the first electronic device, the method further includes:
and receiving a heartbeat data packet from the first electronic equipment, and sending a heartbeat response data packet corresponding to the heartbeat data packet to the first electronic equipment.
In one possible design, determining the transmission delay includes:
and receiving a heartbeat data packet carrying a sending time stamp from the first electronic equipment, and determining the transmission delay according to the receiving time of the heartbeat data packet and the sending time stamp.
In one possible design, determining the transmission delay includes:
sending a heartbeat data packet to the first electronic device according to a heartbeat cycle;
and receiving a heartbeat response data packet corresponding to the heartbeat data packet returned by the first electronic device, and determining the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
In one possible design, determining the transmission delay includes:
sending a heartbeat data packet carrying a sending time stamp to the first electronic equipment according to a heartbeat period;
receiving a heartbeat response data packet corresponding to the heartbeat data packet returned by the first electronic device, wherein the heartbeat response data packet comprises a sending time stamp in the heartbeat data packet;
and determining the transmission delay according to the receiving time and the sending time stamp of the heartbeat response data packet.
In one possible design, the first instruction carries a sending timestamp of the first instruction, and the determining the transmission delay includes:
and determining the transmission delay according to the receiving time and the sending time stamp of the first instruction.
In one possible design, the method further includes:
and sending collected data to the first electronic equipment, wherein the collected data are obtained by executing the second target collection operation by the second electronic equipment.
In one possible design, before sending the collected data to the first electronic device, the method further includes:
and receiving a third instruction sent by the first electronic device, wherein the third instruction is used for instructing the second electronic device to send the acquired data to the first electronic device.
In one possible design, the performing the second target acquisition operation includes:
and if the screen projection window from the first electronic equipment to the second electronic equipment is included in the display screen of the second electronic equipment, executing the second target acquisition operation on other display areas except the screen projection window on the display screen.
In one possible design, the performing the second target acquisition operation includes:
executing continuous screen capture operation within a target time period with a preset length; or
Starting to execute screen recording operation or sound recording operation from a target time point;
the target time point is obtained by subtracting the transmission delay from the receiving time point of the first instruction, and the central time point of the target time period is the target time point.
In a fourth aspect, embodiments of the present application provide an electronic device that includes a touch display screen, a memory, and one or more processors;
wherein the memory is to store computer program code comprising computer instructions; the instructions, when executed by the one or more processors, enable the electronic device to perform the method as described above in the second aspect or any possible design of the second aspect.
In a fifth aspect, embodiments of the present application provide an electronic device, which includes a touch display screen, a memory and one or more processors;
wherein the memory is to store computer program code comprising computer instructions; the instructions, when executed by the one or more processors, enable the electronic device to perform the method described in the third aspect or any possible design of the third aspect.
In a sixth aspect, an embodiment of the present application provides a chip, where the chip is coupled with a memory in an electronic device, so that when running, the chip invokes a computer program stored in the memory, to implement the method of the first aspect or any one of the possible designs provided by the first aspect of the embodiment of the present application, or to implement the method of the second aspect or any one of the possible designs provided by the second aspect of the embodiment of the present application, or to implement the method of the third aspect or any one of the possible designs provided by the third aspect of the embodiment of the present application.
In a seventh aspect, an embodiment of the present application provides a computer storage medium, where the computer storage medium stores a computer program, and when the computer program runs on an electronic device, the computer program causes the electronic device to perform any one of the above-mentioned methods for designing the first aspect or the first aspect, or implement any one of the methods for designing the second aspect or the second aspect of the embodiment of the present application, or implement any one of the methods for designing the third aspect or the third aspect of the embodiment of the present application.
In an eighth aspect, an embodiment of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to perform a method of any one of the possible designs of the first aspect or the first aspect, or implement a method of any one of the possible designs provided by the second aspect or the second aspect of the embodiment of the present application, or implement a method of any one of the possible designs provided by the third aspect or the third aspect of the embodiment of the present application.
Drawings
Fig. 1 is a simplified schematic diagram of an architecture of a cross-device control system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an android operating system of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a cross-device control method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a control device implementing cross-device control by using a knuckle operation according to an embodiment of the present application;
fig. 6 is a schematic diagram of an instruction triggering manner in a remote device according to an embodiment of the present application;
fig. 7 is a schematic diagram of an instruction triggering manner in a remote device according to an embodiment of the present application;
fig. 8 is a schematic diagram of an instruction triggering manner in a remote device according to an embodiment of the present application;
fig. 9 is a schematic diagram of an instruction triggering manner in a remote device according to an embodiment of the present application;
FIG. 10 is a schematic view of a finger joint operation provided by an embodiment of the present application;
fig. 11 is a schematic flowchart of a cross-device control method according to an embodiment of the present application;
fig. 12 is a schematic effect diagram of a cross-device control method according to an embodiment of the present application;
fig. 13 is a schematic flowchart of a cross-device control method according to an embodiment of the present application;
fig. 14 is a schematic effect diagram of a cross-device control method according to an embodiment of the present application;
fig. 15 is a schematic effect diagram of a cross-device control method according to an embodiment of the present application;
fig. 16 is a schematic diagram of an error time compensation method in cross-device control according to an embodiment of the present disclosure;
fig. 17 is a schematic diagram of a method for determining information transmission delay in cross-device control according to an embodiment of the present application;
fig. 18 is a schematic diagram of a method for determining information transmission delay in cross-device control according to an embodiment of the present application;
fig. 19 is a schematic diagram of a method for determining information transmission delay in cross-device control according to an embodiment of the present application;
fig. 20 is a schematic flowchart of a cross-device control method according to an embodiment of the present application;
fig. 21 is a schematic diagram of a control device outputting display screenshot data according to an embodiment of the present disclosure;
fig. 22 is a schematic view of a multi-screen collaboration interface of a remote device according to an embodiment of the present disclosure;
fig. 23 is a schematic view of a screen capture interface of a remote device according to an embodiment of the present application;
FIG. 24 is a schematic diagram of a screen capture interface of a remote device according to an embodiment of the present application;
fig. 25 is a schematic flowchart of a remote device screen capture method according to an embodiment of the present application;
fig. 26 is a schematic diagram of a cross-device control method according to an embodiment of the present application;
fig. 27 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
For ease of understanding, the following description of the terms used in this application are intended to be incorporated by reference.
1) The electronic device is a device with a wireless connection function. In some embodiments of the present application, the electronic device may be a portable device, such as a mobile phone, a tablet computer, a wearable device with wireless communication function (e.g., a watch, a bracelet, a helmet, an earphone, etc.), an in-vehicle terminal device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a smart home device (e.g., a smart television, a smart speaker, etc.), and other smart terminal devices.
In some embodiments of the present application, the electronic device may also be a portable terminal device that also includes other functions, such as personal digital assistant and/or music player functions. Exemplary embodiments of the portable terminal device include, but are not limited to, a mount
Figure BDA0002748121680000081
Figure BDA0002748121680000082
Or other operating system. The portable terminal device described above may also be other portable terminal devices such as a Laptop computer (Laptop) with a touch sensitive surface, e.g. a touch panel, etc. It should also be understood that in some other embodiments of the present application, the electronic device may not be a portable terminal device, but may be a desktop computer with a touch-sensitive surface (e.g., a touch panel).
2) The control device is used for controlling other electronic devices so that the other electronic devices can realize certain functions. Illustratively, the control device may be an electronic device frequently used and convenient to operate by a user, such as a mobile phone and a tablet computer. In the embodiment of the present application, the control device is an electronic device capable of playing/displaying a multimedia file.
For convenience of description in the embodiment of the present application, a control device is taken as an example for issuing a screen capture, screen recording or sound recording operation instruction.
3) And the remote equipment is electronic equipment which is controlled by the control equipment and executes corresponding operation. For example, the remote device may be an electronic device that is not frequently used, inconvenient to operate, and incapable of being carried around by a user, for example, a smart home device (e.g., a smart television, a smart speaker), and the like. In the embodiment of the present application, the remote device may be an electronic device capable of playing/displaying multimedia files.
In some embodiments of the present application, the remote device may be an electronic device having at least one of image display, video playing and audio playing functions, for example, a smart television, a smart sound, and the like; the remote device may also be another electronic device of the same type as the control device.
For convenience of description in the embodiment of the present application, a remote device is taken as an example for performing a screen capture operation, a screen recording operation, or a sound recording operation.
4) The multimedia file is a data file for storing and managing information of at least one media type, and can provide various forms of information display for users. The media types of the information include text, sound, image, video and other forms. In the embodiment of the present application, the multimedia file may be a picture file, a video file, an audio file, a three-dimensional model file, or the like.
In the embodiment of the present application, the multimedia file may be divided into a static multimedia file and a dynamic multimedia file, where the static multimedia file is a file whose data content maintains static state, such as a static image; a dynamic multimedia file is a file in which the data content dynamically changes over time, such as a dynamic image, video, audio, three-dimensional model file, and the like. A dynamic image or video may be understood as a collection of static images, and the static images in the collection are switched over time to realize a dynamic effect.
It should be understood that "at least one" in the embodiments of the present application means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
At present, electronic equipment can be connected with other electronic equipment to control the other electronic equipment and realize corresponding functions so as to improve the usability and convenience of cross-equipment control.
In order to realize cross-device control, the application provides a cross-device control method, and in the method, cross-device screen capture, screen recording or sound recording control can be realized through application software. In the method, the control device and the remote terminal device may install an Application (APP) for implementing the cross-device control function; the control equipment sends the control instruction through the APP, and the remote equipment receives the control instruction through the APP and executes the operation corresponding to the control instruction, so that the control equipment controls the remote equipment. The applicability of this kind of control mode is stronger, and arbitrary two support installations APP's electronic equipment can all regard as controlgear and remote end equipment respectively to carry out the control between the equipment, however, the operation of this mode is comparatively complicated, and there is great time delay in the information interaction between controlgear and remote end equipment, causes the problem that can't carry out accurate screenshot to the dynamic content that remote end equipment shows easily. In addition, after the remote device receives the control of the control device to perform operations such as screen capture and screen recording, the acquired files cannot be transmitted back to the control device in real time, so that the speed of acquiring the screen capture and screen recording files from the remote device by the control device is slow.
When the control device controls the remote device to perform operations such as screen capture, screen recording or sound recording and acquire related result data in the remote device, the problems of complex operation and low accuracy and instantaneity exist, and the requirements of a user on instantaneity and rapidness cannot be met. In view of this, the present application further provides another cross-device control method, which is applied to a system formed by a first electronic device and a second electronic device, and in particular, may be applied to a cross-device screen capture, screen recording or sound recording scene.
In the embodiment of the application, the first electronic device and the second electronic device are both electronic devices capable of playing/displaying multimedia files.
In this embodiment of the application, the first electronic device serves as a control-side device and can control the second electronic device to execute some operations, and the second electronic device serves as a controlled-side device and can execute corresponding operations according to the control of the first electronic device. It can be understood that the roles of the control-side device and the controlled-side device may be switched, that is, when the second electronic device serves as the control-side device, the first electronic device may also serve as the controlled-side device.
Hereinafter, for convenience of description, the first electronic device is referred to as a "control device", and the second electronic device is referred to as a "remote device".
Fig. 1 is a system architecture diagram of cross-device control according to an embodiment of the present application. As shown in fig. 1, the system architecture may include: a control device 101 (e.g., a cell phone as shown in fig. 1) and a remote device 102 (e.g., a smart tv as shown in fig. 1).
The control device 101 is configured to control the remote device 102 to perform a screen capture, screen recording or sound recording operation, and obtain corresponding screen capture, screen recording or sound recording data from the remote device 102. The remote device 102 is configured to perform a screen capture operation, a screen recording operation, or a sound recording operation according to the control of the control device 101, and return corresponding screen capture, screen recording, or sound recording data to the control device 101 in real time.
In this system, communication is enabled between the control device 101 and the remote device 102. Optionally, the control device 101 and the remote device 102 may access the same local area network, or may access different local area networks.
In an example that the control device 101 and the remote device 102 access the same local area network, specifically, the following may be used: the control device 101 and the remote device 102 establish a wireless connection with the same wireless access point.
In addition, the control device 101 and the remote device 102 may access the same Wireless Fidelity (Wi-Fi) hotspot. For another example, the control device 101 and the remote device 102 may also access the same bluetooth beacon through the bluetooth protocol. For another example, the control device 101 and the remote device 102 may also trigger a Communication connection through a Near Field Communication (NFC) tag, and transmit encrypted information through a bluetooth module for identity authentication. After the authentication is successful, data transmission is performed in a Point-to-Point (P2P) manner.
It should be noted that the system shown in fig. 1 does not limit the applicable scenarios of the cross-device control method provided in the present application. For example, in some scenarios, a cell phone may act as a control device to control other remote devices; in other scenarios, the mobile phone can also be used as a remote device to be controlled by other control devices.
It should be noted that, the present application also does not limit the number of remote devices controlled by each control device, and may control one remote device, or may control a plurality of remote devices, for example, 3 or 4 remote devices.
Referring to fig. 2, a structure of an electronic device to which the method provided by the embodiment of the present application is applied will be described.
As shown in fig. 2, the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a SIM card interface 295, and the like. Wherein the sensor module 280 may include a gyroscope sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like.
It is to be understood that the electronic device 200 shown in fig. 2 is merely an example and is not to be construed as limiting the electronic device, and that the electronic device may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller may be, among other things, a neural center and a command center of the electronic device 200. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
The execution of the cross-device control method provided by the embodiment of the present application may be controlled or completed by calling other components by the processor 210, for example, calling the processing program of the embodiment of the present application stored in the internal memory 221, or calling the processing program of the embodiment of the present application stored in the third-party device through the external memory interface 220, so as to control the wireless communication module 260 to perform data communication to other terminal devices, thereby implementing cross-device control, improving the intelligence and convenience of the electronic device 200, and improving the user experience. The processor 210 may include different devices, for example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the cross-device control method provided in the embodiment of the present application, for example, part of algorithms in the cross-device control method are executed by the CPU, and another part of algorithms are executed by the GPU, so as to obtain faster processing efficiency.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 200 may include 1 or N display screens 294, N being a positive integer greater than 1. The display screen 294 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display screen 294 may display a photograph, video, web page, or file, among others. As another example, display screen 294 may display a graphical user interface of a terminal device such as that shown in FIG. 1. The graphical user interface of the terminal device shown in fig. 1 includes a status bar, a Dock bar, a time and weather widget (widget), and icons of applications, such as a phone icon, a short message icon, a browser icon, and the like. The status bar includes the name of the operator (e.g. china mobile), the mobile network (e.g. 4G), the time and the remaining power. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It is further understood that, in other embodiments, a Dock column may be further included in the graphical user interface of the terminal device shown in fig. 1, and a commonly-used application icon may be included in the Dock column. When the processor 210 detects a touch event of a finger (or a stylus, etc.) of a user with respect to an application icon, in response to the touch event, a user interface of an application corresponding to the application icon is opened and displayed on the display screen 294.
In this embodiment, the display screen 294 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens.
The cameras 293 (front camera or rear camera, or one camera may be used as both front camera and rear camera) are used for capturing still images or video. In general, the camera 293 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
Internal memory 221 may be used to store computer-executable program code, including instructions. The processor 210 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 221. The internal memory 221 may include a program storage area and a data storage area. Wherein the storage program area may store codes of an operating system, an application program (such as a cross device control function, etc.), and the like. The storage data area may store data created during use of the electronic device 200 (such as preset data set for a cross-device control function, etc.), and the like.
The internal memory 221 may also store one or more computer programs corresponding to the cross-device control algorithm provided by the embodiments of the present application. The one or more computer programs stored in the internal memory 221 and configured to be executed by the one or more processors 210 include instructions that may be used to perform the steps in the following embodiments.
In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the code of the cross-device control algorithm provided by the embodiment of the present application may also be stored in the external memory. In this case, the processor 210 may run the code of the cross-device control algorithm stored in the external memory through the external memory interface 220.
The sensor module 280 may include a gyroscope sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, and the like.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 294, and the touch sensor and the display screen 294 form a touch display screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 294. In other embodiments, the touch sensor may be disposed on a surface of the electronic device 200, different from the position of the display screen 294.
Illustratively, the display screen 294 of the electronic device 200 displays a home interface including icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks an icon of the camera application in the main interface through the touch sensor, and the trigger processor 210 starts the camera application and opens the camera 293. Display screen 294 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 200. The mobile communication module 250 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 250 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 250 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional blocks of the mobile communication module 250 may be provided in the same device as at least some of the blocks of the processor 210. In this embodiment, the mobile communication module 250 may also be configured to perform information interaction with other terminal devices, that is, send a control instruction to the other terminal devices, or the mobile communication module 250 may be configured to receive data returned by the other terminal devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays an image or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 210, and may be disposed in the same device as the mobile communication module 250 or other functional modules.
The wireless communication module 260 may provide solutions for wireless communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on the electromagnetic wave signal, and transmits the processed signal to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves. In this embodiment, the wireless communication module 260 is configured to establish a connection with other terminal devices to perform data interaction. Or the wireless communication module 260 may be used to access the access point device, send control instructions to other terminal devices, or receive data sent from other terminal devices.
For example, as shown in fig. 1, the control device 101 and the remote device 102 may perform receiving or transmitting of control instructions and data through the mobile communication module 250 or the wireless communication module 260, so as to implement a cross-device control function, and the control device 101 and the remote device 102 may further transmit a heartbeat packet to each other through the mobile communication module 250 or the wireless communication module 260, so that each other confirms information such as a connection state with itself.
In addition, the electronic device 200 may implement an audio function through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone interface 270D, the application processor, and the like. Such as music playing, recording, etc. The electronic apparatus 200 may receive a key 290 input, generating a key signal input related to user setting and function control of the electronic apparatus 200. The electronic device 200 may generate a vibration alert (such as an incoming call vibration alert) using the motor 291. The indicator 292 of the electronic device 200 may be an indicator light, and may be used to indicate a charging status, a power change, or a message, a missed call, a notification, etc. The SIM card interface 295 in the electronic device 200 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic apparatus 200 by being inserted into the SIM card interface 295 or being pulled out from the SIM card interface 295.
It should be understood that in practical applications, the electronic device 200 may include more or less components than those shown in fig. 2, and the embodiment of the present application is not limited thereto. The illustrated electronic device 200 is merely an example, and the electronic device 200 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the electronic device 200 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a terminal device. Fig. 3 is a block diagram of a software structure of the terminal device according to the embodiment of the present invention. By way of example, fig. 3 is a schematic diagram of a software architecture that may be run in the control device described above. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. As shown in fig. 3, the software architecture may be divided into five layers, which are an application layer, an application framework layer, an android runtime and system library, a hardware abstraction layer, and a Linux kernel layer.
The application layer is the top layer of the operating system and includes native applications of the operating system, such as email clients, bluetooth, cameras, music, video, text messages, calls, calendars, browsers, contacts, etc. The APP, abbreviated as application, related to the embodiments of the present application is a software program capable of implementing one or more specific functions. Generally, a plurality of applications can be installed in a terminal device. For example, a camera application, a mailbox application, an intelligent home control application, and the like. The application mentioned below may be a system application installed when the terminal device leaves a factory, or may be a third-party application downloaded from a network or acquired from another terminal device by a user during the process of using the terminal device.
Of course, for a developer, the developer may write an application and install it into the layer.
In some embodiments of the present application, the application layer may be configured to implement presentation of a setting interface, and the setting interface may be configured to enable a user to set a cross-device control function of the terminal device. For example, a user may perform on or off setting of the cross-device control function in the setting interface, and may also perform configuration of the cross-device control function in the setting interface, such as a form and a reception setting of a user instruction in the cross-device control function. For example, the setting interface may display a user instruction form, such as a touch operation instruction, a key operation instruction, a voice operation instruction, and the like, which may be selected by a user, and the user may select the user instruction form in the setting interface and perform specific operation setting. For example, if the user selects a form of a touch operation instruction in the setting interface, the user can trigger the electronic device to execute a corresponding function by performing a touch operation on the touch display screen, and the user can also set a specific action format of the touch operation in the setting interface, such as a finger joint tapping the touch display screen, a finger joint sliding on the touch display screen, and the like. For example, the setting interface may be content in a status bar or a notification bar displayed on a touch screen of the terminal device, or may be a control interface of a device control function displayed on the touch screen of the terminal device.
In a possible implementation manner, the Application program may be developed using Java language, and is completed by calling an Application Programming Interface (API) provided by an Application framework layer, and a developer may interact with a bottom layer (e.g., a hardware abstraction layer, a kernel layer, etc.) of an operating system through the Application framework layer to develop its own Application program. The application framework is primarily a series of services and management systems for the operating system.
The application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The view system may be used to build applications. The display interface may be composed of one or more views. The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction.
In this embodiment of the application, the application framework layer may further include a cross-device control service, specifically including three sub-service modules, namely, an Input service, a gesture service, and a multi-device management service, where the three sub-service modules are mainly used to provide a device control function in cooperation with the cross-device control service. For example, the Input service is used for identifying the received Input event and determining a corresponding operation type; the gesture service is used for identifying gesture operation and determining a corresponding control instruction when the operation type determined by the Input service is gesture operation; the multi-device management service is used for managing the currently connected terminal device. The cross-device control service issues the control instruction determined by the gesture service to the remote device, so that the remote device executes corresponding operation according to the control instruction, receives screen capture/screen recording/recording result data reported by the remote device, and reports the screen capture/screen recording/recording result data to the system application layer, and the function of device control is completed.
In some embodiments of the present application, the cross-device control service may further include a notification manager for interacting with other data layers, such as transmitting notification messages to upper layers for presentation in a display interface of the control device, such as a touch screen.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library of the android system comprises two parts: one part is a function which needs to be called by the Java language, and the other part is a core library of the android system. The application layer and the application framework layer run in a virtual machine. Taking Java as an example, the virtual machine executes Java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers, media libraries, three-dimensional graphics processing libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like. The surface manager is used to manage the display subsystem and provide a fusion of two-dimensional and three-dimensional layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The two-dimensional graphics engine is a two-dimensional drawing engine.
The Hardware Abstraction Layer (HAL) is a support of the application framework, and is an important link for connecting the application framework Layer and the Linux kernel Layer, and can provide services for developers through the application framework Layer.
In some embodiments of the application, the hardware abstraction layer includes an Input reporting process and a gesture algorithm process, where the Input reporting process is configured to receive an Input event reported by the kernel layer and report the Input event to the application framework layer, and the gesture algorithm process is configured to provide a gesture algorithm library for the application framework layer, so that the application framework layer identifies a control instruction corresponding to a gesture operation according to the gesture algorithm library.
The Kernel layer provides core system services of the operating system, such as security, memory management, process management, network protocol stack and driver model, and the like, which are realized based on the Kernel layer. The kernel layer also acts as an abstraction layer between the hardware and software stacks. The layer has many drivers associated with the electronic device, the main drivers being: display driving; linux-based frame buffer drivers; a keyboard driver as an input device; flash drive based on memory technology equipment; driving a camera; audio driving; driving by Bluetooth; WI-FI drives, etc.
In some embodiments of the present application, the kernel layer serves as an abstraction layer between hardware and a software stack, and includes a TP driver service, which is configured to acquire, through a serial peripheral interface, operation information related to a control instruction received by a hardware portion (for example, a touch screen, a touch sensor, and the like), convert the operation information into an Input event, and report the Input event to the hardware abstraction layer.
It should be noted that, in the cross-device control system shown in fig. 1, both the control device and the remote device may be implemented by the above hardware architecture and software architecture.
Referring to the flowchart of the cross-device control method shown in fig. 4, the hardware structure and the software structure of the electronic device shown in fig. 2 and 3 are combined, and taking the finger joint to operate and control the screen capture/screen recording/sound recording of the remote device as an example, the cross-device control method provided by the embodiment of the present application is described in detail.
Step 1: the control device detects touch operation of a user on the touch display screen.
After a Touch display screen (TP) of the control device detects a Touch operation performed on the Touch display screen by a user, an interrupt is triggered, and a kernel layer is notified to acquire corresponding TP operation information.
Step 2: and the control equipment acquires TP operation information corresponding to the touch operation and converts the TP operation information into an Input event.
And the TP drive service in the kernel layer of the control equipment receives the interrupt sent by the touch display screen, acquires TP operation information corresponding to the touch operation through the serial peripheral interface, converts the TP operation information into an Input event and reports the Input event to the hardware abstraction layer.
And step 3: and the control equipment internally forwards the Input event.
The internal forwarding is a communication behavior occurring between internal devices of a computer, and in the application, the internal forwarding may be forwarding or reporting of data between internal hardware structures of the electronic device, or forwarding or reporting of data between internal software structures of the electronic device.
In this embodiment, the hardware abstraction layer of the control device internally forwards the Input event, and reports the Input event reported by the kernel layer to the application framework layer through an Input reporting process.
And 4, step 4: and the control equipment judges whether the touch operation is finger joint operation according to the TP operation information.
And calling a finger joint algorithm in a gesture algorithm library of a hardware abstraction layer by an Input service in an application program framework layer of the control equipment, identifying the TP operation information, and judging whether the touch operation is finger joint operation. In addition, when the touch operation is a finger joint operation, the specific type of the finger joint operation can be identified through the gesture algorithm library.
And if the touch operation is identified as the finger joint operation through the gesture algorithm library, executing the step 5.
The gesture algorithm library comprises an algorithm for identifying whether the operation of the user on the touch display screen is gesture operation or not and an algorithm for identifying the specific type of the gesture operation. For example, the gesture algorithm library may include a finger operation recognition algorithm, a finger joint operation recognition algorithm, and the like, and a finger operation type recognition algorithm, a finger joint operation type recognition algorithm, and the like.
And 5: when the touch operation is determined to be the finger joint operation, the control device determines a control instruction corresponding to the finger joint operation, and the control instruction is used for indicating screen capturing/screen recording/sound recording operation.
When the Input service in the application framework layer of the control device determines that TP operation information refers to joint operation, the gesture service module determines a control instruction of screen capture/screen recording/sound recording corresponding to the joint operation. The gesture service module can store corresponding relations between different finger joint operations and different control instructions in advance.
Step 6: the control device determines a remote device to which the control instruction is directed.
For example, the multi-device management service in the application framework layer of the control device determines whether a remote device connected to the multi-device management service currently exists, if so, step 7 is executed, otherwise, no processing is performed.
And 7: and the control equipment sends the control instruction to the remote equipment.
And the control equipment sends the control instruction to the remote equipment through a mobile communication module or a wireless communication module. For example, the control device may send the control instruction to the remote device through the mobile communication module in a wireless communication manner such as 2G/3G/4G/5G; or the control instruction is sent to the remote equipment through the wireless communication module in short-distance communication modes such as Wi-Fi and Bluetooth.
And 8: and the remote equipment executes corresponding screen capturing/screen recording/sound recording operation according to the control instruction.
The software structure of the remote equipment comprises screen capturing/screen recording/recording service, and is used for carrying out screen capturing/screen recording/recording according to the control instruction of the control equipment.
And step 9: and the remote equipment transmits the result data obtained after the screen capturing/recording operation is executed to the control equipment.
And the remote equipment sends the result data to the control equipment through a mobile communication module or a wireless communication module. For example, the remote device may send the result data to the control device through the mobile communication module in a wireless communication manner such as 2G/3G/4G/5G; or the result data is sent to the control equipment through the wireless communication module in short-distance communication modes such as Wi-Fi and Bluetooth.
Step 10: and the control equipment outputs, displays or stores the received result data.
And the multi-device management service in the application program framework layer of the control device receives the screen capturing/screen recording/sound recording operation result data sent by the remote device and reports the screen capturing/screen recording/sound recording operation result data to the application program layer, and the application program layer outputs or stores or otherwise processes the screen capturing/screen recording/sound recording operation result data of the remote device.
Illustratively, as shown in fig. 5, the flow of the control device implementing cross-device control by using knuckle operation in the embodiment of the present application includes:
step S501, a TP device in the control equipment receives touch operation of a user on a touch display screen and triggers interruption;
step S502, the kernel layer TP driver in the control equipment receives the interrupt, converts the interrupt into an Input event and reports the Input event to an application program framework layer;
step S503, an application program framework layer in the control equipment calls a gesture algorithm library to identify whether the touch operation is finger joint operation; if so, go to step S504, otherwise, go to step S505.
Step S504, an application program framework layer in the control equipment sends a control instruction corresponding to the touch operation to the remote equipment;
and step S505, keeping the original related control flow.
The specific implementation flows provided by the above embodiments are only examples of applicable method flows in the embodiments of the present application, and the execution sequence of each step may be adjusted accordingly according to actual requirements, and other steps may be added or some steps may be reduced. The method flows provided by the above embodiments may also be executed in combination with the method flows provided by other embodiments to implement the cross-device control method provided by the embodiments of the present application.
In the above embodiment, the control device receives a touch operation instruction for controlling the remote device by a user, generates a corresponding control instruction, and instructs the remote device, so as to control the remote device to perform screen capture/screen recording/sound recording. The method can realize the screen capture/screen recording/recording control of other electronic equipment by taking the control equipment as a carrier, is simple to operate and high in universality, and can realize cross-equipment screen capture/screen recording/recording control only through touch operation of a user.
In an embodiment of the present application, an application scenario of cross-device control includes at least one of:
1) and taking the control equipment as a carrier, controlling the remote equipment connected with the control equipment to capture the screen in real time, and returning the corresponding screen capture data to the control equipment by the remote equipment.
2) And the remote equipment connected with the control equipment is controlled in real time to record the screen by taking the control equipment as a carrier, and the corresponding screen recording data is returned to the control equipment by the remote equipment.
3) And the remote equipment connected with the control equipment is controlled in real time to record by taking the control equipment as a carrier, and the corresponding recording data is returned to the control equipment by the remote equipment.
In the embodiment of the application, the control device may be connected to only one remote device and control the remote device to perform screen capture/screen recording/recording, or the control device may be connected to a plurality of remote devices and control at least one of the remote devices to perform screen capture/screen recording/recording.
In the above scenario, the control device monitors in real time whether a control instruction for screen capture/screen recording/recording of the remote device is received, and if the control device receives the control instruction, the control device issues the control instruction to the connected remote device, so that the remote device performs screen capture/screen recording/recording according to the control instruction. Optionally, when the touch display screen of the control device is in a screen-off state or a working state, the control device may monitor whether the control instruction is received in real time, and the monitoring and receiving modes of the control device in the two states may be the same.
In the embodiment of the application, a cross-device control function characteristic switch may be set in both the control device and the remote device, and the characteristic switch may be turned on or off by a user or other devices, or the cross-device control function may be turned on after the control device and the remote device are started.
If the characteristic switch in the control equipment is turned on, the control equipment can respond to a control instruction which is triggered by a user and aims at screen capture/screen recording/sound recording of the remote equipment and issue the control instruction to the connected remote equipment; otherwise, the control device does not respond to the control instruction.
If the characteristic switch in the remote equipment is turned on, the remote equipment can perform screen capture/screen recording/sound recording operation according to the control instruction sent by the control equipment; otherwise, the remote device does not respond to the control instruction.
In the embodiment of the application, a user can trigger the control device to control the remote device connected with the control device to perform screen capture/screen recording/sound recording operations through shortcut operations, however, in view of the fact that a scheme that the user can trigger screen capture or screen recording on the display screen of the electronic device through some shortcut operations still exists at present, in order to achieve compatibility of the two schemes, if the control device receives a control instruction of screen capture/screen recording/sound recording triggered by the user, whether a control object is the remote device or the control object needs to be distinguished. In the embodiment of the application, the control device can distinguish the triggering mode of the control instruction, so that the control device can determine the corresponding control object according to the received control instruction. Illustratively, the control device may distinguish the control objects by, but not limited to:
the control objects are distinguished by the type of touch operation and/or by setting different areas for different control objects in the touch display screen.
In the embodiment of the application, the trigger mode of the control instruction of screen capturing/screen recording/recording adopts at least one of the following modes:
instruction trigger mode 1, touch operation trigger
Instruction triggering mode 2 and key operation triggering
Instruction trigger mode 3, voice instruction trigger
In the case of the instruction triggering mode 1, the control device may preset different types of touch operations, and determine a control instruction or a control object corresponding to the touch operation of the user by corresponding the different touch operations to the different control instructions or the different control objects. Or, the control device may set different areas in the touch display screen, and determine the control instruction or the control object corresponding to the touch operation performed by the user in the different areas by corresponding the different areas to the different control instructions or the different control objects. In the cross-device control process, the control device responds to the touch operation of a user on the touch display screen, generates a corresponding control instruction, and controls a corresponding control object to execute the control instruction. The control instruction is a control instruction for screen capturing, screen recording or sound recording.
Illustratively, the different types of touch operations preset by the control device may include a knuckle tapping operation, a knuckle sliding operation, a finger clicking or long-pressing operation, a finger sliding operation, and the like, wherein the various operations may be further divided into different touch operations, for example, the knuckle tapping operation may further include a single knuckle clicking operation, a single knuckle double clicking operation, a double knuckle double clicking operation, and the like.
In the embodiment of the application, the corresponding relationship between different touch operations and the control instruction or the control object, and the corresponding relationship between different areas and the control instruction or the control object are preset fixed corresponding relationships or corresponding relationships set by a user. The different types of touch operations may be preset operations or setting operations set by the user.
In the embodiment of the present application, the control device includes, but is not limited to, determining a control instruction and a control object corresponding to the touch operation by using at least one of the following modes:
1) a plurality of touch operation areas are arranged in the touch display screen, different areas correspond to different control objects respectively, and a user controls the control objects corresponding to the areas respectively by performing touch operation in the areas.
Optionally, in this manner, the touch operations corresponding to different types of control instructions are set to be different.
For example, in this manner, the first area may correspond to the control device itself, and the second area may correspond to the remote device.
For example, in this mode, different touch operations corresponding to control instructions of screen capturing, screen recording, and sound recording may be set. For example, a single-finger joint double click, a double-finger joint double click and a single-finger joint sliding along a set track respectively correspond to control instructions of screen capture, screen recording and recording, and when a second area in the touch display screen receives an operation of double clicking the display screen by the single-finger joint, the screen capture control instruction of the remote device can be determined to be triggered.
2) A plurality of touch operation areas are arranged in the touch display screen, different areas correspond to different types of control instructions respectively, and a user performs touch operation in each area to trigger generation of the corresponding control instructions respectively.
For example, in this mode, a first area may be provided for receiving a screen capture instruction, a second area may be provided for receiving a recording instruction, and a third area may be provided for receiving a recording instruction.
Optionally, in this manner, the touch operations corresponding to different types of control objects are set to be different.
For example, in this mode, the touch operations for screen capture/recording control on the control device itself and the remote device may be set to be different. For example, a single-knuckle click and a single-knuckle double click are respectively corresponding to screen capture control instructions for the control device and the remote device, and when the first area in the touch display screen receives an operation of double clicking the display screen by the single-knuckle, the screen capture control instruction for the remote device can be determined to be triggered.
3) Different types of touch operation are set for different types of control objects, and a user controls different control objects by performing different types of touch operation on the touch display screen.
Optionally, in this manner, the touch operation areas corresponding to different types of control instructions are set to be different. For example, in this mode, the touch operation areas corresponding to the control commands of the screen capture, the screen recording and the sound recording may be set to be different.
4) Different types of touch operations are set for different types of control instructions, and different types of touch operations are triggered and generated on the touch display screen by a user.
Optionally, in this manner, the touch operations corresponding to different types of control objects are set to be different. For example, in this mode, the touch operations for screen capture/recording control on the control device itself and the remote device may be set to be different.
In each of the above manners, the control object may be further divided into the control device itself and different types of remote devices connected to the control device, and different types of touch operations or different areas in the touch display screen are set correspondingly.
The cross-device control method provided by the embodiment of the present application is described in detail below with reference to specific examples 1 to 4.
Example 1
Exemplarily, the control device sets different areas for the control device and a connected remote device in a touch display screen, and if a user performs a touch operation in an area corresponding to the control device, performs screen capture/screen recording/sound recording control on the control device; and if the user performs touch operation in the area corresponding to the remote equipment, performing screen capture/screen recording/sound recording control on the remote equipment.
For example, as shown in fig. 6, the control device divides the touch display screen into two areas, namely area 1 and area 2, sets area 1 as the area corresponding to the remote device, and sets area 2 as the area corresponding to the control device itself. When a user touches the area 1 on the upper half part of the touch display screen, triggering a control instruction for screen capturing/screen recording/sound recording of the remote equipment; when a user touches the area 2 at the lower half part of the touch display screen to perform touch operation, a control instruction for screen capturing/screen recording/sound recording of the control device is triggered.
In some embodiments of the present application, the touch display screen of the control device may be divided into a plurality of display areas, where different areas respectively correspond to the control device itself and different types of remote devices connected to the control device.
For example, as shown in fig. 7, the remote devices that the control device supports connection may be distinguished according to the device types, for example, the remote devices may be divided into two types, i.e., a display device and an audio device, the touch display screen of the control device may be divided into three regions, i.e., a region 1, a region 2, and a region 3, and the region 1 is set as a region corresponding to the display device, the region 2 is set as a region corresponding to the audio device, and the region 3 is set as a region corresponding to the control device itself. When a user touches an area 1 above the touch display screen, triggering a control instruction aiming at screen capture/recording of the display equipment; and when the user touches the area 2 in the middle of the touch display screen to perform touch operation, triggering a control instruction aiming at the recording of the audio equipment. When a user touches the area 3 below the touch display screen to perform touch operation, a control instruction for screen capture/screen recording/sound recording of the control device is triggered.
As an optional implementation manner, on the basis of the region division method shown in fig. 6, the region 1 divided in fig. 6 may be divided according to the two device types of the divided display device and audio device, so as to obtain the region 1 and the region 3 shown in fig. 8, and the region 1 is set as a region corresponding to the display device, and the region 3 is set as a region corresponding to the audio device; the area 2 is set as an area corresponding to the control apparatus itself. When a user touches the area 1 on the upper left of the display screen, triggering a control instruction for screen capture/screen recording/sound recording of the display equipment; when a user touches the area 3 at the upper right of the display screen to perform touch operation, a control instruction for screen capture/screen recording/sound recording of the audio equipment is triggered. When a user touches the area 2 below the touch display screen to perform touch operation, a control instruction for screen capture/screen recording/sound recording of the control device is triggered.
Example 2
Exemplarily, different areas are set for control instructions of screen capturing, screen recording and sound recording in a touch display screen, and if a user performs touch operation in the area corresponding to the control instruction of the screen capturing, the control instruction of the screen capturing is triggered and generated; if the user performs touch operation in an area corresponding to the control instruction of the screen recording, triggering to generate the control instruction of the screen recording; and if the user performs touch operation in the area corresponding to the recorded control instruction, triggering to generate the recorded control instruction.
For example, as shown in fig. 9, when the touch display screen of the control device is divided into three areas, namely, area 1, area 2 and area 3, area 1 may be set as an area corresponding to screen capture, area 2 may be set as an area corresponding to screen recording, and area 3 may be set as an area corresponding to audio recording, then the touch operation received in area 1 corresponds to the control instruction of screen capture, the touch operation received in area 2 corresponds to the control instruction of screen recording, and the touch operation received in area 3 corresponds to the control instruction of audio recording, where the touch operations received in the areas may be the same or different.
Example 3
Exemplarily, different types of touch operations are set for the control device and the far-end device, and if the type of the touch operation performed by the user corresponds to the control device, screen capture/screen recording/sound recording control is performed on the control device; and if the type of the touch operation performed by the user corresponds to the remote equipment, performing screen capture/screen recording/sound recording control on the remote equipment.
For example, multi-finger operation may be assigned to the control device itself, and knuckle operation may be assigned to the remote device. When a user performs set finger joint touch operation on the touch display screen, triggering and generating a control instruction for screen capturing/screen recording/sound recording of other remote equipment connected with the control equipment; when a user performs multi-finger touch operation on the touch display screen, a control instruction for screen capturing/screen recording/sound recording of the control device is triggered and generated.
In some embodiments of the present application, different types of touch operations may also be respectively corresponding to the control device itself and different types of remote devices, where the control device triggers to generate corresponding control instructions by receiving different types of touch operations on the touch display screen, and respectively controls the control device itself and the different types of remote devices connected to the control device itself.
For example, when a remote device connected to the control device is divided into a display device and an audio device, the multi-finger operation may be corresponding to a control instruction for the control device itself, the operation of the finger joint tapping the touch display screen may be corresponding to the display device, and the operation of the finger joint sliding on the touch display screen may be corresponding to the audio device. When a user performs a preset finger joint knocking operation on the touch display screen, triggering and generating a control instruction for screen capturing/screen recording/sound recording of the display equipment; when a user performs a set finger joint sliding operation on a touch display screen, triggering and generating a control instruction for screen capturing/screen recording/sound recording of the audio equipment; when a user performs multi-finger touch operation on the touch display screen, a control instruction for screen capturing/screen recording/sound recording of the control device is triggered and generated.
Example 4
Exemplarily, different types of touch operations are set for control instructions of screen capturing, screen recording and sound recording, and if the type of the touch operation performed by a user corresponds to the screen capturing, a control instruction of the screen capturing is triggered and generated; if the type of the touch operation performed by the user corresponds to the screen recording, triggering to generate a control instruction of the screen recording; and if the type of the touch operation performed by the user corresponds to the recording, triggering a control instruction for generating the recording.
For example, the touch operation may be a finger joint operation, that is, a user performs an operation on the touch display screen through a finger joint of any finger. The control equipment can correspond the control instruction of the screen capture by double-clicking the touch display screen with the single finger joint, correspond the control instruction of the screen recording by double-clicking the touch display screen with the double finger joint, and correspond the preset specific finger joint to the control instruction of the recording by sliding on the touch display screen according to the preset track. Then, as shown in the schematic diagram (a) in fig. 10, the user can trigger the control instruction for generating the screen capture by double-clicking the touch display screen twice through the knuckle of any finger; as shown in the schematic diagram (b) in fig. 10, the user can trigger the generation of the control instruction of the screen recording by double-clicking the touch display screen twice through the finger joints of any two fingers; as shown in the schematic diagram (c) in fig. 10, the user can trigger the control instruction for generating the recording by sliding the finger joint of any finger on the touch display screen according to the preset track.
In the above embodiments, the control device can trigger the control instruction of screen capture/screen recording/sound recording in multiple ways, so as to implement the device control function, and in the specific implementation, the implementation mode can be flexibly adjusted according to the actual requirements, so that the method has better applicability and can adapt to various application scene requirements.
For example, the cross-device control method provided by the embodiment of the present application is described by taking the control device to control the display device to perform screen capture/screen recording and taking the finger joint to control screen capture/screen recording as an example in combination with the instruction triggering manner. The touch operation performed in the area of the upper half part of the touch display screen corresponds to a control instruction for screen capturing/recording of the display equipment; and the touch operation performed in the area at the lower half part of the touch display screen corresponds to a control instruction aiming at the screen capture/recording of the control device. And the single-finger joint double-click touch display screen corresponds to a control instruction of screen capture, and the double-finger joint double-click touch display screen corresponds to a control instruction of screen recording. As shown in fig. 11, the specific process includes the following steps:
step S1101: the control device is connected with the display device, and detects the touch operation of double-clicking the touch display screen by the finger joints of the user.
Step S1102: the control device determines whether the area corresponding to the touch operation is the upper half part of the touch display screen. If so, go to step S1103, otherwise, go to step S1104.
Step S1103: the control device determines whether the cross device control function characteristic switch is open. If so, go to step S1105, otherwise, go to step S1104.
Step S1104: the original related control flow is kept.
Step S1105: the control equipment determines whether the touch operation is double-finger joint double-click touch on the display screen. If so, go to step S1106, otherwise, go to step S1108.
Step S1106: and the control equipment sends a screen recording control instruction generated according to the touch operation to the display equipment connected with the control equipment so as to instruct the display equipment to record the screen.
Step S1107: when the control device detects that a user performs touch operation for triggering screen recording stop, a control instruction for screen recording stop is generated and then sent to the display device so as to indicate the display device to stop screen recording, and screen recording data returned after the screen recording of the display device is stopped is received.
Step S1108: the control equipment sends a screen capture control instruction to the display equipment connected with the control equipment to indicate the display equipment to capture the screen, and receives screen capture data returned after the screen capture of the display equipment.
A schematic diagram of the screen capture/recording effect of the above method is shown in fig. 12. Specifically, when the control device controls the display device to perform screen capturing, as shown in a schematic diagram (a) in fig. 12, when the control device receives a single finger joint double-click operation in an area corresponding to the display device on the display screen, the control device instructs the display device to perform screen capturing; as shown in the schematic diagram (b) in fig. 12, the display device performs screen capture according to the instruction of the control device, and returns a screen capture picture to the control device; as shown in the schematic diagram (c) in fig. 12, the control device displays the screenshot returned by the display device to feed back to the user. When the control device controls the display device to record a screen, as shown in a schematic diagram (d) in fig. 12, when the control device receives a double-click operation of a double-finger joint in an area corresponding to the display device on the display screen, the control device instructs the display device to record the screen; as shown in a schematic diagram (e) in fig. 12, the display device records a screen according to the instruction of the control device, and returns a video obtained by recording the screen to the control device; as shown in diagram (f) of fig. 12, the control device displays the video returned by the display device for feedback to the user.
The specific implementation flows provided by the above embodiments are only examples of applicable method flows in the embodiments of the present application, and the execution sequence of each step may be adjusted accordingly according to actual requirements, and other steps may be added or some steps may be reduced. The method flows provided by the above embodiments may also be executed in combination with the method flows provided by other embodiments to implement the cross-device control method provided by the embodiments of the present application.
According to the method provided by the embodiment, the operation types for triggering the generation of the control instruction are classified, and the touch display screen is subjected to region division, so that the operation instruction of a user can be respectively corresponding to different control objects and control instructions, the control equipment can realize screen capture/recording control for the control equipment per se, can realize cross-equipment screen capture/recording control for other equipment, and is wider in application scene coverage.
For example, the cross-device control method provided by the embodiment of the present application is described with reference to the above instruction triggering manner, taking the control device to control the audio device connected to the control device to record, and taking the finger joint to control the recording as an example. Wherein the touch operation performed in the area of the upper half of the touch display screen is corresponding to a control instruction for recording of the audio device, and the touch operation performed in the area of the lower half of the touch display screen is corresponding to a control instruction for recording of the control device itself. And sliding the single finger joint on the touch display screen according to a preset track to correspond to the recording control instruction. As shown in fig. 13, the specific process includes the following steps:
step S1301: the control device is connected with the audio device, and detects touch operation of sliding on the touch display screen by the user according to a preset track by using the finger joints.
Step S1302: the control device determines whether the cross device control function characteristic switch is open. If so, go to step S1303, otherwise, go to step S1304.
Step S1303: and the control equipment generates a recording control instruction according to the touch operation and sends the recording control instruction to the audio equipment so as to indicate the audio equipment to record. And performs step S1305.
Step S1304: the original related control flow is kept.
Step S1305: when the control equipment detects that a user performs touch operation for triggering recording stop, a control instruction for recording stop is generated and then sent to the audio equipment so as to indicate the audio equipment to stop recording and receive the returned recording data after the recording of the audio equipment is stopped.
The schematic diagram of the recording effect of the above method is shown in fig. 14. Specifically, when the control device controls the audio device to record, as shown in schematic diagram (a) in fig. 14, the control device instructs the audio device to record when receiving an operation of sliding the knuckle along a preset track in an area corresponding to the audio device on the display screen, as shown in schematic diagram (b) in fig. 14, the audio device records according to the instruction of the control device and returns recording data to the control device, and the control device can display corresponding prompt information after receiving the audio data returned by the display device, prompt the user that the audio data recorded by the audio device has been acquired, and execute corresponding processing according to the operation of the user. For example, as shown in fig. 14 (c), after receiving audio data 1 sent by an audio device, a control device displays prompt information for acquiring the audio data recorded by the audio device, and simultaneously displays prompt information for inquiring whether a user stores the audio data 1, and if an operation of selecting to store the audio data 1 by the user is received, the control device stores the audio data 1 and may display a storage interface of the audio data 1, as shown in fig. 15, the user may continue to perform a relevant operation on the audio data 1, for example, play the audio data.
The specific implementation flows provided by the above embodiments are only examples of applicable method flows in the embodiments of the present application, and the execution sequence of each step may be adjusted accordingly according to actual requirements, and other steps may be added or some steps may be reduced. The method flows provided by the above embodiments may also be executed in combination with the method flows provided by other embodiments to implement the cross-device control method provided by the embodiments of the present application.
In the method provided by the above embodiment, the control device receives the touch operation instruction for controlling the audio device by the user, generates the corresponding control instruction, and instructs the remote device to record the sound. The audio recording control method and the audio recording control device can realize that the control device is used as a carrier to quickly carry out recording control on other audio devices, and further enable the control device to quickly acquire data of other audio devices.
Under the condition of the instruction triggering mode 2, the control device presets different types of key operations, and determines the control instruction and the control object corresponding to the touch operation of the user by corresponding the different key operations to different control instructions and control objects. In the cross-device control process, the control device responds to key operation of a user on an entity key of the control device and/or a virtual key on a touch display screen, generates a corresponding control instruction, and controls a corresponding control object to execute the control instruction. The control instruction is a screen capture or recording control instruction.
In the embodiment of the application, the corresponding relationship between different key operations and the control command and the control object is a preset fixed corresponding relationship, or a corresponding relationship determined according to the setting of a user. The different types of key operations may be preset fixed operations or setting operations set by a user.
In some embodiments of the present application, the operation performed by the user on the physical key of the control device may be an operation performed on a shortcut key, and the operation performed on the virtual key on the touch display screen may be an operation performed on a virtual key configured by default of the control device, or an operation performed on a preset virtual key corresponding to a control instruction for screen capture/screen recording/recording. For example, different virtual keys and/or physical keys of the control device may be combined and then respectively correspond to different control objects and different control instructions, and if the user simultaneously presses the combined keys or simultaneously presses the combined keys and maintains a preset time, the corresponding control instructions are triggered.
For example, the control instruction corresponding to screen capture when the power key and the volume up key are pressed simultaneously, the control instruction corresponding to recording screen when the power key and the volume down key are pressed simultaneously, and the control instruction corresponding to recording when the volume up key and the volume down key are pressed simultaneously; the simultaneous pressing of the keys corresponds to a control instruction for the remote device, and the simultaneous pressing of the keys and the maintenance of the preset time correspond to a control instruction for the control device.
For another example, virtual keys respectively corresponding to control instructions of screen capturing, screen recording and recording can be added on the touch display screen of the first terminal, and if a user clicks or long-presses the virtual keys on the touch display screen, the corresponding control instructions are triggered, wherein the virtual keys added on the touch display screen can be respectively provided with a switch controlled by the user, if the user turns on the switch, the virtual keys are displayed on the touch display screen, otherwise, the virtual keys are not displayed on the touch display screen.
In the above manner, the control object may be divided into the control device itself and different types of remote devices connected to the control device, and different types of key operations are set correspondingly.
Under the condition of the instruction triggering mode 3, a user can send a control instruction of screen capturing/screen recording/sound recording in a voice control mode, and the control equipment determines a corresponding control instruction and a control object by receiving and recognizing voice information sent by the user.
Illustratively, the control device receives preset voice information corresponding to the screen capturing/screen recording/recording control instruction, or triggers a corresponding control instruction when receiving voice information corresponding to the screen capturing/screen recording/recording control instruction and including a preset key field, where the preset key field includes information such as a name of a remote device to be controlled, an executed operation name, and the like.
For example, when the display device connected to the control device is an intelligent television and the name of the intelligent television is an intelligent screen, the preset voice information may be "screen capture/screen recording/recording to the intelligent screen", "screen capture of the intelligent screen", "screen recording of the intelligent screen", "audio recording of the intelligent screen", and the like, and when the control device monitors that the user sends the voice information, the control device triggers a corresponding control instruction; the preset keywords can comprise a plurality of fields such as 'smart screen', 'screen capture/screen recording/sound recording', 'capture, screen' and the like, and when the control equipment monitors that the voice sent by the user comprises the preset keywords, the corresponding control instruction is triggered. If the control equipment monitors that the voice information sent by the user is 'screen shot', the control equipment intercepts the self touch display screen, and if the voice information sent by the user is 'screen shot' the intelligent screen, the control equipment controls the connected intelligent screen to execute screen shot operation.
In the voice control mode, a user only needs to speak out a control requirement in the control process, so that the operation is more convenient and faster compared with a mode of triggering control by touch operation and key operation, and the cross-device control function can be realized more quickly and efficiently.
In the embodiment of the application, when the control device receives a control instruction of screen capture/screen recording/sound recording, the control device issues the control instruction to the corresponding remote device in at least one of the following ways:
and in the instruction issuing mode 1, the control equipment directly sends the control instruction to all remote equipment connected with the control equipment.
Specifically, when receiving a control instruction for screen capture/screen recording/recording of a remote device, the control device directly sends the control instruction to all connected remote devices, after receiving the control instruction, each remote device executes screen capture/screen recording/recording operation according to the control instruction, and after the operation is completed, returns the acquired data to the control device, and after receiving the data, the control device stores and/or outputs and displays the data. If the control equipment is determined to be connected with only one remote equipment and receives the data returned by the remote equipment, the control equipment directly stores and/or outputs and displays the received data; if the connection with the plurality of remote devices is determined, outputting prompt information for selecting the received data returned by the plurality of remote devices on the display interface, storing and/or outputting and displaying the data returned by the remote devices selected by the user according to an operation instruction for selecting the data returned by the plurality of remote devices by the user, and deleting the data returned by the remote devices not selected by the user.
In the embodiment of the application, the control device firstly acquires the result data of screen capturing/screen recording/recording operation of all connected remote devices and then feeds back the result data to the user for selection, so that the execution time can be shortened as much as possible, and the continuity and the real-time performance of cross-device control are ensured.
And the instruction issuing mode 2 is that the control equipment sends a control instruction carrying the delayed return instruction to all remote equipment connected with the control equipment.
Specifically, when the control device receives a control instruction for screen capture/screen recording/recording of the remote device, the control device adds the return-postponing indication information in the control instruction and then sends the control instruction to all the connected remote devices, wherein the return-postponing indication is used for indicating that each remote device does not temporarily return the acquired data to the control device after executing screen capture/screen recording/recording operation according to the control instruction, and returning a result of whether screen capture/screen recording/recording is successful to the control device. After receiving the control instruction sent by the control equipment, each remote equipment executes screen capture/screen recording/sound recording operation according to the control instruction, temporarily stores the acquired data after the execution is successful, and sends feedback information whether the screen capture/screen recording/sound recording is successful or not to the control equipment. And after receiving the feedback information, the control equipment outputs and displays the successfully executed remote equipment on a display interface, and simultaneously outputs prompt information for selecting the remote equipment so as to enable a user to select the remote equipment for acquiring screen capture/screen recording/recording data. And the control equipment sends the indication information of the returned data to the remote equipment selected by the user according to the operation instruction selected by the user for each remote equipment, and sends the indication information of the deleted data to the remote equipment not selected by the user. If receiving the indication information of the returned data, each remote device sends the data acquired after the screen capturing/recording operation to the control device; and if the instruction information for deleting the data is received, deleting the previously stored data acquired after the screen capturing/recording operation is executed. And after receiving the data returned by the remote equipment selected by the user, the control equipment stores and/or outputs and displays the data.
For example, when the control device is simultaneously connected to three remote devices, namely, the remote device a, the remote device B, and the remote device C, if an instruction for capturing a screen of the remote device is received, the control device sends a screen capturing control instruction to the remote device a, the remote device B, and the remote device C, and the remote device a, the remote device B, and the remote device C perform a screen capturing operation according to the received control instruction. If the remote device a and the remote device B successfully complete the screen capturing operation and the remote device C fails to perform the screen capturing operation, the remote device a and the remote device B respectively send feedback information of successful screen capturing to the control device, and the remote device C sends feedback information of failed screen capturing to the control device. And after the control equipment receives the feedback information, if the remote end A and the remote end B are determined to successfully capture the screen, displaying prompt information for selecting the remote end A and the remote end B. The control equipment carries out selection operation on the remote equipment A and the remote equipment B according to the received user, and if the remote equipment A is determined to be selected, the control equipment instructs the remote equipment A to send the picture obtained by screen capturing and displays the received picture. The remote device B may delete the picture captured by the remote device B.
As an optional implementation manner, in this application embodiment, the control device may also issue the control instruction to the remote device in a manner of combining the instruction issue manner 1 and the instruction issue manner 2. Specifically, when the control device receives a control instruction for screen capture/screen recording/recording of the remote device, if it is determined that the control device is currently connected with only one remote device, the control device regards that the control instruction is for the remote device, and the instruction issuing method 1 is adopted to directly send the control instruction to the remote device and receive data returned after the remote device executes the control instruction; and if the current connection with a plurality of remote devices is determined, the instruction issuing mode 2 is adopted to send a control instruction carrying the delay indication to each remote device. The specific embodiment is the same as the method.
In the above embodiment of the present application, the control device controls all connected remote devices to perform screen capture/screen recording/recording operations, and simultaneously suspends data return of the remote devices, and after the user selects a remote device to be controlled, the corresponding remote device returns data. Therefore, the control device does not need to receive and process data of all remote devices, the data processing amount of the control device can be reduced, and the execution efficiency can be improved.
When the control device issues the screen capture/screen recording/recording control instruction to the remote device by using the instruction issuing method 1 or the instruction issuing method 2, because information transmission delay exists between the control device and the remote device, the problem of delay also exists when the remote device receives the control instruction to perform screen capture/screen recording/recording, so that the remote device cannot perform accurate screen capture/screen recording/recording, and the control device cannot obtain accurate screen capture/screen recording/recording data. In view of this, in the present application, the remote device stores the data in the specific time before the current time in advance, so that when the remote device performs screen capture/screen recording/recording, the data before and after the time point when the control device triggers the control by the user can be acquired, thereby acquiring accurate screen capture/screen recording/recording data.
Specifically, after the remote device is connected to the control device, if the multimedia data displayed or played by the remote device is stored data, the remote device does not perform additional processing, and if the multimedia data displayed or played by the remote device is not stored data, such as online video playing, the remote device caches the multimedia data in a preset time before each moment in the display or play process when the multimedia data is displayed or played by the remote device.
For example, after the remote device is connected to the control device, the video data with the duration of 10 minutes is played, wherein the preset time is 50 ms. In the process of playing the video data, if the data of 500 milliseconds at the 2 nd minute 20 seconds is played, buffering the data within 50ms before the moment, namely the data between 450 milliseconds at the 2 nd minute 20 seconds and 500 milliseconds at the 2 nd minute 20 seconds of the playing progress; if the data of 600 ms in the 2 nd minute 20 second is played, the data in 50ms before the time is buffered, namely the data between 550 ms in the 2 nd minute 20 second and 600 ms in the playing progress is buffered.
In specific implementation, the video data can be cached in real time when the video is played, and in the playing process, the cached data before the time point corresponding to the preset time before the current time is deleted, wherein the current time can be any time in the video playing process. For example, in the video playing process, if the current playing schedule is 2 minutes 20 seconds 500 milliseconds, and the time point corresponding to the preset time before the current time is 2 minutes 20 seconds 450 milliseconds, the data cached before the playing schedule is 2 minutes 20 seconds 450 milliseconds is deleted, and only the data within the preset time before the current time (2 minutes 20 seconds 450 milliseconds to 2 minutes 20 seconds 500 milliseconds) is retained.
In the embodiment of the application, after receiving a control instruction of a control device, a remote device determines an information transmission delay T between itself and the control device, and then starts to capture/record a screen/record a sound at a corresponding time point after subtracting the information transmission delay T from the current time in data cached in advance. And the current moment is any moment in the process of displaying or playing the multimedia data by the remote equipment. And if the control instruction is a screen capture instruction, the remote equipment continuously captures the screen of the data in each preset time interval before and after the corresponding time point after subtracting the information transmission delay T from the current time to obtain the screen capture data. If the control instruction is a screen recording/recording instruction, the remote equipment starts to record the screen/record at a corresponding time point after subtracting the information transmission delay T from the current time until the remote equipment receives the control instruction of stopping the screen recording/record sent by the control equipment.
The preset time is a set time greater than the sum of the information transmission delay and the preset time interval. For example, the preset time interval may be 10 ms.
In the embodiment of the application, the control instruction for stopping screen recording/recording is the same as the trigger mode of the control instruction for stopping screen recording/recording, and the control device receives the control instruction for stopping screen recording/recording triggered by the user and then sends the control instruction to the remote device. The remote equipment stops screen recording/recording after receiving a screen recording/recording stopping control instruction sent by the control equipment; or, the remote device stops screen recording/recording after receiving the control instruction for stopping screen recording/recording sent by the control device, and deletes the screen recording/recording data in the error time period before receiving the stopped control instruction, where the error time is the information transmission delay.
In the embodiment of the present application, the remote device determines the information transmission delay T by any one of the following manners:
the method comprises the following steps that 1, a time delay determination mode is adopted, and a remote device determines corresponding information transmission time delay T according to instruction sending time indicated by a control device;
the control device carries instruction sending time in a control instruction sent to the remote device, and after the remote device receives the control instruction, the remote device determines the difference between the instruction receiving time for receiving the control instruction and the instruction sending time carried in the control instruction as information transmission delay. The instruction sending time is the time when the control device sends the control instruction to the remote device, and is the same as the time when the control device receives the user-triggered screen capture/screen recording/recording operation, for example, the time may be the time when the control device detects that the user performs the touch operation of triggering the screen capture/screen recording/recording control instruction on the touch display screen.
Illustratively, the time when the control device detects the touch operation performed by the user is T1, the time when the remote device receives the corresponding control instruction sent by the control device is T2, and according to the time T1 indicated by the control device, the remote device can determine that the information transmission delay with the control device is T2-T1.
For example, as shown in fig. 16, it is assumed that the remote device starts playing video data from time t0, the control device receives a touch operation performed by a user at time t1, and sends a control instruction corresponding to the touch operation to the remote device, where the control instruction carries an instruction sending time t 1. The time when the remote device receives the control command is t2, and the preset time interval is t. The remote device determines the information transmission delay T-T2-T1.
When the remote device receives a screen capturing instruction sent by the control device at time T2, the remote device performs screen capturing on data at time T1 corresponding to time T2-T, or performs continuous screen capturing on data within time T before and after time T1, so as to obtain screen capturing data. In the actual implementation process, considering that t1 may have a small error, the display content of the remote device may be updated too fast, and the like, by continuously capturing the data within t time before and after t1, a required picture can be selected from a plurality of captured pictures, and thus it is further ensured that more accurate captured data can be obtained.
When the remote device receives a screen recording/recording instruction sent by the control device at time T2 and receives a screen recording/recording stopping control instruction sent by the control device at time T3, the remote device starts to record/record data from time T2-T, namely time T1, and stops recording at time T3, so that screen recording/recording data are obtained. After the far-end equipment stops recording the screen/recording at the time T3, the time T4 can be obtained from the time T3, namely the time T3-T, and the recorded data in the time period T4 to T3 can be deleted, so that more accurate data can be obtained. Wherein t1< t2< t4< t 3.
In the above embodiment, the control device instructs the remote device to perform the start time point of screen capture/screen recording/recording, so that the remote device can perform accurate screen capture/screen recording/recording operation to obtain a relatively accurate operation result, thereby avoiding an error caused by information forwarding delay between devices.
And 2, determining the information transmission delay T by the far-end equipment by using the heartbeat packet transmitted by the control equipment.
The heartbeat packet (heartbeat data packet) is information for notifying the state of the other party between the devices in the connected state at regular time, and is transmitted at a predetermined time interval. In the embodiment of the application, after the far-end device establishes connection with the first terminal, the far-end device sends heartbeat packets to each other at regular time, so that the opposite terminal confirms the connection state, and therefore, the far-end device can determine the information transmission delay between the far-end device and the control device by using the heartbeat packets.
Specifically, as an optional implementation manner, after the remote device sends the heartbeat packet to the control device, the control device returns a heartbeat packet response (heartbeat response data packet) to the remote device, and the remote device determines that half of a time difference between receiving the heartbeat packet response and sending the heartbeat packet is information transmission delay. As shown in fig. 17, the remote device sends a heartbeat packet with the current sending time to the control device, the control device sends a heartbeat packet response to the remote device after receiving the heartbeat packet, the remote device receives the heartbeat packet response and determines the time for receiving the heartbeat packet response, and the remote device determines half of the time difference between receiving the heartbeat packet response and sending the heartbeat packet as the information transmission delay.
As another optional implementation, the control device carries the sending time of the heartbeat packet in the heartbeat packet sent to the remote device, and the remote device determines the difference between the receiving time of the heartbeat packet and the sending time of the heartbeat packet as the information transmission delay.
As another optional implementation manner, after the control device sends the heartbeat packet to the remote device, the remote device sends a heartbeat packet response to the control device, and the control device determines a half of a time difference between receiving the heartbeat packet response and sending the heartbeat packet as the information transmission delay and indicates the information transmission delay to the remote device.
As another optional implementation manner, the remote device sends a heartbeat packet carrying the transmission time of the heartbeat packet to the control device, the control device returns the heartbeat packet carrying the transmission time to the remote device, and the remote device determines that half of the time difference between the time for receiving the heartbeat response time and the transmission time is information transmission delay. As shown in fig. 18, the remote device sends a heartbeat packet with an additional sending timestamp to the control device, the control device sends the heartbeat packet carrying the sending timestamp to the remote device after receiving the heartbeat packet, and the remote device determines half of a difference between a time of receiving the heartbeat packet and a time corresponding to the sending timestamp as the information transmission delay after receiving the heartbeat packet.
In the above embodiment, the heartbeat packet between the control device and the remote device is sent at a fixed time, so that the remote device or the control device can not only determine the accurate information transmission delay according to the heartbeat packet, but also update the determined information transmission delay at a fixed time according to the fixed time sending of the heartbeat packet, thereby ensuring the accuracy of the information transmission delay.
For example, as shown in fig. 19, assuming that the remote device sends a heartbeat packet to the control device at time T1 and receives a heartbeat packet response returned by the control device at time T2, it can be determined that the information transmission delay with the control device is T ═ T2-T1)/2. Alternatively, assuming that the control device transmits a heartbeat packet carrying the transmission time T3 to the remote device at time T3, and the remote device receives the heartbeat packet at time T4, it can determine that the information transmission delay with the control device is T4-T3. T1< T2, T3< T4.
Further, in order to improve the precision, in this embodiment of the application, the remote device and the control device may determine a plurality of corresponding information transmission delays by using the heartbeat packet for a plurality of times, and average a plurality of information transmission delay values determined for a plurality of times, as the information transmission delay used when executing the screen capture/screen recording/recording instruction.
For example, the method is exemplified by taking the control device controlling the remote device to perform screen capture/recording in conjunction with fig. 20. As shown in the figure, the specific process comprises the following steps:
step S2001: the remote device establishes connection with the control device, and if the characteristic switch is turned on, the remote device sends a heartbeat packet carrying current time T1 (heartbeat packet sending time) to the control device.
Step S2002: and after receiving the heartbeat packet sent by the remote equipment, the control equipment returns a heartbeat packet response to the remote equipment.
Step S2003: the remote device determines the time T2 at which the heartbeat packet reply returned by the control device is received.
Step S2004: the remote device determines that the transmission of the single-path information between the remote device and the control device takes T5 ═ T2-T1)/2.
Step S2005: the remote device and the control device repeatedly execute the steps to determine T5 within preset times, and average the T5 determined for multiple times to obtain the information transmission time delay T between the control device and the remote device.
Wherein, the preset times can be 5 times.
Step S2006: the remote device determines whether dynamic multimedia data is currently displayed. If so, go to step S2007, otherwise, go to step S2009.
The remote device identifies the data type of the current display data to determine whether the current display data is dynamic multimedia data.
Step S2007: the remote device determines whether current display data has been stored. If so, go to step S2009, otherwise, go to step S2008.
Step S2008: and the remote equipment caches the multimedia data in a preset time before the current time.
The preset time is the sum of T determined by the remote device and a preset time interval, and the preset time interval may be 10 ms.
Step S2009: and when the remote equipment receives a screen capturing/recording control instruction sent by the control equipment, the screen capturing/recording is carried out according to the control instruction, and the acquired screen capturing/recording data is sent to the control equipment.
When the remote equipment receives a screen capturing control instruction sent by the control equipment, the remote equipment captures the screen of data in each preset time interval before and after the corresponding time point obtained by subtracting T from the current time; and when the remote equipment receives a screen recording control instruction sent by the control equipment, starting to record the screen at a corresponding time point after subtracting T from the current time.
When the preset time interval is 10ms, the remote device performs screen capture on data within 10ms, namely 20ms, before and after the corresponding time point obtained by subtracting T from the current time.
Step S2010: and the control equipment receives screen capturing/recording data returned by the remote equipment and feeds the data back to the user.
The screen capture data is continuous data captured by the remote device within a certain time, specifically an image set including a plurality of images, and the user can further select a desired appropriate image according to the image set. For example, as shown in fig. 21, after the remote device performs the screen capturing operation and returns the captured image set to the control device, the control device outputs and displays each image in the image set and feeds back the image set to the user, and the user may select a remaining image from a plurality of images included in the image set, and may also perform processing such as image adjustment. And the control equipment keeps the images selected by the user and deletes the images which are not selected by the user according to the operation of the user for selecting the plurality of images.
The specific implementation flows provided by the above embodiments are only examples of applicable method flows in the embodiments of the present application, and the execution sequence of each step may be adjusted accordingly according to actual requirements, and other steps may be added or some steps may be reduced. The method flows provided by the above embodiments may also be executed in combination with the method flows provided by other embodiments to implement the cross-device control method provided by the embodiments of the present application.
In some scenes, the remote device may be in a state of outputting and displaying a static image all the time within a certain period of time, and in these scenes, when the remote device is controlled to capture a screen, only the current touch display screen needs to be captured, and it is not necessary to cache data in the previous preset time. Therefore, as an optional implementation manner, when the remote device outputs and displays the multimedia file, the file type of the displayed multimedia file may be identified, and if the displayed static multimedia file is determined, when a control instruction of screen capture or screen recording is received, the corresponding screen capture or screen recording operation is directly executed; and if the dynamic multimedia file is determined to be displayed, when a control command of screen capture or screen recording is received, carrying out time compensation of transmission delay according to the method, and then executing corresponding screen capture or screen recording operation. In practice, the method described in the above examples can be used.
In the above embodiment, by determining the information transmission delay between the control device and the remote device, the remote device performs error time compensation during screen capture/screen recording/recording operation according to the information transmission delay, so as to implement accurate screen capture/screen recording/recording, and solve the problem of hysteresis existing during screen capture/screen recording/recording of a dynamic image.
In some embodiments of the present invention, the remote device may be in a multi-screen cooperative state when receiving the screen capture/recording instruction, that is, the touch display screen of the remote device not only outputs and displays data of the remote device, but also displays data of other electronic devices projected to the remote device. Therefore, when the remote device is controlled to perform screen capture/screen recording/sound recording, only data belonging to the remote device in the currently displayed data or all currently displayed data can be acquired.
As an optional implementation manner, the remote device may perform screen capture/screen recording/recording only on the displayed data of the remote device, specifically, when the remote device is in the multi-screen coordination state, add a multi-screen coordination flag to the data displayed by other electronic devices in the touch display screen, skip the data with the multi-screen coordination flag when performing screen capture/screen recording/recording, and perform screen capture/screen recording/recording only on the data belonging to the remote device.
As another alternative, the remote device may directly perform screen capture/recording on all data displayed on the display screen without distinguishing the data projected by itself or other electronic devices.
As another optional implementation manner, when the remote device receives a screen capture/screen recording/recording instruction sent by the control device, if it is determined that other electronic devices projected to the remote device include the control device, screen capture/screen recording/recording is performed on data except for data of the control device in the display data; and if the other electronic equipment which is projected to the electronic equipment does not comprise the control equipment, performing screen capture/screen recording/recording on the data belonging to the remote equipment or performing screen capture/screen recording/recording on all the data displayed on the display screen.
For example, as shown in fig. 22, the remote device is in a multi-screen coordination state, where the layer interface 1 is used to display data of the remote device itself, and the layer interface 2 is located above the layer interface 1 and is used to display data of other electronic devices that are projected to the remote device. For example, when the remote device is an intelligent television and the electronic device projected to the remote device is a mobile phone, the window corresponding to the layer interface 1 displays the interface of the intelligent television, and the window corresponding to the layer interface 2 displays the interface of the mobile phone.
When the remote equipment receives a screen capture instruction sent by the control equipment, the remote equipment determines whether a mobile phone projected to the remote equipment is the control equipment, if so, the remote equipment skips the layer interface 2 during screen capture and only captures the content displayed by the layer interface 1, otherwise, the remote equipment directly captures the corresponding content of the whole display screen, or skips the layer interface 2 and only captures the content displayed by the layer interface 1.
In the embodiment of the application, when the remote device draws a display interface or a display window, whether the layer interface 2 is skipped during screen capturing is identified by adding a multi-screen cooperative mark on the layer interface 2 corresponding to other electronic devices. When the terminal device receives a control instruction of screen capture or screen recording and performs screen capture, if it is determined that the layer interface 2 has the multi-screen cooperative mark, skipping the layer interface 2 to perform screen capture, and the captured screen schematic diagram is shown in fig. 23; if it is determined that there is no multi-screen cooperative mark on the layer interface 2, performing screen capture on the display interface corresponding to the entire display screen, where a captured screen schematic diagram is shown in fig. 24. An exemplary specific flow is shown in fig. 25, and includes the following steps:
step S2501: the remote device establishes a connection with the control device.
Step S2502: the remote device determines whether the multi-screen collaborative scene is currently present. If so, go to step S2503, otherwise, go to step S2504.
Step S2503: when the remote device draws a display window, a multi-screen cooperative mark is added to the window projected to the remote device.
Step S2504: and when the remote equipment receives a screen capturing control instruction sent by the control equipment, judging whether a window with a multi-screen cooperative sign exists in the current window of the touch display screen. If so, go to step S2505, otherwise, go to step S2506.
Step S2505: and after the remote equipment skips the window with the multi-screen cooperative mark, screen capture is carried out to obtain a screenshot of the window corresponding to the self data in the touch display screen. And performs step S2507.
Step S2506: and the remote equipment directly captures the screen to obtain the screenshot of the window corresponding to the touch display screen.
Step S2507: and after the screen capture of the remote equipment is finished, storing the screen capture data or sending the screen capture data to the control equipment.
As another optional implementation, when the remote device receives a control instruction for screen capture or screen recording, the remote device directly performs screen capture without distinguishing the terminal devices to which the layer interface 1 and the layer interface 2 belong. The intercepted screen is shown in FIG. 23.
The specific implementation flows provided by the above embodiments are only examples of applicable method flows in the embodiments of the present application, and the execution sequence of each step may be adjusted accordingly according to actual requirements, and other steps may be added or some steps may be reduced. The method flows provided by the above embodiments may also be executed in combination with the method flows provided by other embodiments to implement the cross-device control method provided by the embodiments of the present application.
In the above embodiment, when the remote device is in a multi-screen collaborative scene, the device objects of the data displayed in the touch display screen are distinguished through the multi-screen collaborative flag bits, and the screen capturing or recording operation can be performed in a targeted manner according to user requirements in the screen capturing or recording process, so that various control requirements are met.
The cross-device control method provided by the foregoing embodiments of the present application may be executed in combination to implement a corresponding cross-device control function.
Based on the foregoing embodiments, an embodiment of the present application provides a cross-device control method, which is applied to a system formed by a first electronic device and a second electronic device, and as shown in fig. 26, the method includes the following steps:
step S2601: the first electronic device establishes a wireless communication connection with the second electronic device.
Step S2602: the first electronic device executes a first target acquisition operation in response to the received first operation.
Step S2603: and the first electronic equipment responds to the received second operation and sends a first instruction to the second electronic equipment, wherein the first instruction is used for instructing the second electronic equipment to execute a second target acquisition operation.
Step S2604: and the second electronic equipment responds to the received first instruction and executes the second target acquisition operation.
Wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
Specifically, the specific steps executed by the first electronic device and the second electronic device in the method may refer to the foregoing embodiments, and are not described in detail herein.
Based on the above embodiments, an embodiment of the present application further provides an electronic device, where the electronic device may be a first electronic device or a second electronic device. The electronic device is used for realizing the cross-device control method applied to the first electronic device or the second electronic device. As shown in fig. 27, the electronic apparatus 2700 may include: a display 2701, one or more processors 2702, memory 2703, and one or more computer programs (not shown). The various devices described above may be coupled by one or more communication buses 2704.
The display 2701 is used for displaying images, videos and other relevant user interfaces. One or more computer programs, including instructions, are stored in the memory 2703; processor 2702 invokes the instructions stored in memory 2703 to cause electronic 2700 to perform the cross-device control method provided by the embodiments of the present application.
In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of an electronic device as an execution subject. In order to implement the functions in the method provided by the embodiment of the present application, the terminal device may include a hardware structure and/or a software module, and implement the functions in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
As used in the above embodiments, the terms "when …" or "after …" may be interpreted to mean "if …" or "after …" or "in response to determining …" or "in response to detecting …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)". In addition, in the above-described embodiments, relational terms such as first and second are used to distinguish one entity from another entity without limiting any actual relationship or order between the entities.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Those skilled in the art will recognize that the functionality described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof, in one or more of the examples described above. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the embodiments of the present application in further detail, and it should be understood that the above-mentioned embodiments are only specific embodiments of the present application, and are not intended to limit the scope of the embodiments of the present application, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the embodiments of the present application should be included in the scope of the embodiments of the present application. The foregoing description of the embodiments of the present application is provided to enable any person skilled in the art to make or use the teachings of the embodiments of the present application, and any modifications based on the disclosed teachings should be considered obvious to those skilled in the art, and the general principles described in the embodiments of the present application may be applied to other variations without departing from the inventive concept and scope of the present application. Thus, the disclosure of the embodiments of the present application is not intended to be limited to the embodiments and designs described, but is to be accorded the widest scope consistent with the principles of the application and novel features disclosed.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made thereto without departing from the spirit and scope of the embodiments of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the embodiments of the present application are intended to include such modifications and variations as well.

Claims (23)

1. A cross-device control method is applied to a system formed by first electronic equipment and second electronic equipment, and is characterized by comprising the following steps:
the first electronic equipment and the second electronic equipment establish wireless communication connection;
the first electronic equipment responds to the received first operation and executes a first target acquisition operation;
the first electronic equipment responds to the received second operation and sends a first instruction to the second electronic equipment, and the first instruction is used for instructing the second electronic equipment to execute a second target acquisition operation;
the second electronic equipment responds to the received first instruction and executes the second target acquisition operation;
wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
2. The method of claim 1, wherein prior to the second electronic device performing the second target acquisition operation, the method further comprises:
the first electronic equipment determines the transmission delay with the second electronic equipment and sends the transmission delay to the second electronic equipment; or
The second electronic device determines the transmission delay.
3. The method of claim 2, wherein determining, by the first electronic device, a transmission delay with the second electronic device comprises:
the first electronic equipment sends a heartbeat data packet to the second electronic equipment according to a heartbeat cycle;
the second electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the first electronic device;
and the first electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
4. The method of claim 2, wherein the second electronic device determining the transmission delay comprises:
the first electronic equipment sends a heartbeat data packet carrying a sending time stamp to the second electronic equipment according to a heartbeat cycle;
and the second electronic equipment receives the heartbeat data packet and determines the transmission delay according to the receiving time and the sending time stamp of the heartbeat data packet.
5. The method of claim 2, wherein the second electronic device determining the transmission delay comprises:
the second electronic equipment sends a heartbeat data packet to the first electronic equipment according to a heartbeat cycle;
the first electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic device;
and the second electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the sending time of the heartbeat data packet and the receiving time of the heartbeat response data packet.
6. The method of claim 2, wherein the second electronic device determining the transmission delay comprises:
the second electronic equipment sends a heartbeat data packet carrying a sending time stamp to the first electronic equipment according to a heartbeat cycle;
the first electronic device receives the heartbeat data packet and sends a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic device, wherein the heartbeat response data packet comprises a sending time stamp in the heartbeat data packet;
and the second electronic equipment receives the heartbeat response data packet and determines the transmission delay according to the receiving time and the sending time stamp of the heartbeat response data packet.
7. The method of claim 2, wherein the first instruction carries a transmission timestamp of the first instruction, and wherein the determining, by the second electronic device, the transmission delay comprises:
and the second electronic equipment determines the transmission delay according to the receiving time and the sending time stamp of the first instruction.
8. The method of any one of claims 1 to 7, further comprising:
the second electronic equipment sends the acquired data obtained by executing the second target acquisition operation to the first electronic equipment;
and the first electronic equipment displays the received collected data.
9. The method of claim 8, wherein the system further comprises a third electronic device, and wherein before the second electronic device transmits the collected data to the first electronic device, the method further comprises:
the first electronic equipment and the third electronic equipment establish wireless communication connection;
the first electronic device responds to the received second operation and sends a second instruction to the third electronic device, and the second instruction is used for instructing the third electronic device to execute a third target acquisition operation;
the third electronic equipment responds to the received second instruction and executes the third target acquisition operation;
the first electronic equipment displays a first option and a second option;
the first electronic device responds to the operation of the first option and sends a third instruction to the second electronic device, wherein the third instruction is used for instructing the second electronic device to send the acquired data to the first electronic device;
and the first electronic equipment responds to the operation of the second option and sends a fourth instruction to the third electronic equipment, wherein the fourth instruction is used for instructing the third electronic equipment to send the acquired data to the first electronic equipment.
10. The method of any of claims 1-9, wherein the second electronic device performs the second target acquisition operation, comprising:
and if the screen projection window from the first electronic equipment to the second electronic equipment is included in the display screen of the second electronic equipment, the second electronic equipment executes the second target acquisition operation on other display areas except the screen projection window on the display screen.
11. The method of any of claims 2-10, wherein the second electronic device performs the second target acquisition operation, comprising:
the second electronic equipment executes continuous screen capture operation within a preset length of target time period; or alternatively
The second electronic equipment starts to execute screen recording operation or sound recording operation from a target time point;
the target time point is obtained by subtracting the transmission delay from the receiving time point of the first instruction, and the central time point of the target time period is the target time point.
12. The method according to any of claims 1 to 11, wherein the first operation or the second operation comprises at least one of:
an operation acting on a touch display screen of the first electronic device;
an operation acting on at least one key of the first electronic device.
13. The method according to any one of claims 1 to 12, wherein the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving the first operation, and the second touch operation area is used for receiving the second operation.
14. The method according to any one of claims 1 to 12, wherein the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving a first operation for controlling the first electronic device to execute the screen capture operation, and the first touch operation area is also used for receiving a second operation for controlling the second electronic device to execute the screen capture operation;
the second touch operation area is used for receiving a first operation for controlling the first electronic device to execute the screen recording operation, and the second touch operation area is also used for receiving a second operation for controlling the second electronic device to execute the screen recording operation.
15. A cross-device control method is applied to a first electronic device and is characterized by comprising the following steps:
establishing a wireless communication connection with a second electronic device;
executing a first target acquisition operation in response to the received first operation;
responding to the received second operation, and sending a first instruction to the second electronic equipment, wherein the first instruction is used for instructing the second electronic equipment to execute a second target acquisition operation;
wherein the target acquisition operation comprises: screen capture operation, screen recording operation or sound recording operation.
16. The method of claim 15, further comprising:
and sending a heartbeat data packet carrying a sending time stamp to the second electronic equipment according to the heartbeat period.
17. The method of claim 15, further comprising:
and receiving a heartbeat data packet from the second electronic equipment, and sending a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic equipment.
18. The method of claim 15, further comprising:
and receiving a heartbeat data packet carrying a sending time stamp from the second electronic equipment, and sending a heartbeat response data packet corresponding to the heartbeat data packet to the second electronic equipment, wherein the heartbeat response data packet comprises the sending time stamp in the heartbeat data packet.
19. The method of any one of claims 15 to 18, further comprising:
receiving collected data from the second electronic device, wherein the collected data is obtained by the second electronic device executing the second target collection operation;
and displaying the received acquisition data.
20. The method of claim 19, wherein prior to receiving the collected data from the second electronic device, the method further comprises:
establishing a wireless communication connection with a third electronic device;
responding to the received second operation, and sending a second instruction to the third electronic device, wherein the second instruction is used for instructing the third electronic device to execute a third target acquisition operation;
displaying a first option and a second option;
responding to the operation of the first option, and sending a third instruction to the second electronic device, wherein the third instruction is used for instructing the second electronic device to send the acquired data to the first electronic device;
and responding to the operation of the second option, and sending a fourth instruction to the third electronic equipment, wherein the fourth instruction is used for instructing the third electronic equipment to send the acquired data to the first electronic equipment.
21. The method according to any one of claims 15 to 20, wherein the touch display screen of the first electronic device comprises a first touch operation area and a second touch operation area;
the first touch operation area is used for receiving the first operation, and the second touch operation area is used for receiving the second operation.
22. An electronic device comprising a touch display screen, a memory and one or more processors;
wherein the memory is configured to store computer program code comprising computer instructions; the computer instructions, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 15-21.
23. A computer-readable storage medium, comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any of claims 1 to 14 or to perform the method of any of claims 15 to 21.
CN202011173806.2A 2020-10-28 2020-10-28 Cross-device control method and device Pending CN114510186A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011173806.2A CN114510186A (en) 2020-10-28 2020-10-28 Cross-device control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011173806.2A CN114510186A (en) 2020-10-28 2020-10-28 Cross-device control method and device

Publications (1)

Publication Number Publication Date
CN114510186A true CN114510186A (en) 2022-05-17

Family

ID=81547179

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011173806.2A Pending CN114510186A (en) 2020-10-28 2020-10-28 Cross-device control method and device

Country Status (1)

Country Link
CN (1) CN114510186A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117130471A (en) * 2023-03-31 2023-11-28 荣耀终端有限公司 Man-machine interaction method, electronic equipment and system
WO2023236800A1 (en) * 2022-06-06 2023-12-14 华为技术有限公司 Method for controlling cross-device application, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101536535A (en) * 2006-01-27 2009-09-16 Lg电子株式会社 Remote controlling system for electric device
CN106406710A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Screen recording method and mobile terminal
CN107958168A (en) * 2017-12-19 2018-04-24 广东欧珀移动通信有限公司 Record screen method, apparatus and terminal
CN108809617A (en) * 2018-04-18 2018-11-13 京信通信系统(中国)有限公司 A kind of delay compensation method and terminal
CN108924614A (en) * 2018-06-01 2018-11-30 联想(北京)有限公司 A kind of control method, control system and electronic equipment
CN109684025A (en) * 2019-01-08 2019-04-26 深圳市网心科技有限公司 A kind of remote communication method and relevant apparatus
CN111565321A (en) * 2020-04-28 2020-08-21 聚好看科技股份有限公司 Terminal device, server and method for screen recording

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101536535A (en) * 2006-01-27 2009-09-16 Lg电子株式会社 Remote controlling system for electric device
CN106406710A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Screen recording method and mobile terminal
CN107958168A (en) * 2017-12-19 2018-04-24 广东欧珀移动通信有限公司 Record screen method, apparatus and terminal
CN108809617A (en) * 2018-04-18 2018-11-13 京信通信系统(中国)有限公司 A kind of delay compensation method and terminal
CN108924614A (en) * 2018-06-01 2018-11-30 联想(北京)有限公司 A kind of control method, control system and electronic equipment
CN109684025A (en) * 2019-01-08 2019-04-26 深圳市网心科技有限公司 A kind of remote communication method and relevant apparatus
CN111565321A (en) * 2020-04-28 2020-08-21 聚好看科技股份有限公司 Terminal device, server and method for screen recording

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023236800A1 (en) * 2022-06-06 2023-12-14 华为技术有限公司 Method for controlling cross-device application, and electronic device
CN117130471A (en) * 2023-03-31 2023-11-28 荣耀终端有限公司 Man-machine interaction method, electronic equipment and system
CN117130471B (en) * 2023-03-31 2024-07-26 荣耀终端有限公司 Man-machine interaction method, electronic equipment and system

Similar Documents

Publication Publication Date Title
WO2022100237A1 (en) Screen projection display method and related product
US20220342850A1 (en) Data transmission method and related device
WO2022100239A1 (en) Device cooperation method, apparatus and system, electronic device and storage medium
KR102064952B1 (en) Electronic device for operating application using received data
CN112558825A (en) Information processing method and electronic equipment
CN111666055B (en) Data transmission method and device
WO2021147406A1 (en) Audio output method and terminal device
CN114520868B (en) Video processing method, device and storage medium
CN111221845A (en) Cross-device information searching method and terminal device
CN112527174B (en) Information processing method and electronic equipment
US11481357B2 (en) Album display method, electronic device, and storage medium
CN114442969B (en) Inter-equipment screen collaboration method and equipment
US20230342104A1 (en) Data Transmission Method and Device
CN112527222A (en) Information processing method and electronic equipment
WO2022028494A1 (en) Multi-device data collaboration method and electronic device
CN114079691B (en) Equipment identification method and related device
CN114510186A (en) Cross-device control method and device
CN114513689A (en) Remote control method, electronic equipment and system
CN111176766A (en) Communication terminal and component display method
US20240012534A1 (en) Navigation Bar Display Method, Display Method, and First Electronic Device
CN112148401A (en) View display method and electronic equipment
KR101987463B1 (en) Mobile terminal and method for controlling of the same
CN116028148B (en) Interface processing method and device and electronic equipment
CN113079332B (en) Mobile terminal and screen recording method thereof
CN115113832A (en) Cross-device synchronous display control method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination