CN113778311A - Operation method and device and electronic equipment - Google Patents

Operation method and device and electronic equipment Download PDF

Info

Publication number
CN113778311A
CN113778311A CN202110939222.XA CN202110939222A CN113778311A CN 113778311 A CN113778311 A CN 113778311A CN 202110939222 A CN202110939222 A CN 202110939222A CN 113778311 A CN113778311 A CN 113778311A
Authority
CN
China
Prior art keywords
target
electronic device
information
electronic equipment
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110939222.XA
Other languages
Chinese (zh)
Inventor
郭广饶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN202110939222.XA priority Critical patent/CN113778311A/en
Publication of CN113778311A publication Critical patent/CN113778311A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The application discloses an operation method, an operation device and electronic equipment, which are applied to first electronic equipment and belong to the technical field of mobile terminals. The method comprises the following steps: the method comprises the steps that a first electronic device obtains target information corresponding to a second electronic device; the target information includes at least one of: inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by a user to the second electronic equipment; the first electronic equipment executes target operation according to the target information, wherein the target operation is at least one of the following operations: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.

Description

Operation method and device and electronic equipment
Technical Field
The application belongs to the technical field of terminal operation, and particularly relates to an operation method, an operation device and electronic equipment.
Background
Currently, there are multiple brands of electronic devices on the market, and the functions of the multiple brands of electronic devices are basically similar. However, due to different brands of electronic devices, or different electronic devices correspond to different operation modes for the same function. For example, on the electronic device a, the user can realize the screen capture function by knocking the screen below two through the finger joints, but when the user changes the electronic device, the electronic device b does not have any reaction by the same operation (namely, knocking the screen below two through the finger joints) on the electronic device b, and the screen capture function cannot be realized.
Therefore, in the case of being unfamiliar with the operation of the electronic device b, the user needs to consult with others or by means of a search to know how to operate the electronic device b. This is time and labor consuming for the user and does not necessarily result in a correct manner of operation. Therefore, when the user replaces the electronic device, the efficiency of operating the replaced electronic device is low.
Disclosure of Invention
An object of the embodiments of the present application is to provide an operation method, an operation device, and an electronic device, which can solve the problem that when a user changes an electronic device, the operation efficiency of the changed electronic device is low.
In a first aspect, an embodiment of the present application provides an operating method, including: the method comprises the steps that a first electronic device obtains target information corresponding to a second electronic device; the target information includes at least one of: inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by a user to the second electronic equipment; the first electronic equipment executes target operation according to the target information, wherein the target operation is at least one of the following operations: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
In a second aspect, an embodiment of the present application provides an operating device, including: the device comprises an acquisition module and an execution module. The acquisition module is used for acquiring target information corresponding to the second electronic equipment by the first electronic equipment; the target information includes at least one of: and inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by the user to the second electronic equipment. The execution module is used for the first electronic equipment to execute target operation according to the target information, and the target operation is at least one of the following operations: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In this embodiment of the application, the first electronic device may execute the corresponding target operation according to the obtained target gesture information and/or the target interface information displayed by the second electronic device by obtaining the target gesture information corresponding to the input of the user to the second electronic device, so that the first electronic device may adjust the corresponding setting parameter according to the target gesture information and/or the first electronic device displays the target interface according to the target interface information. The first electronic device can obtain the target gesture information corresponding to the input of the second electronic device by the user and the target interface information displayed by the second electronic device by acquiring the target information corresponding to the second electronic device, so that the first electronic device can analyze and determine the target gesture information required to be input and the target interface information displayed by the second electronic device when the user triggers the second electronic device to execute the corresponding operation according to the acquired information, the first electronic device can determine the setting parameters and the interface display state of the second electronic device according to the information and correspondingly and synchronously adjust the setting parameters and the interface display state of the first electronic device, and therefore, when the user replaces the electronic device, the operation efficiency of the user on the replaced electronic device is improved.
Drawings
FIG. 1 is one of the schematic diagrams of a method of operation provided by embodiments of the present application;
FIG. 2 is a second schematic diagram of an operating method according to an embodiment of the present application;
fig. 3 is one of schematic diagrams of an example of an interface of a mobile phone according to an embodiment of the present application;
FIG. 4 is a third schematic diagram of an operation method according to an embodiment of the present application;
FIG. 5 is a fourth schematic diagram of a method of operation provided by an embodiment of the present application;
fig. 6 is a second schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present disclosure;
fig. 7 is a third schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present application;
FIG. 8 is a fifth schematic illustration of a method of operation provided by an embodiment of the present application;
FIG. 9 is a schematic structural diagram of an operating device according to an embodiment of the present disclosure;
fig. 10 is a second schematic structural diagram of an operating device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The operation method provided by the embodiment of the application is applied to a scene that a user switches to use different electronic devices, the specific application scene can be determined according to actual use requirements, and the application is not specifically limited.
Taking an example of switching and using a first electronic device (originally used by a user) by a user as an example, assuming that the originally used electronic device of the user is a second electronic device, the electronic device needs to be replaced at present, so as to switch and use the first electronic device, the user can acquire target information corresponding to the second electronic device through the first electronic device, so as to determine target gesture information corresponding to input of the user to the second electronic device and target interface information displayed by the second electronic device, so that when the first electronic device acquires the information, corresponding target operation is executed, so as to adjust setting parameters corresponding to the first electronic device according to the target gesture information, run a target application program on the first electronic device according to the target interface information of the second electronic device, and synchronously display a target interface.
Specifically, the first electronic device may acquire, through the camera, an image (picture or video) corresponding to the second electronic device operated by the user, so that the first electronic device can analyze and process the acquired image to obtain the target information corresponding to the second electronic device, namely, target gesture information (namely, input parameters of the user) corresponding to the input of the second electronic device by the user and target interface information displayed by the second electronic device (namely, interface information of the second electronic device responding to the input of the user) are determined from the image, therefore, the first electronic device may determine the setting parameters corresponding to the second electronic device (i.e., the input parameters that trigger the second electronic device to execute a certain function), and the interface currently displayed by the second electronic device (i.e., the application currently running on the second electronic device and the specific page displayed on the second electronic device). The first electronic device may correspondingly adjust the setting parameters and the displayed interface of the first electronic device (that is, the first electronic device) according to the information of the second electronic device, that is, the first electronic device may adjust the input parameters that trigger the first electronic device to execute a certain function to the same input parameters as the second electronic device, and adjust the currently running application and the displayed specific page of the first electronic device to the same display interface as the currently displayed interface of the second electronic device.
Therefore, in the embodiment of the application, when the user replaces the electronic device, the user does not need to consult other people or search to know how to operate the replaced electronic device. And the relevant information of the electronic equipment before replacement is directly obtained through the replaced electronic equipment, and the corresponding setting parameters and the corresponding display interface are determined, so that the operation efficiency of the replaced electronic equipment when a user replaces the electronic equipment can be improved.
The following describes in detail an operation method provided by the embodiments of the present application with reference to the accompanying drawings and application scenarios thereof.
An embodiment of the present application provides an operation method, and fig. 1 shows a flowchart of an operation method provided in an embodiment of the present application, where the method may be applied to an electronic device. As shown in fig. 1, the operation method provided by the embodiment of the present application may include steps 201 and 202 described below.
Step 201, the first electronic device obtains target information corresponding to the second electronic device.
In an embodiment of the present application, the target information includes at least one of: and inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by the user.
In the embodiment of the application, the first electronic device may determine target gesture information corresponding to input of the second electronic device by a user and target interface information displayed by the second electronic device by acquiring target information corresponding to the second electronic device, so that when the first electronic device acquires the information, the first electronic device may execute corresponding target operation according to the information, adjust setting parameters corresponding to the first electronic device according to the target gesture information, run a target application program on the first electronic device according to the target interface information of the second electronic device, and synchronously display a target interface.
Optionally, in this embodiment of the application, the application scenario is a scenario in which the user changes the electronic device, and specifically, the scenario in which the user changes the second electronic device into the first electronic device.
Optionally, in this embodiment of the application, the first electronic device is an electronic device whose setting parameters are to be adjusted, or an electronic device whose target interface is to be displayed, that is, the first electronic device is an electronic device that is replaced by a user, and the second electronic device is an electronic device that is not replaced by the user.
Optionally, in this embodiment of the application, the target gesture information corresponding to the input of the user to the second electronic device may be understood as: when a user needs to trigger the second electronic device to execute a target function, the user inputs corresponding information to a specific gesture of the second electronic device; the target gesture information includes at least one of: input position, input track, input mode, input duration and the like; the input mode may specifically be any of the following: a slide input, a click input, a tap input, etc.
Optionally, in this embodiment of the application, the target interface information displayed by the second electronic device may be understood as: the second electronic equipment responds to the input of the user, and then executes the target function and the change condition of the interface; the target interface information includes at least one of: the application program corresponding to the target interface, the content specifically contained in the target interface, the picture change process displayed in the target interface and the like.
Optionally, in this embodiment of the application, the first electronic device and the second electronic device may be electronic devices of the same brand or electronic devices of different brands, which is not limited in this application.
Optionally, in this embodiment of the application, the step 201 may be specifically implemented by the following step 201 a.
Step 201a, the first electronic device obtains target information corresponding to the second electronic device in a target mode.
In the embodiment of the present application, the above target mode is any one of: the method comprises the steps of collecting images through a camera, receiving information through a wireless communication technology Wi-Fi, receiving information through Bluetooth and receiving information through a shared hotspot.
Optionally, in this embodiment of the application, the first electronic device may acquire, through the camera, an image of the second electronic device when the user operates the second electronic device, so as to obtain an input action of the user and an interface display condition of the second electronic device. Therefore, the first electronic device can analyze the acquired image to determine the target information corresponding to the second electronic device.
Optionally, in this embodiment of the application, the first electronic device may obtain the target information corresponding to the second electronic device through a network connection with the second electronic device, where the network connection includes: direct connections and indirect connections, direct connections may include: bluetooth connection and shared hotspot connection, indirect connection may include: and the Wi-Fi connection means that the first electronic equipment and the second electronic equipment access the same Wi-Fi.
Optionally, in this embodiment of the application, when the first electronic device acquires the target information corresponding to the second electronic device in a network connection manner, the first electronic device may send a request message to the second electronic device to acquire the target information, so that the second electronic device may send the target information to the first electronic device according to the received request message.
Step 202, the first electronic device executes the target operation according to the target information.
In an embodiment of the present application, the target operation is at least one of: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
Optionally, in this embodiment of the application, after the first electronic device acquires the target information corresponding to the second electronic device, the first electronic device may execute the target operation locally on the first electronic device according to the target information.
It can be understood that the first electronic device obtains the target information corresponding to the second electronic device, so as to determine the setting parameter of the second electronic device, and the current operating state and display state of the second electronic device, so that the first electronic device can adjust the setting parameter, the operating state and the display state of the first electronic device to be the same as those of the second electronic device according to the target information of the second electronic device.
Optionally, in this embodiment of the application, the adjusting, by the first electronic device, the corresponding setting parameter according to the target gesture information may be understood as: the first electronic device adjusts the trigger parameter corresponding to the target function of the first electronic device (namely, the first electronic device) to be the same as the trigger parameter corresponding to the target function of the second electronic device.
Optionally, in this embodiment of the application, the first electronic device displays the target interface according to the target interface information, which may be understood as: the first electronic device determines a target application program currently operated by the second electronic device according to the target interface information, and the displayed interface is a target interface in the target application program, so that the target application program is operated in the first electronic device, and the target interface is displayed, so that the content displayed by the first electronic device is the same as the content displayed by the second electronic device.
It should be noted that when the first electronic device acquires the target interface information of the second electronic device, it is not necessary to acquire the target gesture information corresponding to the input of the user to the second electronic device.
The embodiment of the application provides an operation method, where a first electronic device may execute a corresponding target operation according to target gesture information corresponding to input of a user to a second electronic device and/or target interface information displayed by the second electronic device by acquiring the target gesture information corresponding to the input of the user to the second electronic device, so that the first electronic device may adjust a corresponding setting parameter according to the target gesture information and/or the first electronic device displays a target interface according to the target interface information. The first electronic device can obtain the target gesture information corresponding to the input of the second electronic device by the user and the target interface information displayed by the second electronic device by acquiring the target information corresponding to the second electronic device, so that the first electronic device can analyze and determine the target gesture information required to be input and the target interface information displayed by the second electronic device when the user triggers the second electronic device to execute the corresponding operation according to the acquired information, the first electronic device can determine the setting parameters and the interface display state of the second electronic device according to the information and correspondingly and synchronously adjust the setting parameters and the interface display state of the first electronic device, and therefore, when the user replaces the electronic device, the operation efficiency of the user on the replaced electronic device is improved.
Optionally, in this embodiment of the application, when the first electronic device obtains that the user inputs the second electronic device based on the target gesture information and triggers the second electronic device to execute the target function, the target information includes: target gesture information and target interface information. Referring to fig. 1, as shown in fig. 2, the step 202 can be implemented by the step 202a described below.
Step 202a, the first electronic device sets the target gesture information as an operation mode for triggering execution of the target function according to the target gesture information and the target interface information.
Optionally, in this embodiment of the application, when a user performs a target operation (that is, target gesture information) on the second electronic device, the second electronic device may execute a target function (that is, target interface information) in response to the target operation of the user, so that the first electronic device may determine according to the obtained target gesture information and the target interface information, and in the second electronic device, trigger an operation of executing the target function as the target operation.
It is to be understood that, in the case where the user makes a finger joint double-click input (i.e., target gesture information) to the screen of the second electronic device to trigger the second electronic device to perform the screen capture function (i.e., the target function), the first electronic device may determine that "a finger joint double-click operation is performed on the screen of the electronic device" is a trigger operation that triggers the electronic device to perform the screen capture function.
For example, taking the electronic device as a mobile phone as an example for explanation, as shown in fig. 3, when a user double-clicks a screen of the first mobile phone 10 through a finger joint (i.e., target gesture information), to trigger the first mobile phone 10 to perform a screen capturing operation (i.e., target interface information), the second mobile phone 11 may acquire an image corresponding to an input of the user to the first mobile phone 10 through a camera, and analyze and determine an input manner of triggering the first mobile phone 10 to perform the screen capturing operation from the image, where the screen double-clicking of the finger joint triggers the mobile phone to perform the screen capturing operation. The second mobile phone 11 may adjust the input mode of the local trigger for the screen capture operation to the screen capture operation by double-clicking the screen through the finger joints.
In the embodiment of the application, the first electronic device may determine, according to the obtained target gesture information and the obtained target interface information, that the operation of triggering to execute the target function corresponding to the target interface information is the target operation corresponding to the target gesture information in the second electronic device, so that the setting parameters may be adjusted in the first electronic device to set the target gesture information as the operation mode of triggering to execute the target function, and the efficiency of adjusting the setting parameters by the electronic device is improved.
Optionally, in this embodiment of the application, when the first electronic device acquires a target interface corresponding to a target application displayed by the second electronic device, the target information includes: target interface information. Referring to fig. 1, as shown in fig. 4, the step 202 can be specifically realized by the step 202b described below.
Step 202b, the first electronic device runs the target application program according to the target interface information and synchronously displays the target interface.
Optionally, in this embodiment of the application, in a case that the second electronic device displays a target interface in the target application, the first electronic device may only obtain target interface information of the second electronic device, so as to analyze and determine the target application run by the second electronic device and the displayed target interface according to the target interface information.
Optionally, in this embodiment of the application, the first electronic device locally runs the target application according to the obtained target interface information, and displays a target interface in the target application.
It should be noted that, in this embodiment, the first electronic device does not need to acquire target gesture information, that is, the user does not need to input the second electronic device, and the target interface displayed by the first electronic device is synchronized with the target interface displayed by the second electronic device (that is, the interface content is consistent).
Exemplarily, the second electronic device is playing the video b through the video application a, then the user may shoot the second electronic device through the first electronic device, acquire an image of the second electronic device, determine that the second electronic device is playing the video b through the video application a, run the video application a on the first electronic device, and play the video b, and a current playing time length of playing the video b on the first electronic device is the same as a current playing time length of playing the video b on the second electronic device, that is, the first electronic device and the second electronic device play the video b synchronously.
In the embodiment of the application, the first electronic device may determine that the target application is running in the foreground in the second electronic device according to the obtained target interface information, display the target interface, and locally and synchronously run the target application in the first electronic device, and display the target interface, so that the content displayed by the first electronic device is the same as the content displayed by the second electronic device, and the efficiency of displaying the same content by the electronic device is improved.
Optionally, in this embodiment of the present application, as shown in fig. 5 in combination with fig. 1, before step 202, the operation method provided in this embodiment of the present application may further include step 301 described below.
Step 301, the first electronic device displays a prompt message.
In the embodiment of the application, the prompt information is used for prompting a user whether to execute a target operation; wherein the prompt message includes at least one of: target gesture information, target functions, target applications, and target interfaces.
Optionally, in this embodiment of the application, when the first electronic device acquires target information corresponding to the second electronic device, the first electronic device may prompt the user whether to execute the target operation in a form of displaying a prompt message. The prompt message is used for prompting the user of the current state of the second electronic equipment and the operation mode of executing the target function.
Optionally, in this embodiment of the application, the prompt information may further include an operation mode corresponding to the currently triggered execution of the target function in the first electronic device, a determination control and a cancellation control, where the determination control is used to trigger the first electronic device to execute the target operation, and the cancellation control is used to trigger the first electronic device to cancel the execution of the target operation.
Optionally, in this embodiment of the application, the first electronic device may display the prompt information in a pop-up window form, and the user may select and input the determination control and the cancel control displayed in the prompt information to trigger the electronic device to execute the target operation.
Exemplarily, referring to fig. 3, as shown in fig. 6, when the second mobile phone 11 acquires an image corresponding to an input of the user to the first mobile phone 10 through the camera, and determines that an input mode for triggering the first mobile phone 10 to perform a screen capturing operation is that a screen capturing operation is triggered by double-clicking a screen on a finger joint, first prompt information 12 may be displayed on the screen of the second mobile phone 11 to prompt the user to determine whether to adjust a screen capturing shortcut key to a screen on which the finger joint is double-clicked or not, is a current screen capturing mode of the mobile phone, which is a three-finger upward-sliding mode? ", and displays a determine control and a cancel control.
Further exemplarily, as shown in fig. 7, when the second mobile phone 11 acquires that the first mobile phone 10 is playing the video b through the video application a through the camera, second prompt information 13 may be displayed in the screen of the second mobile phone 11 to prompt the user to "detect that the video is playing, the playing software is the video application a, the playing movie is the video b, the playing progress is 23 minutes and 45 seconds", and the determination control and the cancellation control are displayed.
In the embodiment of the application, the first electronic device can prompt the user of the target gesture information, the target function, the target application program and the target interface corresponding to the second electronic device in a manner of displaying the prompt information, and prompt the user whether to execute the target operation, so that the accuracy of executing the operation by the electronic device is improved.
Optionally, in this embodiment of the application, with reference to fig. 1, as shown in fig. 8, the step 201 may be specifically implemented by a step 201b and a step 201c described below.
Step 201b, the first electronic device acquires an image picture corresponding to the second electronic device through the camera, and determines a plurality of picture information according to the image picture.
In an embodiment of the present application, the plurality of pieces of screen information include at least one of: the target gesture information corresponding to the user input, the operation executed by the second electronic equipment in response to the user input, the application program currently running by the second electronic equipment and the interface currently displayed by the second electronic equipment.
Optionally, in this embodiment of the application, after the first electronic device acquires the image picture corresponding to the second electronic device, the first electronic device performs analysis processing on the image picture to determine a plurality of pieces of picture information corresponding to the second electronic device.
Optionally, in this embodiment of the application, the image corresponding to the second electronic device includes at least one of the following items: a picture displayed in the second electronic device screen, a target gesture of the user, a change in the picture in the second electronic device screen, and the like.
In step 201c, the first electronic device obtains the target information from the plurality of pieces of screen information.
Optionally, in this embodiment of the application, the first electronic device may perform analysis processing on the acquired image corresponding to the second electronic device, so as to determine information included in the image through an object recognition technology, a text recognition technology, a screen analysis technology, and the like.
Optionally, in this embodiment of the application, the first electronic device may obtain, from information included in the determined image picture, target information corresponding to the second electronic device through analysis.
Optionally, in this embodiment of the application, the first electronic device may perform analysis processing on an image corresponding to the second electronic device, so as to analyze and determine an operation of the user, a function in the electronic device corresponding to the operation, and the like according to a motion change condition corresponding to the target gesture of the user and a change of the image in the screen of the second electronic device.
In this embodiment of the application, the first electronic device may acquire an image picture corresponding to the second electronic device in a manner that the camera acquires the image, and determine that the second electronic device corresponds to the image picture according to: the target gesture corresponding to the input of the user, the operation executed by the second electronic device in response to the input of the user, the currently running application program of the second electronic device, the currently displayed interface of the second electronic device and other picture information are obtained, and the target information corresponding to the second electronic device is obtained from the plurality of picture information, so that the flexibility of obtaining the target information corresponding to the second electronic device by the first electronic device is high.
In the operation method provided in the embodiment of the present application, the execution main body may be an operation device, or a control module in the operation device for executing the operation method. In the embodiment of the present application, a method for executing a load operation by an operating device is taken as an example, and the operating device provided in the embodiment of the present application is described.
Fig. 9 shows a schematic diagram of a possible configuration of the operating device referred to in the embodiments of the present application. As shown in fig. 9, the operating device 70 may include: an acquisition module 71 and an execution module 72.
The obtaining module 71 is configured to obtain, by a first electronic device, target information corresponding to a second electronic device; the target information includes at least one of: and inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by the user. An executing module 72, configured to, according to the target information, the first electronic device execute a target operation, where the target operation is at least one of: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
In a possible implementation manner, when the first electronic device acquires that the user inputs the second electronic device based on the target gesture information and triggers the second electronic device to execute the target function, the target information includes: target gesture information and target interface information; the executing module 72 is specifically configured to set, by the first electronic device, the target gesture information as an operation mode for triggering execution of the target function according to the target gesture information and the target interface information.
In a possible implementation manner, in a case where the first electronic device acquires a target interface corresponding to a target application displayed by the second electronic device, the target information includes: target interface information; the execution module 72 is specifically configured to run the target application program by the first electronic device according to the target interface information, and synchronously display the target interface.
In a possible implementation manner, with reference to fig. 9 and as shown in fig. 10, the operating device 70 provided in the embodiment of the present application may further include: a display module 73. The display module 73 is configured to, before the first electronic device executes the target operation according to the target information, display a prompt message by the first electronic device, where the prompt message is used to prompt a user whether to execute the target operation; wherein the prompt message includes at least one of: target gesture information, target functions, target applications, and target interfaces.
In a possible implementation manner, the obtaining module 71 is specifically configured to acquire, by a first electronic device, an image picture corresponding to a second electronic device through a camera, and determine a plurality of picture information according to the image picture, where the plurality of picture information includes at least one of the following items: target gesture information corresponding to the input of the user, operation executed by the second electronic equipment in response to the input of the user, an application program currently running by the second electronic equipment and an interface currently displayed by the second electronic equipment; and acquires target information from the plurality of pieces of screen information.
The operation device provided in the embodiment of the present application can implement each process implemented by the operation device in the above method embodiments, and for avoiding repetition, detailed description is not repeated here.
The embodiment of the application provides an operating device, because a first electronic device can obtain target gesture information corresponding to input of a user to a second electronic device and target interface information displayed by the second electronic device by obtaining target information corresponding to the second electronic device, the first electronic device can analyze and determine the target gesture information required to be input and the target interface information displayed by the second electronic device when the user triggers the second electronic device to execute corresponding operation according to the obtained information, so that the first electronic device can determine setting parameters and an interface display state of the second electronic device according to the information and correspondingly and synchronously adjust the setting parameters and the interface display state of the first electronic device, and therefore, when the user changes the electronic device, the operating efficiency of the user on the changed electronic device is improved.
The operation device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The operating device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
Optionally, as shown in fig. 11, an electronic device M00 is further provided in this embodiment of the present application, and includes a processor M01, a memory M02, and a program or an instruction stored in the memory M02 and executable on the processor M01, where the program or the instruction when executed by the processor M01 implements each process of the foregoing operation method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 110 is configured to enable a first electronic device to obtain target information corresponding to a second electronic device; the target information includes at least one of: inputting corresponding target gesture information and target interface information displayed by the second electronic equipment by a user to the second electronic equipment;
the processor 110 is further configured to, by the first electronic device, execute a target operation according to the target information, where the target operation is at least one of: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
The embodiment of the application provides an electronic device, because a first electronic device can obtain target gesture information corresponding to input of a user to the second electronic device and target interface information displayed by the second electronic device by obtaining target information corresponding to the second electronic device, the first electronic device can analyze and determine the target gesture information required to be input and the target interface information displayed by the second electronic device when the user triggers the second electronic device to execute corresponding operation according to the obtained information, so that the first electronic device can determine setting parameters and an interface display state of the second electronic device according to the information and correspondingly and synchronously adjust the setting parameters and the interface display state of the first electronic device, and therefore, when the user changes the electronic device, the operation efficiency of the user on the changed electronic device is improved.
The processor 110 is specifically configured to set, by the first electronic device, the target gesture information as an operation mode for triggering execution of the target function according to the target gesture information and the target interface information.
In the embodiment of the application, the first electronic device may determine, according to the obtained target gesture information and the obtained target interface information, that the operation of triggering to execute the target function corresponding to the target interface information is the target operation corresponding to the target gesture information in the second electronic device, so that the setting parameters may be adjusted in the first electronic device to set the target gesture information as the operation mode of triggering to execute the target function, and the efficiency of adjusting the setting parameters by the electronic device is improved.
The processor 110 is specifically configured to run the target application program by the first electronic device according to the target interface information, and synchronously display the target interface.
In the embodiment of the application, the first electronic device may determine that the target application is running in the foreground in the second electronic device according to the obtained target interface information, display the target interface, and locally and synchronously run the target application in the first electronic device, and display the target interface, so that the content displayed by the first electronic device is the same as the content displayed by the second electronic device, and the efficiency of displaying the same content by the electronic device is improved.
A display unit 106, configured to display, by the first electronic device, prompt information for prompting a user whether to execute a target operation; wherein the prompt message includes at least one of: target gesture information, target functions, target applications, and target interfaces.
In the embodiment of the application, the first electronic device can prompt the user of the target gesture information, the target function, the target application program and the target interface corresponding to the second electronic device in a manner of displaying the prompt information, and prompt the user whether to execute the target operation, so that the accuracy of executing the operation by the electronic device is improved.
The processor 110 is specifically configured to acquire, by the first electronic device, an image picture corresponding to the second electronic device through the camera, and determine a plurality of picture information according to the image picture, where the plurality of picture information includes at least one of the following: target gesture information corresponding to the input of the user, operation executed by the second electronic equipment in response to the input of the user, an application program currently running by the second electronic equipment and an interface currently displayed by the second electronic equipment; and acquires target information from the plurality of pieces of screen information.
In this embodiment of the application, the first electronic device may acquire an image picture corresponding to the second electronic device in a manner that the camera acquires the image, and determine that the second electronic device corresponds to the image picture according to: the target gesture corresponding to the input of the user, the operation executed by the second electronic device in response to the input of the user, the currently running application program of the second electronic device, the currently displayed interface of the second electronic device and other picture information are obtained, and the target information corresponding to the second electronic device is obtained from the plurality of picture information, so that the flexibility of obtaining the target information corresponding to the second electronic device by the first electronic device is high.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the foregoing operation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the foregoing operation method embodiment, and can achieve the same technical effect, and for avoiding repetition, details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An operating method applied to a first electronic device, the method comprising:
the first electronic equipment acquires target information corresponding to second electronic equipment; the target information includes at least one of: inputting target gesture information corresponding to the second electronic equipment and target interface information displayed by the second electronic equipment by a user;
the first electronic equipment executes target operation according to the target information, wherein the target operation is at least one of the following operations: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
2. The method according to claim 1, wherein when the first electronic device acquires that a user inputs the second electronic device based on the target gesture information to trigger the second electronic device to execute a target function, the target information includes: the target gesture information and the target interface information;
the first electronic equipment executes target operation according to the target information, and the target operation comprises the following steps:
and the first electronic equipment sets the target gesture information as an operation mode for triggering and executing the target function according to the target gesture information and the target interface information.
3. The method according to claim 1, wherein when the first electronic device acquires that the second electronic device displays the target interface corresponding to the target application program, the target information includes: the target interface information;
the first electronic equipment executes target operation according to the target information, and the target operation comprises the following steps:
and the first electronic equipment runs the target application program according to the target interface information and synchronously displays the target interface.
4. The method according to any one of claims 1 to 3, wherein before the first electronic device performs a target operation according to the target information, the method further comprises:
the first electronic equipment displays prompt information, and the prompt information is used for prompting a user whether to execute the target operation;
wherein the prompt message comprises at least one of: target gesture information, target functions, target applications, and target interfaces.
5. The method according to claim 1, wherein the obtaining, by the first electronic device, target information corresponding to the second electronic device includes:
the first electronic device acquires an image picture corresponding to the second electronic device through a camera, and determines a plurality of picture information according to the image picture, wherein the plurality of picture information comprises at least one of the following items: the target gesture information corresponding to the input of the user, the operation executed by the second electronic equipment in response to the input of the user, the application program currently running by the second electronic equipment and the interface currently displayed by the second electronic equipment are input by the user;
and acquiring the target information from the plurality of pieces of picture information.
6. An operating device applied to a first electronic device, the operating device comprising: an acquisition module and an execution module;
the acquisition module is used for acquiring target information corresponding to second electronic equipment by the first electronic equipment; the target information includes at least one of: inputting target gesture information corresponding to the second electronic equipment and target interface information displayed by the second electronic equipment by a user;
the execution module is configured to execute, by the first electronic device, a target operation according to the target information, where the target operation is at least one of: and the first electronic equipment adjusts corresponding setting parameters according to the target gesture information, and displays a target interface according to the target interface information.
7. The operating device according to claim 6, wherein when the first electronic device acquires that a user inputs to the second electronic device based on the target gesture information to trigger the second electronic device to execute a target function, the target information includes: the target gesture information and the target interface information;
the execution module is specifically configured to set, by the first electronic device, the target gesture information as an operation mode for triggering execution of the target function according to the target gesture information and the target interface information.
8. The operating device according to claim 6, wherein when the first electronic device acquires that the second electronic device displays the target interface corresponding to the target application, the target information includes: the target interface information;
the execution module is specifically configured to run the target application program according to the target interface information and synchronously display the target interface by the first electronic device.
9. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the method of operation of any of claims 1-5.
10. A readable storage medium, on which a program or instructions are stored, which, when executed by a processor, carry out the steps of the method of operation according to any one of claims 1 to 5.
CN202110939222.XA 2021-08-16 2021-08-16 Operation method and device and electronic equipment Pending CN113778311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110939222.XA CN113778311A (en) 2021-08-16 2021-08-16 Operation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110939222.XA CN113778311A (en) 2021-08-16 2021-08-16 Operation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113778311A true CN113778311A (en) 2021-12-10

Family

ID=78838085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110939222.XA Pending CN113778311A (en) 2021-08-16 2021-08-16 Operation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113778311A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115801948A (en) * 2022-09-28 2023-03-14 维沃软件技术有限公司 Operation method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115801948A (en) * 2022-09-28 2023-03-14 维沃软件技术有限公司 Operation method and device

Similar Documents

Publication Publication Date Title
CN112486444B (en) Screen projection method, device, equipment and readable storage medium
CN112399006B (en) File sending method and device and electronic equipment
CN113794795B (en) Information sharing method and device, electronic equipment and readable storage medium
CN112269508B (en) Display method and device and electronic equipment
CN112083854A (en) Application program running method and device
CN112540740A (en) Split screen display method and device, electronic equipment and readable storage medium
CN112188001B (en) Shortcut setting method, shortcut setting device, electronic equipment and readable storage medium
CN113467660A (en) Information sharing method and electronic equipment
CN112433693A (en) Split screen display method and device and electronic equipment
CN112399010B (en) Page display method and device and electronic equipment
CN112783406B (en) Operation execution method and device and electronic equipment
CN112269509B (en) Information processing method and device and electronic equipment
CN113778311A (en) Operation method and device and electronic equipment
CN112230817B (en) Link page display method and device and electronic equipment
CN113190162A (en) Display method, display device, electronic equipment and readable storage medium
CN113485625A (en) Electronic equipment response method and device and electronic equipment
CN113672136A (en) Information display method, device, equipment and storage medium
CN113807831A (en) Payment method and device
CN113885981A (en) Desktop editing method and device and electronic equipment
CN112269511A (en) Page display method and device and electronic equipment
CN112783998A (en) Navigation method and electronic equipment
CN112764648A (en) Screen capturing method and device, electronic equipment and storage medium
CN112818094A (en) Chat content processing method and device and electronic equipment
CN113126780A (en) Input method, input device, electronic equipment and readable storage medium
CN112637407A (en) Voice input method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination