CN110266881B - Application control method and related product - Google Patents

Application control method and related product Download PDF

Info

Publication number
CN110266881B
CN110266881B CN201910528820.0A CN201910528820A CN110266881B CN 110266881 B CN110266881 B CN 110266881B CN 201910528820 A CN201910528820 A CN 201910528820A CN 110266881 B CN110266881 B CN 110266881B
Authority
CN
China
Prior art keywords
eye movement
application
display area
control operation
movement control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910528820.0A
Other languages
Chinese (zh)
Other versions
CN110266881A (en
Inventor
韩世广
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910528820.0A priority Critical patent/CN110266881B/en
Publication of CN110266881A publication Critical patent/CN110266881A/en
Application granted granted Critical
Publication of CN110266881B publication Critical patent/CN110266881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an application control method and a related product, which are applied to electronic equipment, wherein the method comprises the following steps: running a first application in a first display area of the display screen and running a second application in a second display area of the display screen; acquiring eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, and controlling the first application according to the eye movement control operation; and acquiring touch operation aiming at the second display area, and controlling the second application according to the touch operation. The method and the device are beneficial to improving the edibility and the real-time performance of split-screen application operation.

Description

Application control method and related product
Technical Field
The application relates to the technical field of electronics, in particular to an application control method and a related product.
Background
At present, the smart phone has the capability of split screen, and part of the smart phone is provided with double screens. Two or even more applications can be displayed simultaneously. Through the multi-foreground technology, two or more applications can be simultaneously in the foreground, and the operations can be simultaneously responded.
Under the condition of split screen or double screen, a user may need to operate the applications running on the two screens at the same time, which may affect the real-time performance of the operation and cause inconvenience in operation.
Disclosure of Invention
The embodiment of the application control method and the related product are provided, so that the real-time performance and convenience of application control are improved.
In a first aspect, an embodiment of the present application provides an application control method, which is applied to an electronic device, and the method includes:
running a first application in a first display area of a display screen and running a second application in a second display area of the display screen;
acquiring eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, and controlling the first application according to the eye movement control operation;
and acquiring touch operation aiming at the second display area, and controlling the second application according to the touch operation.
In a second aspect, an embodiment of the present application provides an application control apparatus, which is applied to an electronic device, and includes a processing unit and a communication unit, wherein,
the processing unit is used for running a first application in a first display area of a display screen and running a second application in a second display area of the display screen; the communication unit is used for transmitting an eye movement control operation signal to the first display area and controlling the first application according to the eye movement control operation; and the touch control unit is used for acquiring touch control operation aiming at the second display area, transmitting the touch control operation signal through the communication unit and controlling the second application according to the touch control operation.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for application control, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of a display interface of multiple applications provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of an application control method provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart diagram illustrating another application control method provided in an embodiment of the present application;
fig. 4 is a schematic flowchart of another application control method provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 6 is a block diagram of functional units of an application control device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiment of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, which have wireless communication functions, and various forms of User Equipment (UE), Mobile Stations (MS), terminals (terminal device), and the like.
The following describes embodiments of the present application in detail.
Referring to fig. 1, fig. 1 is a schematic view of a display interface of an electronic device according to an embodiment of the present disclosure, as shown in fig. 1, the electronic device 10 is an electronic device with multiple foreground applications, that is, a foreground can run one or more applications simultaneously, the electronic device 10 includes a first display area 11 and a second display area 12, where a first application, that is, an application a in the drawing, runs in the first display area 11, a second application, that is, an application B in the drawing, runs in the second display area 12, the first display area 11 is controlled by eye movement, and the second display area 12 is controlled manually. At the same time, the user can control the first display area 11 and the second display area 12 of the electronic device 10 by eyes and hands, respectively.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an application control method according to an embodiment of the present application, applied to the electronic device shown in fig. 1, where as shown in the diagram, the application control method includes:
s201, the electronic equipment runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen.
Wherein, electronic equipment is the electronic equipment including many proscenium technique, can have a plurality of applications simultaneously to be the proscenium operation application at the same time promptly, and the number of the application that is in the proscenium operation simultaneously can be 1, and the number of the application that is in the proscenium operation simultaneously still can be 2, and the number of the application that is in the proscenium operation simultaneously still can be 3, etc. here, do not specifically restrict the number of the application that is in the proscenium operation simultaneously.
The positions, sizes, shapes and the like of the first display area and the second display area on the display screen of the electronic device are not particularly limited. The electronic device may perform display area display through a preset display area division policy, for example, the electronic device may determine the position, size, shape, and the like of the first display area and the second display area according to a type of an application opened by a user or a selected application, and the electronic device may further determine the position, size, shape, and the like of the first display area and the second display area by acquiring a manual operation of the user.
The first application and the second application may be different, for example, the first application is a game application, the second application is a chat application, the correspondence between the application and the display area is not unique and unchangeable, and the user may adjust the display positions of the first application and the second application as needed.
As can be seen, in this example, the electronic device can simultaneously run multiple applications in the foreground to implement control of multiple applications simultaneously in different ways.
S202, the electronic equipment obtains eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, and controls the first application according to the eye movement control operation.
The eye movement control operation refers to eye movements made by a user to control the first display area, and the eye movements include blinking, eye movement and the like. Different eye movements correspond to different eye movement control operations to implement various controls for a first application in a first display region.
For example, the electronic device monitors the eye operation of the user, and when detecting that the point of regard of the user falls on the virtual button for closing the first application, the first application is closed, and at the same time, the first display area can also be closed.
Therefore, in this example, the electronic device can control the display area by acquiring the eye movement control operation for the specific area without affecting the application of other display areas, and the convenience of application control is improved.
S203, the electronic device obtains touch operation aiming at the second display area, and controls the second application according to the touch operation.
The electronic device obtaining the touch operation for the second display area may be an electronic device obtaining a sliding touch operation for the second display area, the electronic device obtaining the touch operation for the second display area may be an electronic device obtaining a clicking touch operation for the second display area, the electronic device obtaining the touch operation for the second display area may be an electronic device obtaining another touch operation for the second display area, and the type of the touch operation is not particularly limited.
For example, when the user clicks a virtual close button in the second display area to close the application, the electronic device closes the second application, and at the same time, the second display area may be closed.
Therefore, in this example, the electronic device can control the display area by acquiring the manual control operation for the specific area without affecting the applications of other display areas, and convenience in application control is improved.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
In one possible example, the electronic device obtains an eye movement control operation for the first display area based on a preset eye movement tracking strategy, and the method includes: the electronic equipment acquires the eye movement condition of a target object in a monitoring range; the electronic equipment determines eye movement control operation aiming at the first display area according to the eye movement condition.
The electronic device analyzes the eye movement to obtain the eye movement control operation for the first display area, where the electronic device acquires a valid eye movement from the acquired eye movement, for example, when the eye control operation is a blinking control operation, the electronic device analyzes a blinking movement, such as a natural blinking, that the blinking operation of the target user needs to be invalid.
It should be noted that, before the electronic device obtains the eye movement of the target object within the monitoring range, the method includes: the electronic device determines a target object. Generally speaking, when only one user is in the monitoring range, the user is defaulted as a target object, when the monitored target range class comprises a plurality of users, facial features of the users can be respectively extracted and compared with preset facial features, the user successfully matched is used as a target user, and the user closest to the electronic equipment can also be defaulted as the target user.
Therefore, in this example, the electronic device determines the corresponding eye movement control operation according to the eye movement condition of the user, and convenience of application control is improved.
In this possible example, the eye movement situation comprises a gaze situation and/or a blink situation.
Wherein, the gazing situation includes gazing point and jumping eyes, the gazing point is a position where the gazing time of the user exceeds a preset time, that is, a target user continuously gazes at a position exceeding the preset time to form a gazing point, the preset time can be 60ms, the preset time can be 50ms, the preset time can be 65ms, and the like, the preset time can be set according to needs, the size of the preset time is not specifically limited,
the eye jump refers to rapid movement occurring between fixation points, and the blinking condition includes the number of blinks, the frequency of blinks, and the like.
Therefore, in this example, the electronic device specifically determines the corresponding eye movement control operation according to the gaze condition and/or the blink condition of the user, so as to improve the accuracy of application control.
In one possible example, the eye movement condition comprises a gaze condition, the electronic device determining an eye movement control operation for the first display region based on the eye movement condition, comprising: the electronic equipment analyzes the watching condition to obtain at least one watching point, wherein the watching point is a position where the watching time of the target object exceeds a preset watching time and is positioned on the first display area; the electronic device determines an eye movement control operation for the first display region according to the at least one gaze point.
For example, when the target object stays at the first position of the first display area for 69ms and the preset time is 60ms, if the first position is a virtual button, the eye-click operation on the virtual button is implemented, and if the first position is not the virtual button, the first position is selected as the first gaze point, the eye-movement operation of the user is continuously monitored until the time that the target object stays at the second position of the first display area for more than 60ms appears, that is, the target object selects the content determined by the first position and the second position together through the eye-movement control operation.
As can be seen, in this example, the electronic device determines the corresponding eye movement control operation by analyzing the related gaze point according to the gaze situation of the user on the display area, thereby expanding the control mechanism for the application.
In one possible example, the eye movement condition comprises a blinking condition, and the electronic device determines an eye movement control operation for the first display region according to the eye movement condition, comprising: the electronic equipment determines eye movement control operation aiming at the first display area according to the number of eye blinks and the frequency of eye blinks.
For example, in the process of reading by the target user, when the electronic device detects that the number of blinks of the target user reaches three times, the next page of the current page is displayed, when the electronic device detects that the number of blinks of the target user reaches five times, the previous page of the current page is displayed, and so on, which is just one possible implementation manner, it should be noted that the correspondence between the number of blinks and the execution event has been previously stored in the electronic device.
Therefore, in this example, the electronic device obtains the corresponding eye movement control operation according to the effective blinking condition of the user, and the different blinking times and blinking frequencies correspond to the corresponding eye movement control operation, so that the accuracy of the applied eye movement control is improved.
In one possible example, after the electronic device obtains an eye movement control operation for the first display area based on a preset eye movement tracking strategy, the method further includes: the electronic equipment controls the first display area according to the eye movement control operation.
The electronic device may control the first display area according to the eye movement control operation by adjusting the first display area according to the eye movement control operation, including adjusting the size and shape of the first display area, and the electronic device may control the first display area according to the eye movement control operation or close the first display area according to the eye movement control operation.
The electronic device may control the first display area according to the eye movement control operation by switching positions of the display areas of the first application and the second application, that is, running the second application in the first display area, performing control by eye movement, running the first application in the second display area, and performing control by hand.
Therefore, in this example, the electronic device controls the display area according to the eye movement control operation, that is, the electronic device dynamically adjusts the display area according to the eye movement control operation, so that convenience in controlling the display area is improved.
In one possible example, the method further comprises: the electronic equipment acquires touch operation aiming at the second display area; the electronic equipment controls the second application to acquire target content according to the touch operation; the electronic equipment acquires eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy; and the electronic equipment controls the first application to acquire the target content according to the eye movement control operation and executes a preset event according to the target content.
Wherein the first application and the second application are related applications, for example, the output of the first application can be used as the input of the second application. And in the process of manually controlling the second application, the user keeps eye tracking to follow in real time, so that the user can realize real-time control on the first application by aiming at the first display area at one eye.
For example, the first application is a notepad, the second application is a web page, in order to copy and paste the content of the second application to the page of the first application, the electronic device obtains a manual operation for copying the content of the notepad, the electronic device copies the content to the pasting board according to the copying operation, the pasting board is a pasting board shared by the first application and the second application, and when the electronic device detects that the point of regard of the user falls on a virtual pasting button of the web page, the electronic device pastes the content copied from the notepad to the pasting board to the web page, and executes a subsequent required operation in the web page.
Therefore, in the example, the electronic device can enable the user to keep eye tracking to follow up in real time in the process of manually controlling the second application, so that the user can realize real-time control on the first application by aiming at the first display area at one eye, and the application control real-time performance is favorably improved under the condition that multiple foreground applications run simultaneously.
In this possible example, the target content is text, and the electronic device executes the event including at least one of: the electronic equipment carries out search query on the characters; the electronic equipment broadcasts the characters in a voice mode; and the electronic equipment queries and translates the characters.
The electronic device execution event may be preset by a system, or may be an event that a user performs eye movement selection through a virtual button, and when the user performs eye movement operation to declare the virtual button, the event triggered by the virtual button is executed, for example, a current interface of the first application displays virtual buttons including "search query", "voice broadcast", "query translation", and the like, and when it is detected that a point of regard of the user falls on the "search query" virtual button, the pasted content is queried and translated, and the function can be implemented in a networked state.
Therefore, in this example, the electronic device can control one application manually, and the eye movement control and another application having a relationship with the application are controlled in real time in different ways, so that convenience of application control is improved.
In this possible example, the target content is a picture, and the electronic device executing the event includes at least one of: the electronic equipment carries out search query on the picture; and the electronic equipment performs character recognition on the picture.
For example, the current interface of the first application displays virtual buttons such as "search query", "text recognition", and the like, and when it is detected that the gaze point of the user falls on the virtual button of "search query", query translation is performed on the pasted picture, the function can be implemented in a networked state, and it should be noted that the execution event may also be another execution event, such as entering panning and the like.
Therefore, in this example, the electronic device can control one application manually, and the eye movement control and another application having a relationship with the application are controlled in real time in different ways, so that convenience of application control is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart of another application control method provided in the embodiment of the present application, and the application control method is applied to the electronic device shown in fig. 1, where as shown in the diagram, the application control method includes:
s301, the electronic equipment runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen;
s302, the electronic equipment acquires the eye movement condition of a target object in a monitoring range;
s303, the electronic equipment determines eye movement control operation aiming at the first display area according to the eye movement condition;
s304, the electronic equipment controls the first application according to the eye movement control operation;
s305, the electronic equipment acquires touch operation aiming at the second display area and controls the second application according to the touch operation.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
Referring to fig. 4, fig. 4 is a schematic flowchart of an application control method according to an embodiment of the present application, applied to the electronic device shown in fig. 1, where as shown in the figure, the application control method includes:
s401, the electronic equipment runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen;
s402, the electronic equipment acquires touch operation aiming at the second display area;
s403, the electronic equipment controls the second application to acquire target content according to the touch operation;
s404, the electronic equipment acquires eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy;
s405, the electronic equipment controls the first application to acquire the target content according to the eye movement control operation and executes a preset event according to the target content.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
In accordance with the embodiments shown in fig. 2, fig. 3, and fig. 4, please refer to fig. 5, and fig. 5 is a schematic structural diagram of an electronic device 500 according to an embodiment of the present application, as shown in the figure, the electronic device 500 includes an application processor 510, a memory 520, a communication interface 530, and one or more programs 521, where the one or more programs 521 are stored in the memory 520 and configured to be executed by the application processor 510, and the one or more programs 521 include instructions for performing the following steps;
running a first application in a first display area of a display screen and running a second application in a second display area of the display screen;
acquiring eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, and controlling the first application according to the eye movement control operation;
and acquiring touch operation aiming at the second display area, and controlling the second application according to the touch operation.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
In this possible example, in terms of the obtaining of the eye-movement control operation for the first display region based on the preset eye-movement tracking policy, the instructions in the one or more programs 521 are specifically configured to: the system is used for acquiring the eye movement condition of the target object in the monitoring range; and determining the eye movement control operation aiming at the first display area according to the eye movement condition.
In this possible example, the eye movement situation comprises a gaze situation and/or a blink situation.
In this possible example, where the eye movement comprises a gaze situation, the determining an eye movement control operation for the first display region from the eye movement aspect, the instructions in the one or more programs 521 are specifically for performing the following: analyzing the watching condition to obtain at least one watching point, wherein the watching point is a position of the target object, the watching time of which exceeds a preset watching time and is positioned on the first display area; and means for determining an eye movement control operation for the first display region in dependence on the at least one point of regard.
In this possible example, where the eye movement comprises a blinking condition, the determining an eye movement control operation for the first display region as a function of the eye movement may be specifically for performing the following: and determining the eye movement control operation aiming at the first display area according to the blink frequency and the blink frequency.
In one possible example, the one or more programs 521 further include instructions for: and after the eye movement control operation aiming at the first display area is obtained based on the preset eye movement tracking strategy, controlling the first display area according to the eye movement control operation.
In one possible example, the one or more programs 521 further include instructions for: acquiring touch operation aiming at the second display area; the touch control operation module is used for controlling the second application to acquire target content according to the touch control operation; the control system is used for acquiring an eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy; and the first application is controlled to acquire the target content according to the eye movement control operation, and a preset event is executed according to the target content.
In this possible example, the target content is text, and the execution event includes at least one of: performing search query on the characters; carrying out voice broadcast on the characters; and inquiring and translating the characters.
In this possible example, the target content is a picture, and the execution event includes at least one of: searching and inquiring the picture; and performing character recognition on the picture.
It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 6 is a block diagram showing functional units of an application control device 600 according to an embodiment of the present application. The application control device 600 is applied to an electronic device in a garbage source classification and decrement comprehensive service platform, the electronic device is used for a first supervising user to execute supervising behaviors, and comprises a processing unit 601 and a communication unit 602, wherein,
the processing unit 601 is configured to run a first application in a first display area of a display screen and run a second application in a second display area of the display screen; the display device is used for acquiring an eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, transmitting an eye movement control operation signal through the communication unit 602, and controlling the first application according to the eye movement control operation; and is configured to acquire a touch operation for the second display area, transmit the touch operation signal through the communication unit 602, and control the second application according to the touch operation.
The application control apparatus 600 may further include a storage unit 603 for storing program codes and data of the electronic device. The processing unit 601 may be a processor, the communication unit 602 may be a touch display screen or a transceiver, and the storage unit 603 may be a memory.
It can be seen that, in the embodiment of the present application, an electronic device firstly runs a first application in a first display area of a display screen and runs a second application in a second display area of the display screen, and secondly obtains an eye movement control operation for the first display area based on a preset eye movement tracking policy, controls the first application according to the eye movement control operation, obtains a touch operation for the second display area, and controls the second application according to the touch operation. Therefore, when the electronic equipment is used for split-screen display application, the electronic equipment can control the application in different split-screen display areas by setting different control modes which are not interfered with each other in the application control process, so that the accuracy of application control is improved, and the application in different display areas is controlled by eye movement control operation and touch operation respectively, so that the convenience of controlling the application by a target user is improved, and the effectiveness of the application control under the split-screen display condition is improved.
In this possible example, in terms of the obtaining of the eye movement control operation for the first display region based on the preset eye movement tracking policy, the processing unit 601 is specifically configured to: acquiring the eye movement condition of a target object in a monitoring range; and determining an eye movement control operation for the first display area according to the eye movement condition.
In this possible example, the eye movement situation comprises a gaze situation and/or a blink situation.
In this possible example, when the eye movement situation comprises a gaze situation, and the aspect of determining the eye movement control operation for the first display region according to the eye movement situation is specifically configured to: analyzing the watching condition to obtain at least one watching point, wherein the watching point is a position of the target object, the watching time of which exceeds a preset watching time and is positioned on the first display area; and determining an eye movement control operation for the first display region according to the at least one point of regard.
In this possible example, in the case that the eye movement comprises a blinking situation, and in the aspect that the eye movement control operation for the first display region is determined according to the eye movement, the processing unit 601 is specifically configured to: and determining the eye movement control operation aiming at the first display area according to the blink frequency and the blink frequency.
In a possible example, after the obtaining of the eye movement control operation for the first display region based on the preset eye movement tracking strategy, the processing unit 601 is further configured to: and controlling the first display area according to the eye movement control operation.
In one possible example, the processing unit 601 is further configured to: acquiring touch operation aiming at the second display area; the touch control operation module is used for controlling the second application to acquire target content according to the touch control operation; the control system is used for acquiring an eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy; and the first application is controlled to acquire the target content according to the eye movement control operation, and a preset event is executed according to the target content.
In this possible example, the target content is text, and the execution event includes at least one of: performing search query on the characters; carrying out voice broadcast on the characters; and inquiring and translating the characters.
In this possible example, the target content is a picture, and the execution event includes at least one of: searching and inquiring the picture; and performing character recognition on the picture.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for application control, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (11)

1. An application control method applied to an electronic device, the method comprising:
running a first application in a first display area of a display screen and running a second application in a second display area of the display screen;
acquiring eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy, and controlling the first application according to the eye movement control operation;
simultaneously acquiring touch operation aiming at the second display area, and controlling the second application according to the touch operation;
the method further comprises the following steps:
acquiring touch operation aiming at the second display area;
controlling the second application to acquire target content according to the touch operation;
acquiring eye movement control operation aiming at the first display area based on a preset eye movement tracking strategy;
and controlling the first application to acquire the target content according to the eye movement control operation and executing a preset event according to the target content.
2. The method of claim 1, wherein obtaining an eye movement control operation for the first display region based on a preset eye movement tracking strategy comprises:
acquiring the eye movement condition of a target object in a monitoring range;
and determining the eye movement control operation aiming at the first display area according to the eye movement condition.
3. The method of claim 2, wherein the eye movement condition comprises a gaze condition and/or a blink condition.
4. The method of claim 3, wherein the eye movement condition comprises a gaze condition, and wherein determining an eye movement control operation for the first display region based on the eye movement condition comprises:
analyzing the watching condition to obtain at least one watching point, wherein the watching point is a position of the target object, the watching time of which exceeds a preset watching time and is positioned on the first display area;
determining an eye movement control operation for the first display region according to the at least one gaze point.
5. The method of claim 3, wherein the eye movement comprises a blinking condition, and wherein determining an eye movement control operation for the first display region based on the eye movement comprises:
and determining the eye movement control operation aiming at the first display area according to the blink frequency and the blink frequency.
6. The method of claim 1, wherein after obtaining the eye movement control operation for the first display region based on the preset eye movement tracking strategy, the method further comprises:
and controlling the first display area according to the eye movement control operation.
7. The method of claim 1, wherein the target content is text, and the executing the preset event comprises at least one of:
performing search query on the characters;
carrying out voice broadcast on the characters;
and inquiring and translating the characters.
8. The method of claim 1, wherein the target content is a picture, and the performing the preset event comprises at least one of:
searching and inquiring the picture;
and performing character recognition on the picture.
9. An application control apparatus, applied to an electronic device, comprising a processing unit and a communication unit, wherein,
the processing unit is used for running a first application in a first display area of a display screen and running a second application in a second display area of the display screen; the communication unit is used for transmitting an eye movement control operation signal to the first display area and controlling the first application according to the eye movement control operation; the communication unit is used for simultaneously acquiring touch operation aiming at the second display area, transmitting the touch operation signal through the communication unit and controlling the second application according to the touch operation; the display device is further configured to obtain a touch operation for the second display area, control the second application to obtain target content according to the touch operation, obtain an eye movement control operation for the first display area based on a preset eye movement tracking policy, control the first application to obtain the target content according to the eye movement control operation, and execute a preset event according to the target content.
10. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-8.
11. A computer-readable storage medium, in which a computer program for application control is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1 to 8.
CN201910528820.0A 2019-06-18 2019-06-18 Application control method and related product Active CN110266881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910528820.0A CN110266881B (en) 2019-06-18 2019-06-18 Application control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910528820.0A CN110266881B (en) 2019-06-18 2019-06-18 Application control method and related product

Publications (2)

Publication Number Publication Date
CN110266881A CN110266881A (en) 2019-09-20
CN110266881B true CN110266881B (en) 2021-03-12

Family

ID=67919088

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910528820.0A Active CN110266881B (en) 2019-06-18 2019-06-18 Application control method and related product

Country Status (1)

Country Link
CN (1) CN110266881B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111193938B (en) * 2020-01-14 2021-07-13 腾讯科技(深圳)有限公司 Video data processing method, device and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301507A (en) * 2013-07-15 2015-01-21 Lg电子株式会社 Mobile terminal and control method thereof
CN104769524A (en) * 2012-11-07 2015-07-08 高通股份有限公司 Techniques for utilizing a computer input device with multiple computers
CN105378595A (en) * 2013-06-06 2016-03-02 微软技术许可有限责任公司 Calibrating eye tracking system by touch input
CN105849712A (en) * 2013-10-23 2016-08-10 三星电子株式会社 Method and device for transmitting data, and method and device for receiving data
CN108803867A (en) * 2018-04-12 2018-11-13 珠海市魅族科技有限公司 A kind of information processing method and device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11010042B2 (en) * 2014-02-13 2021-05-18 Lenovo (Singapore) Pte. Ltd. Display of different versions of user interface element
US9936195B2 (en) * 2014-11-06 2018-04-03 Intel Corporation Calibration for eye tracking systems
CN104834446B (en) * 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 A kind of display screen multi-screen control method and system based on eyeball tracking technology
CN104991642A (en) * 2015-06-18 2015-10-21 惠州Tcl移动通信有限公司 Method for intelligent terminal question answering
CN105204993A (en) * 2015-09-18 2015-12-30 中国航天员科研训练中心 Questionnaire test system and method based on multimodal interactions of eye movement, voice and touch screens
CN106095112B (en) * 2016-06-24 2020-06-23 联想(北京)有限公司 Information processing method and device
CN106919294B (en) * 2017-03-10 2020-07-21 京东方科技集团股份有限公司 3D touch interaction device, touch interaction method thereof and display device
US10496162B2 (en) * 2017-07-26 2019-12-03 Microsoft Technology Licensing, Llc Controlling a computer using eyegaze and dwell
US10678116B1 (en) * 2017-11-09 2020-06-09 Facebook Technologies, Llc Active multi-color PBP elements
CN109587344A (en) * 2018-12-28 2019-04-05 北京七鑫易维信息技术有限公司 Call control method, device, mobile terminal and medium based on mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104769524A (en) * 2012-11-07 2015-07-08 高通股份有限公司 Techniques for utilizing a computer input device with multiple computers
CN105378595A (en) * 2013-06-06 2016-03-02 微软技术许可有限责任公司 Calibrating eye tracking system by touch input
CN104301507A (en) * 2013-07-15 2015-01-21 Lg电子株式会社 Mobile terminal and control method thereof
CN105849712A (en) * 2013-10-23 2016-08-10 三星电子株式会社 Method and device for transmitting data, and method and device for receiving data
CN108803867A (en) * 2018-04-12 2018-11-13 珠海市魅族科技有限公司 A kind of information processing method and device

Also Published As

Publication number Publication date
CN110266881A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US11328044B2 (en) Dynamic recognition method and terminal device
CN107801096A (en) Control method, device, terminal device and the storage medium of video playback
CN106681635A (en) Method and device for preventing split screen touch by mistake and mobile terminal
CN113170000A (en) Equipment control method, device, system, electronic equipment and cloud server
CN108965981B (en) Video playing method and device, storage medium and electronic equipment
CN111338838A (en) Method for controlling frequency of central processing unit and related device
CN109062715B (en) Method and device for determining memory clock frequency and terminal
WO2019011098A1 (en) Unlocking control method and relevant product
US10732724B2 (en) Gesture recognition method and apparatus
CN108108117B (en) Screen capturing method and device and terminal
CN112689201A (en) Barrage information identification method, barrage information display method, server and electronic equipment
US10088897B2 (en) Method and electronic device for improving performance of non-contact type recognition function
CN110266881B (en) Application control method and related product
CN110572704A (en) method, device, equipment and medium for controlling bullet screen playing speed
CN107728877B (en) Application recommendation method and mobile terminal
WO2015085874A1 (en) Method and device for identifying gesture
CN112654957B (en) Suspended window control method and related products
CN111556358B (en) Display method and device and electronic equipment
CN104980582A (en) Screen switching method and mobile terminal
WO2020124454A1 (en) Font switching method and related product
CN109885170A (en) Screenshotss method, wearable device and computer readable storage medium
CN116980719A (en) Video opening method and device and computer readable storage medium
CN111081197A (en) Brightness parameter synchronization method, related device and readable storage medium
CN114510188A (en) Interface processing method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant