CN108958587B - Split screen processing method and device, storage medium and electronic equipment - Google Patents

Split screen processing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108958587B
CN108958587B CN201810739554.1A CN201810739554A CN108958587B CN 108958587 B CN108958587 B CN 108958587B CN 201810739554 A CN201810739554 A CN 201810739554A CN 108958587 B CN108958587 B CN 108958587B
Authority
CN
China
Prior art keywords
screen
split
split screen
application
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810739554.1A
Other languages
Chinese (zh)
Other versions
CN108958587A (en
Inventor
曹丹
林志泳
付亮晶
李同喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810739554.1A priority Critical patent/CN108958587B/en
Publication of CN108958587A publication Critical patent/CN108958587A/en
Application granted granted Critical
Publication of CN108958587B publication Critical patent/CN108958587B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The embodiment of the application discloses a split screen processing method, a split screen processing device, a storage medium and electronic equipment; the method comprises the following steps: the method comprises the steps of displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment, acquiring focusing areas of eyes focused on a screen of the electronic equipment, determining time lengths of the eyes respectively focused on the split screen areas within a preset time length, determining a target split screen application among the split screen applications according to the time lengths, and switching the target split screen application to the preset split screen area. According to the method and the device, when the electronic equipment displays the split screen interface, the target split screen application concerned by the user can be determined according to the focusing area focused on the screen of the electronic equipment by the user, so that the target split screen application is switched to the preset split screen area, and the switching efficiency of the split screen area is improved.

Description

Split screen processing method and device, storage medium and electronic equipment
Technical Field
The application relates to the field of electronic equipment, in particular to a split screen processing method and device, a storage medium and electronic equipment.
Background
With the development of terminal technology, terminals have begun to change from simply providing telephony devices to a platform for running general-purpose software. The platform no longer aims at providing call management, but provides an operating environment including various application software such as call management, game and entertainment, office events, mobile payment and the like, and with a great deal of popularization, the platform has been deeply developed to the aspects of life and work of people.
The split screen is a common user scene of the Android system intelligent terminal at present. The user is more emphatic to the experience when the split screen uses, if the user is when using the split screen, to the agility of split screen operation, the expressive force of demonstration, whether it is convenient to get into and withdraw from, it is humanized, can all have higher requirement, fluency in the operation process also relatively influences experience, the user can be associated with the split screen operation to the performance of cell-phone and directly perceived impression, influences the brand and the reputation of product.
However, in the existing Android system, no special processing is performed on a scene that a user uses split screens at the present stage, and the operation is relatively original, so that the user cannot use the system quickly.
Disclosure of Invention
The embodiment of the application provides a split screen processing method and device, a storage medium and electronic equipment, which can improve the switching efficiency of the electronic equipment to a split screen area.
In a first aspect, an embodiment of the present application provides a split-screen processing method, including:
displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment;
acquiring a focusing area of an eye focused on the screen of the electronic equipment, and determining the time length of the eye respectively focused on the multiple split screen areas within a preset time length;
determining a target split screen application among the plurality of split screen applications according to the duration;
and switching the target split screen application to a preset split screen area.
In a second aspect, an embodiment of the present application further provides a split screen processing apparatus, including: the device comprises a display module, an acquisition module, a determination module and a switching module;
the display module is used for displaying a split screen interface, and the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment;
the acquisition module is used for acquiring a focusing area of an eye focused on the screen of the electronic equipment and determining the time length of the eye respectively focused on the multiple split screen areas within a preset time length;
the determining module is used for determining a target split screen application in the plurality of split screen applications according to the duration;
the switching module is used for switching the target split screen application to a preset split screen area.
In a third aspect, an embodiment of the present application further provides a storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the split-screen processing method.
In a fourth aspect, an embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the split-screen processing method when executing the program.
The split-screen processing method provided by the embodiment of the application comprises the steps of firstly displaying a split-screen interface, wherein the split-screen interface comprises split-screen areas corresponding to a plurality of split-screen applications currently running on an electronic device, acquiring focusing areas of eyes focused on a screen of the electronic device, determining time lengths of the eyes respectively focused on the split-screen areas within a preset time length, determining a target split-screen application among the split-screen applications according to the time lengths, and switching the target split-screen application to the preset split-screen area. According to the method and the device, when the electronic equipment displays the split screen interface, the target split screen application concerned by the user can be determined according to the focusing area focused on the screen of the electronic equipment by the user, so that the target split screen application is switched to the preset split screen area, and the switching efficiency of the split screen area is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a split-screen processing method according to an embodiment of the present application.
Fig. 2 is a scene schematic diagram of a split-screen processing method provided in the embodiment of the present application.
Fig. 3 is another schematic flow chart of a split-screen processing method according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a split-screen processing apparatus according to an embodiment of the present application.
Fig. 5 is another schematic structural diagram of a split-screen processing apparatus according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 7 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The principles of the present application may be employed in numerous other general-purpose or special-purpose computing, communication environments or configurations. Examples of well known computing systems, environments, and configurations that may be suitable for use with the application include, but are not limited to, hand-held telephones, personal computers, servers, multiprocessor systems, microcomputer-based systems, mainframe-based computers, and distributed computing environments that include any of the above systems or devices.
The details will be described below separately.
The present embodiment will be described in terms of a split-screen processing apparatus, which may be specifically integrated in an electronic device, where the electronic device may be an electronic device with an image processing function, such as a mobile interconnection network device (e.g., a smart phone, a tablet computer).
Referring to fig. 1 first, fig. 1 is a schematic flow chart of a split-screen processing method provided in an embodiment of the present application, including the following steps:
step S101, displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment.
In one embodiment, the electronic device may receive a screen splitting instruction, where the screen splitting instruction is used to start a screen splitting mode of the electronic device and display a screen splitting interface, and a user selects at least two applications to be displayed in the screen splitting interface. In the split screen mode, the user can also be supported to select or divide the split screen area of the current split screen interface of the electronic equipment.
In an embodiment, the screen splitting instruction may also be generated for a gesture, such as a double-touch screen, or a finger presses the screen display interface to be stationary, or a finger slides on the screen display interface, so as to generate the screen splitting instruction. Alternatively, the screen-division instruction is generated by sound. Correspondingly, the electronic device is a device with a voice reception control. For example, the electronic device displays the first application program and the second application program in a split screen manner according to the split screen instruction.
In an embodiment, a screen of the electronic device may display a separation bar on the display interface, where the separation bar is located between multiple split-screen applications, for example, if the current split-screen interface includes two split-screen applications, a separation bar is generated between the two applications, and if the current split-screen interface includes three split-screen applications, a separation bar is generated between two applications, that is, a total of two separation bars is generated. The user can pull or drag the separating strip to perform screen splitting operation. The separator bar can be directly displayed on a screen display interface of the electronic device, or can be called out (or generated) in various ways. For example, the separator bar may be an edge line of the screen display interface.
The split-screen display function may be preset in the electronic device, or may be obtained by downloading a corresponding application program. The function of the split-screen display may correspond to a split-screen instruction, and when the split-screen instruction is not triggered, the electronic device may display the running application program through the full screen of the display screen of the electronic device, as with the electronic device in the prior art. When the screen splitting instruction is triggered, the display screen of the electronic device can be divided into a plurality of display areas, such as a first display area and a second display area. The first display region and the second display region may not overlap each other to ensure independence of display contents therein.
In an embodiment, before split-screen display, a split-screen display mode of a display screen of an electronic device may be preset, and the split-screen display mode may include, for example, up-down split-screen display or left-right split-screen display. The display screen of the electronic device can be divided into display areas consistent with the split-screen display mode according to the split-screen display mode. For example, the display screen of the electronic device may be divided into a first display area and a second display area which are uniform in size and distributed vertically, and of course, the display screen of the electronic device may be divided into a first display area and a second display area which are unequal in size and distributed horizontally. The split-screen display mode can be determined according to the current display mode of the first application program. For example, if the first application is a landscape display, the first application and the second application may be split left and right, and if the first application is a portrait display, the first application and the second application may be split up and down.
Step S102, acquiring a focusing area of the eyes focused on the screen of the electronic equipment, and determining the time length of the eyes respectively focused on the plurality of split screen areas within a preset time length.
In an embodiment, the manner of acquiring the focus area of the eye focused on the screen of the electronic device may be various, for example, acquiring an image of a human eye of a user, and then determining a focus area of the human eye on a target image according to the image of the human eye, where the focus position is the focus area of the eye focused on the screen of the electronic device.
In an embodiment, the face image of the user may be acquired first, and then the eye image is extracted from the face image, so that the speed and efficiency of acquiring the eye image may be improved. For example, a front camera of the electronic device performs face recognition to obtain a face image in a target image, and then extracts a human eye image from the face image, where the face recognition is a biometric technology for performing identity recognition based on facial feature information of a person. A series of related technologies, also commonly called face recognition and face recognition, are used to capture an image or video stream containing a face with a camera or a video camera, automatically detect and track the face in the image, and then perform face recognition on the detected face. In addition, the face recognition may use an adaptive boosting (adaptive boosting) algorithm based on Haar features to detect the face in the original image, or use another algorithm to detect the face in the original image, which is not limited in this embodiment.
After the human eye image is extracted, a focus area of human eyes on the screen of the electronic device can be determined according to an eyeball tracking technology. The eyeball tracking mode can be various, firstly, the tracking is carried out according to the characteristic changes of the eyeball and the periphery of the eyeball, secondly, the tracking is carried out according to the angle change of the iris, and thirdly, the light beams such as infrared rays and the like are actively projected to the iris to extract the characteristics. In the aspect of precision, the infrared projection mode has a great advantage, can be accurate to within 1 cm on a 30-inch screen, and can replace a mouse and a touch pad to some extent to perform limited operations by means of technologies such as blink recognition and gaze recognition. In addition, other image acquisition devices, such as a camera on a computer or a mobile phone, can also realize eyeball tracking under the support of software, but have differences in accuracy, speed and stability. Thus, in one embodiment, eye tracking may be accomplished using an infrared distance sensor on the electronic device to determine the area of focus of the human eye on the screen of the electronic device.
In one embodiment, determining the area of focus of the human eye on the screen of the electronic device is essentially the target split screen area for determining the focus of the eye. For example, the focal position of the human eye on the screen of the electronic device is determined through a human eye image, and then the target split screen area is determined according to the focal position of the eye focused on the screen of the electronic device and the split screen areas, so as to determine the time length during which the eye focuses on the split screen areas respectively within the preset time length.
Specifically, the coordinates of the focal point position of the eye focused on the screen of the electronic device may be acquired, and the coordinate set where the coordinates are located may be determined. The duration that the eyes focus on the multiple split screen areas respectively within the preset duration is the duration that the user focuses on the target application program within the preset duration. For example, the current split-screen interface of the electronic device includes an application 1 and an application 2, when the coordinates of the focal position where the eyes are focused on the screen of the electronic device are in the split-screen area corresponding to the application 1, it is determined that the user is currently focused on the application 1, and when the coordinates of the focal position where the eyes are focused on the screen of the electronic device are in the split-screen area corresponding to the application 2, it is determined that the user is currently focused on the application 2.
In an embodiment, the preset time period may be within a preset time period before the current time, for example, the preset time period may be 20 minutes, and the current time is 2 pm: 00, then 1: 40 to 2: 00 the time period during which the eyes are focused on the plurality of split screen areas, respectively. In other embodiments, the preset duration may also be a duration from when the electronic device starts the split-screen mode to the current time, and in actual use, it is considered that the split-screen application in the split-screen area may be switched after the user starts the split-screen mode of the electronic device, so as to further improve accuracy of obtaining durations during which the eyes respectively focus on the multiple split-screen areas, the duration during which the eyes respectively focus on the multiple split-screen areas may be obtained within the preset duration after the split-screen application program is switched.
For example, within 20 minutes, the time period for focusing the eyes of the user on the split screen area corresponding to the application program 1 is 6 minutes, and the time period for focusing the eyes of the user on the split screen area corresponding to the application program 2 is 12 minutes, that is, the time period for focusing the user on the application program 1 within 20 minutes is 6 minutes, and the time period for focusing the user on the application program 2 is 12 minutes.
And step S103, determining a target split screen application among the plurality of split screen applications according to the time length.
In an embodiment, a split screen application corresponding to a split screen area where a user has the longest focusing time within a preset time period among a plurality of split screen applications may be determined as a target split screen application. For example, the preset time duration is 20 minutes, the time duration for the user to pay attention to the application program 1 in 20 minutes is 6 minutes, and the time duration for paying attention to the application program 2 is 12 minutes, and the application program 2 may be determined as the target split screen application. The foregoing embodiment is described by taking two split-screen applications as an example, and other embodiments may also include three or more split-screen applications, which is not further limited in this application.
And step S104, switching the target split screen application to a preset split screen area.
In the embodiment of the application, a preset screen splitting area can be set in a plurality of screen splitting areas, for example, an area which is convenient for a user to operate in the screen can be set as the preset screen splitting area, in the middle of practical use, the user can very conveniently touch due to the fact that the screen of the electronic equipment on the market is larger and larger at present, and therefore the area which is close to the finger of the user can be set as the preset screen splitting area. For example, in the vertical screen mode, the screen splitting area below can be set as a preset screen splitting area, in the horizontal screen mode, the screen splitting area on the right side can be set as a preset screen splitting area, and of course, if the habit of the user is to use the left hand, the screen splitting area on the left side can be set as a preset screen splitting area.
For example, as shown in fig. 2, fig. 2 is a scene schematic diagram of a split-screen processing method provided in the embodiment of the present application. The currently displayed split-screen interface of the electronic equipment respectively comprises split-screen areas corresponding to an application program 1 and an application program 2, if the preset time is 20 minutes, the electronic equipment detects that the time of focusing the eyes of a user in the split-screen area corresponding to the application program 1 within 20 minutes is 6 minutes, and the time of focusing the eyes of the user in the split-screen area corresponding to the application program 2 is 12 minutes, the application program 2 can be determined as a target split-screen application, then a switching button can be generated, and when the user clicks the button, the application program 2 is switched to the lower preset split-screen area.
In an embodiment, the switching button for switching the split screen area may be further disposed in the split screen area where the target split screen application is located, so that the user is informed of the target split screen application determined by the electronic device, and user experience is improved.
In this embodiment of the present invention, the electronic device may be any intelligent electronic device capable of running an application program, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a laptop Computer (laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
As can be seen from the above, the split-screen interface can be displayed in the embodiment of the application, the split-screen interface includes split-screen areas corresponding to multiple split-screen applications currently running on the electronic device, a focusing area where eyes focus on a screen of the electronic device is obtained, durations when the eyes focus on the multiple split-screen areas respectively within a preset duration are determined, a target split-screen application is determined among the multiple split-screen applications according to the durations, and the target split-screen application is switched to the preset split-screen area. According to the method and the device, when the electronic equipment displays the split screen interface, the target split screen application concerned by the user can be determined according to the focusing area focused on the screen of the electronic equipment by the user, so that the target split screen application is switched to the preset split screen area, and the switching efficiency of the split screen area is improved.
According to the description of the previous embodiment, the screen splitting processing method of the present application will be further explained below.
Referring to fig. 3, fig. 3 is a schematic flow chart of another split-screen processing method according to an embodiment of the present application, including the following steps:
step S201, displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment.
In one embodiment, the electronic device may receive a screen splitting instruction, where the screen splitting instruction is used to start a screen splitting mode of the electronic device and display a screen splitting interface, and a user selects at least two applications to be displayed in the screen splitting interface. In the split screen mode, the user can also be supported to select or divide the split screen area of the current split screen interface of the electronic equipment.
In an embodiment, the screen splitting instruction may also be generated for a gesture, such as a double-touch screen, or a finger presses the screen display interface to be stationary, or a finger slides on the screen display interface, so as to generate the screen splitting instruction. Alternatively, the screen-division instruction is generated by sound. Correspondingly, the electronic device is a device with a voice reception control. For example, the electronic device displays the first application program and the second application program in a split screen manner according to the split screen instruction.
Step S202, a human eye image of the user is obtained, and eyeball characteristic information in the human eye image is extracted.
In an embodiment, the front-facing camera can acquire a face image of a user, and then a human eye image is extracted from the face image, so that the speed and efficiency of acquiring the human eye image can be improved. For example, the front camera of the electronic device is used for face recognition to obtain a face image in the target image, and then a human eye image is extracted from the face image, so that eyeball feature information in the human eye image can be extracted. Wherein, the eyeball characteristic information may include the position of the pupil in the human eye.
Step S203, determining the focal position of the eyes focused on the screen of the electronic equipment according to the eyeball characteristic information.
Specifically, the focus position of the human eye on the screen of the electronic device can be determined according to the position of the pupil in the human eye. In other embodiments, the tracking can be performed according to the eyeball and the characteristic change of the eyeball periphery, or the tracking can be performed according to the iris angle change, and the characteristics can be extracted by actively projecting light beams such as infrared rays to the iris to determine the focus position of the human eye on the screen of the electronic equipment. That is, the eyeball characteristic information includes the position of the pupil in the human eye, and the focal position of the eye focused on the screen of the electronic device is determined according to the eyeball characteristic information, including:
and determining the focal position of the eye focused on the screen of the electronic equipment according to the position of the pupil in the human eye.
And step S204, determining a target split screen area according to the focal position of the eye focused on the screen of the electronic equipment.
Specifically, the focus position of human eyes on the screen of the electronic equipment is determined through a human eye image, and then a target split screen area is determined according to the focus position of the eyes focused on the screen of the electronic equipment and the multiple split screen areas.
For example, coordinates of a focal position of an eye focused on a screen of the electronic device may be acquired, a coordinate set where the coordinates are located may be determined, and since each of the plurality of split screen areas includes one coordinate set, a target split screen area where the eye of the user is currently focused may be further determined.
Step S205, determining the time length of focusing of the eyes on the plurality of split screen areas respectively within the preset time length.
For example, the current split-screen interface of the electronic device includes an application 1 and an application 2, when the coordinates of the focal position where the eyes are focused on the screen of the electronic device are in the split-screen area corresponding to the application 1, it is determined that the user is currently focused on the application 1, and when the coordinates of the focal position where the eyes are focused on the screen of the electronic device are in the split-screen area corresponding to the application 2, it is determined that the user is currently focused on the application 2.
In an embodiment, the preset time period may be within a preset time period before the current time, for example, the preset time period may be 20 minutes, and the current time is 2 pm: 00, then 1: 40 to 2: 00 the time period during which the eyes are focused on the plurality of split screen areas, respectively.
And step S206, determining a target split-screen application among the plurality of split-screen applications according to the time length.
For example, within 20 minutes, the time period during which the eyes of the user focus on the split screen area corresponding to the application program 1 is 6 minutes, and the time period during which the eyes of the user focus on the split screen area corresponding to the application program 2 is 12 minutes, that is, the time period during which the user focuses on the application program 1 within 20 minutes is 6 minutes, and the time period during which the user focuses on the application program 2 is 12 minutes, so that the application program 2 can be determined as the target split screen application. The foregoing embodiment is described by taking two split-screen applications as an example, and other embodiments may also include three or more split-screen applications, which is not further limited in this application.
In an embodiment, before switching the target split-screen application to the preset split-screen area, it may be determined whether the target split-screen application is currently located in the preset split-screen area, if so, the switching is not necessary, and if not, the step of switching the target split-screen application to the preset split-screen area is continuously performed. That is, before the target split-screen application is switched to the preset split-screen area, the method further includes:
judging whether the target split-screen application is located in the preset split-screen area currently;
if not, generating an identifier for switching the target split-screen application position;
and receiving touch operation of a user aiming at the identifier, and switching the target split screen application to a preset split screen area according to the touch operation.
Step S207, acquiring the current screen display mode of the electronic equipment, wherein the screen display mode comprises a horizontal screen display mode and a vertical screen display mode.
And S208, determining a preset split screen area according to the screen display mode, and switching the target split screen application to the preset split screen area.
In the embodiment of the application, a preset screen splitting area can be set in a plurality of screen splitting areas, for example, an area which is convenient for a user to operate in the screen can be set as the preset screen splitting area, in the middle of practical use, the user can very conveniently touch due to the fact that the screen of the electronic equipment on the market is larger and larger at present, and therefore the area which is close to the finger of the user can be set as the preset screen splitting area. For example, in the vertical screen mode, the screen splitting area below the screen splitting area can be set as a preset screen splitting area, in the horizontal screen mode, the screen splitting area on the right side can be set as the preset screen splitting area, and then the target screen splitting application is switched to the preset screen splitting area.
In this embodiment of the present invention, the electronic device may be any intelligent electronic device capable of running an application program, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
As can be seen from the above, the split-screen processing method provided in the embodiment of the present application can display a split-screen interface, where the split-screen interface includes split-screen areas corresponding to multiple split-screen applications currently running on an electronic device, acquire an eye image of a user, extract eyeball feature information in the eye image, determining the focal position of eyes focused on the screen of the electronic equipment according to the eyeball characteristic information, determining a target split screen area according to the focal position of the eyes focused on the screen of the electronic equipment, determining the time length of the eyes respectively focused on a plurality of split screen areas within a preset time length, determining a target split screen application among a plurality of split screen applications according to the time length, acquiring the current screen display mode of the electronic equipment, wherein the screen display mode comprises a horizontal screen display mode and a vertical screen display mode, and determining a preset split screen area according to the screen display mode, and switching the target split screen application to the preset split screen area. According to the method and the device, when the electronic equipment displays the split screen interface, the target split screen application concerned by the user can be determined according to the focusing area focused on the screen of the electronic equipment by the user, so that the target split screen application is switched to the preset split screen area, and the switching efficiency of the split screen area is improved.
In order to better implement the split-screen processing method provided by the embodiment of the application, the embodiment of the application also provides a device based on the split-screen processing method. The meaning of the noun is the same as that in the split screen processing method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a split-screen processing device according to an embodiment of the present application, where the split-screen processing device 30 includes: a display module 301, an acquisition module 302, a determination module 303 and a switching module 304;
the display module 301 is configured to display a split-screen interface, where the split-screen interface includes split-screen areas corresponding to multiple split-screen applications currently running on the electronic device;
the obtaining module 302 is configured to obtain a focusing area where an eye focuses on the screen of the electronic device, and determine a time length for which the eye focuses on the multiple split screen areas respectively within a preset time length;
the determining module 303 is configured to determine a target split-screen application among the multiple split-screen applications according to the duration;
the switching module 304 is configured to switch the target split-screen application to a preset split-screen area.
In an embodiment, as shown in fig. 5, the obtaining module 302 may include: a first acquisition submodule 3021, a first determination submodule 3022, and a second determination submodule 3023;
the first obtaining submodule 3021 is configured to obtain an eye image of a user, and extract eyeball feature information in the eye image;
the first determining submodule 3022 is configured to determine a focal position where an eye focuses on the screen of the electronic device according to the eyeball characteristic information;
the second determining submodule 3023 is configured to determine a target split screen area according to a focal position where an eye focuses on the screen of the electronic device.
In an embodiment, the switching module 304 includes: a second acquisition submodule 3041 and a switching submodule 3042;
the second obtaining sub-module 3041 is configured to obtain a current screen display mode of the electronic device, where the screen display mode includes a horizontal screen display mode and a vertical screen display mode;
the switching sub-module 3042 is configured to determine a preset split screen area according to the screen display manner, and switch the target split screen application to the preset split screen area.
As can be seen from the above, the split-screen processing device 30 provided in this embodiment of the present application may display a split-screen interface, where the split-screen interface includes split-screen areas corresponding to multiple split-screen applications currently running on an electronic device, obtain a focus area where eyes focus on a screen of the electronic device, determine durations where eyes focus on the multiple split-screen areas respectively within a preset duration, determine a target split-screen application among the multiple split-screen applications according to the durations, and switch the target split-screen application to the preset split-screen area. According to the method and the device, when the electronic equipment displays the split screen interface, the target split screen application concerned by the user can be determined according to the focusing area focused on the screen of the electronic equipment by the user, so that the target split screen application is switched to the preset split screen area, and the switching efficiency of the split screen area is improved.
The application also provides a storage medium, on which a computer program is stored, wherein the computer program is executed by a processor to implement the split-screen processing method provided by the method embodiment.
The application also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to implement the split-screen processing method provided by the method embodiment.
In another embodiment of the present application, an electronic device is also provided, and the electronic device may be a smart phone, a tablet computer, or the like. As shown in fig. 6, the electronic device 400 includes a processor 401, a memory 402. The processor 401 is electrically connected to the memory 402.
The processor 401 is a control center of the electronic device 400, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or loading an application program stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the electronic device.
In this embodiment, the processor 401 in the electronic device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment;
acquiring a focusing area of an eye focused on the screen of the electronic equipment, and determining the time length of the eye respectively focused on the multiple split screen areas within a preset time length;
determining a target split screen application among the plurality of split screen applications according to the duration;
and switching the target split screen application to a preset split screen area.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device 500 may include Radio Frequency (RF) circuitry 501, memory 502 including one or more computer-readable storage media, input unit 503, display unit 504, sensor 504, audio circuitry 506, Wireless Fidelity (WiFi) module 507, processor 508 including one or more processing cores, and power supply 509. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 7 does not constitute a limitation of the electronic device and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The rf circuit 501 may be used for receiving and transmitting information, or receiving and transmitting signals during a call, and in particular, receives downlink information of a base station and then sends the received downlink information to one or more processors 508 for processing; in addition, data relating to uplink is transmitted to the base station. In general, radio frequency circuit 501 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the radio frequency circuit 501 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 502 may be used to store applications and data. Memory 502 stores applications containing executable code. The application programs may constitute various functional modules. The processor 508 executes various functional applications and data processing by executing application programs stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 502 may also include a memory controller to provide the processor 508 and the input unit 503 access to the memory 502.
The input unit 503 may be used to receive input numbers, character information, or user characteristic information (such as a fingerprint), and generate a keyboard, mouse, joystick, optical, or trackball signal input related to user setting and function control. In particular, in one particular embodiment, the input unit 503 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 508, and can receive and execute commands sent by the processor 508.
The display unit 504 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. The display unit 504 may include a display panel. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 508 to determine the type of touch event, and then the processor 508 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 7 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The electronic device may also include at least one sensor 505, such as light sensors, motion sensors, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the electronic device is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device, detailed descriptions thereof are omitted.
The audio circuit 506 may provide an audio interface between the user and the electronic device through a speaker, microphone. The audio circuit 506 can convert the received audio data into an electrical signal, transmit the electrical signal to a speaker, and convert the electrical signal into a sound signal to output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 506 and converted into audio data, which is then processed by the audio data output processor 508 and then sent to another electronic device via the rf circuit 501, or the audio data is output to the memory 502 for further processing. The audio circuit 506 may also include an earbud jack to provide communication of a peripheral headset with the electronic device.
Wireless fidelity (WiFi) belongs to short-distance wireless transmission technology, and electronic equipment can help users to send and receive e-mails, browse webpages, access streaming media and the like through a wireless fidelity module 507, and provides wireless broadband internet access for users. Although fig. 7 shows the wireless fidelity module 507, it is understood that it does not belong to the essential constitution of the electronic device, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 508 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 502 and calling data stored in the memory 502, thereby integrally monitoring the electronic device. Optionally, processor 508 may include one or more processing cores; preferably, the processor 508 may integrate an application processor, which primarily handles operating systems, user interfaces, application programs, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 508.
The electronic device also includes a power supply 509 (such as a battery) to power the various components. Preferably, the power source may be logically connected to the processor 508 through a power management system, so that the power management system may manage charging, discharging, and power consumption management functions. The power supply 509 may also include any component such as one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 7, the electronic device may further include a camera, a bluetooth module, and the like, which are not described in detail herein.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
It should be noted that, as one of ordinary skill in the art would understand, all or part of the steps in the various methods of the foregoing embodiments may be implemented by relevant hardware instructed by a program, where the program may be stored in a computer-readable storage medium, such as a memory of a terminal, and executed by at least one processor in the terminal, and during the execution, the flow of the embodiments, such as the split-screen processing method, may be included. Among others, the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
In the foregoing, detailed descriptions are given to a split-screen processing method, a split-screen processing device, a storage medium, and an electronic device, where each functional module may be integrated in one processing chip, or each module may exist alone physically, or two or more modules are integrated in one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The principle and the implementation of the present application are explained herein by applying specific examples, and the above description of the embodiments is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A split screen processing method is characterized by comprising the following steps:
displaying a split screen interface, wherein the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment;
acquiring a focusing area of an eye focused on the screen of the electronic equipment, and determining the time length of the eye respectively focused on the multiple split screen areas within a preset time length;
determining a target split screen application among the plurality of split screen applications according to the duration;
judging whether the target split-screen application is currently located in a preset split-screen area;
if not, generating an identifier for switching the target split-screen application position;
and receiving touch operation of a user aiming at the identifier, and switching the target split screen application to a preset split screen area according to the touch operation.
2. The screen division processing method of claim 1, wherein acquiring a focus area where an eye is focused on the screen of the electronic device comprises:
acquiring a human eye image of a user, and extracting eyeball characteristic information in the human eye image;
determining a focal position of an eye focused on the screen of the electronic equipment according to the eyeball characteristic information;
and determining a target split screen area according to the focus position of the eye focused on the screen of the electronic equipment.
3. The screen division processing method according to claim 2, wherein the eyeball characteristic information includes a position of a pupil in the human eye;
determining a focal position of an eye focused on the screen of the electronic equipment according to the eyeball characteristic information, comprising:
and determining the focus position of the eye focused on the screen of the electronic equipment according to the position of the pupil in the human eye.
4. The split-screen processing method of claim 1, wherein switching the target split-screen application to a preset split-screen area comprises:
acquiring a current screen display mode of the electronic equipment, wherein the screen display mode comprises a horizontal screen display mode and a vertical screen display mode;
and determining a preset split screen area according to the screen display mode, and switching the target split screen application to the preset split screen area.
5. A split-screen processing apparatus, characterized in that the apparatus comprises: the device comprises a display module, an acquisition module, a determination module, a judgment module and a switching module;
the display module is used for displaying a split screen interface, and the split screen interface comprises split screen areas corresponding to a plurality of split screen applications currently running on the electronic equipment;
the acquisition module is used for acquiring a focusing area of an eye focused on the screen of the electronic equipment and determining the time length of the eye respectively focused on the multiple split screen areas within a preset time length;
the determining module is used for determining a target split screen application in the plurality of split screen applications according to the duration;
the judging module is used for judging whether the target split-screen application is currently located in a preset split-screen area;
if not, generating an identifier for switching the target split-screen application position;
the switching module is used for receiving touch operation of a user aiming at the identifier and switching the target split screen application to a preset split screen area according to the touch operation.
6. The split-screen processing device of claim 5, wherein the obtaining module comprises: the device comprises a first obtaining submodule, a first determining submodule and a second determining submodule;
the first obtaining submodule is used for obtaining a human eye image of a user and extracting eyeball characteristic information in the human eye image;
the first determining submodule is used for determining the focal position of the eyes focused on the screen of the electronic equipment according to the eyeball characteristic information;
the second determining submodule is used for determining a target split screen area according to the focal position of the eye focused on the screen of the electronic equipment.
7. The split-screen processing device according to claim 5, wherein the switching module comprises: a second obtaining submodule and a switching submodule;
the second obtaining submodule is used for obtaining a current screen display mode of the electronic equipment, and the screen display mode comprises a horizontal screen display mode and a vertical screen display mode;
and the switching submodule is used for determining a preset split screen area according to the screen display mode and switching the target split screen application to the preset split screen area.
8. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the method according to any one of claims 1-4.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1-4 are implemented when the processor executes the program.
CN201810739554.1A 2018-07-06 2018-07-06 Split screen processing method and device, storage medium and electronic equipment Active CN108958587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810739554.1A CN108958587B (en) 2018-07-06 2018-07-06 Split screen processing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810739554.1A CN108958587B (en) 2018-07-06 2018-07-06 Split screen processing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108958587A CN108958587A (en) 2018-12-07
CN108958587B true CN108958587B (en) 2020-09-08

Family

ID=64482326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810739554.1A Active CN108958587B (en) 2018-07-06 2018-07-06 Split screen processing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108958587B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109743602A (en) * 2018-12-29 2019-05-10 百视通网络电视技术发展有限责任公司 A kind of multi-screen displaying and switching method for smart television
CN110727381A (en) * 2019-09-03 2020-01-24 华为技术有限公司 Application switching method and electronic equipment
CN110928407B (en) * 2019-10-30 2023-06-09 维沃移动通信有限公司 Information display method and device
CN113362749B (en) * 2021-05-25 2023-02-03 维沃移动通信有限公司 Display method and device
CN113934392A (en) * 2021-10-13 2022-01-14 广东睿盟计算机科技有限公司 Isolated dual-computer same-screen control method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049180A (en) * 2012-12-18 2013-04-17 深圳创维-Rgb电子有限公司 Method and device for processing user data at information inquiry terminal as well as terminal
CN103869946A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Display control method and electronic device
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
CN106775334A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 File call method, device and mobile terminal on mobile terminal
CN107589900A (en) * 2017-09-06 2018-01-16 广东欧珀移动通信有限公司 Multi-screen display method, device, terminal and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006211176A (en) * 2005-01-27 2006-08-10 Kyocera Mita Corp Image-forming device
CN104834446B (en) * 2015-05-04 2018-10-26 惠州Tcl移动通信有限公司 A kind of display screen multi-screen control method and system based on eyeball tracking technology
JP6113898B2 (en) * 2016-09-09 2017-04-12 シャープ株式会社 Cooker
CN107132985A (en) * 2017-04-30 2017-09-05 上海爱优威软件开发有限公司 A kind of display changeover method
CN107316573A (en) * 2017-06-15 2017-11-03 常州机电职业技术学院 Split screen device, combination of liquid crystals display screen and its method of work under single-chip microcomputer control
CN107256129A (en) * 2017-07-20 2017-10-17 广东欧珀移动通信有限公司 Switch method, device and its relevant device of application under span mode

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103869946A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Display control method and electronic device
CN103049180A (en) * 2012-12-18 2013-04-17 深圳创维-Rgb电子有限公司 Method and device for processing user data at information inquiry terminal as well as terminal
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
CN106775334A (en) * 2016-11-14 2017-05-31 北京奇虎科技有限公司 File call method, device and mobile terminal on mobile terminal
CN107589900A (en) * 2017-09-06 2018-01-16 广东欧珀移动通信有限公司 Multi-screen display method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN108958587A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108958587B (en) Split screen processing method and device, storage medium and electronic equipment
US10445482B2 (en) Identity authentication method, identity authentication device, and terminal
CN109062467B (en) Split screen application switching method and device, storage medium and electronic equipment
US9906406B2 (en) Alerting method and mobile terminal
CN109905754B (en) Virtual gift receiving method and device and storage equipment
CN107817939B (en) Image processing method and mobile terminal
CN108549519B (en) Split screen processing method and device, storage medium and electronic equipment
CN108984064B (en) Split screen display method and device, storage medium and electronic equipment
EP2851779A1 (en) Method, device, storage medium and terminal for displaying a virtual keyboard
CN109067981B (en) Split screen application switching method and device, storage medium and electronic equipment
CN108958606B (en) Split screen display method and device, storage medium and electronic equipment
CN107241552B (en) Image acquisition method, device, storage medium and terminal
US20160133006A1 (en) Video processing method and apparatus
CN108958629B (en) Split screen quitting method and device, storage medium and electronic equipment
CN108418969B (en) Antenna feed point switching method and device, storage medium and electronic equipment
EP3242447A1 (en) Information recommendation management method, device and system
WO2020007114A1 (en) Method and apparatus for switching split-screen application, storage medium, and electronic device
CN108984142B (en) Split screen display method and device, storage medium and electronic equipment
CN109164908B (en) Interface control method and mobile terminal
CN109040427B (en) Split screen processing method and device, storage medium and electronic equipment
CN108803961B (en) Data processing method and device and mobile terminal
CN107622234B (en) Method and device for displaying budding face gift
CN108984075B (en) Display mode switching method and device and terminal
CN108920086B (en) Split screen quitting method and device, storage medium and electronic equipment
EP3862853A1 (en) Touch operation locking method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant