Disclosure of Invention
One of the technical problems solved by the application is to provide a display control method based on human-computer interaction, and the method can effectively solve the problem of protecting the privacy of user information when a user uses an intelligent terminal.
According to an embodiment of an aspect of the present application, there is provided a display control method based on human-computer interaction, which is applied to an intelligent terminal, where the intelligent terminal has a human-computer interaction interface, and a plurality of application icons are displayed in the human-computer interaction interface, and the method includes:
receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface;
dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the plurality of application icons in a second area of the two display areas;
receiving an application icon selected by a user;
displaying the user-selected application icon in a first area of the two display areas;
and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
Optionally, the step of receiving an interface control instruction sent by a user includes:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the step of dividing the human-computer interaction interface into two display areas in the moving direction includes:
selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on a human-computer interaction interface as a dividing line;
and displaying the dividing line on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas.
Optionally, after the second area displays the opened application interface, the method further includes:
receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, wherein the sliding track is intersected with the dividing line;
and closing the opened application interface.
Optionally, after receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, and a sliding track intersects with the dividing line, the method further includes:
and restoring the display mode of the interface to the original display mode.
Optionally, after receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, and a sliding track intersects with the dividing line, the method further includes:
and a lock screen displaying a main screen background picture.
Optionally, after the human-computer interaction interface is divided into two display areas in the moving direction, the method further includes:
receiving a command that a user selects the dividing line and moves the dividing line;
and dividing the human-computer interaction interface into two display areas according to the moved dividing line.
Optionally, the step of moving the application icon selected to be moved by the user to the first area of the two display areas includes:
moving the user-selected application icon to a first area of the two display areas to fill the first area in an equal proportion.
Optionally, after the opened application interface is displayed in the second area, the method further includes:
receiving a designated operation instruction sent by a user in the second area through touching a human-computer interaction interface;
and displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the step of displaying the opened application interface in the second area includes:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the step of displaying the plurality of application icons in a second area of the two display areas includes:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
According to another embodiment of the present application, there is provided a display control apparatus applied to an intelligent terminal, the intelligent terminal having a human-computer interaction interface in which a plurality of application icons are displayed, the apparatus including:
the first receiving unit is used for receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one direction of the transverse direction and the longitudinal direction of the interface;
the dividing unit is used for dividing the human-computer interaction interface into two display areas in the moving direction and displaying the application icons in a second area of the two display areas;
a second receiving unit for receiving an application icon selected by a user;
a first display control unit for displaying the application icon selected by the user in a first area of the two display areas;
and the second display control unit is used for displaying the opened application interface in the second area after receiving an instruction of confirming the opening of the application by the user.
Optionally, the first receiving unit is specifically configured to:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the segmentation unit further includes:
the dividing line selecting subunit is used for selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on the human-computer interaction interface as a dividing line;
and the division display subunit is used for displaying the division line on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas.
Optionally, the apparatus further comprises:
and the third display control unit is used for closing the opened application interface when receiving the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit is further configured to:
and restoring the interface display mode to the original display mode under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit is further configured to:
and locking the screen and displaying a main screen background picture under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface and the sliding track is intersected with the dividing line.
Optionally, the dividing unit further includes:
the receiving subunit is used for receiving a command that the user selects the dividing line and moves the dividing line;
and the parting line moving subunit is used for dividing the human-computer interaction interface into two display areas according to the moved parting line.
Optionally, the first display control unit is specifically configured to:
filling a first area of the two display areas with the user-selected application icon at an equal proportion.
Optionally, the second display control unit is specifically configured to:
and after receiving a specified operation instruction sent by a user in the second area through touching the human-computer interaction interface, displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the second display control unit is specifically configured to:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the second display control unit is specifically configured to:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
According to an embodiment of yet another aspect of the present application, there is provided a device for display control, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs including instructions for performing any of the human-machine-interaction based display control methods described above.
According to an embodiment of yet another aspect of the present application, there is provided a machine-readable medium having stored thereon instructions which, when executed by one or more processors of an apparatus, cause the apparatus to perform the human-machine interaction based display control method as described in any one of the above.
After receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
It will be appreciated by those of ordinary skill in the art that although the following detailed description will proceed with reference being made to illustrative embodiments, the present application is not intended to be limited to these embodiments. Rather, the scope of the application is broad and is intended to be defined only by the claims that follow.
Detailed Description
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The computer equipment comprises user equipment and network equipment. Wherein the user equipment includes but is not limited to computers, smart phones, PDAs, etc.; the network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of computers or network servers, wherein Cloud Computing is one of distributed Computing, a super virtual computer consisting of a collection of loosely coupled computers. The computer equipment can be independently operated to realize the application, and can also be accessed into a network to realize the application through the interactive operation with other computer equipment in the network. The network in which the computer device is located includes, but is not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, and the like.
It should be noted that the user equipment, the network device, the network, etc. are only examples, and other existing or future computer devices or networks may also be included in the scope of the present application, if applicable, and are included by reference.
The methods discussed below, some of which are illustrated by flow diagrams, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. The processor(s) may perform the necessary tasks.
Specific structural and functional details disclosed herein are merely representative and are provided for purposes of describing example embodiments of the present application. This application may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element may be termed a second element, and, similarly, a second element may be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements (e.g., "between" versus "directly between", "adjacent" versus "directly adjacent to", etc.) should be interpreted in a similar manner.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The technical solution of the present application is further described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a display control method based on human-computer interaction according to an embodiment of the present application, where the method is applied to an intelligent terminal, and the intelligent terminal has a human-computer interaction interface, where the type of the human-computer interaction interface is not limited, and may be, for example, various display screen types such as a curved screen and a straight screen. A plurality of application icons are displayed in the human-computer interaction interface, the icon types are not limited in the application, and the application can be any icon in the intelligent terminal. The method comprises the following steps:
s10, receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one direction of the transverse direction and the longitudinal direction of the interface;
s11, dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the application icons in a second area of the two display areas;
s12, receiving an application icon selected by a user;
s13, displaying the application icon selected by the user in a first area of the two display areas;
and S14, displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
After receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
An embodiment of the step of receiving the interface manipulation instruction sent by the user in step S10 includes: recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface; and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished. The blank area is a non-icon display area, that is, the corresponding application program cannot be opened when the user touches the area with a finger, for example, the blank area can be the edge of a man-machine interaction interface of the intelligent terminal, for example, for a curved screen, an application icon is displayed in the center area of the curved screen, the edge of the curved screen can be used as the blank area, and the user can send an interface control instruction by touching the edge and moving up and down, so that the operation is easier because the edge of the curved surface is closer to the finger part. The horizontal or vertical direction of the interface is a relative concept, and is relative to the human-computer interaction interface, the vertical direction is the up-down direction of the human-computer interaction interface in the current display state, and the horizontal direction is the left-right direction of the human-computer interaction interface in the current display state. The horizontal direction or the vertical direction is not limited to be necessarily interactive with a human-computer, the interface side lines are parallel or vertical, and may have a certain angle, for example, the vertical direction may be a direction having an included angle of not more than 45 degrees with the human-computer interaction interface side lines, that is, may be set according to requirements.
In order to enhance the operation experience of the user, when the user touches the human-computer interaction interface and moves in one direction to reach a preset distance, a preset pattern can be displayed in the corresponding direction so as to prompt the user that the control instruction is received, at the moment, the finger can be lifted, and the preset pattern can be a fold pattern represented by a plurality of transverse lines, a page folding pattern, a page stretching pattern and the like.
One embodiment of the step S11 of dividing the human-computer interaction interface into two display areas in the moving direction may be: selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on a human-computer interaction interface as a dividing line; for example, a plurality of virtual lines with preset intervals are preset in the transverse direction or the longitudinal direction of the human-computer interaction interface, the virtual lines are not displayed in a normal display state, when a user sends the program control instruction, the user touches the human-computer interaction interface, moves a preset distance from a starting touch position to an ending position and lifts a finger, one of the virtual lines closest to the ending position is selected as a dividing line, and the dividing line is displayed on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas. The whole human-computer interaction interface can be divided into two display areas through the operation.
In an embodiment of the present application, after the human-computer interaction interface is divided into two display areas, a user may select to adjust the size of each display area, which may specifically be: receiving a command that a user selects the dividing line and moves the dividing line; and dividing the human-computer interaction interface into two display areas according to the moved dividing line. When the fact that the position where the user finger touches the dividing line reaches the preset time is recognized, the user selects the dividing line, after the dividing line is selected, the user can move the dividing line according to requirements, the moving operation can be that after the user selects the dividing line, the finger moves on the human-computer interaction interface, and the moving finishing position is the moved position.
It can be understood that the original display state of the human-computer interaction interface can be restored by moving the position of the dividing line, for example, when the position of the dividing line is moved to reach the longitudinal upper and lower edges or the transverse left and right sides within a preset range, the dividing line can be deleted, and the original display state can be restored, that is, the original display state is restored before the interface control instruction is received.
After the interface is divided, all the icons displayed in the original interface may be displayed in one of the display areas, for example, all the application icons may be displayed in a second area of the two display areas, and the application icons may be displayed in the second area in an equal-scale reduced manner.
The manner of selecting the application icon by the user in step S12 is not limited in the present invention, and the selection method may be, for example, clicking the application icon.
In the embodiment of the present application, after all the application icons are reduced and displayed in the divided second area, there is a problem that, since each icon is reduced and the distance between the icons is reduced, when a user selects one of the application icons, there is a situation of misoperation, which causes two icons to be simultaneously selected.
In step S14, after receiving an instruction of the user to confirm opening of the application, the opened application interface is displayed in the second area. It is understood that the present embodiment selects to display the opened application interface in the second area, but may also display the opened application interface in the first area. If the application selected to be opened by the user is an application which does not need privacy protection, the opened application interface can be displayed on the whole human-computer interaction interface after a specified operation instruction sent by the user through touching the human-computer interaction interface in the second area is received, that is, under the condition that the user selects to open the application which does not need privacy protection, or although the application selected to be opened by the user is an application which needs privacy protection, the user is transferred to a place with fewer people from a public place, such as a home or an office, the normal display state can be recovered, and the application interface is normally displayed in the whole human-computer interaction interface. The specified operation command is not limited in this application, and may be, for example, a double-click command or the like.
In one embodiment, the manner of displaying the opened application interface in the second area may be:
1) displaying the opened application interface in the second area, wherein the opened application interface covers the application icons displayed in the second area; or
2) And displaying the opened application interface in the second area, and displaying the rest application icons in the first area, namely moving the application icons displayed in the original second area to the first area for displaying.
In an embodiment of the present application, after displaying the opened application interface in one of the areas, the user may select to close the application interface in the following manner, specifically:
when the fact that the user touches the human-computer interaction interface and slides on the human-computer interaction interface is received, the sliding track is intersected with the dividing line, namely the user touches and slides the human-computer interaction interface to perform the operation of dividing the dividing line, the fact that the user sends out the application interface for closing the current display can be identified, and optionally, the display mode of the interface is recovered to be the original display mode, namely the display mode before the interface control instruction is received before the original display mode is the step S10, namely the non-partition display mode. And optionally locking the screen to display the background picture of the main screen in the screen locking state.
For the convenience of understanding, the implementation of the above method will be described by way of example in fig. 2-1 to 2-4.
As shown in fig. 2-1, an interface of an intelligent terminal is shown, 6 application icons, namely application a, application b, application c, application d, application e, and application f, are displayed in the interface, and only letters are used to mark each application in the figure. The 6 application icons are displayed in the current interface in an equal proportion, if the interface needs to be divided into two areas, a touch point shown in fig. 2-1 in the current interface is used as a starting position, when a user starts to slide downwards (in the direction indicated by an arrow) at the starting position for a preset distance, the terminal recognizes the interface control command, so that a plurality of wrinkle patterns shown in the short transverse direction in the diagram appear in the interface, the interface control command is received, the interface shown in fig. 2-2 is entered, a dividing line appears at the centerline position of the interface, the dividing line divides the whole interface into an upper display area and a lower display area in the longitudinal direction, the upper area can be used as a first area, the lower area is used as a second area, all the application icons displayed in the original interface are displayed in the second area in an equal proportion, and when the user selects to open the application b, since the icons are small and the application d is also selected, the two application icons are moved to the first area and are fully distributed in the whole area in the first area, as shown in fig. 2-3, the rest unselected icons are scaled in equal proportion and are fully distributed in the second area, in the first area, since the two icons are enlarged and displayed, the operation can be more convenient, so that the user can further confirm the application which is desired to be opened, for example, confirm that the application b is selected to be opened, so that the misoperation can be avoided, the opening interface of the application b is displayed in the second area, the rest icons are moved to the first area to be displayed, as shown in fig. 2-4, the interface schematic diagram of the application b is opened, since the display interface of the second area is small, even if the user opens the interface in a public place to view and operate, other people cannot peep the specific content, the privacy of the user information is protected on the premise that the user application is not influenced. If the user moves from a public place to a non-public place such as a home, the original interface display mode can be recovered, for example, the application display interface can be double-clicked, so that the application display interface is not full of the whole interface. If the user wants to close the split display mode, the original interface can be restored to fig. 2-1 by performing an operation of cutting off the split line in the interface, and the application is closed. Of course, the split display mode can be turned off by moving the position of the split line to the top or bottom of the interface as described in the above embodiments.
The embodiment of the present application further provides a display control device corresponding to a display control method based on human-computer interaction, which is applied to an intelligent terminal, where the intelligent terminal has a human-computer interaction interface, and a plurality of application icons are displayed in the human-computer interaction interface, and as shown in fig. 3, the device mainly includes the following units:
the first receiving unit 30 is configured to receive an interface control instruction sent by a user, where the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a predetermined distance in any one of a horizontal direction and a longitudinal direction of the interface;
a dividing unit 31, configured to divide the human-computer interaction interface into two display areas in a moving direction, and display the plurality of application icons in a second area of the two display areas;
a second receiving unit 32 for receiving an application icon selected by a user;
a first display control unit 33 for displaying the application icon selected by the user in a first area of the two display areas;
and the second display control unit 34 is configured to, after receiving an instruction of a user to open an application, display the opened application interface in the second area.
Optionally, the first receiving unit 30 is specifically configured to:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the dividing unit 31 further includes:
a dividing line selecting subunit 311 (not shown in the figure) configured to select, as a dividing line, a virtual line closest to the movement end position from among a plurality of virtual lines preset in the human-computer interaction interface;
a dividing display subunit 312 (not shown in the figure) for displaying the dividing line on the human-computer interaction interface to divide the human-computer interaction interface into two display areas.
Optionally, the apparatus further comprises:
and a third display control unit 35 (not shown in the figure) for closing the opened application interface when the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track intersects with the dividing line.
Optionally, the third display control unit 35 is further configured to:
and restoring the interface display mode to the original display mode under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit 35 is further configured to:
and locking the screen and displaying a main screen background picture under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface and the sliding track is intersected with the dividing line.
Optionally, the dividing unit 31 further includes:
a receiving subunit 313, configured to receive a command that a user selects the dividing line and moves the dividing line;
and the dividing line moving subunit 314 is configured to divide the human-computer interaction interface into two display areas according to the moved dividing line.
Optionally, the first display control unit 33 is specifically configured to:
filling a first area of the two display areas with the user-selected application icon at an equal proportion.
Optionally, the second display control unit 34 is specifically configured to:
and after receiving a specified operation instruction sent by a user in the second area through touching the human-computer interaction interface, displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the second display control unit 34 is specifically configured to:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the second display control unit 34 is specifically configured to:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
With regard to the apparatus in the above-described embodiment, the specific manner in which each unit performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
To sum up, after receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
An embodiment of the present invention provides an apparatus for display control, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs comprising instructions for: receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface; dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
Fig. 4 is a block diagram illustrating an apparatus 800 for display control in accordance with an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 may include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice information processing mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed state of the device 800, the relative positioning of components, such as a display and keypad of the apparatus 800, the sensor assembly 814 may also search for a change in the position of the apparatus 800 or a component of the apparatus 800, the presence or absence of user contact with the apparatus 800, orientation or acceleration/deceleration of the apparatus 800, and a change in the temperature of the apparatus 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on radio frequency information processing (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 5 is a schematic diagram of a server in some embodiments of the invention. The server 1900 may vary widely by configuration or performance and may include one or more Central Processing Units (CPUs) 1922 (e.g., one or more processors) and memory 1932, one or more storage media 1930 (e.g., one or more mass storage devices) storing applications 1942 or data 1944. Memory 1932 and storage medium 1930 can be, among other things, transient or persistent storage. The program stored in the storage medium 1930 may include one or more modules (not shown), each of which may include a series of instructions operating on a server. Still further, a central processor 1922 may be provided in communication with the storage medium 1930 to execute a sequence of instruction operations in the storage medium 1930 on the server 1900.
The server 1900 may also include one or more power supplies 1926, one or more wired or wireless network interfaces 1950, one or more input-output interfaces 1958, one or more keyboards 1956, and/or one or more operating systems 1941, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
A non-transitory computer-readable storage medium in which instructions, when executed by a processor of an apparatus (server or terminal), enable the apparatus to perform the human-machine interaction based display control method shown in fig. 1.
A non-transitory computer-readable storage medium, wherein when instructions in the storage medium are executed by a processor of an apparatus (a server or a terminal), the apparatus is enabled to perform the description of the data processing method in the embodiment corresponding to fig. 1, and therefore, the description thereof will not be repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the computer program product or the computer program referred to in the present application, reference is made to the description of the embodiments of the method of the present application.
Further, it should be noted that: embodiments of the present application also provide a computer program product or computer program, which may include computer instructions, which may be stored in a computer-readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor can execute the computer instruction, so that the computer device executes the description of the data processing method in the embodiment corresponding to fig. 1, which is described above, and therefore, the description thereof will not be repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the computer program product or the computer program referred to in the present application, reference is made to the description of the embodiments of the method of the present application.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.