CN114895820A - Display control method based on human-computer interaction - Google Patents

Display control method based on human-computer interaction Download PDF

Info

Publication number
CN114895820A
CN114895820A CN202210377010.1A CN202210377010A CN114895820A CN 114895820 A CN114895820 A CN 114895820A CN 202210377010 A CN202210377010 A CN 202210377010A CN 114895820 A CN114895820 A CN 114895820A
Authority
CN
China
Prior art keywords
interface
human
computer interaction
user
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210377010.1A
Other languages
Chinese (zh)
Other versions
CN114895820B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tibet Tenghu Technology Development Co ltd
Original Assignee
Shenzhen Zhongtiandi Network Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongtiandi Network Communication Technology Co ltd filed Critical Shenzhen Zhongtiandi Network Communication Technology Co ltd
Priority to CN202210377010.1A priority Critical patent/CN114895820B/en
Publication of CN114895820A publication Critical patent/CN114895820A/en
Application granted granted Critical
Publication of CN114895820B publication Critical patent/CN114895820B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a display control method based on human-computer interaction, which comprises the following steps: receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface; dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. The invention realizes the split display of the human-computer interaction interface, reduces the display of the application icons and the content, realizes the privacy of the user information and simultaneously does not influence the normal use of each application by the user.

Description

Display control method based on human-computer interaction
Technical Field
The application relates to the field of computers, in particular to a display control method based on human-computer interaction.
Background
At present, the application of smart phones is more and more popular, in order to meet the requirements of people, various application software is more and more abundant, when people open the mobile phones in public places and light up the screens, for example, when the mobile phones are operated in places such as subways, buses or parties, a plurality of application software icons can be displayed on the mobile phone screens, or when a certain application software is opened, the whole screens can display the interface content of the application software, and privacy protection cannot be realized for users.
In order to solve the problem, the current solution is to attach a specific screen protection film on the mobile phone screen, so that the mobile phone screen cannot be seen beyond a specific angle, but the inventor finds that the solution has at least the following problems:
after the screen protection film is attached to the mobile phone, the resolution ratio can be influenced;
the method can be realized only by additionally purchasing a protective film matched with the model of the mobile phone by a user, so that the burden of the user is increased.
Therefore, a technology is needed to overcome the above problems and protect the privacy of the user information when the mobile phone is used.
Disclosure of Invention
One of the technical problems solved by the application is to provide a display control method based on human-computer interaction, and the method can effectively solve the problem of protecting the privacy of user information when a user uses an intelligent terminal.
According to an embodiment of an aspect of the present application, there is provided a display control method based on human-computer interaction, which is applied to an intelligent terminal, where the intelligent terminal has a human-computer interaction interface, and a plurality of application icons are displayed in the human-computer interaction interface, and the method includes:
receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface;
dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the plurality of application icons in a second area of the two display areas;
receiving an application icon selected by a user;
displaying the user-selected application icon in a first area of the two display areas;
and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
Optionally, the step of receiving an interface control instruction sent by a user includes:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the step of dividing the human-computer interaction interface into two display areas in the moving direction includes:
selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on a human-computer interaction interface as a dividing line;
and displaying the dividing line on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas.
Optionally, after the second area displays the opened application interface, the method further includes:
receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, wherein the sliding track is intersected with the dividing line;
and closing the opened application interface.
Optionally, after receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, and a sliding track intersects with the dividing line, the method further includes:
and restoring the display mode of the interface to the original display mode.
Optionally, after receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, and a sliding track intersects with the dividing line, the method further includes:
and a lock screen displaying a main screen background picture.
Optionally, after the human-computer interaction interface is divided into two display areas in the moving direction, the method further includes:
receiving a command that a user selects the dividing line and moves the dividing line;
and dividing the human-computer interaction interface into two display areas according to the moved dividing line.
Optionally, the step of moving the application icon selected to be moved by the user to the first area of the two display areas includes:
moving the user-selected application icon to a first area of the two display areas to fill the first area in an equal proportion.
Optionally, after the opened application interface is displayed in the second area, the method further includes:
receiving a designated operation instruction sent by a user in the second area through touching a human-computer interaction interface;
and displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the step of displaying the opened application interface in the second area includes:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the step of displaying the plurality of application icons in a second area of the two display areas includes:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
According to another embodiment of the present application, there is provided a display control apparatus applied to an intelligent terminal, the intelligent terminal having a human-computer interaction interface in which a plurality of application icons are displayed, the apparatus including:
the first receiving unit is used for receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one direction of the transverse direction and the longitudinal direction of the interface;
the dividing unit is used for dividing the human-computer interaction interface into two display areas in the moving direction and displaying the application icons in a second area of the two display areas;
a second receiving unit for receiving an application icon selected by a user;
a first display control unit for displaying the application icon selected by the user in a first area of the two display areas;
and the second display control unit is used for displaying the opened application interface in the second area after receiving an instruction of confirming the opening of the application by the user.
Optionally, the first receiving unit is specifically configured to:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the segmentation unit further includes:
the dividing line selecting subunit is used for selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on the human-computer interaction interface as a dividing line;
and the division display subunit is used for displaying the division line on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas.
Optionally, the apparatus further comprises:
and the third display control unit is used for closing the opened application interface when receiving the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit is further configured to:
and restoring the interface display mode to the original display mode under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit is further configured to:
and locking the screen and displaying a main screen background picture under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface and the sliding track is intersected with the dividing line.
Optionally, the dividing unit further includes:
the receiving subunit is used for receiving a command that the user selects the dividing line and moves the dividing line;
and the parting line moving subunit is used for dividing the human-computer interaction interface into two display areas according to the moved parting line.
Optionally, the first display control unit is specifically configured to:
filling a first area of the two display areas with the user-selected application icon at an equal proportion.
Optionally, the second display control unit is specifically configured to:
and after receiving a specified operation instruction sent by a user in the second area through touching the human-computer interaction interface, displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the second display control unit is specifically configured to:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the second display control unit is specifically configured to:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
According to an embodiment of yet another aspect of the present application, there is provided a device for display control, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs including instructions for performing any of the human-machine-interaction based display control methods described above.
According to an embodiment of yet another aspect of the present application, there is provided a machine-readable medium having stored thereon instructions which, when executed by one or more processors of an apparatus, cause the apparatus to perform the human-machine interaction based display control method as described in any one of the above.
After receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
It will be appreciated by those of ordinary skill in the art that although the following detailed description will proceed with reference being made to illustrative embodiments, the present application is not intended to be limited to these embodiments. Rather, the scope of the application is broad and is intended to be defined only by the claims that follow.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a flowchart of a display control method based on human-computer interaction according to an embodiment of the present application.
FIG. 2-1 is a schematic view of an interface original display state according to one embodiment of the present application.
Fig. 2-2 is a schematic diagram after interface segmentation according to an embodiment of the present application.
2-3 are diagrams of interface application icon zone displays according to one embodiment of the present application.
2-4 are diagrams of an open application after an application icon partition is displayed, according to one embodiment of the present application.
Fig. 3 is a schematic structural diagram of a display control device according to an embodiment of the present application.
Fig. 4 is a block diagram of an apparatus for display control according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a server according to an embodiment of the present application.
It will be appreciated by those of ordinary skill in the art that although the following detailed description will proceed with reference being made to illustrative embodiments, the present application is not intended to be limited to these embodiments. Rather, the scope of the application is broad and is intended to be defined only by the claims that follow.
Detailed Description
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The computer equipment comprises user equipment and network equipment. Wherein the user equipment includes but is not limited to computers, smart phones, PDAs, etc.; the network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of computers or network servers, wherein Cloud Computing is one of distributed Computing, a super virtual computer consisting of a collection of loosely coupled computers. The computer equipment can be independently operated to realize the application, and can also be accessed into a network to realize the application through the interactive operation with other computer equipment in the network. The network in which the computer device is located includes, but is not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, and the like.
It should be noted that the user equipment, the network device, the network, etc. are only examples, and other existing or future computer devices or networks may also be included in the scope of the present application, if applicable, and are included by reference.
The methods discussed below, some of which are illustrated by flow diagrams, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. The processor(s) may perform the necessary tasks.
Specific structural and functional details disclosed herein are merely representative and are provided for purposes of describing example embodiments of the present application. This application may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element may be termed a second element, and, similarly, a second element may be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements (e.g., "between" versus "directly between", "adjacent" versus "directly adjacent to", etc.) should be interpreted in a similar manner.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The technical solution of the present application is further described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a display control method based on human-computer interaction according to an embodiment of the present application, where the method is applied to an intelligent terminal, and the intelligent terminal has a human-computer interaction interface, where the type of the human-computer interaction interface is not limited, and may be, for example, various display screen types such as a curved screen and a straight screen. A plurality of application icons are displayed in the human-computer interaction interface, the icon types are not limited in the application, and the application can be any icon in the intelligent terminal. The method comprises the following steps:
s10, receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one direction of the transverse direction and the longitudinal direction of the interface;
s11, dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the application icons in a second area of the two display areas;
s12, receiving an application icon selected by a user;
s13, displaying the application icon selected by the user in a first area of the two display areas;
and S14, displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
After receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
An embodiment of the step of receiving the interface manipulation instruction sent by the user in step S10 includes: recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface; and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished. The blank area is a non-icon display area, that is, the corresponding application program cannot be opened when the user touches the area with a finger, for example, the blank area can be the edge of a man-machine interaction interface of the intelligent terminal, for example, for a curved screen, an application icon is displayed in the center area of the curved screen, the edge of the curved screen can be used as the blank area, and the user can send an interface control instruction by touching the edge and moving up and down, so that the operation is easier because the edge of the curved surface is closer to the finger part. The horizontal or vertical direction of the interface is a relative concept, and is relative to the human-computer interaction interface, the vertical direction is the up-down direction of the human-computer interaction interface in the current display state, and the horizontal direction is the left-right direction of the human-computer interaction interface in the current display state. The horizontal direction or the vertical direction is not limited to be necessarily interactive with a human-computer, the interface side lines are parallel or vertical, and may have a certain angle, for example, the vertical direction may be a direction having an included angle of not more than 45 degrees with the human-computer interaction interface side lines, that is, may be set according to requirements.
In order to enhance the operation experience of the user, when the user touches the human-computer interaction interface and moves in one direction to reach a preset distance, a preset pattern can be displayed in the corresponding direction so as to prompt the user that the control instruction is received, at the moment, the finger can be lifted, and the preset pattern can be a fold pattern represented by a plurality of transverse lines, a page folding pattern, a page stretching pattern and the like.
One embodiment of the step S11 of dividing the human-computer interaction interface into two display areas in the moving direction may be: selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on a human-computer interaction interface as a dividing line; for example, a plurality of virtual lines with preset intervals are preset in the transverse direction or the longitudinal direction of the human-computer interaction interface, the virtual lines are not displayed in a normal display state, when a user sends the program control instruction, the user touches the human-computer interaction interface, moves a preset distance from a starting touch position to an ending position and lifts a finger, one of the virtual lines closest to the ending position is selected as a dividing line, and the dividing line is displayed on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas. The whole human-computer interaction interface can be divided into two display areas through the operation.
In an embodiment of the present application, after the human-computer interaction interface is divided into two display areas, a user may select to adjust the size of each display area, which may specifically be: receiving a command that a user selects the dividing line and moves the dividing line; and dividing the human-computer interaction interface into two display areas according to the moved dividing line. When the fact that the position where the user finger touches the dividing line reaches the preset time is recognized, the user selects the dividing line, after the dividing line is selected, the user can move the dividing line according to requirements, the moving operation can be that after the user selects the dividing line, the finger moves on the human-computer interaction interface, and the moving finishing position is the moved position.
It can be understood that the original display state of the human-computer interaction interface can be restored by moving the position of the dividing line, for example, when the position of the dividing line is moved to reach the longitudinal upper and lower edges or the transverse left and right sides within a preset range, the dividing line can be deleted, and the original display state can be restored, that is, the original display state is restored before the interface control instruction is received.
After the interface is divided, all the icons displayed in the original interface may be displayed in one of the display areas, for example, all the application icons may be displayed in a second area of the two display areas, and the application icons may be displayed in the second area in an equal-scale reduced manner.
The manner of selecting the application icon by the user in step S12 is not limited in the present invention, and the selection method may be, for example, clicking the application icon.
In the embodiment of the present application, after all the application icons are reduced and displayed in the divided second area, there is a problem that, since each icon is reduced and the distance between the icons is reduced, when a user selects one of the application icons, there is a situation of misoperation, which causes two icons to be simultaneously selected.
In step S14, after receiving an instruction of the user to confirm opening of the application, the opened application interface is displayed in the second area. It is understood that the present embodiment selects to display the opened application interface in the second area, but may also display the opened application interface in the first area. If the application selected to be opened by the user is an application which does not need privacy protection, the opened application interface can be displayed on the whole human-computer interaction interface after a specified operation instruction sent by the user through touching the human-computer interaction interface in the second area is received, that is, under the condition that the user selects to open the application which does not need privacy protection, or although the application selected to be opened by the user is an application which needs privacy protection, the user is transferred to a place with fewer people from a public place, such as a home or an office, the normal display state can be recovered, and the application interface is normally displayed in the whole human-computer interaction interface. The specified operation command is not limited in this application, and may be, for example, a double-click command or the like.
In one embodiment, the manner of displaying the opened application interface in the second area may be:
1) displaying the opened application interface in the second area, wherein the opened application interface covers the application icons displayed in the second area; or
2) And displaying the opened application interface in the second area, and displaying the rest application icons in the first area, namely moving the application icons displayed in the original second area to the first area for displaying.
In an embodiment of the present application, after displaying the opened application interface in one of the areas, the user may select to close the application interface in the following manner, specifically:
when the fact that the user touches the human-computer interaction interface and slides on the human-computer interaction interface is received, the sliding track is intersected with the dividing line, namely the user touches and slides the human-computer interaction interface to perform the operation of dividing the dividing line, the fact that the user sends out the application interface for closing the current display can be identified, and optionally, the display mode of the interface is recovered to be the original display mode, namely the display mode before the interface control instruction is received before the original display mode is the step S10, namely the non-partition display mode. And optionally locking the screen to display the background picture of the main screen in the screen locking state.
For the convenience of understanding, the implementation of the above method will be described by way of example in fig. 2-1 to 2-4.
As shown in fig. 2-1, an interface of an intelligent terminal is shown, 6 application icons, namely application a, application b, application c, application d, application e, and application f, are displayed in the interface, and only letters are used to mark each application in the figure. The 6 application icons are displayed in the current interface in an equal proportion, if the interface needs to be divided into two areas, a touch point shown in fig. 2-1 in the current interface is used as a starting position, when a user starts to slide downwards (in the direction indicated by an arrow) at the starting position for a preset distance, the terminal recognizes the interface control command, so that a plurality of wrinkle patterns shown in the short transverse direction in the diagram appear in the interface, the interface control command is received, the interface shown in fig. 2-2 is entered, a dividing line appears at the centerline position of the interface, the dividing line divides the whole interface into an upper display area and a lower display area in the longitudinal direction, the upper area can be used as a first area, the lower area is used as a second area, all the application icons displayed in the original interface are displayed in the second area in an equal proportion, and when the user selects to open the application b, since the icons are small and the application d is also selected, the two application icons are moved to the first area and are fully distributed in the whole area in the first area, as shown in fig. 2-3, the rest unselected icons are scaled in equal proportion and are fully distributed in the second area, in the first area, since the two icons are enlarged and displayed, the operation can be more convenient, so that the user can further confirm the application which is desired to be opened, for example, confirm that the application b is selected to be opened, so that the misoperation can be avoided, the opening interface of the application b is displayed in the second area, the rest icons are moved to the first area to be displayed, as shown in fig. 2-4, the interface schematic diagram of the application b is opened, since the display interface of the second area is small, even if the user opens the interface in a public place to view and operate, other people cannot peep the specific content, the privacy of the user information is protected on the premise that the user application is not influenced. If the user moves from a public place to a non-public place such as a home, the original interface display mode can be recovered, for example, the application display interface can be double-clicked, so that the application display interface is not full of the whole interface. If the user wants to close the split display mode, the original interface can be restored to fig. 2-1 by performing an operation of cutting off the split line in the interface, and the application is closed. Of course, the split display mode can be turned off by moving the position of the split line to the top or bottom of the interface as described in the above embodiments.
The embodiment of the present application further provides a display control device corresponding to a display control method based on human-computer interaction, which is applied to an intelligent terminal, where the intelligent terminal has a human-computer interaction interface, and a plurality of application icons are displayed in the human-computer interaction interface, and as shown in fig. 3, the device mainly includes the following units:
the first receiving unit 30 is configured to receive an interface control instruction sent by a user, where the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a predetermined distance in any one of a horizontal direction and a longitudinal direction of the interface;
a dividing unit 31, configured to divide the human-computer interaction interface into two display areas in a moving direction, and display the plurality of application icons in a second area of the two display areas;
a second receiving unit 32 for receiving an application icon selected by a user;
a first display control unit 33 for displaying the application icon selected by the user in a first area of the two display areas;
and the second display control unit 34 is configured to, after receiving an instruction of a user to open an application, display the opened application interface in the second area.
Optionally, the first receiving unit 30 is specifically configured to:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
Optionally, the predetermined pattern includes: a crease pattern represented by a plurality of transverse lines.
Optionally, the dividing unit 31 further includes:
a dividing line selecting subunit 311 (not shown in the figure) configured to select, as a dividing line, a virtual line closest to the movement end position from among a plurality of virtual lines preset in the human-computer interaction interface;
a dividing display subunit 312 (not shown in the figure) for displaying the dividing line on the human-computer interaction interface to divide the human-computer interaction interface into two display areas.
Optionally, the apparatus further comprises:
and a third display control unit 35 (not shown in the figure) for closing the opened application interface when the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track intersects with the dividing line.
Optionally, the third display control unit 35 is further configured to:
and restoring the interface display mode to the original display mode under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface, and the sliding track is intersected with the dividing line.
Optionally, the third display control unit 35 is further configured to:
and locking the screen and displaying a main screen background picture under the condition that the user touches the human-computer interaction interface and slides on the human-computer interaction interface and the sliding track is intersected with the dividing line.
Optionally, the dividing unit 31 further includes:
a receiving subunit 313, configured to receive a command that a user selects the dividing line and moves the dividing line;
and the dividing line moving subunit 314 is configured to divide the human-computer interaction interface into two display areas according to the moved dividing line.
Optionally, the first display control unit 33 is specifically configured to:
filling a first area of the two display areas with the user-selected application icon at an equal proportion.
Optionally, the second display control unit 34 is specifically configured to:
and after receiving a specified operation instruction sent by a user in the second area through touching the human-computer interaction interface, displaying the opened application interface on the whole human-computer interaction interface.
Optionally, the second display control unit 34 is specifically configured to:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
Optionally, the second display control unit 34 is specifically configured to:
and displaying the plurality of application icons in the second area in an equal scaling-down mode.
With regard to the apparatus in the above-described embodiment, the specific manner in which each unit performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
To sum up, after receiving an interface control instruction sent by a user, the embodiment of the application divides a human-computer interaction interface into two display areas, and displays the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user. Can cut apart the demonstration with whole human-computer interaction interface through above-mentioned operation user, can reduce the demonstration with the application icon through this display mode of cutting apart to can only show in the partial region after cutting apart after opening the application that needs privacy protection, should reduce the show mode convenient operation not only, and effectively protect the privacy, do not hinder user's operation to intelligent terminal, also need not intelligent terminal to dispose other extra protection films etc..
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
An embodiment of the present invention provides an apparatus for display control, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs comprising instructions for: receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface; dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the plurality of application icons in a second area of the two display areas; receiving an application icon selected by a user; displaying the user-selected application icon in a first area of the two display areas; and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
Fig. 4 is a block diagram illustrating an apparatus 800 for display control in accordance with an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 may include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice information processing mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed state of the device 800, the relative positioning of components, such as a display and keypad of the apparatus 800, the sensor assembly 814 may also search for a change in the position of the apparatus 800 or a component of the apparatus 800, the presence or absence of user contact with the apparatus 800, orientation or acceleration/deceleration of the apparatus 800, and a change in the temperature of the apparatus 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on radio frequency information processing (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 5 is a schematic diagram of a server in some embodiments of the invention. The server 1900 may vary widely by configuration or performance and may include one or more Central Processing Units (CPUs) 1922 (e.g., one or more processors) and memory 1932, one or more storage media 1930 (e.g., one or more mass storage devices) storing applications 1942 or data 1944. Memory 1932 and storage medium 1930 can be, among other things, transient or persistent storage. The program stored in the storage medium 1930 may include one or more modules (not shown), each of which may include a series of instructions operating on a server. Still further, a central processor 1922 may be provided in communication with the storage medium 1930 to execute a sequence of instruction operations in the storage medium 1930 on the server 1900.
The server 1900 may also include one or more power supplies 1926, one or more wired or wireless network interfaces 1950, one or more input-output interfaces 1958, one or more keyboards 1956, and/or one or more operating systems 1941, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
A non-transitory computer-readable storage medium in which instructions, when executed by a processor of an apparatus (server or terminal), enable the apparatus to perform the human-machine interaction based display control method shown in fig. 1.
A non-transitory computer-readable storage medium, wherein when instructions in the storage medium are executed by a processor of an apparatus (a server or a terminal), the apparatus is enabled to perform the description of the data processing method in the embodiment corresponding to fig. 1, and therefore, the description thereof will not be repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the computer program product or the computer program referred to in the present application, reference is made to the description of the embodiments of the method of the present application.
Further, it should be noted that: embodiments of the present application also provide a computer program product or computer program, which may include computer instructions, which may be stored in a computer-readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor can execute the computer instruction, so that the computer device executes the description of the data processing method in the embodiment corresponding to fig. 1, which is described above, and therefore, the description thereof will not be repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the computer program product or the computer program referred to in the present application, reference is made to the description of the embodiments of the method of the present application.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A display control method based on human-computer interaction is applied to an intelligent terminal, the intelligent terminal is provided with a human-computer interaction interface, and a plurality of application icons are displayed in the human-computer interaction interface, and the method is characterized by comprising the following steps:
receiving an interface control instruction sent by a user, wherein the interface control instruction is sent by touching a blank area of the human-computer interaction interface and moving a preset distance along any one of the transverse direction and the longitudinal direction of the interface;
dividing the human-computer interaction interface into two display areas in the moving direction, and displaying the application icons in a second area of the two display areas in an isometric reduced mode;
receiving an application icon selected by a user;
displaying the user-selected application icon in a first area of the two display areas;
and displaying the opened application interface in the second area after receiving an instruction of confirming to open the application by the user.
2. The method of claim 1, wherein the step of receiving an interface manipulation instruction sent by a user comprises:
recognizing a blank area of the human-computer interaction interface touched by a user, and moving along any one direction of the transverse direction/the longitudinal direction of the interface;
and when the moving distance reaches a preset distance, displaying a preset pattern on the edge of the human-computer interaction interface along the moving direction so as to prompt a user that the sending of the interface control command is finished.
3. The method of claim 1, wherein the step of dividing the human-computer interaction interface into two display areas in the moving direction comprises:
selecting a virtual line closest to the movement ending position from a plurality of virtual lines preset on a human-computer interaction interface as a dividing line;
and displaying the dividing line on the human-computer interaction interface so as to divide the human-computer interaction interface into two display areas.
4. The method of claim 3, wherein after the second area displays the opened application interface, the method further comprises:
receiving that a user touches the human-computer interaction interface and slides on the human-computer interaction interface, wherein the sliding track is intersected with the dividing line;
and closing the opened application interface, and recovering the interface display mode to be the original display mode.
5. The method of claim 3, wherein after dividing the human-machine interface into two display regions in the moving direction, the method further comprises:
receiving a command that a user selects the dividing line and moves the dividing line;
and dividing the human-computer interaction interface into two display areas according to the moved dividing line.
6. The method of claim 1, wherein after the second area displays the opened application interface, the method further comprises:
receiving a designated operation instruction sent by a user in the second area through touching a human-computer interaction interface;
and displaying the opened application interface on the whole human-computer interaction interface.
7. The method of claim 1, wherein the step of displaying the opened application interface in the second area comprises:
and displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
8. The method of claim 4, wherein the predetermined pattern comprises: and receiving a wrinkle pattern represented by a plurality of transverse lines, touching the human-computer interaction interface by a user, sliding on the human-computer interaction interface, and after the sliding track is intersected with the dividing line, the method further comprises the following steps:
and a lock screen displaying a main screen background picture.
9. The method of claim 1, wherein the step of moving the user-selected application icon to a first area of the two display areas comprises:
and moving the application icon selected by the user to the first area of the two display areas, and enlarging the first area in an equal scale mode.
10. The method of claim 1, wherein after the second area displays the opened application interface, the method further comprises:
receiving a designated operation instruction sent by a user in the second area through touching a human-computer interaction interface;
displaying the opened application interface on the whole human-computer interaction interface, or
And displaying the opened application interface in the second area, and displaying the rest application icons in the first area.
CN202210377010.1A 2022-04-12 2022-04-12 Display control method based on man-machine interaction Active CN114895820B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210377010.1A CN114895820B (en) 2022-04-12 2022-04-12 Display control method based on man-machine interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210377010.1A CN114895820B (en) 2022-04-12 2022-04-12 Display control method based on man-machine interaction

Publications (2)

Publication Number Publication Date
CN114895820A true CN114895820A (en) 2022-08-12
CN114895820B CN114895820B (en) 2024-04-19

Family

ID=82717199

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210377010.1A Active CN114895820B (en) 2022-04-12 2022-04-12 Display control method based on man-machine interaction

Country Status (1)

Country Link
CN (1) CN114895820B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630224A (en) * 2009-08-14 2010-01-20 宇龙计算机通信科技(深圳)有限公司 Method and system for processing control icons on interface and touch terminal
CN103049199A (en) * 2012-12-14 2013-04-17 中兴通讯股份有限公司 Touch screen terminal, control device and working method of touch screen terminal
CN103324435A (en) * 2013-05-24 2013-09-25 华为技术有限公司 Multi-screen display method and device and electronic device thereof
US20150089418A1 (en) * 2012-07-18 2015-03-26 Huawei Device Co., Ltd. Method for managing icon on user interface, and touch-control device
CN104536660A (en) * 2014-12-16 2015-04-22 小米科技有限责任公司 Interface displaying method and device
WO2015186069A2 (en) * 2014-06-03 2015-12-10 Realitygate (Pty) Ltd Display and interaction method in a user interface
CN106164857A (en) * 2014-04-01 2016-11-23 微软技术许可有限责任公司 Scalable user interface shows
CN106484283A (en) * 2016-09-19 2017-03-08 广东欧珀移动通信有限公司 A kind of display control method and mobile terminal
US20180335939A1 (en) * 2017-05-16 2018-11-22 Apple Inc. Devices, Methods, and Graphical User Interfaces for Navigating Between User Interfaces and Interacting with Control Objects
CN109597546A (en) * 2018-11-30 2019-04-09 维沃移动通信有限公司 A kind of icon processing method and terminal device
CN109697003A (en) * 2018-12-11 2019-04-30 广州市久邦数码科技有限公司 A kind of dynamic desktop background display method and mobile terminal
CN109960443A (en) * 2017-12-22 2019-07-02 华为终端有限公司 A kind of display application drawing calibration method and terminal device
US20200183549A1 (en) * 2017-06-16 2020-06-11 Beijing Xiaomi Mobile Software Co., Ltd. Application icon moving method and apparatus, terminal and storage medium
US20200333944A1 (en) * 2017-12-28 2020-10-22 Huawei Technologies Co., Ltd. Icon management method and apparatus
CN111831182A (en) * 2020-07-09 2020-10-27 维沃移动通信有限公司 Application icon control method and device and electronic equipment
CN112068907A (en) * 2019-05-25 2020-12-11 华为技术有限公司 Interface display method and electronic equipment
US20210011609A1 (en) * 2019-07-12 2021-01-14 Qingdao Hisense Media Networks Ltd. Method for displaying user interface and display device
US20210400152A1 (en) * 2020-06-17 2021-12-23 Konica Minolta, Inc. Image display device, image forming apparatus, and display position changing method

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630224A (en) * 2009-08-14 2010-01-20 宇龙计算机通信科技(深圳)有限公司 Method and system for processing control icons on interface and touch terminal
US20150089418A1 (en) * 2012-07-18 2015-03-26 Huawei Device Co., Ltd. Method for managing icon on user interface, and touch-control device
CN103049199A (en) * 2012-12-14 2013-04-17 中兴通讯股份有限公司 Touch screen terminal, control device and working method of touch screen terminal
CN103324435A (en) * 2013-05-24 2013-09-25 华为技术有限公司 Multi-screen display method and device and electronic device thereof
CN106164857A (en) * 2014-04-01 2016-11-23 微软技术许可有限责任公司 Scalable user interface shows
WO2015186069A2 (en) * 2014-06-03 2015-12-10 Realitygate (Pty) Ltd Display and interaction method in a user interface
CN104536660A (en) * 2014-12-16 2015-04-22 小米科技有限责任公司 Interface displaying method and device
CN106484283A (en) * 2016-09-19 2017-03-08 广东欧珀移动通信有限公司 A kind of display control method and mobile terminal
US20180335939A1 (en) * 2017-05-16 2018-11-22 Apple Inc. Devices, Methods, and Graphical User Interfaces for Navigating Between User Interfaces and Interacting with Control Objects
US20200183549A1 (en) * 2017-06-16 2020-06-11 Beijing Xiaomi Mobile Software Co., Ltd. Application icon moving method and apparatus, terminal and storage medium
CN109960443A (en) * 2017-12-22 2019-07-02 华为终端有限公司 A kind of display application drawing calibration method and terminal device
US20200333944A1 (en) * 2017-12-28 2020-10-22 Huawei Technologies Co., Ltd. Icon management method and apparatus
CN109597546A (en) * 2018-11-30 2019-04-09 维沃移动通信有限公司 A kind of icon processing method and terminal device
CN109697003A (en) * 2018-12-11 2019-04-30 广州市久邦数码科技有限公司 A kind of dynamic desktop background display method and mobile terminal
CN112068907A (en) * 2019-05-25 2020-12-11 华为技术有限公司 Interface display method and electronic equipment
US20210011609A1 (en) * 2019-07-12 2021-01-14 Qingdao Hisense Media Networks Ltd. Method for displaying user interface and display device
US20210400152A1 (en) * 2020-06-17 2021-12-23 Konica Minolta, Inc. Image display device, image forming apparatus, and display position changing method
CN111831182A (en) * 2020-07-09 2020-10-27 维沃移动通信有限公司 Application icon control method and device and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
徐艳金;: "触摸屏用户界面设计探究", 制造业自动化, no. 08, 25 April 2012 (2012-04-25), pages 132 - 134 *
覃京燕;陈珊;: "触摸屏智能手机交互设计方法探析", 包装工程, no. 22, 20 November 2010 (2010-11-20), pages 30 - 32 *

Also Published As

Publication number Publication date
CN114895820B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
KR101802760B1 (en) Mobile terminal and method for controlling thereof
KR101526995B1 (en) Mobile terminal and method for controlling display thereof
EP2177976B1 (en) Mobile terminal with image projection
KR101788051B1 (en) Mobile terminal and method for controlling thereof
KR101520689B1 (en) a mobile telecommunication device and a method of scrolling a screen using the same
KR101527014B1 (en) Mobile terminal and method for controlling display thereof
KR101078929B1 (en) Terminal and internet-using method thereof
KR101504210B1 (en) Terminal and method for controlling the same
KR101853057B1 (en) Mobile Terminal And Method Of Controlling The Same
CN111680521A (en) Translation processing method and device and translation processing device
KR101878141B1 (en) Mobile terminal and method for controlling thereof
CN111381737B (en) Dock display method and device and storage medium
KR20100053093A (en) Mobile terminal and method for inputting instructions thereto
CN111522498A (en) Touch response method and device and storage medium
CN111092971A (en) Display method and device for displaying
CN114895820B (en) Display control method based on man-machine interaction
CN106293405B (en) Page moving method and device
KR101559778B1 (en) Mobile terminal and method for controlling the same
CN114900582A (en) Display control method and system based on human-computer interaction
KR101904938B1 (en) Mobile terminal and method for controlling thereof
KR101542386B1 (en) Mobile terminal with an image projector and method for controlling the same
KR101527020B1 (en) Mobile terminal and method for controlling the same
KR20110084624A (en) Mobile terminal and communication control method thererof
KR20110134617A (en) Mobile terminal and method for managing list thereof
CN114816161A (en) Icon display method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240325

Address after: Room 605, Floor 6, Liuwu Honeycomb+Innovation Center, Building 1, International Headquarters City, Liuwu New District, Lhasa, Xizang Autonomous Region, 850000

Applicant after: Tibet Tenghu Technology Development Co.,Ltd.

Country or region after: China

Address before: 518130 521, Mintai building, Minkang intersection, Minzhi street, Longhua New District, Shenzhen, Guangdong Province

Applicant before: Shenzhen zhongtiandi Network Communication Technology Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant