CN107861664B - Display control method and device, storage medium and processor - Google Patents

Display control method and device, storage medium and processor Download PDF

Info

Publication number
CN107861664B
CN107861664B CN201711098147.9A CN201711098147A CN107861664B CN 107861664 B CN107861664 B CN 107861664B CN 201711098147 A CN201711098147 A CN 201711098147A CN 107861664 B CN107861664 B CN 107861664B
Authority
CN
China
Prior art keywords
face
polyhedron
display control
control object
selection operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711098147.9A
Other languages
Chinese (zh)
Other versions
CN107861664A (en
Inventor
古祁琦
李瑞恒
郑贤钢
李润皋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711098147.9A priority Critical patent/CN107861664B/en
Publication of CN107861664A publication Critical patent/CN107861664A/en
Application granted granted Critical
Publication of CN107861664B publication Critical patent/CN107861664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a display control method and device, a storage medium and a processor. Wherein, the method comprises the following steps: detecting a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in a display picture, and at least two surfaces in the polyhedron correspond to different User Interfaces (UIs) respectively; determining a first face of the polyhedron according to the selection operation; a first UI corresponding to the first face is opened. The invention solves the technical problem that the user interface is troublesome to switch in the related technology.

Description

Display control method and device, storage medium and processor
Technical Field
The invention relates to the field of computers, in particular to a display control method and device, a storage medium and a processor.
Background
There are a large number of functions in the game that require entry buttons to carry. Most of the current games employ buttons in a tiled or drawer-like format. In addition, the two-level interfaces opened by two different buttons rarely jump to each other, and switching can be performed only by turning off the interfaces and then opening new interfaces. FIG. 1 is a schematic diagram of a related art layout of buttons in a game, such as a flat tile or a drawer.
The button arrangement mode has the following defects:
(1) the tiled buttons take too much space to make the interface appear more crowded;
(2) the drawer is accommodated, so that the drawer is not different from a tiled button when being unfolded; when the drawer is folded, the drawer is fresh and cool, but needs to be unfolded firstly when in use, and one-step operation is added;
(3) after an interface is opened by a certain button, the user wants to open another interface, and turns off the interface first, and then opens another interface by another button, that is, it is troublesome to switch the interface, and fig. 2 is a schematic diagram of a switching interface in a game of the related art.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a display control method and device, a storage medium and a processor, which are used for at least solving the technical problem that the user interface is troublesome to switch in the related technology.
According to an aspect of an embodiment of the present invention, there is provided a display control method including: detecting a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in a display picture, and at least two surfaces in the polyhedron correspond to different User Interfaces (UIs) respectively; determining a first face of the polyhedron according to the selection operation; a first UI corresponding to the first face is opened.
Optionally, determining the first face of the polyhedron according to the selection operation comprises: and controlling the display control object to rotate according to the selection operation, and determining the surface at the preset position in the polyhedron as a first surface.
Optionally, opening the first UI corresponding to the first face includes: detecting a trigger operation acting on the first surface; and when the trigger operation is detected, opening a first UI corresponding to the first face.
Optionally, in a case where the first UI is opened, the first face is used to identify the currently displayed interface as the first UI.
Optionally, determining a face of the polyhedron at the preset position as the first face includes: and under the condition that two surfaces at the preset position in the polyhedron are determined, determining a surface which is closest to a screen of the terminal where the display control object is located in the two surfaces as a first surface.
Optionally, the at least two faces of the polyhedron have different transparencies, and the size of the transparency is related to the rotation angle of the face of the polyhedron corresponding to the transparency.
According to an aspect of an embodiment of the present invention, there is provided a display control apparatus including: a detection module configured to detect a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in a display picture, and at least two surfaces in the polyhedron correspond to different User Interfaces (UIs) respectively; the determining module is used for determining a first surface of the polyhedron according to the selection operation; and the opening module is used for opening the first UI corresponding to the first surface.
Optionally, the determining module is further configured to control the display control object to rotate according to the selection operation, and determine that a surface in the preset position in the polyhedron is the first surface.
Optionally, the opening module comprises: a detection unit configured to detect a trigger operation applied to the first surface; and the opening unit is used for opening the first UI corresponding to the first surface when the trigger operation is detected.
Optionally, the determining module is further configured to determine, when it is determined that there are two faces in the polyhedron at the preset position, a face closest to a screen on which the display control object is located in the two faces as the first face.
According to an aspect of the embodiments of the present invention, there is provided a storage medium including a stored program, wherein when the program is executed, a device on which the storage medium is located is controlled to execute any one of the above-described display control methods.
According to an aspect of the embodiments of the present invention, there is provided a processor configured to execute a program, where the program executes to perform the display control method according to any one of the above.
According to an aspect of an embodiment of the present invention, there is provided a terminal including: one or more processors, a memory, a display device, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing the display control method of any of the above.
In the embodiment of the present invention, in a manner that a display control object in a display screen is set as a polyhedron, and at least two surfaces of the polyhedron correspond to different user interfaces UI respectively, a first surface of the polyhedron is determined by a selection operation performed on the display control object, and then the first UI corresponding to the first surface is opened, that is, a plurality of buttons (i.e., a plurality of surfaces of a multi-surface entity) for controlling different game interfaces are integrated into one multi-surface entity, so that a purpose of saving a game interface space can be achieved, and a technical problem that user interface switching is troublesome in the related art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a related art layout of buttons in a game, such as a tiled or drawer-type layout;
FIG. 2 is a schematic diagram of an in-game switch interface of the related art;
FIG. 3 is a flow chart illustrating a display control method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a game control provided in accordance with a preferred embodiment of the present invention;
FIG. 5 is a schematic diagram of an opening interface and a closing interface using game controls provided in accordance with a preferred embodiment of the present invention;
fig. 6 is a block diagram of a display control apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of a display control method, it should be noted that the steps illustrated in the flowchart of the accompanying drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that presented herein.
Fig. 3 is a schematic flowchart of a display control method according to an embodiment of the present invention, and as shown in fig. 3, the method is implemented based on the game control, and includes the following steps:
step S302, detecting a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in a display picture, and at least two surfaces in the polyhedron correspond to different User Interfaces (UIs) respectively;
step S304, determining a first surface of the polyhedron according to the selection operation;
step S306, a first UI corresponding to the first face is opened.
Through the steps, the mode that the display control object in the display picture is set to be a polyhedron, and at least two faces of the polyhedron correspond to different User Interfaces (UIs) respectively is adopted, the first face of the polyhedron is determined through selection operation acting on the display control object, then the first UI corresponding to the first face is opened, namely a plurality of buttons (namely a plurality of faces of a multi-face entity) for controlling different game interfaces are integrated into one multi-face entity, the purpose of saving the space of the game interfaces can be achieved, and the technical problem that the user interfaces are troublesome to switch in the related technology is solved.
The selection operation may be a drag operation or a continuous click operation, but is not limited thereto.
It should be noted that, one surface of the polyhedron may be a sub-button, and each sub-button corresponds to a UI, but the invention is not limited thereto. It should be noted that the UI may be a game interface, but is not limited thereto.
In an embodiment of the present invention, the existence of at least two surfaces in the polyhedron corresponding to different user interfaces UI respectively may be expressed as: different faces in the polyhedron correspond to different UIs, but are not limited thereto.
It should be noted that, the selection operation may control the display control object to rotate, and the determination of the facets in the polyhedron is realized through the rotation, so in an embodiment of the present invention, the step S304 may be expressed as: and controlling the display control object to rotate according to the selection operation, and determining the surface at the preset position in the polyhedron as a first surface.
The display control object may be a polyhedron that rotates in an arbitrary direction, and the polyhedron may specifically rotate in what direction, depending on the direction of the selection operation, but the display control object is not limited to this.
It should be noted that the switching of the UI may be realized by controlling the display control object to rotate through the selection operation, for example, after step S306, the method may further include: detecting a second selection operation acting on the display control object, wherein the second selection operation controls the display control object to rotate from a first surface to a second surface of the display control object; and switching the currently displayed UI from the first UI to a second UI corresponding to the second side. By the mode, the UI switching operation is simplified, and the UI switching efficiency is improved.
When the first UI is opened and the second selection operation controls the display control object to rotate from the first surface to the second surface, the second UI is automatically switched to the second UI; when the first UI is closed, the second UI is in an activated state when the second selection operation controls the display control object to rotate from the first side to the second side, and when the trigger operation is detected on the second side, the second UI is opened, that is, the first UI is switched to the second UI.
In addition, the manner in which the second selection operation controls the display control object to rotate from the first surface to the second surface may be expressed as at least one of: the first method is as follows: the starting position of the second selection operation is positioned on the first surface, and the ending position of the second selection operation is positioned on the second surface; the second method comprises the following steps: the start position of the second selection operation is located on the first surface, and the end position of the second selection operation is located on a third surface of the polyhedron, but the end position of the rotation of the display control object is located on the second surface based on inertia of the rotation of the display control object caused by the second selection operation.
It should be noted that, when the second selection operation causes the inertia of the display control object to cause the end position of the rotation of the display control object to be located on the edge between the two faces of the polyhedron, the face with the smallest included angle with the normal line of the screen of the terminal may be used as the second face, but the second selection operation is not limited thereto. For example, the inertia of the display control object caused by the second selection operation causes the end position of the rotation of the display control object to be located on the face 1 and the face 2 of the polyhedron, but the angle between the face 2 and the normal of the screen of the terminal is smaller than the angle between the face 1 and the normal of the screen of the terminal, and thus the face 2 is the second face.
The second selection operation may be a drag operation or a continuous click operation, but is not limited thereto.
It should be noted that, the number of faces in the polyhedron at the preset position may be two, and thus in an embodiment of the present invention, determining that the face in the polyhedron at the preset position is the first face may be represented as: and under the condition that two surfaces at the preset position in the polyhedron are determined, determining a surface which is closest to a screen of the terminal where the display control object is located in the two surfaces as a first surface.
In an embodiment of the present invention, the step S306 may be represented as: detecting a trigger operation acting on the first surface; and when the trigger operation is detected, opening a first UI corresponding to the first face.
It should be noted that the trigger operation may be a click operation such as a single click or a double click operation, but is not limited thereto.
In one embodiment of the present invention, the first face is used to identify the currently displayed interface as the first UI in a case where the first UI is opened. That is, when the first UI is opened, the surface corresponding to the first UI functions as a label.
It should be noted that, the first surface may be marked with a mark, and the currently displayed UI is indicated as the first UI through the mark, where when the state of the mark changes, the mark indicates that the state of the UI corresponding to the face marked with the mark changes, for example, the mark may be set to black and white, where when the mark changes to white, the mark may indicate that the first UI is in an open state, that is, the currently displayed UI is the first UI, and conversely, when the mark changes to black, the mark may indicate that the first UI is in a closed state, but the invention is not limited thereto.
It should be noted that the first surface having a function of a label and the first surface labeled with an identifier are merely an example, and may also be applicable to other surfaces of the polyhedron, that is, other surfaces may also function as labels, and the implementation manner of the first surface is similar to that of the first surface, and details are not repeated here.
In one embodiment of the present invention, in a case where the first UI corresponding to the first face is in the open state, and the above-described trigger operation is detected again, the first UI corresponding to the first face is closed.
Each face of the polyhedron may be a polygon, but the polyhedron is not limited thereto, and may be a triangle, for example.
In order to realize the effect of embodying 3D when displaying a polyhedron, in an embodiment of the present invention, a plurality of faces of the polyhedron may be set to have a certain transparency, where at least two faces of the polyhedron have different transparencies; wherein, the size of the transparency is related to the rotation angle of the surface corresponding to the transparency in the polyhedron. Each face of the polyhedron corresponds to a certain display area, and the size of the display area corresponding to the designated face of the polyhedron is determined by the area of the maximum response area of the designated face and the rotation angle corresponding to the designated face; the area of the maximum response area is the area of the designated surface when the normal of the designated surface is vertical to the screen of the terminal where the control object is displayed.
The display control object may be a button or a key, but is not limited thereto, and specifically may be a game button, a game control, or a game key, but is not limited thereto.
The following description will be given taking the above display control object as a game control as an example.
A preferred game control is provided in an embodiment of the present invention, and fig. 4 is a schematic diagram of a game control provided in a preferred embodiment of the present invention, and as shown in fig. 4, the game button is a polyhedral button capable of rotating in any direction. First, in all types of interfaces (including touch screen/AR/VR), multiple button (corresponding to the sub-button) entries are collected into one entity, different faces of the entity correspond to different buttons (corresponding to the sub-button), and the player can rotate the entity in any direction. When the button on the front side (corresponding to the sub-button corresponding to the surface of the middle surface of the entity whose normal is perpendicular to the screen of the terminal, that is, corresponding to the first surface) is in an activated state, and the entity is clicked (corresponding to the detection of the trigger operation), the interface corresponding to the surface (corresponding to the first UI) is opened, as shown in fig. 5, after the interface is opened, the entity simultaneously plays a role of a label bar. When the entity is clicked again, the corresponding interface of the face is closed, as shown in fig. 5. It should be noted that, the interface can be switched by scrolling or dragging the entity (switching when scrolling is finished).
It should be noted that the game button is a physical model, the physical model is divided into a plurality of sub-buttons (corresponding to the sub-buttons), each sub-button is actually a common touchable button, and the sub-buttons are displayed by default facing the screen and at the topmost touch priority of the game interface. And it needs to display a 3D model, so it needs to make 3D rotation change to the sub-button, i.e. multiply the 3D rotation matrix in the screen, and make it rotate to the central point, i.e. the vertex position corresponding to the sub-button is changed by the camera perspective projection transformation matrix and the rotation matrix, which is the visual effect that it represents 3D.
It should be noted that, since each sub-button is a non-rectangular button, it is necessary to set vertex information of each button so that the button can correctly detect its own response area (corresponding to the display area described above). I.e. the area of the response area on the front side of the sub-button (i.e. facing the screen) multiplied by the 3D rotation angle.
It should be noted that the 3D rotation angle may be obtained by: a default rotation angle of the sub button is set (default is zero), and then a movement distance touched by the user is recorded through an event of the OnTouchMove, and the movement distance is converted into an actual 3D rotation angle according to a center position of the sub button.
When the game control is displayed, due to the fact that a 3D effect is needed, the displayed transparency of the game control can be set according to the rotation angle of the current sub-button, then the selected corresponding sub-button is judged according to the click area of the player, and meanwhile the click event of the back button is phagocytosed according to the positive angle of the X axis. I.e., the above-mentioned clicked area, and the number of the corresponding sub-buttons selected is usually two, wherein the surface whose X coordinate is positive (i.e., the surface closer to the screen) is displayed in response.
The scrolling of the model of the game button may be converted into a sliding angle by a drag touch event of the player, acquiring a sliding distance, and then decomposing into 3-direction sliding angles. The spin inertia may be a gradual weakening of the operation by the model player sliding. Meanwhile, the button with the minimum angle with the X axis can be selected and reset to the positive direction of the X axis for displaying, and the adsorption effect is achieved.
It should be noted that when the X-axis is aligned with the player, the sliding distance can be converted into a sliding angle as follows: parallel to the screen are the Y-axis and the Z-axis in the 3D world, respectively, assuming that the player's touch point slides from (x1, Y1) to (x2, Y2); if the distance between the set center point of the model and the screen is R, the model is rotated around the Y axis by α ═ arctan ((x2-x1)/R) and around the Z axis by β ═ arctan ((Y2-Y1)/R) with respect to the original position.
It should be noted that the executing subject of the above method may be a terminal, such as a mobile terminal, a computer terminal, but is not limited thereto.
According to an aspect of an embodiment of the present invention, there is provided a display control apparatus, and fig. 6 is a block diagram of a structure of the display control apparatus according to the embodiment of the present invention, as shown in fig. 6, the apparatus includes:
a detection module 62 configured to detect a selection operation acting on a display control object in the display screen; the display control object is a polyhedron in a display picture, and at least two surfaces in the polyhedron correspond to different User Interfaces (UIs) respectively;
a determining module 64, connected to the detecting module 62, for determining a first face of the polyhedron according to the selection operation;
and an opening module 66 connected to the determining module 64 for opening the first UI corresponding to the first side.
Through the device, the display control object in the display picture is set to be a polyhedron, at least two faces of the polyhedron correspond to different user interface UIs respectively, the first face of the polyhedron is determined through selection operation acting on the display control object, then the first UI corresponding to the first face is opened, namely a plurality of buttons (namely a plurality of faces of a multi-face entity) for controlling different game interfaces are integrated into one multi-face entity, the purpose of saving the space of the game interfaces can be achieved, and the technical problem that the user interfaces are troublesome to switch in the related technology is solved.
It should be noted that the determining module 64 may be further configured to control the display control object to rotate according to the selection operation, and determine a surface in the preset position in the polyhedron to be the first surface.
It should be noted that the determining module 64 may be further configured to determine, when two surfaces in the polyhedron at the preset position are determined, a surface closest to a screen on which the control object is displayed in the two surfaces as the first surface.
In an embodiment of the present invention, the opening module 66 includes: a detection unit configured to detect a trigger operation applied to the first surface; and the opening unit is connected with the detection unit and used for opening the first UI corresponding to the first surface when the trigger operation is detected.
The detecting module 62 is further configured to detect a second selecting operation applied to the display control object, where the second selecting operation controls the display control object to rotate from the first surface to the second surface of the display control object; the above-mentioned device still includes: and a switching module connected to the detection module 62, for switching the currently displayed UI from the first UI to a second UI corresponding to the second surface. By the mode, the UI switching operation is simplified, and the UI switching efficiency is improved.
It should be noted that the above-mentioned devices may be located in a terminal, such as a mobile terminal, a computer terminal, etc., but are not limited thereto.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
According to an embodiment of the present invention, there is also provided a storage medium including a stored program, wherein the apparatus on which the storage medium is located is controlled to execute the above-described display control method when the program is executed. The storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
According to an embodiment of the present invention, there is also provided a processor configured to execute a program, where the program executes the display control method described above. The processor may include, but is not limited to: processing device for Microprocessor (MCU) or programmable logic device (FPGA) or the like
According to an embodiment of the present invention, there is also provided a terminal including: one or more processors, a memory, a display device, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for performing the above-described display control method. In some embodiments, the terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a Mobile Internet Device (MID), a PAD, and the like. The display device may be a touch screen type Liquid Crystal Display (LCD) that enables a user to interact with a user interface of the terminal. In addition, the terminal may further include: an input/output interface (I/O interface), a Universal Serial Bus (USB) port, a network interface, a power source, and/or a camera.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A display control method, characterized in that the method comprises:
detecting a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in the display picture, at least two faces of the polyhedron correspond to different User Interfaces (UIs), the transparency of at least two faces of the polyhedron is different, and the size of the transparency is related to the rotation angle of the face, corresponding to the transparency, of the polyhedron;
determining a first face of the polyhedron according to the selection operation, wherein the first face is marked with an identifier which is used for representing the state of a first UI corresponding to the first face;
opening a first UI corresponding to the first face;
wherein the opening of the first UI corresponding to the first face comprises:
detecting a trigger operation acting on the first surface;
and when the trigger operation is detected, opening a first UI corresponding to the first face.
2. The method according to claim 1, wherein said determining a first face of said polyhedron according to said selecting operation comprises:
and controlling the display control object to rotate according to the selection operation, and determining the surface at the preset position in the polyhedron to be the first surface.
3. The method of claim 1, wherein the first face is used to identify a currently displayed interface as the first UI if the first UI is opened.
4. The method of claim 2, wherein determining a face of the polyhedron at a preset position as the first face comprises:
and under the condition that two surfaces at the preset position in the polyhedron are determined, determining the surface which is closest to the screen of the terminal where the display control object is located in the two surfaces as the first surface.
5. A display control apparatus, characterized in that the apparatus comprises:
a detection module configured to detect a selection operation acting on a display control object in a display screen; the display control object is a polyhedron in the display picture, at least two faces of the polyhedron correspond to different User Interfaces (UIs), the transparency of at least two faces of the polyhedron is different, and the size of the transparency is related to the rotation angle of the face, corresponding to the transparency, of the polyhedron;
a determining module, configured to determine a first face of the polyhedron according to the selection operation, where an identifier is marked on the first face, and the identifier is used to indicate a state of a first UI corresponding to the first face;
the starting module is used for opening a first UI corresponding to the first face;
wherein the opening module comprises:
a detection unit configured to detect a trigger operation applied to the first surface;
and the opening unit is used for opening the first UI corresponding to the first surface when the trigger operation is detected.
6. The apparatus according to claim 5, wherein the determining module is further configured to control the display control object to rotate according to the selection operation, and determine a face in a preset position in the polyhedron as the first face.
7. The apparatus according to claim 6, wherein the determining module is further configured to determine, if two faces in the polyhedron at the preset position are determined, a face closest to a screen of a terminal where the display control object is located in the two faces as the first face.
8. A storage medium characterized by comprising a stored program, wherein an apparatus in which the storage medium is located is controlled to execute the display control method according to any one of claims 1 to 4 when the program is executed.
9. A processor, characterized in that the processor is configured to execute a program, wherein the program executes the display control method according to any one of claims 1 to 4.
10. A terminal, comprising: one or more processors, a memory, a display device, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing the display control method of any of claims 1-4.
CN201711098147.9A 2017-11-09 2017-11-09 Display control method and device, storage medium and processor Active CN107861664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711098147.9A CN107861664B (en) 2017-11-09 2017-11-09 Display control method and device, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711098147.9A CN107861664B (en) 2017-11-09 2017-11-09 Display control method and device, storage medium and processor

Publications (2)

Publication Number Publication Date
CN107861664A CN107861664A (en) 2018-03-30
CN107861664B true CN107861664B (en) 2020-09-11

Family

ID=61699997

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711098147.9A Active CN107861664B (en) 2017-11-09 2017-11-09 Display control method and device, storage medium and processor

Country Status (1)

Country Link
CN (1) CN107861664B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110620805B (en) * 2018-12-29 2022-02-08 北京时光荏苒科技有限公司 Method and apparatus for generating information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530018A (en) * 2013-09-27 2014-01-22 深圳天珑无线科技有限公司 Establishment method of widget interfaces in android operating system and mobile terminal
CN104142775A (en) * 2013-05-08 2014-11-12 中兴通讯股份有限公司 Mobile terminal and function item rapid operation implementing method thereof
CN106325722A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D user interface interaction method based on touch terminal and the touch terminal
CN107148610A (en) * 2014-09-29 2017-09-08 三星电子株式会社 Subscriber terminal equipment and its method for controlling the subscriber terminal equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201619801A (en) * 2014-11-20 2016-06-01 富智康(香港)有限公司 Method and system for displaying files
KR20180020452A (en) * 2016-08-18 2018-02-28 엘지전자 주식회사 Terminal and method for controlling the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104142775A (en) * 2013-05-08 2014-11-12 中兴通讯股份有限公司 Mobile terminal and function item rapid operation implementing method thereof
CN103530018A (en) * 2013-09-27 2014-01-22 深圳天珑无线科技有限公司 Establishment method of widget interfaces in android operating system and mobile terminal
CN107148610A (en) * 2014-09-29 2017-09-08 三星电子株式会社 Subscriber terminal equipment and its method for controlling the subscriber terminal equipment
CN106325722A (en) * 2015-06-19 2017-01-11 深圳创锐思科技有限公司 3D user interface interaction method based on touch terminal and the touch terminal

Also Published As

Publication number Publication date
CN107861664A (en) 2018-03-30

Similar Documents

Publication Publication Date Title
EP2815299B1 (en) Thumbnail-image selection of applications
CN104718528B (en) Determine the method, apparatus and terminal device of the color of interface control
US10296140B2 (en) Information processing method for avoidance of a mis-touch and electronic device thereof
CN102197366B (en) Child window surfacing and management
CN107562316A (en) Method for showing interface, device and terminal
US20090178011A1 (en) Gesture movies
WO2014088471A2 (en) Action initiatiation in multi-face device
KR101863425B1 (en) Multiple displays for displaying workspaces
CN107596688B (en) Skill release control method and device, storage medium, processor and terminal
US20120167005A1 (en) Creating an immersive environment
WO2012166182A1 (en) Multi-application environment
EP2715503A1 (en) Multi-application environment
US9013509B2 (en) System and method for manipulating digital images on a computer display
US9495064B2 (en) Information processing method and electronic device
WO2018119584A1 (en) Interaction method and device for flexible display screen
US20190026004A1 (en) Three Dimensional Icons for Computer Applications
US10606447B2 (en) Method and apparatus for interface presentation, method and apparatus for user interface interaction, and computer readable storage medium
CN107577415A (en) Touch operation response method and device
WO2022127304A1 (en) Method and apparatus for adjusting interface display state, and device and storage medium
US9377944B2 (en) Information processing device, information processing method, and information processing program
CN107608550A (en) Touch operation response method and device
CN107608551A (en) Touch operation response method and device
US10698566B2 (en) Touch control based application launch
CN107861664B (en) Display control method and device, storage medium and processor
CN107479902B (en) Control processing method and device, storage medium, processor and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant