CN113093971B - Object display control method and device - Google Patents

Object display control method and device Download PDF

Info

Publication number
CN113093971B
CN113093971B CN202110408955.0A CN202110408955A CN113093971B CN 113093971 B CN113093971 B CN 113093971B CN 202110408955 A CN202110408955 A CN 202110408955A CN 113093971 B CN113093971 B CN 113093971B
Authority
CN
China
Prior art keywords
display
switching
gesture controlled
switching control
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110408955.0A
Other languages
Chinese (zh)
Other versions
CN113093971A (en
Inventor
赵振涛
黄国锋
张浩阳
伍毅书
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110408955.0A priority Critical patent/CN113093971B/en
Publication of CN113093971A publication Critical patent/CN113093971A/en
Application granted granted Critical
Publication of CN113093971B publication Critical patent/CN113093971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and a display control device for an object, which relate to the technical field of computers and comprise the steps that a graphical user interface provides a first switching control; responding to the switching operation aiming at the first switching control, and switching the gesture controlled target into a second display object; and controlling the gesture controlled object to move in the virtual scene in response to the sliding operation acting on the gesture controlled object. Therefore, the object switching control can be realized by adopting the independent switching control associated with the display object, the switching of the gesture controlled target is realized by sliding the gesture, and the possibility of gesture conflict is reduced.

Description

Object display control method and device
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for controlling display of an object.
Background
Currently, the switching aiming at the controlled target is mainly realized by sliding left and right. However, the best way to switch on a mobile device is to slide left and right, which often causes the current function to fail because the gesture is occupied by other functions. For example, if the switching function is bound to the left-right sliding, the position of the object cannot be dragged by the sliding, and a gesture collision occurs.
Disclosure of Invention
The invention aims to provide a method and a device for controlling display of an object, which are used for relieving the technical problem that gestures are easy to conflict in the prior art.
In a first aspect, the present invention provides a method for controlling display of an object. Providing a graphical user interface through a terminal, wherein content displayed in the graphical user interface comprises at least one display object positioned in a virtual scene, and a first display object in the at least one display object is a gesture controlled target, and the method comprises the following steps:
the graphical user interface provides a first switching control;
responding to the switching operation aiming at the first switching control, and switching the gesture controlled target to a second display object;
and controlling the gesture controlled target to move in the virtual scene in response to the sliding operation acting on the gesture controlled target.
In an optional implementation manner, the rest of the at least one display object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third display object; the method further comprises the following steps:
and responding to the sliding operation acting on the third display object, and switching the gesture controlled target to the third display object.
In an optional implementation manner, the rest of the at least one display object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third display object; the method further comprises the following steps:
and controlling the third display object to stay at the current position in the virtual scene in response to the sliding operation acting on the third display object.
In an optional implementation manner, in response to a sliding operation acting on a gesture controlled object, controlling the gesture controlled object to move in a virtual scene includes:
responding to the sliding operation acting on the gesture controlled target, and determining the target position in the virtual scene according to the finishing touch point of the sliding operation;
and moving the controlled target from the corresponding placing position to the target position.
In an alternative implementation, the content displayed by the graphical user interface further comprises at least one presentation area located in the virtual scene;
each display area corresponds to one switching control and one placing position set;
each switching control comprises at least one icon, and at least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control in a one-to-one mode;
each icon corresponds to a display state, and the display state of the icon is used for indicating the state of the placement position corresponding to the icon;
the perspective relation and the posture of the icon are consistent with those of the placement position corresponding to the icon;
the perspective relation and the posture of the switching control are consistent with those of the display area corresponding to the switching control.
In an alternative implementation manner, the correspondence between the display state of the icon and the state of the placement position corresponding to the icon includes one or more of the following:
placing a corresponding placing position of the gesture controlled target, wherein the placed virtual object corresponds to a first state;
the gesture controlled target is placed at a corresponding position, and the virtual object is not placed at a corresponding second state;
placing positions corresponding to the non-gesture controlled targets, wherein the virtual objects are placed in the third state;
the non-gesture controlled object is placed at the corresponding position, and the virtual object which is not placed corresponds to the fourth state.
In an optional implementation manner, the display objects displayed in the graphical user interface include information displayed in an empty placement position where no virtual object is placed and/or information displayed in a real placement position where a virtual object is placed.
In an optional implementation manner, in response to a switching operation for the first switching control, switching the gesture controlled target to the second presentation object includes:
in response to the switching operation of the first switching control, according to the arrangement sequence of the placement positions in the first placement position set, determining a second display object according to information displayed by a next real placement position of the placement position where the first display object is located;
and switching the gesture controlled target into a second display object.
In an optional implementation manner, the icons in the first switching control include a first icon and a second icon, and the first display object corresponds to the first icon; in response to a switching operation for the first switching control, switching the gesture controlled target to a second presentation object, including:
in response to the switching operation aiming at the second icon, determining a display position corresponding to the second icon, and determining the information displayed by the display position corresponding to the second icon as a second display object;
and switching the gesture controlled target into a second display object.
In an optional implementation manner, the method further includes:
and in response to the sliding operation of the first switching control, controlling at least one icon of the first switching control to scroll along with the sliding operation.
In an alternative implementation, a graphical user interface provides a first switching control comprising:
and displaying a first switching control in response to the triggering operation aiming at the first display area.
In an optional implementation manner, the triggering operation is a click operation for the mode control; or the designated area is a display area of the switching control associated with the presentation position aiming at the triggering operation of the designated area.
In an optional implementation manner, the virtual scene is a three-dimensional game scene, the graphical user interface includes a main display area and a slave display area, the main display area is used for displaying the gesture controlled object, and the slave display area is used for displaying the non-gesture controlled object.
In a second aspect, a display control apparatus for an object is provided. Providing a graphical user interface through a terminal, wherein content displayed in the graphical user interface comprises at least one display object positioned in a virtual scene, and a first display object in the at least one display object is a gesture controlled target, and the device comprises:
the display module is used for providing a first switching control through a graphical user interface;
the switching module is used for responding to the switching operation aiming at the first switching control and switching the gesture controlled target into a second display object;
and the moving module is used for responding to the sliding operation acted on the gesture controlled target and controlling the gesture controlled target to move in the virtual scene.
In an optional implementation manner, the rest of the at least one display object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third display object; the switching module is further configured to:
and switching the gesture controlled target to the third display object in response to the sliding operation on the third display object.
In an optional implementation manner, the rest of the at least one display object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third display object; the switching module is further configured to:
and controlling the third display object to stay at the current position in the virtual scene in response to the sliding operation acting on the third display object.
In an alternative implementation, the moving module is specifically configured to:
responding to the sliding operation acting on the gesture controlled target, and determining the target position in the virtual scene according to the finishing touch point of the sliding operation;
and moving the controlled target from the corresponding placing position to the target position.
In an alternative implementation, the content displayed by the graphical user interface further comprises at least one presentation area located in the virtual scene;
each display area corresponds to one switching control and one placing position set;
each switching control comprises at least one icon, and at least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control in a one-to-one mode;
each icon corresponds to a display state, and the display state of the icon is used for indicating the state of the placement position corresponding to the icon;
the perspective relation and the posture of the icon are consistent with those of the placement position corresponding to the icon;
the perspective relation and the posture of the switching control are consistent with those of the display area corresponding to the switching control.
In an alternative implementation manner, the correspondence between the display state of the icon and the state of the placement position corresponding to the icon includes one or more of the following:
placing a corresponding placing position of the gesture controlled target, wherein the placed virtual object corresponds to a first state;
the gesture controlled target is placed at a corresponding position, and the virtual object is not placed at a corresponding second state;
placing a virtual object in a placing position corresponding to the non-gesture controlled object, wherein the virtual object is placed in a third state corresponding to the non-gesture controlled object;
the non-gesture controlled object is placed at the corresponding position, and the virtual object which is not placed corresponds to the fourth state.
In an optional implementation manner, the display objects displayed in the graphical user interface include information displayed by the empty placement positions where the virtual objects are not placed and/or information displayed by the real placement positions where the virtual objects are placed.
In an optional implementation manner, the switching module is specifically configured to:
in response to the switching operation of the first switching control, according to the arrangement sequence of the placement positions in the first placement position set, determining a second display object according to information displayed by a next real placement position of the placement position where the first display object is located;
and switching the gesture controlled target into a second display object.
In an optional implementation manner, the icons in the first switching control include a first icon and a second icon, and the first display object corresponds to the first icon; the switching module is specifically configured to:
in response to the switching operation aiming at the second icon, determining a display position corresponding to the second icon, and determining the information displayed by the display position corresponding to the second icon as a second display object;
and switching the gesture controlled target into a second display object.
In an alternative implementation, the system further comprises a scrolling module, configured to:
and in response to the sliding operation of the first switching control, controlling at least one icon of the first switching control to scroll along with the sliding operation.
In an optional implementation manner, the display module is specifically configured to:
and displaying a first switching control in response to the triggering operation aiming at the first display area.
In an optional implementation manner, the triggering operation is a click operation for the mode control; or the designated area is a display area of the switching control associated with the display position aiming at the triggering operation of the designated area.
In an optional implementation manner, the virtual scene is a three-dimensional game scene, the graphical user interface includes a main display area and a slave display area, the main display area is used for displaying the gesture controlled object, and the slave display area is used for displaying the non-gesture controlled object.
In a third aspect, the invention provides a computer device comprising a touch display, a memory, and a processor; the touch display is used for displaying a graphical user interface and receiving operation of a user on the graphical user interface, and a computer program which can run on a processor is stored in the memory, and when the processor executes the computer program, the steps of the method of any one of the foregoing embodiments are realized.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon machine executable instructions which, when invoked and executed by a processor, cause the processor to carry out the method of any one of the preceding embodiments.
The invention provides a method and a device for controlling the display of an object, wherein a first switching control is provided through a graphical user interface; responding to the switching operation aiming at the first switching control, and switching the gesture controlled target to a second display object; and controlling the gesture controlled object to move in the virtual scene in response to the sliding operation acting on the gesture controlled object. Therefore, the object switching control can be realized by adopting the independent switching control associated with the display object, the switching of the gesture controlled target is realized by sliding the gesture, and the possibility of gesture conflict is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the embodiments or the items in the prior art are briefly described below, and it is obvious that the drawings in the items below are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 illustrates an application scenario diagram provided in an embodiment of the present application;
fig. 2 shows a schematic structural diagram of a mobile phone provided in an embodiment of the present application;
fig. 3 is a schematic view illustrating a usage scenario of a touch terminal according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a method for controlling display of an object according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a touch terminal for displaying a graphical user interface according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a touch terminal for providing another user interface for displaying images according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a touch terminal for providing another user interface for displaying images according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a touch terminal providing another user interface for displaying images according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a touch terminal for providing another user interface for displaying images according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram illustrating a display control apparatus for an object according to an embodiment of the present application.
Detailed Description
The technical solutions of the present application will be clearly and completely described below with reference to the following embodiments, and it should be understood that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprises" and "comprising," and any variations thereof, as referred to in the context of items of the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the application provides an object display control method and a touch terminal. When the controlled target is switched by the method, the independent switching control visually associated with the display object is adopted to control the display of the object, so that the possibility of gesture conflict is reduced.
The display control method of the object in the embodiment of the application can be applied to a touch terminal. The touch terminal comprises a touch screen and a processor, wherein the touch screen is used for presenting a graphical user interface and receiving operation aiming at the graphical user interface.
In some embodiments, when the touch terminal controls the gui, the gui may be used to control the local touch terminal or the peer server.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in the embodiment of the present application. The application scenario may include a touch terminal (e.g., a cell phone 102) and a server 101, and the touch terminal may communicate with the server 101 through a wired network or a wireless network. The touch terminal is used for operating a virtual desktop, and can interact with the server 101 through the virtual desktop to control the server 101.
The touch terminal of the present embodiment is described by taking the mobile phone 102 as an example. The handset 102 includes, among other components, RF (Radio Frequency) circuitry 110, memory 120, a touch screen 130, and a processor 140. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the touch screen 130 is part of a User Interface (UI) and that the cell phone 102 may include fewer than or the same User Interface as illustrated.
The RF circuitry 110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), etc.
The memory 120 may be used to store software programs and modules, and the processor 140 executes various functional applications and data processing of the handset 102 by executing the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created from use of the handset 102, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The touch screen 130 may be used to display a graphical user interface and receive user operations with respect to the graphical user interface. A particular touch screen 130 may include a display panel and a touch panel. The Display panel may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), and the like. The touch panel may collect contact or non-contact operations of a user on or near the touch panel (for example, as shown in fig. 3, operations of the user on or near the touch panel using any suitable object or accessory such as a finger 103, a stylus pen, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 140, and receives and executes commands sent by the processor 140. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may cover the display panel, a user may operate on or near the touch panel covered on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor 140 to determine a user input, and the processor 140 provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be implemented as two independent components or can be implemented by integration.
The processor 140 is the control center of the handset 102, connects various parts of the entire handset using various interfaces and lines, and performs various functions of the handset 100 and processes data by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the handset.
The handset 102 also includes a power supply (e.g., a battery) for powering the various components, which may be logically coupled to the processor 140 via a power management system to manage charging, discharging, and power consumption via the power management system.
To facilitate understanding of the present embodiment, first, a detailed description is given of an object display control method and a touch terminal disclosed in the embodiments of the present application.
Fig. 4 is a flowchart illustrating a method for controlling display of an object according to an embodiment of the present application.
The method is applied to a touch terminal (for example, the mobile phone 102 shown in fig. 2) capable of presenting a graphical user interface, the graphical user interface is provided through the terminal, content displayed in the graphical user interface includes at least one display object located in a virtual scene, and a first display object in the at least one display object is a gesture controlled target. As shown in fig. 4, the method may include the steps of:
s410, providing a first switching control by the graphical user interface;
in the embodiment of the present invention, the content displayed by the graphical user interface may further include at least one presentation area located in the virtual scene;
the display area is mainly used for displaying display objects, the display area can display a plurality of display objects simultaneously and can also display only one display object, when the display area simultaneously displays a plurality of objects, the displayed display objects can be divided into primary and secondary objects, and the primary and secondary relation can be distinguished based on the size, the front and back relation and the like of display. The displayed object usually comprises a gesture controlled target, and the gesture controlled target is controlled to realize operations such as movement of the gesture controlled target and viewing of attribute information.
For example, the virtual scene is a three-dimensional game scene, the graphical user interface comprises a main display area and a slave display area, the main display area is used for displaying the gesture controlled object, and the slave display area is used for displaying the non-gesture controlled object. The non-gesture controlled targets are the rest of the at least one presentation object except the gesture controlled target.
Each display area in the at least one display area corresponds to one switching control and one placement position set; the first display area corresponds to a first placement position set, and the placement positions are used for displaying information. The presentation objects displayed in the graphical user interface may include information presented by empty placement locations where no virtual objects are placed and/or information presented by real placement locations where virtual objects are placed. For example, the placement position may be used to place a virtual object, which is the presentation information presented by the placement position, and if the virtual object is displayed in the graphical user interface, the virtual object may also be referred to as a presentation object; for another example, the placement position may also be empty, the display information displayed by the placement position is empty, and if the placement position is displayed in the graphical user interface, the display object corresponding to the placement position may be empty.
For example, the set of placement positions may belong to a warehouse or a production line, etc. in the game, and the placement positions are grids for placing virtual objects in the warehouse or grids for placing virtual objects such as materials, products, etc. by users of the production line. The grids of the warehouse or production line are usually arranged in a certain order, and the switching control corresponding to the placement position set can show the number and the arrangement order of each grid.
The display object can only refer to a lattice in which the virtual object is placed, and at the moment, the empty lattice is skipped by default in the switching process and is not displayed; alternatively, the display object may include a grid in which the virtual object is placed and an empty grid in which the virtual object is not placed, that is, during the switching process, the empty grid is also switched, and in an extreme case, all the grids may be empty, or all the grids may have the virtual object placed, and the method may be applied to any one of the above two manners.
The switching control can correspond to various implementation modes.
As one example, each toggle control can include at least one icon. At least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control one by one; for example, the number of icons in the toggle control is consistent with the number of placement bits in the set of placement bits, and the ordering of the icons in the toggle control is consistent with the ordering of the placement bits in the set of placement bits.
As another example, each icon in the toggle control may also correspond to a display state. The display state of the icon can be used for indicating the state of the placement position corresponding to the icon;
wherein, the corresponding relation between the display state of the icon and the state of the placement position corresponding to the icon comprises one or more of the following:
placing positions corresponding to the gesture controlled targets, wherein the placed virtual objects correspond to a first state;
the gesture controlled target is placed at a corresponding position, and the virtual object is not placed at a corresponding second state;
placing a virtual object in a placing position corresponding to the non-gesture controlled object, wherein the virtual object is placed in a third state corresponding to the non-gesture controlled object;
the non-gesture controlled object is placed at the corresponding position, and the virtual object which is not placed corresponds to the fourth state.
For example, the first state is yellow, the second state is gray, the third state is white, and the fourth state is black. For another example, the position corresponding to the gesture controlled object may also be referred to by highlighting.
As another example, since the icon is not located in the virtual scene, in order to keep the icon visually consistent with the information in the scene when displayed in the graphical user interface, the perspective relationship and the posture of the icon may be consistent with the perspective relationship and the posture of the placement position corresponding to the icon. The perspective relation of the placement position mainly refers to the effect of the size of the points generated based on the difference of the distances between the points in the virtual scene and the virtual camera, and the perspective relation of the icon mainly refers to the three-dimensional effect displayed by the length of the line and the slope body in the icon. The pose of the placement location may include a placement direction of the placement location, and the pose of the icon may refer to an icon direction. For example, the icon may be a two-dimensional graphic which may have the same shape as a face of the display object on the placement site displayed in the graphical user interface, at which time a face of the display object may be oriented in the same direction as the two-dimensional graphic.
As another example, the perspective relationship and the pose of the toggle control are consistent with the perspective relationship and the pose of the presentation area corresponding to the toggle control. The perspective relation of the display area mainly refers to the effect of the size of the display area generated based on the difference of the distance between the point in the virtual scene and the virtual camera, and the perspective relation of the switching control mainly refers to the three-dimensional effect displayed through the length of the line and the slope body in the switching control. The gesture of the display area may include a trend of the display area, and the gesture of the toggle control may refer to a trend of the toggle control. For example, the outline of the toggle control may be a two-dimensional graphic, which may be the same shape as the display area, at which time a face of the display object may be oriented in the same direction as the two-dimensional graphic.
Therefore, the switching control can be visually associated with the display object displayed on the display area, and the area where the switching control is located and the display area are not overlapped in the graphical user interface.
And S420, responding to the switching operation aiming at the first switching control, and switching the gesture controlled target into a second display object.
In the embodiment of the present invention, the switching operation may include various implementations.
As an example, the switching may be performed sequentially in a preset order. Based on this, the determination of the second presentation object may be achieved by: in response to the switching operation aiming at the first switching control, determining the information displayed by the next real placement bit of the placement bit where the first display object is located according to the arrangement sequence of the placement bits in the first placement bit set to be a second display object; and switching the gesture controlled target into a second display object.
As another example, switching may also be accomplished by clicking on an icon. At the moment, the icons in the first switching control comprise a first icon and a second icon, and the first display object corresponds to the first icon; the step S420 may be specifically implemented by the following steps: in response to the switching operation aiming at the second icon, determining a display position corresponding to the second icon, and determining the information displayed by the display position corresponding to the second icon as a second display object; and switching the gesture controlled target into a second display object.
In addition, if the icons in the first switching control are too many, the icons can be scrolled by sliding in the first switching control for simplicity and convenience in operation. Based on this, when the second icon is selected, at least one icon of the first switching control can be controlled to scroll along with the sliding operation in response to the sliding operation for the first switching control. The order of scrolling may be sequentially scrolling in the order of arrangement of the placement bits in the first set of placement bits.
After the second presentation object is determined, the gesture controlled object presented in the graphical user interface may be switched to the second presentation object. For example, the image for the second display object may be directly used to replace the image for the first display object, or a switching animation may be displayed, and after the switching animation is completed, the second display object is directly displayed.
And S430, controlling the gesture controlled object to move in the virtual scene in response to the sliding operation acting on the gesture controlled object.
The sliding operation acting on the gesture controlled target may be a sliding operation in which a starting touch point of the sliding operation is located in an area where the gesture controlled target is located.
The target position may be determined based on the end touch point of the sliding operation. Based on this, the step S430 can be specifically realized by the following steps: responding to the sliding operation acting on the gesture controlled target, and determining the target position in the virtual scene according to the finishing touch point of the sliding operation; and moving the controlled target from the corresponding placing position to the target position. The target position can be determined by converting the end touch point of the sliding operation from the position in the image coordinate system to the position in the virtual scene coordinate system.
According to the embodiment of the invention, the independent switching control associated with the display object can be adopted to carry out switching control on the object, and the switching of the gesture controlled target is realized through the sliding gesture, so that the possibility of gesture conflict is reduced, and the operation is convenient and fast. In addition, preview of some attribute information can be achieved through the switching control, so that the switching control can achieve a switching function, can display and acquire the attribute information, improves richness of information transmission, and improves resource utilization rate.
In some embodiments, the toggle control may be provided in the graphical user interface at all times, or may be presented if desired. As an example, before the step S410, the method may further include: and displaying a first switching control in response to the triggering operation aiming at the first display area. The trigger operation is a click operation aiming at the mode control; or, the triggering operation is performed on a specified area, wherein the specified area is a display area of the switching control associated with the presentation position.
For example, a mode control may be preconfigured, and the mode control may be used to switch between a switching mode in which the switching control is displayed and a locking mode in which the switching control is hidden.
For another example, the display area may include at least one display area, and at this time, if the switching controls corresponding to all the display areas are displayed, confusion of the display interface may be caused, so the switching control may be displayed only for the current display area.
The triggering operation may be a click operation or a specific operation. For example, the trigger operation for the designated area may be a double-click operation for the designated area.
In some embodiments, the rest of the at least one display object except the gesture controlled target is determined to be a non-gesture controlled target, and the sliding operation on the non-gesture controlled target can realize the gesture controlled current switching. For example, the non-gesture controlled target includes a third presentation object; the method may further comprise the steps of: and switching the gesture controlled target to the third display object in response to the sliding operation on the third display object.
The sliding operation applied to the third display object may be a sliding operation in which the initial touch point is located in an area where the third display object is located.
In some embodiments, a swipe operation for a non-gesture controlled object may be set as an invalid operation, without responding. For example, the non-gesture controlled target includes a third presentation object; the method may further comprise the steps of: and controlling the third display object to stay at the current position in the virtual scene in response to the sliding operation acting on the third display object.
The following further describes embodiments of the present invention with reference to a specific example. See fig. 5-9.
Referring to fig. 5, a first display area for displaying containers on the track 520 may be included in the graphical user interface, wherein the track 520 has a plurality of placement positions for placing containers, and the areas of the graphical user interface corresponding to the track 520 and the containers 531-533 may be referred to as the display area. The first display area may be an area of an upper right portion of the graphical user interface, the track 520 may correspond to a plurality of containers, for example, the container 531, the container 532, the container 533, and a container not shown in fig. 5 may be switched back and forth on the track 520, wherein the track 520 may correspond to a plurality of placement locations, the placement locations are used for placing the container, and in a specific implementation, one placement location may correspond to one container, of course, the plurality of placement locations may correspond to one container or one placement location places a plurality of containers, which is determined according to actual needs.
In the graphical user interface shown in this fig. 5, a toggle control 510 is provided, the toggle control 510 being visually associated with a container 531, a container 532, and a container 533 on a track 520 displayed in the graphical user interface, the visual association being primarily through orientation and perspective as shown in fig. 5. The toggle control 510 includes a plurality of identifiers (e.g., identifiers 541-546), wherein identifier 541 corresponds to a placement location at which container 531 is located, identifier 542 corresponds to a placement location at which container 532 is located, and identifier 543 corresponds to a placement location at which container 533 is located, and identifiers 544-546 correspond to placement locations not shown in fig. 5. As shown in fig. 5, different display states of the marks in the switching control 510 are used to indicate different states of the placement positions, the container 531 at the forefront of the screen is a gesture controlled target, at this time, the display state of the corresponding mark 541 is a first filling effect, the display states of the corresponding marks 542 and 543 of the container 532 and the container 533 are a second filling state, and the display states of the corresponding mark 544 and the mark 546 of the placement position where no container is placed are no filling states, which is of course the nature that the display states in fig. 5 are only examples, and any distinguishable different display states can implement the above function in a specific implementation, and are not limited herein.
In the graphical user interface shown in fig. 5, the gesture-controlled object is a container 531, and the container 531 is the first display object.
Referring to FIG. 6, in response to a toggle operation with respect to toggle control 510, the gesture-controlled object is toggled to container 532, and container 532 is the second presentation object.
For example, the toggle operation for the toggle control 510 can be triggering the toggle operation by clicking the indicator 542 in the toggle control 510.
For another example, the switching operation for the switching control 510 may be triggering the switching operation by clicking switching buttons located on two sides in the switching control 510. Wherein, the left button is switched to the left, and the right button is switched to the right.
In some embodiments, the effect of the switching process of the gesture controlled object may be that container 531, container 532, container 533, and containers outside the screen move on rails, switching the current front-most container to container 532, and at this time, switching the display state of the logo in control 510 also switches adaptively.
After the gesture controlled target is selected, the gesture controlled target can be controlled. For example, the container 532 is controlled to move in the virtual scene in response to a sliding operation applied to the container 532.
For example, as shown in fig. 7, the container 532 may be moved by a left-right sliding operation 561. The start touch point of the sliding operation 561 is located in the area where the container 532 is located, and the end touch point of the sliding operation 561 is located at the position 551, so that the container 532 can be moved to the position 551.
After the sliding operation is completed, a moving animation may be displayed to show the progress of the movement, for example, as shown in fig. 8, after the movement is completed, the state shown in fig. 9 is shown.
In some embodiments, referring to fig. 7, the gesture-controlled object may be switched to the container 533 in response to a sliding operation acting on the container 533. Alternatively, no response is made to the sliding operation applied to the container 533.
Fig. 10 is a schematic structural diagram of an object display control apparatus according to an embodiment of the present invention. Providing a graphical user interface through a terminal, wherein content displayed in the graphical user interface comprises at least one display object positioned in a virtual scene, and a first display object in the at least one display object is a gesture controlled target, and the device comprises:
a display module 1001 for providing a first switching control through a graphical user interface;
the switching module 1002 is configured to switch the gesture controlled object to a second display object in response to a switching operation for the first switching control;
and a moving module 1003, configured to control the gesture controlled object to move in the virtual scene in response to a sliding operation applied to the gesture controlled object.
In some embodiments, the rest of the at least one presentation object except the gesture controlled target is determined to be a non-gesture controlled target, the non-gesture controlled target comprising a third presentation object; the switching module 1002 is further configured to:
and switching the gesture controlled target to the third display object in response to the sliding operation on the third display object.
In some embodiments, the remaining display objects of the at least one display object except the gesture controlled object are determined to be non-gesture controlled objects, the non-gesture controlled objects including a third display object; the switching module 1002 is further configured to:
and controlling the third display object to stay at the current position in the virtual scene in response to the sliding operation acting on the third display object.
In some embodiments, the moving module 1003 is specifically configured to:
responding to sliding operation acting on the gesture controlled target, and determining a target position in the virtual scene according to an ending touch point of the sliding operation;
and moving the controlled target from the corresponding placing position to the target position.
In some embodiments, the content displayed by the graphical user interface further comprises at least one presentation area located in the virtual scene;
each display area corresponds to one switching control and one placing position set;
each switching control comprises at least one icon, and at least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control in a one-to-one mode;
each icon corresponds to a display state, and the display state of the icon is used for indicating the state of the placement position corresponding to the icon;
the perspective relation and the posture of the icon are consistent with those of the placement position corresponding to the icon;
the perspective relation and the posture of the switching control are consistent with those of the display area corresponding to the switching control.
In some embodiments, the correspondence between the display state of an icon and the state of the place corresponding to the icon includes one or more of:
placing a corresponding placing position of the gesture controlled target, wherein the placed virtual object corresponds to a first state;
the gesture controlled target is placed at a corresponding position, and the virtual object is not placed at a corresponding second state;
placing a virtual object in a placing position corresponding to the non-gesture controlled object, wherein the virtual object is placed in a third state corresponding to the non-gesture controlled object;
the non-gesture controlled object is placed at the corresponding position, and the virtual object which is not placed corresponds to the fourth state.
In some embodiments, the presentation objects displayed in the graphical user interface include information presented by empty placement locations where no virtual objects are placed and/or information presented by real placement locations where virtual objects are placed.
In some embodiments, the switching module 1002 is specifically configured to:
in response to the switching operation aiming at the first switching control, determining the information displayed by the next real placement bit of the placement bit where the first display object is located according to the arrangement sequence of the placement bits in the first placement bit set to be a second display object;
and switching the gesture controlled target into a second display object.
In some embodiments, the icons in the first switching control include a first icon and a second icon, and the first presentation object corresponds to the first icon; the switching module 1002 is specifically configured to:
in response to the switching operation aiming at the second icon, determining a display position corresponding to the second icon, and determining the information displayed by the display position corresponding to the second icon as a second display object;
and switching the gesture controlled target into a second display object.
In some embodiments, further comprising a scrolling module to:
and in response to the sliding operation of the first switching control, controlling at least one icon of the first switching control to scroll along with the sliding operation.
In some embodiments, the display module 1001 is specifically configured to:
and displaying a first switching control in response to the triggering operation aiming at the first display area.
In some embodiments, the trigger operation is a click operation for a mode control; or the designated area is a display area of the switching control associated with the presentation position aiming at the triggering operation of the designated area.
In some embodiments, the virtual scene is a three-dimensional game scene, and the graphical user interface includes a main display area and a slave display area, the main display area is used for displaying the gesture controlled object, and the slave display area is used for displaying the non-gesture controlled object.
The touch terminal provided by the embodiment of the application has the same technical characteristics as the object display control method provided by the embodiment of the application, so that the same technical problems can be solved, and the same technical effects can be achieved.
The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps, and logic blocks disclosed in this application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in this application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The embodiment of the application also provides a machine-readable storage medium, and the machine-readable storage medium stores machine executable instructions, and when the machine executable instructions are called and executed by a processor, the machine executable instructions cause the processor to realize the display control method of the object provided by the embodiment of the application.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative and, for example, the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Finally, it should be noted that: although the present application has been described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: those skilled in the art can still make modifications or changes to the embodiments described in the foregoing embodiments, or make equivalent substitutions for some features, within the technical scope of the present disclosure; the modifications, changes or substitutions do not cause the essence of the corresponding technical solutions to depart from the technical solutions of the embodiments of the present application, and are intended to be covered by the protection scope of the present application.

Claims (15)

1. The method for controlling the display of the objects is characterized in that a terminal provides a graphical user interface, the content displayed in the graphical user interface comprises at least one display object positioned in a virtual scene, a first display object in the at least one display object is a gesture controlled target, and the content displayed in the graphical user interface also comprises at least one display area positioned in the virtual scene;
each display area corresponds to one switching control and one placing position set;
each switching control comprises at least one icon, and at least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control in a one-to-one mode;
the method comprises the following steps:
the graphical user interface provides a first switching control;
in response to the switching operation aiming at the first switching control, switching the gesture controlled target to a second display object;
controlling the gesture controlled target to move in a virtual scene in response to a sliding operation acting on the gesture controlled target;
wherein, in response to the switching operation for the first switching control, switching the gesture controlled target to the second presentation object comprises:
responding to the switching operation of the first switching control, and determining the information displayed by the next real placement bit of the placement bit where the first display object is located according to the arrangement sequence of the placement bits in the first placement bit set to be a second display object;
and switching the gesture controlled target into the second display object.
2. The method according to claim 1, wherein the rest of the at least one presentation object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third presentation object; the method further comprises the following steps:
and responding to the sliding operation acted on the third display object, and switching the gesture controlled target into the third display object.
3. The method according to claim 1, wherein the rest of the at least one presentation object except the gesture controlled target is determined to be a non-gesture controlled target, and the non-gesture controlled target comprises a third presentation object; the method further comprises the following steps:
and controlling the third display object to stay at the current position in the virtual scene in response to the sliding operation acting on the third display object.
4. The method according to claim 1, wherein the controlling the gesture controlled object to move in the virtual scene in response to the sliding operation acting on the gesture controlled object comprises:
responding to sliding operation acting on the gesture controlled target, and determining a target position in a virtual scene according to an ending touch point of the sliding operation;
moving the controlled object from the corresponding placement position to the target position.
5. The method of claim 1,
each icon corresponds to a display state, and the display state of the icon is used for indicating the state of the placement position corresponding to the icon;
the perspective relation and the posture of the icon are consistent with those of the placement position corresponding to the icon;
the perspective relation and the posture of the switching control are consistent with those of the display area corresponding to the switching control.
6. The method of claim 5, wherein the correspondence between the display status of an icon and the status of the placement position corresponding to the icon comprises one or more of:
placing positions corresponding to the gesture controlled targets, wherein the placed virtual objects correspond to a first state;
the gesture controlled target is placed at a corresponding position, and the virtual object is not placed at a corresponding second state;
placing a virtual object in a placing position corresponding to the non-gesture controlled object, wherein the virtual object is placed in a third state corresponding to the non-gesture controlled object;
the non-gesture controlled object is placed at the corresponding position, and the virtual object which is not placed corresponds to the fourth state.
7. The method according to claim 6, wherein the display objects displayed in the graphical user interface comprise information displayed by an empty placement location where no virtual object is placed and/or information displayed by a real placement location where a virtual object is placed.
8. The method of claim 7, wherein the icons in the first switch control comprise a first icon and a second icon, and wherein the first presentation object corresponds to the first icon; the switching the gesture controlled target to the second presentation object in response to the switching operation for the first switching control comprises:
in response to the switching operation of the second icon, determining a placement position corresponding to the second icon, and determining information displayed by the placement position corresponding to the second icon as a second display object;
and switching the gesture controlled target into a second display object.
9. The method of claim 5, further comprising:
in response to a sliding operation for the first switching control, controlling at least one icon of the first switching control to scroll along with the sliding operation.
10. The method of claim 5, wherein the graphical user interface provides a first switching control comprising:
and displaying the first switching control in response to the triggering operation aiming at the first display area.
11. The method of claim 10, wherein the trigger operation is a click operation for a mode control; or, aiming at the triggering operation of a specified area, the specified area is a display area of a switching control associated with the placement position.
12. The method according to claim 2 or 3, wherein the virtual scene is a three-dimensional game scene, the graphical user interface comprises a main display area and a slave display area, the main display area is used for displaying the gesture controlled object, and the slave display area is used for displaying the non-gesture controlled object.
13. The device for controlling the display of the objects is characterized in that a terminal provides a graphical user interface, the content displayed in the graphical user interface comprises at least one display object positioned in a virtual scene, a first display object in the at least one display object is a gesture controlled target, and the content displayed in the graphical user interface also comprises at least one display area positioned in the virtual scene; each display area corresponds to one switching control and one placing position set; each switching control comprises at least one icon, and at least one icon in the switching control corresponds to at least one placing position in a placing position set corresponding to the display area corresponding to the switching control in a one-to-one mode; the device comprises:
the display module is used for providing a first switching control through the graphical user interface;
the switching module is used for responding to the switching operation aiming at the first switching control and switching the gesture controlled target into a second display object;
the moving module is used for responding to the sliding operation acted on the gesture controlled target and controlling the gesture controlled target to move in the virtual scene;
the switching module is further configured to determine, in response to a switching operation for the first switching control, a second display object according to an arrangement order of placement bits in a first placement bit set, where the first display object is located, and information displayed by a next real placement bit of the placement bits where the first display object is located;
and switching the gesture controlled target into the second display object.
14. A computer device comprising a touch display, a memory, and a processor; the touch display is used for displaying a graphical user interface and receiving operation of the graphical user interface by a user, and the memory stores a computer program which can run on the processor, and the processor realizes the steps of the method of any one of the above claims 1 to 12 when executing the computer program.
15. A computer readable storage medium having stored thereon machine executable instructions which, when invoked and executed by a processor, cause the processor to execute the method of any of claims 1 to 12.
CN202110408955.0A 2021-04-15 2021-04-15 Object display control method and device Active CN113093971B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110408955.0A CN113093971B (en) 2021-04-15 2021-04-15 Object display control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110408955.0A CN113093971B (en) 2021-04-15 2021-04-15 Object display control method and device

Publications (2)

Publication Number Publication Date
CN113093971A CN113093971A (en) 2021-07-09
CN113093971B true CN113093971B (en) 2022-11-01

Family

ID=76678141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110408955.0A Active CN113093971B (en) 2021-04-15 2021-04-15 Object display control method and device

Country Status (1)

Country Link
CN (1) CN113093971B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117193585A (en) * 2022-05-31 2023-12-08 京东方科技集团股份有限公司 Interaction method based on light field display device and related equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108399036A (en) * 2017-02-06 2018-08-14 中兴通讯股份有限公司 A kind of control method, device and terminal
CN110825279A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Method, apparatus and computer readable storage medium for inter-plane seamless handover
CN110064193A (en) * 2019-04-29 2019-07-30 网易(杭州)网络有限公司 Manipulation control method, device and the mobile terminal of virtual objects in game
CN112044067A (en) * 2020-10-14 2020-12-08 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113093971A (en) 2021-07-09

Similar Documents

Publication Publication Date Title
CN107422934B (en) Icon setting method and electronic equipment
KR101527827B1 (en) Split-screen display method and apparatus, and electronic device thereof
CN110955370B (en) Switching method and device of skill control in game and touch terminal
US20200183574A1 (en) Multi-Task Operation Method and Electronic Device
CN106775420B (en) Application switching method and device and graphical user interface
EP3200061B1 (en) Icon management method and apparatus, and terminal
US9891816B2 (en) Method and mobile terminal for processing touch input in two different states
EP2487579A1 (en) Method and apparatus for providing graphic user interface in mobile terminal
CN102478986B (en) Portable electronic equipment and information sharing method thereof
US20100107067A1 (en) Input on touch based user interfaces
US9323451B2 (en) Method and apparatus for controlling display of item
KR20140021346A (en) Method for displaying graphic user interface and apparatus thereof
CN110442297B (en) Split screen display method, split screen display device and terminal equipment
CN106406741B (en) A kind of operation processing method and mobile terminal of mobile terminal
CN108228020B (en) Information processing method and terminal
CA2826933C (en) Method and apparatus for providing graphic user interface in mobile terminal
CN111796734B (en) Application program management method, management device, electronic device and storage medium
CN104182123A (en) Method for processing information and electronic device
CN113663322A (en) Interactive control method and device in game
KR102095039B1 (en) Apparatus and method for receiving touch input in an apparatus providing a touch interface
CN113093971B (en) Object display control method and device
CN112416199B (en) Control method and device and electronic equipment
CN105511597B (en) A kind of page control method and device based on browser
JP5943856B2 (en) Mobile terminal having multifaceted graphic objects and display switching method
CN107728898B (en) Information processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant