CN117234630A - Interface display method and device, electronic equipment and medium - Google Patents

Interface display method and device, electronic equipment and medium Download PDF

Info

Publication number
CN117234630A
CN117234630A CN202210635847.1A CN202210635847A CN117234630A CN 117234630 A CN117234630 A CN 117234630A CN 202210635847 A CN202210635847 A CN 202210635847A CN 117234630 A CN117234630 A CN 117234630A
Authority
CN
China
Prior art keywords
target
user
identifier
display
identifiers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210635847.1A
Other languages
Chinese (zh)
Inventor
王新宇
黄诗涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210635847.1A priority Critical patent/CN117234630A/en
Publication of CN117234630A publication Critical patent/CN117234630A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the disclosure provides an interface display method, an interface display device, electronic equipment and a medium, wherein the method comprises the following steps: displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, and the object identifier comprises a user identifier and an associated identifier of a target user; and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object. According to the method, the target display interface is added in the application program, and the user identifier and the associated identifier of the target user are displayed in the display area of the target display interface, so that the association between the target user and the associated user can be increased, and the display of the target display interface is enriched.

Description

Interface display method and device, electronic equipment and medium
Technical Field
The embodiment of the disclosure relates to the internet technology, in particular to an interface display method, an interface display device, electronic equipment and a medium.
Background
An application, referred to as a computer program that operates in user mode to interact with a user to perform a particular task or tasks, has a visual user interface. Currently, applications are very diverse, such as video-type applications, which can be considered applications for video social networking.
Most video applications mainly display videos, and users can browse videos through the video applications, however, at present, the display modes of interfaces in the video applications are single, such as interfaces for displaying videos, so how to improve the richness of the displayed interfaces in the applications is currently needed to be solved.
Disclosure of Invention
The disclosure provides an interface display method, an interface display device, electronic equipment and a medium, which increase the relevance between a target user and an associated user and enrich the display of a target display interface.
In a first aspect, an embodiment of the present disclosure provides an interface display method, including:
displaying a target display interface, wherein a display area of the target display interface is provided with target identifiers and a plurality of levels of target identifiers, the display positions of the target identifiers in the display area are determined based on the acquired target progress information, the target identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers;
And responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object.
In a second aspect, embodiments of the present disclosure further provide an interface display apparatus, including:
the display module is used for displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, the display position of the object identifier in the display area is determined based on the acquired target progress information, the object identifier comprises a user identifier of a target user and an associated identifier, and the associated identifier is an identifier selected from first identifiers of users associated with the target user based on the sequence information of the identifiers;
the response module is used for responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, and determining the setting condition based on the target identifier corresponding to the unprocessed virtual object.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including:
one or more processing devices;
a storage means for storing one or more programs;
the one or more programs are executed by the one or more processing apparatuses, so that the one or more processing apparatuses implement the interface display method provided by the embodiments of the present disclosure.
In a fourth aspect, the embodiments of the present disclosure further provide a computer readable medium having stored thereon a computer program which, when executed by a processing device, implements the interface display method provided by the embodiments of the present disclosure.
The embodiment of the disclosure provides an interface display method, an interface display device, electronic equipment and a medium, wherein the method comprises the following steps: displaying a target display interface, wherein a display area of the target display interface is provided with target identifiers and a plurality of levels of target identifiers, the display positions of the target identifiers in the display area are determined based on the acquired target progress information, the target identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers; and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object. By using the technical scheme, the target display interface is added in the application program, and the relevance between the target user and the associated user can be increased by displaying the user identifier and the associated identifier of the target user in the display area of the target display interface, so that the display of the target display interface is enriched.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a flow chart of an interface display method according to an embodiment of the disclosure;
FIG. 2 is a schematic diagram of a target presentation interface provided by an embodiment of the present disclosure;
FIG. 3 is a schematic view of a pop-up window provided in an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a process success window provided by an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a sequence information presentation interface provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of an updated target presentation interface provided by an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a target presentation interface provided by an embodiment of the present disclosure;
FIG. 8 is a flowchart of an interface display method according to an embodiment of the disclosure;
FIG. 9 is a schematic diagram of an interface display device according to an embodiment of the disclosure;
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server or a storage medium for executing the operation of the technical scheme of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
Fig. 1 is a schematic flow chart of an interface display method provided by an embodiment of the present disclosure, where the embodiment of the present disclosure is applicable to a case of displaying an interface, the method may be performed by an interface display device, where the device may be implemented in a form of software and/or hardware, and optionally, implemented by an electronic device, where the electronic device may be a mobile terminal, a PC end, a server, or the like.
As shown in fig. 1, the method includes:
s110, displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, the display position of the object identifier in the display area is determined based on the acquired target progress information, the object identifier comprises a user identifier of a target user and an associated identifier, and the associated identifier is an identifier selected from first identifiers of users associated with the target user based on the sequence information of the identifiers.
The target display interface may be an interface for displaying the object identifier and the target identifier, and the display area is an area in which the object identifier and the target identifier are displayed in the target display interface, and the specific size and the specific position of the display area are not limited and can be determined according to typesetting of the actual target display interface.
An object identifier may be understood as an identification used to characterize a user, e.g. the object identifier may be a head portrait or a nickname, etc.; the display position of the object identifier in the display area can be determined according to the obtained target progress information, the target progress information can refer to current progress information of the user, such as the progress information corresponding to the tasks of the step number information or the duration of the used electronic equipment, and the like, and the mode of obtaining the target progress information is not further developed. Different tasks may correspond to different target progress information, which may characterize the progress of the user currently completing the task.
In one embodiment, the target progress information is determined based on acquired motion information of the target user.
The movement information may be considered as information characterizing the movement of the target user, such as movement pattern, movement duration, movement rate and/or number of steps taken, etc. In this embodiment, whether to allow the motion information of the target user to be used may be triggered by a corresponding control, and if the motion information of the target user is triggered to be allowed to be used, the motion information of the target user may be directly obtained, so as to determine the target progress information.
When the target progress information is determined based on the motion information, the task may be a motion-related task, and accordingly, the present disclosure may determine the target progress information based on the motion information of the target user. The target presentation interface may be an interface within an application, where the application displaying the target presentation interface is not limited, and may be any application, such as a video-type application.
The number of the object identifiers may be one or more, and in an embodiment, the object identifiers may include a user identifier of a target user and an associated identifier, where the target user may refer to a user currently using the electronic device, and the user identifier is an identifier representing the target user, such as an avatar of the target user. The associated identifier may be considered as an identifier selected from a first identifier, wherein the first identifier is an identification of a user associated with the target user.
It is considered that the first identifier includes one or more than two identifiers, each identifier may be arranged in a certain order, the associated identifier may be an identifier selected from the first identifiers based on order information of the identifiers, the order information may be information characterizing an order of each identifier in the first identifier, and an order of arrangement of each identifier in the order information is not limited, and may be determined according to an actual situation, for example, may be determined according to a degree of association between the associated user and the target user; the selected association identifier may be one or more, and the means for selecting the association identifier is not limited herein, and illustratively, when the order information is that the identifiers are arranged according to the association degree of the associated user and the target user, the first N identifiers may be selected from the first identifiers as the association identifiers according to the order from high to low, where N is a positive integer. When the associated identifier is selected based on the association program, a location at which the associated identifier is displayed within the presentation area may be determined based on the target progress information of the associated identifier.
When the target progress information is determined based on the motion information, at least a identifier different from the target user order information by one name may be selected as the associated identifier. When the associated identifier is determined, the position of the associated identifier displayed in the display area can be determined based on the target progress information corresponding to the associated identifier.
In one embodiment, the order information is determined based on target progress information of the target user and a user associated with the target user; alternatively, the order information is determined based on a degree of association of the target user with the target user associated user.
In this embodiment, the order information may be determined based on the target user and the target progress information of the user associated with the target user, i.e., the user identifier and the first identifier may be ordered according to the target progress information of the user; or, the order information may be further determined based on the association degree of the target user and the target user associated user, that is, the user identifier and the first identifier may be ordered according to the association degree of the user, where the association degree of the target user and the target user associated user may be determined according to the interaction condition of the target user and the target user associated user, such as a particular attention, praise, comment number, or the like.
In one embodiment, the association identifier is selected based on the corresponding target progress information; alternatively, the association identifier is selected based on the corresponding degree of association.
In this step, the selection of the association identifier may correspond to the order information, that is, the association identifier may be selected based on the size of the corresponding target progress information; alternatively, the association identifier may be selected based on a degree of association to which the user corresponds.
The target mark can be understood as a preset mark for intuitively displaying the target which is required to be completed by the user, and the number of the target marks can be multiple and respectively correspond to multiple grades. The target progress information may be progress with respect to an unfinished target.
Specifically, a target display interface may be displayed, where a display area of the target display interface may display a user identifier of a target user, an associated identifier, and a plurality of levels of target identifiers, and the specific display manner is not limited in this embodiment, for example, target progress information corresponding to the user identifier of the target user and the associated identifier may be directly displayed in text form with each target identifier, or a relationship between the user identifier of the target user, the associated identifier, and the plurality of levels of target identifiers may be represented in a graphical form.
In one embodiment, the target identification is an identification that characterizes 1 kilostep and 5 kilosteps. The target progress information is the step number of the target user, and the display position of the target user on the target display interface can be determined based on the step number of the target user.
The positions at which the object identifier and the target identifier are displayed in the display area are not limited herein, as long as they can be displayed in the display area. For example, the object identifiers can be displayed in the object display interface in the order from low to high, and then the object identifiers are arranged among the object identifiers so as to highlight the progress of the object identifiers in completing the task.
And S120, responding to the triggering operation of a target processing control in the target display interface, processing a virtual object corresponding to the target processing control, and displaying the target processing control when target progress information corresponding to the user identifier meets a setting condition, wherein the setting condition is determined based on a target identifier corresponding to an unprocessed virtual object.
The target processing control may be a control for processing a corresponding virtual object, and the virtual object may be considered as a virtual resource that may be obtained when the target progress information corresponding to the user identifier satisfies a set condition, for example, the virtual object may be a point, a gold coin, a red packet, or the like, and the set condition may be determined based on the target identifier corresponding to the unprocessed virtual object.
It may be appreciated that the display state of the target processing control may be determined according to the target progress information corresponding to the user identifier, and, for example, when the target progress information corresponding to the user identifier meets the set condition, the target processing control may be displayed; when the target progress information corresponding to the user identifier does not meet the set condition, the target processing control is not displayed; or when the target progress information corresponding to the user identifier meets the set condition, the target processing control can be normally displayed in the target display interface to wait for the triggering operation of the target processing control, and when the target progress information corresponding to the user identifier does not meet the set condition, the target processing control can be displayed in an inactive state (such as a shadow state), at the moment, the target processing control cannot be triggered, and only when the target progress information corresponding to the user identifier meets the set condition, the target processing control can be normally displayed to indicate that the target processing control can be triggered.
In this step, when the target processing control is displayed in the target display interface, the target processing control may be triggered to process (e.g. get) the virtual object of the virtual identifier corresponding to the target processing control in response to the triggering operation of the target processing control, where the specific process of the process is not limited, for example, the process of processing the virtual object in the target display interface may remind the target user of the processing path of the virtual identifier in an animation form, and may skip to other interfaces to process the virtual object of the virtual identifier. The moving path of the virtual object before and after the processing can be embodied in the form of animation in the process of processing the virtual object. Such as the number of virtual objects that the target user has processed displayed at a target location within the target presentation interface. In the process of processing the virtual objects, the processed virtual objects move from the display area to the target position, and the number of the virtual objects processed currently is displayed at the target position. After processing the virtual objects, the number of processed virtual objects may be increased at the target location to indicate the number of virtual objects currently processed.
In one embodiment, each target identifier may correspond to a virtual object, where the virtual object may be displayed in one-to-one correspondence with the target identifier, and a display position of the virtual object in the display area is not limited, so long as the association with the corresponding target identifier can be represented, for example, the virtual object may be located above the corresponding target identifier, or located to the left of the corresponding target identifier, and so on.
FIG. 2 is a schematic diagram of a target display interface provided in an embodiment of the present disclosure, where, as shown in FIG. 2, a user identifier 2 of a target user (i.e., user D), associated identifiers 3, 4, and multiple levels of target identifiers 5 (i.e., 1 step, 1 kilostep, 3 kilostep, and 1 kilostep) are displayed in a display area 1 of the target display interface; in addition, the target display interface also comprises a target processing control 6, which is used for processing the virtual object corresponding to the target processing control 6.
The target progress information of the target user in fig. 2 indicates that the target user has completed the target of 1 kilostep, so that the virtual objects corresponding to 1 kilostep, that is, 70 virtual objects, can be acquired. The target identifications in this example are sorted in a linear manner in the order from low to high (1 step, 1 kilostep, 3 kilosteps, and 1 kilostep) and the target user is displayed between the target identifications sorted in the linear manner based on the corresponding target progress information.
The interface display method includes displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, display positions of the object identifiers in the display area are determined based on acquired target progress information, the object identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers; and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object. By using the method, the target display interface is added in the application program, and the relevance between the target user and the associated user can be increased by displaying the user identifier and the associated identifier of the target user in the display area of the target display interface, so that the display of the target display interface is enriched.
In one embodiment, the method further comprises:
order update information is displayed within the display area or within a popup window of the display area, the order update information being triggered if an order characterized by the order information of the user identifier changes.
The order update information may refer to a change in an order represented by the user identifier, when the order represented by the order information of the user identifier changes, the order update information may be triggered to be displayed, and a specific display time and a specific form may be determined according to an actual situation, for example, when the electronic device displays the target display interface, if the order represented by the order information of the user identifier changes, the order update information may be displayed in an animation form in a display area of the target display interface; for example, if the electronic device does not display the target display interface or does not activate the client, when the order represented by the order information of the user identifier is changed, when the target display interface is displayed again, the order update information may be displayed, for example, a popup window is displayed, and the order update information is displayed in text and/or animation form in the popup window.
The order of user identifiers in this disclosure may also be considered the order of the target users, and the associated identifiers may be considered herein the order in which the associated identifiers characterize the users.
In one embodiment, the order update information is displayed in an animated manner, the displayed animation comprising:
and dynamically adjusting the position of the object identifier from the position of the sequence before updating to the position of the sequence after updating.
In this embodiment, in the case that the order represented by the order information of the object identifier (i.e., the user identifier or the associated identifier) changes, the position of the user identifier or the associated identifier in the order before update may be dynamically adjusted to the position of the order after update, i.e., the order update information may be displayed in an animated manner.
Fig. 3 is a schematic diagram of a pop-up window according to an embodiment of the disclosure, as shown in fig. 3, when the order represented by the order information of the user identifier of the user D changes, order update information may be displayed in the pop-up window in the display area, that is, the position of the user identifier of the user D is dynamically adjusted from the position of the order before update to the position of the order after update. The position can be adjusted to be displayed in an animation mode so as to improve the display effect.
In one embodiment, the popup window includes the user identifier and a second identifier which is different from the sequence of the user identifier by a set value, and the second identifier includes a third identifier, where the third identifier is an identifier with a changed sequence relationship with the user identifier.
In one embodiment, the popup window further includes a user identifier and a second identifier, where the second identifier is an identifier differing from the order of the user identifier by a set value, and the set value may be determined according to a case that the order represented by the order information of the user identifier is changed. The second identifier may include an object identifier that is ordered before the user identifier, or may include an object identifier that is ordered after the user identifier.
The second identifier may comprise a third identifier, the third identifier being one that changes its order relationship with the user identifier, e.g. the ordering of the third identifier is changed before the user identifier to after the user identifier or after the user identifier to before the user identifier.
For example, if the ranking of the user identifiers is tenth, the object identifiers from 5 th to 12 th may be displayed in the popup window, and the user identifier is updated to 8 th, the object identifiers from 3 rd to 10 th may be displayed in the popup window, where the 9 th and 10 th are the third identifiers that the target user exceeds this time.
In one embodiment, the method further comprises:
Displaying a first interaction control corresponding to the third identifier;
and responding to the triggering operation of the first interaction control, and displaying an interaction interface of the user corresponding to the third identifier.
The first interaction control may be a control for interacting with a user corresponding to the third identifier, and the interaction interface may be an interface for interacting with the user corresponding to the third identifier. In this step, a first interaction control corresponding to the third identifier is displayed in the popup window, and when the first interaction control is triggered, the triggering operation of the first interaction control is responded, and an interaction interface corresponding to the user corresponding to the third identifier is displayed, so that the target user interacts with the user corresponding to the first interaction control. In addition, a view control can be further contained in the popup window to return to the target display interface for viewing.
As shown in fig. 3, a first interaction control 7 corresponding to the third identifier is displayed in the popup window, so as to be used for displaying an interaction interface of a user corresponding to the third identifier; and a view control 8 for returning to the target display interface for viewing.
In one embodiment, the method further comprises:
displaying a processing success window, wherein the processing success window comprises a second interaction control;
And responding to the triggering operation of the second interaction control, and displaying an order information display interface.
Wherein, the processing success window can be considered as a window characterizing that the virtual object has been successfully processed; the second interactive control may be a control that triggers a display of order information presentation interface, which may then be an interface that presents order information for the user identifier and the first identifier.
Specifically, after the virtual object corresponding to the target processing control is processed, a processing success window can be displayed, the processing success window can include a second interaction control, when the second interaction control is triggered, the triggering operation of the second interaction control can be responded, and the sequence information display interface can be displayed.
Fig. 4 is a schematic diagram of a processing success window provided by an embodiment of the disclosure, and as shown in fig. 4, the processing success window includes a second interaction control 9 for triggering a display order information display interface.
In one embodiment, the method further comprises:
and the sequence information display interface comprises a third interaction control, and the display state of the third interaction control is updated on the sequence information display interface in response to the triggering operation of the third interaction control.
The third interaction control may be a control that interacts with the user corresponding to the first identifier, for example, the third interaction control may be a praise control, so as to realize praise on the user corresponding to the first identifier.
In one embodiment, a third interaction control can be included in the order information display interface, and when the third interaction control is triggered, the display state of the third interaction control can be updated on the order information display interface in response to the triggering operation of the third interaction control. After the display state is changed, the interaction with the user corresponding to the third interaction control can be considered to be completed.
In this embodiment, each object identifier in the order information display interface corresponds to a third interaction control, so that the target user interacts with the corresponding user through the third interaction control.
In one embodiment, the method further comprises: and the sequence information display interface comprises a fourth interaction control, and a user display interface corresponding to a user of the fourth interaction control is displayed in response to the triggering operation of the fourth interaction control.
The fourth interaction control can be a control which interacts with the user corresponding to the first identifier so as to display a user display interface of the corresponding user, wherein the user display interface is an interface for displaying user information. The user information may be information that the user is allowed to present within the application, such as the user's personal home page.
In an embodiment, the sequence information display interface may further include a fourth interaction control, and when the fourth interaction control is triggered, the triggering operation of the fourth interaction control may be responded, that is, a user display interface of the fourth interaction control corresponding to the user is displayed.
FIG. 5 is a schematic diagram of an order information display interface provided by an embodiment of the present disclosure, where, as shown in FIG. 5, the order information display interface includes a third interaction control 11, and when the third interaction control 11 is triggered, a display state of the third interaction control 11 is updated on the order information display interface; in addition, the sequence information presentation interface includes a fourth interaction control 12, and after the fourth interaction control 12 is operated, a user presentation interface corresponding to the user B of the fourth interaction control 12 is displayed.
In one embodiment, the method further comprises:
processing the success window includes closing the control;
and responding to the triggering operation of the closing control, displaying an updated target display interface, wherein the display state of the processed virtual object is updated in the updated target display interface.
The close control is a control that closes a process success window. Specifically, the processing success window further includes a closing control, when the closing control is triggered, the closing control can be responded to the triggering operation of the closing control, and the updated target display interface is displayed, wherein the display state of the processed virtual object is updated in the updated target display interface, so that the display state of the processed virtual object is distinguished from the display state before the virtual object is processed.
After the virtual object is processed, the display state of the virtual object may be a processed state; when the virtual object is not processed, the display state of the virtual object may be an unprocessed state. The specific form of the display state is not limited as long as whether processing can be discriminated.
Illustratively, upon triggering the close control 10 as in FIG. 4, an updated target presentation interface may be displayed, and FIG. 6 is a schematic diagram of an updated target presentation interface provided by embodiments of the present disclosure, as shown in FIG. 6, an unprocessed virtual object (i.e., 14) may be displayed in shadow, and an processed virtual object (i.e., 13) may be displayed in opposite-to-hook.
In one embodiment, the target display interface includes an inactive control, where the inactive control is updated to the target processing control after being activated, and the inactive control indicates a condition required to be updated to the target processing control and a corresponding virtual object.
The unactivated control may be used to indicate a condition required for updating to be the target processing control and a corresponding virtual object, where the required condition may be determined according to the target progress information currently corresponding to the user identifier and the unprocessed virtual object, e.g. the target progress information and the target with the smallest corresponding level in the unprocessed virtual object are determined. If the difference between the target progress information and the corresponding target identifier is the required condition, the corresponding virtual object may be an unprocessed virtual object. It can be assumed that when the target progress information currently corresponding to the user identifier reaches the desired condition, the inactive control is activated and updated to the target processing control.
In one embodiment, the target display interface includes an identification element corresponding to the user identifier, where the identification element dynamically displays when the corresponding target progress information is updated.
The identification element can be an element for identifying the target user and is used for simulating the movement behavior of the target user; the identification element can be dynamically displayed when the target progress information corresponding to the target user is updated. The dynamic display mode of the identification element is not limited, different targets correspond to different display modes, and the identification element can synchronously move with the target user under the condition that the target user moves. In the case where the target user completes the target processing of the virtual object, the identification element may process the virtual object synchronously.
In one embodiment, the method further comprises: the target display interface comprises a first indication area, the first indication area comprises a fifth interaction control, and a user display interface corresponding to a user of the fifth interaction control is displayed in response to triggering operation of the fifth interaction control.
The first indication area is used for indicating the state of a user associated with the target user, such as the state of updating works and the like in the application program; the fifth interaction control can be a control for interacting with a user associated with the target user in the first indication area, so that the user display interface of the corresponding user is displayed.
In one embodiment, the target display interface further includes a first indication area, where the first indication area includes a fifth interaction control, and when the fifth interaction control is triggered, the triggering operation of the fifth interaction control may be responded, that is, a user display interface of the fifth interaction control corresponding to the user is displayed.
In one embodiment, the method further comprises: the target display interface comprises a second indication area, the second indication area comprises a sixth interaction control, and an interaction interface corresponding to the sixth interaction control for user interaction is displayed in response to triggering operation of the sixth interaction control.
The second indication area is an area for indicating to invite other users, and the other users can be users associated with the target user and not participating in the activity, such as users not triggering to display the target display interface; the sixth interaction control can be a control which interacts with other users in the second indication area so as to display other user interaction interfaces. After the sixth interaction control is triggered, the target user can interact with other users outside the application program.
In one embodiment, the target display interface further includes a second indication area, where the second indication area includes a sixth interaction control, and when the sixth interaction control is triggered, the triggering operation of the sixth interaction control may be responded, that is, an interaction interface corresponding to the user interaction by the sixth interaction control is displayed.
Fig. 7 is a schematic diagram of a target presentation interface provided by an embodiment of the present disclosure, where, as shown in fig. 7, an inactive control 15 is included in the target presentation interface, where the inactive control 15 indicates a condition (i.e. a difference 949) required for updating to a target processing control and corresponding virtual objects (i.e. 100 collars).
In addition, the target display interface includes a first indication area 16, and the first indication area 16 includes a fifth interaction control 17, so as to display a user display interface corresponding to the user a by the fifth interaction control 17; the target display interface further comprises a second indication area 18, and the second indication area 18 comprises a sixth interaction control 19 so as to display an interaction interface which interacts with the user F corresponding to the sixth interaction control 19.
Fig. 8 is a flow chart of an interface display method according to an embodiment of the disclosure, where the embodiment is embodied based on various alternatives in the foregoing embodiments. In this embodiment, the display target display interface is further embodied as: displaying the target identifications in the display area of the target display interface according to the order of the grade from low to high; and arranging the object identifiers among the object identifiers based on corresponding target progress information.
For details not yet described in detail in this embodiment, refer to embodiment one.
As shown in fig. 8, a method for displaying an interface according to a second embodiment of the present disclosure includes the following steps:
and S210, displaying the target identifications in the display area of the target display interface according to the order of the grade from low to high.
The object identifiers may be displayed in the display area in the form of lines or curves.
S220, arranging the object identifiers among the object identifiers based on corresponding target progress information.
Specifically, the target identifiers may be displayed in the display area of the target display interface in the order from low to high, and then the user identifier and the associated identifier are arranged between the target identifiers according to the sizes of the user identifier and the associated identifier corresponding to the target progress information, respectively.
And S230, responding to the triggering operation of a target processing control in the target display interface, processing a virtual object corresponding to the target processing control, and displaying the target processing control when target progress information corresponding to the user identifier meets a setting condition, wherein the setting condition is determined based on a target identifier corresponding to an unprocessed virtual object.
According to the interface display method provided by the second embodiment of the disclosure, the object identifiers are displayed in the display area of the object display interface according to the order from low to high, and the object identifiers are arranged among the object identifiers based on the corresponding object progress information, so that visual display of the user identifiers and the associated identifiers is realized, and user experience is improved.
Through the above description, it can be found that the interface display method provided by the embodiment of the disclosure can combine walking and social scenes, add more social attributes (praise, user interaction corresponding to the interaction interface through the interaction interface, access to a personal homepage, etc.), and ranking mechanisms, so as to promote social penetration.
Specifically, the present walking progress module (i.e. the display area) has a status prompt (i.e. the display order update information in an animation manner) for the friends to go beyond, and meanwhile, the nodes (i.e. the target identifiers) to go through have a bubble prompt (the display status indicates that the status can be taken), so that the user is prompted to take the status (i.e. change the display status) that the mark is taken after the taking is completed. When the method is used for picking, a successful popup window can be popped up, a popup window button (namely, a second interaction control 9 in fig. 4) is clicked to jump to a ranking list module below a page (namely, a sequence information display interface can be displayed in a target display interface and is an interface for displaying sequence information in the target display interface, the sequence information display interface can also be an interface independent of the target display interface), the viscosity of a target user and an application program is improved through a virtual object processing mode, and social penetration is promoted through a plurality of set controls such as a first interaction control, a second interaction control, a third interaction control, a fourth interaction control and the like.
Fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the disclosure, as shown in fig. 9, where the device includes: a display module 310 and a response module 320.
A display module 310, configured to display a target display interface, where a display area of the target display interface displays object identifiers and multiple levels of target identifiers, where a display position of the object identifiers in the display area is determined based on the obtained target progress information, where the object identifiers include user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on order information of the identifiers;
and the response module 320 is configured to process the virtual object corresponding to the target processing control in response to a triggering operation of the target processing control in the target display interface, where the target processing control is displayed when the target progress information corresponding to the user identifier meets a setting condition, and the setting condition is determined based on a target identifier corresponding to an unprocessed virtual object.
According to the technical scheme provided by the embodiment of the disclosure, a target display interface is displayed through a display module 310, a display area of the target display interface is displayed with object identifiers and a plurality of levels of target identifiers, the display positions of the object identifiers in the display area are determined based on acquired target progress information, the object identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers; and responding to the triggering operation of the target processing control in the target display interface through the response module 320, and processing the virtual object corresponding to the target processing control, wherein the target processing control is displayed when the target progress information corresponding to the user identifier meets the setting condition, and the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object. According to the device, the target display interface is added in the application program, and the user identifier and the association identifier of the target user are displayed in the display area of the target display interface, so that the association between the target user and the associated user can be increased, and the display of the target display interface is enriched.
Further, the display module 310 includes:
displaying the target identifications in the display area of the target display interface according to the order of the grade from low to high;
and arranging the object identifiers among the object identifiers based on corresponding target progress information.
Further, the interface display device provided in this embodiment further includes:
order update information is displayed within the display area or within a popup window of the display area, the order update information being triggered if an order characterized by the order information of the user identifier changes.
Further, the order update information is displayed in an animated manner, the displayed animation including:
and dynamically adjusting the position of the object identifier from the position of the sequence before updating to the position of the sequence after updating.
Further, the popup window comprises the user identifier and a second identifier which is different from the sequence of the user identifier by a set value, the second identifier comprises a third identifier, and the third identifier is an identifier with a changed sequence relation with the user identifier.
Further, the interface display device provided in this embodiment further includes:
Displaying a first interaction control corresponding to the third identifier;
and responding to the triggering operation of the first interaction control, and displaying an interaction interface of the user corresponding to the third identifier.
Further, the interface display device provided in this embodiment further includes:
displaying a processing success window, wherein the processing success window comprises a second interaction control;
and responding to the triggering operation of the second interaction control, and displaying an order information display interface.
Further, the interface display device provided in this embodiment further includes:
and the sequence information display interface comprises a third interaction control, and the display state of the third interaction control is updated on the sequence information display interface in response to the triggering operation of the third interaction control.
Further, the interface display device provided in this embodiment further includes:
and the sequence information display interface comprises a fourth interaction control, and a user display interface corresponding to a user of the fourth interaction control is displayed in response to the triggering operation of the fourth interaction control.
Further, the interface display device provided in this embodiment further includes:
processing the success window includes closing the control;
And responding to the triggering operation of the closing control, displaying an updated target display interface, wherein the display state of the processed virtual object is updated in the updated target display interface.
Further, the target display interface comprises an unactivated control, the unactivated control is updated to the target processing control after being activated, and the unactivated control indicates the condition required by updating to the target processing control and the corresponding virtual object.
Further, the order information is determined based on the target user and target progress information of a user associated with the target user; alternatively, the order information is determined based on a degree of association of the target user with the target user associated user.
Further, the association identifier is selected based on the corresponding target progress information; alternatively, the association identifier is selected based on the corresponding degree of association.
Further, the target display interface includes an identification element corresponding to the user identifier, and the identification element is dynamically displayed when the corresponding target progress information is updated.
Further, the interface display device provided in this embodiment further includes:
The target display interface comprises a first indication area, the first indication area comprises a fifth interaction control, and a user display interface corresponding to a user of the fifth interaction control is displayed in response to triggering operation of the fifth interaction control.
Further, the interface display device provided in this embodiment further includes:
the target display interface comprises a second indication area, the second indication area comprises a sixth interaction control, and an interaction interface corresponding to the sixth interaction control for user interaction is displayed in response to triggering operation of the sixth interaction control.
Further, the target progress information is determined based on the acquired motion information of the target user.
The interface display device provided by the embodiment of the disclosure can execute the interface display method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that each unit and module included in the above apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present disclosure.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. Referring now to fig. 10, a schematic diagram of an electronic device 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 10 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 10, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An edit/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 10 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The electronic device provided in the embodiment of the present disclosure and the interface display method provided in the foregoing embodiment belong to the same inventive concept, and technical details not described in detail in the present embodiment may be referred to the foregoing embodiment, and the present embodiment has the same beneficial effects as the foregoing embodiment.
The embodiment of the present disclosure provides a computer storage medium having stored thereon a computer program which, when executed by a processing apparatus, implements the interface display method provided by the above embodiment.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: displaying a target display interface, wherein a display area of the target display interface is provided with target identifiers and a plurality of levels of target identifiers, the display positions of the target identifiers in the display area are determined based on the acquired target progress information, the target identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers; and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, example 1 provides a data processing method, comprising:
displaying a target display interface, wherein a display area of the target display interface is provided with target identifiers and a plurality of levels of target identifiers, the display positions of the target identifiers in the display area are determined based on the acquired target progress information, the target identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers;
and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object.
In accordance with one or more embodiments of the present disclosure, example 2 is in accordance with the method of example 1,
the display target presentation interface includes:
displaying the target identifications in the display area of the target display interface according to the order of the grade from low to high;
And arranging the object identifiers among the object identifiers based on corresponding target progress information.
According to one or more embodiments of the present disclosure, example 3 is the method of example 1, further comprising:
order update information is displayed within the display area or within a popup window of the display area, the order update information being triggered if an order characterized by the order information of the user identifier changes.
In accordance with one or more embodiments of the present disclosure, example 4 is in accordance with the method of example 3,
displaying the order update information in an animated manner, the displayed animation comprising:
and dynamically adjusting the position of the object identifier from the position of the sequence before updating to the position of the sequence after updating.
According to one or more embodiments of the present disclosure, example 5 is a method according to example 3,
the popup window comprises the user identifier and a second identifier which is different from the sequence of the user identifier by a set value, wherein the second identifier comprises a third identifier, and the third identifier is an identifier with a changed sequence relation with the user identifier.
According to one or more embodiments of the present disclosure, example 6 is the method of example 5, further comprising:
Displaying a first interaction control corresponding to the third identifier;
and responding to the triggering operation of the first interaction control, and displaying an interaction interface of the user corresponding to the third identifier.
According to one or more embodiments of the present disclosure, example 7 is the method of example 1, further comprising:
displaying a processing success window, wherein the processing success window comprises a second interaction control;
and responding to the triggering operation of the second interaction control, and displaying an order information display interface.
According to one or more embodiments of the present disclosure, the method of example 8 according to example 7, further comprising:
and the sequence information display interface comprises a third interaction control, and the display state of the third interaction control is updated on the sequence information display interface in response to the triggering operation of the third interaction control.
According to one or more embodiments of the present disclosure, the method of example 9 according to example 7, further comprising:
and the sequence information display interface comprises a fourth interaction control, and a user display interface corresponding to a user of the fourth interaction control is displayed in response to the triggering operation of the fourth interaction control.
According to one or more embodiments of the present disclosure, example 10 is the method of example 7, further comprising:
Processing the success window includes closing the control;
and responding to the triggering operation of the closing control, displaying an updated target display interface, wherein the display state of the processed virtual object is updated in the updated target display interface.
In accordance with one or more embodiments of the present disclosure, example 11 is in accordance with the method of example 1,
the target display interface comprises an unactivated control, the unactivated control is updated to the target processing control after being activated, and the unactivated control indicates conditions required by updating to the target processing control and corresponding virtual objects.
In accordance with one or more embodiments of the present disclosure, example 12 is in accordance with the method of example 1,
the order information is determined based on target progress information of the target user and a user associated with the target user; alternatively, the order information is determined based on a degree of association of the target user with the target user associated user.
In accordance with one or more embodiments of the present disclosure, example 13 is in accordance with the method of example 1,
the associated identifier is selected based on the corresponding target progress information; alternatively, the association identifier is selected based on the corresponding degree of association.
In accordance with one or more embodiments of the present disclosure, example 14 is in accordance with the method of example 1,
and the target display interface comprises an identification element corresponding to the user identifier, and the identification element is dynamically displayed when the corresponding target progress information is updated.
According to one or more embodiments of the present disclosure, example 15 is the method of example 1, further comprising:
the target display interface comprises a first indication area, the first indication area comprises a fifth interaction control, and a user display interface corresponding to a user of the fifth interaction control is displayed in response to triggering operation of the fifth interaction control.
According to one or more embodiments of the present disclosure, example 16 is the method of example 1, further comprising:
the target display interface comprises a second indication area, the second indication area comprises a sixth interaction control, and an interaction interface corresponding to the sixth interaction control for user interaction is displayed in response to triggering operation of the sixth interaction control.
In accordance with one or more embodiments of the present disclosure, example 17 is in accordance with the method of example 1,
the target progress information is determined based on the acquired motion information of the target user.
Example 18 provides an interface display apparatus according to one or more embodiments of the present disclosure, comprising:
the display module is used for displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, the display position of the object identifier in the display area is determined based on the acquired target progress information, the object identifier comprises a user identifier of a target user and an associated identifier, and the associated identifier is an identifier selected from first identifiers of users associated with the target user based on the sequence information of the identifiers;
the response module is used for responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, and determining the setting condition based on the target identifier corresponding to the unprocessed virtual object.
Example 19 provides an electronic device, according to one or more embodiments of the present disclosure, comprising:
one or more processing devices;
a storage means for storing one or more programs;
The one or more programs, when executed by the one or more processing devices, cause the one or more processing devices to implement the methods of any of examples 1-17.
Example 20 provides a computer-readable medium having stored thereon a computer program which, when executed by a processing device, implements a method as in any of examples 1-17, according to one or more embodiments of the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (20)

1. An interface display method, comprising:
displaying a target display interface, wherein a display area of the target display interface is provided with target identifiers and a plurality of levels of target identifiers, the display positions of the target identifiers in the display area are determined based on the acquired target progress information, the target identifiers comprise user identifiers of target users and associated identifiers, and the associated identifiers are identifiers selected from first identifiers of users associated with the target users based on sequence information of the identifiers;
and responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, and displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, wherein the setting condition is determined based on the target identifier corresponding to the unprocessed virtual object.
2. The method of claim 1, wherein displaying the target presentation interface comprises:
displaying the target identifications in the display area of the target display interface according to the order of the grade from low to high;
and arranging the object identifiers among the object identifiers based on corresponding target progress information.
3. The method as recited in claim 1, further comprising:
order update information is displayed within the display area or within a popup window of the display area, the order update information being triggered if an order characterized by the order information of the user identifier changes.
4. The method of claim 3, wherein the step of,
displaying the order update information in an animated manner, the displayed animation comprising:
and dynamically adjusting the position of the object identifier from the position of the sequence before updating to the position of the sequence after updating.
5. A method according to claim 3, wherein the popup window includes the user identifier and a second identifier which is different from the order of the user identifier by a set value, and the second identifier includes a third identifier, and the third identifier is an identifier whose order relation with the user identifier is changed.
6. The method as recited in claim 5, further comprising:
displaying a first interaction control corresponding to the third identifier;
and responding to the triggering operation of the first interaction control, and displaying an interaction interface of the user corresponding to the third identifier.
7. The method as recited in claim 1, further comprising:
displaying a processing success window, wherein the processing success window comprises a second interaction control;
and responding to the triggering operation of the second interaction control, and displaying an order information display interface.
8. The method as recited in claim 7, further comprising:
and the sequence information display interface comprises a third interaction control, and the display state of the third interaction control is updated on the sequence information display interface in response to the triggering operation of the third interaction control.
9. The method as recited in claim 7, further comprising:
and the sequence information display interface comprises a fourth interaction control, and a user display interface corresponding to a user of the fourth interaction control is displayed in response to the triggering operation of the fourth interaction control.
10. The method as recited in claim 7, further comprising:
Processing the success window includes closing the control;
and responding to the triggering operation of the closing control, displaying an updated target display interface, wherein the display state of the processed virtual object is updated in the updated target display interface.
11. The method of claim 1, wherein the target presentation interface includes an inactive control, the inactive control updated to the target processing control after being activated, the inactive control indicating a condition required to be updated to the target processing control and a corresponding virtual object.
12. The method of claim 1, wherein the order information is determined based on target progress information of the target user and a user associated with the target user; alternatively, the order information is determined based on a degree of association of the target user with the target user associated user.
13. The method of claim 1, wherein the association identifier is selected based on the corresponding target progress information; alternatively, the association identifier is selected based on the corresponding degree of association.
14. The method according to claim 1, wherein the target display interface includes an identification element corresponding to the user identifier, and the identification element is dynamically displayed when the corresponding target progress information is updated.
15. The method as recited in claim 1, further comprising:
the target display interface comprises a first indication area, the first indication area comprises a fifth interaction control, and a user display interface corresponding to a user of the fifth interaction control is displayed in response to triggering operation of the fifth interaction control.
16. The method as recited in claim 1, further comprising:
the target display interface comprises a second indication area, the second indication area comprises a sixth interaction control, and an interaction interface corresponding to the sixth interaction control for user interaction is displayed in response to triggering operation of the sixth interaction control.
17. The method of claim 1, wherein the target progress information is determined based on acquired motion information of the target user.
18. An interface display device, comprising:
the display module is used for displaying a target display interface, wherein an object identifier and a plurality of levels of target identifiers are displayed in a display area of the target display interface, the display position of the object identifier in the display area is determined based on the acquired target progress information, the object identifier comprises a user identifier of a target user and an associated identifier, and the associated identifier is an identifier selected from first identifiers of users associated with the target user based on the sequence information of the identifiers;
The response module is used for responding to the triggering operation of the target processing control in the target display interface, processing the virtual object corresponding to the target processing control, displaying the target processing control when the target progress information corresponding to the user identifier meets the setting condition, and determining the setting condition based on the target identifier corresponding to the unprocessed virtual object.
19. An electronic device, the electronic device comprising:
one or more processing devices;
storage means for storing one or more programs,
when the one or more programs are executed by the one or more processing devices, the one or more processing devices are caused to implement the interface display method of any of claims 1-17.
20. A storage medium containing computer executable instructions for performing the interface display method of any one of claims 1-17 when executed by a computer processing device.
CN202210635847.1A 2022-06-06 2022-06-06 Interface display method and device, electronic equipment and medium Pending CN117234630A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210635847.1A CN117234630A (en) 2022-06-06 2022-06-06 Interface display method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210635847.1A CN117234630A (en) 2022-06-06 2022-06-06 Interface display method and device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN117234630A true CN117234630A (en) 2023-12-15

Family

ID=89086738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210635847.1A Pending CN117234630A (en) 2022-06-06 2022-06-06 Interface display method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN117234630A (en)

Similar Documents

Publication Publication Date Title
CN111510760B (en) Video information display method and device, storage medium and electronic equipment
CN112764612A (en) Interaction method, interaction device, electronic equipment and storage medium
CN111597467A (en) Display method and device and electronic equipment
CN114217707B (en) Sharing method, sharing device, electronic equipment and storage medium
US20240040185A1 (en) Task processing method, device, and storage medium
CN111596995B (en) Display method and device and electronic equipment
CN114564269A (en) Page display method, device, equipment, readable storage medium and product
US20230042757A1 (en) Human-computer interaction method and apparatus, and electronic device
CN113721807A (en) Information display method and device, electronic equipment and storage medium
CN114025225B (en) Bullet screen control method and device, electronic equipment and storage medium
CN110825481A (en) Method and device for displaying page information corresponding to page tag and electronic equipment
US20230276079A1 (en) Live streaming room page jump method and apparatus, live streaming room page return method and apparatus, and electronic device
CN111147885B (en) Live broadcast room interaction method and device, readable medium and electronic equipment
CN114625469B (en) Information display method, device, equipment and medium based on virtual resource
CN115379245B (en) Information display method and device and electronic equipment
CN117234630A (en) Interface display method and device, electronic equipment and medium
CN114491098A (en) Comment prompting method and device, electronic equipment, storage medium and program product
CN113836415A (en) Information recommendation method, device, medium and equipment
CN113318437A (en) Interaction method, device, equipment and medium
CN113342440A (en) Screen splicing method and device, electronic equipment and storage medium
CN115933919A (en) Interaction method, interaction device, electronic equipment and computer-readable storage medium
CN112306340A (en) Information display method, information sending device, electronic equipment and medium
CN112162682A (en) Content display method and device, electronic equipment and computer readable storage medium
WO2024022102A1 (en) Page display method and apparatus, and electronic device and medium
EP4270187A1 (en) Work display method and apparatus, and electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination