CN110025962B - Object matching method, device, equipment and storage medium - Google Patents

Object matching method, device, equipment and storage medium Download PDF

Info

Publication number
CN110025962B
CN110025962B CN201910323381.XA CN201910323381A CN110025962B CN 110025962 B CN110025962 B CN 110025962B CN 201910323381 A CN201910323381 A CN 201910323381A CN 110025962 B CN110025962 B CN 110025962B
Authority
CN
China
Prior art keywords
interactive
groups
group
interaction
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910323381.XA
Other languages
Chinese (zh)
Other versions
CN110025962A (en
Inventor
钱宏图
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Tencent Domain Computer Network Co Ltd
Original Assignee
Shenzhen Tencent Domain Computer Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Tencent Domain Computer Network Co Ltd filed Critical Shenzhen Tencent Domain Computer Network Co Ltd
Priority to CN201910323381.XA priority Critical patent/CN110025962B/en
Publication of CN110025962A publication Critical patent/CN110025962A/en
Application granted granted Critical
Publication of CN110025962B publication Critical patent/CN110025962B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5566Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history by matching opponents or finding partners to build a team, e.g. by skill level, geographical area, background, play style

Abstract

The application discloses a method, a device, equipment and a storage medium for object matching, and belongs to the technical field of internet. The method comprises the following steps: acquiring the number of groups participating in interaction and the score of each group in the groups participating in interaction; determining the number of interactive groups according to the number of groups participating in interaction; grouping all groups according to the score of each group and the number of the interactive groups; for each interactive group, pairwise matching is carried out on the groups with the reference number in the interactive group; carrying out random matching on unmatched groups in all interactive groups; and distributing a plurality of interactive scenes for every two groups which are successfully matched, sending the matching information and the distributed interactive scene information to the clients of the corresponding groups, and carrying out interaction on the groups participating in the interaction based on the matching information and the interactive scene information. The method and the device can improve the matching degree of the objects, have better interaction effect, improve the interaction rate and improve the user experience.

Description

Object matching method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of internet, in particular to a method, a device, equipment and a storage medium for object matching.
Background
With the continuous development of the internet, the online games are more and more accepted by people. In the current network game, Player competition (PVP) is a popular play mode, in which players can directly compete with players, and players can form a group to play, i.e., perform group interaction, thereby further improving the interest of the game. Therefore, how to match objects is a key to influencing the game experience of the player.
In the related technology, in the process of object matching, a server matches two groups in the groups participating in interaction in a score approaching mode, and the two groups successfully matched enter a single-match field to fight, namely, the two groups interact.
In the course of implementing the present application, the inventors found that the related art has at least the following problems:
in the related technology, the groups are matched according to a mode of approximate scores, the groups with strong capability are often matched with the groups with strong capability, so that players in the groups of both parties in the strong-capability fight are too tired, the fight of the groups with weak capability is relatively boring, and therefore the matching degree of the objects is not high, the interaction effect of the groups is not good, and the user experience is influenced.
Disclosure of Invention
The embodiment of the application provides an object matching method, device, equipment and storage medium, which can be used for solving the problems in the related art. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides an object matching method, where the method includes:
acquiring the number of groups participating in interaction and the score of each group in the groups participating in interaction, wherein each group participating in interaction comprises a plurality of interaction objects, and the score is used for indicating the interaction capacity of the group;
determining the number of interactive groups according to the number of the groups participating in the interaction;
grouping all the groups according to the score of each group and the number of the interactive groups, wherein the number of the interactive groups is one or more;
for each interactive group, pairwise matching is carried out on the groups with the reference number in the interactive group; carrying out random matching on unmatched groups in all interactive groups;
distributing a plurality of interactive scenes for every two groups successfully matched, wherein the number of interactive objects of each group contained in each interactive scene is within a threshold range;
and sending the matching information and the distributed interactive scene information to the client side of the corresponding group, wherein the group participating in the interaction carries out interaction based on the matching information and the interactive scene information.
There is also provided a method of object matching, the method comprising:
acquiring matching information of a group, wherein the matching information comprises information of a first group and information of a second group which are interactive, and the first group is a group in which an interactive object of a current client is located; the matching information of the groups is obtained by the server through determining the number of interactive groups based on the number of groups participating in interaction, grouping all the groups according to the score of each group and the number of the interactive groups, matching the groups with reference number in the interactive groups in pairs, and randomly matching the groups which are not matched in all the interactive groups;
acquiring interactive scene information of a plurality of interactive scenes distributed by the server, wherein the number of interactive objects contained in each interactive scene is within a threshold range;
and interacting with the interactive objects in the second group based on the matching information of the group and the interactive scene information of the plurality of interactive scenes.
In another aspect, an apparatus for object matching is provided, the apparatus including:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the number of groups participating in interaction and the score of each group in the groups participating in interaction, each group participating in interaction comprises a plurality of interaction objects, and the score is used for indicating the interaction capacity of the group;
the determining module is used for determining the number of the interactive groups according to the number of the groups participating in the interaction;
the grouping module is used for grouping all the groups according to the score of each group and the number of the interactive groups, wherein the number of the interactive groups is one or more;
the matching module is used for matching every two groups of the reference number in the interactive groups for each interactive group; carrying out random matching on unmatched groups in all interactive groups;
the distribution module is used for distributing a plurality of interactive scenes for every two groups which are successfully matched, and the number of interactive objects of each group contained in each interactive scene is within a threshold range;
and the sending module is used for sending the matching information and the distributed interactive scene information to the client side of the corresponding group, and the group participating in the interaction carries out interaction based on the matching information and the interactive scene information.
There is also provided an apparatus for object matching, the apparatus comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the matching information of groups, the matching information comprises the information of a first interactive group and the information of a second interactive group, and the first interactive group is the group where the interactive object of the current client is located; the matching information of the groups is obtained by the server through determining the number of interactive groups based on the number of groups participating in interaction, grouping all the groups according to the score of each group and the number of the interactive groups, matching the groups with reference number in the interactive groups in pairs, and randomly matching the groups which are not matched in all the interactive groups;
the second acquisition module is used for acquiring interactive scene information of a plurality of interactive scenes distributed by the server, and the number of interactive objects contained in each interactive scene is within a threshold range;
and the interaction module is used for interacting with the interaction objects in the second group based on the matching information of the group and the interaction scene information of the plurality of interaction scenes.
In another aspect, a computer device is provided, which includes a processor and a memory, wherein the memory stores at least one instruction, and the at least one instruction, when executed by the processor, implements any one of the above-mentioned object matching methods.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction is stored, and when executed, the at least one instruction implements any one of the above-described object matching methods.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
in the process of object matching, all groups are grouped according to the scores of the groups participating in interaction, pairwise matching is carried out on the groups with the reference number in each interactive group, then random matching is carried out on the groups which are not matched in all interactive groups, so that part of the groups are matched with the groups with the same capacity, and the other part of the groups are matched with the groups with the larger capacity difference. Therefore, the method helps to enable the interest value and fatigue of the interactive objects in the group interaction to reach a reasonable balance point, improves the matching degree of the objects, enables the group interaction effect to be better, further improves the interaction rate and improves the user experience.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
FIG. 2 is a flowchart of a method for object matching according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating an object matching process provided by an embodiment of the present application;
FIG. 4 is a schematic illustration of a multi-pair field according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating a group engagement representation according to an embodiment of the present disclosure;
FIG. 6 is a flowchart of a method for object matching according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a result of team formation in a target scene according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a model displayed during interaction in a target scene according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an object matching apparatus provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an object matching apparatus provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of an object matching apparatus according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an interactive module according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an object matching apparatus provided in an embodiment of the present application;
fig. 14 is a schematic structural diagram of an object matching device according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
With the continuous development of the internet, the online games are more and more accepted by people. In the current network game, Player competition (PVP) is a popular play mode, in which players can directly compete with players, and players can form a group to play, i.e., perform group interaction, thereby further improving the interest of the game. Therefore, how to match objects is a key to influencing the game experience of the player.
In view of the above, an embodiment of the present application provides an object matching method, please refer to fig. 1, which shows a schematic diagram of an implementation environment of the method provided in the embodiment of the present application. The implementation environment may include: at least one terminal 11 and at least one second server 12.
The terminal 11 is installed with an application program or a web page capable of performing object matching, for example, a game-class application program or a game-class web page, and the method provided by the embodiment of the present application can be applied when group interaction is performed based on the game-class application program or the web page. The terminal 11 may collect information of the interactive object, for example, a score of the interactive object, a skill of the interactive object, a number of days for logging in the interactive object, a result of success or failure of interaction of the interactive object, and upload the information of the interactive object to the server 12 for storage. The terminal 11 may also store information of the interactive object.
Alternatively, the terminal 11 may be an electronic device such as a mobile phone, a tablet computer, a personal computer, or the like. The server 12 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
It should be understood by those skilled in the art that the above-mentioned distribution terminal 11 and server 12 are only examples, and other existing or future terminals or servers may be suitable for the present application, and are included in the scope of the present application and are incorporated herein by reference.
Based on the implementation environment shown in fig. 1, an embodiment of the present application provides an object matching method, which is applied to the server 12 in the implementation environment shown in fig. 1 as an example. As shown in fig. 2, the method provided by the embodiment of the present application may include the following steps:
in step 201, the number of groups participating in the interaction and the score of each group in the groups participating in the interaction are obtained, wherein each group participating in the interaction comprises a plurality of interaction objects, and the score is used for indicating the interaction capability of the group.
Before object matching, groups meeting the conditions are automatically registered to participate in group interaction, and each group meeting the conditions comprises a plurality of interaction objects meeting the conditions.
Optionally, the conditions that need to be satisfied by the interactive objects capable of participating in the group interaction are: the interactive objects belong to the active interactive objects, the level of the interactive objects reaches a first threshold value, and the time of joining the interactive objects into the group reaches a second threshold value, wherein the interactive objects logged in within 7 days are used as the active interactive objects. For example, the first threshold is 36 levels, the second threshold is 24 hours, and if the level of the interactive object is 38 levels, the time for joining the group is 30 hours, and the interactive object is logged in within 7 days, the interactive object is an interactive object satisfying the condition. In addition, the conditions that the group automatically registering participation in the group interaction needs to satisfy are as follows: the amount of interactive objects with sufficient funds, satisfying the condition reaches a third threshold and the community level reaches a fourth threshold. For example, if a certain group has sufficient funds, 60 interaction objects satisfying the condition are included and the group has a level of 3, the group automatically registers to participate in the group interaction.
Optionally, before the group interaction activity starts, further comprising: and detecting whether the number of the groups meeting the conditions reaches a fifth threshold value, and if the number of the groups meeting the conditions does not reach the fifth threshold value, canceling the group interaction activity. If the group meeting the condition reaches the fifth threshold, the group interaction activity is normally carried out, and all the groups meeting the condition are automatically registered to participate in the group interaction.
In the embodiment of the application, the number of the groups meeting the condition reaches the fifth threshold, the group interaction activity is normally performed, the automatically registered groups all participate in the group interaction, so that the number of the groups participating in the interaction is obtained, and each group participating in the interaction comprises a plurality of interaction objects.
In the groups participating in the interaction, each interactive object in each group has an interactive object score, and the interactive object score refers to a quantitative value of the interactive capacity of the interactive object. The higher the score of the interactive object, the stronger the interactive ability of the interactive object. And summing the interactive object scores of all the interactive objects in each group to obtain the score of the group. The higher the score of a group, the stronger the interaction of the group.
In step 202, the number of interactive groups is determined based on the number of groups participating in the interaction.
And determining the number of interactive groups according to the number of the groups participating in the interaction, wherein the interactive groups can be used for grouping the groups participating in the interaction. The greater the number of groups participating in an interaction, the greater the number of interactive groups.
Optionally, the number of interactive groups is calculated according to the following formula:
the number of interactive groups max min int [ number of groups participating in the interaction/10 ], 4], 1.
Taking the group battle (group interaction) activities in the game as an example, the number of battle fields (interaction groups) is calculated according to the following formula:
the number of battle fields is max { min int [ number of groups participating in battle/10 ], 4], 1 }.
For example, assuming that the number of participating battle groups is 104, the number of battle fields (interactive groups) is max { min [ int [104/10], 4], 1}, max { min [ int [10.4], 4], 1}, max { min [10, 4], 1}, max {4, 1}, 4. That is, when the number of the groups participating in the match-up is 104, the number of match-up areas (interactive groups) is 4. Assume that the 4 match-up areas (interactive groups) are a first match-up area (first interactive group), a second match-up area (second interactive group), a third match-up area (third interactive group), and a fourth match-up area (fourth interactive group), respectively.
In step 203, all the groups are grouped according to the score of each group and the number of interactive groups, wherein the number of interactive groups is one or more.
And according to the number of the grading high-low interactive groups of each group in all the groups participating in the interaction, grading descending order is carried out on all the groups participating in the interaction, and all the groups participating in the interaction are grouped according to the sorting result, namely all the groups participating in the interaction are filled into the interactive groups. The difference between the number of groups participating in the interaction filled in each interaction group is as small as possible.
For example, taking the group competition (group interaction) activity in the game as an example, if the number of the groups participating in the competition is 104, the number of the established competition areas (interaction groups) is 4, and 26 groups participating in the competition are equally divided and filled in the first competition area (first interaction group), the second competition area (second interaction group), the third competition area (third interaction group) and the fourth competition (fourth interaction group) in descending order. If the number of the groups participating in the battle is 102, the number of the battle areas (interactive groups) is 4, 26 groups participating in the battle are respectively filled in the first battle area (first interactive group) and the second battle area (second interactive group) in the descending order, and 25 groups participating in the battle are respectively filled in the third battle area (third interactive group) and the fourth battle area (fourth interactive group).
In step 204, for each interactive group, pairwise matching is performed on the groups of the reference number in the interactive group; and carrying out random matching on the unmatched groups in all the interactive groups.
And for each interactive group, pairwise matching is carried out on the groups with the reference number in the interactive group, wherein the reference number can be determined according to the reference proportion. The reference ratio is assumed to be x%, i.e., within each interactive group, groups of x% ratio are matched two by two. The reference number is even number, so that the group with matching failure is avoided in the process of matching the groups in the interactive group. At this time, since the scores of the communities filled in each interactive group are close, that is, the interactive abilities are close, the communities matched by x% are the communities with close interactive abilities.
Within each interactive group, there is a proportion of y% of unmatched groups, where y% is 1-x%. And carrying out random matching on the unmatched groups in all the interactive groups. At this time, the groups subjected to random matching include groups not matched in a proportion of y% in each interactive group, and the interactive capabilities of the groups are uneven, and when random matching is performed, the group with strong interactive capability may be matched with the group with strong interactive capability or may be matched with the group with weak interactive capability. Similarly, a poorly interactive group may be matched to a poorly interactive group, and may be matched to a strongly interactive group.
Alternatively, the value of x is 60, the value of y is 40, that is, for all groups in each interactive group, there is a 60% probability of pairwise matching in the interactive group, and a 40% probability of random matching with the groups in other interactive groups, as shown in fig. 3.
For example, in the group battle activities in the game, assuming that 4 battle regions (interactive groups) are opened, the number of groups filled in the first battle region (first interactive group), the second battle region (second interactive group), the third battle region (third interactive group) and the fourth battle (fourth interactive group) is 30, assuming that the value of x is 60 and the value of y is 40, 18 groups can be matched in the first battle region (first interactive group), the second battle region (second interactive group), the third battle region (third interactive group) and the fourth battle (fourth interactive group), and 12 unmatched groups are in the first battle region (first interactive group), the second battle region (second interactive group), the third battle region (third interactive group) and the fourth battle (fourth interactive group), the 48 unmatched communities are randomly matched.
Although the x% proportion of communities in each interactive group is an even number, the number of unmatched communities in all interactive groups may be even or odd. When the number of all unmatched groups is even, all unmatched groups can be randomly matched with the interactive group, and when the number of all unmatched groups is odd, one group cannot be matched with the interactive group, and at the moment, the group cannot participate in the group interaction activity.
The player can feel the same as a chess meeting player when interacting with an opponent with close interaction capacity, the interaction pleasure is the highest, and the corresponding fatigue is high. Therefore, the value of x and the value of y can be dynamically adjusted according to the actual situation, so that the interactive pleasure and the interactive fatigue of the interactive objects in the group interaction reach an ideal balance point.
Optionally, after matching all parties involved in the interaction, further steps are included.
In step 205, a plurality of interactive scenes are allocated to each two communities for which matching is successful, and each interactive scene contains the number of interactive objects of each community within a threshold range.
For example, the number of interaction scenes allocated to each two groups which are successfully matched is 5, and the interaction scenes are a first interaction scene, a second interaction scene, a third interaction scene, a fourth interaction scene and a fifth interaction scene respectively. Assuming a threshold range of 20, each interactive scene contains no more than 20 interactive objects in each community. And sending the distributed interactive scene information to the clients of the corresponding group, so that the interactive objects of the clients can enter one interactive scene for interaction.
By using the multi-interaction scene mode and limiting the number of interaction objects contained in each interaction scene, tactical deployment can be performed by a group before interaction, namely, an interaction strategy is determined, so that game pleasure is improved, and interaction rate is further improved. The number of the interactive object models which can be simultaneously presented in the interactive scene is reduced, the stacking phenomenon of the interactive object models is reduced to a certain extent, and the visibility is improved.
Likewise, in the group battle activities in the game, a plurality of battle fields (interactive scenes) are allocated for every two groups successfully matched, and each battle field (interactive scene) contains the number of battle objects (interactive objects) of each group within a threshold range. For example, as shown in fig. 4, 5 match fields are allocated to each two groups successfully matched, and the match fields are a first match field (first interactive scene), a second match field (second interactive scene), a third match field (third interactive scene), a fourth match field (fourth interactive scene), and a fifth match field (fifth interactive scene). In each battlefield, the number of battle objects (interactive objects) in each group must not exceed a threshold range. For example, the threshold range is 20, 5 match fields (interactive scenes) are allocated to the first group and the second group which are successfully matched, the number of the match objects (interactive objects) in the first group is 130, the number of the match objects (interactive objects) in the second group is 140, each group can only allow 100 match objects (interactive objects) to participate in the match in 5 match fields (interactive scenes), so that pre-war strategy deployment is needed, and the appropriate match objects (interactive objects) are selected to enter one match field (interactive scenes) for match according to the number and the capability of the match objects (interactive objects) of the opposite group entering the certain match field. If the details of the battle objects (interactive objects) of the opposite party group entering a certain battle field (interactive scene) are required to be checked, the opposite party group can enter the battle field (interactive scene) for investigation.
In step 206, the matching information and the distributed interaction scenario information are sent to the clients of the corresponding group, and the group participating in the interaction performs interaction based on the matching information and the interaction scenario information.
Optionally, the matching information and the distributed interaction scenario information may be separately sent to the clients of the corresponding group, or may be sent to the clients of the corresponding group together. Taking the split sending as an example, the way of sending the matched information to the client of the corresponding community may be as follows:
the first method is as follows: and storing the matching information of every two groups which are successfully matched independently, and sending the matching information to the clients of the two groups corresponding to the matching information. For example, if the first group and the second group are successfully matched, the matching information is sent to the clients of the first group and the second group, and the matching information may include the name of the group, the ID number of the group, the score of the group, the number of interactive objects in the group, and the like.
The second method comprises the following steps: and storing the matching information of all the communities which are successfully matched together, and sending the matching information of all the communities to the clients of all the communities. Optionally, an interaction table is generated according to matching information of all groups successfully matched. The interaction table contains information of all successfully matched groups, and the group information may include information of the group name, the group ID number, the group score, the number of interaction objects in the group, and the like. For example, in the group match event in the game, after all the groups participating in the match (interaction) are successfully matched, a match table (interaction table) as shown in fig. 5 can be generated based on the matching information, and it can be seen from the table that the first group and the second group are mutually the competitors, the ID number of the first group is 11111, and the ID number of the second group is 22222.
And after the matching information and the distributed interactive scene information are sent to the clients of the corresponding groups, the groups participating in the interaction perform interaction based on the matching information and the distributed interactive scene information.
After the group participating in the interaction performs interaction based on the matching information and the distributed interaction scene information, the method further comprises the following steps: and for each two groups successfully matched, acquiring interaction results of each two groups successfully matched under all the allocated interaction scenes, and taking the interaction results of each two groups successfully matched under all the allocated interaction scenes as the total interaction result of each two groups successfully matched.
For every two groups successfully matched, in the process of interaction in each interaction scene, the interaction objects after interaction failure reaches a certain number of times can be kicked out of the interaction scene, and the interaction result in each interaction scene is judged according to the survival number of the interaction objects of the two groups in the interaction scene.
Optionally, the rule for determining the result of the interaction in each interaction scenario is as follows:
detecting the interactive scene every minute from 2 minutes after the interaction starts, and if the number of the interactive objects of a certain group in the interactive scene is 0, immediately winning the opposite group;
if the interaction is still not win or lose after overtime, the groups with more interaction objects left in the interaction scene win; if the number of the interactive objects left in the interactive scene is the same, the interactive result of the two teams in the interactive scene is a tie.
And taking the interaction results of every two groups successfully matched under all the allocated interaction scenes as the total interaction result of every two groups successfully matched. Optionally, the rule of the total interaction result is as follows: and when all the interactive scenes have interactive results, one party with more wins in all the interactive scenes, and if the wins of the two groups in all the interactive scenes are the same, the total interactive results of the two groups are a tie.
In the embodiment of the application, in the process of object matching, all groups are grouped according to the scores of the groups participating in interaction, pairwise matching is performed on the groups with the reference number in each interactive group, then random matching is performed on the groups which are not matched in all interactive groups, so that part of the groups are matched with the groups with the equivalent capacity, and the other part of the groups are matched with the groups with the larger capacity difference. Therefore, the method helps to enable the interest value and fatigue of the interactive objects in the group interaction to reach a reasonable balance point, improves the matching degree of the objects, enables the group interaction effect to be better, improves the interaction rate and further improves the user experience.
Based on the implementation environment shown in fig. 1, an embodiment of the present application provides an object matching method, which is applied to the terminal 11 in the implementation environment shown in fig. 1 as an example. As shown in fig. 6, the method provided by the embodiment of the present application may include the following steps:
in step 601, obtaining matching information of a group, where the matching information includes information of a first group and information of a second group that are interactive, and the first group is a group in which an interactive object of a current client is located; the group matching information is obtained by the server determining the number of interactive groups based on the number of groups participating in interaction, grouping all groups according to the score of each group and the number of the interactive groups, matching the groups with reference number in the interactive groups pairwise, and performing random matching on the groups which are not matched in all the interactive groups.
The server determines the number of interactive groups based on the number of groups participating in interaction, groups all groups according to the score of each group and the number of the interactive groups, matches groups with reference number in the interactive groups in pairs, and randomly matches groups which are not matched in all the interactive groups to obtain matching information. And the server sends the matching information to the terminals where all the interactive objects of the corresponding group are located.
And the terminal receives the matching information sent by the server, namely the information of the matched group is obtained. The first group is assumed to be the group where the interactive object of the current client is located, and the second group is the group matched with the first group for interaction. The interactive object of the current client may refer to an object which is logged in the client of the current terminal and participates in the interaction, and the logging manner includes but is not limited to: account password login, two-dimensional code scanning login, communication number login and the like.
According to different modes of matching information stored by a server, matching information acquired by a terminal can be divided into the following two types:
if the server stores the matching information of every two groups which are successfully matched separately, the matching information obtained by the terminal only comprises the information of the first group and the information of the second group.
If the matching information of all the groups successfully matched is stored together, the matching information acquired by the terminal includes the information of all the groups successfully matched, and the information of the first group and the information of the second group can be found in all the information. Optionally, if the server generates the interactive table according to the matching information of all the groups successfully matched, the terminal may obtain the interactive table. For example, in a group battle activity in a game, an interactive object of a current client may obtain a battle table (interactive table) as shown in fig. 5, and information of a first group and information of a second group performing the battle may be obtained in the battle table (interactive table).
In step 602, interactive scene information of a plurality of interactive scenes distributed by a server is obtained, and the number of interactive objects included in each interactive scene is within a threshold range.
The distribution process of the interactive scene information is detailed in step 205, and the interactive scene information of the interactive scene may be obtained when the matching information is obtained, or the matching information and the interactive scene information may be obtained separately, which is not limited in this application.
In step 603, the group is interacted with the interactive objects in the second group based on the matching information of the group and the interactive scene information of the plurality of interactive scenes.
In the screen of the current terminal, a plurality of selectable interactive scenes distributed by the server can be seen, and the interactive object of the current client can select any one of the interactive scenes for interaction. If the number of the interactive objects entering a certain interactive scene exceeds the threshold range, the interactive object of the current client cannot select the interactive scene.
The interaction with the interactive object in the second group based on the matching information of the group and the interactive scene information of the plurality of interactive scenes may include the following steps:
the method comprises the following steps: and entering a target interactive scene based on the interactive scene information of the multiple interactive scenes, and forming a team in the target interactive scene.
And displaying a plurality of interactive scenes and the quantity distribution condition of the interactive objects which have entered each interactive scene by the two party groups in the screen of the current terminal. As shown in fig. 4, in the group battle activity in the game, 5 battle fields (interactive scenes) are displayed on the screen of the current terminal, and the number of battle objects (interactive objects) that both parties of the group have entered each battle field (interactive scene) is also displayed. The number of the battle objects (interactive objects) that have entered the second battle field (interactive scene) in the first group is 3, and the number of the battle objects (interactive objects) that have entered the second battle field (interactive scene) in the second group is 4.
In each interactive scenario, the number of interactive objects in each community must not exceed a threshold range. For example, the threshold range is 20. And selecting an interactive scene of which the number of the interactive objects does not reach the threshold range as a target interactive scene by the interactive objects of the current client. And the interactive object of the current client side is formed into a team with other interactive objects entering the target interactive scene in the target interactive scene and the first cluster. And the number of the interactive objects in the team cannot exceed a sixth threshold, and the team comprises a team leader and a team member. For example, the sixth threshold is 3. For example, assume that the number of interactive objects entering the target interactive scene in the first group is 6, which are interactive object a1, interactive object a2, interactive object a3, interactive object a4, interactive object a5 and interactive object a6, respectively, wherein interactive object a1 is the interactive object of the current client. The interactive object a1 of the current client may constitute a team a1 having the interactive object a1 of the current client as a team leader with the interactive object a2 and the interactive object a 3.
Step two: and in the target interaction scene, filtering the currently displayed model of the interaction object according to the team forming result.
The result of the team includes, but is not limited to, the name of each team, the name of the interactive object in each team that is the leader of the team, and the name of the individual interactive objects that are not to be teamed. For example, as shown in fig. 7, assuming that the number of the interactive objects entering the target interactive scene in the first group is 6, which are the interactive object a1, the interactive object a2, the interactive object a3, the interactive object a4, the interactive object a5 and the interactive object a6 of the current client, the result of the team formation of the first group in the target interactive scene is: the interactive object a1, the interactive object a2 and the interactive object A3 of the current client form a team A1 which takes the interactive object a1 of the current client as a team leader; the interactive object a4 and the interactive object a5 form a team A2 with the interactive object a4 as the team leader; interactive object a6 fails to team, and there is no team. Similarly, it is assumed that the number of the interactive objects entering the target interactive scene in the second group is also 6, which are the interactive object b1, the interactive object b2, the interactive object b3, the interactive object b4, the interactive object b5 and the interactive object b 6. The result of the team formation of the second team in the target interaction scenario is: the interactive object B1 and the interactive object B2 form a team B1 with the interactive object B1 as the team leader; the interactive object B3, the interactive object B4 and the interactive object B5 form a team B2 with the interactive object B3 as a team leader; interactive object b6 fails to team, and there is no team.
Filtering a model of the currently displayed interactive object according to the result of the team, comprising: and for non-co-team interactive objects outside the team where the interactive object of the current client is located, if the non-co-team interactive objects are in other teams, displaying the model of the interactive object which is taken as the team leader in the other teams, and if the non-co-team interactive objects are not in the other teams, displaying the model of the non-co-team interactive objects.
As shown in fig. 7, the team of the interactive object of the current client is a1, and the interactive objects in team a1 include interactive object a1, interactive object a2 and interactive object A3 of the current client. In the target interactive scene, for other interactive objects except the interactive object a1, the interactive object a2, and the interactive object a3 of the previous client, if the interactive object is in other teams, a model of the interactive object as a team leader in the other teams is displayed, and a model of the interactive object as a team member in the other teams is not displayed. For example, the model of the interactive object a4 as the captain in the team a2 is displayed, and the model of the interactive object a5 as the player in the team a2 is not displayed; displaying the model of the interactive object B1 as the captain in the team B1, and not displaying the model of the interactive object B2 as the player in the team B2; the model of the interactive object B3 as the captain in the team B2 is displayed, and the models of the interactive object B4 and the interactive object B5 as the team members in the team B2 are not displayed. Within the target interactive scene, for other interactive objects than the interactive object a1, the interactive object a2 and the interactive object a3 of the previous client, if the interactive objects are not in other teams, the model of the interactive objects is displayed. For example, if neither the interactive object a6 nor the interactive object b6 is in the other team, then a model of the interactive object a6 and a model of the interactive object b6 are displayed.
Optionally, if the interactive object is in another team, displaying a model of the interactive object as the team leader in the other team, and displaying the number of the interactive objects in the team above the model of the interactive object as the team leader, where the number may be changed according to the survival condition of the interactive object during the interaction. As shown in fig. 8, (2) shown on the left side of the "team B1-team leader B1" represents that the number of interactive objects in the team B1 is 2.
Step three: and displaying the models of the reference quantity in the models of other interactive objects according to the priority order in the filtered interactive object models, wherein the models of the other interactive objects are models except the interactive object model of the current client within the reference threshold range of the position of the interactive object of the current client.
In the target interaction scene, the filtered models of the interaction objects comprise models of all interaction objects in a team where the interaction object of the current client is located, models of interaction objects serving as team leaders in other teams, and models of interaction objects without teams. Optionally, if a system role model exists in the target interaction scene, the filtered interaction object model also includes the system role model; if an interaction object which is separated from the group due to unpredictable factors exists in the target interaction scene, the filtered model of the interaction object also comprises the model of the interaction object separated from the group.
For example, according to the result of the team formation as shown in FIG. 7, the model of the filtered interactive object includes: a model of the interactive object a1, a model of the interactive object a2, a model of the interactive object a3, a model of the interactive object a4 (captain), a model of the interactive object a6, a model of the interactive object b1 (captain), a model of the interactive object b3 (captain), and a model of the interactive object b6 of the current client.
And then displaying the reference number models in the models of other interactive objects in the filtered models of the interactive objects according to the priority order. The models of the other interactive objects are models except the interactive object model of the current client within a seventh threshold range of the position of the interactive object of the current client.
Optionally, the priority order is: a system character model, models of other interactive objects of the team where the interactive object of the current client is located, a model of the team leader of the team in the second group, a model of the interactive object of no team in the second group, and a model of the team leader of other teams of the first group except for the team where the interactive object of the current client is located; a model of the interactive objects of the teamless in the first group; models of the interactive objects of the first or second community are disengaged during the interaction.
According to the result of the formation shown in fig. 7, except for the model of the interactive object of the current client, in the filtered model of the interactive object, the priority order is in turn: a model of interactive object a2 and a model of interactive object a3, a model of interactive object b1 (captain) and a model of interactive object b3 (captain), a model of interactive object b6, a model of interactive object a4 (captain), and a model of interactive object a 6.
And displaying the reference number of models according to the priority sequence within a seventh threshold range of the position of the interactive object of the current client. For example, assuming that the reference number is 5, in a seventh threshold range of the position of the interactive object of the current client, a maximum of 5 models of the interactive object are displayed in addition to the model of the interactive object of the current client. Optionally, if the priority of the relationship between the two models is the same, the straight-line distance between the two models and the current player character model is obtained, and the model with the shorter distance is preferentially displayed.
Assuming that all the models of the filtered interactive objects are within the seventh threshold range of the position of the interactive object of the current client, and assuming that the reference number of the models is 5, as shown in fig. 8, a model of the interactive object a2, a model of the interactive object a3, a model of the interactive object b1 (captain), a model of the interactive object b3 (captain), and a model of the interactive object b6 are displayed.
Step four: and interacting with the interactive objects in the second group according to the displayed model of the interactive objects.
And according to the displayed model of the interactive object, the interactive object of the current client interacts with the interactive object belonging to the second group.
As shown in fig. 8, according to the above steps, in addition to the model of the interactive object of the current client, the models of the interactive objects displayed on the screen of the current terminal are: a model of interactive object a2, a model of interactive object a3, a model of interactive object b1 (captain), a model of interactive object b3 (captain), and a model of interactive object b 6. Among them, the interactive object b1 (captain), the interactive object b3 (captain), and the interactive object b6 are interactive objects belonging to the second group. The interactive object of the current client interacts with the interactive object b1 (captain), the interactive object b3 (captain), and the interactive object b 6.
When the interactive object of the current client interacts with the captain of the second group, the interactive object is equivalent to interact with all the interactive objects represented by the captain of the second group. For example, when the interactive object a1 of the front client interacts with the interactive object b1 (captain), that is, the interactive object b1 and the interactive object b2 in the team represented by the interactive object b 1.
After the interaction is carried out in the target scene, the win-lose result of the first group and the second group in the target scene can be judged according to the interaction result. And the interactive objects after the interaction failure reaches a certain number of times can be kicked out of the target interactive scene, and the outcome of the first group and the second group in the target interactive scene is judged according to the survival number of the interactive objects of the two groups in the interactive scene.
Optionally, the specific rule for determining the win or lose result of the first group and the second group in the target scene is as follows: if the number of the interactive objects of a certain group in the target interactive scene is 0, the opposite group wins immediately; if the interaction is still not win or lose after overtime, the group with a large number of interaction objects left in the target interaction scene wins; if the number of the interaction objects left in the target interaction scene is the same, the interaction results of the two teams in the target interaction scene are a tie.
In the embodiment of the application, in the process of object matching, all groups are grouped according to the scores of the groups participating in interaction, pairwise matching is performed on the groups with the reference number in each interactive group, then random matching is performed on the groups which are not matched in all interactive groups, so that part of the groups are matched with the groups with the equivalent capacity, and the other part of the groups are matched with the groups with the larger capacity difference. Therefore, the method helps to enable the interest value and fatigue of the interactive objects in the group interaction to reach a reasonable balance point, improves the matching degree of the objects, enables the group interaction effect to be better, improves the interaction rate and further improves the user experience.
Based on the same technical concept, referring to fig. 9, an embodiment of the present application provides an apparatus for object matching, including:
a first obtaining module 901, configured to obtain the number of groups participating in interaction and a score of each group in the groups participating in interaction, where each group participating in interaction includes multiple interaction objects, and the score is used to indicate an interaction capability of the group;
a determining module 902, configured to determine the number of interactive groups according to the number of groups participating in the interaction;
a grouping module 903, configured to group all groups according to the score of each group and the number of interactive groups, where the number of interactive groups is one or more;
a matching module 904, configured to match, for each interactive group, every two groups of the reference number in the interactive group; carrying out random matching on unmatched groups in all interactive groups;
the allocating module 905 is configured to allocate a plurality of interactive scenes to every two groups that are successfully matched, where the number of interactive objects of each group included in each interactive scene is within a threshold range;
a sending module 906, configured to send the matching information and the distributed interaction scenario information to a client of a corresponding group, where the group participating in the interaction performs interaction based on the matching information and the interaction scenario information.
Optionally, referring to fig. 10, the apparatus further comprises:
a second obtaining module 907, configured to obtain, for each two groups that are successfully matched, interaction results of each two groups that are successfully matched in all the allocated interaction scenarios, and use the interaction results of each two groups that are successfully matched in all the allocated interaction scenarios as a total interaction result of each two groups that are successfully matched.
In the embodiment of the application, in the process of object matching, all groups are grouped according to the scores of the groups participating in interaction, pairwise matching is performed on the groups with the reference number in each interactive group, then random matching is performed on the groups which are not matched in all interactive groups, so that part of the groups are matched with the groups with the equivalent capacity, and the other part of the groups are matched with the groups with the larger capacity difference. Therefore, the method helps to enable the interest value and fatigue of the interactive objects in the group interaction to reach a reasonable balance point, improves the matching degree of the objects, enables the group interaction effect to be better, further improves the interaction rate and improves the user experience.
Referring to fig. 11, an embodiment of the present application further provides an apparatus for object matching, where the apparatus includes:
a first obtaining module 1101, configured to obtain matching information of a group, where the matching information includes information of a first group and information of a second group that are interactive, and the first group is a group in which an interactive object of a current client is located; the server determines the number of interactive groups based on the number of groups participating in interaction, groups all groups according to the score of each group and the number of the interactive groups, matches groups with reference number in the interactive groups in pairs, and randomly matches groups which are not matched in all the interactive groups to obtain the group matching information of the groups;
a second obtaining module 1102, configured to obtain interaction scene information of multiple interaction scenes distributed by a server, where a number of interaction objects included in each interaction scene is within a threshold range;
the interaction module 1103 is configured to interact with an interaction object in the second group based on the group matching information and the interaction scene information of the multiple interaction scenes.
Optionally, referring to fig. 12, the interaction module 1103 includes:
a team forming unit 11031, configured to enter a target interaction scene based on the matching information of the group and the interaction scene information of the multiple interaction scenes, and form a team in the target interaction scene;
a filtering unit 11032, configured to filter, in the target interactive scene, a model of the currently displayed interactive object according to the result of the team formation;
a display unit 11033, configured to display, in the filtered models of the interactive objects, models of reference numbers in models of other interactive objects according to a priority order, where the models of other interactive objects are models, except the interactive object model of the current client, within a reference threshold range of a position where the interactive object of the current client is located;
and an interacting unit 11034, configured to interact with the interactive objects in the second group according to the displayed model of the interactive objects.
Optionally, the filtering unit 11032 is configured to, for a non-co-team interactive object outside the team where the interactive object of the current client is located, if the non-co-team interactive object is in another team, display a model of the interactive object as the team leader in the other team, and if the non-co-team interactive object is not in the other team, display a model of the non-co-team interactive object.
Optionally, a display unit 11033, configured to display the models of the reference number according to the following priority order: a system character model, models of other interactive objects of the team where the interactive object of the current client is located, a model of the team leader of the team in the second group, a model of the interactive object of no team in the second group, and a model of the team leader of other teams of the first group except for the team where the interactive object of the current client is located; a model of the interactive objects of the teamless in the first group; and separating the interactive object model of the first group or the second group in the interactive process.
In the embodiment of the application, in the process of object matching, all groups are grouped according to the scores of the groups participating in interaction, pairwise matching is performed on the groups with the reference number in each interactive group, then random matching is performed on the groups which are not matched in all interactive groups, so that part of the groups are matched with the groups with the equivalent capacity, and the other part of the groups are matched with the groups with the larger capacity difference. Therefore, the method helps to enable the interest value and fatigue of the interactive objects in the group interaction to reach a reasonable balance point, improves the matching degree of the objects, enables the group interaction effect to be better, further improves the interaction rate and improves the user experience.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 13 is a schematic structural diagram of an object matching device according to an embodiment of the present application, where the device may be a server, and the server may be an individual server or a cluster server. Specifically, the method comprises the following steps:
the server includes a Central Processing Unit (CPU)1301, a system memory 1304 of a Random Access Memory (RAM)1302 and a Read Only Memory (ROM)1303, and a system bus 1305 connecting the system memory 1304 and the central processing unit 1301. The server also includes a basic input/output system (I/O system) 1306, which facilitates transfer of information between devices within the computer, and a mass storage device 1307 for storing an operating system 1313, application programs 1314, and other program modules 1315.
The basic input/output system 1306 includes a display 1308 for displaying information and an input device 1309, such as a mouse, keyboard, etc., for user input of information. Wherein a display 1308 and an input device 1309 are connected to the central processing unit 1301 through an input/output controller 1310 connected to the system bus 1305. The basic input/output system 1306 may also include an input/output controller 1310 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, an input/output controller 1310 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1307 is connected to the central processing unit 1301 through a mass storage controller (not shown) connected to the system bus 1305. The mass storage device 1307 and its associated computer-readable media provide non-volatile storage for the server. That is, the mass storage device 1307 may include a computer-readable medium (not shown) such as a hard disk or CD-ROM drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 1304 and mass storage device 1307 described above may be collectively referred to as memory.
According to various embodiments of the present application, the server may also operate as a remote computer connected to a network through a network, such as the Internet. That is, the servers may be connected to the network 1312 through the network interface unit 1311, which is coupled to the system bus 1305, or the network interface unit 1311 may be used to connect to other types of networks or remote computer systems (not shown).
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU. One or more programs contain instructions for performing the methods of object matching provided by embodiments of the present application.
Fig. 14 is a schematic structural diagram of an object matching device according to an embodiment of the present application. The device may be a terminal, and may be, for example: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. A terminal may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, a terminal includes: a processor 1401, and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). Processor 1401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the method of object matching provided by the method embodiments herein.
In some embodiments, the terminal may further include: a peripheral device interface 1403 and at least one peripheral device. The processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a touch display 1405, a camera assembly 1406, audio circuitry 1407, a positioning assembly 1408, and a power supply 1409.
The peripheral device interface 1403 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402. In some embodiments, the processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1401, the memory 1402, and the peripheral device interface 1403 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1404 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to capture touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 for processing as a control signal. At this point, the display 1405 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1405 may be one, disposed on the front panel of the terminal; in other embodiments, the display 1405 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in still other embodiments, the display 1405 may be a flexible display, disposed on a curved surface or on a folded surface of the terminal. Even further, the display 1405 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1405 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing or inputting the electric signals to the radio frequency circuit 1404 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones can be arranged at different parts of the terminal respectively. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1407 may also include a headphone jack.
The positioning component 1408 is used to locate the current geographic Location of the terminal to implement navigation or LBS (Location Based Service). The Positioning component 1408 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
The power supply 1409 is used to supply power to the various components in the terminal. The power source 1409 may be alternating current, direct current, disposable or rechargeable. When the power source 1409 comprises a rechargeable battery, the rechargeable battery can support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1401 can control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal, and the gyro sensor 1412 and the acceleration sensor 1411 may cooperate to collect a 3D motion of the user on the terminal. The processor 1401 can realize the following functions according to the data collected by the gyro sensor 1412: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1413 may be disposed on the side frames of the terminal and/or underneath the touch display 1405. When the pressure sensor 1413 is disposed on the side frame of the terminal, the user can detect the holding signal of the terminal, and the processor 1401 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch display 1405, the processor 1401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1414 is used for collecting a fingerprint of a user, and the processor 1401 identifies the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 1401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for, and changing settings, etc. The fingerprint sensor 1414 may be disposed on the front, back, or side of the terminal. When a physical button or vendor Logo is provided on the terminal, the fingerprint sensor 1414 may be integrated with the physical button or vendor Logo.
The optical sensor 1415 is used to collect ambient light intensity. In one embodiment, processor 1401 can control the display brightness of touch display 1405 based on the ambient light intensity collected by optical sensor 1415. Specifically, when the ambient light intensity is high, the display luminance of the touch display 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display 1405 is turned down. In another embodiment, the processor 1401 can also dynamically adjust the shooting parameters of the camera assembly 1406 according to the intensity of the ambient light collected by the optical sensor 1415.
A proximity sensor 1416, also known as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1416 is used to collect the distance between the user and the front face of the terminal. In one embodiment, when proximity sensor 1416 detects that the distance between the user and the front face of the terminal is gradually decreased, processor 1401 controls touch display 1405 to switch from a bright screen state to a dark screen state; when the proximity sensor 1416 detects that the distance between the user and the front face of the terminal is gradually increased, the processor 1401 controls the touch display 1405 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 14 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer device is also provided that includes a processor and a memory having at least one instruction, at least one program, set of codes, or set of instructions stored therein. The at least one instruction, at least one program, set of code, or set of instructions is configured to be executed by one or more processors to implement any of the above methods of object matching.
In an exemplary embodiment, there is also provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions which, when executed by a processor of a computer device, implement any of the above methods of object matching.
Alternatively, the computer-readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of object matching, the method comprising:
acquiring the number of groups participating in interaction and the score of each group in the groups participating in interaction, wherein each group participating in interaction comprises a plurality of interaction objects, and the score is used for indicating the interaction capacity of the group;
determining the number of interactive groups according to the number of the groups participating in the interaction;
grouping all the groups according to the score of each group and the number of the interactive groups, wherein the number of the interactive groups is one or more;
for each interactive group, pairwise matching is carried out on the groups with the reference number in the interactive group; carrying out random matching on unmatched groups in all interactive groups;
distributing a plurality of interactive scenes for every two groups successfully matched, wherein the number of interactive objects of each group contained in each interactive scene is within a threshold range;
and sending the matching information and the distributed interactive scene information to the client side of the corresponding group, wherein the group participating in the interaction carries out interaction based on the matching information and the interactive scene information.
2. The method of claim 1, wherein after sending the matching information and the distributed interaction scenario information to the clients of the corresponding group, further comprising:
and for each two groups successfully matched, acquiring interaction results of each two groups successfully matched under all the allocated interaction scenes, and taking the interaction results of each two groups successfully matched under all the allocated interaction scenes as the total interaction result of each two groups successfully matched.
3. A method of object matching, the method comprising:
acquiring matching information of a group, wherein the matching information comprises information of a first group and information of a second group which are interactive, and the first group is a group in which an interactive object of a current client is located; the matching information of the groups is obtained by the server through determining the number of interactive groups based on the number of groups participating in interaction, grouping all the groups according to the score of each group and the number of the interactive groups, matching the groups with reference number in the interactive groups in pairs, and randomly matching the groups which are not matched in all the interactive groups;
acquiring interactive scene information of a plurality of interactive scenes distributed by the server, wherein the number of interactive objects contained in each interactive scene is within a threshold range;
and interacting with the interactive objects in the second group based on the matching information of the group and the interactive scene information of the plurality of interactive scenes.
4. The method of claim 3, wherein the interacting with the interactive object in the second group based on the matching information of the group and the interactive scene information of the plurality of interactive scenes comprises:
entering a target interaction scene based on the matching information of the group and the interaction scene information of the plurality of interaction scenes, and forming a team in the target interaction scene;
filtering a model of the currently displayed interactive object according to the result of the team formation in the target interactive scene;
displaying models of reference quantity in models of other interactive objects according to a priority order in the filtered interactive object models, wherein the models of other interactive objects are models except the interactive object model of the current client within a reference threshold range of the position of the interactive object of the current client;
and interacting with the interactive objects in the second group according to the displayed model of the interactive objects.
5. The method of claim 4, wherein filtering the currently displayed model of the interactive object according to the result of the grouping comprises:
and for non-co-team interactive objects outside the team where the interactive object of the current client is located, if the non-co-team interactive objects are in other teams, displaying a model of the interactive object which is the team leader in the other teams, and if the non-co-team interactive objects are not in the other teams, displaying the model of the non-co-team interactive object.
6. The method of claim 4, wherein the priority order comprises:
the priority order is as follows: a system character model, models of other interactive objects of the team where the interactive object of the current client is located, a model of the team leader of the team in the second group, a model of the interactive object of no team in the second group, and a model of the team leader of other teams of the first group except for the team where the interactive object of the current client is located; a model of the interactive objects of the teamless in the first group; and separating the interactive object model of the first group or the second group in the interactive process.
7. An apparatus for object matching, the apparatus comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the number of groups participating in interaction and the score of each group in the groups participating in interaction, each group participating in interaction comprises a plurality of interaction objects, and the score is used for indicating the interaction capacity of the group;
the determining module is used for determining the number of the interactive groups according to the number of the groups participating in the interaction;
the grouping module is used for grouping all the groups according to the score of each group and the number of the interactive groups, wherein the number of the interactive groups is one or more;
the matching module is used for matching every two groups of the reference number in the interactive groups for each interactive group; carrying out random matching on unmatched groups in all interactive groups;
the distribution module is used for distributing a plurality of interactive scenes for every two groups which are successfully matched, and the number of interactive objects of each group contained in each interactive scene is within a threshold range;
and the sending module is used for sending the matching information and the distributed interactive scene information to the client side of the corresponding group, and the group participating in the interaction carries out interaction based on the matching information and the interactive scene information.
8. An apparatus for object matching, the apparatus comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring the matching information of groups, the matching information comprises the information of a first interactive group and the information of a second interactive group, and the first interactive group is the group where the interactive object of the current client is located; the matching information of the groups is obtained by the server through determining the number of interactive groups based on the number of groups participating in interaction, grouping all the groups according to the score of each group and the number of the interactive groups, matching the groups with reference number in the interactive groups in pairs, and randomly matching the groups which are not matched in all the interactive groups;
the second acquisition module is used for acquiring interactive scene information of a plurality of interactive scenes distributed by the server, and the number of interactive objects contained in each interactive scene is within a threshold range;
and the interaction module is used for interacting with the interaction objects in the second group based on the matching information of the group and the interaction scene information of the plurality of interaction scenes.
9. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction which, when executed by the processor, implements a method of object matching as claimed in any one of claims 1 to 2, or a method of object matching as claimed in any one of claims 3 to 6.
10. A computer-readable storage medium having stored therein at least one instruction, which when executed, implements a method of object matching as claimed in any of claims 1-2, or a method of object matching as claimed in any of claims 3-6.
CN201910323381.XA 2019-04-22 2019-04-22 Object matching method, device, equipment and storage medium Active CN110025962B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910323381.XA CN110025962B (en) 2019-04-22 2019-04-22 Object matching method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910323381.XA CN110025962B (en) 2019-04-22 2019-04-22 Object matching method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110025962A CN110025962A (en) 2019-07-19
CN110025962B true CN110025962B (en) 2022-03-18

Family

ID=67239491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910323381.XA Active CN110025962B (en) 2019-04-22 2019-04-22 Object matching method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110025962B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111905377B (en) * 2020-08-20 2021-12-10 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104063574A (en) * 2013-06-03 2014-09-24 腾讯科技(深圳)有限公司 Team battle matching method and server
KR20150012327A (en) * 2013-07-24 2015-02-04 (주)위메이드엔터테인먼트 Method of matching players in online game's PvP mode, a game server including PvP system, Computer readable storage medium of recording the method
CN106202142A (en) * 2016-05-24 2016-12-07 北京畅游天下网络技术有限公司 Object matching method in a kind of MMORPG game and server
CN106730850A (en) * 2016-12-16 2017-05-31 网易(杭州)网络有限公司 Game partners matching process and device
CN108744525A (en) * 2018-05-29 2018-11-06 深圳市紫石文化传播有限公司 A kind of matching process and device
CN109513215A (en) * 2018-11-23 2019-03-26 腾讯科技(深圳)有限公司 A kind of object matching method, model training method and server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104063574A (en) * 2013-06-03 2014-09-24 腾讯科技(深圳)有限公司 Team battle matching method and server
KR20150012327A (en) * 2013-07-24 2015-02-04 (주)위메이드엔터테인먼트 Method of matching players in online game's PvP mode, a game server including PvP system, Computer readable storage medium of recording the method
CN106202142A (en) * 2016-05-24 2016-12-07 北京畅游天下网络技术有限公司 Object matching method in a kind of MMORPG game and server
CN106730850A (en) * 2016-12-16 2017-05-31 网易(杭州)网络有限公司 Game partners matching process and device
CN108744525A (en) * 2018-05-29 2018-11-06 深圳市紫石文化传播有限公司 A kind of matching process and device
CN109513215A (en) * 2018-11-23 2019-03-26 腾讯科技(深圳)有限公司 A kind of object matching method, model training method and server

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MOBA游戏平衡性探究;林澳剀;《中国优秀博高级论文全文数据库(硕士)信息科技辑》;20150915;第2015年卷(第09期);全文 *
More than skills: A novel matching proposal for multiplayer video games;Nadja Stroh-Marauna;《Entertainment Computing》;20171218;第25卷;全文 *

Also Published As

Publication number Publication date
CN110025962A (en) 2019-07-19

Similar Documents

Publication Publication Date Title
CN110755850B (en) Team forming method, device, equipment and storage medium for competitive game
CN112704883B (en) Method, device, terminal and storage medium for grouping virtual objects in virtual environment
CN107982918B (en) Game game result display method and device and terminal
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN111318026B (en) Team forming method, device, equipment and storage medium for competitive game
CN110496392B (en) Virtual object control method, device, terminal and storage medium
CN109771955B (en) Invitation request processing method, device, terminal and storage medium
CN112827166B (en) Card object-based interaction method and device, computer equipment and storage medium
CN108579075B (en) Operation request response method, device, storage medium and system
CN112915538A (en) Method and device for displaying game information, terminal and storage medium
CN110548277B (en) Method, device and equipment for acquiring hand cards in card game program and readable medium
CN109806583B (en) User interface display method, device, equipment and system
CN111589116A (en) Method, device, terminal and storage medium for displaying function options
CN112774185B (en) Virtual card control method, device and equipment in card virtual scene
CN110833695A (en) Service processing method, device, equipment and storage medium based on virtual scene
CN113599819A (en) Prompt message display method, device, equipment and storage medium
CN111760296B (en) Team forming processing method, device, terminal, server and storage medium
CN113413615A (en) Game invitation method, device, computer equipment and storage medium
CN113181647A (en) Information display method, device, terminal and storage medium
CN113144598A (en) Virtual exchange-matching reservation method, device, equipment and medium
CN110025962B (en) Object matching method, device, equipment and storage medium
CN112306332A (en) Method, device and equipment for determining selected target and storage medium
CN112156454A (en) Virtual object generation method and device, terminal and readable storage medium
CN111589117A (en) Method, device, terminal and storage medium for displaying function options
CN113413587B (en) Information determination method, device, equipment and medium for card sports

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant