Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if," as used herein, may be interpreted as "responsive to a determination," depending on the context.
In the present application, a scene control method and apparatus, a computing device, and a computer-readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Other components may be added or replaced as desired by those skilled in the art.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the scene control method shown in fig. 2. Fig. 2 shows a flowchart of a scenario control method according to an embodiment of the present application, including step 202 to step 206.
Step 202: and loading a configuration file corresponding to the current program scene, and recording the operation record of the user.
In a specific embodiment, after a user enters a current scene, a configuration file corresponding to the scene is loaded. The scene may be a scene that a user needs to pass through currently, such as a current level, a sub-level in the level, a transition scene between levels, a map, and the like, which are well known to those skilled in the art and are not described herein.
Meanwhile, recording the operation record of the user in the current scene. The operation record of the user in the current scene may include: the user accumulates the number of passing through the gate, the number of times of failure of the user to pass through the gate continuously, the number of times of success of the user to pass through the gate continuously or operation records of the user in the same scene and the like.
In another specific embodiment, the configuration data of the configuration file is generated by scene operation result data corresponding to the scene configuration data uploaded by the user. In the prior art, because the scenes of the game are numerous, and the playing skills and habits of different users are different, the efficiency and the effect of the mode of manually configuring data are poor. In this embodiment, the terminal of the user collects operation result data in a scene corresponding to the configuration data in each scene, and uploads the result data to the server. Optionally, the scene operation result data uploaded by the user terminal includes, but is not limited to, information such as a level clearance state, a level score, and a level consumption time, and may further include a type and a number of properties used.
In another specific embodiment, the game server collects massive scene operation result data uploaded by the user terminal to form matched big data of the scene configuration data and the operation result data; the big data is analyzed by mathematical modeling to obtain an improved configuration file. The mathematical modeling model may be a statistical model, such as cluster analysis or time series analysis, or a machine learning or data mining model, and the like, and those skilled in the art may select the model according to actual situations, which is not limited herein. As shown in fig. 3, the modified profile includes a two-dimensional array _ diffs, each value in the two-dimensional array representing a scene coefficient.
Step 204: and selecting a scene coefficient matched with the user operation record from a configuration file according to the user operation record.
In one embodiment, the types of user operation records include, but are not limited to: the accumulated number of passing the gate by the user, the number of continuous failure of passing the gate by the user, the number of continuous success of passing the gate by the user, the operation record of the user in the same gate and the like. In this step, it is determined whether the current situation meets the adjustment condition of the scene according to the data of different types of user operation records, and if so, a scene coefficient matching the user operation record is selected from the configuration file in step 202.
Step 206: applying the scene coefficients into a current scene of the program.
In one embodiment, the scene coefficients obtained from the configuration file are applied to the current scene, thereby adjusting the game difficulty of the current scene.
In the scheme of the embodiment, compared with a mode that configuration data of each scene of a game is subjectively defined by product personnel, the configuration data determined by the embodiment is more in line with the actual operation condition of the game, so that the matching accuracy between the configuration data and the game scene is improved, and the generation efficiency of the configuration file is also improved.
Another embodiment of the present application describes how to adjust a scene according to an operation record of a user, and in this embodiment, the following description is provided in combination with a code:
as shown in fig. 4, the configuration file of the scene further includes a one-dimensional step interval array steps [ i ] and a one-dimensional integration interval array _ accs [ j ]; each numerical value in the step interval array corresponds to one step, and each numerical value in the integral interval array corresponds to one integral proportion.
In one embodiment, for the operation record of the user in the same scene, optionally, as in the elimination game, the user is recorded every time the chessman moves, such as adding 1 to the step number; in the shooting game, each time a user shoots, the shooting frequency is added by 1; in a sports game, the user adds 1 to the number of shots per shot, and so on.
When a user breaks a barrier in a barrier, sequentially comparing the current accumulated game step number with the numerical values in the step number interval array every time one-step operation is performed; if the current accumulated game step number is smaller than the numerical value of the current step number interval array, defining the sequence number i of the current step number interval array as X;
and comparing the current game point with the target total point of the current level to obtain the specific gravity of the distance customs clearance.
var nowScore ═ gamemanager. // Current score
var offset 1- (float) nowScore/_ maxLevelScore; specific gravity of/distance clearance
Sequentially comparing the offset with the values in the integration interval array, and if the offset is greater than or equal to the value of the integration interval array, defining the sum of the serial number j of the current integration interval array and 1 as Y;
and taking the X and the Y as row and column numbers of a two-dimensional array, positioning in the two-dimensional array of the configuration file to obtain a scene coefficient, and applying the scene coefficient to the current scene of the program.
diffCurrent=_diffs[X][Y];
In this embodiment, a scene adjustment mode when a user breaks a gate in a certain gate is described, the number of operation steps of each step of the user in the gate and the corresponding score are recorded according to the definition of an array established according to a data model of big data in a configuration file, and each time the user performs a game operation, the number of steps and the score are compared with the numerical values in the array corresponding to the configuration file, so as to obtain a scene coefficient matching the number of steps and the score of the current user. Based on the mode, the granularity decomposition of the difficulty adjustment in the level is more reasonable, the user is not easy to find the difficulty change, and the user experience of the game is greatly improved.
As shown in fig. 5, another embodiment of the present application describes how to adjust a scene according to another operation record of a user.
In this embodiment, when a user enters a game, a distance D between the current time the user enters the game and the last time of the maximum number of consecutive breakthrough failures in the previous breakthrough is obtained.
Specifically, as shown in fig. 5, when the user enters the game at level 9, the previous N game judgments are performed, where N is also established based on the data model of the big data uploaded by the terminal, and it is assumed that N is 10. The number of connection passage failure of the 5 th game was found to be 3 times as large as 10 games. Therefore, the distance D is equal to the last time of the 5 th pass failure, plus the number of times 1 of the 5 th pass success, plus the number of times 1 of the 6 th pass success, plus 2 times of the 7 th pass failure and 1 time of the 7 th pass success, plus 1 time of the 8 th pass failure and 1 time of the 8 th pass success, that is, in fig. 5, the distance D is equal to 2 times of the 8 th pass + 3 times of the 7 th pass + 1 times of the 6 th pass + two times of the 5 th pass being equal to 8.
And acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency. For example, in this example, according to the maximum number of consecutive breakthrough failures 3 in fig. 7, the corresponding section score 10 is obtained in the configuration file.
Comparing the distance D with a distance threshold T, if the distance D is larger than T, setting the distance coefficient A to be less than or equal to 1, and multiplying the interval fraction S with the distance coefficient A to obtain an addition fraction. In this example, the distance threshold T is taken to be 3, and since the distance 8> the distance threshold 3, the summation score is 10 × 1 to 10, where the distance coefficient a is 1.
In another embodiment, as shown in fig. 6, when the user enters the game at level 9, the previous N game determinations are made, where N is assumed to be 10. The number of connection 8 pass-through failures was found to be 4 times the maximum in 10 games. Therefore, the distance D is equal to the last time of the 8 th pass failure plus the number of times of the 8 th pass success 1, i.e. in fig. 6, the distance D is 2.
And acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure times. For example, according to the number of consecutive pass failures 4 in fig. 7, the corresponding section score obtained in the configuration file is 20.
And then comparing the distance D with a distance threshold value T, if the distance D is less than T, setting a distance coefficient A to be more than 1, and multiplying the interval fraction S with the distance coefficient A to obtain an addition fraction. In this embodiment, the distance threshold T is taken to be 3, and since the distance 2< the distance threshold 3, the summation score is 20 × 1.5 — 30, and the distance coefficient a is 1.5.
In the configuration file, the scene coefficient corresponding to the addition score is acquired, and as shown in fig. 7, the scene coefficient corresponding to the addition score 10 is 1, and the scene coefficient corresponding to the addition score 30 is 3. In this embodiment, the game difficulty level corresponding to the scene coefficient 3 is smaller than the game difficulty level corresponding to the scene coefficient 1.
In this embodiment, a method of adjusting a scene according to a record of a user when continuously breaking through a gate is described, specifically, a difficulty of adjusting a current game according to a record of a user failure of continuously breaking through a gate is described. In the embodiment, various judgment thresholds, such as a distance threshold T, a distance coefficient a, and the like, are calculated according to big data submitted by a terminal in combination with a data model, and the thresholds are configured in a configuration file and issued to the terminal, and are continuously updated according to game scene operation data submitted by a user, so that the game experience of the user is better matched with the real situation, the intensity of game difficulty change is reduced, and the game experience of the user is improved.
Another embodiment of the present application describes how to adjust the scene according to the change of the scene coefficient.
In a specific implementation mode, when the scene coefficient of a scene is monitored to change, the weight of an object to be appeared in the current scene is adjusted according to the changed scene coefficient, and the appearance weight of the object to be appeared is determined again;
as an example, in an elimination game, all of the potentially dropped chess pieces for the current level, e.g., 3-color red, yellow and blue chess pieces, are obtained, and the initial drop weight for each color chess piece is 1. When the scene coefficient of the current level is monitored to be changed, the falling weight of the chessmen with each color needs to be adjusted. If the scene coefficient is larger, the difficulty of the current checkpoint is reduced. In order to adjust the weight of the chessmen, as a way, the configuration file also stores the corresponding relation between the current pass clearance and the chessman falling weight, for example, for the pass 7, if the red chessman falling weight is increased to 2, the predicted pass clearance of the pass can be increased by 10%; if the weight of the blue chess piece falling is increased to 3, the estimated rate of passing the current level can be increased by 30%;
in another mode, a weight adjustment mode of the color of the falling chess piece can be calculated in real time according to the current scene, for example, the weight of the chess piece with which color should be increased is calculated according to all the fallen chess pieces;
in another mode, before the chess pieces fall, each grid in the chessboard is traversed frame by frame, and when a certain grid is in a dropable state, the weight of the chess pieces to fall is calculated according to the color of the chess pieces in the adjacent grids of the grid.
After the occurrence weight of the object to be appeared is determined, the occurrence probability of the object to be appeared is determined according to the adjusted weight, and the object to be appeared is controlled to appear in the current scene with the corresponding probability. In one embodiment, different weights correspond to different object occurrence probabilities. The corresponding relation between the weight and the probability can be kept in a configuration file and continuously updated according to game scene operation data submitted by a user, or real-time calculation can be carried out in a game according to the change of the weight, and when the weight of an object to be appeared becomes large, the falling probability of the object also becomes high.
After the probability of occurrence of the object to be present is determined, the game controls the presence of the object. As an example, when allowing a piece to fall, the manager would traverse each grid in the board frame by frame, and when a certain grid is in a droppable state and its lower grid is an empty grid and a fall is allowed, it would mark that grid as a droppable state, to which a fall instruction is appended. Each chessman has a falling state of a lattice, the state is a non-falling state by default, when the state is marked as a falling instruction, the state of the chessman is marked as a starting falling state, when the falling action is carried out later, the state of the chessman is a falling state, when the chessman is fallen, the frame of the chessman is finished, the state of the chessman is marked as a finished state, and the next frame is recovered to the non-falling state.
Corresponding to the above method embodiment, the present application further provides an embodiment of a scene control device, and fig. 8 shows a schematic structural diagram of a scene control device according to an embodiment of the present application. As shown in fig. 8, the apparatus 800 includes:
the loading module is used for loading the configuration file corresponding to the current scene when the program runs;
the recording module is used for recording the operation record of a user when the program runs;
the matching module is used for selecting a scene coefficient matched with the user operation record from a configuration file when a program runs;
a configuration module for applying the scene coefficients to a current scene of the program.
Optionally, the apparatus 800 further comprises:
and the generating module is used for generating a configuration file according to the scene operation result data corresponding to the scene configuration data uploaded by the user.
Optionally, the configuration file further comprises:
a two-dimensional array, each value in the two-dimensional array representing a scene coefficient.
Optionally, the configuration file further comprises:
a one-dimensional step interval array and a one-dimensional integration interval array.
Optionally, the matching module is configured to:
when a user breaks through the barrier in the barrier, sequentially comparing the current accumulated game step number with the numerical value in the step number interval array every time one-step operation is carried out, and if the current accumulated game step number is smaller than the numerical value of the current step number interval array, defining the serial number i of the current step number interval array as X;
comparing the current game point with the target total point of the current level:
offset=1-nowScore/_maxLevelScore
the nowScore is the current game point, and the maxLevelScore is the target total point of the current level;
sequentially comparing the offset with the numerical value in the integral interval array, and if the offset is greater than or equal to the value of the integral interval array, adding 1 to the serial number j of the current integral interval array to define as Y;
and taking the X and the Y as row and column numbers of the two-dimensional array, and positioning in the two-dimensional array of the configuration file to obtain a scene coefficient.
Optionally, the matching module is further configured to:
when a user enters a game, acquiring the distance between the current game and the last time of the maximum continuous pass failure times in the previous pass;
acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency;
when the distance is greater than a distance threshold value, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is less than or equal to 1;
when the distance is less than or equal to the distance threshold, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is greater than 1;
and acquiring corresponding scene coefficients in the configuration file according to different addition scores, wherein the higher the scene coefficient is, the simpler the game difficulty is.
The above is a schematic scheme of a scene control apparatus of the present embodiment. It should be noted that the technical solution of the scene control device and the technical solution of the scene control method belong to the same concept, and details that are not described in detail in the technical solution of the scene control device can be referred to the description of the technical solution of the scene control method.
There is also provided in an embodiment of the present application a computing device, including a memory, a processor, and computer instructions stored on the memory and executable on the processor, where the processor implements the steps of the scene control method when executing the instructions.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the above-mentioned scene control method belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the above-mentioned scene control method.
An embodiment of the present application further provides a computer-readable storage medium, which stores computer instructions, and when the instructions are executed by a processor, the computer-readable storage medium implements the steps of the scene control method as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the above-mentioned scene control method, and details that are not described in detail in the technical solution of the storage medium can be referred to the description of the technical solution of the above-mentioned scene control method.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in source code form, object code form, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art will appreciate that the embodiments described in this specification are presently considered to be preferred embodiments and that acts and modules are not required in the present application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.