CN113384900B - Scene control method and device - Google Patents

Scene control method and device Download PDF

Info

Publication number
CN113384900B
CN113384900B CN202110759766.8A CN202110759766A CN113384900B CN 113384900 B CN113384900 B CN 113384900B CN 202110759766 A CN202110759766 A CN 202110759766A CN 113384900 B CN113384900 B CN 113384900B
Authority
CN
China
Prior art keywords
scene
user
current
configuration file
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110759766.8A
Other languages
Chinese (zh)
Other versions
CN113384900A (en
Inventor
王承振
周欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunyou Interactive Network Technology Co ltd
Online Tuyoo Beijing Technology Co ltd
Tuyou World Beijing Technology Co ltd
Original Assignee
Beijing Yunyou Interactive Network Technology Co ltd
Online Tuyoo Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunyou Interactive Network Technology Co ltd, Online Tuyoo Beijing Technology Co ltd filed Critical Beijing Yunyou Interactive Network Technology Co ltd
Priority to CN202110759766.8A priority Critical patent/CN113384900B/en
Publication of CN113384900A publication Critical patent/CN113384900A/en
Application granted granted Critical
Publication of CN113384900B publication Critical patent/CN113384900B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/46Computing the game score
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/71Version control; Configuration management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application provides a scene control method and device, a computing device and a computer readable storage medium, wherein the method comprises the following steps: loading a configuration file corresponding to a scene in a program, and recording an operation record of a user; and judging whether the operation record of the user accords with the scene adjustment condition, if so, selecting a scene coefficient matched with the operation record of the user from the configuration file, and applying the scene coefficient to the current scene of the program. The method provided by the application improves the generation efficiency and matching degree of the configuration file, enables the scene change to be more natural, and improves the use experience of a user.

Description

Scene control method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a scene control method and apparatus, a computing device, and a computer-readable storage medium.
Background
In various electronic games, a plurality of levels and different scenes in the levels are provided for players, so that the games present different pictures and playing methods. Configuration data is used when a game level is designed in the prior art, and the game content of the game level can be changed through different level configuration data, so that the situations that the game content is monotonous and is simply repeated are reduced. At present, the checkpoint configuration data is mainly configured by developers according to experiences, and the configuration mode has strong subjectivity, so that the checkpoint is not matched with the suitable difficulty of users, and a large amount of human resources are consumed. And the manual adjustment mode is rough in granularity, so that the difficulty of the checkpoint is easy to change violently, and the checkpoint is easy to be detected by a user, thereby causing poor user experience.
Disclosure of Invention
In view of this, embodiments of the present application provide a scene control method and apparatus, a computing device, and a computer-readable storage medium, so as to solve technical defects in the prior art.
According to a first aspect of embodiments of the present application, there is provided a scene control method, including:
loading a configuration file corresponding to the current program scene, and recording the operation record of a user;
judging whether the operation records of the user accord with scene adjustment conditions or not, and if so, selecting a scene coefficient matched with the operation records of the user from a configuration file; applying the scene coefficients into a current scene of the program.
According to a second aspect of embodiments of the present application, there is provided a scene control apparatus, including:
the loading module is used for loading the configuration file corresponding to the current scene when the program scene runs;
the recording module is used for recording the operation record of a user when the program runs;
the judging module is used for judging whether the operation record of the user accords with the scene adjusting condition or not when the program runs;
and the configuration module is used for selecting the scene coefficient matched with the user operation record from a configuration file during the running of the program and applying the scene coefficient to the current scene of the program.
According to a third aspect of embodiments herein, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the scene control method when executing the instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the scene control method.
According to the embodiment of the application, the configuration files of all scenes in the game are determined by collecting the big data of the user, so that the configuration files corresponding to all the determined scenes are more in line with the actual playing experience of the game, the method is more accurate than the method of subjectively defining the configuration files by product personnel, and the development efficiency of the game is improved. Meanwhile, when the data in the configuration file is configured according to the big data of the user, various operation records of the user are considered, the data are reasonably modeled, the scene adjustment fineness is improved, the perception of the user for scene adjustment is reduced, and the game experience of the user is greatly improved.
Drawings
FIG. 1 is a block diagram of a computing device provided by an embodiment of the present application;
fig. 2 is a flowchart of a scene control method provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of a configuration file provided by an embodiment of the present application;
FIG. 4 is another schematic diagram of a configuration file provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a user operation record provided in an embodiment of the present application;
FIG. 6 is another schematic diagram of a user operation record provided in an embodiment of the present application;
FIG. 7 is another schematic diagram of a configuration file provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram of a scene control apparatus according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if," as used herein, may be interpreted as "responsive to a determination," depending on the context.
In the present application, a scene control method and apparatus, a computing device, and a computer-readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Other components may be added or replaced as desired by those skilled in the art.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein the processor 120 may perform the steps of the scene control method shown in fig. 2. Fig. 2 shows a flowchart of a scenario control method according to an embodiment of the present application, including step 202 to step 206.
Step 202: and loading a configuration file corresponding to the current program scene, and recording the operation record of the user.
In a specific embodiment, after a user enters a current scene, a configuration file corresponding to the scene is loaded. The scene may be a scene that a user needs to pass through currently, such as a current level, a sub-level in the level, a transition scene between levels, a map, and the like, which are well known to those skilled in the art and are not described herein.
Meanwhile, recording the operation record of the user in the current scene. The operation record of the user in the current scene may include: the user accumulates the number of passing through the gate, the number of times of failure of the user to pass through the gate continuously, the number of times of success of the user to pass through the gate continuously or operation records of the user in the same scene and the like.
In another specific embodiment, the configuration data of the configuration file is generated by scene operation result data corresponding to the scene configuration data uploaded by the user. In the prior art, because the scenes of the game are numerous, and the playing skills and habits of different users are different, the efficiency and the effect of the mode of manually configuring data are poor. In this embodiment, the terminal of the user collects operation result data in a scene corresponding to the configuration data in each scene, and uploads the result data to the server. Optionally, the scene operation result data uploaded by the user terminal includes, but is not limited to, information such as a level clearance state, a level score, and a level consumption time, and may further include a type and a number of properties used.
In another specific embodiment, the game server collects massive scene operation result data uploaded by the user terminal to form matched big data of the scene configuration data and the operation result data; the big data is analyzed by mathematical modeling to obtain an improved configuration file. The mathematical modeling model may be a statistical model, such as cluster analysis or time series analysis, or a machine learning or data mining model, and the like, and those skilled in the art may select the model according to actual situations, which is not limited herein. As shown in fig. 3, the modified profile includes a two-dimensional array _ diffs, each value in the two-dimensional array representing a scene coefficient.
Step 204: and selecting a scene coefficient matched with the user operation record from a configuration file according to the user operation record.
In one embodiment, the types of user operation records include, but are not limited to: the accumulated number of passing the gate by the user, the number of continuous failure of passing the gate by the user, the number of continuous success of passing the gate by the user, the operation record of the user in the same gate and the like. In this step, it is determined whether the current situation meets the adjustment condition of the scene according to the data of different types of user operation records, and if so, a scene coefficient matching the user operation record is selected from the configuration file in step 202.
Step 206: applying the scene coefficients into a current scene of the program.
In one embodiment, the scene coefficients obtained from the configuration file are applied to the current scene, thereby adjusting the game difficulty of the current scene.
In the scheme of the embodiment, compared with a mode that configuration data of each scene of a game is subjectively defined by product personnel, the configuration data determined by the embodiment is more in line with the actual operation condition of the game, so that the matching accuracy between the configuration data and the game scene is improved, and the generation efficiency of the configuration file is also improved.
Another embodiment of the present application describes how to adjust a scene according to an operation record of a user, and in this embodiment, the following description is provided in combination with a code:
as shown in fig. 4, the configuration file of the scene further includes a one-dimensional step interval array steps [ i ] and a one-dimensional integration interval array _ accs [ j ]; each numerical value in the step interval array corresponds to one step, and each numerical value in the integral interval array corresponds to one integral proportion.
In one embodiment, for the operation record of the user in the same scene, optionally, as in the elimination game, the user is recorded every time the chessman moves, such as adding 1 to the step number; in the shooting game, each time a user shoots, the shooting frequency is added by 1; in a sports game, the user adds 1 to the number of shots per shot, and so on.
When a user breaks a barrier in a barrier, sequentially comparing the current accumulated game step number with the numerical values in the step number interval array every time one-step operation is performed; if the current accumulated game step number is smaller than the numerical value of the current step number interval array, defining the sequence number i of the current step number interval array as X;
Figure BDA0003148424180000071
Figure BDA0003148424180000081
and comparing the current game point with the target total point of the current level to obtain the specific gravity of the distance customs clearance.
var nowScore ═ gamemanager. // Current score
var offset 1- (float) nowScore/_ maxLevelScore; specific gravity of/distance clearance
Sequentially comparing the offset with the values in the integration interval array, and if the offset is greater than or equal to the value of the integration interval array, defining the sum of the serial number j of the current integration interval array and 1 as Y;
Figure BDA0003148424180000082
and taking the X and the Y as row and column numbers of a two-dimensional array, positioning in the two-dimensional array of the configuration file to obtain a scene coefficient, and applying the scene coefficient to the current scene of the program.
diffCurrent=_diffs[X][Y];
In this embodiment, a scene adjustment mode when a user breaks a gate in a certain gate is described, the number of operation steps of each step of the user in the gate and the corresponding score are recorded according to the definition of an array established according to a data model of big data in a configuration file, and each time the user performs a game operation, the number of steps and the score are compared with the numerical values in the array corresponding to the configuration file, so as to obtain a scene coefficient matching the number of steps and the score of the current user. Based on the mode, the granularity decomposition of the difficulty adjustment in the level is more reasonable, the user is not easy to find the difficulty change, and the user experience of the game is greatly improved.
As shown in fig. 5, another embodiment of the present application describes how to adjust a scene according to another operation record of a user.
In this embodiment, when a user enters a game, a distance D between the current time the user enters the game and the last time of the maximum number of consecutive breakthrough failures in the previous breakthrough is obtained.
Specifically, as shown in fig. 5, when the user enters the game at level 9, the previous N game judgments are performed, where N is also established based on the data model of the big data uploaded by the terminal, and it is assumed that N is 10. The number of connection passage failure of the 5 th game was found to be 3 times as large as 10 games. Therefore, the distance D is equal to the last time of the 5 th pass failure, plus the number of times 1 of the 5 th pass success, plus the number of times 1 of the 6 th pass success, plus 2 times of the 7 th pass failure and 1 time of the 7 th pass success, plus 1 time of the 8 th pass failure and 1 time of the 8 th pass success, that is, in fig. 5, the distance D is equal to 2 times of the 8 th pass + 3 times of the 7 th pass + 1 times of the 6 th pass + two times of the 5 th pass being equal to 8.
And acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency. For example, in this example, according to the maximum number of consecutive breakthrough failures 3 in fig. 7, the corresponding section score 10 is obtained in the configuration file.
Comparing the distance D with a distance threshold T, if the distance D is larger than T, setting the distance coefficient A to be less than or equal to 1, and multiplying the interval fraction S with the distance coefficient A to obtain an addition fraction. In this example, the distance threshold T is taken to be 3, and since the distance 8> the distance threshold 3, the summation score is 10 × 1 to 10, where the distance coefficient a is 1.
In another embodiment, as shown in fig. 6, when the user enters the game at level 9, the previous N game determinations are made, where N is assumed to be 10. The number of connection 8 pass-through failures was found to be 4 times the maximum in 10 games. Therefore, the distance D is equal to the last time of the 8 th pass failure plus the number of times of the 8 th pass success 1, i.e. in fig. 6, the distance D is 2.
And acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure times. For example, according to the number of consecutive pass failures 4 in fig. 7, the corresponding section score obtained in the configuration file is 20.
And then comparing the distance D with a distance threshold value T, if the distance D is less than T, setting a distance coefficient A to be more than 1, and multiplying the interval fraction S with the distance coefficient A to obtain an addition fraction. In this embodiment, the distance threshold T is taken to be 3, and since the distance 2< the distance threshold 3, the summation score is 20 × 1.5 — 30, and the distance coefficient a is 1.5.
In the configuration file, the scene coefficient corresponding to the addition score is acquired, and as shown in fig. 7, the scene coefficient corresponding to the addition score 10 is 1, and the scene coefficient corresponding to the addition score 30 is 3. In this embodiment, the game difficulty level corresponding to the scene coefficient 3 is smaller than the game difficulty level corresponding to the scene coefficient 1.
In this embodiment, a method of adjusting a scene according to a record of a user when continuously breaking through a gate is described, specifically, a difficulty of adjusting a current game according to a record of a user failure of continuously breaking through a gate is described. In the embodiment, various judgment thresholds, such as a distance threshold T, a distance coefficient a, and the like, are calculated according to big data submitted by a terminal in combination with a data model, and the thresholds are configured in a configuration file and issued to the terminal, and are continuously updated according to game scene operation data submitted by a user, so that the game experience of the user is better matched with the real situation, the intensity of game difficulty change is reduced, and the game experience of the user is improved.
Another embodiment of the present application describes how to adjust the scene according to the change of the scene coefficient.
In a specific implementation mode, when the scene coefficient of a scene is monitored to change, the weight of an object to be appeared in the current scene is adjusted according to the changed scene coefficient, and the appearance weight of the object to be appeared is determined again;
as an example, in an elimination game, all of the potentially dropped chess pieces for the current level, e.g., 3-color red, yellow and blue chess pieces, are obtained, and the initial drop weight for each color chess piece is 1. When the scene coefficient of the current level is monitored to be changed, the falling weight of the chessmen with each color needs to be adjusted. If the scene coefficient is larger, the difficulty of the current checkpoint is reduced. In order to adjust the weight of the chessmen, as a way, the configuration file also stores the corresponding relation between the current pass clearance and the chessman falling weight, for example, for the pass 7, if the red chessman falling weight is increased to 2, the predicted pass clearance of the pass can be increased by 10%; if the weight of the blue chess piece falling is increased to 3, the estimated rate of passing the current level can be increased by 30%;
in another mode, a weight adjustment mode of the color of the falling chess piece can be calculated in real time according to the current scene, for example, the weight of the chess piece with which color should be increased is calculated according to all the fallen chess pieces;
in another mode, before the chess pieces fall, each grid in the chessboard is traversed frame by frame, and when a certain grid is in a dropable state, the weight of the chess pieces to fall is calculated according to the color of the chess pieces in the adjacent grids of the grid.
After the occurrence weight of the object to be appeared is determined, the occurrence probability of the object to be appeared is determined according to the adjusted weight, and the object to be appeared is controlled to appear in the current scene with the corresponding probability. In one embodiment, different weights correspond to different object occurrence probabilities. The corresponding relation between the weight and the probability can be kept in a configuration file and continuously updated according to game scene operation data submitted by a user, or real-time calculation can be carried out in a game according to the change of the weight, and when the weight of an object to be appeared becomes large, the falling probability of the object also becomes high.
After the probability of occurrence of the object to be present is determined, the game controls the presence of the object. As an example, when allowing a piece to fall, the manager would traverse each grid in the board frame by frame, and when a certain grid is in a droppable state and its lower grid is an empty grid and a fall is allowed, it would mark that grid as a droppable state, to which a fall instruction is appended. Each chessman has a falling state of a lattice, the state is a non-falling state by default, when the state is marked as a falling instruction, the state of the chessman is marked as a starting falling state, when the falling action is carried out later, the state of the chessman is a falling state, when the chessman is fallen, the frame of the chessman is finished, the state of the chessman is marked as a finished state, and the next frame is recovered to the non-falling state.
Corresponding to the above method embodiment, the present application further provides an embodiment of a scene control device, and fig. 8 shows a schematic structural diagram of a scene control device according to an embodiment of the present application. As shown in fig. 8, the apparatus 800 includes:
the loading module is used for loading the configuration file corresponding to the current scene when the program runs;
the recording module is used for recording the operation record of a user when the program runs;
the matching module is used for selecting a scene coefficient matched with the user operation record from a configuration file when a program runs;
a configuration module for applying the scene coefficients to a current scene of the program.
Optionally, the apparatus 800 further comprises:
and the generating module is used for generating a configuration file according to the scene operation result data corresponding to the scene configuration data uploaded by the user.
Optionally, the configuration file further comprises:
a two-dimensional array, each value in the two-dimensional array representing a scene coefficient.
Optionally, the configuration file further comprises:
a one-dimensional step interval array and a one-dimensional integration interval array.
Optionally, the matching module is configured to:
when a user breaks through the barrier in the barrier, sequentially comparing the current accumulated game step number with the numerical value in the step number interval array every time one-step operation is carried out, and if the current accumulated game step number is smaller than the numerical value of the current step number interval array, defining the serial number i of the current step number interval array as X;
comparing the current game point with the target total point of the current level:
offset=1-nowScore/_maxLevelScore
the nowScore is the current game point, and the maxLevelScore is the target total point of the current level;
sequentially comparing the offset with the numerical value in the integral interval array, and if the offset is greater than or equal to the value of the integral interval array, adding 1 to the serial number j of the current integral interval array to define as Y;
and taking the X and the Y as row and column numbers of the two-dimensional array, and positioning in the two-dimensional array of the configuration file to obtain a scene coefficient.
Optionally, the matching module is further configured to:
when a user enters a game, acquiring the distance between the current game and the last time of the maximum continuous pass failure times in the previous pass;
acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency;
when the distance is greater than a distance threshold value, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is less than or equal to 1;
when the distance is less than or equal to the distance threshold, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is greater than 1;
and acquiring corresponding scene coefficients in the configuration file according to different addition scores, wherein the higher the scene coefficient is, the simpler the game difficulty is.
The above is a schematic scheme of a scene control apparatus of the present embodiment. It should be noted that the technical solution of the scene control device and the technical solution of the scene control method belong to the same concept, and details that are not described in detail in the technical solution of the scene control device can be referred to the description of the technical solution of the scene control method.
There is also provided in an embodiment of the present application a computing device, including a memory, a processor, and computer instructions stored on the memory and executable on the processor, where the processor implements the steps of the scene control method when executing the instructions.
The above is an illustrative scheme of a computing device of the present embodiment. It should be noted that the technical solution of the computing device and the technical solution of the above-mentioned scene control method belong to the same concept, and details that are not described in detail in the technical solution of the computing device can be referred to the description of the technical solution of the above-mentioned scene control method.
An embodiment of the present application further provides a computer-readable storage medium, which stores computer instructions, and when the instructions are executed by a processor, the computer-readable storage medium implements the steps of the scene control method as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the above-mentioned scene control method, and details that are not described in detail in the technical solution of the storage medium can be referred to the description of the technical solution of the above-mentioned scene control method.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in source code form, object code form, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art will appreciate that the embodiments described in this specification are presently considered to be preferred embodiments and that acts and modules are not required in the present application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.

Claims (9)

1. A scene control method is applied to a program and comprises the following steps:
loading a configuration file corresponding to the current program scene, and recording the operation record of a user;
selecting a scene coefficient matched with the user operation record from a configuration file according to the user operation record;
applying the scene coefficients into a current scene of the program;
further comprising:
when a user enters a game, acquiring the distance between the current game and the last time of the maximum continuous pass failure times in the previous pass;
acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency;
when the distance is greater than a distance threshold value, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is less than or equal to 1;
when the distance is less than or equal to the distance threshold, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is greater than 1;
and acquiring corresponding scene coefficients in the configuration file according to different addition scores, wherein the higher the scene coefficient is, the simpler the game difficulty is.
2. The method of claim 1, wherein the configuration data of the configuration file is generated from scene operation result data corresponding to the scene configuration data uploaded by a user.
3. The method of claim 2, further comprising:
the configuration file comprises a two-dimensional array, and each value in the two-dimensional array represents a scene coefficient.
4. The method of claim 3, further comprising:
the configuration file comprises a one-dimensional step interval array and a one-dimensional integral interval array;
when a user breaks through the barrier in the barrier, sequentially comparing the current accumulated game step number with the numerical value in the step number interval array every time one-step operation is carried out, and if the current accumulated game step number is smaller than the numerical value of the current step number interval array, defining the serial number i of the current step number interval array as X;
comparing the current game score with the target total score of the current level:
offset=1-nowScore/_maxLevelScore
the nowScore is the current game score, and the maxLevelScore is the target total score of the current level;
sequentially comparing the offset with the numerical value in the integral interval array, and if the offset is greater than or equal to the value of the integral interval array, adding 1 to the serial number j of the current integral interval array to define as Y;
and taking the X and the Y as row and column numbers of the two-dimensional array, and positioning in the two-dimensional array of the configuration file to obtain a scene coefficient.
5. The method of claim 4, further comprising:
according to the scene coefficient of the current scene, adjusting the weight of the object to be appeared in the current scene, and re-determining the appearance weight of the object to be appeared; and determining the occurrence probability of the object to be generated according to the occurrence weight of the object to be generated, and controlling the object to be generated to appear in the current scene with the corresponding probability.
6. The method of claim 1, wherein the record of user actions includes at least one of:
the user accumulates the number of passing the gate, the number of continuous failure of passing the gate, the number of continuous success of passing the gate or the operation record of the user in the same scene.
7. An apparatus for scene control, comprising:
the loading module is used for loading the configuration file corresponding to the current scene when the program runs;
the recording module is used for recording the operation record of a user when the program runs;
the matching module is used for selecting a scene coefficient matched with the user operation record from a configuration file when a program runs;
a configuration module for applying the scene coefficients to a current scene of the program;
the matching module is further configured to:
when a user enters a game, acquiring the distance between the current game and the last time of the maximum continuous pass-through failure times in the previous pass-through;
acquiring a corresponding interval score S in a configuration file according to the maximum continuous breakthrough failure frequency;
when the distance is greater than a distance threshold value, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is less than or equal to 1;
when the distance is less than or equal to the distance threshold, multiplying the interval fraction S by a distance coefficient to obtain an addition fraction, wherein the distance coefficient is greater than 1;
and acquiring corresponding scene coefficients in the configuration file according to different addition scores, wherein the higher the scene coefficient is, the simpler the game difficulty is.
8. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor implements the steps of the method of any one of claims 1-6 when executing the instructions.
9. A computer-readable storage medium storing computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 6.
CN202110759766.8A 2021-07-06 2021-07-06 Scene control method and device Active CN113384900B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110759766.8A CN113384900B (en) 2021-07-06 2021-07-06 Scene control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110759766.8A CN113384900B (en) 2021-07-06 2021-07-06 Scene control method and device

Publications (2)

Publication Number Publication Date
CN113384900A CN113384900A (en) 2021-09-14
CN113384900B true CN113384900B (en) 2022-09-30

Family

ID=77625148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110759766.8A Active CN113384900B (en) 2021-07-06 2021-07-06 Scene control method and device

Country Status (1)

Country Link
CN (1) CN113384900B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111035922A (en) * 2019-10-31 2020-04-21 咪咕互动娱乐有限公司 Game assisting method, device and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9675889B2 (en) * 2014-09-10 2017-06-13 Zynga Inc. Systems and methods for determining game level attributes based on player skill level prior to game play in the level
US10357718B2 (en) * 2017-02-28 2019-07-23 Electronic Arts Inc. Realtime dynamic modification and optimization of gameplay parameters within a video game application
CN107376353B (en) * 2017-07-14 2018-09-04 腾讯科技(深圳)有限公司 Outpost of the tax office configuration method and device
CN108031120B (en) * 2017-12-13 2020-10-16 腾讯科技(深圳)有限公司 Scene object control method and device, electronic equipment and storage medium
CN108920221B (en) * 2018-06-29 2023-01-10 网易(杭州)网络有限公司 Game difficulty adjusting method and device, electronic equipment and storage medium
CN109806584A (en) * 2019-01-24 2019-05-28 网易(杭州)网络有限公司 Scene of game generation method and device, electronic equipment, storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111035922A (en) * 2019-10-31 2020-04-21 咪咕互动娱乐有限公司 Game assisting method, device and storage medium

Also Published As

Publication number Publication date
CN113384900A (en) 2021-09-14

Similar Documents

Publication Publication Date Title
CN106390456B (en) The generation method and device of role-act in game
CN109513215B (en) Object matching method, model training method and server
CN108920221B (en) Game difficulty adjusting method and device, electronic equipment and storage medium
CN107335220B (en) Negative user identification method and device and server
CN106204597B (en) A kind of video object dividing method based on from the step Weakly supervised study of formula
CN110458109A (en) A kind of tealeaves disease recognition system and working method based on image recognition technology
CN104090573A (en) Robot soccer dynamic decision-making device and method based on ant colony algorithm
CN113018866A (en) Map resource loading method and device, storage medium and electronic device
CN114392560A (en) Method, device and equipment for processing running data of virtual scene and storage medium
CN115510042A (en) Power system load data filling method and device based on generation countermeasure network
CN112417990A (en) Examination student violation behavior identification method and system
Wang et al. Distortion recognition for image quality assessment with convolutional neural network
CN114024737B (en) Method, apparatus and computer readable storage medium for determining live room volume
CN113384900B (en) Scene control method and device
CN111282281A (en) Image processing method and device, electronic equipment and computer readable storage medium
CN113230650B (en) Data processing method and device and computer readable storage medium
CN112235571B (en) Video bit depth expansion method and device, electronic equipment and storage medium
CN113393063A (en) Match result prediction method, system, program product and storage medium
CN110852425A (en) Optimization-based neural network processing method and device and electronic system
CN110852224A (en) Expression recognition method and related device
CN114490618B (en) Ant-lion algorithm-based data filling method, device, equipment and storage medium
CN113946604B (en) Staged go teaching method and device, electronic equipment and storage medium
CN105279266B (en) A kind of method based on mobile Internet social activity picture prediction user context information
JP2022101461A (en) Joint sparse method based on mixed particle size used for neural network
CN110009749B (en) Virtual object positioning method, device, computing equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230620

Address after: 100041 room a-0860, 2 / F, building 3, yard 30, Shixing street, Shijingshan District, Beijing

Patentee after: ONLINE TUYOO (BEIJING) TECHNOLOGY CO.,LTD.

Patentee after: Beijing Yunyou Interactive Network Technology Co.,Ltd.

Patentee after: Tuyou World (Beijing) Technology Co.,Ltd.

Address before: Block B, warm mountain life, No. 36, Hongjunying South Road, Chaoyang District, Beijing 100012

Patentee before: ONLINE TUYOO (BEIJING) TECHNOLOGY CO.,LTD.

Patentee before: Beijing Yunyou Interactive Network Technology Co.,Ltd.

TR01 Transfer of patent right