CN112675534A - Data processing method and device, server and storage medium - Google Patents

Data processing method and device, server and storage medium Download PDF

Info

Publication number
CN112675534A
CN112675534A CN202011527399.0A CN202011527399A CN112675534A CN 112675534 A CN112675534 A CN 112675534A CN 202011527399 A CN202011527399 A CN 202011527399A CN 112675534 A CN112675534 A CN 112675534A
Authority
CN
China
Prior art keywords
map
data processing
distance
real object
processing method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011527399.0A
Other languages
Chinese (zh)
Inventor
赵墨强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN202011527399.0A priority Critical patent/CN112675534A/en
Publication of CN112675534A publication Critical patent/CN112675534A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application provides a data processing method and device, a server and a storage medium, and relates to the technical field of data processing. The data processing method is applied to a server, the server stores a plurality of segmentation maps, and the data processing method comprises the following steps: firstly, aiming at each segmentation map, acquiring a real object in a map adjacent to the segmentation map; secondly, the real objects in the adjacent maps are mapped to the segmentation maps to obtain corresponding mapping objects. By the method, the game characters in the adjacent map can be mapped to the current map, and the problem of high data processing pressure caused by the fact that the game characters attached to all game scenes need to be displayed to ensure the large scene experience of a player in the prior art is solved.

Description

Data processing method and device, server and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a data processing method and apparatus, a server, and a storage medium.
Background
In the prior art, in order to ensure the large-scene experience of players, NPCs and players attached to all scenes need to be displayed, the number of game characters is increased, so that the performance problem of a server occurs, and a CPU (central processing unit) uses overload, so that the problem of high data processing pressure exists.
Disclosure of Invention
In view of the above, an object of the present application is to provide a data processing method and apparatus, a server, and a storage medium, so as to solve the problems in the prior art.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, the present invention provides a data processing method applied to a server, where a plurality of divided maps are stored, the data processing method including:
aiming at each segmentation map, acquiring a real object in a map adjacent to the segmentation map;
and mapping the real object in the adjacent map to the segmentation map to obtain a corresponding mapping object.
In an optional embodiment, the step of mapping the real object in the adjacent map to the segmented map to obtain a corresponding mapping object includes:
calculating to obtain a scaling according to the contact distance between the real object in the adjacent map and the map boundary and a preset distance;
and mapping the real object to the segmentation map according to the scaling to obtain a corresponding mapping object.
In an optional embodiment, the contact distance includes a first contact distance, the first contact distance is a distance from the real object to a closest point of a boundary of the divided map, the preset distance is a first preset distance, the first preset distance is a sum of the first contact distance and a distance from the real object to a boundary of the adjacent map far from the divided map, the zoom scale includes a first zoom scale, and the step of calculating the zoom scale according to the contact distance from the real object in the adjacent map to the boundary of the map and the preset distance includes:
and calculating to obtain a first scaling according to the first contact distance and a first preset distance.
In an optional embodiment, the contact distance includes a second contact distance, the second contact distance is a distance from the real object to a closest point of the boundary of the adjacent map, the preset distance includes a second preset distance, the second preset distance is a sum of the second contact distance and a distance from the real object to the boundary of the adjacent map far away from the closest point, the zoom scale includes a second zoom scale, and the step of calculating the zoom scale according to the contact distance from the real object in the adjacent map to the boundary of the map and the preset distance includes:
and calculating to obtain a second scaling according to the second contact distance and a second preset distance.
In an optional embodiment, the step of acquiring, for each of the divided maps, a real object in a neighboring map of the divided map includes:
judging whether the object in the adjacent map is within a preset range of the segmentation map;
and if so, taking the object in the preset range as a real object.
In an optional embodiment, the data processing method further comprises the step of obtaining a plurality of segmentation maps, the step comprising:
and carrying out segmentation processing on the preset map to obtain a plurality of segmented maps.
In an optional embodiment, the data processing method further includes:
judging whether the real objects in the adjacent map are updated or not;
and if so, mapping the updated real object to the segmentation map to obtain an updated mapping object.
In a second aspect, the present invention provides a data processing apparatus applied to a server storing a plurality of divided maps, the data processing apparatus comprising:
the object acquisition module is used for acquiring a real object in the adjacent map of each segmentation map;
and the object mapping module is used for mapping the real object in the adjacent map to the segmentation map to obtain a corresponding mapping object.
In a third aspect, the present invention provides a server, comprising a memory and a processor, wherein the processor is configured to execute an executable computer program stored in the memory to implement the data processing method of any one of the foregoing embodiments.
In a fourth aspect, the present invention provides a storage medium having stored thereon a computer program which, when executed, implements the steps of the data processing method of any one of the preceding embodiments.
According to the data processing method and device, the server and the storage medium, the real objects in the adjacent maps of the divided maps are mapped into the divided maps to obtain the corresponding mapping objects, so that the game characters in the adjacent maps are mapped into the current map, and the problem of high data processing pressure caused by the fact that the game characters attached to all game scenes need to be displayed to ensure the large-scene experience of a player in the prior art is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a block diagram of a data processing system according to an embodiment of the present disclosure.
Fig. 2 is a schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 3 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 4 is another schematic flow chart of the data processing method according to the embodiment of the present application.
Fig. 5 is another schematic flow chart of the data processing method according to the embodiment of the present application.
Fig. 6 is a scene schematic diagram of object mapping provided in the embodiment of the present application.
Fig. 7 is another scene schematic diagram of object mapping according to an embodiment of the present application.
Fig. 8 is another schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 9 is a block diagram of a data processing apparatus according to an embodiment of the present application.
Icon: 10-a data processing system; 100-a server; 200-a terminal device; 900-a data processing apparatus; 910-an object acquisition module; 920-object mapping module.
Detailed Description
With the more mature game technology, the demand of players for game experience is increasing, and the experience effect of super-large scenes (environment, buildings, machines, props and the like in the game) is also one of the important demands pursued by the players. The scene is large, the number of attached objects such as NPC (system characters in the game, such as monsters and the like) and players (game characters operated by the players in the online game) is increased, and the problems of performance problems of the server, overload of CPU (central processing unit) use and the like are caused.
In the prior art, a common method is that a large scene is not made or a large scene is cut into a plurality of different small scenes, and the scenes are not interacted. Because there is no interaction between the small scenes, the player can not see the player and the NPC of the next small scene near the small scene, and when the small scene is switched, the surrounding player and NPC appear and disappear suddenly, and the player can not feel the very large scene really. That is, the conventional technique has a problem that the data processing pressure is large, and the player and the NPC of the next small scene cannot be seen in the vicinity of one different small scene.
In order to improve at least one of the above technical problems proposed by the present application, embodiments of the present application provide a data processing method and apparatus, a server, and a storage medium, and the following describes technical solutions of the present application through possible implementation manners.
The defects of the above solutions are the results of the inventor after practice and careful study, and therefore, the discovery process of the above problems and the solution proposed by the present application to the above problems should be the contribution of the inventor to the present application in the process of the present application.
For purposes of making the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be described in detail below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
In order to enable a person skilled in the art to make use of the present disclosure, the following embodiments are given. It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the application. Applications of the system or method of the present application may include web pages, plug-ins for browsers, client terminals, customization systems, internal analysis systems, or artificial intelligence robots, among others, or any combination thereof.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Fig. 1 is a block diagram of a data processing system 10 provided in an embodiment of the present application, which provides a possible implementation manner of the data processing system 10, and referring to fig. 1, the data processing system 10 may include one or more of a server 100 and a terminal device 200.
The server 100 can be in communication connection with a plurality of terminal devices 200, and the server 100 acquires a plurality of divided maps on the terminal devices 200 and maps real objects in adjacent maps into the divided maps for display.
For the server 100, it should be noted that, in some embodiments, the server 100 may be a single server device or a server group. The set of servers may be centralized or distributed (e.g., server 100 may be a distributed system). In some embodiments, the server 100 may be local or remote to the terminal device 200. For example, the server 100 may access information and/or data stored in the terminal device 200 via a network. As another example, the server 100 may be directly connected to the terminal device 200 to access stored information and/or data. In some embodiments, the server 100 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a resilient cloud, a community cloud (community cloud), a distributed cloud, a cross-cloud (inter-cloud), a multi-cloud (multi-cloud), and the like, or any combination thereof. In some embodiments, the server 100 may be implemented on the terminal device 200.
In some embodiments, the server 100 may include a processor. The processor may process information and/or data transmitted by terminal device 200 to perform one or more of the functions described herein. In some embodiments, a processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a Processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an Application Specific Instruction Set Processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller Unit, a Reduced Instruction Set computer (Reduced Instruction Set computer, RISC), a microprocessor, or the like, or any combination thereof.
The network may be used for the exchange of information and/or data. In some embodiments, one or more components in data processing system 10 (e.g., server 100 and terminal device 200) may send information and/or data to other components. For example, the server 100 may acquire data from the terminal device 200 via a network. In some embodiments, the network may be any type of wired or wireless network, or combination thereof. Merely by way of example, the Network may include a wired Network, a Wireless Network, a fiber optic Network, a telecommunications Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a ZigBee Network, a Near Field Communication (NFC) Network, or the like, or any combination thereof.
In some embodiments, the network may include one or more network access points. For example, a network may include wired or wireless network access points, such as base stations and/or network switching nodes, through which one or more components of data processing system 10 may connect to the network to exchange data and/or information.
A database may be included in server 100 and may store data and/or instructions. In some embodiments, the database may store data obtained from the terminal device 200. In some embodiments, a database may store data and/or instructions for the exemplary methods described herein. In some embodiments, the database may include mass storage, removable storage, volatile Read-write Memory, or Read-Only Memory (ROM), among others, or any combination thereof. By way of example, mass storage may include magnetic disks, optical disks, solid state drives, and the like; removable memory may include flash drives, floppy disks, optical disks, memory cards, zip disks, tapes, and the like; volatile read-write Memory may include Random Access Memory (RAM); the RAM may include Dynamic RAM (DRAM), Double data Rate Synchronous Dynamic RAM (DDR SDRAM); static RAM (SRAM), Thyristor-Based Random Access Memory (T-RAM), Zero-capacitor RAM (Zero-RAM), and the like. By way of example, ROMs may include Mask Read-Only memories (MROMs), Programmable ROMs (PROMs), Erasable Programmable ROMs (PERROMs), Electrically Erasable Programmable ROMs (EEPROMs), compact disk ROMs (CD-ROMs), digital versatile disks (ROMs), and the like. In some embodiments, the database may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, cross-cloud, multi-cloud, elastic cloud, or the like, or any combination thereof.
In some embodiments, the database may be connected to a network to communicate with one or more components in the data processing system 10 (e.g., the server 100 and the terminal device 200). One or more components in data processing system 10 may access data or instructions stored in a database via a network. In some embodiments, the database may be directly connected to one or more components in data processing system 10 (e.g., server 100 and terminal device 200). Alternatively, in some embodiments, the database may also be part of the server 100. In some embodiments, one or more components in data processing system 10 (e.g., server 100 and terminal device 200) may have access to a database.
Fig. 2 shows one of flowcharts of a data processing method provided in an embodiment of the present application, where the method is applicable to the server 100 shown in fig. 1 and is executed by the server 100 in fig. 1. It should be understood that, in other embodiments, the order of some steps in the data processing method of this embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The flow of the data processing method shown in fig. 2 is described in detail below.
Step S210, for each of the divided maps, acquiring a real object in a map adjacent to the divided map.
Step S220, the real object in the adjacent map is mapped to the divided map, and a corresponding mapping object is obtained.
According to the method, the real objects in the adjacent maps of the divided maps are mapped into the divided maps to obtain the corresponding mapping objects, so that the game characters in the adjacent maps are mapped into the current map, and the problem of high data processing pressure caused by the fact that the game characters attached to all game scenes need to be displayed to ensure the large-scene experience of a player in the prior art is solved.
Before step S210, the data processing method provided in the embodiment of the present application may further include a step of obtaining a plurality of divided maps. Therefore, on the basis of fig. 2, fig. 3 is a schematic flow chart of another data processing method provided in the embodiment of the present application, and referring to fig. 3, the data processing method may further include:
step S230, performing segmentation processing on a preset map to obtain a plurality of segmented maps.
In detail, when the map needs to be divided, the server 100 reads the size of the preset map, divides the preset map, and can divide the preset map when the scene runs without planning in advance. For example, the size of the preset map provided in the embodiment of the present application may be 16384 × 16384 meters, and the large scene (preset map) may be divided into 1024 small scenes (divided maps) 512 × 512 meters. The abscissa of the preset map is x, the ordinate is y, and the map with the x range of 0-512 and the y range of 0-512 can be used as the 1 st segmentation map; the map with the x range of 0-512 and the y range of 512-16384 is used as the 2 nd segmentation map, and the map with the x range of 15872-16384 and the y range of 15872-16384 is used as the 1024 th segmentation map.
Where the objects in the map may include players and NPCs, after the segmentation process, each segmentation map may have all players and NPCs within the segmentation map. It should be noted that some of the objects are located at the boundaries of a plurality of maps, and not all of the objects are located inside a certain divided map, and the objects at the boundaries belong to one of the smaller divided maps ID when the divided map is divided. For example, the object a is located at the boundary between the 1 st and 2 nd divided maps, to which the object a belongs when the division is performed.
For step S210, it should be noted that the specific manner of acquiring the real object is not limited, and may be set according to the actual application requirement. For example, in an alternative example, step S210 may include a step of acquiring a real object according to the range. Therefore, on the basis of fig. 2, fig. 4 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 4, step S210 may include:
step S211, determining whether the object in the adjacent map is within the preset range of the divided map.
In the embodiment of the present application, when the object in the adjacent map is within the preset range of the divided map, it is determined that the real object exists in the adjacent map, and step S212 is executed; and when the object in the adjacent map is not within the preset range of the segmentation map, judging that no real object exists in the adjacent map.
In step S212, the object in the preset range is used as the real object.
Optionally, the specific value of the preset range is not limited, and may be set according to the actual application requirement. For example, in an alternative example, the specific value of the preset range may be 50 meters. The preset range may be a distance from the object to a boundary of the divided map, or may be a distance from the object to a center point of the divided map.
In detail, the NPC and the player in the preset range in the adjacent map can have a mapping object on the divided map, the mapping object refers to the shadow of the real object in the adjacent map on the divided map, the mapping object has all data of the real object, and the data can not be actively changed only, the modification of the data of the mapping object is synchronously modified by the change of the real object, so that the player and the NPC and the mapping object only need to be arranged on the scene of the map, and the peripheral players and NPCs can be ensured not to appear or disappear suddenly when the player switches the scene, and the user experience is improved.
For step S220, it should be noted that the specific manner of mapping the real object is not limited, and may be set according to the actual application requirement. For example, in an alternative example, step S220 may include the step of mapping the real object according to a scaling ratio. Therefore, on the basis of fig. 2, fig. 5 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 5, step S220 may include:
and step S221, calculating to obtain a scaling according to the contact distance between the real object in the adjacent map and the map boundary and a preset distance.
Step S222, mapping the real object to the segmentation map according to the scaling to obtain a corresponding mapping object.
For step S221, it should be noted that, the specific way of calculating the scaling is not limited, and may be set according to the actual application requirement. For example, in an alternative example, the contact distance includes a first contact distance, the first contact distance is a distance from the real object to a closest point of the boundary of the divided map, the preset distance is a first preset distance, the first preset distance is a sum of the first contact distance and a distance from the real object to the boundary of the adjacent map away from the divided map, the scaling includes a first scaling, and the step S221 may include the step of calculating the first scaling. Thus, step S221 may comprise the following sub-steps:
and calculating to obtain a first scaling according to the first contact distance and the first preset distance.
With reference to fig. 6, the segmentation map is an area covered by four points a-B-C-D, the adjacent map is an area covered by four points a-E-F-D, the real object is located at point M on the adjacent map, the first contact distance is a distance MA from the point M of the real object to a point a closest to the boundary of the segmentation map, the first preset distance is a sum EA of the first contact distance MA and a distance ME from the point M of the real object to a point E far from the boundary of the adjacent map, and the first zoom ratio is calculated according to the first contact distance MA and the first preset distance EA.
The adjacent maps a-E-F-D may be mapped to the divided maps a-B-C-D as a whole to obtain mapped adjacent maps a1-E1-F1-D1, that is, the first preset distance EA may be mapped to E1a1 in the divided map, and then the position M1 of the mapping object on E1a1 is obtained according to the position of the real object in the first preset distance and the first zoom scale.
That is, when mapping the position of the real object, the abscissa of the real object may be mapped through the above-described substeps of step S221, resulting in the mapped abscissa.
For step S221, it should be noted that, the specific way of calculating the scaling is not limited, and may be set according to the actual application requirement. For example, in an alternative example, the contact distance includes a second contact distance, the second contact distance is a distance from the real object to a closest point of the boundary of the adjacent map, the preset distance includes a second preset distance, the second preset distance is a sum of the second contact distance and a distance from the real object to the boundary of the adjacent map away from the closest point, the scaling includes a second scaling, and the step S221 may include a step of calculating the second scaling. Thus, step S221 may comprise the following sub-steps:
and calculating to obtain a second scaling according to the second contact distance and the second preset distance.
With reference to fig. 7, the divided map is an area covered by four points a-B-C-D, the adjacent map is an area covered by four points a-E-F-D, the real object is located at N points on the adjacent map, the second contact distance is a distance NE from the N point of the real object to the E point, which is the closest point of the boundary of the adjacent map, the second preset distance is a sum EF of the second contact distance NE and a distance NF from the N point of the real object to the F point, which is the closest point of the boundary of the adjacent map, and the second zoom scale is calculated according to the second contact distance NE and the second preset distance EF.
The adjacent maps a-E-F-D may be integrally mapped to the divided maps a-B-C-D to obtain mapped adjacent maps a1-E1-F1-D1, that is, the second preset distance EF may be mapped to E1F1 in the divided map, and then the position N1 of the mapping object on E1F1 is obtained according to the position of the real object in the second preset distance and the second zoom scale.
That is, when mapping the position of the real object, the ordinate of the real object may be mapped through the above sub-step of step S221, and the mapped ordinate is obtained.
Further, when the real object is not a point but a game character having an area (the area is calculated by the lateral distance and the longitudinal distance), the lateral distance may be scaled by a first scaling ratio, and the longitudinal distance may be scaled by a second scaling ratio, so as to obtain a scaled real object. Then, the zoomed real object is mapped to the segmentation map to obtain a zoomed mapping object, so that the area and the position of the real object are zoomed simultaneously.
It should be noted that, after step S220, the data processing method provided in the embodiment of the present application may further include a step of performing an update. Therefore, on the basis of fig. 2, fig. 8 is a schematic flowchart of another data processing method provided in the embodiment of the present application, and referring to fig. 8, the data processing method may further include:
in step S240, it is determined whether the real object in the adjacent map is updated.
In the embodiment of the present application, when the real object in the adjacent map is updated, it is determined that the mapping object needs to be updated, and step S250 is executed; when the real object in the adjacent map is not updated, it is determined that the map object does not need to be updated.
Step S250, mapping the updated real object to the segmented map to obtain an updated mapping object.
For example, a scene a (an adjacent map) and a scene B (the split map) are adjacent, a range within the scene a and not more than 50 meters from the scene B serves as a synchronization range, the NPC and the player within the synchronization range each have a mapping object on the scene B, the NPC and the player within the synchronization range of the scene a notify the scene B when a change occurs, and the corresponding mapping object on the scene B reacts accordingly. When a player walks into the scene B within the synchronous range of the scene A, the player only needs to be marked as a mapping object on the scene A, and the mapping object corresponding to the scene B is changed into a real object.
By the method, the players, the NPCs and the mapping objects in the scene B only need to be processed in the scene B, and the surrounding players and NPCs can be ensured not to appear and disappear suddenly when the scene is switched by the players. And moreover, the scene is cut into 1024 blocks of segmentation maps, each block of segmentation map processes the player, the NPC and the mapping object on the small scene, and only one hundredth of the objects on the large scene need to be processed, so that the data processing pressure is greatly relieved.
With reference to fig. 9, an embodiment of the present application further provides a data processing apparatus 900, where the functions implemented by the data processing apparatus 900 correspond to the steps executed by the foregoing method. The data processing apparatus 900 may be understood as a processor of the server 100, or may be understood as a component that is independent of the server 100 or a processor and that implements the functions of the present application under the control of the server 100. The data processing apparatus 900 may include an object obtaining module 910 and an object mapping module 920.
And an object obtaining module 910, configured to, for each of the divided maps, obtain a real object in a map adjacent to the divided map. In this embodiment of the application, the object obtaining module 910 may be configured to perform step S210 shown in fig. 2, and for relevant contents of the object obtaining module 910, reference may be made to the foregoing description of step S210.
The object mapping module 920 is configured to map the real object in the adjacent map to the segmented map, so as to obtain a corresponding mapping object. In this embodiment of the application, the object mapping module 920 may be configured to perform step S220 shown in fig. 2, and reference may be made to the foregoing description of step S220 regarding relevant contents of the object mapping module 920.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the data processing method.
The computer program product of the data processing method provided in the embodiment of the present application includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the data processing method in the above method embodiment, which may be referred to specifically in the above method embodiment, and are not described herein again.
In summary, according to the data processing method and apparatus, the server, and the storage medium provided in the embodiments of the present application, the real object in the adjacent map of the divided map is mapped to the divided map to obtain the corresponding mapped object, so that the game characters in the adjacent map are mapped to the current map, and the problem of large data processing pressure caused by the fact that the game characters attached to all game scenes need to be displayed to ensure the large-scene experience of the player in the prior art is solved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server 100, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A data processing method applied to a server storing a plurality of divided maps, the data processing method comprising:
aiming at each segmentation map, acquiring a real object in a map adjacent to the segmentation map;
and mapping the real object in the adjacent map to the segmentation map to obtain a corresponding mapping object.
2. The data processing method of claim 1, wherein the step of mapping the real object in the adjacent map to the segmented map to obtain the corresponding mapping object comprises:
calculating to obtain a scaling according to the contact distance between the real object in the adjacent map and the map boundary and a preset distance;
and mapping the real object to the segmentation map according to the scaling to obtain a corresponding mapping object.
3. The data processing method of claim 2, wherein the contact distance comprises a first contact distance, the first contact distance is a distance from the real object to a closest point of a boundary of the divided map, the preset distance is a first preset distance, the first preset distance is a sum of the first contact distance and a distance from the real object to a boundary of the adjacent map away from the divided map, the zoom scale comprises a first zoom scale, and the step of calculating the zoom scale according to the contact distance from the real object in the adjacent map to the boundary of the map and the preset distance comprises:
and calculating to obtain a first scaling according to the first contact distance and a first preset distance.
4. The data processing method of claim 2, wherein the contact distance comprises a second contact distance, the second contact distance is a distance from the real object to a nearest point of the boundary of the adjacent map, the preset distance comprises a second preset distance, the second preset distance is a sum of the second contact distance and a distance from the real object to the boundary of the adjacent map far away from the nearest point, the zoom scale comprises a second zoom scale, and the step of calculating the zoom scale according to the contact distance from the real object in the adjacent map to the boundary of the map and the preset distance comprises:
and calculating to obtain a second scaling according to the second contact distance and a second preset distance.
5. The data processing method of claim 1, wherein the step of acquiring, for each of the segment maps, the real object in the segment map neighboring map comprises:
judging whether the object in the adjacent map is within a preset range of the segmentation map;
and if so, taking the object in the preset range as a real object.
6. The data processing method of claim 1, further comprising the step of obtaining a plurality of segmented maps, the step comprising:
and carrying out segmentation processing on the preset map to obtain a plurality of segmented maps.
7. The data processing method of claim 1, wherein the data processing method further comprises:
judging whether the real objects in the adjacent map are updated or not;
and if so, mapping the updated real object to the segmentation map to obtain an updated mapping object.
8. A data processing apparatus applied to a server storing a plurality of divided maps, comprising:
the object acquisition module is used for acquiring a real object in the adjacent map of each segmentation map;
and the object mapping module is used for mapping the real object in the adjacent map to the segmentation map to obtain a corresponding mapping object.
9. A server, comprising a memory and a processor for executing an executable computer program stored in the memory to implement the data processing method of any one of claims 1 to 7.
10. A storage medium, characterized in that a computer program is stored thereon, which when executed performs the steps of the data processing method of any one of claims 1-7.
CN202011527399.0A 2020-12-22 2020-12-22 Data processing method and device, server and storage medium Pending CN112675534A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011527399.0A CN112675534A (en) 2020-12-22 2020-12-22 Data processing method and device, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011527399.0A CN112675534A (en) 2020-12-22 2020-12-22 Data processing method and device, server and storage medium

Publications (1)

Publication Number Publication Date
CN112675534A true CN112675534A (en) 2021-04-20

Family

ID=75450469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011527399.0A Pending CN112675534A (en) 2020-12-22 2020-12-22 Data processing method and device, server and storage medium

Country Status (1)

Country Link
CN (1) CN112675534A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1918574A (en) * 2004-02-05 2007-02-21 Nhn株式会社 Method for processing the data distributed at online game server and a system thereof
CN101266633A (en) * 2006-11-29 2008-09-17 优万科技(北京)有限公司 Seamless super large scale dummy game world platform
CN103795782A (en) * 2013-12-27 2014-05-14 北京像素软件科技股份有限公司 Cross server method and system of network game
WO2017163702A1 (en) * 2016-03-25 2017-09-28 株式会社セガゲームス Information processing device, terminal device, and program
CN108710525A (en) * 2018-05-18 2018-10-26 腾讯科技(深圳)有限公司 Map methods of exhibiting, device, equipment and storage medium in virtual scene
CN108905203A (en) * 2018-07-11 2018-11-30 网易(杭州)网络有限公司 Information processing method, device, storage medium and electronic device
CN109364483A (en) * 2018-10-10 2019-02-22 苏州好玩友网络科技有限公司 Large scene map dividing method and the player visual angle scene update method for applying it
CN109771951A (en) * 2019-02-13 2019-05-21 网易(杭州)网络有限公司 Method, apparatus, storage medium and the electronic equipment that map generates
CN111185009A (en) * 2020-01-02 2020-05-22 腾讯科技(深圳)有限公司 Map generation method and device
CN111957041A (en) * 2020-09-07 2020-11-20 网易(杭州)网络有限公司 Map viewing method in game, terminal, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1918574A (en) * 2004-02-05 2007-02-21 Nhn株式会社 Method for processing the data distributed at online game server and a system thereof
CN101266633A (en) * 2006-11-29 2008-09-17 优万科技(北京)有限公司 Seamless super large scale dummy game world platform
CN103795782A (en) * 2013-12-27 2014-05-14 北京像素软件科技股份有限公司 Cross server method and system of network game
WO2017163702A1 (en) * 2016-03-25 2017-09-28 株式会社セガゲームス Information processing device, terminal device, and program
CN108710525A (en) * 2018-05-18 2018-10-26 腾讯科技(深圳)有限公司 Map methods of exhibiting, device, equipment and storage medium in virtual scene
CN108905203A (en) * 2018-07-11 2018-11-30 网易(杭州)网络有限公司 Information processing method, device, storage medium and electronic device
CN109364483A (en) * 2018-10-10 2019-02-22 苏州好玩友网络科技有限公司 Large scene map dividing method and the player visual angle scene update method for applying it
CN109771951A (en) * 2019-02-13 2019-05-21 网易(杭州)网络有限公司 Method, apparatus, storage medium and the electronic equipment that map generates
CN111185009A (en) * 2020-01-02 2020-05-22 腾讯科技(深圳)有限公司 Map generation method and device
CN111957041A (en) * 2020-09-07 2020-11-20 网易(杭州)网络有限公司 Map viewing method in game, terminal, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谭哲锋: "大型多人在线网络游戏中无缝地图的研究与实现", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 3, 15 March 2012 (2012-03-15), pages 139 - 508 *

Similar Documents

Publication Publication Date Title
CN110990516B (en) Map data processing method, device and server
AU2020416878B2 (en) Map generation method and apparatus, electronic device, and computer storage medium
CN111127615A (en) Data scheduling method and device of three-dimensional model and electronic equipment
CN111275054A (en) Image processing method, image processing device, electronic equipment and storage medium
US20110161060A1 (en) Optimization-Based exact formulation and solution of crowd simulation in virtual worlds
US20180268606A1 (en) Model object building method, server, and system
CN113593033A (en) Three-dimensional model feature extraction method based on grid subdivision structure
CN114255160A (en) Data processing method, device, equipment and storage medium
CN111773717A (en) Object control method and apparatus, storage medium, and electronic apparatus
CN109802859B (en) Node recommendation method and server in network graph
CN114404984A (en) Data processing method and device for game scene, computer equipment and medium
CN111013146A (en) Dynamically modifiable way-finding navigation method and device for ultra-large map
CN112675534A (en) Data processing method and device, server and storage medium
US20230401806A1 (en) Scene element processing method and apparatus, device, and medium
US6577308B1 (en) Data processing method and apparatus and information furnishing medium
CN115809696B (en) Virtual image model training method and device
US6674433B1 (en) Adaptively subdividing a subdivision surface
CN110321184B (en) Scene mapping method and computer storage medium
CN115775024A (en) Virtual image model training method and device
CN108627884B (en) Meteorological data processing method and device
CN111402369A (en) Interactive advertisement processing method and device, terminal equipment and storage medium
CN112121435B (en) Game way finding method, device, server and storage medium
CN112263836B (en) Virtual scene processing method and device and storage medium
CN111744196A (en) Task target guiding method and device in game task
CN116109806B (en) Space dynamic adjustment method, system and storage medium for virtual meeting place

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination