CN109126136B - Three-dimensional virtual pet generation method, device, equipment and storage medium - Google Patents

Three-dimensional virtual pet generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN109126136B
CN109126136B CN201810840540.9A CN201810840540A CN109126136B CN 109126136 B CN109126136 B CN 109126136B CN 201810840540 A CN201810840540 A CN 201810840540A CN 109126136 B CN109126136 B CN 109126136B
Authority
CN
China
Prior art keywords
dimensional virtual
virtual pet
pet
layers
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810840540.9A
Other languages
Chinese (zh)
Other versions
CN109126136A (en
Inventor
杨旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810840540.9A priority Critical patent/CN109126136B/en
Publication of CN109126136A publication Critical patent/CN109126136A/en
Application granted granted Critical
Publication of CN109126136B publication Critical patent/CN109126136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8058Virtual breeding, e.g. tamagotchi

Abstract

The application discloses a method, a device, equipment and a storage medium for generating a three-dimensional virtual pet, and belongs to the field of computer graphics. The method comprises the following steps: sending an acquisition request of the three-dimensional virtual pet to a server; receiving image parameters of the three-dimensional virtual pet sent by the server; determining n layers of target making materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to the n angular color parts of the three-dimensional virtual pet respectively, and at least one layer of manufacturing material comprises at least two different materials corresponding to the same part; and overlapping and combining the n layers of target making materials of the three-dimensional virtual pet according to a hierarchical sequence to generate a pet image of the three-dimensional virtual pet. According to the embodiment of the application, different index-level pet images can be generated for the same type of three-dimensional virtual pet based on limited manufacturing materials.

Description

Three-dimensional virtual pet generation method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the field of computer graphics, in particular to a method, a device, equipment and a storage medium for generating a three-dimensional virtual pet.
Background
There are many three-dimensional virtual pets of the same type in game applications, including soldiers, heros, pets, Non-Player characters (NPCs), and so on.
In generating three-dimensional virtual pets of the same type, the related art sets a three-dimensional body model for the three-dimensional virtual pets of the same type, and then superimposes different patterns and clothes on the body and/or face of the body model, thereby generating three-dimensional virtual pets with different pet images.
However, since the same type of three-dimensional virtual pets with different pet images need to be added with some personalized materials, the generation process needs to consume more time and calculation resources, and the number of types of generated pet images is limited.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for generating a three-dimensional virtual pet, and can solve the problem that more time and computing resources are consumed because some personalized materials need to be added to the same type of three-dimensional virtual pet in the related technology. The technical scheme is as follows:
according to an aspect of the present application, there is provided a method for generating a three-dimensional virtual pet, applied to an application program provided with a three-dimensional virtual pet, in which a pet image in which at least one of the three-dimensional virtual pet exists is generated based on genetic rules, the method including:
sending an acquisition request of the three-dimensional virtual pet to a server;
receiving image parameters of the three-dimensional virtual pet sent by the server;
determining n layers of target making materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 0 and less than or equal to n, and n is an integer more than 1;
and overlapping and combining the n layers of target making materials of the three-dimensional virtual pet according to a hierarchical sequence to generate a pet image of the three-dimensional virtual pet.
According to another aspect of the present application, there is provided a method for generating a three-dimensional virtual pet, applied to a server provided with a three-dimensional virtual pet, in which a pet image in which at least one of the three-dimensional virtual pet exists is generated based on genetic rules, the method including:
receiving a three-dimensional virtual pet acquisition request sent by a terminal;
determining image parameters of the three-dimensional virtual pet according to the acquisition request, wherein the image parameters are used for indicating the terminal to determine n layers of target manufacturing materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet, the n layers of manufacturing materials are respectively corresponding to n angular color parts of the three-dimensional virtual pet, each layer of manufacturing materials comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing materials is one of the ith layer of manufacturing materials, i is more than or equal to 0 and less than or equal to n, and n is an integer more than 1;
and sending the image parameters of the three-dimensional virtual pet to the terminal.
According to another aspect of the present application, there is provided a three-dimensional virtual pet generation apparatus provided with the three-dimensional virtual pet, a pet image in which at least one three-dimensional virtual pet exists being generated based on genetic rules, the apparatus including:
the sending module is used for sending an acquisition request of the three-dimensional virtual pet to the server;
the receiving module is used for receiving the image parameters of the three-dimensional virtual pet sent by the server;
the determining module is used for determining n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 0 and less than or equal to n, and n is an integer more than 1;
and the generation module is used for superposing and combining the n layers of target making materials of the three-dimensional virtual pet according to the hierarchical sequence to generate the pet image of the three-dimensional virtual pet.
According to another aspect of the present application, there is provided a three-dimensional virtual pet generation apparatus provided with the three-dimensional virtual pet, a pet image in which at least one of the three-dimensional virtual pet exists being generated based on genetic rules, the apparatus including:
the receiving module is used for receiving a three-dimensional virtual pet acquisition request sent by the terminal;
a determining module, configured to determine an image parameter of the three-dimensional virtual pet according to the obtaining request, where the image parameter is used to instruct the terminal to determine n layers of target production materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet, where the n layers of production materials correspond to n corner color portions of the three-dimensional virtual pet, each layer of production material includes at least two different materials corresponding to the same portion, an ith layer of target production material is one of the ith layer of production materials, i is greater than or equal to 0 and less than or equal to n, and n is an integer greater than 1;
and the sending module is used for sending the image parameters of the three-dimensional virtual pet to the terminal.
According to another aspect of the present application, there is provided an electronic device comprising a memory and a processor; the memory stores at least one program that is loaded and executed by the processor to implement the method for generating a three-dimensional virtual pet as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium having at least one program stored therein, the at least one program being loaded and executed by a processor to implement the method for generating a three-dimensional virtual pet as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the n layers of manufacturing materials of the three-dimensional virtual pet are provided and correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, and the n layers of target manufacturing materials of the current three-dimensional virtual pet are determined according to the image parameters sent by the server, so that different index-level pet images, such as different billions of millions of pet images, can be generated for the same type of three-dimensional virtual pet based on the limited manufacturing materials.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a server system provided in an exemplary embodiment of the present application;
FIG. 3 is a flowchart of a method for generating a three-dimensional virtual pet, according to an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of n layers of manufacturing materials of a three-dimensional virtual pet according to an exemplary embodiment of the present application when combined in an overlapping manner;
FIG. 5 is a flowchart of a method for generating a three-dimensional virtual pet, according to an exemplary embodiment of the present application;
FIG. 6 is a diagram of a material set for a three-dimensional virtual pet, provided in accordance with an exemplary embodiment of the present application;
FIG. 7 is a schematic view of a three-dimensional virtual pet in superimposed combination, as provided by another exemplary embodiment of the present application;
FIG. 8 is a schematic illustration of a plurality of different three-dimensional virtual pets generated in accordance with an exemplary embodiment of the present application;
FIG. 9 is a flowchart of a method for generating a three-dimensional virtual pet, according to another exemplary embodiment of the present application;
FIG. 10 is a schematic illustration of an interface for purchasing a three-dimensional virtual pet, as provided in another exemplary embodiment of the present application;
FIG. 11 is a flowchart of a method for generating a three-dimensional virtual pet, according to another exemplary embodiment of the present application;
FIG. 12 is a schematic illustration of an interface for breeding a three-dimensional virtual pet according to another exemplary embodiment of the present application;
FIG. 13 is a block diagram of an apparatus for generating a three-dimensional virtual pet according to another exemplary embodiment of the present application;
FIG. 14 is a block diagram of an apparatus for generating a three-dimensional virtual pet according to another embodiment of the present application;
fig. 15 is a block diagram of a terminal according to another exemplary embodiment of the present application;
fig. 16 is a block diagram of a server according to another exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
Virtual pets: a digital pet that is presented in a cartoon and/or animal form of a pet image. The virtual pet is a two-dimensional digital pet or a three-dimensional digital pet, for example, the virtual pet is a three-dimensional virtual pet represented by a pet image in the form of a cartoon cat. Optionally, the pet image in which a portion of the virtual pet is present is randomly generated, such as the pet image of the 0 th generation virtual pet; the pet image of the virtual pet exists in a part which is generated according to genetic rules according to the pet images of the parent virtual pet and/or the virtual pets of other ancestors, for example, the pet images of the offspring virtual pets except the 0 generation virtual pet are generated according to the genetic rules. Optionally, each fictitious pet has a unique gene sequence comprising generation parameters for determining the pet image of the fictitious pet, also referred to as image parameters.
In some embodiments, the pet information for each virtual pet is stored on the blockchain system, stored and authenticated via a consensus mechanism of multiple nodes on the blockchain system. The pet information at least includes: the unique gene sequence of the virtual pet may optionally include: at least one of identification of the virtual pet, parent information of the virtual pet, intergenerational information of the virtual pet, genealogical information of the virtual pet, historical trading stream information of the virtual pet, historical lifetime event information of the virtual pet, and other information of the virtual pet. The virtual pets have a collectible attribute because the gene sequence of each virtual pet is unique and the information stored on the blockchain system is real and unique. Meanwhile, since the pet information of the virtual pet is stored in the blockchain system, even if the virtual pet is a digital pet designed to be used in the first application program, the virtual pet can be conveniently migrated to the second application program for use. The first application and the second application are different applications.
In some embodiments, the virtual pet is a digital pet exposed by an application (or application) running in the terminal. The application program includes at least one of the following functions: grabbing a virtual pet, generating a virtual pet, breeding a virtual pet, trading a virtual pet, using a virtual pet for combat, using a virtual pet for Augmented Reality (AR) interaction, using a virtual pet for social interaction, and using a virtual pet for AR education. In other embodiments, the application is a blockchain system-based application for virtual pet acquisition, breeding, and/or trading. In other embodiments, the application is a geographic location-based social gaming program that provides at least one of collecting, growing, and/or fighting with virtual pets.
In some embodiments, the application has the functionality to combat using a virtual pet. In this case, the gene sequence determines the characteristics of the virtual pet. The above features may include: extrinsic features and/or intrinsic features.
The appearance characteristic is a characteristic representing the pet image of the virtual pet. Alternatively, the virtual pet may include different body parts such as skin, patches, ears, beard, floral designs, eyes, and mouth, each of which may have a variety of different appearance characteristics. The appearance may include visible features such as color, shape, texture, etc. For example, the skin appearance may include different colors such as white skin, red skin, orange skin, yellow skin, green skin, cyan skin, blue skin, and purple skin. For another example, the external features of the ear may include different shapes such as a long ear, a short ear, a curled ear, a folded ear, and a normal ear.
Intrinsic characteristics refer to characteristics that embody intrinsic attributes of a virtual pet. For example, the intrinsic attributes may include a variety of different attributes such as intelligence values, attack force values, defense force values, spirit values, magic values, force values, endurance values, agility values, latent values, speed values, life values, and the like.
Gene sequence of virtual pet: includes a set of parameter values, also referred to as character parameters, for generating a pet character for the virtual pet. Taking the virtual pet as an example of a 3D virtual pet, the pet image of each virtual pet comprises a plurality of types of 3D image materials, each type of 3D image material corresponds to different role parts and/or texture levels, each 3D image material corresponds to a material identifier, and each type of 3D material identifier can be regarded as a parameter value in the gene sequence. Illustratively, if the 3D body model of a 3D virtual pet is the same, the pet image of the 3D virtual pet comprises at least 8 3D image materials (also called local features): 3D body model, ear model, skin material, eye material, nose material, mouth material, beard material, body stripe material, chest and abdomen pattern material. Optionally, the pet image of the 3D virtual pet may further optionally include: tail material, outside pendant material and global characteristics. Tail material is a characteristic of the tail model used to determine the virtual pet, such as a long, thin tail or a short, thick tail when the pet image is an animal type; the external hanging material is used for determining the characteristics of accessories of the virtual pet, and the accessories comprise but are not limited to at least one of a backpack, glasses, handheld props, a belt, clothes, a hat, shoes and a head ornament; the global feature is an integrated character feature of the body model for covering the virtual pet and has the highest priority for display. When the target image parameters include the global features, the global features can cover the local features to be displayed completely, that is, the local features can be hidden and not displayed. For example, when a certain pet cat has a superhuman global feature, the pet cat does not display the own cat image, but displays a pet image with a superhuman appearance or a pet image with a robot appearance.
Correspondingly, the gene sequences include: at least one of global characteristic parameter, skin texture characteristic parameter, skin color characteristic parameter, belly texture characteristic parameter, belly color characteristic parameter, eye texture characteristic parameter, eye color characteristic parameter, mouth texture characteristic parameter, mouth color characteristic parameter, beard texture characteristic parameter, beard color characteristic parameter, ear characteristic parameter, tail characteristic parameter, and pendant characteristic parameter. A gene sequence may be represented by a plurality of key-value pairs arranged in sequence, which may take the form of (gene name, parameter value). In an illustrative example, the Gene sequence is represented as Gene ═ 3D body model feature, default), (skin feature, smooth), (belly feature, floral 1), (mouth texture feature, small tiger 1), (mouth color feature, red), (tail feature, thick short shape) ].
Genetic rules of genes: the pet image of the parent virtual pet and/or other ancestor virtual pets is transmitted by imitating the genetic rule of a real organism, so as to generate the pet image of the child virtual pet. In some embodiments, to ensure that each virtual pet is a unique personalized virtual pet, each virtual pet has a unique gene sequence. In some embodiments, the genetic rule is a rule for generating a pet image with unique characteristics for a child virtual pet after recombining and de-duplicating the pet image of a parent virtual pet and/or other grandparent virtual pets according to genetic rules. The duplication removal means a mechanism that when a gene sequence identical to that of the existing virtual pet appears in the genetic process, the gene sequence of the virtual pet is regenerated, so that the gene uniqueness of the virtual pet is ensured. Alternatively, since the genetic rule is a genetic rule that mimics an actual organism, there are also limitations in the breeding process such as the length of pregnancy, inability of close relatives to breed, and the like.
In the present example, a genetic gene exists between two virtual pets having a genetic relationship. The genetic gene refers to a gene which is inherited by one of two virtual pets having a genetic relationship to the other. The characteristic determined by the genetic gene may be referred to as a genetic characteristic. The same genetic characteristics exist between two virtual pets with genetic relationship, namely the same image material characteristics exist. For example, two virtual pets with genetic relationships, both have yellow skin. As another example, two virtual pets with genetic relationships, both with red skin and with tucked ears. The number of the genetic characteristics may be one or more, and the present embodiment is not limited thereto. In general, the closer the generations between two virtual pets having a genetic relationship, the more genetic features; conversely, the more distant the ancestors between two virtual pets having a genetic relationship, the fewer the genetic trait.
The information of the virtual pet is as follows: the information of the generation number of the virtual pet in the whole virtual pet world is determined by the generation of the father virtual pet and the mother virtual pet of the virtual pet. In some embodiments, the offspring of the child virtual pet is obtained by adding one to the maximum offspring numbers of the father virtual pet and the mother virtual pet, for example, if the father virtual pet is a 0 th generation virtual pet, the mother virtual pet is a 4 th generation virtual pet, and the child virtual pet is a 5 th generation virtual pet. In some embodiments, the ancestor of the primary virtual pet is the lowest, e.g., 0. The era of the non-primary virtual pet is determined by the era of the virtual pet of the father mother. The era of the child virtual pet bred and generated by the father virtual pet is higher than that of the father virtual pet. In one example, if only the parent virtual pet of the same ancestor is allowed to breed to generate the child virtual pet (i.e., the next generation virtual pet), the ancestor of the child virtual pet is equal to the ancestor of the parent virtual pet plus 1. For example, if the ancestors of the parent virtual pets are all 1, the ancestors of the child virtual pets are 2; for another example, if the ancestors of the parent virtual pets are all 0, the ancestors of the child virtual pets are 1. In another example, if both the parent virtual pet of the same ancestor is allowed to breed to generate the child virtual pet (i.e., the next generation virtual pet) and the parent virtual pet of a different ancestor is allowed to breed to generate the child virtual pet, the ancestor of the child virtual pet is equal to the ancestor of the higher ancestor of the parent virtual pet plus 1. For example, if the era of the father virtual pet is 0, the era of the mother virtual pet is 2, and the era of the child virtual pet is 3. In addition, the primary virtual pet is not bred by the father virtual pet and the mother virtual pet, but automatically produced by the virtual pet system. Therefore, the primary virtual pet does not have a parent virtual pet and a mother virtual pet, and even has no other virtual pet which is higher in generation and has a genetic relationship with the primary virtual pet.
FIG. 1 shows a block diagram of a computer system 100 provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server cluster 140, and a second terminal 160.
The first terminal 120 is connected to the server cluster 120 through a wireless network or a wired network. The first terminal 120 may be at least one of a smartphone, a game console, a desktop computer, a tablet computer, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The first device 120 is installed and operated with an application program supporting a virtual pet. The application program may be any one of a pet growing game program, an AR game program, and an AR education program. The first terminal 120 is a terminal used by a first user, and an application program in the first terminal 120 is registered with a first user account.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server cluster 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server cluster 140 is used to provide background services for virtual pet enabled applications. Optionally, the server cluster 140 undertakes primary computational work and the first terminal 120 and the second terminal 160 undertakes secondary computational work; alternatively, the server cluster 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; or, the server cluster 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture. The server cluster 140 may be referred to simply as a server.
Optionally, the server cluster 140 includes: access server 141 and game server 142. The access server 141 serves to provide access services and information transceiving services of the first terminal 120 and the second terminal 160, and forwards useful information between the terminals and the game server 142. The game server 142 is used to provide background services for applications, such as: at least one of game logic service, material providing service, virtual pet generating service, virtual pet transaction service and virtual pet breeding service. The game server 142 may be one or more. When the game servers 142 are multiple, at least two game servers 142 exist for providing different services, and/or at least two game servers 142 exist for providing the same service, which is not limited in the embodiment of the present application.
The second terminal 160 is installed and operated with an application program supporting a virtual pet. The application program may be any one of a pet growing game program, an AR game program, and an AR education program. The second terminal 160 is a terminal used by the second user. The second terminal 120 has a second user account registered in the application.
Optionally, the first user account and the second user account are in the same virtual social network. Optionally, the first user account and the second user account may belong to the same team, the same organization, have a friend relationship, or have a temporary communication right. Alternatively, the first user account and the second user account may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different control system platforms. The application program includes at least one of the following functions: grabbing a virtual pet, generating a virtual pet, breeding the virtual pet, trading the virtual pet, using the virtual pet for combat, using the virtual pet for AR interaction, using the virtual pet for social interaction, and using the virtual pet for AR education. In other embodiments, the application is a blockchain system-based application for virtual pet acquisition, breeding, and/or trading. In other embodiments, the application is a geographic location-based social gaming program that provides at least one of collecting, growing, and/or fighting with virtual pets.
The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The terminal types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smartphone, a gaming console, a desktop computer, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated with the first terminal 120 and/or the second terminal 140 being a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
In some alternative embodiments, the server cluster 120 is configured to store pet information and transaction records for each virtual pet. The pet information includes: the pet mark is used for uniquely identifying the virtual pet, the image parameter is used for representing the pet image of the virtual pet, and the preview image is used for representing at least one of the virtual pet. In an alternative embodiment as shown in FIG. 2, the server cluster 120 is further coupled to a blockchain system 180, and the server cluster 120 stores pet information and/or transaction records for each virtual pet in the blockchain system 180. In some alternative embodiments, server cluster 120 may itself also operate and store data as a node in blockchain system 180.
FIG. 3 is a flowchart illustrating a method for generating a three-dimensional virtual pet according to an exemplary embodiment of the present application. The embodiment is exemplified by applying the generating method to the first terminal or the second terminal shown in fig. 1, the first terminal or the second terminal is simply referred to as a terminal, the first terminal or the second terminal runs an application program for providing a three-dimensional virtual pet, and the pet image in which at least one three-dimensional virtual pet (such as a non-first generation virtual pet) exists is generated based on genetic rules. The method comprises the following steps:
in step 301, sending a request for acquiring a three-dimensional virtual pet to a server;
the three-dimensional virtual pet is a character for display in an application within the terminal. Types of three-dimensional virtual pets include, but are not limited to: soldiers, general, heros, pets, NPC, saddles.
An application program runs in the terminal. The application program is provided with a function of using a three-dimensional virtual pet. The functions of using the three-dimensional virtual pet include: at least one of playing a game in a virtual environment using a three-dimensional virtual pet, performing a simulation using a three-dimensional virtual pet, playing an AR game using a three-dimensional virtual pet, and performing AR education using a three-dimensional virtual pet.
The three-dimensional virtual pet in the application program can be obtained by the following methods: at least one of a purchasing mode, a lottery mode, a gift bag receiving mode, an accumulated point exchanging mode and a breeding mode. That is, the present step includes at least one of the following forms:
sending an acquisition request of the three-dimensional virtual pet to a server in a purchasing mode;
sending an acquisition request of the three-dimensional virtual pet to a server in a lottery mode;
sending an acquisition request of the three-dimensional virtual pet to a server in a gift bag getting mode;
sending an acquisition request of the three-dimensional virtual pet to a server in a point exchange mode;
and sending an acquisition request of the three-dimensional virtual pet to a server based on the existing breeding mode of the three-dimensional virtual pet.
The server generates the image parameters of the three-dimensional virtual pet before or after receiving the acquisition request. And after receiving the acquisition request, the server sends the image parameters of the three-dimensional virtual pet to the terminal. For example, the server may randomly generate the image parameters of each three-dimensional virtual pet in advance. As another example, the server may
In step 302, receiving an image parameter of the three-dimensional virtual pet sent by the server;
correspondingly, the terminal receives the image parameters of the three-dimensional virtual pet sent by the server.
Optionally, the image parameters of the three-dimensional virtual pet include: and n layers of target production material marks.
In step 303, determining n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters;
the application program of the terminal stores a material set of the three-dimensional virtual pet. The material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 0 and less than or equal to n, and n is an integer more than 1.
The application program determines n layers of target making materials corresponding to the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters after acquiring the image parameters of the three-dimensional virtual pet.
In step 304, the n layers of target making materials of the three-dimensional virtual pet are overlapped and combined according to the hierarchical sequence to generate the pet image of the three-dimensional virtual pet.
Since the n layers of production materials correspond to the n character parts of the three-dimensional virtual pet, respectively, and the display levels corresponding to the different character parts are different, the n layers of production materials correspond to the respective level orders (also referred to as priority orders).
The terminal makes n layers of target manufacturing materials of the three-dimensional virtual pet, and the materials are overlapped and combined according to respective hierarchical sequence to generate the pet image of the three-dimensional virtual pet. The pet image may be a two-dimensional, 2.5-dimensional, or three-dimensional pet image.
After the pet image of the three-dimensional virtual pet is generated, the terminal displays the pet image of the three-dimensional virtual pet on a user interface of an application program; and/or in a real scene picture shot by the camera, the terminal superposes the pet image of the three-dimensional virtual pet on the real scene picture; and/or, superposing and displaying the pet image of the three-dimensional virtual pet in the virtual environment world.
In summary, in the method provided in this embodiment, n layers of manufacturing materials of the three-dimensional virtual pet are provided, the n layers of manufacturing materials respectively correspond to n color parts of the three-dimensional virtual pet, each layer of manufacturing materials includes at least two different materials corresponding to the same part, and the n layers of target manufacturing materials of the current three-dimensional virtual pet are determined according to the image parameters sent by the server, so that different pet images at an index level, such as different pet images at a billionth level, can be generated for the same type of three-dimensional virtual pet based on the limited manufacturing materials.
Taking a three-dimensional virtual pet as an example of a pet cat, as shown in fig. 4, the material set of the pet cat includes 8 layers of manufacturing materials: three-dimensional body model +7 layers of texture material. The 7-layer texture material comprises: ear 41, beard 42, mouth 43, eye 44, tread 45, patches 46 and skin 47. Optionally, ear 41 comprises a plurality of differently shaped ears; the beard 42 includes a plurality of beards of different shapes; the mouth 43 comprises a plurality of differently shaped mouths; the eye 44 comprises a plurality of eyes of different shapes and/or colors; the pattern 45 comprises a plurality of facial and abdominal patterns of different shapes and/or colors; the markings 46 comprise a plurality of body markings of different shapes and/or colors; the skin 47 comprises a plurality of skin background colors of different colors. When a certain pet cat needs to be generated, a target texture material is determined in each layer of texture material, and the n layers of target texture materials are overlapped layer by layer according to the hierarchical sequence to generate the pet image of the pet cat.
FIG. 5 is a flowchart illustrating a method for generating a three-dimensional virtual pet according to another exemplary embodiment of the present application. The embodiment is exemplified by applying the method to the computer system shown in fig. 1. In the computer system, an application program for providing a three-dimensional virtual pet is operated in the first terminal or the second terminal, and a pet image with at least one three-dimensional virtual pet (such as a non-first generation virtual pet) is generated based on genetic rules. The method comprises the following steps:
in step 501, a terminal sends a request for acquiring a three-dimensional virtual pet to a server;
an application program runs in the terminal, and the application program logs in a user account. Multiple terminals may communicate with the server simultaneously, each terminal communicating with the server using a respective user account. Different user accounts are used to distinguish different users.
Taking the three-dimensional virtual pet as a pet cat, the application program is a program provided with the pet cat as an example, and the obtaining manner of the three-dimensional virtual pet in the application program includes but is not limited to: at least one of a purchasing mode, a lottery mode, a gift bag receiving mode, an accumulated point exchanging mode and a breeding mode. In some embodiments, the breeding mode comprises: local breeding mode and breeding mode among different users. The local breeding mode is a breeding mode between two virtual pets provided by the same user account; the breeding mode among different users is a breeding mode between two virtual pets provided by different user accounts.
And the terminal sends an acquisition request to the server according to the operation triggered by the user on the application program. And the terminal sends an acquisition request of the three-dimensional virtual pet to the server through a wired network or a wireless network. The get request includes, but is not limited to, the following parameters:
a user account number; alternatively, the first and second electrodes may be,
a user account and an identifier of a three-dimensional virtual pet; alternatively, the first and second electrodes may be,
a user account number, and an identification of a parent of the three-dimensional virtual pet.
In step 502, the server receives an acquisition request sent by the terminal;
in step 503, the server determines the image parameters of the three-dimensional virtual pet according to the obtaining request;
optionally, the server generates and stores in advance the character parameters of the plurality of three-dimensional virtual pets. And after receiving the acquisition request, the server determines the image parameters of the three-dimensional virtual pet used this time from the stored image parameters of the plurality of three-dimensional virtual pets. For example, the server randomly generates and stores image parameters of the first-generation three-dimensional virtual pet.
Optionally, after receiving the obtaining request, the server generates an image parameter of the three-dimensional virtual pet according to information carried in the obtaining request.
The image parameters are used for instructing the terminal to determine n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet. In the material set, n layers of manufacturing materials respectively correspond to n angular color parts of the three-dimensional virtual pet, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, and i is greater than or equal to 0 and less than or equal to n. Typically, the profile parameters include material identifications for each of the n layers of the target production material. When the n layers of manufacturing materials comprise the three-dimensional body model of the three-dimensional virtual pet and the three-dimensional body model is unique, the image parameters may not comprise the identification of the three-dimensional body model, but comprise the material identifications of other n-1 layers of target manufacturing materials.
In step 504, the server sends the image parameters of the three-dimensional virtual pet to the terminal;
and the server sends the image parameters of the three-dimensional virtual pet to the terminal through a wired network or a wireless network. Optionally, the profile parameters include: and material identifications of the n layers of target making materials or material identifications of other n-1 layers of target making materials except the three-dimensional body model.
In step 505, the terminal receives the image parameters of the three-dimensional virtual pet sent by the server;
in step 506, the terminal determines n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters;
the application program of the terminal stores resource files. The resource file includes a material collection of the three-dimensional virtual pet. And the terminal determines the target production material from the material set according to the material identifier in the image parameter.
Taking the example that the three-dimensional virtual pet is a pet cat, referring to fig. 6, the material set of the pet cat includes 8 layers of manufacturing materials: three-dimensional body model +7 layers of texture material. The pet cat may use the same three-dimensional body model, and the 7 layers of texture material include: skin 61, patches 62, ears 63, beard 64, pattern 65, eyes 66, and mouth 67. The stripes are lines mainly formed by spots, and the patterns are lines mainly formed by lines. Each material in the same layer of textured material has a different color and/or shape. Optionally, each material in the layer of textured material has a transparency n, the transparency n being greater than 0% and less than 100%. That is, each material in the texture material layer has translucency, and the texture patterns of different layers can be seen after the texture materials of different layers are overlapped. Or the background part in the texture material layer is transparent, and the foreground part is opaque, so that a complete pet image of a virtual pet can be obtained after the multiple layers of texture material layers are overlapped.
Fig. 6 shows 8 skin materials 101 to 108, which are respectively white skin, red skin, orange skin, yellow skin, green skin, cyan skin, blue skin, and purple skin, and the various skins can be distinguished by different colors and have the same shape; correspondingly, 8 speckle materials 201 to 208, 8 ear materials 301 to 308, 8 beard materials 401 to 408, 8 texture materials 501 to 508, 8 eye materials 601 to 608, and 8 mouth materials 701 to 708 are also shown.
Alternatively, the 8 speckle materials 201 to 208 in fig. 6 are divided according to different shapes, but the speckle material of the same shape may be divided into more speckle materials according to different colors; correspondingly, the 8 pattern materials 501 to 508 in fig. 6 are divided according to different shapes, but the pattern material in the same shape can be divided into more pattern materials according to different colors; correspondingly, the 8 eye materials 601 to 608 in fig. 6 are divided according to different shapes, but the eye material of the same shape can also be divided into more eye materials according to different colors, which is not limited in the present application.
Optionally, the terminal determines the ith layer of target production material from the ith layer of production material according to the material identifier corresponding to the ith layer of target production material.
In one illustrative example, the image parameters for a pet cat include: targeted skin material 102, targeted zebra material 201, targeted ear material 301, targeted beard material 404, targeted floral material 501, targeted eye material 604, targeted mouth material 704. The profile parameters can be abbreviated as: {102, 201, 301, 404, 501, 605, 704 }. In other embodiments, the profile parameter may be represented by a key-value pair, abbreviated as: { (skin, 102), (patches, 201), (ears, 301), (beard, 404), (tread, 501), (eyes, 605), (mouth, 704) }
After the terminal acquires the image parameters of the pet cat, extracting a target skin material from the skin material set according to the material identifier 102; extracting target speckle materials from the speckle material set according to the material identifier 201; extracting target ear materials from the ear material set according to the material identification 301; extracting target beard materials from the set of beard materials according to the material identification 404; extracting target pattern materials from the pattern material set according to the material identification 501; extracting target eye stories from the eye story set according to the story identification 604; target mouth material is extracted from the collection of mouth material based on the material identification 704.
In step 507, the terminal superimposes and combines n layers of target making materials of the three-dimensional virtual pet according to the hierarchical sequence to generate a pet image of the three-dimensional virtual pet.
The hierarchical order is used for representing the sequential overlapping relation between the target making materials of all layers when the pet image is generated. Optionally, each layer of the production material also has a respective display priority. The hierarchical order of each layer of the object creation material may be the same as the display priority of the corresponding hierarchy, or may be different from the display priority of the corresponding hierarchy.
The hierarchical order is: for example, the ear is larger than the skin, larger than the stripe is larger than the pattern, larger than the eye is larger than the mouth, and larger than the beard, after the terminal determines n layers of target manufacturing materials, the terminal sequentially superimposes a target ear material 1, a target skin material 2, a target stripe material 3, a target pattern material 4, a target eye material 5, a target mouth material 6 and a target beard material 7 of the pet cat on a three-dimensional body model 0 of the pet cat, so that the pet cat with the individual pet image is generated. The terminal or server can generate billions of different pet cats in the manner described above, and fig. 8 shows 87 different cats generated in the manner described above.
After the pet image of the three-dimensional virtual pet is generated, the terminal displays the pet image of the three-dimensional virtual pet on a user interface of an application program; and/or the terminal superposes the pet image of the three-dimensional virtual pet on the real scene picture in the real scene picture shot by the camera; and/or the terminal displays the pet image of the three-dimensional virtual pet in a virtual environment world in an overlapping mode.
Optionally, when the three-dimensional virtual pet is a three-dimensional virtual pet, the terminal may further display a pet image of the three-dimensional virtual pet from different observation angles and/or different observation distances according to a three-dimensional display mode, which is not limited in the embodiment of the present application.
In summary, in the method provided in this embodiment, n layers of manufacturing materials of the three-dimensional virtual pet are provided, the n layers of manufacturing materials respectively correspond to n color parts of the three-dimensional virtual pet, each layer of manufacturing materials includes at least two different materials corresponding to the same part, and the n layers of target manufacturing materials of the current three-dimensional virtual pet are determined according to the image parameters sent by the server, so that different pet images at an index level, such as different pet images at a billionth level, can be generated for the same type of three-dimensional virtual pet based on the limited manufacturing materials.
Because the image parameters of each virtual pet are unique, each virtual pet has uniqueness, and when the pet information of the virtual pet is stored on the block chain system, the image parameters of the virtual pet are real and unique, so that the collection attribute of each virtual pet can be ensured, and the scarcity of some virtual pets can also be ensured.
In an alternative embodiment based on fig. 5, the terminal may obtain the image parameters of a three-dimensional virtual pet through a purchasing method, which will be explained below by using the embodiment shown in fig. 9; the terminal can also obtain the image parameters of a three-dimensional virtual pet by breeding, which will be explained below by using the embodiment shown in fig. 10.
FIG. 9 is a flowchart illustrating a method for generating a three-dimensional virtual pet according to another exemplary embodiment of the present application. The embodiment is exemplified by applying the method to the computer system shown in fig. 1. The method comprises the following steps:
in step 901, the server randomly generates image parameters and preview pictures of a plurality of three-dimensional virtual pets;
taking a three-dimensional virtual pet as an example of a three-dimensional pet cat, the server randomly generates a plurality of image parameters of the three-dimensional virtual pet and a preview of each three-dimensional virtual pet in advance. Optionally, the server randomly generates the appearance parameters of the first generation of three-dimensional virtual pets, such as the server generates the appearance parameters of a plurality of first generation of three-dimensional virtual pets every 10 days.
Optionally, the server further stores the pet identification of each three-dimensional virtual pet, the avatar parameter of each three-dimensional virtual pet, and the preview of each three-dimensional virtual pet. In some embodiments, the server stores the pet identification and character parameters for each three-dimensional virtual pet in a blockchain system.
The table shows the relationship between the three schematically.
Watch 1
Pet cat sign Image parameters Preview picture
6613 101、204、305、405、506、601、702 Picture 1
6614 102、202、303、404、504、602、708 Picture 2
6611 102、203、303、404、502、601、707 Picture 3
6609 102、208、301、401、502、602、706 Picture 4
In different embodiments, the material identifier is represented by one or a combination of several of numbers, letters, special characters, and chinese characters, and in this embodiment, the material identifier is only illustrated by using a three-digit representation, but the specific representation form of the material identifier is not limited.
In step 902, the server sends preview images of a plurality of three-dimensional virtual pets to the terminal;
after an application program in the terminal is started, a user logs in a user account in the application program. Then, the application program receives an interface display operation of a user. When the interface display operation is an interface for requesting display of a preview image of a three-dimensional virtual pet, the application requests a server for preview images of a plurality of three-dimensional virtual pets. Alternatively, the preview interface may be a purchase interface for a three-dimensional virtual pet.
Correspondingly, the terminal receives the preview images of the three-dimensional virtual pets transmitted by the server. Illustratively, the server sends the pet identification and the preview image of the three-dimensional virtual pet in the table I to the terminal, and the terminal receives the pet identification and the preview image of the three-dimensional virtual pet. The preview may be a two-dimensional image.
In step 903, the terminal displays a first user interface for purchasing a virtual object;
and after receiving the pet identification and the preview image of the three-dimensional virtual pet, the terminal generates a first user interface according to the pet identification and the preview image of the three-dimensional virtual pet. The first user interface is an interface for purchasing a virtual object.
Fig. 10 schematically shows a display process of the first user interface. First, a preview interface 91 for a plurality of pet cats is displayed in the application program, and the preview interface 91 includes pet identifiers and preview images for 8 different pet cats. When the user wishes to purchase the first pet cat in the second row, clicking on the card area where the first pet cat is located jumps to the first user interface 92 for that pet cat. The first user interface 92 displays a preview of the pet cat, a price trend, a pet identification of the pet cat, a character of the pet cat, and a purchase button. The purchase button also displays that pet cat has a price of 634 coins, and the user can click the purchase button to purchase.
In step 904, the terminal receives a purchase operation triggered on the first user interface;
optionally, a control, such as a purchase button, is displayed on the first user interface for purchasing the three-dimensional virtual pet. When the user operates the control on the first user interface, the terminal receives purchase operation triggered on the first user interface.
In step 905, the terminal sends a first obtaining request to the server according to the purchasing operation, wherein the first obtaining request is used for purchasing the three-dimensional virtual pet;
optionally, the first obtaining request carries a pet identifier of the three-dimensional virtual pet. Or the first obtaining request also carries the pet identification and the price of the three-dimensional virtual pet; or the first acquisition request also carries a pet identifier of the three-dimensional virtual pet and a user account; or the first obtaining request also carries the pet identification, the price and the user account of the three-dimensional virtual pet.
And after receiving the purchase operation on the first user interface, the application program in the terminal sends a first acquisition request to the server.
In step 906, the server receives a first acquisition request;
and after receiving the first acquisition request, the server extracts the pet identification of the three-dimensional virtual pet from the first acquisition request. Optionally, the server further extracts the price of the three-dimensional virtual pet and the user account number from the first obtaining request.
The server determines whether the user account number meets the purchase condition of the three-dimensional virtual pet, wherein the purchase condition comprises but is not limited to at least one of the following conditions: the user account has a purchase permission, the account resources of the user account are larger than the price of the three-dimensional virtual pet, and the login state of the user account is in an effective state.
And when the user account accords with the purchasing condition of the three-dimensional virtual pet, the server transfers the account resources corresponding to the price from the user account to another user account corresponding to the seller, and the successful purchase is determined.
In step 907, the server determines image parameters of the three-dimensional virtual pet according to the first obtaining request;
when the obtaining request is a first obtaining request for purchasing the three-dimensional virtual pet, the server determines the image parameters of the three-dimensional virtual pet according to the pet identification in the first obtaining request.
The server extracts the pet identification of the three-dimensional virtual pet from the first acquisition request, and then determines the image parameters corresponding to the pet identification in the stored image roles according to the pet identification.
Wherein, the image parameters of the three-dimensional virtual pet are randomly generated.
Illustratively, in conjunction with table one, when the pet identifier is "6611", determining the avatar parameter corresponding to the pet identifier includes: 102. 203, 303, 404, 502, 601, 707.
In step 908, the server sends the image parameters of the three-dimensional virtual pet to the terminal;
and the server sends the pet identification and the image parameters of the three-dimensional virtual pet to the terminal. Illustratively, the server sends first feedback information to the terminal, wherein the first feedback information carries the pet identifier and the image parameters of the three-dimensional virtual pet.
In step 909, the terminal receives the image parameters of the three-dimensional virtual pet sent by the server;
and the terminal extracts the image parameters of the three-dimensional virtual pet from the first feedback information.
In step 910, the terminal determines n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters;
the application program of the terminal stores resource files. The resource file includes a material collection of the three-dimensional virtual pet. And the terminal determines the target production material from the material set according to the material identifier in the image parameter.
Taking the three-dimensional virtual pet as an example, the material set of the pet cat comprises 8 layers of manufacturing materials: three-dimensional body model +7 layers of texture material. The pet cat may use the same three-dimensional body model, and 7 layers of texture material include: skin, patches, ears, beard, floral, eyes, and mouth.
And the terminal determines the ith layer of target making material from the ith layer of making material according to the material identification corresponding to the ith layer of target making material.
Taking the example that the image parameters of the three-dimensional virtual pet comprise material identifiers 102, 203, 303, 404, 502, 601 and 707, after the terminal acquires the image parameters of the pet cat, extracting a target skin material from a skin material set according to the material identifiers 102; extracting target speckle materials from the speckle material set according to the material identifier 203; extracting target ear material from the ear material set according to the material identification 303; extracting target beard materials from the set of beard materials according to the material identification 404; extracting target pattern materials from the pattern material set according to the material identification 502; extracting target eye materials from the eye material set according to the material identification 601; target mouth material is extracted from the set of mouth material according to the material identification 707.
In step 911, the terminal superimposes and combines n layers of target making materials of the three-dimensional virtual pet according to the hierarchical order to generate the pet image of the three-dimensional virtual pet.
The hierarchical order is used for representing the sequential overlapping relation between the target making materials of all layers when the pet image is generated. Optionally, each layer of the production material also has a respective display priority. The hierarchical order of each layer of the object creation material may be the same as the display priority of the corresponding hierarchy, or may be different from the display priority of the corresponding hierarchy.
The hierarchical order may be: ear > skin > speckle > pattern > eye > mouth > beard.
In summary, in the method provided in this embodiment, n layers of manufacturing materials of the three-dimensional virtual pet are provided, the n layers of manufacturing materials respectively correspond to n color parts of the three-dimensional virtual pet, each layer of manufacturing materials includes at least two different materials corresponding to the same part, and the n layers of target manufacturing materials of the current three-dimensional virtual pet are determined according to the image parameters sent by the server, so that different pet images at an index level, such as different pet images at a billionth level, can be generated for the same type of three-dimensional virtual pet based on the limited manufacturing materials.
The method provided by the embodiment can maximize and increase the total number of the generated pet images by generating the image parameters of the three-dimensional virtual pet in a random mode through the server. Dividing a certain three-dimensional virtual pet into 4 material manufacturing layers, manufacturing 4 manufacturing materials with different shapes on each layer, wherein the manufacturing materials with each shape have 4 different colors, namely, 16 materials on each layer are changed, so that the three-dimensional virtual pet with 16 by 16 or 65536 different images can be manufactured.
FIG. 11 is a flowchart illustrating a method for generating a three-dimensional virtual pet according to another exemplary embodiment of the present application. The embodiment is exemplified by applying the method to the computer system shown in fig. 1. The method comprises the following steps:
in step 1101, the terminal receives a breeding operation triggered on a second user interface;
the second user interface is a user interface for breeding a child three-dimensional virtual pet based on the parent three-dimensional virtual pet. The second user interface may be one user interface or a combination of several user interfaces. The user can pair the father three-dimensional virtual pet and the mother three-dimensional virtual pet, and then breeding is carried out on the father three-dimensional virtual pet and the mother three-dimensional virtual pet to obtain a new three-dimensional virtual pet.
Optionally, at least one of the father three-dimensional virtual pet and the mother three-dimensional virtual pet is of the current user account. For example, the current user account has a father three-dimensional virtual pet, and the father three-dimensional virtual pet is paired with a mother three-dimensional virtual pet owned by another user account to breed a new three-dimensional virtual pet; for another example, the current user account has a mother three-dimensional virtual pet, and the mother three-dimensional virtual pet is paired with a father three-dimensional virtual pet owned by another user account, so as to breed a new three-dimensional virtual pet; for another example, the current user account has a father three-dimensional virtual pet and a mother three-dimensional virtual pet at the same time, and a new three-dimensional virtual pet is bred.
Taking the example where the three-dimensional virtual pet is a pet cat, fig. 12 schematically illustrates the second user interface. The user can add a mother pet cat 'pet cat 000033' in the mother column, add a father pet cat 'cat 000024' in the father column, and breed a new pet cat through two pet cats. The user clicking the 'confirm breeding' button on the second user interface can be regarded as the breeding operation triggered by the user.
In step 1102, the terminal sends a second acquisition request to the server according to the breeding operation, wherein the second acquisition request is used for requesting to generate a three-dimensional virtual pet according to the manufacturing material of the three-dimensional virtual pet of the parent and the genetic rule;
and the terminal sends a second acquisition request to the server according to the breeding operation triggered by the user. The second acquisition request is used for requesting the server to generate the three-dimensional virtual pet according to the manufacturing material of the three-dimensional virtual pet of the father and mother and the genetic rule.
In some embodiments, the second obtaining request carries: the pet identification of the three-dimensional virtual pet of the father and the mother and/or the image parameters of the three-dimensional virtual pet of the father and the mother. In still other embodiments, the second get request may further include: a first user account corresponding to the father three-dimensional virtual pet and a second user account corresponding to the mother three-dimensional virtual pet.
In step 1103, the server receives a second acquisition request, where the second acquisition request carries the pet identifier of the parent's three-dimensional virtual pet;
and after receiving the second acquisition request, the server extracts the information carried in the second acquisition request. For example, the server extracts from the second acquisition request: the pet identification of the three-dimensional virtual pet of the father and the mother and/or the image parameters of the three-dimensional virtual pet of the father and the mother; for another example, the server further extracts from the second acquisition request: a first user account corresponding to the father three-dimensional virtual pet, and a second user account corresponding to the mother three-dimensional virtual pet.
In step 1104, the server determines the image parameters of the three-dimensional virtual pet according to the second obtaining request;
and when the acquisition request is a second acquisition request for requesting generation of the three-dimensional virtual pet according to the three-dimensional virtual pet of the parent, generating the three-dimensional virtual pet according to the manufacturing material of the three-dimensional virtual pet of the parent and the genetic rule.
Optionally, when the second obtaining request carries: when the pet identification of the father-mother three-dimensional virtual pet is carried out, the server acquires the image parameters of the father three-dimensional virtual pet according to the pet identification of the father three-dimensional virtual pet; acquiring image parameters of the mother three-dimensional virtual pet according to the pet identification of the mother three-dimensional virtual pet; and generating the three-dimensional virtual pet according to the genetic rule according to the image parameters of the father three-dimensional virtual pet and the image parameters of the mother three-dimensional virtual pet.
Illustratively, a genetic rule comprises: the i target production materials of the bred three-dimensional virtual pet are the same as the corresponding production materials of the father three-dimensional virtual pet, the j target production materials of the bred three-dimensional virtual pet are the same as the corresponding production materials of the mother three-dimensional virtual pet, and i + j is equal to n.
Illustratively, the other genetic rule comprises that a target production materials of the bred three-dimensional virtual pet are the same as corresponding production materials of the father three-dimensional virtual pet, b target production materials of the bred three-dimensional virtual pet are the same as corresponding production materials of the mother three-dimensional virtual pet, c1 target production materials of the bred three-dimensional virtual pet are randomly generated, and a + b + c1 is equal to n.
Illustratively, the other genetic rule comprises that a target production materials of the bred three-dimensional virtual pet are the same as corresponding production materials of the father three-dimensional virtual pet, b target production materials of the bred three-dimensional virtual pet are the same as corresponding production materials of the mother three-dimensional virtual pet, and c2 target production materials of the bred three-dimensional virtual pet are generated according to a preset corresponding relationship, wherein the preset corresponding relationship comprises a corresponding relationship between a pet identifier of the father three-dimensional virtual pet and c2 target production materials, and a + b + c2 is equal to n.
Illustratively, another genetic rule includes: at least one of an inheritance rule, a mutation rule, and a loss rule;
the inheritance rule means that all or part of the image material of the third pet image of the child virtual pet is copied from the first pet image and/or the second pet image. Optionally, the inheritance rule further includes: all or part of image materials of a third pet image of the child virtual pet are copied from the pet image of the ancestor virtual pet;
the variation rule means that the third pet image of the child virtual pet comprises image materials obtained by variation, and the image materials obtained by variation are image materials which are not possessed by the first pet image and the second pet image;
the loss rule is that when the first pet image and/or the second pet image has/have image material of global feature, the third pet image has image material of global feature, and the image material of global feature is integrated image material covering the body model of the virtual pet and has the highest display priority.
The genetic rule is merely an exemplary one, and the present embodiment does not limit the specific form of the genetic rule.
In step 1105, the server sends the image parameters of the three-dimensional virtual pet to the terminal;
and the server sends the pet identification and the image parameters of the three-dimensional virtual pet to the terminal. Illustratively, the server sends second feedback information to the terminal, wherein the second feedback information carries the pet identifier and the image parameters of the three-dimensional virtual pet bred this time.
Optionally, there is a time difference between the feedback time of the second feedback information and the sending time of the second obtaining request, where the time difference is greater than the threshold. This time difference may be referred to as the breeding time. Illustratively, the breeding time is several hours, days, weeks or months.
In step 1106, the terminal receives the image parameters of the three-dimensional virtual pet sent by the server;
and the terminal extracts the image parameters of the three-dimensional virtual pet from the second feedback information.
In step 1107, the terminal determines n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters;
the application program of the terminal stores resource files. The resource file includes a material collection of the three-dimensional virtual pet. And the terminal determines the target production material from the material set according to the material identifier in the image parameter.
Taking the three-dimensional virtual pet as an example, the material set of the pet cat comprises 8 layers of manufacturing materials: three-dimensional body model +7 layers of texture material. The pet cat may use the same three-dimensional body model, and 7 layers of texture material include: skin, patches, ears, beard, floral, eyes, and mouth.
And the terminal determines the ith layer of target making material from the ith layer of making material according to the material identification corresponding to the ith layer of target making material.
In step 1108, the terminal superimposes and combines the n layers of target making materials of the three-dimensional virtual pet according to the hierarchical order to generate the pet image of the three-dimensional virtual pet.
The hierarchical order is used for representing the sequential overlapping relation between the target making materials of all layers when the pet image is generated. Optionally, each layer of the production material also has a respective display priority. The hierarchical order of each layer of the object creation material may be the same as the display priority of the corresponding hierarchy, or may be different from the display priority of the corresponding hierarchy.
The hierarchical order may be: ear > skin > speckle > pattern > eye > mouth > beard. Illustratively, the eye has the highest display priority, and the display priority of other production materials is associated with the hierarchical order. For example, the display priority of each hierarchy is: eyes > beard > mouth > pattern > speckle > skin > ears.
In summary, in the method provided in this embodiment, n layers of manufacturing materials of the three-dimensional virtual pet are provided, the n layers of manufacturing materials respectively correspond to n color parts of the three-dimensional virtual pet, each layer of manufacturing materials includes at least two different materials corresponding to the same part, and the n layers of target manufacturing materials of the current three-dimensional virtual pet are determined according to the image parameters sent by the server, so that different pet images at an index level, such as different pet images at a billionth level, can be generated for the same type of three-dimensional virtual pet based on the limited manufacturing materials.
According to the method provided by the embodiment, the existing three-dimensional virtual pet of the parent and the mother is bred to obtain the new three-dimensional virtual pet of the child, and the manufacturing material of the new three-dimensional virtual pet of the child is obtained according to the manufacturing material of the three-dimensional virtual pet of the parent and the mother and the genetic rule, so that the genetic rule in the real world is simulated, and the pet image of the three-dimensional virtual pet has a more real simulation effect.
The following are embodiments of the apparatus of the embodiments of the present application, which correspond to the above-mentioned embodiments of the method one to one, and for details not described in detail in the embodiments of the apparatus, reference may be made to the above-mentioned corresponding embodiments of the method.
Fig. 13 is a block diagram of an apparatus for generating a three-dimensional virtual pet according to an exemplary embodiment of the present application. The apparatus may be implemented as all or a portion of the terminal in software, hardware, or a combination of both. The device is provided with the three-dimensional virtual pet, and a pet image in which at least one of the three-dimensional virtual pet exists is generated based on genetic rules. The device includes: a sending module 1320, a receiving module 1340, a determining module 1360, and a generating module 1380.
A sending module 1320, configured to send an obtaining request of the three-dimensional virtual pet to the server.
The receiving module 1340 is configured to receive the image parameters of the three-dimensional virtual pet sent by the server.
A determining module 1360, configured to determine n layers of target production materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to the n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, and i is greater than or equal to 0 and less than or equal to n. i is an integer and n is a positive integer greater than 1. Optionally, n is 8.
The generating module 1380 is configured to perform a stacking and combining on the n layers of target production materials of the three-dimensional virtual pet according to a hierarchical order, so as to generate a pet image of the three-dimensional virtual pet.
In an optional embodiment, the image parameters comprise material identifications of n layers of target production materials of the three-dimensional virtual pet;
the determining module 1360 is configured to determine, according to the material identifier of the i-th layer of the target production material of the three-dimensional virtual pet, the i-th layer of the target production material from the i-th layer of the production material set.
In an alternative embodiment, the n layers of manufacturing materials include: a three-dimensional body model of a three-dimensional virtual pet and n-1 layers of texture materials;
the generating module 1380 is configured to, based on the three-dimensional body model of the three-dimensional virtual pet, sequentially superimpose each layer of the texture material on the three-dimensional body model according to the hierarchical order of the n-1 layers of the texture material.
In some alternative embodiments, the at least two layers of textured material comprise at least two of the following layers of textured material: ear, skin, patches, patterns, eyes, mouth, and beard.
In alternative embodiments, the materials in the same layer of textured material have different colors and/or shapes.
In other alternative embodiments, each material in the layers of textured material has a transparency n, the transparency n being greater than 0% and less than 100%.
In some optional embodiments, the receiving module 1340 is configured to receive a purchase operation triggered on the first user interface; the sending module 1320 is further configured to send a first obtaining request to the server according to the purchasing operation, where the first obtaining request is used to purchase the three-dimensional virtual pet.
In some optional embodiments, the receiving module 1340 is configured to receive a breeding operation triggered on the second user interface; the sending module 1320 is configured to send a second obtaining request to the server according to the breeding operation, where the second obtaining request is used to request to generate the three-dimensional virtual pet according to the manufacturing material of the parent three-dimensional virtual pet and the genetic rule.
Fig. 14 is a block diagram illustrating a three-dimensional virtual pet generation apparatus according to an exemplary embodiment of the present application. The apparatus may be implemented as all or part of a server in software, hardware, or a combination of both. The device is provided with the three-dimensional virtual pet, and a pet image in which at least one of the three-dimensional virtual pet exists is generated based on genetic rules. The device includes: a receiving module 1420, a determining module 1440, and a transmitting module 1460.
The receiving module 1420 is configured to receive an obtaining request of the three-dimensional virtual pet sent by the terminal;
a determining module 1440, configured to determine an image parameter of the three-dimensional virtual pet according to the obtaining request, where the image parameter is used to instruct the terminal to determine n layers of target manufacturing materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet, the n layers of manufacturing materials correspond to n angular color portions of the three-dimensional virtual pet, each layer of manufacturing materials includes at least two different materials corresponding to the same portion, an ith layer of target manufacturing material is one of ith layer of manufacturing materials, and i is greater than or equal to 0 and less than or equal to n;
a sending module 1460, configured to send the image parameters of the three-dimensional virtual pet to the terminal.
In some optional embodiments, the determining module 1440 is configured to, when the obtaining request is a first obtaining request for purchasing the three-dimensional virtual pet, determine an image parameter of the three-dimensional virtual pet according to a pet identifier in the first obtaining request; wherein the image parameters of the three-dimensional virtual pet are randomly generated.
In some optional embodiments, the determining module 1440 is configured to generate the appearance parameters of the three-dimensional virtual pet according to the production materials of the parent three-dimensional virtual pet according to the genetic rules, when the obtaining request is a second obtaining request for requesting to generate the three-dimensional virtual pet according to the parent three-dimensional virtual pet.
Fig. 15 shows a block diagram of a terminal 1500 according to an exemplary embodiment of the present application. The terminal 1500 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one instruction for execution by processor 1501 to implement the method of generating a three-dimensional virtual pet provided by the method embodiments of the present application.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, touch screen display 1505, camera 1506, audio circuitry 1507, positioning assembly 1508, and power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, providing the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in still other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is used to locate a current geographic position of the terminal 1500 to implement navigation or LBS (location based Service). The positioning component 1508 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen display 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side bezel of terminal 1500 and/or underneath touch display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the touch display 1505 to switch from the bright screen state to the dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the touch display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
Fig. 16 is a schematic structural diagram of a server according to an embodiment of the present application. Specifically, the method comprises the following steps: server 1600 includes a Central Processing Unit (CPU) 1601, a system memory 1604 including a Random Access Memory (RAM) 1602 and a read-only memory (ROM) 1603, and a system bus 1605 connecting system memory 1604 and CPU 1601. The server 1600 also includes a basic input/output system (I/O system) 1606, which facilitates transfer of information between devices within the computer, and a mass storage device 1607 for storing an operating system 1613, application programs 1614, and other program modules 1615.
The basic input/output system 1606 includes a display 1608 for displaying information and an input device 1609 such as a mouse, keyboard, etc. for user input of information. Wherein the display 1608 and input device 1609 are connected to the central processing unit 1601 by way of an input/output controller 1610 which is connected to the system bus 1605. The basic input/output system 1606 may also include an input/output controller 1610 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, an input/output controller 1610 may also provide output to a display screen, a printer, or other type of output device.
The mass storage device 1607 is connected to the central processing unit 1601 by a mass storage controller (not shown) connected to the system bus 1605. The mass storage device 1607 and its associated computer-readable media provide non-volatile storage for the server 1600. That is, the mass storage device 1607 may include a computer-readable medium (not shown) such as a hard disk or a Compact Disc-Only Memory (CD-ROM) drive.
Without loss of generality, the computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media include RAM, ROM, erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other solid state memory technology, CD-ROM, Digital Versatile Disks (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1604 and mass storage device 1607 described above may be collectively referred to as memory.
The server 1600 may also operate with remote computers connected to a network via a network, such as the internet, according to various embodiments of the present application. That is, the server 1600 may be connected to the network 1612 through the network interface unit 1611 coupled to the system bus 1605, or the network interface unit 1611 may be used to connect to other types of networks or remote computer systems (not shown).
The present application further provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the method for generating a three-dimensional virtual pet provided in the above-mentioned method embodiments.
The present application further provides a computer program product, which when run on an electronic device, causes the electronic device to execute the method for generating a three-dimensional virtual pet according to the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (14)

1. A method for generating a three-dimensional virtual pet, applied to an application program provided with the three-dimensional virtual pet, wherein at least one pet image of the three-dimensional virtual pet exists, and the genetic rule is generated based on the genetic rule, wherein the genetic rule is a rule for generating a pet image with unique characteristics of a child three-dimensional virtual pet after the pet image of a parent three-dimensional virtual pet and/or other grandparent three-dimensional virtual pets is recombined and deduplicated according to the genetic rule, and the method comprises the following steps:
sending an acquisition request of the three-dimensional virtual pet to a server;
receiving an image parameter of the three-dimensional virtual pet sent by the server, wherein the image parameter is generated by the server according to a genetic rule according to a manufacturing material of the parent three-dimensional virtual pet and/or the other ancestor three-dimensional virtual pet when the server confirms that the acquisition request is a second acquisition request for requesting to generate the three-dimensional virtual pet according to the parent three-dimensional virtual pet;
determining n layers of target making materials of the three-dimensional virtual pet from a material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 1 and less than or equal to n, and n is an integer more than 1;
and overlapping and combining the n layers of target making materials of the three-dimensional virtual pet according to a hierarchical sequence to generate a pet image of the three-dimensional virtual pet.
2. The method of claim 1, wherein the character parameters include material identifications of n layers of targeted materials of the three-dimensional virtual pet;
the step of determining n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters comprises the following steps:
and determining the ith layer of target making material from the ith layer of making material in the material set according to the material identification of the ith layer of target making material of the three-dimensional virtual pet.
3. The method of claim 2, wherein the n layers of fabrication stock comprises: the three-dimensional body model of the three-dimensional virtual pet and the n-1 texture material layers;
the method for generating the pet image of the three-dimensional virtual pet by overlapping and combining the n layers of target making materials of the three-dimensional virtual pet according to the hierarchical sequence comprises the following steps:
and on the basis of the three-dimensional body model of the three-dimensional virtual pet, sequentially superposing each layer of texture material on the three-dimensional body model according to the hierarchical sequence of the n-1 layers of texture material.
4. The method of claim 3, wherein the at least two layers of the textured material comprise at least two of the following layers of the textured material:
ear, skin, patches, patterns, eyes, mouth, and beard.
5. A method as claimed in claim 4, wherein the individual materials in the same layer of textured material have different colours and/or shapes.
6. The method of claim 4, wherein each of the layers of the textured material has a transparency n, the transparency n being greater than 0% and less than 100%.
7. The method according to any one of claims 1 to 6, wherein said sending a request for obtaining the three-dimensional virtual pet to a server comprises:
receiving a purchase operation triggered on a first user interface;
and sending a first acquisition request to the server according to the purchase operation, wherein the first acquisition request is used for purchasing the three-dimensional virtual pet.
8. The method according to any one of claims 1 to 6, wherein said sending a request for obtaining the three-dimensional virtual pet to a server comprises:
receiving a breeding operation triggered on a second user interface;
and sending a second acquisition request to the server according to the breeding operation, wherein the second acquisition request is used for requesting to generate the three-dimensional virtual pet according to the manufacturing material of the three-dimensional virtual pet of the parent and the genetic rule.
9. A method for generating a three-dimensional virtual pet, which is applied to a server provided with the three-dimensional virtual pet, wherein at least one pet image of the three-dimensional virtual pet is generated according to genetic rules, and the genetic rules are rules for generating a pet image with unique characteristics of a child three-dimensional virtual pet after the pet image of a parent three-dimensional virtual pet and/or other grandparent three-dimensional virtual pets is recombined and deduplicated according to genetic rules, the method comprises the following steps:
receiving an acquisition request of the three-dimensional virtual pet sent by a terminal;
when the acquisition request is a second acquisition request for requesting generation of the three-dimensional virtual pet from the parent three-dimensional virtual pet, generating image parameters of the three-dimensional virtual pet according to the production materials of the parent three-dimensional virtual pet and/or the other ancestor three-dimensional virtual pets and genetic rules, the image parameter is used for instructing the terminal to determine n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet, the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, each layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 1 and less than or equal to n, and n is an integer more than 1;
and sending the image parameters of the three-dimensional virtual pet to the terminal.
10. The method of claim 9, further comprising:
when the obtaining request is a first obtaining request for purchasing the three-dimensional virtual pet, determining the image parameters of the three-dimensional virtual pet according to the pet identification in the first obtaining request;
wherein the image parameters of the three-dimensional virtual pet are randomly generated.
11. An apparatus for generating a three-dimensional virtual pet, wherein the apparatus is provided with the three-dimensional virtual pet, and at least one of pet images of the three-dimensional virtual pet is generated based on genetic rules which are rules for generating a pet image having unique characteristics of a three-dimensional virtual pet of a child after recombining and de-duplicating the pet image of a three-dimensional virtual pet of a parent and/or a three-dimensional virtual pet of other grandparents according to genetic rules, the apparatus comprising:
the sending module is used for sending an acquisition request of the three-dimensional virtual pet to a server;
the receiving module is used for receiving the image parameters of the three-dimensional virtual pet sent by the server, and the image parameters of the three-dimensional virtual pet are generated according to the manufacturing materials of the parent three-dimensional virtual pet and/or other ancestor three-dimensional virtual pets and genetic rules when the server confirms that the acquisition request is a second acquisition request for requesting the generation of the three-dimensional virtual pet according to the parent three-dimensional virtual pet;
the determining module is used for determining n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet according to the image parameters; the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, at least one layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 1 and less than or equal to n, and n is an integer more than 1;
and the generation module is used for superposing and combining the n layers of target making materials of the three-dimensional virtual pet according to the hierarchical sequence to generate the pet image of the three-dimensional virtual pet.
12. An apparatus for generating a three-dimensional virtual pet, wherein the apparatus is provided with the three-dimensional virtual pet, and at least one of pet images of the three-dimensional virtual pet is generated based on genetic rules which are rules for generating a pet image having unique characteristics of a three-dimensional virtual pet of a child after recombining and de-duplicating the pet image of a three-dimensional virtual pet of a parent and/or a three-dimensional virtual pet of other grandparents according to genetic rules, the apparatus comprising:
the receiving module is used for receiving the acquisition request of the three-dimensional virtual pet sent by the terminal;
a determination module for determining whether the acquisition request is a second acquisition request for requesting generation of the three-dimensional virtual pet from the parent three-dimensional virtual pet, generating image parameters of the three-dimensional virtual pet according to the production materials of the parent three-dimensional virtual pet and/or the other ancestor three-dimensional virtual pets and genetic rules, the image parameter is used for instructing the terminal to determine n layers of target making materials of the three-dimensional virtual pet from the material set of the three-dimensional virtual pet, the material set comprises n layers of manufacturing materials, the n layers of manufacturing materials correspond to n angular color parts of the three-dimensional virtual pet respectively, each layer of manufacturing material comprises at least two different materials corresponding to the same part, the ith layer of target manufacturing material is one of the ith layer of manufacturing material, i is more than or equal to 1 and less than or equal to n, and n is an integer more than 1;
and the sending module is used for sending the image parameters of the three-dimensional virtual pet to the terminal.
13. An electronic device, comprising a memory and a processor;
the memory stores at least one program, and the at least one program is loaded and executed by the processor to implement the method for generating a three-dimensional virtual pet according to any one of claims 1 to 8, or the method for generating a three-dimensional virtual pet according to claim 9 or 10.
14. A computer-readable storage medium, wherein at least one program is stored in the computer-readable storage medium, and the at least one program is loaded and executed by a processor to implement the method for generating a three-dimensional virtual pet according to any one of claims 1 to 8, or the method for generating a three-dimensional virtual pet according to claim 9 or 10.
CN201810840540.9A 2018-07-27 2018-07-27 Three-dimensional virtual pet generation method, device, equipment and storage medium Active CN109126136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810840540.9A CN109126136B (en) 2018-07-27 2018-07-27 Three-dimensional virtual pet generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810840540.9A CN109126136B (en) 2018-07-27 2018-07-27 Three-dimensional virtual pet generation method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109126136A CN109126136A (en) 2019-01-04
CN109126136B true CN109126136B (en) 2020-09-15

Family

ID=64799170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810840540.9A Active CN109126136B (en) 2018-07-27 2018-07-27 Three-dimensional virtual pet generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109126136B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109908587B (en) * 2019-03-20 2022-07-15 北京小米移动软件有限公司 Method and device for generating image parameters of reproducible virtual character and storage medium
CN112634416B (en) * 2020-12-23 2023-07-28 北京达佳互联信息技术有限公司 Method and device for generating virtual image model, electronic equipment and storage medium
CN113769393A (en) * 2021-09-27 2021-12-10 上海完美时空软件有限公司 Method and device for generating character image, storage medium and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903292A (en) * 2012-12-27 2014-07-02 北京新媒传信科技有限公司 Method and system for realizing head portrait editing interface
CN104376160A (en) * 2014-11-07 2015-02-25 薛景 Real person simulation individuality ornament matching system
CN104965997A (en) * 2015-06-05 2015-10-07 浙江工业大学 Crop virtual breeding method based on plant function and structure model
CN106204698A (en) * 2015-05-06 2016-12-07 北京蓝犀时空科技有限公司 Virtual image for independent assortment creation generates and uses the method and system of expression
CN108295465A (en) * 2018-02-12 2018-07-20 腾讯科技(深圳)有限公司 Share the method, apparatus, equipment and storage medium in the visual field in three-dimensional virtual environment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2449694B (en) * 2007-05-31 2010-05-26 Sony Comp Entertainment Europe Entertainment system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903292A (en) * 2012-12-27 2014-07-02 北京新媒传信科技有限公司 Method and system for realizing head portrait editing interface
CN104376160A (en) * 2014-11-07 2015-02-25 薛景 Real person simulation individuality ornament matching system
CN106204698A (en) * 2015-05-06 2016-12-07 北京蓝犀时空科技有限公司 Virtual image for independent assortment creation generates and uses the method and system of expression
CN104965997A (en) * 2015-06-05 2015-10-07 浙江工业大学 Crop virtual breeding method based on plant function and structure model
CN108295465A (en) * 2018-02-12 2018-07-20 腾讯科技(深圳)有限公司 Share the method, apparatus, equipment and storage medium in the visual field in three-dimensional virtual environment

Also Published As

Publication number Publication date
CN109126136A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
JP7237096B2 (en) Virtual pet information display method and device, terminal, server, computer program and system thereof
JP7142113B2 (en) Virtual pet display method and device, terminal and program
CN110276840B (en) Multi-virtual-role control method, device, equipment and storage medium
JP7090837B2 (en) Virtual pet information display method and devices, terminals, servers and their computer programs
CN110585726A (en) User recall method, device, server and computer readable storage medium
CN109107166B (en) Virtual pet breeding method, device, equipment and storage medium
CN110420464B (en) Method and device for determining virtual pet image parameters and readable storage medium
CN110496392B (en) Virtual object control method, device, terminal and storage medium
WO2022052620A1 (en) Image generation method and electronic device
CN109126136B (en) Three-dimensional virtual pet generation method, device, equipment and storage medium
WO2020233403A1 (en) Personalized face display method and apparatus for three-dimensional character, and device and storage medium
CN112891931A (en) Virtual role selection method, device, equipment and storage medium
JP7186901B2 (en) HOTSPOT MAP DISPLAY METHOD, DEVICE, COMPUTER DEVICE AND READABLE STORAGE MEDIUM
CN112306332B (en) Method, device and equipment for determining selected target and storage medium
CN110399183B (en) Virtual pet breeding method, device, equipment and storage medium
CN109806583A (en) Method for displaying user interface, device, equipment and system
CN113599819A (en) Prompt message display method, device, equipment and storage medium
CN112711335B (en) Virtual environment picture display method, device, equipment and storage medium
CN112717391A (en) Role name display method, device, equipment and medium for virtual role
CN115869624A (en) Game area marking method, device, equipment and storage medium
CN113769397A (en) Virtual object setting method, device, equipment, medium and program product
CN115494992A (en) Region division method, device and equipment for virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant