WO2013137609A1 - 캐릭터 화장을 제공하는 온라인 게임 제공 방법 및 그 시스템 - Google Patents
캐릭터 화장을 제공하는 온라인 게임 제공 방법 및 그 시스템 Download PDFInfo
- Publication number
- WO2013137609A1 WO2013137609A1 PCT/KR2013/001952 KR2013001952W WO2013137609A1 WO 2013137609 A1 WO2013137609 A1 WO 2013137609A1 KR 2013001952 W KR2013001952 W KR 2013001952W WO 2013137609 A1 WO2013137609 A1 WO 2013137609A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- makeup
- character
- information
- providing
- effect
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 129
- 230000000694 effects Effects 0.000 claims description 118
- 241001237961 Amanita rubescens Species 0.000 claims description 5
- 239000002537 cosmetic Substances 0.000 claims 1
- 238000005516 engineering process Methods 0.000 description 27
- 238000010586 diagram Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 235000010654 Melissa officinalis Nutrition 0.000 description 3
- 244000062730 Melissa officinalis Species 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 210000004709 eyebrow Anatomy 0.000 description 3
- 239000000865 liniment Substances 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 235000019013 Viburnum opulus Nutrition 0.000 description 1
- 244000071378 Viburnum opulus Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 210000000216 zygoma Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/63—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
Definitions
- the present application relates to a technology for providing a game, and more specifically, to decorate a character by reflecting the individual's personality by giving a virtual make-up function to a virtual character manipulated by a user in an online game.
- An online game providing method and system for providing a character makeup that can be provided.
- MORPGs multiplayer online role playing games
- a typical feature of such various online games is to use a virtual character on behalf of the user as the game's progress object. Therefore, the virtual character that directly performs the content of the online game has a role of representing the user in the virtual world provided in the game.
- the present application provides a make-up environment equivalent to the actual make-up for the virtual character provided in the online game, so that the virtual character can be variously decorated and provides a character make-up that enables different and individual embellishment from other characters. We want to provide the technology provided.
- the present application by setting the effect of the makeup and the process of the makeup tool to the virtual character similar to the actual makeup, the online game providing technology that can perform the process of dressing itself as interesting as a game To provide.
- the present application may vary the effect of the makeup by interpreting the user's click input differently according to the type of makeup tool to set a fixed or variable application range to which the effect of the makeup is applied and calculate an effect parameter for the corresponding range.
- the present invention aims to provide an online game providing technology that can be efficiently applied and can provide an input step for exerting the effect of makeup.
- the online game providing method is performed in an online game providing system that can be connected with at least one game client through a network, and can provide an online game based on a virtual character to the connected game client.
- the online game providing method includes: (a) providing a makeup input interface for displaying a certain character's appearance to a game client or displaying the character's face and a list of at least one makeup tool to the game client; b) receiving, from the game client, makeup tool information on any one selected from the makeup tool list and pointer input information input for the face of the displayed character; (c) makeup effect using the makeup tool information; Determining an application method of the method, determining an application area to apply the makeup effect using the pointer input information according to the determined application method, and calculating an effect parameter to be applied to the determined application area; and (d) the application area. Apply the effect parameter to the Correcting the face texture of the emitter, and comprising the step of providing a texture to the face of the corrected character through the make-up type interface.
- the online game providing system may be connected with at least one game client through a network, and may provide an online game based on a virtual character to the connected game client.
- the online game providing system includes a makeup function control unit, an application area determination unit, and a parameter calculation unit.
- the makeup function controller provides a makeup input interface that displays a list of makeup tools and the appearance of the character, and provides the corrected appearance of the character by applying effect parameters to the designated application area.
- the application area determination unit determines an application area to which the makeup effect is to be applied using at least one of pointer input information about the displayed character's appearance and makeup tool information including the makeup tool selected from the tool list.
- the parameter calculator calculates an effect parameter to be applied to the determined application area by using at least one of the makeup tool information and the pointer input information.
- the recording medium records a program for executing the online game providing method.
- the program may be connected to at least one game client through a network, and may be run in an online game providing system capable of providing an online game based on a virtual character to a connected game client, the program comprising: (a) a game; (B) providing a makeup input interface for displaying a certain character's appearance to the client or displaying to the game client a list including the character's face and at least one makeup tool; and (b) the makeup tool list from the game client.
- FIG. 1 is a reference diagram for explaining an online game providing system and a game client according to the disclosed technology.
- FIG. 2 is a block diagram illustrating an embodiment of an online game providing system according to the disclosed technology.
- FIG. 3 is a reference diagram illustrating one embodiment of a treatment procedure for providing a make-up function provided according to the disclosed technology.
- FIG. 4 is a reference diagram illustrating an example of information about a makeup tool that may be stored in a makeup tool database.
- 5 through 11 are reference diagrams for explaining various makeup tools and effects thereof that may be provided in the disclosed technology.
- FIG. 12 is a flowchart illustrating an embodiment of an online game providing method according to the disclosed technology.
- FIG. 13 is a flowchart illustrating another embodiment of an online game providing method according to the disclosed technology.
- first and second are intended to distinguish one component from another component, and the scope of rights should not be limited by these terms.
- first component may be named a second component, and similarly, the second component may also be named a first component.
- an identification code (e.g., a, b, c, etc.) is used for convenience of description, and the identification code does not describe the order of the steps, and each step clearly indicates a specific order in context. Unless stated otherwise, they may occur out of the order noted. That is, each step may occur in the same order as specified, may be performed substantially simultaneously, or may be performed in the reverse order.
- the disclosed technology can be embodied as computer readable code on a computer readable recording medium, and the computer readable recording medium includes all kinds of recording devices in which data can be read by a computer system.
- Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and are also implemented in the form of a carrier wave (for example, transmission over the Internet). It also includes.
- the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- FIG. 1 is a reference diagram for explaining an online game providing system and a game client according to the disclosed technology.
- the online game providing system 100 may provide an online game to at least one user (game client) at the same time through an online network environment.
- the online game provided by game providing system 100 may be an online game based on multiple users.
- the online game provided by the game providing system 100 is not limited to a specific genre. That is, since the online game according to the disclosed technology may provide a character make-up, an online game provided with content based on a character does not limit its genre. For example, it can be applied to games of various genres such as Multiplayer Online Role Playing Game (MORPG), Massive Multiplayer Online Role Playing Game (MMORPG), Sports Online Game, First-Person Shooting (FPS) or Third-Person Shooting (TPF). Can be.
- MORPG Multiplayer Online Role Playing Game
- MMORPG Massive Multiplayer Online Role Playing Game
- FPS First-Person Shooting
- TPF Third-Person Shooting
- the game providing system 100 may provide not only game contents between users (Player versus Player, PVP) but also game contents between users and AI (Player versus Environment). It can also provide PVP between multiple users, PVE between multiple users and artificial intelligence, and game content mixed with users and artificial intelligence.
- the game providing system 100 may perform at least some real-time data exchange with at least one game client 200 in order to provide an online game.
- the game providing system 100 may provide a makeup providing means for the virtual character provided in the game. Such a game providing system will be described in more detail below with reference to FIG. 2.
- the game client 200 is a game providing means that can be driven in the user terminal.
- the user terminal may be a terminal having a central processing unit, a storage unit, and an input / output unit, such as a PC, a tablet PC, a smart phone, and the like.
- the game client 200 may communicate with the game providing system 100 to provide an online game to the user.
- the game client 200 may load at least a part of a game engine for driving a game. For example, at least some of the predetermined operations required to provide the online game may be performed in the game client 200 for fast processing.
- FIG. 2 is a block diagram illustrating an embodiment of an online game providing system according to the disclosed technology.
- the game providing system 100 includes a communication unit 110, a game engine 120, a game providing control unit 130, a makeup tool database 140, a card table database 140, and a character.
- the database 150 may include an application area determiner 170, a parameter calculator 180, and a makeup function controller 160.
- the communication unit 110 may set or maintain a communication environment for providing a game with the game client 200 under the control of the game providing control unit 130.
- the communication unit 110 may directly communicate with the game client 200 under the control of the makeup function control unit 160 under the control of the game providing control unit 160.
- the game engine 120 is a component for driving a game and may be linked with the game client 200 to provide an online game to a user.
- the game engine 120 has been described in a separate configuration from the game providing control unit 130, but according to the embodiment, the game engine 120 may be implemented as a function of the game providing control unit 130.
- At least a portion of the game engine 120 may be loaded or overlapped with the game client 200.
- the game providing controller 130 may control other components of the game providing system 100 to provide an online game.
- the game providing control unit 130 may provide the request or data to the make-up function control unit 160. That is, the predetermined process related to the makeup function may be controlled by the makeup function controller 160. For example, when a user creates a new character or receives a request to modify the appearance of the character according to a predetermined procedure, the game provision controller 130 may modify the appearance of the character as a part of the character generation function.
- the make-up function may be provided as a part of the function, and for the make-up function, the make-up function controller 160 may hand over at least a part of the control authority to perform the related control.
- the game providing control unit 130 is displayed as a component separately from the makeup function control unit 160, but this will be merely a functional distinction for convenience of description. Therefore, the disclosed technology is not necessarily limited to configuring the two components as different independent components from each other, the makeup function control unit 160 may be implemented as a function of the game providing control unit 130.
- the disclosed technique is a user (game client) through the makeup tool database 140, the card table database 140, the character database 150, the application area determination unit 170, the parameter calculator 180, and the makeup function control unit 160. ) May provide a make-up function for the virtual character.
- FIG. 3 is a reference diagram illustrating one embodiment of a treatment procedure for providing a make-up function provided according to the disclosed technology.
- the disclosed technique provides a plurality of make-up tools, and the attributes 320 set according to the make-up tools may be different from each other. That is, according to the makeup tool, since the application method for inputting makeup is different, the movement of the pointer can be interpreted from the user's input by using the user's input 310 and the property 320 of the makeup tool together.
- an area (hereinafter, “application area”) to which an actual makeup effect is applied is determined in consideration of the application range according to the makeup tool, and a parameter to be adjusted according to the makeup effect in the corresponding area (hereinafter, "Effect parameter”) may be calculated and reflected to change the attributes of the texture to provide makeup.
- the makeup tool database 140 may store information about the makeup tool applied to makeup the virtual character.
- the makeup tool database 140 may store information about at least one of adjustment parameters for each makeup tool, a user input element, an application method, an application range designation method, an application area, a brush area, and a variable value.
- FIG. 4 is a reference diagram illustrating an example of information about a makeup tool that may be stored in the makeup tool database 140.
- the makeup tool database 140 may have attribute information for each makeup tool. A predetermined table having each may be stored or maintained.
- the attribute information for each makeup tool may include a fixed attribute and a variable attribute.
- Variable attributes refer to attributes that can be selected or adjusted by the user.
- the variable attribute may be selected by the user's selection when the user selects a sub menu after selecting the makeup tool in the makeup input interface. For example, if the user selects the foundation as a makeup tool in the makeup input interface, the size of the brush area of the foundation is a variable property, and the size of the brush area may be selected by the user.
- the fixed property is a property that is predetermined for each makeup tool and is not changed by the user.
- the disclosed technique may interpret a user's input by using a fixed attribute for each makeup tool.
- Adjustment parameters refer to the properties of the texture that the makeup tool changes. Reverting among the adjustment parameters means removing the makeup effect. If the adjustment parameter is a color, there may be a method of mixing with the existing color or overwriting the existing color.
- the user input element indicates which of the user inputs the adjustment parameter can be changed.
- the effect of gloss can be determined according to the drag length.
- the application method is an attribute indicating how input by a user is related to attribute change for each makeup tool.
- the application range designation method means a method of determining an area to which the makeup effect is applied.
- the application part means at least a part of the appearance of the virtual character to which the effect of the makeup tool is applied.
- the brush region refers to a brush region
- the variable value is an attribute specifying the aforementioned variable attribute
- the character database 150 may store information about the virtual character.
- the character database 150 may store texture information about the virtual character.
- the character database 150 may store texture information necessary for displaying a virtual character, and may display texture information of a specific character by providing texture information of the specific character at the request of another component.
- the makeup function controller 160 may control the makeup tool database 140, the character database 150, the application area determiner 170, or the parameter calculator 180 to provide a makeup function for the virtual character.
- the makeup function controller 160 may provide a makeup input interface for inputting makeup to the game client 200.
- the game client 200 may display at least a part of the appearance of the virtual character (eg, the face of the character) through the makeup input interface.
- the game client 200 may further include a list of makeup tools together with at least some of the appearance of the virtual character through the makeup input interface.
- the makeup function controller 160 may receive at least one of makeup tool information or pointer input information from the game client 200 that provides the makeup input interface.
- the makeup tool information may include identification information about the makeup tool for any one selected from a list of at least one makeup tool provided in the makeup input interface.
- the pointer input information is information input from the user with respect to the pointer.
- the pointer input information may include at least one of click information, drag length information, and drag direction information.
- the makeup function control unit 160 may identify the makeup tool from the makeup tool information, and may interpret the pointer movement of the pointer input information by reflecting the attribute of the identified makeup tool.
- the makeup function control unit 160 may determine the application method of the makeup effect using the makeup tool information. For example, the makeup function controller 160 may search the makeup tool database 140 by using identification information of the makeup tool included in the makeup tool information, and check the adjustment parameter among the properties of the makeup tool.
- the makeup function controller 160 may classify the application method of the makeup effect based on the application area.
- the makeup function control unit 160 is a method of applying a makeup effect, including (1) a free drawing method of designating a moving line of a pointer as an application area, (2) a fixed area method in which an application area is fixed, and (3)
- the application area may include at least one of the deformation area methods that vary according to the pointer input information.
- the makeup function controller 160 may apply the makeup effect to the character using the application area determining unit 170 or the parameter calculator 180 based on the application method of the makeup effect.
- the makeup function controller 160 may provide the application area determiner 170 or the parameter calculator 180 with information about a method of applying the divided makeup effect.
- the makeup function control unit 160 checks the makeup tool information, checks whether the makeup tool corresponds to one of a free drawing method, a fixed area method, and a deformation area method, and applies the identified information to the application area determination unit. Or to the parameter calculator 180.
- the makeup function control unit 160 may apply the effect parameter calculated by the parameter calculation unit 180 to the application area determined by the application area determination unit 170 to correct the character appearance (that is, change the texture value). have.
- the makeup function controller 160 may cover the existing makeup with respect to the overlapped area or mix the makeup with the existing makeup.
- the makeup function controller 160 may determine whether to cover or blend the makeup effects by determining whether the makeup tools have the same category of the sameness and effect parameters.
- the makeup function control unit 160 may cover the makeup effect if the same makeup tool and the same category of effect parameters, and mix the makeup effect if the makeup tools are different or the effect parameters are different from each other. have.
- the makeup function controller 160 may overwrite the overlapping part with a recent input if the makeup input by the same eye shadow is the same and the color of the eye shadow is the same (the effect parameter is the same category).
- the makeup function control unit 160 may output a result of mixing the two colors for the overlapping part when the makeup input by the same eye shadow or the colors of the eye shadows are different from each other.
- the makeup function control unit 160 may enlarge and display at least a partial area of the character appearance using the makeup tool information and provide the same to the game client 200. For example, if the makeup input tool selected in the state where the full screen shot size is displayed for the character is a ball touch, the face input of the character may be displayed on the makeup input interface and provided to the game client 200. For another example, when the eyebrow knife is selected while the face of the character is being displayed, the eyebrow portion of the character may be overlapped with the current character face or enlarged or displayed around the eyebrow portion.
- the makeup function controller 160 may provide the corrected character appearance to the game client 200 again to continuously provide the makeup function.
- the application area determination unit 170 may determine an application area to which the makeup effect is to be applied using at least one of the pointer input information or the makeup tool information according to the application method provided from the makeup function control unit 160.
- the application area determining unit 170 checks the blushing area for the corresponding makeup tool, interprets the click information included in the pointer input information, and generates one for each click.
- a blushing area may be designated and determined as an application area by including a plurality of blushing areas in the case of a plurality of clicks. For example, in the case of dot shooting as in the example shown in FIG. 6, the point at which the click occurs for each click can be determined as the application area. For another example, in the case of a foundation such as the example shown in FIG. 8, it can be determined as an application area including a path using a pointer while clicking.
- the application area determiner 170 may determine the application area using at least one of the makeup tool information and the click information. For example, when the make-up tool is a blusher and a click is made around the cheekbone, it can be determined as an application area by designating a blushing area (size of the blusher, etc.) at the point where the click occurs. However, if the make-up tool is a blusher and a click is made near the lower jaw, the application area may not be determined.
- the application area determining unit 170 may determine the internal area of the closed curve as the application area. have. For example, in the case of the lip gloss as in the example shown in FIG. 5, when a click is made in the lip region, the entire lip region may be determined as the application region.
- the application area determiner 170 may determine the application area by using the pointer input information. In more detail, the application area determiner 170 may determine an application area to which the makeup effect is applied according to a user's drag input. For example, in the case of the mascara as shown in the example shown in FIG. 9, the user may determine a region to apply the mascara according to a drag input performed by the user placing the pointer near the eye.
- the application area determiner 170 presets and maintains a plurality of deformation areas that are differently divided according to horizontal and vertical inputs for each makeup tool in advance for the deformation area method.
- the application area can be determined by determining which deformation area the drag information inputted from the data corresponds to.
- the application area determiner 170 calculates the drag lengths of the X and Y axes using the drag length information and the drag direction information included in the pointer input information, and calculates the drag lengths of the calculated X and Y axes.
- the application area can be determined by calculating the X-axis application area and the Y-axis application area by dividing by the predetermined unit length for each makeup tool.
- the parameter calculator 180 may calculate an effect parameter to be applied to the determined application area by using at least one of the makeup tool information and the pointer input information.
- the parameter calculator 180 may calculate the effect parameter to be applied to the application area by interpreting the pointer input information reflecting the application method.
- the parameter calculator 180 may calculate a blushing effect preset for each makeup tool or selected by a user as an effect parameter.
- the makeup function control unit 160 may apply the makeup effect by applying the calculated blushing effect to the application area including the at least one blushing area. For example, in the case of taking a dot as in the example shown in FIG. 6, an effect corresponding to a dot may be calculated as an effect parameter for a point where a click occurs.
- detailed parameters such as dot size, color, etc. may be provided and selected through the makeup input interface.
- the parameter calculator 180 may calculate an effect parameter by using pointer input information performed on the determined application area.
- the parameter calculator 180 identifies the click information or the drag length information performed on the determined application area and weights the number of click information or the length of the drag from the identified click information or the drag length information. Can be given to calculate the effect value to be applied.
- the parameter calculator 180 may determine a parameter item (adjustment parameter) to be adjusted according to the makeup tool, and calculate an effect parameter by reflecting the effect value calculated for the determined parameter item. For example, in the case of the lip gloss as shown in FIG. 5, as shown in (a) of FIG. 5, when the drag length is short, a lower gloss value is provided. The effect parameter can be calculated.
- the parameter item (adjustment parameter) to be adjusted according to the makeup tool may be any one of color, brightness, gloss and revert.
- the parameter calculator 180 may calculate an effect parameter by using an effect set according to the makeup tool for the determined application area.
- 5 through 11 are reference diagrams for explaining various makeup tools and effects thereof that may be provided in the disclosed technology.
- FIG. 5 shows an example for a lip gloss, which may be a makeup tool targeting a fixed area (lip) as described above. Therefore, when the user drags as shown, the application area determiner 160 may determine the lips as the application area, and the parameter calculator 180 may calculate the effect parameter to be applied according to the drag length. The application area determination unit 160 may apply the effect parameter to the lips to change the corresponding texture and provide the game client 200 with information about the changed character.
- a makeup effect may be applied using a point where a click is made by a user as an application area and a point size selected or default selected by the user as an effect parameter.
- FIG. 7 illustrates an example of a highlighter, and the highlighter may be applied in a fixed area designation manner. That is, as shown in the illustrated example, a region to which the highlighter can be applied, such as a forehead, a snorkel, a snowball, and a ball, may be preset, and the effect (brightness) of the highlighter using a user's brush touch click or drag information Can be calculated.
- a region to which the highlighter can be applied such as a forehead, a snorkel, a snowball, and a ball, may be preset, and the effect (brightness) of the highlighter using a user's brush touch click or drag information Can be calculated.
- the foundation may be applied in a free drawing manner. Accordingly, the application of the brush area to the movement line of the pointer after the user's click becomes the application area, and the makeup effect may be applied using the color selected or defaulted by the user as the effect parameter.
- 10-11 are examples showing that the disclosed technique can impose a makeup effect on other appearances other than the face of the virtual character.
- FIG. 10 illustrates an example of a body balm, and a makeup effect may be given by applying an area in which the body balm is dragged to the appearance of the character as an application area (free drawing) and using color or gloss of the body balm as an effect parameter. .
- FIG. 11 illustrates an example of a manicure (or pedicure), and as shown in FIG. 11, a user's free drawing path is applied to an application area (free drawing), and a preset or selected color or gloss effect is applied. A make-up effect can be provided as a parameter.
- FIG. 12 is a flowchart illustrating an embodiment of an online game providing method according to the disclosed technology.
- the game providing system 100 displays a list of a character's face and at least one makeup tool on the game client 200.
- a makeup input interface may be provided (step S1210).
- the game providing system 100 receives the makeup tool information about any one selected from the list of the makeup tools and the input pointer input information about the face of the displayed character from the game client 200 through the makeup input interface (step S1220,
- the application method of the makeup effect may be determined using the makeup tool information (step S1230).
- the game providing system 100 may determine an application area to which the makeup effect is to be applied using the pointer input information according to the determined application method (step S1240), and calculate an effect parameter to be applied to the determined application area (step S1250). .
- the game providing system 100 may apply the effect parameter to the application area to correct the face texture of the character (step S1260), and provide the face of the character whose texture is corrected through the makeup input interface (step S1270).
- the application method of the makeup effect may be classified based on the application area, and includes (1) a free drawing method for designating a moving line of the pointer as an application area, (2) a fixed area method in which the application area is fixed, and (3) application
- the region may include at least one of a modified region scheme in which the region is changed according to the pointer input information.
- the pointer input information may include at least one of click information, drag length information, and drag direction information.
- the game providing system 100 checks whether the makeup tool identified by the makeup tool information is a makeup tool belonging to the free drawing method, and if it belongs to the free drawing method, You can check the blushing area for.
- the game providing system 100 may change the texture of the corresponding area by applying the checked blushing area for each click according to the click information.
- the game providing system 100 checks whether the makeup tool identified by the makeup tool information is a makeup tool belonging to the fixed area method, and if it belongs to the fixed area method, the makeup tool information or The application area may be determined using at least one of the click information.
- the game providing system 100 may calculate the effect parameter using the pointer input information made for the determined application area.
- the game providing system 100 may determine an application area by using a closed curve.
- the game providing system 100 may check whether a click is performed within a predetermined closed curve using the click information, and if the click is performed, determine the inner region of the closed curve as the application area.
- the game providing system 100 may calculate the effect parameter using the click or drag length.
- the game providing system 100 identifies the click information or the drag length information performed on the determined application area and weights the number of click information or the length of the drag from the identified click information or the drag length information. Can be given to calculate the effect value to be applied.
- the game providing system 100 may determine a parameter item to be adjusted according to the makeup tool, and calculate an effect parameter by reflecting an effect value calculated for the determined parameter item.
- the parameter item may be any one of color, brightness, gloss and revert.
- the game providing system 100 checks whether the makeup tool identified by the makeup tool information is a makeup tool belonging to the deformation area method, and if the belonging to the deformation area method, the pointer input information.
- the application area can be determined.
- the game providing system 100 may calculate an effect parameter using an effect set according to the makeup tool for the determined application area.
- the game providing system 100 may calculate the variable area using the X-axis and Y-axis drag lengths.
- the game providing system 100 calculates the drag lengths of the X and Y axes using the drag length information and the drag direction information included in the pointer input information, and applies the calculated drag lengths of the X and Y axes.
- X-axis application area and Y-axis application area can be calculated by dividing by preset unit length for each tool.
- FIG. 13 is a flowchart illustrating another embodiment of an online game providing method according to the disclosed technology.
- the game providing system 100 provides a make-up input interface for displaying the appearance of a predetermined character to the game client 200 (In operation S1310, the makeup tool information and the pointer input information may be received through the provided makeup input interface (step S1320, YES).
- the game providing system 100 may select any one of application methods of the plurality of makeup effects according to the makeup tool information (step S1330).
- the game providing system 100 may determine an application area to which the makeup effect is applied using at least one of the makeup tool information or pointer input information according to the selected application method (step S1340), and the effect to be applied to the determined application area.
- the parameter can be calculated (step S1350).
- the game providing system 100 may apply the effect parameter to the application area to correct the face texture of the character (step S1260), and provide the appearance of the character whose texture is corrected through the makeup input interface (step S1270).
- the game providing system 100 may enlarge and provide the game client 200 with at least a partial area of the character appearance to which the makeup tool may be applied. .
- the game providing system 100 selects, as an application method, a pre-drawing method that designates an effect according to the moving line of the pointer, if the makeup tool is any one of a foundation, a dot, a concealer, and a manicure. Can be.
- step S1330 if the make-up tool is any one of blusher, highlighter, lipstick, and lip gloss, the game providing system 100 applies a fixed area method in which the application area of the effect is fixed for each make-up tool. It can be used as a method.
- step S1330 if the make-up tool is one of the eyeliner and the eyeshadow, the game providing system 100 may use the deformation area method in which the application area of the effect is variable according to the pointer input information as the application method. Can be.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (15)
- 네트워크를 통하여 적어도 하나의 게임 클라이언트와 연결될 수 있고, 연결된 게임 클라이언트에 가상의 캐릭터를 기반으로 하는 온라인 게임을 제공할 수 있는 온라인 게임 제공 시스템에서 수행되는 온라인 게임 제공 방법에 있어서,(a) 게임 클라이언트에 대하여 소정의 캐릭터의 외형을 디스플레이하거나 게임 클라이언트에 캐릭터의 얼굴 및 적어도 하나의 화장 도구에 대한 목록을 포함하여 디스플레이하는 화장 입력 인터페이스를 제공하는 단계;(b) 상기 게임 클라이언트로부터 상기 화장 도구 목록에서 선택된 어느 하나에 대한 화장 도구 정보 및 상기 디스플레이된 캐릭터의 얼굴에 대하여 입력된 포인터 입력 정보를 수신하는 단계;(c) 상기 화장 도구 정보를 이용하여 화장 효과의 적용 방식을 결정하고, 상기 결정된 적용 방식에 따라 상기 포인터 입력 정보를 이용하여 화장 효과를 적용할 적용 영역을 판정하고 상기 판정된 적용 영역에 적용될 효과 파라미터를 산출하는 단계; 및(d) 상기 적용 영역에 상기 효과 파라미터를 적용하여 상기 캐릭터의 얼굴 텍스처를 보정하고, 텍스처가 보정된 캐릭터의 얼굴을 상기 화장 입력 인터페이스를 통하여 제공하는 단계;를 포함하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제1항에 있어서, 상기 화장 효과의 적용 방식은상기 적용 영역을 기준으로 구분되고,상기 포인터의 이동 동선을 적용 영역으로 지정하는 프리 드로잉 방식, 상기 적용 영역이 고정되는 고정 영역 방식 및 상기 적용 영역이 상기 포인터 입력 정보에 따라 가변되는 변형 영역 방식 중 적어도 하나를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제2항에 있어서, 상기 포인터 입력 정보는클릭 정보, 드래그 길이 정보 및 드래그 방향 정보 중 적어도 하나를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제2항에 있어서, 상기 (c) 단계는상기 화장 도구 정보에 의하여 식별된 화장 도구가 상기 프리 드로잉 방식에 속하는 화장 도구인지 확인하는 단계;상기 확인 결과 프리 드로잉 방식에 속하면, 해당 화장 도구에 대한 블러싱 영역을 확인하는 단계; 및상기 클릭 정보에 따른 클릭 마다 상기 확인된 블러싱 영역을 적용하여 해당 영역의 텍스처를 변경하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제2항에 있어서, 상기 (c) 단계는(c-1) 상기 화장 도구 정보에 의하여 식별된 화장 도구가 상기 고정 영역 방식에 속하는 화장 도구인지 확인하는 단계;(c-2) 상기 확인 결과 고정 영역 방식에 속하면, 상기 화장 도구 정보 또는 상기 클릭 정보 중 적어도 하나를 이용하여 적용 영역을 판정하는 단계; 및(c-3) 상기 판정된 적용 영역을 대상으로 행해진 포인터 입력 정보를 이용하여 상기 효과 파라미터를 산출하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제5항에 있어서, 상기 (c-2) 단계는상기 클릭 정보를 이용하여, 소정의 폐곡선 내에서 클릭이 수행되었는지 확인하는 단계; 및상기 확인 결과 클릭이 수행되었으면, 해당 폐곡선의 내부 영역을 상기 적용 영역으로서 판정하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제5항에 있어서, 상기 (c-3) 단계는상기 판정된 적용 영역을 대상으로 행해진 클릭 정보 또는 드래그 길이 정보를 식별하는 단계;상기 식별된 클릭 정보 또는 드래그 길이 정보로부터 클릭 정보의 횟수 또는 드래그의 길이에 따라 가중치를 부여하여 적용할 효과 수치를 산출하는 단계; 및상기 화장 도구에 따라 조정할 파라미터 항목을 결정하고, 상기 결정된 파라미터 항목에 대하여 상기 산출된 효과 수치를 반영하여 상기 효과 파라미터를 산출하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제2항에 있어서, 상기 (c) 단계는(c-1) 상기 화장 도구 정보에 의하여 식별된 화장 도구가 상기 변형 영역 방식에 속하는 화장 도구인지 확인하는 단계;(c-2) 상기 확인 결과 변형 영역 방식에 속하면, 상기 포인터 입력 정보를 이용하여 적용 영역을 판정하는 단계; 및(c-3) 상기 판정된 적용 영역에 대하여 상기 화장 도구에 따라 설정된 효과를 이용하여 상기 효과 파라미터를 산출하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제8항에 있어서, 상기 (c-2) 단계는상기 포인터 입력 정보에 포함된 드래그 길이 정보 및 드래그 방향 정보를 이용하여 X 축 및 Y 축의 드래그 길이를 산출하는 단계; 및상기 산출된 X축 및 Y축의 드래그 길이를 화장 도구별로 기 설정된 단위 길이로 나누어 X축 적용 영역 및 Y축 적용 영역을 산출하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제1항에 있어서, 상기 (a) 단계는상기 화장 도구 정보를 수신하면, 해당 화장 도구가 적용될 수 있는 상기 캐릭터 외형의 적어도 일부 영역를 확대하여 상기 게임 클라이언트에 제공하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제1항에 있어서, 상기 (c) 단계는만일 화장 도구가 파운데이션, 점, 컨실러 및 매니큐어 중 어느 하나이면, 포인터의 이동 동선에 따라 효과를 지정하는 프리 드로잉 방식을 이용하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 제1항에 있어서, 상기 (c) 단계는만일 화장 도구가 블러셔, 하이라이터, 립스틱 및 립글로즈 중 어느 하나이면, 해당 화장 도구 별로 효과의 적용 영역이 고정되는 고정 영역 방식을 이용하는 단계;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법.
- 네트워크를 통하여 적어도 하나의 게임 클라이언트와 연결될 수 있고, 연결된 게임 클라이언트에 가상의 캐릭터를 기반으로 하는 온라인 게임을 제공할 수 있는 온라인 게임 제공 시스템에 있어서,화장 도구 목록 및 캐릭터의 외형을 디스플레이하는 화장 입력 인터페이스를 제공하고, 지정된 적용 영역에 대하여 효과 파라미터를 적용하여 보정된 캐릭터 외형을 재 제공하는 화장 기능 제어부;상기 디스플레이된 캐릭터의 외형에 대한 포인터 입력 정보 및 상기 도구 목록에서 선택된 화장 도구를 포함하는 화장 도구 정보 중 적어도 하나를 이용하여, 화장 효과를 적용할 적용 영역을 판정하는 적용 영역 판정부; 및상기 화장 도구 정보 및 상기 포인터 입력 정보 중 적어도 하나를 이용하여, 상기 판정된 적용 영역에 적용될 효과 파라미터를 산출하는 파라미터 산출부;를 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 시스템.
- 제13항에 있어서, 상기 화장 기능 제어부는상기 화장 도구 정보를 이용하여 화장 효과의 적용 방식을 결정하고,상기 적용 영역 판정부는상기 결정된 적용 방식을 기초로, 상기 포인터 입력 정보 또는 상기 화장 도구 정보 중 적어도 하나를 이용하여 화장 효과를 적용할 적용 영역을 판정하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 시스템.
- 온라인 게임 제공 방법을 실행시키기 위한 프로그램을 기록한 기록매체에 있어서,상기 프로그램은 네트워크를 통하여 적어도 하나의 게임 클라이언트와 연결될 수 있고, 연결된 게임 클라이언트에 가상의 캐릭터를 기반으로 하는 온라인 게임을 제공할 수 있는 온라인 게임 제공 시스템에서 구동될 수 있는 프로그램으로서,(a) 게임 클라이언트에 대하여 소정의 캐릭터의 외형을 디스플레이하거나 게임 클라이언트에 캐릭터의 얼굴 및 적어도 하나의 화장 도구에 대한 목록을 포함하여 디스플레이하는 화장 입력 인터페이스를 제공하는 기능;(b) 상기 게임 클라이언트로부터 상기 화장 도구 목록에서 선택된 어느 하나에 대한 화장 도구 정보 및 상기 디스플레이된 캐릭터의 얼굴에 대하여 입력된 포인터 입력 정보를 수신하는 기능;(c) 상기 화장 도구 정보를 이용하여 화장 효과의 적용 방식을 결정하고, 상기 결정된 적용 방식에 따라 상기 포인터 입력 정보를 이용하여 화장 효과를 적용할 적용 영역을 판정하고 상기 판정된 적용 영역에 적용될 효과 파라미터를 산출하는 기능; 및(d) 상기 적용 영역에 상기 효과 파라미터를 적용하여 상기 캐릭터의 얼굴 텍스처를 보정하고, 텍스처가 보정된 캐릭터의 얼굴을 상기 화장 입력 인터페이스를 통하여 제공하는 기능;을 포함하는 것을 특징으로 하는 캐릭터 화장을 제공하는 온라인 게임 제공 방법을 기록한 기록 매체.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/385,177 US20150038225A1 (en) | 2012-03-13 | 2013-03-11 | Online game providing method for providing character makeup and system therefor |
EP13761812.0A EP2827296A4 (en) | 2012-03-13 | 2013-03-11 | ONLINE GAME PROVIDING METHOD FOR PROVIDING A CHARACTER COMPOSITION AND CORRESPONDING SYSTEM |
CN201380010575.5A CN104137140A (zh) | 2012-03-13 | 2013-03-11 | 用于提供角色化妆的在线游戏提供方法及其系统 |
JP2015500358A JP2015513924A (ja) | 2012-03-13 | 2013-03-11 | キャラクタ化粧を提供するオンラインゲーム提供方法及びそのシステム |
RU2014141047/08A RU2586834C2 (ru) | 2012-03-13 | 2013-03-11 | Способ предоставления онлайн-игр для предоставления гримирования персонажей и система для этого |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0025477 | 2012-03-13 | ||
KR1020120025477A KR101398188B1 (ko) | 2012-03-13 | 2012-03-13 | 캐릭터 화장을 제공하는 온라인 게임 제공 방법 및 그 시스템 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013137609A1 true WO2013137609A1 (ko) | 2013-09-19 |
Family
ID=49161450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/001952 WO2013137609A1 (ko) | 2012-03-13 | 2013-03-11 | 캐릭터 화장을 제공하는 온라인 게임 제공 방법 및 그 시스템 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20150038225A1 (ko) |
EP (1) | EP2827296A4 (ko) |
JP (1) | JP2015513924A (ko) |
KR (1) | KR101398188B1 (ko) |
CN (1) | CN104137140A (ko) |
RU (1) | RU2586834C2 (ko) |
WO (1) | WO2013137609A1 (ko) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101697286B1 (ko) * | 2015-11-09 | 2017-01-18 | 경북대학교 산학협력단 | 사용자 스타일링을 위한 증강현실 제공 장치 및 방법 |
CN106251191A (zh) * | 2016-07-15 | 2016-12-21 | 深圳市金立通信设备有限公司 | 一种终端屏幕的显示控制方法及终端 |
CN107551549A (zh) * | 2017-08-09 | 2018-01-09 | 广东欧珀移动通信有限公司 | 游戏形象调整方法及其装置 |
CN110992248B (zh) * | 2019-11-27 | 2021-03-19 | 腾讯科技(深圳)有限公司 | 唇妆特效的显示方法、装置、设备及存储介质 |
CN111408129B (zh) * | 2020-02-28 | 2020-12-08 | 苏州叠纸网络科技股份有限公司 | 一种基于虚拟角色形象的交互方法、装置及存储介质 |
CN111861632B (zh) * | 2020-06-05 | 2023-06-30 | 北京旷视科技有限公司 | 虚拟试妆方法、装置、电子设备及可读存储介质 |
US20220202168A1 (en) * | 2020-12-30 | 2022-06-30 | L'oreal | Digital makeup palette |
CN112843702B (zh) * | 2021-03-11 | 2024-06-25 | 网易(杭州)网络有限公司 | 一种颜色调整方法、装置、电子设备和存储介质 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06319613A (ja) * | 1993-04-30 | 1994-11-22 | Onishi Netsugaku:Kk | 顔のメークアップ支援装置 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3912834B2 (ja) * | 1997-03-06 | 2007-05-09 | 有限会社開発顧問室 | 顔画像の修正方法、化粧シミュレーション方法、化粧方法、化粧サポート装置及びファンデーション転写膜 |
JP2003173451A (ja) * | 2001-12-05 | 2003-06-20 | Sony Communication Network Corp | キャラクタ表示方法およびその方法を利用可能な端末 |
US20050143174A1 (en) * | 2003-08-19 | 2005-06-30 | Goldman Daniel P. | Systems and methods for data mining via an on-line, interactive game |
JP2005092588A (ja) * | 2003-09-18 | 2005-04-07 | Hitachi Software Eng Co Ltd | 合成画像プリント装置及び画像編集方法 |
US20070019882A1 (en) * | 2004-01-30 | 2007-01-25 | Shoji Tanaka | Makeup simulation program, makeup simulation device, and makeup simulation method |
US7775885B2 (en) * | 2005-10-14 | 2010-08-17 | Leviathan Entertainment, Llc | Event-driven alteration of avatars |
CN101379460B (zh) * | 2006-01-31 | 2011-09-28 | 吉田健治 | 图像处理方法 |
WO2007128117A1 (en) * | 2006-05-05 | 2007-11-15 | Parham Aarabi | Method. system and computer program product for automatic and semi-automatic modification of digital images of faces |
KR20090002176A (ko) * | 2007-06-20 | 2009-01-09 | 엔에이치엔(주) | 네트워크 상에서 게임 아바타의 랭킹을 제공하는 시스템 및그 방법 |
US20090032100A1 (en) * | 2007-08-02 | 2009-02-05 | Eugene Oak | Position adjustable awning equipped with solar cell plates thereon |
WO2009021124A2 (en) * | 2007-08-07 | 2009-02-12 | Dna Digital Media Group | System and method for a motion sensing amusement device |
JP2009064423A (ja) * | 2007-08-10 | 2009-03-26 | Shiseido Co Ltd | メイクアップシミュレーションシステム、メイクアップシミュレーション装置、メイクアップシミュレーション方法およびメイクアップシミュレーションプログラム |
US20090312100A1 (en) * | 2008-06-12 | 2009-12-17 | Harris Scott C | Face Simulation in Networking |
JP5442966B2 (ja) * | 2008-07-10 | 2014-03-19 | 株式会社 資生堂 | ゲーム装置、ゲーム制御方法、ゲーム制御プログラム、及び、該プログラムを記録した記録媒体 |
JP5029852B2 (ja) * | 2010-01-07 | 2012-09-19 | 花王株式会社 | メイクアップシミュレーション方法 |
RU2438647C1 (ru) * | 2010-04-30 | 2012-01-10 | Елена Юрьевна Кутузова | Косметическое средство для формирования художественного образа (варианты) |
RU2475290C1 (ru) * | 2010-11-17 | 2013-02-20 | Общество С Ограниченной Ответственностью "Айтэм Мультимедиа" | Устройство для игр |
JP2012181688A (ja) * | 2011-03-01 | 2012-09-20 | Sony Corp | 情報処理装置、情報処理方法、情報処理システムおよびプログラム |
-
2012
- 2012-03-13 KR KR1020120025477A patent/KR101398188B1/ko active IP Right Grant
-
2013
- 2013-03-11 US US14/385,177 patent/US20150038225A1/en not_active Abandoned
- 2013-03-11 JP JP2015500358A patent/JP2015513924A/ja active Pending
- 2013-03-11 RU RU2014141047/08A patent/RU2586834C2/ru active
- 2013-03-11 EP EP13761812.0A patent/EP2827296A4/en not_active Withdrawn
- 2013-03-11 WO PCT/KR2013/001952 patent/WO2013137609A1/ko active Application Filing
- 2013-03-11 CN CN201380010575.5A patent/CN104137140A/zh active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06319613A (ja) * | 1993-04-30 | 1994-11-22 | Onishi Netsugaku:Kk | 顔のメークアップ支援装置 |
Non-Patent Citations (3)
Title |
---|
NAVER BLOG, MAKEUP METHOD-SITE FOR MAKEUP SIMULATION, TAAZ, 9 February 2009 (2009-02-09), XP055167023, Retrieved from the Internet <URL:http://blog.naver.com/happy2bean/30042556976> * |
NAVER BLOG: "Makeup! Hair style! Try computer simulation at TAAZ!", 7 January 2011 (2011-01-07), XP055167026, Retrieved from the Internet <URL:http://blog.naver.com/kachi6292/90103904651> * |
PANDORA TV: "Makeup Debut", 27 August 2009 (2009-08-27), Retrieved from the Internet <URL:http://channel.pandora.tv/channel/video.ptv?ch_userid=fkaqhrddwn&prgid=35971493> * |
Also Published As
Publication number | Publication date |
---|---|
KR20130104190A (ko) | 2013-09-25 |
KR101398188B1 (ko) | 2014-05-30 |
RU2586834C2 (ru) | 2016-06-10 |
JP2015513924A (ja) | 2015-05-18 |
CN104137140A (zh) | 2014-11-05 |
US20150038225A1 (en) | 2015-02-05 |
RU2014141047A (ru) | 2016-05-10 |
EP2827296A1 (en) | 2015-01-21 |
EP2827296A4 (en) | 2015-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013137609A1 (ko) | 캐릭터 화장을 제공하는 온라인 게임 제공 방법 및 그 시스템 | |
JP7482242B2 (ja) | 表情トランスファーモデルの訓練方法、表情トランスファー方法及び装置並びにコンピュータ装置及びプログラム | |
WO2020180134A1 (ko) | 이미지 수정 시스템 및 이의 이미지 수정 방법 | |
US9224248B2 (en) | Method of virtual makeup achieved by facial tracking | |
JP6778877B2 (ja) | メイクパーツ作成装置、メイクパーツ利用装置、メイクパーツ作成方法、メイクパーツ利用方法、メイクパーツ作成プログラム、およびメイクパーツ利用プログラム | |
CN104331164A (zh) | 一种基于手势识别的相似度阈值分析的手势运动平滑处理方法 | |
WO2021242005A1 (ko) | 전자 장치 및 사용자 아바타 기반의 이모지 스티커를 생성하는 방법 | |
WO2023138345A1 (zh) | 虚拟形象生成方法和系统 | |
WO2013141510A1 (ko) | 사용자에 의한 아이템 외형 변경을 지원하는 온라인 게임 제공 방법 및 그 시스템 | |
US10628984B2 (en) | Facial model editing method and apparatus | |
CN115346024A (zh) | 虚拟形象生成方法 | |
WO2022215823A1 (ko) | 영상 생성 방법 및 장치 | |
CN112083863A (zh) | 图像处理方法、装置、电子设备及可读存储介质 | |
CN104484034A (zh) | 一种基于手势识别的手势运动基元过渡帧定位方法 | |
Shin et al. | Incorporating real-world object into virtual reality: using mobile device input with augmented virtuality | |
CN111383343A (zh) | 一种面向家装设计的基于生成对抗网络技术的增强现实图像渲染上色方法 | |
WO2022097823A1 (ko) | 화장자의 영상 이미지 추출 및 얼굴 투영 시스템 | |
CN112446821A (zh) | 一种图像处理方法、装置及电子设备 | |
CN112037338A (zh) | Ar形象的创建方法、终端设备以及可读存储介质 | |
Yang et al. | Bimanual natural user interaction for 3D modelling application using stereo computer vision | |
CN115999156B (zh) | 角色控制方法、装置、设备及存储介质 | |
WO2023068571A1 (ko) | 전자 장치 및 전자 장치의 제어 방법 | |
KR102293108B1 (ko) | 화장자의 영상 이미지 추출 및 얼굴 투영 시스템 | |
WO2019083234A1 (ko) | 공간 콘텐츠 체험 정보의 시각화 및 보상 시스템 | |
WO2023172063A1 (ko) | 오픈 아바타 제공 서버 및 오픈 아바타 제공 프로그램 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13761812 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14385177 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015500358 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2013761812 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013761812 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014141047 Country of ref document: RU Kind code of ref document: A |