WO2023083067A1 - 一种绒毛渲染方法、装置、设备及介质 - Google Patents

一种绒毛渲染方法、装置、设备及介质 Download PDF

Info

Publication number
WO2023083067A1
WO2023083067A1 PCT/CN2022/129194 CN2022129194W WO2023083067A1 WO 2023083067 A1 WO2023083067 A1 WO 2023083067A1 CN 2022129194 W CN2022129194 W CN 2022129194W WO 2023083067 A1 WO2023083067 A1 WO 2023083067A1
Authority
WO
WIPO (PCT)
Prior art keywords
fluff
rendering
parameter
parameters
villi
Prior art date
Application number
PCT/CN2022/129194
Other languages
English (en)
French (fr)
Inventor
王泽�
尹豆
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Publication of WO2023083067A1 publication Critical patent/WO2023083067A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Definitions

  • the present application relates to computer processing technology, in particular to a villi rendering method, device, equipment and medium.
  • the embodiments of the present application provide a fluff rendering method, device, equipment, and medium, so as to provide various fluff shapes and improve the fluff rendering efficiency.
  • a fluff rendering method includes:
  • the fluff rendering parameters include one or more modeling parameters for fluff, and different modeling parameters among the one or more modeling parameters are used to render fluff of different shapes
  • the object to be rendered is an object that needs to be rendered with fluff;
  • a hair rendering device in the second aspect of the embodiment of the present application, includes:
  • the first acquiring unit is configured to acquire the fluff rendering parameters of the object to be rendered, the fluff rendering parameters include one or more modeling parameters for fluff, and different modeling parameters among the one or more modeling parameters are used for Rendering fluff in different shapes, the object to be rendered is an object that needs to be rendered with fluff;
  • the second acquiring unit is configured to render the object to be rendered according to the fluff rendering parameter, and obtain the fluff shape of the object to be rendered.
  • an electronic device in the third aspect of the embodiment of the present application, includes: a processor and a memory;
  • said memory for storing instructions or computer programs
  • the processor is configured to execute the instructions or computer programs in the memory, so that the electronic device executes the villi rendering method.
  • a computer-readable storage medium including instructions, which, when run on a computer, cause the computer to execute the above-mentioned fluff rendering method.
  • the fluff rendering parameters of the object to be rendered are acquired, and the fluff rendering parameters include one or more modeling parameters for the fluff.
  • each of the one or more styling parameters corresponds to a fluffy styling effect.
  • Fig. 1 is a kind of multi-channel villi rendering model diagram
  • FIG. 2 is a flow chart of a method for rendering fluff provided by an embodiment of the present application
  • Figure 3a is a schematic diagram of a fluffy spiral provided in the embodiment of the present application.
  • Figure 3b is an effect diagram of a spiral shape of fluff provided by the embodiment of the present application.
  • Fig. 3c is a bending effect diagram of fluff affected by noise waves provided by the embodiment of the present application.
  • Figure 3d is a UV offset effect diagram of fluff provided by the embodiment of the present application.
  • Fig. 4 is a schematic diagram of a fluff rendering device provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an electronic device provided by an embodiment of the present application.
  • an embodiment of the present application provides a fluff rendering method.
  • the fluff rendering parameters of the object to be rendered are obtained, and the fluff rendering parameters include one or more modeling parameters for fluff, Each of the one or more styling parameters renders a fluffy styling.
  • the fluff rendering parameters are used to render the object to be rendered, so that fluff of different shapes can be rendered when the fluff rendering is performed on the object to be rendered, so as to meet the diverse needs of users and improve user experience.
  • the fluff simulation in the embodiment of the present application can be realized based on a multi-pass (Pass) fluff rendering model, that is, the rendering model is extruded in a specific direction multiple times, and the transparency is reduced layer by layer to achieve the effect of simulating fluff.
  • a multi-pass (Pass) fluff rendering model that is, the rendering model is extruded in a specific direction multiple times, and the transparency is reduced layer by layer to achieve the effect of simulating fluff.
  • each Pass represents a layer, and when rendering each layer, the vertex position is moved out along the normal in the vertex shader.
  • the more passes used the better the rendering effect.
  • FIG. 2 the figure is a flow chart of a furling rendering method provided by an embodiment of the present application.
  • the method can be executed by a furling rendering device, which can be implemented by software and/or hardware, and generally can be integrated into electronic equipment.
  • the method includes:
  • S201 Acquire fluff rendering parameters of an object to be rendered.
  • the fluff rendering parameters may include one or more styling parameters, and the user renders fluff of different shapes with different styling parameters among the one or more styling parameters.
  • the object to be rendered is an object that needs to be rendered with fluff.
  • the shape parameter can be the fluff spiral parameter that determines the fluff spiral shape.
  • the fluff spiral shape refers to the shape of the fluff spiraling around the normal (that is, the initial growth direction).
  • the effect of the fluff spiraling around the normal can be rendered by the fluff spiral parameter .
  • the styling parameter can also be a parameter for determining the shape of the hair tail, which refers to the shape of the hair tail of the hair, so as to reflect different states of the object to be rendered through tail shapes of different shapes.
  • the shape parameter may also be a parameter for determining the villi bending shape, which refers to the bending effect of the villi relative to the initial growth direction, so as to simulate the change of the villi affected by external factors through the villi bending shape. For example, simulate the phenomenon that fluff is bent by wind force.
  • the determination of the villi bending shape can be achieved by increasing the villi noise, UV offset, flow direction parameters and vertex color offset parameters.
  • the fluff rendering parameters may include one or more of fluff spiral parameters, fluff tail shape parameters, and fluff bending parameters.
  • the fluff bending parameters may include one or more of fluff noise parameters, fluff UV offset parameters, fluff flow direction parameters, or fluff vertex offset parameters.
  • the fluff noise parameter is a parameter that adds random bending in the initial growth direction of the fluff to improve the authenticity of the rendered fluff; in the fluff UV offset parameter, U indicates that the fluff is offset in the horizontal direction, and V indicates that the fluff is in the vertical direction Offset; the flow direction parameter is a parameter that controls the growth direction of the fluff, and is used to render the aggregation effect of the fluff.
  • S202 Render the object to be rendered according to the fluff rendering parameter, and obtain the fluff shape of the object to be rendered.
  • the object to be rendered is rendered according to the fluff rendering parameter, so as to render various fluff shapes on the object to be rendered.
  • the number of spiral turns of the fluff is determined according to the fluff spiral parameters, and the object to be rendered is rendered according to the spiral turns of the fluff to obtain the spiral shape of the fluff of the object to be rendered. That is, when the fluff rendering parameters include the fluff spiral parameter, the number of turns of the fluff around the normal helix will be determined according to the fluff spiral parameter, and the spiral shape of the fluff will be determined according to the spiral turn number.
  • Pass-x represents the xth layer in the multi-pass.
  • determining the helix angle of the hair when determining the number of helical turns of the hair, the helix angle of the hair will be determined according to the helix parameter of the hair; Wherein, determining the helix angle of the fluff according to the fluff helix parameters includes:
  • the normal vector of the rendering model is the vector of the initial growth direction of the fluff
  • the coefficient of the current channel in the rendering model refers to the coefficient of the channel currently being rendered in the multi-channel rendering model.
  • side represents the first vector
  • varWorldNormal represents the normal vector, which is a three-dimensional vector
  • vec3(0.0,1.0.0.0) represents the global up direction vector
  • the cross function represents the calculation vector product
  • the output is a three-dimensional vector
  • the normalize function represents the process Normalized processing.
  • up represents the second vector.
  • angle indicates the first angle
  • FURLEVEL indicates the coefficient corresponding to the current Pass layer
  • PI indicates the circular ratio
  • SpiralPow indicates the spiral parameter, which is a factor used to control the spiral frequency.
  • the value range is [0,1].
  • the dir on the right side of the equal sign represents the vector of the direction of the current fluff, and the dir on the left represents the vector of the direction of the fluff after adding the spiral, that is, the helix angle of the fluff.
  • the dir on the right side of the equal sign indicates the initial growth direction of the fluff, that is, the normal direction; after multiple iterations, the dir on the right side of the equal sign in step 4 indicates the current actual growth direction of the fluff .
  • the spiral shape of each Pass fluff can be realized by the above method.
  • the fluff tail shape is determined according to the fluff tail shape parameter, and the object to be rendered is rendered according to the fluff tail shape to obtain the fluff tail shape of the object to be rendered.
  • the shape parameter of the fluffy tail is greater than zero, the tail of the fluff is in the shape of a spike; when the shape of the tail of the fluff is less than zero, the tail of the fluff is in the shape of a feather duster.
  • Determining the tail shape of the fluff according to the shape parameter of the tail of the fluff may include: taking the coefficient of the current channel in the rendering model as the base and the shape parameter of the tail of the fluff as the index to determine the first value; determining according to the first value and the gray value of the tail mask map of the fluff The shape of the tail of the fluff.
  • the mask texture of the fluffy tail can be obtained by sampling the image of the fluffy tail.
  • the tail shape of the fluff can be determined by:
  • SP.transluency pow(1.1-FURLEVEL,3.0+FurTailFactor)-1.0+t_mask.r
  • SP.transluency represents the transparency of the pixel, which is used to reflect the tail shape of the fluff; the pow(x,y) function is used to find the value of the yth power of x; FurTailFactor represents the shape parameter of the fluff tail; FURLEVEL represents the coefficient of the current Pass layer ;t_mask.r indicates the gray value of the mask texture sampled by the pixel at the tail of the fluff.
  • the SP.transluency is calculated, it is also necessary to judge whether it is less than the preset threshold. If yes, it means that the transparency is low, and the SP.transluency is discarded. For example, when it is less than 0.00001, it is discarded.
  • Determining the tail shape of the fluff according to the shape parameter of the fluff tail may include: taking the gray value of the tail mask map of the fluff as the base number, and the fluff tail shape parameter as an index to determine the second value; taking the coefficient of the current channel in the rendering model as the base number, and The shape parameter is an index to determine the third value; the tail shape of the hair is determined according to the difference between the second value and the third value.
  • the villi bending direction is determined according to the villi bending parameters, and the object to be rendered is rendered according to the villi bending direction to obtain the villi bending shape of the object to be rendered. That is, the fluff renderer can determine the bending direction of the fluff according to the fluff bending parameter, and then render the fur of the object to be rendered according to the bending direction, so that the rendered fluff has a curved shape.
  • the bending direction of the fluff refers to the direction that offsets the initial growth direction of the fluff, and the initial growth direction is the normal direction of the rendered model. That is, increasing the offset in the initial growth direction of the fluff makes the fluff appear curved, thus making the rendering effect more realistic.
  • the bending effect of fluff can be achieved by adding fluff noise, UV offset, flow direction parameters, and vertex offset.
  • the bending direction can be determined in the following way: perform vector product calculation according to the normal vector of the rendering model and the preset global direction vector to obtain the third vector; Perform vector product calculation on the three vectors and the normal vector to obtain a fourth vector; determine the second angle according to the coefficient of the current channel in the rendering model and the vertex coordinates of the current channel; according to the third vector, the fourth vector, The second angle and the fluff noise parameter determine a bending direction of the fluff.
  • the vertex coordinates of the current channel represent the vertex coordinate values in the local coordinate system corresponding to the current channel. Specifically, it can be achieved in the following ways:
  • side represents the first vector
  • up represents the second vector
  • VB.local_position represents the vertex coordinate value in the local coordinate system
  • FURLEVEL represents the coefficient of the current channel
  • noise() represents the random function
  • NoisePow represents the fluff noise parameter. The larger the parameter, the more random the direction the fluff bends.
  • the villi bending direction can be determined in the following manner: determine the villi bending direction based on the UV offset parameter, the villi mask map, texture coordinates, and the scaling factor of the texture coordinates.
  • the mask texture of the fluff can be determined by sampling the fluff image. Specifically, it can be calculated and realized by the following formula:
  • t_mask Texture2DScale(FurStyleMask, FB.texcoord0+uvoffset/FurMaskScale, FurMaskScale)
  • t_mask represents the gray value of the mask texture of the villi pixel, which is used to reflect the direction of villi bending
  • the Texture2DScale() function is used to sample the texture
  • FurStyleMask represents the villi mask map, which determines the initial shape of the villi
  • FB.texcoord0 represents the texture coordinates
  • uvoffset represents the UV offset parameter
  • FurMaskScale represents the scaling factor of the texture coordinates.
  • the flow direction parameter can be a flow direction map.
  • the R channel indicates the horizontal offset size, and the right is the positive direction;
  • the G channel indicates the vertical offset size.
  • the above is Positive direction.
  • the UV offset value can be obtained by sampling the flow map, and the villi bending direction is opposite to the UV offset value.
  • the UV offset is determined according to the flow direction parameter;
  • the bending direction of the fluff is determined according to the UV offset and the coefficient of the current channel in the rendering model.
  • the bending direction of the fluff can be determined in the following ways:
  • t_flowmap Texture2DScale(FlowMapture, FB.texcoord0, FlowMapScale);
  • t_uvoffset t_flowmap.xy*2.0-vec2(1.0,1.0);
  • t_flowmap represents the flow direction map of the fluff, that is, the flow direction parameter; the Texture2DScale() function is used to sample the texture; FlowMapture represents the flow direction texture; FB.texcoord0 represents the texture coordinate; FlowMapScale represents the texture coordinate scaling factor; Shift; FlowMapPow represents the coefficient for adjusting the strength of the shift.
  • the fluff bending direction can be determined in the following way: determine the fluff bending direction based on the vertex offset parameter, the normal direction of the rendering model, and the interpolation coefficient. Among them, the interpolation coefficient is used to reflect the degree of influence of the vertex offset parameter on the normal direction.
  • varWorldNormal represents the initial growth direction of the fluff
  • VB.model_color.xyz represents the increased offset direction on the vertex
  • u_ColorDirFactor is the interpolation coefficient of these two directions, indicating that the increased offset direction on the vertex has an effect on the initial growth direction.
  • mix() represents a mixing function, which is used to mix the initial growth direction and the offset direction on the vertex according to the interpolation coefficient
  • dir represents the final bending direction of the mixed fluff.
  • the fluff rendering parameters may also include basic parameters.
  • the length parameters, thickness parameters and density parameters of the fluff so as to realize the rendering of the fluff according to the basic parameters, and add the shape on the basis of the rendered fluff.
  • the device 400 may include a first acquisition unit 401 and a second acquisition unit 402 .
  • the first acquisition unit 401 is configured to acquire the fluff rendering parameters of the object to be rendered, the fluff rendering parameters include one or more modeling parameters for fluff, and the different modeling parameters in the one or more modeling parameters are used To render fluff in different shapes, the object to be rendered is an object that needs to be rendered with fluff.
  • the first obtaining unit 401 reference may be made to the relevant description of the above-mentioned method S201.
  • the second acquiring unit 402 is configured to render the object to be rendered according to the fluff rendering parameter, and obtain the fluff shape of the object to be rendered.
  • the second acquiring unit 402 reference may be made to the related description of the above-mentioned method S202.
  • the fluff rendering parameters include one or more modeling parameters among fluff spiral parameters, fluff tail shape parameters, and fluff bending parameters.
  • the second acquiring unit 402 is specifically configured to, in response to the fluff rendering parameter including a fluff spiral parameter, determine the number of spiral turns of the fluff according to the fluff spiral parameter, and The object to be rendered is rendered according to the number of spiral turns of the hair to obtain the shape of the hair of the object to be rendered.
  • the second acquisition unit 402 is specifically configured to respond to the fuzz rendering parameter including a fuzz spiral parameter, determine the helix angle of the fuzz according to the fuzz spiral parameter; The helix angle of the hair determines the number of helical turns of the hair.
  • the second acquisition unit 402 is specifically configured to perform vector product calculation according to the normal vector of the rendering model and the preset global direction vector to obtain the first vector; according to the first vector and Perform vector product calculation on the normal vector to obtain a second vector; determine the first angle according to the fluffy spiral parameter and the coefficient of the current channel in the rendering model; determine the first angle according to the first vector, the second vector and the first angle Determine the helix angle of the villi.
  • the second acquiring unit 402 is specifically configured to determine the shape of the tail of the fluff according to the shape parameter of the tail of the fluff in response to the rendering parameters of the fluff including the shape parameter of the tail of the fluff, and according to The shape of the tail of the hair is rendered to the object to be rendered to obtain the shape of the hair of the object to be rendered.
  • the second acquiring unit 402 is specifically configured to use the coefficient of the current channel in the rendering model as the base and the fluff tail shape parameter as the exponent when the shape parameter of the fluff tail is less than zero Determine the first value; determine the tail shape of the fluff according to the first value and the gray value of the tail mask texture of the fluff.
  • the second acquiring unit 402 is specifically configured to use the gray value of the tail mask map of the fluff as the base number, the fluff
  • the tail shape parameter is the index to determine the second value; the coefficient of the current channel in the rendering model is used as the base number, and the fluff tail shape parameter is used as the index to determine the third value; and the third value is determined according to the difference between the second value and the third value The shape of the tail of the villi.
  • the tail of the fluff when the shape parameter of the tail of the fluff is greater than zero, the tail of the fluff presents a spike shape; when the shape parameter of the tail of the fluff is less than zero, the tail of the fluff presents a feather duster shape.
  • the second acquisition unit 402 is specifically configured to determine the bending direction of the hair according to the hair bending parameter in response to the hair rendering parameter including the hair bending parameter, and according to the The to-be-rendered object is rendered in the bending direction of the fluff, and the fluff shape of the to-be-rendered object is obtained.
  • the fluff bending parameters include one or more of fluff noise parameters, fluff UV offset parameters, fluff flow direction parameters, or fluff vertex offset parameters.
  • the second acquiring unit 402 is specifically configured to, in response to the fluff bending parameter including the fluff noise parameter, perform vector processing according to the normal vector of the rendering model and the preset global direction vector Calculate the product to obtain the third vector; perform vector product calculation according to the third vector and the normal vector to obtain the fourth vector; determine the second angle according to the coefficient of the current channel in the rendering model and the vertex coordinates of the current channel; The third vector, the fourth vector, the second angle, and the fluff noise parameter determine a bending direction of the fluff.
  • the second acquisition unit 402 is specifically configured to include the UV offset parameter in response to the villi bending parameter, based on the UV offset parameter, the villi mask map, The texture coordinates and the scaling factor of the texture coordinates determine the bending direction of the hairs.
  • the second acquisition unit 402 is specifically configured to determine the UV offset according to the flow direction parameter in response to the villi bending parameter including the flow direction parameter; according to the UV offset Amount, a coefficient for the current pass in the rendered model determines the direction in which the fluff bends.
  • the second acquisition unit 402 is specifically configured to include the vertex offset parameter in response to the fluff bending parameter, based on the vertex offset parameter, the normal direction of the rendering model, and
  • the interpolation coefficient determines the bending direction of the hair, and the interpolation coefficient is used to reflect the degree of influence of the vertex offset parameter on the normal direction.
  • FIG. 5 it shows a schematic structural diagram of an electronic device 500 suitable for implementing the embodiment of the present application.
  • the terminal equipment in the embodiment of the present application may include but not limited to mobile phones, notebook computers, digital broadcast receivers, PDA (Personal Digital Assistant, personal digital assistant), PAD (portable android device, tablet computer), PMP (Portable Media Player, portable multimedia player), mobile terminals such as vehicle-mounted terminals (such as vehicle-mounted navigation terminals), and fixed terminals such as digital TVs (television, television sets), desktop computers, and the like.
  • the electronic device shown in FIG. 5 is only an example, and should not limit the functions and scope of use of this embodiment of the present application.
  • an electronic device 500 may include a processing device (such as a central processing unit, a graphics processing unit, etc.) 501, which may be randomly accessed according to a program stored in a read-only memory (ROM) 502 or loaded from a storage device 508.
  • ROM read-only memory
  • RAM random access memory
  • various appropriate actions and processes are executed by programs in the memory (RAM) 503 .
  • RAM random access memory
  • various programs and data necessary for the operation of the electronic device 500 are also stored.
  • the processing device 501, ROM 502, and RAM 503 are connected to each other through a bus 504.
  • An input/output (I/O) interface 505 is also connected to the bus 504 .
  • the following devices can be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a liquid crystal display (LCD), speaker, vibration an output device 507 such as a computer; a storage device 508 including, for example, a magnetic tape, a hard disk, etc.; and a communication device 509.
  • the communication means 509 may allow the electronic device 500 to perform wireless or wired communication with other devices to exchange data. While FIG. 5 shows electronic device 500 having various means, it is to be understood that implementing or having all of the means shown is not a requirement. More or fewer means may alternatively be implemented or provided.
  • the processes described above with reference to the flowcharts can be implemented as computer software programs.
  • the embodiments of the present application include a computer program product, which includes a computer program carried on a non-transitory computer readable medium, where the computer program includes program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from a network via communication means 509 , or from storage means 508 , or from ROM 502 .
  • the processing device 501 the above-mentioned functions defined in the method of the embodiment of the present application are executed.
  • An embodiment of the present application provides a computer-readable medium on which a computer program is stored, wherein when the program is executed by a processor, the fluff rendering method as described in any one of the above-mentioned embodiments is implemented.
  • the computer-readable medium mentioned above in this application may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • a computer readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof. More specific examples of computer-readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer diskettes, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, carrying computer-readable program code therein. Such propagated data signals may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, which can send, propagate, or transmit a program for use by or in conjunction with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer readable medium may be transmitted by any appropriate medium, including but not limited to: wires, optical cables, RF (radio frequency), etc., or any suitable combination of the above.
  • the client and the server can communicate using any currently known or future network protocols such as HTTP (HyperText Transfer Protocol, Hypertext Transfer Protocol), and can communicate with digital data in any form or medium
  • HTTP HyperText Transfer Protocol
  • the communication eg, communication network
  • Examples of communication networks include local area networks (“LANs”), wide area networks (“WANs”), internetworks (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network of.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device, or may exist independently without being incorporated into the electronic device.
  • the above-mentioned computer-readable medium carries one or more programs, and when the above-mentioned one or more programs are executed by the electronic device, the electronic device is made to execute the above-mentioned data processing method.
  • Computer program code for carrying out the operations of this application may be written in one or more programming languages, or combinations thereof, including but not limited to object-oriented programming languages—such as Java, Smalltalk, C++, and Includes conventional procedural programming languages - such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer can be connected to the user computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as through an Internet service provider). Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider such as AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • each block in a flowchart or block diagram may represent a module, program segment, or portion of code that contains one or more logical functions for implementing specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified functions or operations , or may be implemented by a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments described in the present application may be implemented by means of software or by means of hardware.
  • the name of the unit/module does not constitute a limitation on the unit itself under certain circumstances, for example, the voice data collection module can also be described as a "data collection module”.
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • SOCs System on Chips
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in conjunction with an instruction execution system, apparatus, or device.
  • a machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • a machine-readable medium may include, but is not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, apparatus, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer discs, hard drives, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • each embodiment in this specification is described in a progressive manner, each embodiment focuses on the differences from other embodiments, and the same and similar parts of each embodiment can be referred to each other.
  • the system or device disclosed in the embodiment since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and for relevant details, please refer to the description of the method part.
  • At least one (item) means one or more, and “multiple” means two or more.
  • “And/or” is used to describe the association relationship of associated objects, indicating that there can be three types of relationships, for example, “A and/or B” can mean: only A exists, only B exists, and A and B exist at the same time , where A and B can be singular or plural.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • At least one of the following” or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • At least one item (piece) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c ", where a, b, c can be single or multiple.
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically programmable ROM
  • EEPROM electrically erasable programmable ROM
  • registers hard disk, removable disk, CD-ROM, or any other Any other known storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

本申请公开了一种绒毛渲染方法、装置、设备及介质,在需要对待渲染对象进行绒毛渲染时,获取待渲染对象的绒毛渲染参数,该绒毛渲染参数中包括针对绒毛的一种或多种造型参数。其中,一种或多种造型参数中每种造型参数对应一种绒毛造型效果。在获取到绒毛渲染参数后,根据该绒毛渲染参数对待渲染对象进行渲染,获得待渲染对象的绒毛造型。可见,本申请实施例通过预先获取多种绒毛造型参数来渲染出多种造型的绒毛,提高绒毛造型的多样化。

Description

一种绒毛渲染方法、装置、设备及介质
本申请要求于2021年11月10日提交的申请号为202111329287.9、申请名称为“一种绒毛渲染方法、装置、设备及介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机处理技术,具体涉及一种绒毛渲染方法、装置、设备及介质。
背景技术
绒毛在生活中出现的频率非常高,包括各种帽子、大衣、围巾等,在一些影视制作中,为保证图像的质量,需要对出现的绒毛进行渲染。目前采用的绒毛渲染方案所渲染出的绒毛造型较为简单,无法满足多样化的需求。
发明内容
有鉴于此,本申请实施例提供一种绒毛渲染方法、装置、设备及介质,以提供多样化的绒毛造型,提高绒毛渲染效率。
为实现上述目的,本申请实施例提供的技术方案如下:
在本申请实施例第一方面,提供了一种绒毛渲染方法,所述方法包括:
获取待渲染对象的绒毛渲染参数,所述绒毛渲染参数中包括针对绒毛的一种或多种造型参数,所述一种或多种造型参数中不同的造型参数用于渲染出不同造型的绒毛,所述待渲染对象为需要进行绒毛渲染的对象;
根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
在本申请实施例第二方面,提供了一种毛发渲染装置,该装置包括:
第一获取单元,用于获取待渲染对象的绒毛渲染参数,所述绒毛渲染参数中包括针对绒毛的一种或多种造型参数,所述一种或多种造型参数中不同的造型参数用于渲染出不同造型的绒毛,所述待渲染对象为需要进行绒毛渲染的对象;
第二获取单元,用于根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
在本申请实施例第三方面,提供了一种电子设备,所述设备包括:处理器和存储器;
所述存储器,用于存储指令或计算机程序;
所述处理器,用于执行所述存储器中的所述指令或计算机程序,以使得所述电子设备执行所述的绒毛渲染方法。
在本申请实施例第四方面,提供了一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行以上所述的绒毛渲染方法。
由此可见,本申请实施例具有如下有益效果:
本申请实施例提供的技术方案,在需要对待渲染对象进行绒毛渲染时,获取待渲染对象的绒毛渲染参数,该绒毛渲染参数中包括针对绒毛的一种或多种造型参数。其中,一种或多种造型参数中每种造型参数对应一种绒毛造型效果。在获取到绒毛渲染参数后,根据该绒毛渲染参数对待渲染对象进行渲染,获得待渲染对象的绒毛造型。可见,本申请实施例通过预先获取多种绒毛造型参数来渲染出多种造型的绒毛,提高绒毛造型的多样化。
附图说明
图1为一种多通道绒毛渲染模型图;
图2为本申请实施例提供的一种绒毛渲染方法流程图;
图3a为本申请实施例提供的一种绒毛螺旋示意图;
图3b为本申请实施例提供的一种绒毛螺旋造型效果图;
图3c为本申请实施例提供的一种绒毛受噪波影响的弯曲效果图;
图3d为本申请实施例提供的一种绒毛UV偏移效果图;
图4为本申请实施例提供的绒毛渲染装置示意图;
图5为本申请实施例提供的一种电子设备意图。
具体实施方式
为使本申请的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本申请实施例作进一步详细的说明。可以理解的是,此处所描述的具体实施例仅仅用于解释本申请,并非对本申请的限定。另外,还需要说明的是,为便于描述,附图中仅示出了与本申请相关的部分,并非全部结构。
随着多媒体技术的不断发展,在一些多媒体视频中常常会出现一些人物或动物等角色的仿真,该仿真过程不可避免地会出现绒毛的仿真。然而,传统的绒毛渲染方法所渲染出来的绒毛造型较为单一,无法满足用户所需的多样化绒毛造型,影响用户使用体验。
基于此,本申请实施例提供了一种绒毛渲染方法,在对待渲染对象进行绒毛渲染时,获取待渲染对象的绒毛渲染参数,该绒毛渲染参数中包括针对绒毛的一种或多种造型参数,该一种或多种造型参数中每种造型参数渲染出一种绒毛造型。在获取到绒毛渲染参数后,利用该绒毛渲染参数对待渲染对象进行渲染,从而在对待渲染对象进行绒毛渲染时,可以渲染出不同造型的绒毛,满足用户多样化需求,提升用户使用体验。
需要说明的是,本申请实施例中绒毛仿真可以基于多通道(Pass)绒毛渲染模型实现,即通过渲染模型多次沿特定方向挤出,并逐层降低透明度来到达模拟绒毛的效果。例如图1所示,每一个Pass代表一层,渲染每一层时在顶点着色器中将顶点位置沿法线移出。通常情况下,所使用的Pass数量越多,渲染效果越好。
为便于理解本申请实施例提供的技术方案,下面将结合附图进行说明。
参见图2,该图为本申请实施例提供的一种绒毛渲染方法流程图,该方法可以由绒毛渲染装置执行,该装置可以采用软件和/或硬件实现,一般可集成在电子设备中。如图2所示,该方法包括:
S201:获取待渲染对象的绒毛渲染参数。
本实施例中,在需要对待渲染对象进行绒毛渲染时,获取对应的绒毛渲染参数。其中,绒毛渲染参数可以包括一种或多种造型参数,该一种或多种造型参数中不同的造型参数用户渲染出不同造型的绒毛。其中,待渲染对象为需要进行绒毛渲染的对象。
可选的,造型参数可以为确定绒毛螺旋造型的绒毛螺旋参数,绒毛螺旋造型是指绒毛绕法线(即初始生长方向)螺旋的造型,通过绒毛螺旋参数可以渲染出绒毛绕法线螺旋的效果。或者造型参数还可以为确定绒毛尾部形状造型的参数,该绒毛尾部形状造型是指绒 毛的发尾形状造型,以通过不同形状的尾部造型来反映待渲染对象的不同状态。例如待渲染对象为动物,可以通过渲染绒毛尾部呈现尖刺状来体现待渲染对象处于生气状态或通过渲染绒毛尾部呈现鸡毛掸子状来体现待渲染对象处于正常状态。或者造型参数还可以为确定绒毛弯曲造型的参数,该绒毛弯曲造型是指绒毛相对于初始生长方向的弯曲效果,以通过绒毛弯曲造型来模拟体现绒毛受外界因素影响所带来的变化。例如,模拟绒毛受风力影响而导致弯曲的现象。
其中,确定绒毛弯曲造型可以通过增加绒毛噪波、UV偏移量、流向参数以及顶点色偏移参数的方式实现。即,绒毛渲染参数可以包括绒毛螺旋参数、绒毛尾部形状参数、绒毛弯曲参数中的一种或多种。绒毛弯曲参数可以包括绒毛噪波参数、绒毛UV偏移参数、绒毛的流向参数或绒毛的顶点偏移参数中的一种或多种。其中,绒毛噪波参数为在绒毛的初始生长方向上增加随机弯曲的参数,提高所渲染绒毛的真实性;绒毛UV偏移参数中U标识绒毛在水平方向偏移、V标识绒毛在竖直方向偏移;流向参数为控制绒毛生长方向的参数,用于渲染出绒毛的聚集效果。
S202:根据绒毛渲染参数对待渲染对象进行渲染,获得待渲染对象的绒毛造型。
在获取到绒毛渲染参数后,根据该绒毛渲染参数对待渲染对象进行渲染,以在待渲染对象上渲染出多样化的绒毛造型。
可选的,响应于绒毛渲染参数包括绒毛螺旋参数,根据绒毛螺旋参数确定绒毛的螺旋圈数,并根据绒毛的螺旋圈数对待渲染对象进行渲染,获得待渲染对象的绒毛螺旋造型。即,在绒毛渲染参数包括绒毛螺旋参数时,将根据该绒毛螺旋参数确定绒毛围绕法线螺旋的圈数,并根据该螺旋圈数确定绒毛的螺旋造型。如图3a所示,Pass-x表示多通道中的第x层,正常情况下绒毛沿第x层的法线方向生长,所增加的螺旋效果是指绒毛绕法线螺旋。需要说明的是,图3a仅用于说明书螺旋效果,由于绒毛相对比较短,其绕法线螺旋的效果根据实际情况确定。
具体地,在确定绒毛的螺旋圈数时,将根据绒毛螺旋参数确定绒毛的螺旋角;在根据绒毛的螺旋角确定绒毛的螺旋圈数。其中,根据绒毛螺旋参数确定绒毛的螺旋角包括:
根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第一向量;根据第一向量和法线向量进行向量积计算,获得第二向量;根据绒毛螺旋参数以及渲染模型中当前通道的系数确定第一角度;根据第一向量、第二向量以及第一角度确定绒毛的螺旋角。其中,渲染模型的法线向量为绒毛的初始生长方向的向量,渲染模型中当前通道的系数是指多通道渲染模型中当前正在渲染的通道的系数。具体实现可以参见如下计算公式:
1.side=normalize(cross(varWorldNormal,vec3(0.0,1.0.0.0)));
2.up=normalize(cross(side,varWorldNormal));
3.angle=FURLEVEL*10.0*PI*SpiralPow;
4.dir=normalize(dir*FURLEVEL*10.0+normalize(side*cos(angle)+up*sin(angle))*0.8)。
其中,side表示第一向量,varWorldNormal表示法线向量,其为三维的向量,vec3(0.0,1.0.0.0)表示全局上方向向量,cross函数表示计算向量积,输出为三维向量,normalize函数表示进行归一化处理。up表示第二向量。angle表示第一角度、FURLEVEL 表示当前Pass层对应的系数,PI表示圆周率,SpiralPow表示螺旋参数,用于控制螺旋频率的因子,通常情况下取值范围为[0,1]。等号右侧dir表示当前绒毛的方向的向量,左侧的dir表示加入螺旋后的绒毛方向的向量,即绒毛的螺旋角。当初次执行步骤4中的计算时,等号右侧的dir表示绒毛的初始生长方向,即法线方向;当进行多次迭代后,步骤4等号右侧的dir表示绒毛当前的实际生长方向。
需要说明的是,对于基于多Pass的绒毛渲染,每一Pass绒毛的螺旋造型均可以利用上述方式实现。例如图3b所示的绒毛螺旋造型效果图,其中左图对应的螺旋参数SpiralPow=0.3,右图对应的螺旋参数SpiralPow=0.4。
可选的,响应于绒毛渲染参数包括绒毛尾部形状参数,根据绒毛尾部形状参数确定绒毛的尾部形状,并根据绒毛的尾部形状对待渲染对象进行渲染,获得待渲染对象的绒毛尾部造型。其中,当绒毛尾部形状参数大于零时,绒毛的尾部呈现尖刺形状;当绒毛尾部形状小于零时,绒毛的尾部呈现鸡毛掸子状。
一、当绒毛尾部形状参数小于零时
根据绒毛尾部形状参数确定绒毛的尾部形状可以包括:以渲染模型中当前通道的系数为底数、绒毛尾部形状参数为指数确定第一数值;根据第一数值以及绒毛的尾部mask贴图的灰度值确定绒毛的尾部形状。其中,绒毛尾部mask贴图可以通过对绒毛尾部图像进行采样获取。具体地,可以通过以下方式确定绒毛的尾部形状:
SP.transluency=pow(1.1-FURLEVEL,3.0+FurTailFactor)-1.0+t_mask.r
其中,SP.transluency表示像素的透明度,用于反映绒毛的尾部形状;pow(x,y)函数用来求x的y次方的值;FurTailFactor表示绒毛尾部形状参数;FURLEVEL表示当前Pass层的系数;t_mask.r表示绒毛尾部像素采样的mask贴图的灰度值。
当计算获得SP.transluency时,还需判断其是否小于预设阈值,如果是,表示透明度较低,则丢弃该SP.transluency。例如,当小于0.00001时,则丢弃。
二、当绒毛尾部形状参数大于或等于零时
根据绒毛尾部形状参数确定绒毛的尾部形状可以包括:以绒毛的尾部mask贴图的灰度值为底数、绒毛尾部形状参数为指数确定第二数值;以渲染模型中当前通道的系数为底数、绒毛尾部形状参数为指数确定第三数值;根据第二数值与第三数值的差值确定绒毛的尾部形状。具体可以参见下述计算公式:
SP.transluency=pow(t_mask.r,FurTailFactor)-pow(FURLEVEL,FurTailFactor)
当计算获得SP.transluency时,还需判断其是否小于预设阈值以及FURLEVEL是否小于零,如果均为是,表示透明度较低,则丢弃该SP.transluency。
其中,在绒毛尾部形状参数大于零时,该绒毛尾部形状参数值越大,表示绒毛尾部呈现尖刺度越大。
可选的,响应于绒毛渲染参数包括绒毛弯曲参数,根据绒毛弯曲参数确定绒毛的弯曲方向,并根据绒毛的弯曲方向对待渲染对象进行渲染,获得待渲染对象的绒毛弯曲造型。即,绒毛渲染器可以根据绒毛弯曲参数确定绒毛的弯曲方向,进而根据该弯曲方向对待渲染对象的绒毛进行渲染,使得渲染后的绒毛具有弯曲造型。其中,绒毛的弯曲方向是指偏 移绒毛的初始生长方向的方向,该初始生长方向为渲染模型的法线方向。即,在绒毛的初始生长方向上增加偏移量,使得绒毛出现弯曲效果,进而使得渲染效果更加真实。
其中,绒毛的弯曲效果可以通过增加绒毛噪波、UV偏移、流向参数以及顶点偏移等方式来实现。
1、具体地,绒毛弯曲参数包括绒毛噪波参数时,可以通过以下方式确定弯曲方向:根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第三向量;根据所述第三向量和所述法线向量进行向量积计算,获得第四向量;根据渲染模型中当前通道的系数以及当前通道的顶点坐标确定第二角度;根据所述第三向量、所述第四向量、所述第二角度以及所述绒毛噪波参数确定所述绒毛的弯曲方向。其中,当前通道的顶点坐标表示当前通道对应的局部坐标系下的顶点坐标值。具体地,可以通过以下方式实现:
1.side=normalize(cross(varWorldNormal,vec3(0.0,1.0,0.0)));
2.up=normalize(cross(side,varWorldNormal));
3.angle=PI/noise(VB.local_position+FURLEVEL);
4.dir=normalize(dir+side*sin(angle)*NoisePow+up*cos(angle))*NoisePow
其中,side表示第一向量、up表示第二向量、VB.local_position表示局部坐标系下的顶点坐标值、FURLEVEL表示当前通道的系数、noise()表示随机函数、NoisePow表示为绒毛噪波参数,该参数越大,绒毛弯曲方向的随机性越大。其它参数的相关含义可以参见上述绒毛螺旋参数中的相关介绍。
例如图3c所示的绒毛弯曲效果示意图,其中,第一幅图中的噪波参数NoisePow=0、第二幅图中的噪波参数NoisePow=0.5、第三幅图中的噪波参数NoisePow=2.0。
2、当绒毛弯曲参数包括绒毛UV偏移参数时,可以通过以下方式确定绒毛弯曲方向:基于UV偏移参数、绒毛的mask贴图、纹理坐标以及纹理坐标的缩放系数确定所述绒毛的弯曲方向。其中,绒毛的mask贴图可以通过对绒毛图像进行采样确定。具体地,可以通过以下公式计算实现:
t_mask=Texture2DScale(FurStyleMask,FB.texcoord0+uvoffset/FurMaskScale,FurMaskScale)
其中,t_mask表示绒毛像素的mask贴图的灰度值,用于反映绒毛弯曲方向,Texture2DScale()函数用于采样纹理,FurStyleMask表示绒毛mask图,决定绒毛初始的形态;FB.texcoord0表示纹理坐标;uvoffset表示UV偏移参数;FurMaskScale表示纹理坐标的缩放系数。
参见图3d所示的绒毛UV偏移效果示意图,其中,第一幅图中uvoffset=(0,0)、第二幅图中uvoffset=(1.6,0)绒毛向左偏移,第三幅图中uvoffset=(-1.6,0)绒毛向右偏移。
3、当绒毛弯曲参数包括绒毛的流向参数时,该流向参数可以为流向贴图,在流向贴图中,R通道标识水平偏移大小、以右为正方向;G通道标识垂直偏移大小,以上为正方向。通过对流向贴图采样可以得到UV偏移值,绒毛弯曲方向与UV偏移值方向相反。具体地,根据流向参数确定UV偏移量;根据UV偏移量、渲染模型中当前通道的系数确定绒毛的弯曲方向。具体可以通过以下方式确定绒毛的弯曲方向:
t_flowmap=Texture2DScale(FlowMapture,FB.texcoord0,FlowMapScale);
t_uvoffset=t_flowmap.xy*2.0-vec2(1.0,1.0);
uvoffset=t_uvoffset*FURLEVEL*FlowMapPow*0.1+uvoffset
其中,t_flowmap表示绒毛流向贴图,即流向参数;Texture2DScale()函数用于采样纹理;FlowMapture表示流向纹理;FB.texcoord0表示纹理坐标;FlowMapScale表示纹理坐标缩放系数;t_uvoffset表示从流向贴图中采样的UV偏移量;FlowMapPow表示调节偏移强度的系数。
4、当绒毛弯曲参数包括绒毛的顶点偏移参数时,可以通过以下方式确定绒毛弯曲方向:基于顶点偏移参数、渲染模型的法线方向以及插值系数确定绒毛的弯曲方向。其中,插值系数用于反映顶点偏移参数对法线方向的影响程度。
If(length(VB.model_color.xyz)>0.0)
dir=normalize(mix(varWorldNormal,normalize(VB.model_color.xyz),u_ColorDirFactor));
其中,varWorldNormal表示绒毛的初始生长方向,VB.model_color.xyz表示在顶点上所增加的偏移方向,u_ColorDirFactor就是这两个方向的插值系数,表明顶点上所增加的偏移方向对初始生长方向的影响越大,插值系数越大,表明插入力度较大。mix()表示混合函数,用于根据插值系数将初始生长方向和顶点上的偏移方向进行混合,dir表示混合之后的绒毛的最终弯曲方向。
需要说明的是,上述进行判断的目的是确定顶点上所增加的偏移方向为合法的方向。
另外,在对待渲染对象进行渲染时,除了确定绒毛的造型之外,还需获取绒毛渲染的基本参数,即绒毛渲染参数中还可以包括基本参数。例如绒毛的长度参数、粗细参数和密度参数,以根据该基本参数实现绒毛的渲染,并在渲染出的绒毛基础上增加造型。
基于上述方法实施例,本申请实施例提供了一种绒毛渲染装置和设备,下面将结合附图进行说明。
参见图4,该图为本申请实施例提供的一种绒毛渲染装置示意图,如图4所示,该装置400可以包括第一获取单元401和第二获取单元402。
第一获取单元401,用于获取待渲染对象的绒毛渲染参数,所述绒毛渲染参数中包括针对绒毛的一种或多种造型参数,所述一种或多种造型参数中不同的造型参数用于渲染出不同造型的绒毛,所述待渲染对象为需要进行绒毛渲染的对象。其中,关于第一获取单元401的具体实现可以参见上述方法S201的相关描述。
第二获取单元402,用于根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。其中,关于第二获取单元402的具体实现可以参见上述方法S202的相关描述。
在一种具体的实现方式中,所述绒毛渲染参数中包括绒毛螺旋参数、绒毛尾部形状参数、绒毛弯曲参数中一种或多种造型参数。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛渲染参数包括绒毛螺旋参数,根据所述绒毛螺旋参数确定所述绒毛的螺旋圈数,并根据所述绒毛 的螺旋圈数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
在一种具体的实现方式中,所述第二获取单元402,具体用于所述响应于所述绒毛渲染参数包括绒毛螺旋参数,根据所述绒毛螺旋参数确定所述绒毛的螺旋角;根据所述绒毛的螺旋角确定所述绒毛的螺旋圈数。
在一种具体的实现方式中,所述第二获取单元402,具体用于根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第一向量;根据所述第一向量和所述法线向量进行向量积计算,获得第二向量;根据绒毛螺旋参数以及所述渲染模型中当前通道的系数确定第一角度;根据所述第一向量、第二向量以及所述第一角度确定所述绒毛的螺旋角。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛渲染参数包括绒毛尾部形状参数,根据所述绒毛尾部形状参数确定所述绒毛的尾部形状,并根据所述绒毛的尾部形状对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
在一种具体的实现方式中,所述第二获取单元402,具体用于在所述绒毛尾部形状参数小于零时,以渲染模型中当前通道的系数为底数、所述绒毛尾部形状参数为指数确定第一数值;根据所述第一数值以及所述绒毛的尾部mask贴图的灰度值确定所述绒毛的尾部形状。
在一种具体的实现方式中,所述第二获取单元402,具体用于在所述绒毛尾部形状参数大于或等于零时,以所述绒毛的尾部mask贴图的灰度值为底数、所述绒毛尾部形状参数为指数确定第二数值;以渲染模型中当前通道的系数为底数、所述绒毛尾部形状参数为指数确定第三数值;根据所述第二数值与所述第三数值的差值确定所述绒毛的尾部形状。
在一种具体的实现方式中,在所述绒毛尾部形状参数大于零时,所述绒毛的尾部呈现尖刺形状;在所述绒毛尾部形状参数小于零时,所述绒毛的尾部呈现鸡毛掸子状。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,并根据所述绒毛的弯曲方向对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
在一种具体的实现方式中,所述绒毛弯曲参数包括绒毛噪波参数、绒毛UV偏移参数、绒毛的流向参数或绒毛的顶点偏移参数中的一种或多种。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛弯曲参数包括所述绒毛噪波参数,根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第三向量;根据所述第三向量和所述法线向量进行向量积计算,获得第四向量;根据渲染模型中当前通道的系数以及当前通道的顶点坐标确定第二角度;根据所述第三向量、所述第四向量、所述第二角度以及所述绒毛噪波参数确定所述绒毛的弯曲方向。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛弯曲参数包括所述UV偏移参数,基于所述UV偏移参数、所述绒毛的mask贴图、纹理坐标以及所述纹理坐标的缩放系数确定所述绒毛的弯曲方向。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛弯曲参数包括所述流向参数,根据所述流向参数确定UV偏移量;根据所述UV偏移量、渲染模 型中当前通道的系数确定所述绒毛的弯曲方向。
在一种具体的实现方式中,所述第二获取单元402,具体用于响应于所述绒毛弯曲参数包括所述顶点偏移参数,基于所述顶点偏移参数、渲染模型的法线方向以及插值系数确定所述绒毛的弯曲方向,所述插值系数用于反映所述顶点偏移参数对所述法线方向的影响程度。
需要说明的是,本实施例中各个单元的实现可以参见上述方法实施例中的相关描述,本实施例在此不再赘述。
参见图5,其示出了适于用来实现本申请实施例的电子设备500的结构示意图。本申请实施例中的终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、PDA(Personal Digital Assistant,个人数字助理)、PAD(portable android device,平板电脑)、PMP(Portable Media Player,便携式多媒体播放器)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV(television,电视机)、台式计算机等等的固定终端。图5示出的电子设备仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图5所示,电子设备500可以包括处理装置(例如中央处理器、图形处理器等)501,其可以根据存储在只读存储器(ROM)502中的程序或者从存储装置508加载到随机访问存储器(RAM)503中的程序而执行各种适当的动作和处理。在RAM503中,还存储有电子设备500操作所需的各种程序和数据。处理装置501、ROM 502以及RAM 503通过总线504彼此相连。输入/输出(I/O)接口505也连接至总线504。
通常,以下装置可以连接至I/O接口505:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置506;包括例如液晶显示器(LCD)、扬声器、振动器等的输出装置507;包括例如磁带、硬盘等的存储装置508;以及通信装置509。通信装置509可以允许电子设备500与其他设备进行无线或有线通信以交换数据。虽然图5示出了具有各种装置的电子设备500,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本申请的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本申请的实施例包括一种计算机程序产品,其包括承载在非暂态计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置509从网络上被下载和安装,或者从存储装置508被安装,或者从ROM502被安装。在该计算机程序被处理装置501执行时,执行本申请实施例的方法中限定的上述功能。
本申请实施例提供的电子设备与上述实施例提供的绒毛渲染方法属于同一发明构思,未在本实施例中详尽描述的技术细节可参见上述实施例,并且本实施例与上述实施例具有相同的有益效果。
本申请实施例提供一种计算机可读介质,其上存储有计算机程序,其中,所述程序被处理器执行时实现如上述任一实施例所述的绒毛渲染方法。
需要说明的是,本申请上述的计算机可读介质可以是计算机可读信号介质或者计算机 可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本申请中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本申请中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频)等等,或者上述的任意合适的组合。
在一些实施方式中,客户端、服务器可以利用诸如HTTP(HyperText Transfer Protocol,超文本传输协议)之类的任何当前已知或未来研发的网络协议进行通信,并且可以与任意形式或介质的数字数据通信(例如,通信网络)互连。通信网络的示例包括局域网(“LAN”),广域网(“WAN”),网际网(例如,互联网)以及端对端网络(例如,ad hoc端对端网络),以及任何当前已知或未来研发的网络。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行上述数据处理方法。
可以以一种或多种程序设计语言或其组合来编写用于执行本申请的操作的计算机程序代码,上述程序设计语言包括但不限于面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本申请各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。 也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本申请实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元/模块的名称在某种情况下并不构成对该单元本身的限定,例如,语音数据采集模块还可以被描述为“数据采集模块”。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、片上系统(SOC)、复杂可编程逻辑设备(CPLD)等等。
在本申请的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
需要说明的是,本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的系统或装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的 软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本申请。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本申请的精神或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (18)

  1. 一种绒毛渲染方法,其特征在于,所述方法包括:
    获取待渲染对象的绒毛渲染参数,所述绒毛渲染参数中包括针对绒毛的一种或多种造型参数,所述一种或多种造型参数中不同的造型参数用于渲染出不同造型的绒毛,所述待渲染对象为需要进行绒毛渲染的对象;
    根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
  2. 根据权利要求1所述的方法,其特征在于,所述绒毛渲染参数中包括绒毛螺旋参数、绒毛尾部形状参数、绒毛弯曲参数中一种或多种造型参数。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型,包括:
    响应于所述绒毛渲染参数包括绒毛螺旋参数,根据所述绒毛螺旋参数确定所述绒毛的螺旋圈数,并根据所述绒毛的螺旋圈数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
  4. 根据权利要求3所述的方法,其特征在于,所述响应于所述绒毛渲染参数包括绒毛螺旋参数,根据所述绒毛螺旋参数确定所述绒毛的螺旋圈数,包括:
    所述响应于所述绒毛渲染参数包括绒毛螺旋参数,根据所述绒毛螺旋参数确定所述绒毛的螺旋角;
    根据所述绒毛的螺旋角确定所述绒毛的螺旋圈数。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述绒毛螺旋参数确定所述绒毛的螺旋角,包括:
    根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第一向量;
    根据所述第一向量和所述法线向量进行向量积计算,获得第二向量;
    根据绒毛螺旋参数以及所述渲染模型中当前通道的系数确定第一角度;
    根据所述第一向量、第二向量以及所述第一角度确定所述绒毛的螺旋角。
  6. 根据权利要求2-5任一项所述的方法,其特征在于,所述根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型,包括:
    响应于所述绒毛渲染参数包括绒毛尾部形状参数,根据所述绒毛尾部形状参数确定所述绒毛的尾部形状,并根据所述绒毛的尾部形状对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述绒毛尾部形状参数确定所述绒毛的尾部形状,包括:
    在所述绒毛尾部形状参数小于零时,以渲染模型中当前通道的系数为底数、所述绒毛尾部形状参数为指数确定第一数值;
    根据所述第一数值以及所述绒毛的尾部mask贴图的灰度值确定所述绒毛的尾部形状。
  8. 根据权利要求6所述的方法,其特征在于,所述根据所述绒毛尾部形状参数确定所述绒毛的尾部形状,包括:
    在所述绒毛尾部形状参数大于或等于零时,以所述绒毛的尾部mask贴图的灰度值为底 数、所述绒毛尾部形状参数为指数确定第二数值;
    以渲染模型中当前通道的系数为底数、所述绒毛尾部形状参数为指数确定第三数值;
    根据所述第二数值与所述第三数值的差值确定所述绒毛的尾部形状。
  9. 根据权利要求6所述的方法,其特征在于,在所述绒毛尾部形状参数大于零时,所述绒毛的尾部呈现尖刺形状;在所述绒毛尾部形状参数小于零时,所述绒毛的尾部呈现鸡毛掸子状。
  10. 根据权利要求2-9任一项所述的方法,其特征在于,所述根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型,包括:
    响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,并根据所述绒毛的弯曲方向对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
  11. 根据权利要求10所述的方法,其特征在于,所述绒毛弯曲参数包括绒毛噪波参数、绒毛UV偏移参数、绒毛的流向参数或绒毛的顶点偏移参数中的一种或多种。
  12. 根据权利要求11所述的方法,其特征在于,所述响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,包括:
    响应于所述绒毛弯曲参数包括所述绒毛噪波参数,根据渲染模型的法线向量以及预设全局方向向量进行向量积计算,获得第三向量;
    根据所述第三向量和所述法线向量进行向量积计算,获得第四向量;
    根据渲染模型中当前通道的系数以及当前通道的顶点坐标确定第二角度;
    根据所述第三向量、所述第四向量、所述第二角度以及所述绒毛噪波参数确定所述绒毛的弯曲方向。
  13. 根据权利要求11所述的方法,其特征在于,所述响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,包括:
    响应于所述绒毛弯曲参数包括所述UV偏移参数,基于所述UV偏移参数、所述绒毛的mask贴图、纹理坐标以及所述纹理坐标的缩放系数确定所述绒毛的弯曲方向。
  14. 根据权利要求11所述的方法,其特征在于,所述响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,包括:
    响应于所述绒毛弯曲参数包括所述流向参数,根据所述流向参数确定UV偏移量;
    根据所述UV偏移量、渲染模型中当前通道的系数确定所述绒毛的弯曲方向。
  15. 根据权利要求11所述的方法,其特征在于,所述响应于所述绒毛渲染参数包括绒毛弯曲参数,根据所述绒毛弯曲参数确定所述绒毛的弯曲方向,包括:
    响应于所述绒毛弯曲参数包括所述顶点偏移参数,基于所述顶点偏移参数、渲染模型的法线方向以及插值系数确定所述绒毛的弯曲方向,所述插值系数用于反映所述顶点偏移参数对所述法线方向的影响程度。
  16. 一种绒毛渲染装置,其特征在于,所述装置包括:
    第一获取单元,用于获取待渲染对象的绒毛渲染参数,所述绒毛渲染参数中包括针对绒毛的一种或多种造型参数,所述一种或多种造型参数中不同的造型参数用于渲染出不同 造型的绒毛,所述待渲染对象为需要进行绒毛渲染的对象;
    第二获取单元,用于根据所述绒毛渲染参数对所述待渲染对象进行渲染,获得所述待渲染对象的绒毛造型。
  17. 一种电子设备,其特征在于,所述设备包括:处理器和存储器;
    所述存储器,用于存储指令或计算机程序;
    所述处理器,用于执行所述存储器中的所述指令或计算机程序,以使得所述电子设备执行权利要求1-15任一项所述的绒毛渲染方法。
  18. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行以上权利要求1-15任一项所述的绒毛渲染方法。
PCT/CN2022/129194 2021-11-10 2022-11-02 一种绒毛渲染方法、装置、设备及介质 WO2023083067A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111329287.9 2021-11-10
CN202111329287.9A CN116109744A (zh) 2021-11-10 2021-11-10 一种绒毛渲染方法、装置、设备及介质

Publications (1)

Publication Number Publication Date
WO2023083067A1 true WO2023083067A1 (zh) 2023-05-19

Family

ID=86266093

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/129194 WO2023083067A1 (zh) 2021-11-10 2022-11-02 一种绒毛渲染方法、装置、设备及介质

Country Status (2)

Country Link
CN (1) CN116109744A (zh)
WO (1) WO2023083067A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883567A (zh) * 2023-07-07 2023-10-13 上海散爆信息技术有限公司 一种绒毛渲染方法和装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070291050A1 (en) * 1999-08-06 2007-12-20 Bruderlin Armin W Multiple instantiable effects in a hair/fur pipeline
US20100063789A1 (en) * 2006-11-21 2010-03-11 Toyota Tsusho Corporation Computer-readable recording medium which stores fabric model generation program, fabric model generation apparatus and fabric model generation method
CN111369658A (zh) * 2020-03-24 2020-07-03 北京畅游天下网络技术有限公司 一种渲染方法及装置
CN111462313A (zh) * 2020-04-02 2020-07-28 网易(杭州)网络有限公司 绒毛效果的实现方法、装置和终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070291050A1 (en) * 1999-08-06 2007-12-20 Bruderlin Armin W Multiple instantiable effects in a hair/fur pipeline
US20100063789A1 (en) * 2006-11-21 2010-03-11 Toyota Tsusho Corporation Computer-readable recording medium which stores fabric model generation program, fabric model generation apparatus and fabric model generation method
CN111369658A (zh) * 2020-03-24 2020-07-03 北京畅游天下网络技术有限公司 一种渲染方法及装置
CN111462313A (zh) * 2020-04-02 2020-07-28 网易(杭州)网络有限公司 绒毛效果的实现方法、装置和终端

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883567A (zh) * 2023-07-07 2023-10-13 上海散爆信息技术有限公司 一种绒毛渲染方法和装置

Also Published As

Publication number Publication date
CN116109744A (zh) 2023-05-12

Similar Documents

Publication Publication Date Title
JP2023547917A (ja) 画像分割方法、装置、機器および記憶媒体
CN110069191B (zh) 基于终端的图像拖拽变形实现方法和装置
CN109800730B (zh) 用于生成头像生成模型的方法和装置
WO2022042290A1 (zh) 一种虚拟模型处理方法、装置、电子设备和存储介质
WO2023083067A1 (zh) 一种绒毛渲染方法、装置、设备及介质
WO2023160513A1 (zh) 3d素材的渲染方法、装置、设备及存储介质
WO2023061169A1 (zh) 图像风格迁移和模型训练方法、装置、设备和介质
WO2023029893A1 (zh) 纹理映射方法、装置、设备及存储介质
US20230401764A1 (en) Image processing method and apparatus, electronic device and computer readable medium
WO2023116801A1 (zh) 一种粒子效果渲染方法、装置、设备及介质
WO2023138498A1 (zh) 生成风格化图像的方法、装置、电子设备及存储介质
CN109754464A (zh) 用于生成信息的方法和装置
WO2023193639A1 (zh) 图像渲染方法、装置、可读介质及电子设备
CN111243085B (zh) 图像重建网络模型的训练方法、装置和电子设备
US20120086724A1 (en) Method for automatically adjusting the rendered fidelity of elements of a composition
CN111652675A (zh) 展示方法、装置和电子设备
CN111127603A (zh) 动画生成方法、装置、电子设备及计算机可读存储介质
WO2024131532A1 (zh) 发丝处理方法、装置、设备及存储介质
CN114049674A (zh) 一种三维人脸重建方法、装置及存储介质
CN110211017A (zh) 图像处理方法、装置及电子设备
WO2023193613A1 (zh) 高光渲染方法、装置、介质及电子设备
KR20230167746A (ko) 메시 정점 위치에 대한 반복 및 루트 찾기를 사용하여 표면에 근사하는 폴리곤 메시를 생성하기 위한 방법 및 시스템
CN110069195B (zh) 图像拖拽变形方法和装置
WO2023103682A1 (zh) 图像处理方法、装置、设备及介质
CN116109721A (zh) 发丝生成方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22891863

Country of ref document: EP

Kind code of ref document: A1