CN112465942B - Color rendering method and device, electronic equipment and storage medium - Google Patents

Color rendering method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112465942B
CN112465942B CN202011410290.9A CN202011410290A CN112465942B CN 112465942 B CN112465942 B CN 112465942B CN 202011410290 A CN202011410290 A CN 202011410290A CN 112465942 B CN112465942 B CN 112465942B
Authority
CN
China
Prior art keywords
rendered
hairline
hair
color
current state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011410290.9A
Other languages
Chinese (zh)
Other versions
CN112465942A (en
Inventor
杨帆
史文浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011410290.9A priority Critical patent/CN112465942B/en
Publication of CN112465942A publication Critical patent/CN112465942A/en
Application granted granted Critical
Publication of CN112465942B publication Critical patent/CN112465942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a color rendering method, a color rendering device, electronic equipment and a storage medium. The method comprises the following steps: determining a reference hairline and a hairline to be rendered of a current object; projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered; aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline; based on treat the alignment relation of rendering hair and benchmark hair to and the colour of benchmark hair carries out the colour to each hair of treating rendering and renders up, realize the hair colour rendering based on the hair, thereby when controlling hair art style, reduce the dynamic loss of hair, promote the display effect of hair, further improvement user's visual experience feels.

Description

Color rendering method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a color rendering method, a color rendering device, electronic equipment and a storage medium.
Background
Currently, all stylized hair on the market is rendered on a Mesh basis. Stylized rendering is basically in the category of cartoon rendering, and requires control over the overall color of the hair, the position and shape of highlight areas, and more freedom of art control than fully realistic hair rendering. The currently widely used stylized rendering method based on Mesh basis has low requirements on simulation calculation, can quickly determine the shape of hair, but has large dynamic loss to the hair, cannot be dispersed like real hair, and has unreal effect. However, if real hair is used, although some complex dispersion effects can be made, the requirements of stylized rendering cannot be met in the rendering.
Disclosure of Invention
The invention provides a hair color rendering method, a hair color rendering device, electronic equipment and a storage medium, which are used for realizing hair color rendering based on hair, so that the hair art style is controlled, the dynamic loss of the hair is reduced, the hair display effect is improved, and the visual experience of a user is further improved.
In a first aspect, an embodiment of the present invention provides a color rendering method, including:
determining a reference hairline and a hairline to be rendered of a current object;
projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered;
aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
and performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline.
In a second aspect, an embodiment of the present invention further provides a color rendering apparatus, including:
the determining module is used for determining the reference hairline and the hairline to be rendered of the current object;
the projection module is used for projecting each hairline to be rendered to a measurement space of a reference hairline based on the current state data of each hairline to be rendered;
the alignment module is used for aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
and the rendering module is used for performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the hair color rendering method provided by the embodiment of the invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the color rendering method provided in the embodiment of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
determining a reference hairline and a hairline to be rendered of a current object; projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered; aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline; based on treat the alignment relation of rendering hair and benchmark hair to and the colour of benchmark hair carries out the colour to each hair of treating rendering and renders up, realize the hair colour rendering based on the hair, thereby when controlling hair art style, reduce the dynamic loss of hair, promote the display effect of hair, further improvement user's visual experience feels.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flow chart of a color rendering method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a metrology space provided in accordance with one embodiment of the present invention;
FIG. 3 is a schematic sectional view of a first embodiment of the present invention;
FIG. 4 is a schematic flowchart of a second embodiment of a color rendering method according to the present invention;
fig. 5 is a schematic flowchart of a color rendering method according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of a color rendering apparatus according to a fourth embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow diagram of a color rendering method according to an embodiment of the present invention, where this embodiment is applicable to a situation where to-be-rendered hairlines are aligned with reference hairlines and color rendering is performed on each to-be-rendered hairline based on an alignment relationship between each to-be-rendered hairline and the reference hairline, and the method may be implemented by a color rendering device, where the device may be implemented by hardware and/or software, and the method specifically includes the following steps:
and S110, determining the reference hairline and the hairline to be rendered of the current object.
The current object refers to a target object needing color rendering, such as a 3D cartoon character, a cartoon image, a 3D animal, a 3D character or an image character. The hair color rendering refers to performing color rendering on each hairline to be rendered so as to realize color gradual change from a hair root to a hair tip. The reference hair and the hair to be rendered in the present embodiment may be hair of a person, hair of a person such as a beard or an eyebrow, or hair of an animal such as a fur. In this embodiment, the current object may be, for example, a 3D cartoon character, and the reference hairline and the hairline to be rendered may be hairlines of the 3D cartoon character.
The reference hairline and the hairline to be rendered are both from the same local area of the current object, each local area of the current object comprises one reference hairline and a plurality of hairlines to be rendered, and the reference hairline can be a central hairline of each local area. Illustratively, the local region may be a bang region, a left region, a right region, a hindbrain region, or the like of the current subject. The division of the local area may be set according to the requirement of the rendering effect, for example, the requirement on the rendering effect is high, and each local area may be divided into a plurality of areas, so as to form a smaller local area to improve the rendering effect. The reference hairline refers to a hairline which has undergone color rendering in a local area, such as a gradient color, and is used for assisting color rendering of the hairline to be rendered, so that the rendered color of the hairline to be rendered is consistent with the color of the reference hairline.
Optionally, the hair of the current object includes a plurality of groups of pre-divided hair; wherein, determining the hair to be rendered of the current object comprises: and determining at least one group of displayed hairlines as to-be-rendered hairlines according to the station position of the current object.
The station position of the current object refers to the face orientation direction of the current object, and at least one group of displayed hairlines can be determined according to the face orientation direction of the current object. For example, if the standing position of the current subject is forward, that is, the face faces the front, the at least one group of hair strands displayed is the hair strands of the bang group; if the current object stands in the left lateral direction, that is, the face faces the left side, at least one group of hair displayed is the right hair. And determining at least one group of displayed hairlines as to-be-rendered hairlines so as to render each currently displayed hairline.
In the embodiment, the hair of the target object is divided into the multiple groups of hairlines in advance, and at least one group of the displayed hairlines is determined to be the hairlines to be rendered according to the station position of the target object, so that part of the hairlines which do not need to be displayed do not need to be rendered, and the hairline rendering amount and the calculation amount are reduced.
Optionally, the reference hairline is a hairline which is located in a hairline central area in each group of hairlines and meets a preset length condition.
The hair center area refers to the middle area of each group of hair, the width of the middle area can be adjusted according to requirements, and the application does not limit the width. The preset length condition may be that the length is maximum, or the length is greater than a preset proportion of other hair to be rendered, such as other hair to be rendered with a length greater than 95%. It can be understood that the reference hairline is defined as the hairline in the central area of the hairline to be rendered, so that each hairline to be rendered is rendered according to the color change rule of the central hairline, the color rendering effect of each hairline to be rendered can be improved, and the display effect of the hair is improved. In addition, the reference hairline is limited to the hairline meeting the preset length condition, so that the problem that the color of the part of the hairline to be rendered, which exceeds the length of the reference hairline, cannot be determined when the length of the reference hairline is smaller than that of most hairlines to be rendered is avoided, and the color rendering effect of each hairline to be rendered is improved.
And S120, projecting each hair to be rendered to a measurement space of the reference hair based on the current state data of each hair to be rendered.
The current state data is used for representing the curvature value of each hair section and the position of the corresponding hair section on the hair, or the curvature value of each hair point and the position of the corresponding hair point on the hair, or the integral curvature value of the hair. The curve value is used for representing the bending degree of the hair on the hair point or the hair section, and the larger the curve value is, the larger the bending degree of the hair point or the hair section is. The measurement space of the reference hairline refers to a reference direction generated according to the reference hairline, and is used for determining coordinate values of the reference hairline and the hairline to be rendered in the reference direction. Specifically, the measurement space may be a hair direction of the reference hair, i.e., a direction determined by two vertices of the reference hair, as shown in fig. 2, the point a and the point B are two vertices of the reference hair, and the hair direction of the reference hair, i.e., β, may be determined based on a straight line connecting line between the points AB; the metrology space may also be in the vertical direction, e.g. γ in fig. 2. Specifically, the value range of the coordinate value of the measurement space is [0,1], wherein the value 0 and the value 1 are the coordinate values of the measurement space corresponding to the two vertex coordinates of the reference hair.
When each hair to be rendered is mapped to the measurement space of the reference hair, the coordinate values corresponding to the two vertex coordinates of the hair to be rendered are respectively 0 and 1, so that the coordinate value of each hair point or hair section in the measurement space can be determined based on the position of each hair point or hair section in the current state data of the hair to be rendered. For example, if there are 6 uniform hair points on a certain hair to be rendered, the coordinate values of the measurement space corresponding to each hair point are 0,0.2, 0.4, 0.6, 0.8, and 1, respectively.
Specifically, after coordinate values of each hair point or hair segment on the hair to be rendered in the measurement space are determined, according to the curvature values of each hair point or hair segment contained in the current state data of the hair to be rendered, the coordinate values on the measurement space of each hair point or hair segment and the corresponding curvature values are generated into an array, and therefore the projection of the hair to be rendered to the measurement space of the reference hair is achieved. Illustratively, if the values of the curvatures of the respective uniform hairline points are 0.05, 0.1, 0.2, 0.3, 0.1 and 0.05 as described above, the resulting arrays are [0,0.05], [0.2,0.1], [0.4,0.2], [0.6,0.3], [0.8,0.1] and [1,0.05], respectively. And updating the current state data of the hairline to be rendered based on the array generated after the hairline to be rendered is projected, namely, the position of each hairline point or hairline segment on the hairline to be rendered is represented by the coordinate value of the measurement space of the reference hairline.
Optionally, the projecting each hair to be rendered to the measurement space of the reference hair based on the current state data of each hair to be rendered includes: segmenting each hairline to be rendered according to a preset segmentation rule, calculating the curvature of each segment, and forming current state data based on the position of each segment on the hairline to be rendered and the corresponding curvature; and projecting the current state data of each hair to be rendered to the metric space of the reference hair.
The preset segmentation rule can be that each hairline to be rendered is uniformly divided into a preset number of hairline segments; the preset number can be a fixed value and can also be determined according to the length of each hairline to be rendered; or, the preset segmentation rule can also be that segmentation is carried out on the basis of the hair points of which the curvature change values exceed a preset threshold value according to the curvature change values among the hair points; or, the hairlines to be rendered may be segmented according to preset fixed length values. For example, as shown in fig. 3, it is shown that each hair to be rendered is segmented according to a preset fixed length value, and since the lengths of the hair to be rendered are different and the lengths of the segments are fixed, there may be differences in the number of the segments obtained by each hair to be rendered.
Specifically, the curvature of each segment may be represented by the curvature value calculated at the center hair point of each segment, may be represented by the average value of the curvature values calculated at the respective hair points, or may be represented by the maximum value of the curvature values calculated at the respective hair points on the segment. The position of each segment in the hair to be rendered may be represented by the position of an end point of each segment, the position of a midpoint of each segment, or the position of a hair point with the largest curvature value, which is not limited in the present application.
Specifically, the current state data of the hair to be rendered is projected into the metric space of the reference hair: and determining the coordinate value of each segment in the measurement space based on the position of each segment in the current state data of the hair to be rendered in the hair to be rendered, and generating an array comprising the coordinate value of the measurement space of each segment and the corresponding curvature value. In the embodiment, each hairline to be rendered is segmented according to a preset segmentation rule, the curvature of each segment is calculated, and current state data is formed on the basis of the position of each segment on the hairline to be rendered and the corresponding curvature; the current state data of each hairline to be rendered is projected to the measurement space of the reference hairline, so that the reference hairline and the hairline to be rendered are mapped to the same space, the color rendering of each hairline to be rendered is realized, and further, the visual experience of a user is improved.
Optionally, before aligning the hair to be rendered according to the similarity between the hair to be rendered and the current state data of the reference hair, the method further includes: and segmenting the reference hairline according to a preset segmentation rule, and determining the curvature of each segment in the current state to form current state data of the reference hairline.
And the preset segmentation rule of the reference hairline is consistent with the preset segmentation rule of the hairline to be rendered. Illustratively, the preset segmentation rule of the reference hairline is to segment the reference hairline according to a preset fixed length value, so that each segment length of the reference hairline is consistent with each segment length of the hairline to be rendered. The curvature of each segment of the reference hair may be represented by the curvature value calculated for the middle hair point of each segment, or by the average of the curvature values calculated for each hair point, or by the maximum of the curvature values calculated for each hair point on the segment. The current state data of the reference hair comprises the curvature of each segment and the position of the corresponding segment on the reference hair. Wherein the position of each segment on the reference hair is represented by the coordinate value of each segment on the gauge space of the reference hair.
In this embodiment, the reference hairline is segmented according to the preset segmentation rule, and the curvature of each segment in the current state is determined to form the current state data of the reference hairline, so that the curvature data and the corresponding position of each segment on the reference hairline are obtained, and the alignment between the hairline to be rendered and the reference hairline based on the curvature data is realized.
And S130, aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline.
The current state data of the hair to be rendered comprises coordinate values of each hair point or hair segment on the hair to be rendered in the measurement space and corresponding curvature values. Correspondingly, the current state data of the reference hairline comprises coordinate values of each hairline point or hairline segment on the reference hairline on the measurement space and corresponding curvature values. The similarity of the current state data of the hair to be rendered and the reference hair refers to the similarity of the curvature values of each hair point or hair section on the hair to be rendered and each hair point or hair section on the reference hair. The method comprises the step of aligning hair points on the hair to be rendered and the hair points on the reference hair with the similarity, or the hair sections on the hair to be rendered and the hair sections on the reference hair with the similarity.
For example, the curvature values of the hair segments on the hair to be rendered and the hair segments on the reference hair are taken as examples. The similarity of each hair section of the hair to be rendered and the reference hair can be judged by the difference value of the curvature values of the hair sections. Specifically, if the difference between the curvature value of the hair section of the hair to be rendered and the curvature value of the hair section of the reference hair is smaller than the preset similarity threshold, it is determined that the hair section of the hair to be rendered and the hair section of the reference hair have similarity, and the two have an alignment relationship. For example, the reference hair includes 3 hair segments, the current state data is [0,0.1], [0.5,0.3], [1,0.2], and a certain hair to be rendered also includes 3 hair segments, and the current state data is [0,0.2], [0.5.0.1], [1,0.3], so that [0,0.1] and [0.5.0.1] have an alignment relationship, [0.5,0.3] and [1,0.3] have an alignment relationship, and [1,0.2] and [0,0.2] have an alignment relationship.
And S140, performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline.
According to the corresponding color of each hair point or hair section on the reference hair and the alignment relationship between each hair to be rendered and the reference hair, the color of the hair point or the hair section aligned with the reference hair on each hair to be rendered can be determined. The colors of the hair to be rendered and the reference hair on the aligned hair points or hair sections should be the same. In one embodiment, the colors of all the hair points or the hair segments aligned with the reference hair on each hair to be rendered may be set based on the colors of all the hair points or the hair segments on the reference hair and the alignment relationship between each hair to be rendered and the reference hair, so as to realize the overall color rendering of each hair to be rendered.
In another embodiment, the color of the part of the hair points or the hair segments of each hair to be rendered may be set as the color of the coordinate points of the reference hair based on the part of the hair points or the hair segments of the reference hair and the alignment relationship between each hair point or the hair segment of the reference hair and each hair to be rendered. It can be understood that, after the colors of the partial hair distribution points or the hair distribution segments of the respective hairs to be rendered are obtained, the colors of the coordinate points between the partial hair distribution points or the hair distribution segments may be filled in a smooth transition manner, so as to implement the color rendering of the respective hairs to be rendered. Illustratively, b for a certain hair to be rendered 1 、b 2 、b 3 The corresponding colors of the hair line are 10, 20 and 30 respectively, if b 1 And b 2 There are 4 hair points in between, the colors of the 4 hair points are 12, 14, 16, 18, respectively.
According to the technical scheme of the embodiment, the reference hairline and the hairline to be rendered of the current object are determined; projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered; aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline; based on treat the alignment relation of rendering hair and benchmark hair to and the colour of benchmark hair carries out the colour to each hair of treating rendering and renders up, realize the hair colour rendering based on the hair, thereby when controlling hair art style, reduce the dynamic loss of hair, promote the display effect of hair, further improvement user's visual experience feels.
Example two
Fig. 4 is a schematic flow chart of a color development rendering method according to a second embodiment of the present invention, and in this embodiment, based on the foregoing embodiments, further optimization is performed on "aligning hair to be rendered according to similarity between current state data of the hair to be rendered and current state data of a reference hair". On the basis, optimization can be further performed on the alignment relation between the hairlines to be rendered and the reference hairline and the color rendering of the reference hairline on the hairlines to be rendered. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 4, the color rendering method provided in this embodiment includes:
and S410, determining the reference hairline and the hairline to be rendered of the current object.
And S420, projecting each hair to be rendered to a measurement space of the reference hair based on the current state data of each hair to be rendered.
And S430, matching the curvatures of the segments in the hairlines to be rendered in the curvatures of the segments in the reference hairline, and determining segment identifiers in the reference hairline matched with the segments in the hairlines to be rendered according to the curvature matching result.
Wherein, each subsection of the benchmark hairline is provided with a corresponding subsection identification. Specifically, the curvatures of the segments in the hair to be rendered are respectively compared with the curvatures of the segments in the reference hair, and if the curvatures are the same or the difference between the curvatures is greater than a preset error threshold, the segment identifier in the reference hair matched with the segments in the hair to be rendered is determined. Illustratively, the reference hair has 3 segments, the curvatures are 0.2, 0.3 and 0.4 respectively, and the corresponding segment identifiers are a, b and c respectively, then the segment with the curvature of 0.3 in the hair to be rendered is matched with the second segment of the reference hair, and the determined segment identifier in the reference hair is b.
S440, determining color values corresponding to the segments in the hairline to be rendered according to the segment identifiers in the reference hairline matched with the segments in the hairline to be rendered and the color values corresponding to the segment identifiers in the reference hairline; and rendering the hairline to be rendered based on the color value corresponding to each subsection in the hairline to be rendered.
Wherein, the color value corresponding to the segment identifier may be a color mean of the segment; or, a color value of an endpoint of the segment; or, a color value of a midpoint of the segment; or, the color value of the key point in the segment, where the key point may be a point whose color change degree exceeds a preset degree threshold; or, an array of color values for a plurality of keypoints within the segment. The color of the reference hair is represented by the color value of the segment identifier corresponding to each segment identifier. And determining the corresponding color value through the segmentation identification of the reference hairline, so that the color of each segment in the hairline to be rendered is equal to the color of the segment of the reference hairline matched with the color of the segment of the reference hairline, and the color change rule of the hairline to be rendered is consistent with the color change rule of the reference hairline after the hairline to be rendered is rendered. Illustratively, if the segment identifier b in the reference hair matched with the segment with the curvature of 0.3 in the hair to be rendered in the above example is b, and the color value corresponding to b is 40, the color value of the segment with the curvature of 0.3 in the hair to be rendered is set to be 40.
It can be understood that, if the color value corresponding to the segment identifier in the reference hairline is the color mean of the segment, the color value of each hairline point in the corresponding segment of the hairline to be rendered may be all set to be the color mean, the color value of the middle point of the segment may also be set to be the color mean, and the color values of the remaining points are determined in a uniform filling manner. If the color value corresponding to the segment identifier in the reference hairline is the color value of the endpoint of the segment, the color values of the endpoints in the corresponding segment of the hairline to be rendered may all be set to the color mean. If the color value corresponding to the segment identifier in the reference hairline is the color value of the midpoint of the segment, the color value of each hairline point in the corresponding segment of the hairline to be rendered can be set to be the color value, the color value of the midpoint of the segment can also be set to be the color value, and the color values of the rest points are determined in an even filling mode. If the color value corresponding to the segmentation identifier in the reference hairline is the color value of one or more key points in the segmentation, the color value of one or more key points in the corresponding segmentation of the hairline to be rendered can be set as the color value, and the color values of the rest hairlines can be obtained by uniformly transiting the color values of the adjacent key points.
According to the technical scheme, the curvatures of the segments in the hairline to be rendered are matched in the curvatures of the segments in the reference hairline, the segment identifiers in the reference hairline matched with the segments in the hairline to be rendered are determined according to the curvature matching result, the color values corresponding to the segments in the hairline to be rendered are determined according to the color values corresponding to the segment identifiers in the reference hairline, color rendering of the segments of the hairline to be rendered based on the curvature matching result is achieved, and the color rendering precision is improved.
Optionally, rendering the hairline to be rendered based on the color value corresponding to each segment in the hairline to be rendered includes: and determining the color value corresponding to the segmentation as the color value of the segmentation node, and performing uniform transition rendering on each segmentation based on the color values of the adjacent segmentation.
The segmentation node refers to an end point of the segment, and may be an upper end point or a lower end point of the segment, or may be a key point in the segment, which is not limited in this application. In this embodiment, the color of each hairline point on the color-determined segment may be refined, thereby achieving a uniform transition of hairline color. Illustratively, the segmentation node is the upper end point of the segment, the color value corresponding to segment a is 15, the color value of segment A1 at the upper end point of segment a is 15, segment B is the next segment next to segment a, and the color value corresponding to segment B is 20, the color value of segment B1 at the upper end point of segment B is 15, and it can be known from the method of this embodiment that the colors of 4 hair spots between segment A1 and segment B1 can be set to be 16, 17, 18, and 19. It is understood that more detailed uniform transition rendering can also be performed according to the accuracy requirement of color rendering, such as setting the colors of 7 hair points between A1 and B1 to 15.5, 16, 16.5, 17, 17.5, 18, 18.5.
In this embodiment, the color values corresponding to the segments are determined as the color values of the segment nodes, and uniform transition rendering is performed on each segment based on the color values of the adjacent segments, so that uniform rendering of the hair to be rendered is realized, and the color rendering effect is improved.
EXAMPLE III
Fig. 5 is a schematic flow chart of a color development rendering method according to a third embodiment of the present invention, and in this embodiment, based on the foregoing embodiments, further optimization is performed on "aligning hair to be rendered according to similarity between current state data of the hair to be rendered and current state data of a reference hair". On the basis, optimization can be further performed on the alignment relation between the hairlines to be rendered and the reference hairline and the color rendering of the reference hairline on the hairlines to be rendered. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 5, the color rendering method provided in this embodiment includes:
and S510, determining the reference hairline and the hairline to be rendered of the current object.
And S520, projecting each hair to be rendered to a measurement space of the reference hair based on the current state data of each hair to be rendered.
S530, matching curvatures of head and tail sections in each hairline to be rendered in curvatures of each section in a reference hairline, and determining section identifications in the reference hairline respectively matched with the head and tail sections in the hairline to be rendered according to curvature matching results.
The head and tail segments refer to segments at two ends after the hairline to be rendered is segmented. In this embodiment, only the curvatures of the two segments of the head and tail segments of the hair to be rendered are matched in the curvatures of the segments in the reference hair, so as to obtain the segment identifiers in the reference hair matched with the head and tail segments. The principle of matching may be: and if the difference value between the curvature of the head and tail sections in each hairline to be rendered and the curvature of each section in the reference hairline is smaller than a preset error threshold value, the head and tail sections in the hairline to be rendered are matched with the sections in the reference hairline.
S540, determining color values respectively corresponding to head and tail subsections in the hairline to be rendered according to the subsection identification in the reference hairline respectively matched with the head and tail subsections in the hairline to be rendered and the color values corresponding to the subsection identification in the reference hairline; and performing uniform transition rendering on the hairline to be rendered of each subsection based on the color values of the head and the tail subsections.
Wherein, the color value corresponding to the segment identifier may be a color mean of the segment; or, a color value of an endpoint of the segment; or, a color value of a midpoint of the segment; or, the color value of the key point in the segment, where the key point may be a point whose color change degree exceeds a preset degree threshold; or, an array of color values for a plurality of keypoints within the segment. After the color values of the head and tail segments in the hair to be rendered are determined, the color values of all the segments in the middle of the head and tail segments are obtained in a uniform transition rendering mode. Illustratively, the color value of the first segment in the hair to be rendered is 20, and the color value of the last segment in the hair to be rendered is 100, then the color values of the segments in the middle of the first segment and the last segment are 40, 60, and 80, respectively. It can be understood that after the color values of the segments in the middle of the head and tail segments are obtained, the colors of the hair points on the segments can be subjected to uniform transition rendering.
According to the technical scheme of the embodiment, the curvatures of the head and tail segments in each hairline to be rendered are matched in the curvatures of the segments in the reference hairline, segment identifiers in the reference hairline, which are respectively matched with the head and tail segments in the hairline to be rendered, are determined according to the curvature matching result, and the color values respectively corresponding to the head and tail segments in the hairline to be rendered are determined according to the color values corresponding to the segment identifiers in the reference hairline; the hairline to be rendered of each subsection is subjected to uniform transition rendering based on the color values of the head and the tail subsections, so that the color rendering of the hairline to be rendered based on the head and the tail subsections is realized, and the speed of the hair color rendering is improved.
Example four
Fig. 6 is a schematic structural diagram of a hair color rendering device according to a fourth embodiment of the present invention, where this embodiment is applicable to a situation in which hairlines to be rendered are aligned with reference hairlines and color rendering is performed on each hairline to be rendered based on an alignment relationship between each hairline to be rendered and the reference hairline, and the device specifically includes: a determination module 610, a projection module 620, an alignment module 630, and a rendering module 640.
A determining module 610, configured to determine a reference hairline and a hairline to be rendered of a current object;
a projection module 620, configured to project each hair to be rendered to a measurement space of a reference hair based on current state data of each hair to be rendered;
an alignment module 630, configured to align the hairline to be rendered according to similarity between the current state data of the hairline to be rendered and the current state data of the reference hairline;
and the rendering module 640 is configured to perform color rendering on each hair to be rendered based on the alignment relationship between the hair to be rendered and the reference hair and the color of the reference hair.
In the embodiment, the reference hairline and the hairline to be rendered of the current object are determined by the determining module; projecting each hairline to be rendered to a measurement space of a reference hairline through a projection module based on current state data of each hairline to be rendered; aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline through an alignment module; through the alignment relation of the module of playing up based on treating the hair of playing up and benchmark hair to and the colour of benchmark hair carries out the colour to each hair of waiting to play up and plays up, realize the hair colour based on the hair and play up, thereby when control hair artistic style, reduce the dynamic loss of hair, promote the display effect of hair, further improvement user's visual experience feels.
Optionally, the projection module includes a forming unit and a projection unit; the forming unit is used for segmenting each hairline to be rendered according to a preset segmentation rule, calculating the curvature of each segment, and forming current state data based on the position of each segment on the hairline to be rendered and the corresponding curvature; the projection unit is used for projecting the current state data of each hair to be rendered to the measurement space of the reference hair.
Optionally, the hair color rendering device further includes a forming module, configured to segment the reference hair according to a preset segmentation rule before aligning the hair to be rendered according to similarity between the hair to be rendered and current state data of the reference hair, and determine curvatures of segments in a current state to form current state data of the reference hair.
Optionally, the alignment module includes an identifier determining unit, configured to match curvatures of segments in the respective to-be-rendered hairlines with curvatures of the segments in the reference hairline, and determine, according to a curvature matching result, a segment identifier in the reference hairline that is matched with the segments in the to-be-rendered hairline.
Optionally, the rendering module includes a color determination unit and a color rendering unit; the color determining unit is used for determining color values corresponding to all the subsections in the hairline to be rendered according to subsection marks in reference hairlines matched with the subsections in the hairline to be rendered and color values corresponding to the subsection marks in the reference hairlines; the color rendering unit is used for rendering the hairline to be rendered based on the color value corresponding to each subsection in the hairline to be rendered.
Optionally, the color rendering unit is specifically configured to determine a color value corresponding to the segment as a color value of the segment node, and perform uniform transition rendering on each segment based on color values of adjacent segments.
Optionally, the alignment module includes an identifier determining unit, configured to match curvatures of head and tail segments in each of the to-be-rendered hairlines in curvatures of the segments in the reference hairline, and determine segment identifiers in the reference hairline respectively matched with the head and tail segments in the to-be-rendered hairline according to a curvature matching result.
Optionally, the rendering module includes an identifier rendering unit, configured to determine color values respectively corresponding to head and tail segments in the hairline to be rendered according to segment identifiers in the reference hairline, where the head and tail segments in the hairline to be rendered are respectively matched, and color values corresponding to the segment identifiers in the reference hairline; and performing uniform transition rendering on the hairline to be rendered of each subsection based on the color values of the head and the tail subsections.
Optionally, the hair of the current object includes a plurality of groups of pre-divided hair; correspondingly, the determining module is specifically configured to determine, according to the station of the current object, at least one group of hairlines to be displayed as hairlines to be rendered.
Optionally, the reference hair is a hair located in a central region of the hair in each group of the hair, and meeting a preset length condition.
The color rendering device provided by the embodiment of the invention can execute the color rendering method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE five
Fig. 7 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 7 illustrates a block diagram of an exemplary electronic device 70 suitable for use in implementing embodiments of the present invention. The electronic device 70 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 7, electronic device 70 is embodied in the form of a general purpose computing device. The components of the electronic device 70 may include, but are not limited to: one or more processors or processing units 701, a system memory 702, and a bus 703 that couples various system components including the system memory 702 and the processing unit 701.
Bus 703 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 70 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 70 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 702 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 704 and/or cache memory 705. The electronic device 70 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, the storage system 706 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 703 via one or more data media interfaces. Memory 702 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 708 having a set (at least one) of program modules 707 may be stored, for example, in memory 702, such program modules 707 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. The program modules 707 generally perform the functions and/or methodologies of the described embodiments of the invention.
The electronic device 70 may also communicate with one or more external devices 709 (e.g., keyboard, pointing device, display 710, etc.), with one or more devices that enable a user to interact with the electronic device 70, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 70 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 711. Also, the electronic device 70 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 712. As shown, the network adapter 712 communicates with the other modules of the electronic device 70 over a bus 703. It should be appreciated that although not shown in FIG. 7, other hardware and/or software modules may be used in conjunction with electronic device 70, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 701 executes various functional applications and data processing by running a program stored in the system memory 702, for example, to implement a color rendering method provided in this embodiment, the method includes:
determining a reference hairline and a hairline to be rendered of a current object;
projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered;
aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
and performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline.
Of course, those skilled in the art can understand that the processor can also implement the technical solution of the color rendering method provided in any embodiment of the present invention.
EXAMPLE six
The present embodiments provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method steps of a color rendering method as provided in any of the embodiments of the present invention, the method comprising:
determining a reference hairline and a hairline to be rendered of a current object;
projecting each hairline to be rendered to a measurement space of a reference hairline based on current state data of each hairline to be rendered;
aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
and performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. Those skilled in the art will appreciate that the present invention is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions will now be apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A color rendering method, comprising:
determining a reference hairline and a hairline to be rendered of a current object;
based on the current state data of each hairline to be rendered, projecting each hairline to be rendered to a measurement space of the reference hairline;
aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline;
the projecting each hair to be rendered to the metric space of the reference hair based on the current state data of each hair to be rendered comprises:
segmenting the hairlines to be rendered respectively according to a preset segmentation rule, calculating the curvature of each segment, and forming current state data based on the positions and the corresponding curvatures of the segments on the hairlines to be rendered;
and projecting the current state data of each hairline to be rendered to the metric space of the reference hairline.
2. The method of claim 1, further comprising, prior to aligning the hair to be rendered according to similarity of current state data of the hair to be rendered and the reference hair:
and segmenting the reference hairline according to a preset segmentation rule, and determining the curvature of each segment in the current state to form current state data of the reference hairline.
3. The method according to claim 1, wherein the aligning the hair to be rendered according to the similarity of the current state data of the hair to be rendered and the reference hair comprises:
and matching the curvatures of the sections in the hairlines to be rendered in the curvatures of the sections in the reference hairlines, and determining the section identifications in the reference hairlines matched with the sections in the hairlines to be rendered according to the curvature matching result.
4. The method according to claim 3, wherein the color rendering of each hair to be rendered based on the alignment relationship between the hair to be rendered and the reference hair and the color of the reference hair comprises:
determining color values corresponding to the segments in the hairline to be rendered according to segment identifiers in reference hairlines matched with the segments in the hairline to be rendered and color values corresponding to the segment identifiers in the reference hairlines;
rendering the hairline to be rendered based on the color values corresponding to the segments in the hairline to be rendered.
5. The method according to claim 4, wherein the rendering the hair to be rendered based on the color values corresponding to the segments of the hair to be rendered comprises:
and determining the color value corresponding to the segmentation as the color value of the segmentation node, and performing uniform transition rendering on each segmentation based on the color values of adjacent segments.
6. The method according to claim 1, wherein the aligning the hair to be rendered according to the similarity of the current state data of the hair to be rendered and the reference hair comprises:
and matching the curvatures of the head and tail sections in the hairlines to be rendered in the curvatures of the sections in the reference hairlines, and determining the section identifiers in the reference hairlines respectively matched with the head and tail sections in the hairlines to be rendered according to the curvature matching result.
7. The method according to claim 6, wherein the color rendering of each hair to be rendered based on the alignment relationship between the hair to be rendered and the reference hair and the color of the reference hair comprises:
determining color values respectively corresponding to head and tail sections in the hairline to be rendered according to the section identifiers in the reference hairline, which are respectively matched with the head and tail sections in the hairline to be rendered, and the color values corresponding to the section identifiers in the reference hairline;
and performing uniform transition rendering on the hairline to be rendered of each subsection based on the color values of the head and the tail subsections.
8. The method according to claim 1, wherein the hair of the current subject comprises a plurality of pre-divided sets of hair strands;
wherein the determining the hair to be rendered of the current object comprises:
and determining at least one group of displayed hairlines as to-be-rendered hairlines according to the station position of the current object.
9. The method according to claim 8, wherein the reference hair is a hair satisfying a predetermined length condition in a central region of the hair in each group of the hair.
10. A color rendering device is characterized by comprising:
the determining module is used for determining the reference hairline and the hairline to be rendered of the current object;
the projection module is used for projecting each hairline to be rendered to the measurement space of the reference hairline based on the current state data of each hairline to be rendered;
the alignment module is used for aligning the hairline to be rendered according to the similarity of the current state data of the hairline to be rendered and the reference hairline;
the rendering module is used for performing color rendering on each hairline to be rendered based on the alignment relation between the hairline to be rendered and the reference hairline and the color of the reference hairline;
the projection module comprises a forming unit and a projection unit;
the forming unit is used for segmenting each hairline to be rendered according to a preset segmentation rule, calculating the curvature of each segment, and forming current state data based on the position of each segment on the hairline to be rendered and the corresponding curvature;
the projection unit is used for projecting the current state data of each hair to be rendered to the measurement space of the reference hair.
11. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the hair color rendering method of claims 1-9.
12. A computer-readable storage medium on which a computer program is stored, which program, when executed by a processor, implements a hair color rendering method as claimed in claims 1 to 9.
CN202011410290.9A 2020-12-04 2020-12-04 Color rendering method and device, electronic equipment and storage medium Active CN112465942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011410290.9A CN112465942B (en) 2020-12-04 2020-12-04 Color rendering method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011410290.9A CN112465942B (en) 2020-12-04 2020-12-04 Color rendering method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112465942A CN112465942A (en) 2021-03-09
CN112465942B true CN112465942B (en) 2023-03-24

Family

ID=74805899

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011410290.9A Active CN112465942B (en) 2020-12-04 2020-12-04 Color rendering method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112465942B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763521B (en) * 2021-09-16 2023-06-13 网易(杭州)网络有限公司 Hair model rendering method and device, electronic equipment and storage medium
CN114187633B (en) * 2021-12-07 2023-06-16 北京百度网讯科技有限公司 Image processing method and device, and training method and device for image generation model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982575A (en) * 2012-11-29 2013-03-20 杭州挪云科技有限公司 Hair rendering method based on ray tracking
CN104574479A (en) * 2015-01-07 2015-04-29 北京科艺有容科技有限责任公司 Rapid generating method for bird single feathers in three-dimensional animation
CN104756156A (en) * 2013-03-14 2015-07-01 梦工厂动画公司 Compressing data representing computer animated hair
CN107204036A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The method and apparatus for generating hair image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982575A (en) * 2012-11-29 2013-03-20 杭州挪云科技有限公司 Hair rendering method based on ray tracking
CN104756156A (en) * 2013-03-14 2015-07-01 梦工厂动画公司 Compressing data representing computer animated hair
CN104574479A (en) * 2015-01-07 2015-04-29 北京科艺有容科技有限责任公司 Rapid generating method for bird single feathers in three-dimensional animation
CN107204036A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The method and apparatus for generating hair image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种头发动态模拟方法;唐勇等;《计算机仿真》;20060728(第07期);全文 *
三维头发造型系统的设计与实现;张建明;《江苏理工大学学报》;19981115(第06期);全文 *

Also Published As

Publication number Publication date
CN112465942A (en) 2021-03-09

Similar Documents

Publication Publication Date Title
JP7227292B2 (en) Virtual avatar generation method and device, electronic device, storage medium and computer program
CN110838162B (en) Vegetation rendering method and device, storage medium and electronic equipment
CN112465942B (en) Color rendering method and device, electronic equipment and storage medium
CN114187633B (en) Image processing method and device, and training method and device for image generation model
CN111340928B (en) Ray tracing-combined real-time hybrid rendering method and device for Web end and computer equipment
US20220012930A1 (en) Artificial Intelligence-Based Animation Character Control and Drive Method and Apparatus
CN108830385A (en) deep learning model training method and device and computer readable storage medium
CN114723888B (en) Three-dimensional hair model generation method, device, equipment, storage medium and product
CN112347288B (en) Vectorization method of word graph
CN114202597B (en) Image processing method and apparatus, device, medium and product
KR20230145197A (en) Methods, devices, computer devices and storage media for determining spatial relationships
CN112927328A (en) Expression migration method and device, electronic equipment and storage medium
CN113127697B (en) Method and system for optimizing graph layout, electronic device and readable storage medium
CN114187405A (en) Method, apparatus, device, medium and product for determining an avatar
CN112419430A (en) Animation playing method and device and computer equipment
CN112465943B (en) Color rendering method and device, electronic equipment and storage medium
CN112528707A (en) Image processing method, device, equipment and storage medium
CN112465944A (en) Color rendering method and device, electronic equipment and storage medium
CN115761196A (en) Method, device, equipment and medium for generating expression of object
CN112396680B (en) Method and device for making hair flow diagram, storage medium and computer equipment
CN111488768B (en) Style conversion method and device for face image, electronic equipment and storage medium
CN116863062A (en) Virtual makeup rendering method, device, equipment and storage medium
CN113608615B (en) Object data processing method, processing device, electronic device, and storage medium
CN109460511A (en) A kind of method, apparatus, electronic equipment and storage medium obtaining user's portrait
WO2023179091A1 (en) Three-dimensional model rendering method and apparatus, and device, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant