CN114926832A - Feature extraction model training method, material chartlet processing method, device and electronic equipment - Google Patents

Feature extraction model training method, material chartlet processing method, device and electronic equipment Download PDF

Info

Publication number
CN114926832A
CN114926832A CN202210524910.4A CN202210524910A CN114926832A CN 114926832 A CN114926832 A CN 114926832A CN 202210524910 A CN202210524910 A CN 202210524910A CN 114926832 A CN114926832 A CN 114926832A
Authority
CN
China
Prior art keywords
sample
clustering
texture
maps
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210524910.4A
Other languages
Chinese (zh)
Inventor
唐立军
唐忠樑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meiping Meiwu Shanghai Technology Co ltd
Original Assignee
Meiping Meiwu Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meiping Meiwu Shanghai Technology Co ltd filed Critical Meiping Meiwu Shanghai Technology Co ltd
Priority to CN202210524910.4A priority Critical patent/CN114926832A/en
Publication of CN114926832A publication Critical patent/CN114926832A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The application provides a method and a device for feature extraction model training and material mapping processing and electronic equipment. The feature extraction model training method comprises the following steps: obtaining an initial sample data set, wherein the initial sample data set comprises: at least one sample data, each sample data comprising: the method comprises the steps of obtaining a sample material mapping, wherein the sample material mapping is used for representing a hard label of a material type to which the sample material mapping belongs; converting a hard label of a sample material mapping in the initial sample data set into a soft label related to a sample hierarchical clustering result of the sample material mapping to obtain a training sample data set; the soft label is used for representing the probability that the sample material chartlet belongs to each material type; the sample hierarchical clustering result comprises the following steps: color results and/or texture clustering results; training the neural network model by using the training sample data set to obtain a feature extraction model, wherein the feature extraction model is used for extracting the material characteristics of the image. The method and the device improve the efficiency and accuracy of material mapping processing.

Description

Feature extraction model training method, material chartlet processing method, device and electronic equipment
Technical Field
The present application relates to computer technologies, and in particular, to a method and an apparatus for feature extraction model training and material mapping, and an electronic device.
Background
Taking three-dimensional (3-dimension, 3D) modeling as an example, the electronic device may respond to an operation of constructing a 3D model triggered by a user through modeling software to construct a 3D model frame, and then give a target material mapping to the 3D model frame, so that the 3D model has visual representations such as color, texture, roughness and the like corresponding to the target material mapping. In the modeling process, before assigning a target material map to the 3D model frame, the electronic device first needs to determine a target material map.
In the related art, the method for determining the target material mapping is mainly used for identifying and searching the required target material mapping in the massive material mapping in the material mapping library by the user. However, this method has problems of low efficiency and poor accuracy.
Disclosure of Invention
The application provides a feature extraction model training method, a material mapping processing method, a device and electronic equipment, and aims to solve the problems that the existing material mapping processing efficiency is low and the accuracy is poor.
In a first aspect, the present application provides a method for training a feature extraction model, where the method includes:
obtaining an initial sample data set, wherein the initial sample data set comprises: at least one sample data, each sample data comprising: a sample texture map, and a hard label for the sample texture map; the hard label is used for representing the material type of the sample material chartlet;
converting the hard label of the sample texture mapping in the initial sample data set into a soft label to obtain a training sample data set; the soft label is used for representing the probability that the sample material chartlet belongs to each material type; the soft label is associated with a sample hierarchical clustering result of a sample texture map, the sample hierarchical clustering result comprising: color results and/or texture clustering results;
and training a neural network model by using the training sample data set to obtain a feature extraction model, wherein the feature extraction model is used for extracting the material characteristics of the image.
Optionally, the sample hierarchical clustering result includes: clustering a first sample clustering result obtained by clustering the sample texture maps based on colors, clustering a second sample clustering result obtained by clustering the sample texture maps based on textures, and clustering a third sample clustering result obtained by clustering the sample texture maps based on colors and textures; the first sample clustering result and the second sample clustering result are both first-level sample clustering results in the sample hierarchical clustering results, and the third sample clustering result is a second-level sample clustering result in the sample hierarchical clustering results;
converting the hard label of the sample texture mapping in the initial sample data set into a soft label to obtain a training sample data set, including:
obtaining a soft label of each sample texture mapping according to the first sample clustering result, the second sample clustering result, the third sample clustering result and the hard label of each sample texture mapping;
and obtaining the training sample data set by using the texture mapping of each sample and the soft label of the texture mapping of each sample.
Optionally, the obtaining a soft label of each sample texture map according to the first sample clustering result, the second sample clustering result, the third sample clustering result, and the hard label of each sample texture map includes:
aiming at each sample material mapping, determining an initial soft label of the sample material mapping according to a hard label of the sample material mapping and a preset soft label probability distribution mode;
and adjusting the probability in the initial soft label of the sample texture mapping according to the first sample clustering result, the second sample clustering result and the third sample clustering result to obtain the soft label of the sample texture mapping.
Optionally, the method further includes:
obtaining the first sample clustering result and the second sample clustering result;
and obtaining a third sample clustering result according to the first sample clustering result and the second sample clustering result.
In a second aspect, the present application provides a texture mapping processing method, including:
acquiring a target image;
inputting the target image into a feature extraction model to obtain the material feature of the target image; the feature extraction model is obtained by the method of any one of the first aspect;
acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in a material map library extracted by using the characteristic extraction model and the hierarchical clustering result of the material maps in the material map library based on the color and the texture; and K is an integer greater than or equal to 1.
Optionally, the obtaining K material maps matched with the material of the target image according to the material features of the target image, the material features of each material map in the material map library extracted by using the feature extraction model, and the color and texture based hierarchical clustering result of the material maps in the material map library includes:
acquiring the similarity of the target image and each material map according to the material characteristics of the target image and the material characteristics of each material map in the material map library;
obtaining the first N material maps according to the sorting sequence of the similarity from big to small; n is an integer greater than or equal to 2;
acquiring clustering scores of N material maps according to the hierarchical clustering result of the material maps in the material map library based on colors and textures;
and acquiring the first K material maps as the material maps matched with the target image material according to the sequence of the clustering scores from large to small.
Optionally, the hierarchical clustering result includes: clustering the material maps based on colors to obtain a first clustering result, clustering the material maps based on textures to obtain a second clustering result, and clustering the material maps based on colors and textures to obtain a third clustering result; the first clustering result and the second clustering result are both first-level clustering results in the hierarchical clustering results, and the third clustering result is a second-level clustering result in the hierarchical clustering results;
the obtaining of the clustering scores of the N texture maps according to the hierarchical clustering results of the texture maps in the texture map library based on colors and textures comprises:
obtaining initial clustering scores of the N material maps according to the similarity ranking of the N material maps;
obtaining first clustering scores of the N material maps according to the initial clustering scores of the N material maps and the first clustering result;
obtaining second clustering scores of the N material maps according to the initial clustering scores of the N material maps and the second clustering result;
obtaining a third clustering score of the N material maps according to the initial clustering scores of the N material maps and the third clustering result;
and obtaining the clustering scores of the N material maps according to the initial clustering score, the first clustering score, the second clustering score and the third clustering score of the N material maps.
Optionally, after obtaining K material maps matched with the material of the target image, the method further includes:
and outputting the K material maps.
Optionally, after obtaining K material maps matched with the material of the target image, the method further includes:
determining a target material map from the K material maps;
rendering a model frame of a target object by using the target material chartlet to obtain a model of the target object;
and outputting the model of the target object.
Optionally, before the model frame of the target object is rendered by using the target material map and the model of the target object is obtained, the method further includes:
and constructing a model framework of the target object.
In a third aspect, the present application provides a feature extraction model training apparatus, including:
an obtaining module, configured to obtain an initial sample data set, where the initial sample data set includes: at least one sample data, each sample data comprising: a sample texture map, and a hard label for the sample texture map; the hard label is used for representing the material type of the sample material chartlet;
the processing module is used for converting the hard label of the sample material chartlet in the initial sample data set into a soft label to obtain a training sample data set; the soft label is used for representing the probability that the sample material chartlet belongs to each material type; the soft label is associated with a sample level clustering result of a sample texture map, the sample level clustering result comprising: color results and/or texture clustering results;
and the training module is used for training a neural network model by using the training sample data set to obtain a feature extraction model, and the feature extraction model is used for extracting the material characteristics of the image.
In a fourth aspect, the present application provides a texture mapping apparatus, comprising:
the acquisition module is used for acquiring a target image;
the processing module is used for inputting the target image into a feature extraction model to obtain the material features of the target image; acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in the material map library extracted by using the characteristic extraction model, and the hierarchical clustering result of the material maps in the material map library based on the color and the texture; the feature extraction model is obtained by the method according to any one of the first aspect; and K is an integer greater than or equal to 1.
In a fifth aspect, the present application provides an electronic device comprising a memory for storing a set of computer instructions and a processor;
the processor executes a set of computer instructions stored by the memory to perform the method of any of the first or second aspects described above.
In a sixth aspect, the present application provides a computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, implement the method of any one of the first or second aspects.
In a seventh aspect, the present application provides a computer program product comprising a computer program that, when executed by a processor, implements the method of any of the first or second aspects.
According to the feature extraction model training and material mapping processing method, device and electronic equipment, the hard labels of the material mapping of each sample are converted into the soft labels related to the sample level clustering result of the material mapping of the sample, the neural network model is trained based on the soft labels of the material mapping of the sample, the condition that the material characteristics of the material mapping of each sample are smaller in difference is considered, compared with the condition that the hard labels are used for training the neural network model, the training process of the method is more suitable for judging the material characteristics of the material mapping of each sample, the accuracy of the feature extraction model obtained by training the neural network model based on the training sample data set is improved, and the accuracy of material mapping processing based on the feature extraction model is improved. Through the clustering result based on the color and the texture, the probability of each material type of the sample material chartlet represented by the soft label is consistent with the visual intuition of human art, the accuracy of determining the sample material chartlet soft label is improved, the accuracy of a feature extraction model obtained by training a neural network model based on the training sample data set is further improved, and the accuracy of the material chartlet processing based on the feature extraction model is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings required for the embodiments or the description of the prior art are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is an illustration of a standard texture map;
FIG. 2 is an illustration of a non-standard texture map;
FIG. 3a is a schematic diagram of a 3D modeled scene;
FIG. 3b is a schematic view of an application scenario of a texture map processing system according to the present application;
FIG. 3c is a schematic view of an application scenario of another texture map processing system according to the present application;
FIG. 3d is a diagram illustrating a hardware configuration of the electronic device 10 with a texture mapping processing system deployed therein;
FIG. 4 is a schematic flow chart diagram of a feature extraction model training method provided in the present application;
FIG. 5 is a schematic flow chart illustrating a method for obtaining a soft label for a sample texture map according to the present disclosure;
FIG. 6 is a schematic flow chart of a method for obtaining a sample hierarchical clustering result according to the present application;
FIG. 7 is a schematic diagram of a neural network model provided herein;
FIG. 8 is a flowchart illustrating a texture mapping processing method according to the present application;
FIG. 9 is a flowchart illustrating a method for K texture maps for texture matching of a target image according to the present application;
FIG. 10 is a schematic interface diagram of a texture mapping system according to the present application;
fig. 11 is a schematic structural diagram of a feature extraction model training apparatus provided in the present application;
fig. 12 is a schematic structural diagram of a texture map processing apparatus according to the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following first explains the concept of nouns to which the present application relates:
material mapping: a texture map is an image. The texture map is a visual representation of the properties of the object, such as color, transparency, reflectivity intensity, self-luminescence, roughness, and the like. The term "material" as used herein may refer to a combination of material and texture. By way of example, the materials may be, for example: wood, plastic, leather, and the like. Texture can be, for example, the visual appearance of various characteristics of an object, such as color, texture, transparency, retroreflectivity, roughness, and the like. The material and texture of the same material map are the same. And/or, the two different material maps exist in the texture are the two different material maps.
The texture maps may include standard texture maps and non-standard texture maps. For example, fig. 1 is an exemplary diagram of a standard texture map. As shown in FIG. 1, the texture map 2 and the texture map 3 are standard texture maps with three different expression forms. The texture map 3 and the texture map 4 are the same representation form, and respectively correspond to the standard texture maps of two texture types. The standard texture map referred to herein is a texture map that is presented in the form of texture balls. The material ball is a relatively complete display of the material map.
For example, fig. 2 is an exemplary diagram of a non-standard texture map. As shown in FIG. 2, the texture map 5, the texture map 6, the texture map 7 and the texture map 8 are the non-standard texture maps of four different textures. For example, the non-standard material map may be obtained by taking a screenshot of an image including an object.
Material map library: the material map library may include material maps of multiple material types. In some embodiments, the texture map library comprises texture maps that are substantially the standard texture maps described above. Alternatively, the texture map library may include the standard texture maps and non-standard texture maps. In some embodiments, the library of material maps may also be a library that includes only non-standard material maps.
Hierarchical classification: hierarchical classification refers to that the aggregate of objects to be classified is gradually divided into a plurality of corresponding hierarchical categories according to selected attributes or characteristics as dividing reference or classification marks, and the hierarchical categories are organized into a hierarchical classification system which is developed step by step. The expression form of hierarchical classification is mainly large class, medium class, small class, fine class and the like. The successive layers are divided into several levels, and each level can also be divided into several categories. The same level categories of the same branch can form a parallel relation, and the categories of different levels can form a subordination relation.
Hard tag: the hard label is used to represent the uniquely determined class of material maps, i.e., the probability that the material map is of that class is 100%.
Soft labeling: the soft label is used for representing the probability that the material map is the category of each material map. For example, the soft label of the texture map A may be used to characterize: the material map A has a 60% probability of the material map type 1, a 20% probability of the material map type 2, and a 20% probability of the material map type 3.
Illustratively, FIG. 3a is a schematic diagram of a 3D modeled scene. As shown in fig. 3a, the electronic device may build a 3D model framework (e.g., a cuboid as shown in fig. 3 a) through 3D modeling software in response to a user-triggered modeling operation. Then, the electronic device can give a target material mapping to the 3D model frame through the modeling software, so that the 3D model has visual representations such as color, texture, roughness and the like corresponding to the target material mapping. Taking the target texture map as the texture map 2 shown in fig. 1 as an example, by giving the texture map 2 to the rectangular parallelepiped 3D model frame shown in fig. 3a, the surface of the rectangular parallelepiped can be made to have the same visual appearance as the texture map 2, such as texture, color, reflectance intensity, roughness, and the like.
In the modeling process, before assigning a target material map to the 3D model frame, the electronic device first needs to determine a target material map.
In the related art, the method for determining the target material mapping is mainly used for identifying and searching the required target material mapping in the massive material mapping in the material mapping library by the user. However, because the number of texture maps in the texture map library is large, it may take a long time for the user to obtain the desired target texture map. Moreover, because there are many material maps with small differences, for example, the two material maps may have the same color and texture, and the differences are in the visual performance such as roughness and reflectivity of the presented material, which may lead the user to select the target material map incorrectly. Therefore, the above method has problems of low efficiency and poor accuracy.
Related art also provides a method for searching a target material map from a material map library based on the material map name. However, this method requires the user to know the names of all the material maps clearly to find the desired target material map. If the user has less knowledge of the texture map, for example, the user may not know the names of various textures (e.g., litchi, curvy, waterfall, hill, etc.), materials (e.g., marble, leather, cotton, silk, wool, etc.). Therefore, the method still has the problems of low efficiency and poor accuracy.
In view of the above problem of the existing material map processing method, the reason is to rely on the user's ability to distinguish numerous material maps, so the present application provides a method for automatically retrieving and obtaining the material map required by the user without the need of distinguishing the material map by the user, so as to improve the efficiency and accuracy of obtaining the material map required by the user.
It should be understood that the application scenario of the texture mapping processing method is not limited in the present application. For example, the texture map processing method may be used in a modeling service as part of a modeling process. For example, the material map determined by the material map processing method may be used to render the 3D model frame as described above to obtain the 3D model. Alternatively, the texture map may be used to render the two-dimensional model frame to obtain the two-dimensional model. Further alternatively, the texture mapping processing method described above may be applied to any scene that can use texture mapping other than modeling services, for example. For example, the texture map processing method can be applied to a texture map providing service. The texture map providing service may output a texture map matching the texture of a target image according to the target image provided by a user.
In addition, it should be understood that the subject of the texture mapping method is not limited in this application. Optionally, the main body of the texture mapping processing method may be a texture mapping processing system. It should be understood that the present application is not limited to the material mapping system being used to provide other services (e.g., the aforementioned 3D modeling service) besides the material mapping service.
Fig. 3b is a schematic view of an application scenario of a material mapping processing system provided in the present application, and as shown in fig. 3b, in an embodiment, the material mapping processing system may be fully deployed in a cloud environment. The cloud environment is an entity which provides cloud services to users by using basic resources in a cloud computing mode. A cloud environment includes a cloud data center that includes a large number of infrastructure resources (including computing resources, storage resources, and network resources) owned by a cloud service provider, and a cloud service platform, and the computing resources included in the cloud data center may be a large number of electronic devices (e.g., servers). For example, taking an example that the computing resources included in the cloud data center are servers running virtual machines, the texture mapping processing system may be independently deployed on the servers or virtual machines in the cloud data center, or the texture mapping processing system may be distributively deployed on multiple servers in the cloud data center, or on multiple virtual machines in the cloud data center, or on the servers and virtual machines in the cloud data center.
As shown in FIG. 3b, the texture map processing system can be abstracted into a texture map processing service at the cloud service platform by the cloud service provider and provided to the user. When a User uses the texture mapping processing service, a target image can be specified through an Application Program Interface (API) or a Graphical User Interface (GUI), a texture mapping processing system in a cloud environment receives the target image input by the User and executes the operation of texture mapping processing, and the texture mapping processing system returns an automatically determined texture mapping matched with the target image to the User through the API or the GUI. The texture map may be downloaded by a user or used online to perform a specific task (e.g., rendering a 3D model frame to obtain a three-dimensional model).
FIG. 3c is a schematic view of an application scenario of another texture map processing system according to the present application. The texture map processing system provided by the present application is more flexible to deploy, and as shown in fig. 3c, in another embodiment, the texture map processing system provided by the present application may also be deployed in different environments in a distributed manner. The texture map processing system provided by the application can be logically divided into a plurality of parts, and each part has different functions. Each part in the texture mapping processing system may be deployed in any two or three of a terminal electronic device (located on the user side), an edge environment, and a cloud environment, respectively. The terminal electronic device located at the user side may include, for example, at least one of: terminal server, smart mobile phone, notebook computer, panel computer, personal desktop computer, etc. The edge environment is an environment including a set of edge electronic devices closer to the terminal electronic device, and the edge electronic device includes: edge servers, edge kiosks that possess computational power, etc. All parts of the texture mapping processing system deployed in different environments or equipment are cooperatively realized to provide the function of automatic texture mapping processing for users. It should be understood that, in the present application, the specific deployment of which parts of the texture mapping processing system are specifically deployed in what environment is not restrictively divided, and in actual application, adaptive deployment may be performed according to the computing capability of the terminal electronic device, the resource occupation of the edge environment and the cloud environment, or the specific application requirements. FIG. 3c is a schematic diagram of an application scenario in which the texture mapping processing system is respectively deployed in an edge environment and a cloud environment.
The texture mapping system can also be deployed on an electronic device in any environment (for example, on an edge server in an edge environment). Fig. 3d is a schematic diagram of a hardware structure of the electronic device 10 with a texture mapping processing system, where the electronic device 10 shown in fig. 3d includes a memory 11, a processor 12, and a communication interface 13. The memory 11, the processor 12 and the communication interface 13 are connected with each other in communication. For example, the memory 11, the processor 12, and the communication interface 13 may be connected by a network connection. Alternatively, the electronic device 10 may further include a bus 14. The memory 11, the processor 12 and the communication interface 13 are connected to each other through a bus 14. Fig. 3d is an electronic device 10 with a memory 11, a processor 12, and a communication interface 13 communicatively connected to each other via a bus 14.
The Memory 11 may be a Read Only Memory (ROM), a static Memory device, a dynamic Memory device, or a Random Access Memory (RAM). The memory 11 may store a program, and the processor 12 and communication interface 13 are used to perform a method for the texture map processing system to automatically provide a texture map to a user when the program stored in the memory 11 is executed by the processor 12.
The processor 12 may be a general-purpose processor (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), a Graphics Processing Unit (GPU), or one or more Integrated circuits.
The processor 12 may also be an integrated circuit chip having signal processing capabilities. In implementation, the functions of the texture map processing system of the present application may be implemented by instructions in the form of hardware integrated logic circuits or software in the processor 12. The processor 12 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, and may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application below. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the methods disclosed in connection with the embodiments described below may be embodied directly in the hardware decoding processor, or in a combination of the hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, etc. as is well known in the art. The storage medium is located in the memory 11, and the processor 12 reads the information in the memory 11 and completes the function of the texture mapping processing system of the present application in combination with the hardware thereof.
The communication interface 13 enables communication between the electronic device 10 and other devices or communication networks using transceiver modules such as, but not limited to, transceivers. For example, the data set may be acquired through the communication interface 13.
When electronic device 10 includes bus 14, bus 14 may include a pathway for communicating information between various components of electronic device 10 (e.g., memory 11, processor 12, communication interface 13).
The material map processing method provided by the application is used for determining the material map matched with the material of the target image based on the material characteristics of the target image and the material characteristics of each material map in the material map library. And the material characteristics of the target image and the material characteristics of each material map in the material map library are obtained by performing feature extraction on the basis of the feature extraction model.
Therefore, the following first describes the feature extraction model training scheme provided in the present application in detail with reference to specific embodiments. It should be understood that the subject executing the feature extraction model training method may be the texture mapping processing system, or may be any electronic device with processing function independent of the texture mapping processing system. The following is a detailed description of the technical solution of the feature extraction model training, taking an execution subject of the feature extraction model training method as any electronic device with a processing function as an example. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 4 is a schematic flow chart of a feature extraction model training method provided in the present application. As shown in fig. 4, the method comprises the steps of:
s101, obtaining an initial sample data set.
The initial set of sample data may include: at least one sample data. Wherein each sample data may comprise: a sample texture map, and a hard label for the sample texture map. The hard label is used for representing the material type of the sample material mapping.
Optionally, the sample texture map may be a standard texture map or a non-standard texture map. By taking part of the sample material maps included in the initial sample data set as standard material maps and the rest of the sample material maps as non-standard material maps as examples, it should be understood that the number of the standard material maps and the number of the non-standard material maps are not limited in the present application.
For example, taking the above material types as an example, including five material types, the hard label of the sample material mapping may be as shown in table 1 below:
TABLE 1
Sample texture mapping Hard tag
Class 1 (1,0,0,0,0)
Class 2 (0,1,0,0,0)
Class 3 (0,0,1,0,0)
Class 4 (0,0,0,1,0)
Class 5 (0,0,0,0,1)
As shown in table 1, taking the hard label (1, 0, 0, 0, 0) as an example, the hard label of the sample material map is (1, 0, 0, 0, 0) to characterize the type of material to which the class 1 sample material map belongs as the class 1 material type.
Optionally, the electronic device may receive the initial sample data set input by the user through an API, a GUI, or the like, for example. Alternatively, the electronic device may obtain the initial sample data set from, for example, a server or a database in which the initial sample data set is stored. Still alternatively, the initial sample data set may also be pre-stored in the electronic device.
S102, converting the hard labels of the sample texture mapping in the initial sample data set into soft labels to obtain a training sample data set.
The soft label is used for representing the probability that the sample material chartlet belongs to each material type, and the soft label is related to the sample level clustering result of the sample material chartlet. Wherein, the sample level clustering result comprises: color results and/or texture clustering results. Each sample data in the set of training sample data may comprise: the sample texture map and a soft label of the sample texture map.
For example, and again taking the example of the above material types including five material types, the soft label of the sample material mapping may be as shown in table 2 below:
TABLE 2
Sample texture map Soft label
Class 1 (0.8,0.1,0.05,0.05,0)
Class 2 (0.1,0.9,0,0,0)
Class 3 (0,0.1,0.9,0,0)
Class 4 (0.05,0.1,0.05,0.8,0)
Class 5 (0,0.1,0.05,0.05,0.8)
As shown in table 2, taking the soft label (0.8, 0.1, 0.05, 0.05, 0) as an example, the soft label of the sample material map is (0.8, 0.1, 0.05, 0.05, 0), which indicates that the class 1 sample material map has a probability of belonging to the class 1 material type of 80%, a probability of belonging to the class 2 material type of 10%, a probability of belonging to the class 3 material type of 5%, a probability of belonging to the class 4 material type of 5%, and a probability of belonging to the class 5 material type of 0.
In some embodiments, the sample hierarchical clustering results may include: a primary sample clustering result, and a secondary sample clustering result. The first sample clustering result obtained by clustering the sample texture maps based on the colors and the second sample clustering result obtained by clustering the sample texture maps based on the textures can be used as the first-level sample clustering result of the sample hierarchy clustering result. In the first sample clustering result, the sample texture maps with the same color can be grouped into one type. In the second sample clustering result, sample texture maps with the same texture can be clustered into one class. And clustering the sample texture maps based on the color and the texture to obtain a third sample clustering result, which can be used as a secondary sample clustering result in the sample hierarchical clustering result. In the third sample clustering result, sample texture maps with the same color and texture may be grouped into one type.
Optionally, the first-level sample clustering result in this application may include: a first sample clustering result and a second sample clustering result. Alternatively, in some embodiments, the first-level sample clustering result may further include only: one of the first sample clustering result and the second sample clustering result.
Optionally, the electronic device may, for example, adjust the hard label of each sample material mapping in the initial sample data set according to the sample hierarchical clustering result to obtain the soft label of each sample material mapping, so as to obtain the training sample data set.
S103, training the neural network model by using the training sample data set to obtain a feature extraction model.
The feature extraction model is used for extracting material features of the image. It should be understood that the neural network model described above is not limited by the present application. As an example, the neural Network model may be a Residual Network (ResNet), for example.
It should be understood that the present application is not limited to how the electronic device performs the neural network model training using the training sample data set. Optionally, the neural network model may be trained by referring to an existing neural network model training method to obtain a feature extraction model.
In the embodiment, the hard labels of the texture maps of the samples are converted into the soft labels related to the sample level clustering result of the texture maps of the samples, the neural network model is trained based on the soft labels of the texture maps of the samples, the condition that the material characteristics between the texture maps of the samples have small differences is considered, and compared with the condition that the hard labels are used for training the neural network model, the training process of the method is more suitable for judging the material characteristics of the texture maps of the samples, the accuracy of the feature extraction model obtained by training the neural network model based on the training sample data set is improved, and the accuracy of the texture map processing based on the feature extraction model is improved. Through the clustering result based on the color and the texture, the probability of each material type of the sample material chartlet represented by the soft label is consistent with the human art intuition, the accuracy of determining the sample material chartlet soft label is improved, the accuracy of a feature extraction model obtained by training a neural network model based on the training sample data set is further improved, and the accuracy of the material chartlet processing based on the feature extraction model is further improved.
How the electronic equipment converts the hard label of the sample material mapping in the initial sample data set into the soft label to obtain a training sample data set is described in detail below:
as one possible implementation, clustering results in a sample hierarchy includes: for example, the electronic device may obtain a soft label of each sample texture mapping according to the first sample clustering result, the second sample clustering result, the third sample clustering result, and the hard label of each sample texture mapping, and then obtain a training sample data set using each sample texture mapping and the soft label of each sample texture mapping.
Optionally, after obtaining the soft label of each sample material mapping, the electronic device may directly use the soft label of each sample material mapping and the soft label of each sample material mapping as a training sample data set.
Alternatively, the electronic device may also use the preprocessed sample texture map and the soft label of each sample texture map to obtain a training sample data set, for example, after preprocessing the sample texture map. For example, the preprocessing may be, for example, performing processing operations such as translation and rotation on the sample texture map to improve the diversity of the training sample data set, and further improve the accuracy of the feature extraction model obtained by training based on the training sample data set, so as to improve the accuracy of performing texture map processing on the texture features of the extracted image.
Through the implementation mode, the electronic equipment can determine the soft label of each sample material chartlet based on the first sample clustering result corresponding to the first-level color clustering of the sample material chartlet, the second sample clustering result corresponding to the first-level texture clustering of the sample material chartlet, and the third sample clustering result corresponding to the second-level color and texture clustering of the sample material chartlet, fully considers the importance of the color and the texture difference among the sample material chartlets, and improves the accuracy of determining the soft label of the sample material chartlet.
As another possible implementation, clustering results in a sample hierarchy includes: the first-level sample clustering result and the second-level sample clustering result, wherein the first-level sample clustering result comprises: for example, the specific implementation manner of the electronic device obtaining the training sample data set according to the sample hierarchical clustering result and the hard label of the texture mapping of each sample may refer to the above embodiment, and details are not repeated here.
The following clustering results in sample hierarchy include: the first sample clustering result, the second sample clustering result, and the third sample clustering result are taken as examples, how the electronic device obtains the soft label of each sample texture mapping according to the first sample clustering result, the second sample clustering result, the third sample clustering result, and the hard label of each sample texture mapping is explained in detail:
as a first possible implementation manner, fig. 5 is a schematic flowchart of a method for obtaining a soft label of a sample material mapping provided in the present application. As shown in fig. 5, the method may include the steps of:
s201, aiming at each sample material mapping, determining an initial soft label of the sample material mapping according to a hard label of the sample material mapping and a preset soft label probability distribution mode.
Optionally, the preset soft tag probability distribution manner may be pre-stored in the electronic device for the user, for example. Optionally, the soft label probability distribution modes corresponding to different types of sample material maps may be the same or different. For example, taking the hard label of the sample material mapping shown in table 1 as an example, it is assumed that the soft label probability distribution methods corresponding to different types of sample material mapping are the same, and the soft label probability distribution method is: reducing the probability of 1 in the hard label to 0.8, the initial soft label of the sample texture map may be, for example, as shown in table 3 below:
TABLE 3
Sample texture mapping Initial soft label
Class 1 (0.8,0,0,0,0)
Class 2 (0,0.8,0,0,0)
Class 3 (0,0,0.8,0,0)
Class 4 (0,0,0,0.8,0)
Class 5 (0,0,0,0,0.8)
If the soft label probability distribution modes corresponding to different types of sample material maps may be different, for example, assuming that the soft label probability distribution mode corresponding to the 5 th type sample material map shown in table 3 is different from the 1 st to 4 th types, the soft label probability distribution mode corresponding to the 5 th type sample material map is: reducing the probability of 1 in the hard label to 0.7, the initial soft label of the class 5 sample texture map may be (0, 0, 0, 0, 0.7), for example.
S202, according to the first sample clustering result, the second sample clustering result and the third sample clustering result, adjusting the probability in the initial soft label of the sample texture mapping to obtain the soft label of the sample texture mapping.
Optionally, the electronic device may, for example, adjust the probability in the initial soft label of the sample material map according to the first sample clustering result to obtain the first initial soft label of the sample material map. Then, the electronic device may adjust the probability in the first initial soft label of the sample material mapping according to the second sample clustering result to obtain a second initial soft label of the sample material mapping. Then, the electronic device may adjust the probability in the second initial soft label of the sample material mapping according to the third sample clustering result, so as to obtain a soft label of the sample material mapping. Or, the electronic device may further adjust the probability in the initial soft label of the sample material mapping according to the sequence of the second sample clustering result, the first sample clustering result, and the third sample clustering result, which is not described herein again.
Under the implementation mode, the hard label of the sample material mapping is processed through a preset soft label probability distribution mode, the initial soft label of the sample material mapping can be obtained, the probability in the initial soft label can be adjusted through the first sample clustering result, the second sample clustering result and the third sample clustering result, the soft label of the sample material mapping accords with the condition that the materials are distinguished according to colors and textures, and the accuracy of determining the soft label of the sample material mapping is improved.
As a second possible implementation manner, for each sample texture map, the electronic device may, for example, adjust the probability in the hard label of the sample texture map according to the first sample clustering result, the second sample clustering result, and the third sample clustering result to obtain a first soft label of the sample texture map. If the electronic device determines that the sum of the probability values in the first soft label is greater than 1, the electronic device can reduce the probability values in an equal ratio to enable the sum of the probability values in the first soft label to be equal to 1, and then the soft label of the sample material mapping is obtained.
For example, taking the hard labels (1, 0, 0, 0, 0) shown in table 1 above as an example, assume that the electronic device determines a sample texture map having the same color as the type 1 sample texture map as the type 2 sample texture map according to the first sample clustering result, determines a sample texture map having the same texture as the type 1 sample texture map as the type 2 sample texture map according to the second sample clustering result, and determines a sample texture map having the same color and texture as the type 1 sample texture map as the type 2 sample texture map according to the third sample clustering result. Optionally, according to the first sample clustering result, the electronic device may, for example, increase the probability that the sample map in the hard tag (1, 0, 0, 0, 0) belongs to the class 2 sample texture map by a preset value, for example, by 0.1, to obtain (1, 0.1, 0, 0, 0).
Similarly, according to the second sample clustering result, the electronic device may, for example, increase the probability that the sample map in (1, 0.1, 0, 0, 0) belongs to the class 2 sample material map and the probability that the sample map belongs to the class 4 sample material map by a preset value, for example, by 0.1, to obtain (1, 0.2, 0, 0.1, 0). Similarly, according to the third sample clustering result, the electronic device may, for example, increase the probability that the sample map in (1, 0.2, 0, 0.1, 0) belongs to the class 2 sample texture map by a preset value, for example, by 0.1, to obtain (1, 0.3, 0, 0.1, 0). Since 1+0.3+0.1 is equal to 1.4, the electronic device may divide each probability in (1, 0.3, 0, 0.1, 0) by 1.4 to implement geometric reduction of the probability values, so that the sum of the probability values in the first soft label is equal to 1, thereby obtaining the soft label of the sample texture mapping.
In the following, how the electronic device adjusts the probability in the initial soft label of the sample texture mapping according to the first sample clustering result, the second sample clustering result, and the third sample clustering result to obtain the soft label of the sample texture mapping is exemplarily explained:
optionally, the electronic device may first adjust the probability in the initial soft label of the sample material mapping according to the first sample clustering result, for example, to obtain the first soft label of the sample material mapping.
In this implementation manner, as a possible implementation manner, the electronic device may determine, for example, the X-type sample material map, which is represented by the first sample clustering result and has the same color as the i-th type sample material map. Wherein X and i are each an integer greater than or equal to 1. Then, the electronic device may adjust the probability of the X-type material type in the initial soft label of the ith type sample material mapping and the probability of the ith type material type according to the preset soft label probability distribution manner, so as to obtain the first soft label of the ith type sample material mapping.
For example, taking the initial soft label shown in table 3 as an example, when i is equal to 1, assuming that the electronic device can determine, according to the first sample clustering result, that the sample material map having the same color as the class 1 sample material map is the class 2 sample material map, the electronic device may adjust the probability of the class 2 material type in the initial soft label of the class 1 sample material map and the probability of the class 1 material type according to the preset soft label probability distribution manner, so as to obtain the first soft label of the class 1 sample material map. In the example shown in table 3, the preset soft label probability distribution mode is: reducing the probability of 1 in the hard label to 0.8, the electronic device may determine that by 1-0.8 ═ 0.2, the electronic device adjusts the probability of belonging to the class 2 material type and the probability of belonging to the class 1 material type in the initial soft label of the class 1 sample material map by the 0.2. For example, the electronic device may equally assign the 0.2 to the probability of belonging to the type 2 material type and the probability of belonging to the type 1 material type in the initial soft label of the type 1 sample material map, and then obtain that the first soft label of the type 1 sample material map is (0.8+0.1, 0+0.1, 0, 0, 0), that is, (0.9, 0.1, 0, 0, 0).
When i is equal to any one of the values 2 to 5, the first soft label of each type of sample material mapping can be determined by referring to the method described in the above embodiment, which is not described herein again.
The electronic equipment adjusts the probability in the initial soft label of the sample material mapping according to the first sample clustering result to obtain a first soft label of the sample material mapping, and then adjusts the probability in the first soft label of the sample material mapping according to the second sample clustering result to obtain a second soft label of the sample material mapping.
In this implementation manner, as a possible implementation manner, the electronic device may determine, for example, the Y-type sample material map, which is represented by the second sample clustering result and has the same texture as the i-th type sample material map, first. Wherein Y is an integer greater than or equal to 1. Then, the electronic device may adjust the probability of the Y-type material type in the first soft label of the i-th type sample material mapping and the probability of the i-th type material type according to a preset soft label probability distribution manner, so as to obtain a second soft label of the i-th type sample material mapping.
For example, when i is equal to 1, taking the first soft label of the class 1 sample material mapping illustrated in the above embodiment as an example (0.9, 0.1, 0, 0, 0), assuming that the electronic device can determine, according to the second sample clustering result, that the sample material mapping with the same texture as that of the class 1 sample material mapping is a class 2, and the class 4 sample material mapping is a class 4 sample material mapping, the electronic device can adjust the probability of belonging to the class 2 material type, the probability of belonging to the class 4 material type, and the probability of belonging to the class 1 material type in the first soft label of the class 1 sample material mapping according to a preset soft label probability distribution manner to obtain the second soft label of the class 1 sample material mapping. The preset soft label probability distribution mode is assumed as follows: subtracting 0.3 from the probability that the sample material mapping in the first soft label belongs to the type 1 material type, the electronic device may determine that the probability that the sample material mapping in the type 1 sample material mapping belongs to the type 2 material type, the probability that the sample material mapping in the first soft label belongs to the type 4 material type, and the probability that the sample material mapping in the first soft label belongs to the type 1 material type are adjusted by the 0.3. For example, the electronic device may equally assign the 0.3 to the probability of belonging to the type 2 material type, the probability of belonging to the type 4 material type, and the probability of belonging to the type 1 material type in the first soft label, and then obtain that the second soft label of the type 1 sample material mapping is (0.6+0.1, 0.1+0.1, 0, 0.1, 0), that is (0.7, 0.2, 0, 0.1, 0).
When i is equal to any one of the values 2-5, the second soft label of each type of sample material mapping can be determined by referring to the method described in the above embodiment, which is not described herein again.
The electronic device adjusts the probability in the first soft label of the sample material mapping according to the second sample clustering result to obtain a second soft label of the sample material mapping, and then adjusts the probability in the second soft label of the sample material mapping according to the third sample clustering result to obtain a soft label of the sample material mapping.
In this implementation manner, as a possible implementation manner, the electronic device may determine, for example, a Z-type sample texture map that is represented by the third sample clustering result and has the same color and texture as the i-type sample texture map. Wherein Z is an integer greater than or equal to 1. Then, the electronic device may adjust the probability of the Z-type material type in the second soft label of the i-th type sample material mapping and the probability of the i-th type material type according to a preset soft label probability distribution manner, so as to obtain the soft label of the i-th type sample material mapping.
For example, when i is equal to 1, taking the second soft label of the class 1 sample material mapping illustrated in the above embodiment as an example (0.7, 0.2, 0, 0.1, 0), assuming that the electronic device can determine, according to the third sample clustering result, that the sample material mapping having the same color and texture as those of the class 1 sample material mapping is the class 2 sample material mapping, the electronic device may adjust the probability of the class 2 material type in the second soft label of the class 1 sample material mapping and the probability of the class 1 material type according to a preset soft label probability distribution manner to obtain the soft label of the class 1 sample material mapping. The preset soft label probability distribution mode is assumed as follows: subtracting 0.2 from the probability that the sample material mapping in the second soft label belongs to the type 1 material type, the electronic device may determine that the probability that the sample material mapping in the type 1 sample material mapping belongs to the type 2 material type and the probability that the sample material mapping in the second soft label belongs to the type 1 material type are adjusted by the 0.2. For example, the electronic device may equally assign the 0.2 to the probability of belonging to the type 2 material type and the probability of belonging to the type 1 material type in the second soft label, so as to obtain the soft label of the type 1 sample material map as (0.5+0.1, 0.2+0.1, 0, 0.1, 0), that is, (0.6, 0.3, 0, 0.1, 0).
When i is equal to any one of the values 2 to 5, the soft labels of the texture maps of the various samples can be determined by referring to the method described in the above embodiment, which is not described herein again.
How to obtain the first sample clustering result, the second sample clustering result, and the third sample clustering result by the electronic device is described in detail below:
as a first possible implementation manner, for example, the electronic device may obtain a first sample clustering result and a second sample clustering result, and then obtain a third sample clustering result according to the first sample clustering result and the second sample clustering result.
Fig. 6 is a schematic flow chart of a method for obtaining a sample hierarchical clustering result according to the present application. As shown in fig. 6, for sample texture maps of multiple texture types, a user can manually determine whether the color and texture of the sample texture map of each texture type are the same. The electronic equipment can respond to the classification operation of the user on the sample material maps, the sample material maps of the material types with the same color are used as one type in the first sample clustering result, and the sample material maps of the material types with the same texture are used as one type in the second sample clustering result, so that the first sample clustering result and the second sample clustering result are obtained.
For example, in the first sample clustering result, the sample material maps of the material types with the same color may be placed in the same folder, or the identifiers of the sample material maps of the material types with the same color may be the same. In the second sample clustering result, sample texture maps with the same texture may be placed in the same folder, or the identifiers of sample texture maps with the same texture for different texture types may be the same. For example, the identifier of the sample texture map may be, for example, the name of the sample texture map, or a field of a target location in the name of the sample texture map.
The electronic device may then receive the first sample clustering result and the second sample clustering result input by the user. After obtaining the first sample clustering result and the second sample clustering result, for example, if the electronic device determines that the sample texture map of the texture type 1 is not only in the same folder as the sample texture map of the texture type 2 in the first sample clustering result, but also in the same folder as the sample texture map of the texture type 2 in the second sample clustering result, the electronic device may determine that the sample texture map of the texture type 1 is the same as the sample texture map of the texture type 2 in color and texture, and the electronic device may group the sample texture map of the texture type 1 and the sample texture map of the texture type 2 into one class, and then obtain the third sample clustering result.
As a second possible implementation manner, the third sample clustering result may also be a third sample clustering result calibrated by the second sample clustering result according to the first sample clustering result under the subscriber line. The electronic device may receive a first sample clustering result, a second sample clustering result, and a third sample clustering result input by a user through an API, a GUI, or the like, for example.
Taking the above neural network model as a ResNet network model as an example, how the electronic device trains the neural network model using the training sample data set to obtain the feature extraction model is exemplarily explained as follows:
fig. 7 is a schematic structural diagram of a neural network model provided in the present application. The neural network model may include a material feature extraction portion, and a non-material feature extraction portion. The electronic equipment can extract the characteristics of the sample texture map through the texture characteristic extraction part; and determining the probability of each material type of the sample material chartlet predicted by the material characteristics through the non-material characteristic extraction part.
Illustratively, as shown in FIG. 7, the input to the ResNet network model is a sample texture map. The material characteristic extraction part of the ResNet network model can comprise three convolution layers, a pooling layer and two full-connection layers. The non-material feature extraction section may include a Softmax activation function layer. The electronic equipment can perform feature extraction on the sample material mapping through the convolution layer, the pooling layer and the full-connection layer to obtain material features of the sample material mapping, and outputs the material features to the Softmax activation function layer. The electronic equipment can output the probability p of each material type of the sample material mapping predicted by the neural network model according to the material characteristics of the sample material mapping through a Softmax activation function.
Alternatively, the loss function used to train the neural network model may be represented by the following equation (1), for example:
Figure BDA0003643801970000151
where CE () represents the cross entropy loss function, Label j And N represents the number of sample texture maps of a training batch. Predict j And representing the probability of each material type of the jth sample material map predicted by the neural network model.
It should be understood that the cross-entropy loss function described above is only a possible implementation of the loss function provided herein. The loss function used to train the neural network model is not limited by the present application.
Optionally, after the training of the neural network model is completed and the trained neural network model is obtained, the electronic device may respond to an operation of deleting the non-material feature extraction part in the trained neural network model by the user, delete the non-material feature extraction part in the trained neural network model, and reserve the feature extraction part in the trained neural network model as the feature extraction model. By the method, the feature extraction model can be used for performing feature extraction on the image input into the model to obtain the material features of the input image.
In this embodiment, as described above, the accuracy of obtaining the feature extraction model can be improved by training the neural network model based on the soft label of the sample material mapping. Although there are some methods for training a neural network based on soft labels of images, in the prior art, the main role of network model training based on soft labels is to reduce the size of the neural network so as to improve the network training efficiency. That is to say, the existing network model training based on soft labels has the function of improving the network training efficiency, and is different from the function of improving the network training accuracy of the present application.
In addition, some image feature extraction methods based on deep learning also exist in the prior art, however, the existing image feature extraction methods mainly perform feature extraction on images including obvious targets (such as automobiles, people, animals and the like), and the texture maps of the present application do not have obvious targets, so the existing image feature extraction methods based on deep learning are also not suitable for the present application.
After the feature extraction model is obtained by the feature extraction model training method described in any of the foregoing embodiments, the texture mapping processing system may perform texture mapping processing based on the feature extraction model.
The following describes the technical scheme of texture mapping processing provided by the present application in detail with reference to specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 8 is a flowchart illustrating a texture map processing method according to the present application. As shown in fig. 8, the method comprises the steps of:
s301, acquiring a target image.
In some embodiments, a target image may include a material type, i.e., a color and a texture may be included in a target image. In this implementation, before the target image is obtained, the material mapping processing system may further output prompt information for prompting a user to upload the target image including only one material type, for example, so as to improve the accuracy and efficiency of obtaining the target image by the material mapping processing system, and further improve the accuracy of performing the material mapping processing by the material mapping processing system.
In some embodiments, the target image may also be an image after pre-processing, such as image enhancement, to further improve the accuracy of the texture mapping processing system performing the texture mapping processing based on the target image.
Alternatively, the texture map processing system may receive the target image input by the user through, for example, an API or a GUI.
S302, inputting the target image into the feature extraction model to obtain the material feature of the target image.
Wherein, the feature extraction model is obtained by the method as described in any of the foregoing embodiments.
S303, acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in the material map library extracted by using the characteristic extraction model and the hierarchical clustering result of the material maps in the material map library based on the color and the texture.
Wherein K is an integer greater than or equal to 1.
In some embodiments, the hierarchical clustering result may include: a primary clustering result, and a secondary clustering result. The first clustering result obtained by clustering the material maps based on the colors and the second clustering result obtained by clustering the material maps based on the textures can be used as the first-level clustering result of the hierarchical clustering result. In the first clustering result, the texture maps with the same color can be clustered into one category. In the second clustering result, texture-identical texture maps can be clustered into one cluster. And a third clustering result obtained by clustering the texture maps based on the colors and the textures can be used as a second-level clustering result in the hierarchical clustering result. In the third clustering result, texture maps with the same color and texture can be clustered into one cluster.
Optionally, the first-level clustering result in the present application may include: a first clustering result and a second clustering result. Alternatively, in some embodiments, the first-level clustering result may also include only: one of the first clustering result and the second clustering result.
The texture features of each texture map in the texture map library extracted by the feature extraction model and the hierarchical clustering result of the texture maps in the texture map library based on color and texture may be, for example, pre-stored in the texture map processing system. It should be understood that the execution subject for executing the operation of extracting the material feature of each material map in the material map library by using the feature extraction model may be the same as or different from the execution subject for executing the material map processing method, and the present application does not limit this.
In this embodiment, the feature extraction is performed on the target image through the feature extraction model obtained by the soft label training based on the sample material mapping, so that the accuracy of obtaining the material features of the target image is improved, and the accuracy of performing the material mapping processing based on the material features of the target image is further improved. K material maps matched with the target image material can be determined according to the material characteristics of the target image, the material characteristics of each material map in the material map library extracted by the characteristic extraction model, and the hierarchical clustering result of the material maps in the material map library based on the color and the texture. By the method, the automatic determination of the material chartlet matched with the target image material is realized, and compared with the existing method which needs to depend on the discrimination capability of a user on a plurality of material chartlets, the efficiency and the accuracy of determining the material chartlet required by the user are improved.
In addition, the method for determining the K material maps matched with the target image material based on the hierarchical clustering result of the material maps based on the color and the texture fully considers the importance of the texture and the color in the retrieval of the material maps, and further improves the accuracy of determining the material maps required by the user. In addition, the execution of the material mapping processing method provided by the application only needs to use one feature extraction model obtained based on the neural network model training, so that the efficiency of material mapping processing is further improved, and the training cost is reduced.
The following describes how the material mapping processing system obtains K material mappings matched with the material of the target image based on the material characteristics of each material mapping in the material mapping library extracted by the characteristic extraction model and the hierarchical clustering result of the material mappings in the material mapping library based on the color and texture, and further describes in detail:
FIG. 9 is a flowchart illustrating a method for K texture maps for texture matching of a target image according to the present application. As shown in fig. 9, as a first possible implementation manner, the foregoing step S303 may include the following steps:
s401, according to the material characteristics of the target image and the material characteristics of each material map in the material map library, the similarity between the target image and each material map is obtained.
Optionally, the material map processing system may calculate, for example, a similarity between the material feature of the target image and the material feature of each material map in the material map library as a similarity between the target image and each material map in any one of the existing similarity calculation methods, such as a euclidean distance similarity calculation method or a cosine similarity calculation method.
S402, obtaining the first N material maps according to the sorting sequence of the similarity from large to small.
Wherein N is an integer greater than or equal to 2.
The greater the similarity between the target image and the material map, the more similar the material types included in the target image and the material types corresponding to the material map are. The smaller the similarity between the target image and the material map is, the more different the material types corresponding to the material map are. Therefore, the material mapping processing system can obtain the first N material mappings from the material mapping library according to the sorting order of the similarity from large to small.
S403, obtaining clustering scores of the N material maps according to the hierarchical clustering result of the material maps in the material map library based on the color and the texture.
The clustering scores of the N material maps are obtained based on the hierarchical clustering result of the material maps based on the color and the texture in the material map library, so that the clustering score of each material map considers the possibility that the material map belonging to the same cluster with the material map is used as the material map matched with the material of the target image.
The first-level clustering result comprises the following steps: for example, the texture map processing system may calculate the clustering scores of the N texture maps based on the first-level clustering result and the third-level clustering result. Taking the hierarchical clustering result as an example, where the hierarchical clustering result includes a first clustering result, a second clustering result, and a third clustering result, the material map processing system may calculate the clustering scores of the N material maps based on the first clustering result, the second clustering result, and the third clustering result.
S404, acquiring the first K material maps as the material maps matched with the material of the target image according to the sequence of the clustering scores from large to small.
The larger the clustering score, the better the texture map matches the color and texture of the target image than other texture maps. Therefore, the material map processing system may obtain the top K material maps as the material maps matching the material of the target image in the order of the cluster scores from large to small.
In this embodiment, the similarity between the target image and each material map is determined according to the material characteristics of the target image and the material characteristics of each material map in the material map library, and the first N material maps with higher similarity are determined according to the similarity. By the method, the material mapping processing system does not need to calculate the clustering scores of all the material mappings, and the efficiency of searching the material mappings is further improved. And determining the clustering scores of the N material maps according to the hierarchical clustering result, and determining the K material maps according to the magnitude sequence of the clustering scores. By the method, the importance of the texture and the color in the retrieval of the texture map is fully considered, the first N texture maps with higher similarity are reordered according to the texture and the color characteristics of the texture map, and the accuracy of the retrieval of the texture map is further improved.
As a second possible implementation manner, after obtaining the similarity between the target image and each material map, the material map processing system may use at least one material map larger than a preset similarity threshold as the initial material map. The texture map processing system may then obtain a clustering score for the at least one initial texture map based on the color and texture based hierarchical clustering results for the texture maps in the texture map library. Then, the material map processing system may take the initial material map with the clustering score greater than the preset clustering score threshold as the target material map.
If the number of the initial material maps is less than or equal to K, optionally, the material map processing system may use the at least one initial material map as a material map matching the material of the target image. If the number of the initial material maps is greater than K and the number of the target material maps is less than or equal to K, optionally, the material map processing system may use the target material maps as material maps matched with the target image material. If the number of the target material maps is greater than K, optionally, the material map processing system may obtain, from the plurality of target material maps, the first K material maps as the material maps matched with the target image material according to the descending order of the clustering scores.
Taking the hierarchical clustering result including the first clustering result, the second clustering result, and the third clustering result as an example, how the material mapping processing system obtains the clustering scores of the N material mappings according to the hierarchical clustering results of the material mappings in the material mapping library based on the color and texture will be described in detail below:
as a possible implementation manner, the foregoing step S403 may include the following steps:
step 1, the material map processing system can obtain the initial clustering scores of the N material maps according to the similarity ranking of the N material maps.
For example, the material map processing system may assign an initial cluster score equal to 1/r for the r-th material map in a descending order of similarity. Taking the above N equal to 20 as an example, in this implementation, the material map processing system may obtain that, in the order of similarity from large to small, the initial cluster score of the 1 st material map is 1, the initial cluster score of the 3 rd material map is 1/2, the initial cluster score of the 3 rd material map is 1/3, and so on, the initial cluster score of the 20 th material map is 1/20.
And 2, the material map processing system can obtain the first clustering score of the N material maps according to the initial clustering scores of the N material maps and the first clustering result.
For example, according to the first clustering result, the material map processing system may determine a plurality of material maps with the same color from the N material maps. The material map processing system may then determine a first cluster score for the plurality of material maps based on the initial cluster scores for the plurality of material maps. The first clustering score of each material map in the plurality of material maps with the same color may be the same. For example, for any of the plurality of material maps with the same color, the material map processing system may determine the sum of the initial cluster scores of the plurality of material maps with the same color as the first cluster score of the material map.
For example, still taking the example that N is equal to 20, assuming that the material mapping processing system determines according to the first clustering result that the colors of the 1 st, 4 th, 8 th and 16 th material maps of the 20 material maps in the descending order of similarity are the same, the material mapping processing system may make the first clustering scores of the 1 st, 4 th, 8 th and 16 th material maps all equal to: 1/1+1/4+1/8+1/16 is 23/16. Referring to the above method, the material map processing system may determine the first clustering score corresponding to the other material maps in the 20 material maps.
And 3, the material mapping processing system can obtain second clustering scores of the N material mappings according to the initial clustering scores of the N material mappings and the second clustering result.
For example, according to the second clustering result, the texture map processing system may determine a plurality of texture maps having the same texture from the N texture maps. The material map processing system may then determine a second clustering score for the plurality of material maps based on the initial clustering scores for the plurality of material maps. The second clustering score of each texture map in the texture maps may be the same. For example, for any of the texture maps having the same texture, the texture map processing system may determine the sum of the initial clustering scores of the texture maps having the same texture as the second clustering score of the texture map.
For example, still taking the above N equal to 20 as an example, assuming that the texture of the 1 st and 4 th texture maps in the similarity descending order of the 20 texture maps determined by the texture map processing system according to the above second clustering result are the same, the texture map processing system may make the second clustering score of the 1 st and 4 th texture maps equal to: 1/1+1/4 is 5/4. Referring to the above method, the material map processing system may determine second clustering scores corresponding to other material maps in the 20 material maps.
And step 4, obtaining a third clustering score of the N material maps according to the initial clustering scores of the N material maps and the third clustering result.
For example, according to the third classification result, the texture map processing system may determine a plurality of texture maps having the same color and texture from the N texture maps. The material map processing system may then determine a third cluster score for the plurality of material maps based on the initial cluster scores for the plurality of material maps. Wherein the third clustering score of each texture map of the plurality of texture maps having the same color and texture may be the same. For example, for any of the plurality of material maps with the same color and texture, the material map processing system may determine the sum of the initial cluster scores of the plurality of material maps with the same color and texture as the third cluster score of the material map.
For example, still taking the above N equal to 20 as an example, assuming that the material mapping processing system determines according to the above third clustering result that the color and texture of the 1 st and 4 th material maps in the similarity descending order of the 20 material maps are the same, the material mapping processing system may make the third clustering score of the 1 st and 4 th material maps equal to: 1/1+1/4 is 5/4. Referring to the above method, the material map processing system may determine a third clustering score corresponding to the other material maps in the 20 material maps.
It should be understood that the order in which the texture mapping system performs the steps 2, 3, and 4 is not limited in the present application. For example, the texture mapping system may perform step 2, then step 3, and then step 4. Alternatively, the texture mapping processing system may perform step 3, then step 4, and then step 2.
And step 5, obtaining the clustering scores of the N material maps according to the initial clustering scores, the first clustering scores, the second clustering scores and the third clustering scores of the N material maps.
Optionally, for any material map in the N material maps, the material map processing system may directly use the sum of the initial clustering score, the first clustering score, the second clustering score, and the third clustering score as the clustering score of the material map.
Alternatively, for any of the N material maps, the material map processing system may further use a weighted sum of the initial clustering score, the first clustering score, the second clustering score, and the third clustering score as the clustering score of the material map.
The weights corresponding to the initial clustering score, the first clustering score, the second clustering score, and the third clustering score may be pre-stored in the texture map processing system for the user, for example. The user can adjust the value of each item of weight according to the self requirement. For example, if the texture map that the user needs to obtain needs to be more matched with the target image in color, the user may increase the weight corresponding to the first clustering score to improve the accuracy of the K texture maps determined by the texture map processing system in color. If the texture of the texture map that the user needs to obtain needs to be more matched with the texture of the target image, the user can adjust the weight corresponding to the second clustering score to improve the accuracy of the K texture maps determined by the texture map processing system on the texture. By the method, the flexibility and the accuracy of the material mapping processing system for determining the K material mappings are improved, and the user experience is improved.
In this embodiment, an initial clustering score of the material map is determined according to the similarity ranking of the material maps, and a first clustering score, a second clustering score, and a first clustering score of the material map are respectively determined according to a first clustering result based on color clustering, a second clustering result based on texture clustering, and a third clustering result based on color and texture clustering. Based on the initial clustering score, the first clustering score, the second clustering score and the third clustering score, the obtained clustering scores of the N material maps not only consider the similarity between the target image and the material maps, but also combine the similarity of color and texture between the material maps, and improve the accuracy of determining the clustering scores of the N material maps, thereby further improving the accuracy of determining the K material maps based on the clustering scores of the N material maps.
Taking the hierarchical clustering result including the first clustering result and the second clustering result as an example, how the material map processing system obtains the clustering scores of the N material maps according to the hierarchical clustering result may refer to the method described in the above embodiment, and details thereof are not described herein.
Further, as a possible implementation manner, after acquiring K material maps matched with the target image material, the material map processing system may further output the K material maps, so that the user may view the K material maps.
For example, fig. 10 is a schematic interface diagram of a texture mapping processing system according to the present application. As shown in FIG. 10, the texture map processing system may receive a target image input by a user, for example, through a model build interface. Then, the material mapping processing system may output the K material mappings in response to a user-triggered request to obtain the "K material mappings matching the target image material". For example, the user may trigger the request by clicking a "get material map" control of the model build interface shown in FIG. 10, and the material map processing system may output K material maps through the model build interface in response to the request.
It should be understood that the present application is not limited to whether the model building interface includes other content. Illustratively, as shown in FIG. 10, the model build interface may also include a model framework, for example.
As a possible implementation manner, after the material mapping processing system obtains K material mappings matched with the material of the target image, the material mapping processing system may further determine the target material mapping from the K material mappings, and then render the model frame of the target object by using the target material mapping to obtain the model of the target object. The material map processing system may then output a model of the target object.
Alternatively, the model frame of the target object may be, for example, a three-dimensional model frame or a two-dimensional model frame, which is not limited in this application. Alternatively, the target object may be any object, such as a sofa, a table, and the like.
In some embodiments, the texture map processing system may, for example, receive user input of a model framework for the target object.
In some embodiments, the material map processing system may construct the model frame of the target object before rendering the model frame of the target object using the target material map, resulting in the model of the target object. Optionally, a specific implementation manner of the texture mapping processing system for constructing the model frame of the target object may refer to an existing model construction method, and details are not described herein again. Through the implementation mode, the material charting processing system can render the model frame of the target object through the target material charting after the model frame of the target object is built, so that the efficiency of the material charting processing system from building the model frame of the target object to obtaining the model of the target object is improved, and the user experience is improved.
Optionally, the material map processing system may use, for example, a material map with the highest clustering score among the K material maps as the target material map. Alternatively, the material map processing system may randomly determine a material map from the K material maps as the target material map. Or, the material mapping processing system may further output the K material mappings after acquiring the K material mappings matching the material of the target image. The material map processing system may then determine a target material map from the K material maps based on a user selection instruction. For example, taking the model building interface shown in FIG. 10 as an example, the material map processing system may respond to the user clicking on the 1 st material map by using the 1 st material map as the target material map.
It should be understood that the present application is not limited to how the material map processing system renders the model framework of the target object using the target material map. Optionally, a method for rendering a model of a target object by using a material map may be used with reference to an existing material map processing system, and details are not repeated here.
Optionally, the texture mapping processing system may output the model of the target object through a display device, for example. Optionally, the display device may be a display device of the texture mapping processing system, or a display device connected to the texture mapping processing system. Alternatively, the texture map processing system may also send the model of the target object to other devices.
As another possible implementation manner, taking the above-mentioned K equal to 1 as an example, after obtaining 1 material map matching the material of the target image, the material map processing system may further directly take the material map as the target material map, and use the material map to render the model frame of the target object, obtain the model of the target object, and output the model.
In this embodiment, the model frame of the target object is rendered through the target material maps determined from the K material maps, so that the model of the target object can be obtained, and the accuracy of obtaining the model of the target object is improved. By outputting the model of the target object, the user can check the model of the target object which has higher accuracy and better meets the material required by the user, and the user experience is improved.
Clustering the results in the hierarchy includes: the first-level clustering result, and the second-level clustering result, wherein the first-level clustering result comprises: based on the first clustering result that the color was clustered to the material map to and, based on the second clustering result that the texture was clustered to the material map, the second grade clustering result includes: as an example of a third clustering result obtained by clustering the texture map based on color and texture, as a possible implementation manner, the present application further provides a texture map processing method, which may include the following steps:
step 1, acquiring a target image input by a user.
And 2, inputting the target image into a feature extraction model to obtain the material feature of the target image.
And 3, acquiring the similarity of the target image and each material map according to the material characteristics of the target image and the material characteristics of each material map in the material map library extracted by the characteristic extraction model.
And 4, acquiring initial clustering scores of the first N texture maps according to the sorting sequence of the similarity from large to small.
Step 5, obtaining first clustering scores of the N material maps according to the initial clustering scores of the N material maps and the first clustering result; obtaining second clustering scores of the N material maps according to the initial clustering scores of the N material maps and the second clustering result; and obtaining a third clustering score of the N material maps according to the initial clustering scores of the N material maps and the third clustering result.
And 6, calculating a weighted sum of the initial clustering score, the first clustering score, the second clustering score and the third clustering score aiming at any material map in the N material maps to serve as the clustering score of the material map.
And 7, acquiring the first K material maps as the material maps matched with the material of the target image according to the sequence of the clustering scores from large to small.
And 8, outputting the K material maps.
And 9, responding to the material map selected by the user, and determining a target material map from the K material maps.
And step 10, rendering the model frame of the target object by using the target material chartlet to obtain the model of the target object.
And 11, outputting the model of the target object.
Table 4 below is an example of experimental results performed to test the accuracy of the texture mapping method provided herein:
TABLE 4
Comparison of Experimental Components Basic model + soft tag loss function + reordering Total point number of points
Top1 recall 62.33% 65.72% 65.72% 3.39
Top2 recall 65.64% 68.24% 72.53% 6.89
Top3 recall 66.58% 70.39% 83.46% 16.88
Top4 recall 70.44% 74.77% 87.31% 16.87
When testing the accuracy of the texture mapping processing method provided by the application, the texture mapping in the test data set can be used as the target image. Each texture map may take, for example, 5 pictures of different illumination and different angles. As shown in Table 4, the Top1 recall rate indicates the recall rate corresponding to the texture mapping processing method when K is equal to 1. The Top2 recall rate indicates the recall rate corresponding to the texture mapping processing method when the above-mentioned K is equal to 2. The Top3 recall rate indicates the recall rate corresponding to the texture mapping processing method when the above-mentioned K is equal to 3. Top4 recall rate represents the recall rate corresponding to the texture mapping method when K is equal to 4.
As shown in table 4, for any row recall rate, the accuracy of the feature extraction model obtained by training the neural network model based on the soft label loss function provided in the present application is higher than that of the basic model (for example, the feature extraction model obtained without model training based on the soft label). After the N material maps are reordered based on the hierarchical clustering result, the accuracy of the K material maps is improved compared with a material map processing method which is not subjected to reordering.
Fig. 11 is a schematic structural diagram of a feature extraction model training apparatus provided in the present application. As shown in fig. 11, the apparatus includes: an acquisition module 51, a processing module 52, and a training module 53. Wherein the content of the first and second substances,
an obtaining module 51, configured to obtain an initial sample data set. Wherein the initial set of sample data comprises: at least one sample data, each sample data comprising: a sample texture map, and a hard label for the sample texture map; the hard label is used for representing the material type of the sample material mapping.
And the processing module 52 is configured to convert the hard label of the sample texture map in the initial sample data set into a soft label, so as to obtain a training sample data set. The soft label is used for representing the probability that the sample material chartlet belongs to each material type; the soft label is associated with a sample hierarchical clustering result of a sample texture map, the sample hierarchical clustering result comprising: color results and/or texture clustering results.
And a training module 53, configured to train a neural network model using the training sample data set, to obtain a feature extraction model. The feature extraction model is used for extracting material features of the image.
Clustering results in the sample hierarchy includes: taking a first sample clustering result obtained by clustering the sample texture maps based on color, a second sample clustering result obtained by clustering the sample texture maps based on texture, and a third sample clustering result obtained by clustering the sample texture maps based on color and texture as examples, optionally, the processing module 52 is specifically configured to obtain a soft label of each sample texture map according to the first sample clustering result, the second sample clustering result, the third sample clustering result, and a hard label of each sample texture map; and obtaining the training sample data set by using the texture mapping of each sample and the soft label of the texture mapping of each sample. The first sample clustering result and the second sample clustering result are both first-level sample clustering results in the sample hierarchy clustering results, and the third sample clustering result is a second-level sample clustering result in the sample hierarchy clustering results.
Optionally, the processing module 52 is specifically configured to determine, for each sample material mapping, an initial soft label of the sample material mapping according to a hard label of the sample material mapping and a preset soft label probability distribution manner; and adjusting the probability in the initial soft label of the sample material mapping according to the first sample clustering result, the second sample clustering result and the third sample clustering result to obtain the soft label of the sample material mapping.
Optionally, the obtaining module 51 is further configured to obtain the first sample clustering result and the second sample clustering result; and obtaining a third sample clustering result according to the first sample clustering result and the second sample clustering result.
The feature extraction model training device provided by the application is used for executing the embodiment of the feature extraction model training method, the implementation principle and the technical effect are similar, and the details are not repeated.
Fig. 12 is a schematic structural diagram of a texture map processing apparatus according to the present application. As shown in fig. 12, the apparatus includes: an acquisition module 61 and a processing module 62. Wherein the content of the first and second substances,
the acquiring module 61 is configured to acquire a target image.
The processing module 62 is configured to input the target image into a feature extraction model to obtain material features of the target image; and acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in the material map library extracted by using the characteristic extraction model, and the hierarchical clustering result of the material maps in the material map library based on colors and textures. Wherein, the characteristic extraction model is obtained by adopting any one of the characteristic extraction model training methods; and K is an integer greater than or equal to 1.
Optionally, the processing module 62 is specifically configured to obtain a similarity between the target image and each material map according to the material characteristics of the target image and the material characteristics of each material map in the material map library; obtaining the first N material maps according to the sorting sequence of the similarity from big to small; acquiring clustering scores of N material maps according to the hierarchical clustering result of the material maps in the material map library based on colors and textures; and acquiring the first K texture maps as the texture maps matched with the textures of the target image according to the sequence of the cluster scores from large to small. Wherein N is an integer greater than or equal to 2.
Clustering results in the hierarchy includes: taking a first clustering result obtained by clustering the material maps based on the colors, a second clustering result obtained by clustering the material maps based on the textures, and a third clustering result obtained by clustering the material maps based on the colors and the textures as examples, optionally, the processing module 62 is specifically configured to obtain initial clustering scores of the N material maps according to similarity ranking of the N material maps; obtaining first clustering scores of the N material maps according to the initial clustering scores of the N material maps and the first clustering result; obtaining second clustering scores of the N material maps according to the initial clustering scores of the N material maps and the second clustering result; obtaining a third clustering score of the N material maps according to the initial clustering scores of the N material maps and the third clustering result; and obtaining the clustering scores of the N material maps according to the initial clustering score, the first clustering score, the second clustering score and the third clustering score of the N material maps. And the first clustering result and the second clustering result are both first-level clustering results in the hierarchical clustering results, and the third clustering result is a second-level clustering result in the hierarchical clustering results.
Optionally, the apparatus may further include an output module 63, configured to output the K material maps after the K material maps matched with the target image material are obtained.
Optionally, the processing module 62 is further configured to determine a target material map from the K material maps after the K material maps matched with the target image material are obtained; and rendering the model frame of the target object by using the target material chartlet to obtain the model of the target object. And the output module 63 is further configured to output the model of the target object.
Optionally, the processing module 62 is further configured to construct a model frame of the target object before the model frame of the target object is rendered by using the target material map to obtain the model of the target object.
The material mapping processing apparatus provided by the present application is used for executing the foregoing embodiment of the material mapping processing method, and the implementation principle and the technical effect thereof are similar and will not be described again.
The present application further provides an electronic device 10 as shown in fig. 3d, wherein a processor 12 in the electronic device 10 reads a set of computer instructions stored in a memory 11 to execute the aforementioned feature extraction model training method or texture mapping processing method.
The present application also provides a computer-readable storage medium, which may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and in particular, the computer-readable storage medium stores program instructions, and the program instructions are used in the method in the foregoing embodiments.
The present application further provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the electronic device may read the execution instructions from the readable storage medium, and the execution of the execution instructions by the at least one processor causes the electronic device to implement the feature extraction model training or texture mapping processing method provided in the various embodiments described above.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. A method for training a feature extraction model, the method comprising:
obtaining an initial sample data set, wherein the initial sample data set comprises: at least one sample data, each sample data comprising: a sample texture map, and a hard tag of the sample texture map; the hard label is used for representing the material type of the sample material chartlet;
converting a hard label of a sample material chartlet in the initial sample data set into a soft label to obtain a training sample data set; the soft label is used for representing the probability that the sample material chartlet belongs to each material type; the soft label is associated with a sample level clustering result of a sample texture map, the sample level clustering result comprising: color results and/or texture clustering results;
and training a neural network model by using the training sample data set to obtain a feature extraction model, wherein the feature extraction model is used for extracting the material characteristics of the image.
2. The method of claim 1, wherein the sample-level clustering results comprise: clustering a first sample clustering result obtained by clustering the sample texture maps based on colors, clustering a second sample clustering result obtained by clustering the sample texture maps based on textures, and clustering a third sample clustering result obtained by clustering the sample texture maps based on colors and textures; the first sample clustering result and the second sample clustering result are both first-level sample clustering results in the sample hierarchical clustering results, and the third sample clustering result is a second-level sample clustering result in the sample hierarchical clustering results;
converting the hard label of the sample texture mapping in the initial sample data set into a soft label to obtain a training sample data set, including:
obtaining a soft label of each sample texture mapping according to the first sample clustering result, the second sample clustering result, the third sample clustering result and the hard label of each sample texture mapping;
and obtaining the training sample data set by using the texture mapping of each sample and the soft label of the texture mapping of each sample.
3. The method of claim 2, wherein obtaining the soft label of each sample texture map according to the first sample clustering result, the second sample clustering result, the third sample clustering result, and the hard label of each sample texture map comprises:
aiming at each sample material mapping, determining an initial soft label of the sample material mapping according to a hard label of the sample material mapping and a preset soft label probability distribution mode;
and adjusting the probability in the initial soft label of the sample material mapping according to the first sample clustering result, the second sample clustering result and the third sample clustering result to obtain the soft label of the sample material mapping.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
obtaining the first sample clustering result and the second sample clustering result;
and obtaining the third sample clustering result according to the first sample clustering result and the second sample clustering result.
5. A material mapping processing method is characterized by comprising the following steps:
acquiring a target image;
inputting the target image into a feature extraction model to obtain the material feature of the target image; the feature extraction model is obtained by the method of any one of claims 1 to 4;
acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in a material map library extracted by using the characteristic extraction model and the hierarchical clustering result of the material maps in the material map library based on the color and the texture; and K is an integer greater than or equal to 1.
6. The method according to claim 5, wherein the obtaining K material maps matching the material of the target image based on the hierarchical clustering result of color and texture based on the material features of each material map in the material map library extracted by the feature extraction model according to the material features of the target image comprises:
acquiring the similarity of the target image and each material map according to the material characteristics of the target image and the material characteristics of each material map in the material map library;
obtaining the first N material maps according to the sorting sequence of the similarity from big to small; n is an integer greater than or equal to 2;
acquiring clustering scores of N material maps according to the hierarchical clustering result of the material maps in the material map library based on colors and textures;
and acquiring the first K material maps as the material maps matched with the target image material according to the sequence of the clustering scores from large to small.
7. The method of claim 6, wherein the hierarchical clustering results comprise: clustering the texture maps based on colors to obtain a first clustering result, clustering the texture maps based on textures to obtain a second clustering result, and clustering the texture maps based on colors and textures to obtain a third clustering result; the first clustering result and the second clustering result are both first-level clustering results in the hierarchical clustering results, and the third clustering result is a second-level clustering result in the hierarchical clustering results;
the obtaining of the clustering scores of the N texture maps according to the hierarchical clustering results of the texture maps in the texture map library based on colors and textures comprises:
obtaining initial clustering scores of the N material maps according to the similarity ranking of the N material maps;
obtaining first clustering scores of the N material maps according to the initial clustering scores of the N material maps and the first clustering result;
obtaining second clustering scores of the N material maps according to the initial clustering scores of the N material maps and the second clustering result;
obtaining a third clustering score of the N material maps according to the initial clustering scores of the N material maps and the third clustering result;
and obtaining the clustering scores of the N material maps according to the initial clustering score, the first clustering score, the second clustering score and the third clustering score of the N material maps.
8. The method according to any one of claims 5-7, wherein after obtaining K material maps matching the target image material, the method further comprises:
and outputting the K material maps.
9. The method according to any one of claims 5-7, wherein after obtaining K material maps matching the target image material, the method further comprises:
determining a target material map from the K material maps;
rendering a model frame of a target object by using the target material chartlet to obtain a model of the target object;
and outputting the model of the target object.
10. The method of claim 9, wherein before the rendering the model frame of the target object using the target material map to obtain the model of the target object, the method further comprises:
and constructing a model framework of the target object.
11. A feature extraction model training apparatus, characterized in that the apparatus comprises:
an obtaining module, configured to obtain an initial sample data set, where the initial sample data set includes: at least one sample data, each sample data comprising: a sample texture map, and a hard label for the sample texture map; the hard label is used for representing the material type of the sample material chartlet;
the processing module is used for converting a hard label of the sample material chartlet in the initial sample data set into a soft label to obtain a training sample data set; the soft label is used for representing the probability that the sample material chartlet belongs to each material type; the soft label is associated with a sample hierarchical clustering result of a sample texture map, the sample hierarchical clustering result comprising: color results and/or texture clustering results;
and the training module is used for training a neural network model by using the training sample data set to obtain a feature extraction model, and the feature extraction model is used for extracting the material characteristics of the image.
12. A material charting apparatus, comprising:
the acquisition module is used for acquiring a target image;
the processing module is used for inputting the target image into a feature extraction model to obtain the material features of the target image; acquiring K material maps matched with the material of the target image according to the material characteristics of the target image, the material characteristics of each material map in a material map library extracted by using the characteristic extraction model and the hierarchical clustering result of the material maps in the material map library based on the color and the texture; the feature extraction model is obtained by the method according to any one of claims 1 to 4; and K is an integer greater than or equal to 1.
13. An electronic device, comprising a memory and a processor, the memory configured to store a set of computer instructions;
the processor executes a set of computer instructions stored by the memory to perform the method of any of claims 1 to 10.
14. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of any one of claims 1-10.
CN202210524910.4A 2022-05-13 2022-05-13 Feature extraction model training method, material chartlet processing method, device and electronic equipment Pending CN114926832A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210524910.4A CN114926832A (en) 2022-05-13 2022-05-13 Feature extraction model training method, material chartlet processing method, device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210524910.4A CN114926832A (en) 2022-05-13 2022-05-13 Feature extraction model training method, material chartlet processing method, device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114926832A true CN114926832A (en) 2022-08-19

Family

ID=82807757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210524910.4A Pending CN114926832A (en) 2022-05-13 2022-05-13 Feature extraction model training method, material chartlet processing method, device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114926832A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116434016A (en) * 2023-06-13 2023-07-14 苏州浪潮智能科技有限公司 Image information enhancement method, model training method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116434016A (en) * 2023-06-13 2023-07-14 苏州浪潮智能科技有限公司 Image information enhancement method, model training method, device, equipment and medium
CN116434016B (en) * 2023-06-13 2023-08-22 苏州浪潮智能科技有限公司 Image information enhancement method, model training method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN105354307B (en) Image content identification method and device
CN110569322A (en) Address information analysis method, device and system and data acquisition method
TW202139183A (en) Method of detecting object based on artificial intelligence, device, equipment and computer-readable storage medium
CN108875537B (en) Object detection method, device and system and storage medium
CN111159563B (en) Method, device, equipment and storage medium for determining user interest point information
CN112948951B (en) Building model creating method and device and processing server
CN112069319A (en) Text extraction method and device, computer equipment and readable storage medium
CN109582813A (en) A kind of search method, device, equipment and the storage medium of historical relic showpiece
CN113408570A (en) Image category identification method and device based on model distillation, storage medium and terminal
CN110197207A (en) To not sorting out the method and relevant apparatus that user group is sorted out
CN114127785A (en) Point cloud completion method, network training method, device, equipment and storage medium
CN110909817B (en) Distributed clustering method and system, processor, electronic device and storage medium
CN114926832A (en) Feature extraction model training method, material chartlet processing method, device and electronic equipment
CN111563207B (en) Search result sorting method and device, storage medium and computer equipment
CN110674388A (en) Mapping method and device for push item, storage medium and terminal equipment
CN112966756A (en) Visual access rule generation method and device, machine readable medium and equipment
CN110717405A (en) Face feature point positioning method, device, medium and electronic equipment
CN113591881B (en) Intention recognition method and device based on model fusion, electronic equipment and medium
CN116977668A (en) Image recognition method, device, computer equipment and computer storage medium
CN111177450B (en) Image retrieval cloud identification method and system and computer readable storage medium
CN111639260B (en) Content recommendation method, content recommendation device and storage medium
CN110909193B (en) Image ordering display method, system, device and storage medium
CN114443864A (en) Cross-modal data matching method and device and computer program product
JP2023510945A (en) Scene identification method and apparatus, intelligent device, storage medium and computer program
CN113704528A (en) Clustering center determination method, device and equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination