CN110378423A - Feature extracting method, device, computer equipment and storage medium - Google Patents
Feature extracting method, device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN110378423A CN110378423A CN201910663371.0A CN201910663371A CN110378423A CN 110378423 A CN110378423 A CN 110378423A CN 201910663371 A CN201910663371 A CN 201910663371A CN 110378423 A CN110378423 A CN 110378423A
- Authority
- CN
- China
- Prior art keywords
- characteristic
- matrix
- data
- mode
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Probability & Statistics with Applications (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
This application involves a kind of feature extracting method, device, computer equipment and storage mediums, this method comprises: obtaining the multi-modal initial characteristic data of target object;Mapping dimensionality reduction is carried out to the initial characteristic data of each mode, decomposes the privately owned characteristic of the sharing feature data and each mode that obtain being indicated by variable indicated by variable;According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;Determine the difference between the initial characteristic data of each mode and the characteristic of reconstruct;By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, minimize the difference;The value for obtaining the sharing feature data when difference minimizes, obtains the sharing feature of the target object finally merged.The scheme of the application can be improved the accuracy of feature extraction.
Description
Technical field
The present invention relates to field of computer technology, more particularly to a kind of feature extracting method, device, computer equipment and
Storage medium.
Background technique
With the rapid development of science and technology, a large amount of advanced technologies continue to bring out.Feature Extraction Technology is then one of them
Important technology.In practical application, data are usually picked up from multiple data sources or data channel, show as multiple mode.
In general, different modalities can reflect the different characteristic of data from the not ipsilateral of data, and then can be to provide each other with letter
The complementation of breath, the common fusion study for supporting data, so multi-modal characteristic merges, in Feature Extraction Technology by
Pay attention to extensively.
In conventional method, it is directly to connect together each modal characteristics vector, forms the feature vector of object.This side
Method has ignored the different statistical properties of data different modalities, and the feature vector for causing direct splicing to obtain can not accurately express mesh
Mark the feature of object.
Summary of the invention
Based on this, it is necessary to for the problem that the feature of conventional method generation is not accurate enough, provide a kind of feature extraction side
Method, device, computer equipment and storage medium.
A kind of feature extracting method, which comprises
Obtain the multi-modal initial characteristic data of target object;
Mapping dimensionality reduction carried out to the initial characteristic data of each mode, decompose the sharing feature data for obtaining being indicated by variable and
The privately owned characteristic of each mode indicated by variable;
According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;
Determine the difference between the initial characteristic data of each mode and the characteristic of reconstruct;
By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make
The difference minimizes;
Obtain the difference minimize when sharing feature data value, obtain the target object finally merge be total to
Enjoy feature.
In one of the embodiments, the initial characteristic data of each mode of the determination and the characteristic of reconstruct it
Between difference include:
Construct machine learning model;The objective function of the machine learning model, for characterizing the described original of each mode
Difference between characteristic and the characteristic of reconstruct;
The value by adjusting the sharing feature data indicated by variable and the privately owned characteristic indicated by variable
Value, makes the difference minimum include:
The value of the objective function of the machine learning model is minimized by repetitive exercise, and in every wheel iteration, update
The value of the value of the sharing feature data and each privately owned characteristic, until meeting iteration stopping condition;
The value for obtaining the sharing feature data when meeting iteration stopping condition obtains shared when the difference minimizes
The value of characteristic.
In one of the embodiments, the method also includes:
Determine the local invariant regularization factor of each mode;The local invariant regularization factor, for adjusting the first phase
Like the consistency between structure and the second similar structure;The first similar structure, be each mode initial characteristic data it
Between similar structure;Second similar structure is mapped the sub- sharing feature after dimensionality reduction for each initial characteristic data in each mode
Similar structure between data;
The building machine learning model includes:
Construct basic machine learning model;The objective function of the basic machine learning model, for characterizing each mode
Difference between the initial characteristic data and the characteristic of reconstruct;
The local invariant regularization factor for combining the basic machine learning model and each mode, obtains machine learning mould
Type.
The local invariant regularization factor of each mode of the determination includes: in one of the embodiments,
For each mode, the data weighting matrix of the mode is obtained;Each weight in the data weighting matrix, is used for
Characterize the distance between initial characteristic data how far two-by-two of the mode;
The similarity between sub- sharing feature data after determining the mapping of the initial characteristic data two-by-two dimensionality reduction;
The data weighting matrix and the similarity are coupled, the local invariant regularization factor of each mode is obtained.
Described in one of the embodiments, to be directed to each mode, the data weighting matrix for obtaining the mode includes:
Arest neighbors figure is constructed using each initial characteristic data of the mode as vertex for each mode;
According to the distance between the vertex on side total in the arest neighbors figure, the weight on each side in arest neighbors figure is determined;
According to the weight on each side, the mode corresponding data weighting matrix in original data space is constructed.
The initial characteristic data to each mode carries out mapping dimensionality reduction in one of the embodiments, decomposition obtain by
The sharing feature data of variable expression and the privately owned characteristic of each mode indicated by variable include:
The corresponding shared mapping matrix of each mode and privately owned mapping matrix are obtained respectively;
By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding shared mapping matrix, obtains shared spy
Levy matrix of variables;
By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding privately owned mapping matrix respectively, obtains private
There is characteristic variable matrix.
It is described according to the sharing feature data and each privately owned characteristic, reconstruct in one of the embodiments,
The characteristic of each mode includes:
The sharing feature matrix of variables is subjected to inverse mapping according to the corresponding shared mapping matrix of each mode respectively
Conversion, obtains the first inverse mapping result;
Each privately owned characteristic variable matrix is subjected to inverse mapping conversion according to the corresponding privately owned mapping matrix, obtains the
Two inverse mapping results;
According to the first inverse mapping result and the second inverse mapping as a result, reconstructing the characteristic of each mode.
The shared mapping matrix is shared mapping variable matrix, the privately owned mapping square in one of the embodiments,
Battle array is privately owned mapping variable matrix;
It is described to update the value of the sharing feature data and the value of each privately owned characteristic in every wheel iteration, until full
Sufficient iteration stopping condition includes:
In every wheel iteration, the value of the sharing feature matrix of variables, value of each privately owned characteristic variable matrix, each total is updated
The value of mapping variable matrix and the value of each privately owned mapping variable matrix are enjoyed, until meeting iteration stopping condition.
It is described in every wheel iteration in one of the embodiments, update the value of the sharing feature matrix of variables, each private
The value for having the value of characteristic variable matrix, the value of each shared mapping variable matrix and each privately owned mapping variable matrix includes:
In every wheel iteration, successively from the sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable
Current variable matrix is chosen in matrix and privately owned mapping variable matrix, keeping the value of non-present matrix of variables is last update
The value of the non-present matrix of variables is constant, optimizes update to the value of current variable matrix, and become from the sharing feature
Next current change is chosen in moment matrix, privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix
Moment matrix, to continue to optimize update processing.
The multi-modal initial characteristic data for obtaining target object includes: in one of the embodiments,
Obtain the multi-modal default characteristic of target object;The default characteristic is non-negative data;
The default characteristic is normalized, multi-modal initial characteristic data is obtained.
In one of the embodiments, the method also includes:
Obtain the sharing feature finally merged of multiple target objects;
According to each sharing feature, multiple target objects are clustered;
According to the cluster result, respective handling is carried out to the target object.
A kind of feature deriving means, described device include:
Module is obtained, for obtaining the multi-modal initial characteristic data of target object;
Mapping block carries out mapping dimensionality reduction for the initial characteristic data to each mode, decomposes and obtains by variable expression
The privately owned characteristic of sharing feature data and each mode indicated by variable;
Reconstructed module, for reconstructing the spy of each mode according to the sharing feature data and each privately owned characteristic
Levy data;
Fusion Module, for determining the difference between the initial characteristic data of each mode and the characteristic of reconstruct;
By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make the difference
It minimizes;Obtain the difference minimize when sharing feature data value, obtain the target object finally merge be total to
Enjoy feature.
A kind of computer equipment, including memory and processor are stored with computer program, the meter in the memory
When calculation machine program is executed by the processor, so that the processor executes following steps:
Obtain the multi-modal initial characteristic data of target object;
Mapping dimensionality reduction carried out to the initial characteristic data of each mode, decompose the sharing feature data for obtaining being indicated by variable and
The privately owned characteristic of each mode indicated by variable;
According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;
Determine the difference between the initial characteristic data of each mode and the characteristic of reconstruct;
By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make
The difference minimizes;
Obtain the difference minimize when sharing feature data value, obtain the target object finally merge be total to
Enjoy feature.
A kind of computer readable storage medium is stored with computer program on the computer readable storage medium, described
When computer program is executed by processor, so that the processor executes following steps:
Obtain the multi-modal initial characteristic data of target object;
Mapping dimensionality reduction carried out to the initial characteristic data of each mode, decompose the sharing feature data for obtaining being indicated by variable and
The privately owned characteristic of each mode indicated by variable;
According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;
Determine the difference between the initial characteristic data of each mode and the characteristic of reconstruct;
By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make
The difference minimizes;
Obtain the difference minimize when sharing feature data value, obtain the target object finally merge be total to
Enjoy feature.
Features described above extracting method, device, computer equipment and storage medium first lead to the initial characteristic data of each mode
Mapping dimensionality reduction is crossed, the privately owned characteristic of the sharing feature data and each mode that obtain being indicated by variable indicated by variable is decomposed
According to.It is then based on according to the sharing feature data and each privately owned characteristic, reconstructs the characteristic of each mode, determine
Difference between the initial characteristic data of each mode and the characteristic of reconstruct.By adjusting the shared spy indicated by variable
The value of data and the value of the privately owned characteristic indicated by variable are levied, the difference is minimized.Since difference gets over hour, show
The sharing feature data and privately owned characteristic that the initial characteristic data mapping dimensionality reduction of each mode is obtained are more accurate.It is equivalent to,
During minimizing difference, gradually sharing feature data and privately owned characteristic are distinguished, therefore, difference minimizes
When sharing feature data value, as the most accurate sharing feature data are equivalent to the value of sharing feature data at this time,
It is the sharing feature preferably eliminated after privately owned characteristic, this improves the standards that the sharing feature of target object is extracted
True property.
Detailed description of the invention
Fig. 1 is the application scenario diagram of feature extracting method in one embodiment;
Fig. 2 is the flow diagram of feature extracting method in one embodiment;
Fig. 3 is the structural schematic diagram that machine learning model is constructed in one embodiment;
Fig. 4 is the flow diagram of feature extracting method in another embodiment;
Fig. 5 to Fig. 7 is effect contrast figure in one embodiment;
Fig. 8 to Figure 11 is the selection schematic diagram of regularization control parameter in one embodiment;
Figure 12 is the block diagram of feature deriving means in one embodiment;
Figure 13 is the block diagram of feature deriving means in another embodiment;
Figure 14 is the block diagram of computer equipment in one embodiment;
Figure 15 is the block diagram of computer equipment in another embodiment.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
Fig. 1 is the application scenario diagram of feature extracting method in one embodiment.It referring to Fig.1, include logical in the application scenarios
Cross the terminal 110 and server 120 of network connection.Terminal 110 can be intelligent TV set, intelligent sound box, desktop computer or
Mobile terminal, mobile terminal may include mobile phone, tablet computer, laptop, personal digital assistant and wearable device etc.
At least one of.Server 120 can use the server cluster of the either multiple physical server compositions of independent server
To realize.
Terminal 110 can collect the multi-modal initial characteristic data of target object, and be uploaded to server 120.Service
Device 120 can execute following processing, and multi-modal initial characteristic data is merged, and obtain final shared of target object
Feature.
Specifically, the multi-modal initial characteristic data of the available target object of server 120;To the original of each mode
Characteristic carries out mapping dimensionality reduction, decomposes the private of the sharing feature data and each mode that obtain being indicated by variable indicated by variable
There is characteristic;According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;It determines
Difference between the initial characteristic data of each mode and the characteristic of reconstruct;Adjust the value of sharing feature data and privately owned
The value of characteristic minimizes the difference;The value for obtaining the sharing feature data when difference minimizes obtains described
The sharing feature of target object finally merged.
Fig. 2 is the flow diagram of feature extracting method in one embodiment.This feature extracting method in the present embodiment
It can be applied to computer equipment, computer equipment can be terminal or server, now mainly with computer equipment in Fig. 1
Server 120 is illustrated.Referring to Fig. 2, this method specifically comprises the following steps:
S202 obtains the multi-modal initial characteristic data of target object.
Wherein, target object refers to the object that merge obtaining final sharing feature.It is multi-modal, i.e., multiple fields or
Visual angle.Initial characteristic data is the characteristic of original (not carrying out mapping dimensionality reduction).Multi-modal initial characteristic data,
Refer to the set of the original characteristic obtained to the same target object by different field or visual angle.
In order to make it easy to understand, existing be illustrated multi-modal initial characteristic data.For example, it is directed to the same person,
Its positive face characteristic and side face characteristic just belong to the initial characteristic data of different modalities.That is, positive face characteristic belongs to
The characteristic of one mode, side face characteristic belong to the characteristic of another mode, then, obtain the multimode of this people
The initial characteristic data of state, so that it may including positive face characteristic and side face characteristic.It for another example, both include text for one
Word includes the social information of picture again, and wherein text belongs to the characteristic of text mode, and picture belongs to the feature of picture mode
Data, then, the multi-modal initial characteristic data of this social information, so that it may characteristic and figure including text mode
The characteristic of piece mode.
It is appreciated that computer equipment can be from the multi-modal primitive character of the local target object for directly acquiring storage
Data also can receive the multi-modal initial characteristic data for the target object that terminal reports.
In one embodiment, initial characteristic data can be non-negative data matrix.It is appreciated that initial characteristic data
It can not be matrix form, but other data modes.
In one embodiment, step S202 includes: the set for obtaining the multi-modal default characteristic of target object;
Default characteristic is non-negative data;Default characteristic is normalized, multi-modal initial characteristic data is obtained.
Wherein, characteristic is preset, is pre-set characteristic.The set of multi-modal default characteristic is
Refer to the set of the characteristic obtained by different field or visual angle pre-set to the same target object.It is appreciated that
The default characteristic of one mode may include at least one default characteristic.
Specifically, computer equipment can be normalized the default characteristic of each mode, by default feature
Data are mapped to the section [0-1], obtain multi-modal initial characteristic data.
For example, given multi-modal default characteristicWherein, V is total mode number,Indicate that there is MvThe default eigenmatrix of v-th of mode of dimensional feature, N are the quantity of total default characteristic,For the real data matrix with nonnegativity restrictions.Computer equipment can use min-max standardized method for each mode
Default characteristic is mapped to the section [0-1].
S204 carries out mapping dimensionality reduction to the initial characteristic data of each mode, decomposes the sharing feature for obtaining being indicated by variable
The privately owned characteristic of data and each mode indicated by variable.
Wherein, sharing feature data refer to relevant characteristic between different modalities.That is, that different modalities all have,
And the characteristic described from different aspect.Privately owned characteristic refers to incoherent characteristic between each mode.Refer to
The only characteristic that each mode itself has.
Specifically, computer equipment can the initial characteristic data to each mode carry out mapping dimensionality reduction, with pass through non-negative square
The initial characteristic data of each mode is decomposed and is converted to the sharing feature data indicated by variable and by variable by battle array resolution process
The privately owned characteristic indicated.
In one embodiment, due to including multiple initial characteristic datas in each mode, computer equipment can will be same
Whole initial characteristic datas of one mode carry out mapping dimensionality reduction respectively, then, it is possible to it can be by by multiple initial characteristic datas
Mapping and Converting is sharing feature, then, it is sub- sharing feature that each initial characteristic data of same mode is then mapped dimensionality reduction respectively
Data, the final sharing feature data of the mode are then mapped the sub- sharing feature after dimensionality reduction by multiple initial characteristic datas
Data fusion obtains.
It is appreciated that the sharing feature data indicated by variable, not a specific fixed data, but by becoming
Measure the unknown data indicated.Similarly, the privately owned characteristic indicated by variable, not specific data, but not
The data known.
S206 reconstructs the characteristic of each mode according to sharing feature data and privately owned characteristic.
Specifically, computer equipment can be indicated according to the sharing feature data and each mode indicated by variable by variable
Privately owned characteristic, reconfigure the characteristic of each mode.
It is appreciated that the characteristic of each mode of reconstruct, due to being by sharing feature data and privately owned characteristic weight
Structure obtains, so, also not specific data, but the unknown data indicated by variable.
S208 determines the difference between the initial characteristic data of each mode and the characteristic of reconstruct.
Specifically, computer equipment, which can determine, characterizes between the initial characteristic data of each mode and the characteristic of reconstruct
Difference function.
It is appreciated that due to the characteristic of reconstruct be by variable indicate unknown data, so, each mode it is original
Difference between characteristic and the characteristic of reconstruct is not a specific data, but the function indicated by variable.
S210, the value by adjusting the sharing feature data indicated by variable and the privately owned characteristic by variable expression
Value minimizes the difference.
It is appreciated that when the difference between the value of the characteristic of the initial characteristic data and reconstruct of each mode is smaller
When, illustrate, the characteristic of each mode reconstructed then shows closer to initial characteristic data by the primitive character of each mode
The sharing feature data and privately owned characteristic that data mapping dimensionality reduction obtains are more accurate.It therefore, can be by adjusting sharing feature
The value of the value of data and privately owned characteristic, so that the function minimization of characterization difference, to approach more accurate sharing feature
Data and privately owned characteristic.
S212 obtains the value of the sharing feature data when meeting iteration stopping condition, when obtaining difference minimum
Sharing feature data value.
Wherein, the sharing feature of target object finally merged, be it is that the multiple mode of target object have jointly, from multiple
The feature of dimension description.
It is appreciated that showing to obtain the initial characteristic data mapping dimensionality reduction of each mode shared since difference gets over hour
Characteristic and privately owned characteristic are more accurate.Therefore, shared spy when the available difference of computer equipment minimizes
Levy the value of data, as the most accurate sharing feature data, so being the sharing feature of target object finally merged.
Likewise, it is understood that the value of the privately owned characteristic of each mode and each mode is most when difference is minimized
For accurate privately owned characteristic.But main purpose is to obtain finally merging for target object in each embodiment of the application
Sharing feature, therefore, the step of not limiting to obtain the final privately owned characteristic of each mode.
Features described above extracting method, first by the initial characteristic data of each mode by mapping dimensionality reduction, decomposition is obtained by variable
The sharing feature data of expression and the privately owned characteristic of each mode indicated by variable.It is then based on according to the sharing feature
Data and each privately owned characteristic, reconstruct the characteristic of each mode, determine each mode the initial characteristic data and
Difference between the characteristic of reconstruct.Value by adjusting the sharing feature data indicated by variable and the private by variable expression
There is the value of characteristic, minimizes the difference.Since difference gets over hour, show to map the initial characteristic data of each mode
The sharing feature data and privately owned characteristic that dimensionality reduction obtains are more accurate.It is equivalent to, during minimizing difference, gradually will
Sharing feature data and privately owned characteristic distinguish, therefore, the value of sharing feature data when difference minimizes, and as most
For accurate sharing feature data, it is equivalent to the value of sharing feature data at this time, is preferably to eliminate privately owned characteristic
Sharing feature afterwards, this improves the accuracys that the sharing feature of target object is extracted.
In one embodiment, step S208 includes: building machine learning model;The target letter of the machine learning model
Number, for characterizing the difference between the initial characteristic data of each mode and the characteristic of reconstruct.Step S210 includes: logical
The value that repetitive exercise minimizes the objective function of the machine learning model is crossed, and in every wheel iteration, updates the shared spy
The value of data and the value of each privately owned characteristic are levied, until meeting iteration stopping condition;It obtains when meeting iteration stopping condition
Sharing feature data value, obtain the difference minimize when sharing feature data value.
Specifically, computer equipment can construct machine learning model, and the objective function of the machine learning model is used for table
Levy the difference between the initial characteristic data of each mode and the characteristic of reconstruct.
It should be noted that the objective function of machine learning model can be only the primitive character number for characterizing each mode
It, can also be in addition to including the initial characteristic data for characterizing each mode according to the function of the difference between the characteristic of reconstruct
Further include other Tuning functions except the function of difference between the characteristic of reconstruct, for example indicates that local irregularities become
Change the function of the factor.
Computer equipment can be such that the value of the objective function of machine learning model minimizes by repetitive exercise, and in every wheel
In iteration, the value of the sharing feature data and the value of each privately owned characteristic are updated, until meeting iteration stopping condition.It calculates
The value of the available sharing feature data when meeting iteration stopping condition of machine equipment obtains the being total to when difference minimizes
Enjoy the value of characteristic.
It is appreciated that due to during repetitive exercise, can not only update the value of sharing feature data, but also update each privately owned spy
The value for levying data, is equivalent to and carries out learning training jointly to sharing feature and privately owned feature, so, machine learning model is equivalent to
One total learning model, i.e., carry out the model of learning training jointly to sharing feature and privately owned feature.
In one embodiment, in every wheel iteration, computer equipment can update the variable in sharing feature data
Value, to realize the update to the value of sharing feature data, and, the value of variable in privately owned characteristic is updated, to realize to private
There is the update of the value of characteristic.It is appreciated that in this case, in the variable and privately owned characteristic in sharing feature data
Variable, it can be equivalent to model parameter, every wheel iteration is equivalent to the update to model parameter.
Iteration stopping condition refers to the condition for stopping repetitive exercise.In one embodiment, iteration stopping condition, including
Repetitive exercise number reaches preset times threshold value.In another embodiment, iteration stopping condition, the target including current round
Difference between the value of the objective function of the value of function and previous round is less than preset threshold.Preset threshold in one embodiment
It can be 10-4.It is appreciated that in other embodiments, preset threshold is also possible to other values, does not limit this.
In above-described embodiment, characterized by building between the initial characteristic data of each mode and the characteristic of reconstruct
Difference machine learning model, and minimized by objective function of the repetitive exercise to machine learning model, changed in every wheel
Dai Zhong updates the value of the sharing feature data and the value of each privately owned characteristic, until meeting iteration stopping condition.It is equivalent to
The problem of difference will be minimized, is converted into model convergence training managing, can accurately and rapidly determine the minimum of objective function
Value, so as to the value for the sharing feature data that difference when quickly determining out minimizes, therefore, it can be more quickly and accurately
Determine the sharing feature that target object finally merges.
In one embodiment, this method further include: determine the local invariant regularization factor of each mode.The present embodiment
In, building machine learning model includes: the basic machine learning model of building;The objective function of the basic machine learning model,
For characterizing the difference between the initial characteristic data of each mode and the characteristic of reconstruct;Combine the basic engineering
The local invariant regularization factor for practising model and each mode, obtains machine learning model.
Specifically, computer equipment can the initial characteristic data to each mode carry out mapping dimensionality reduction, obtain sharing feature
Data.
Computer equipment can determine the local invariant regularization factor of each mode.Wherein, the local invariant regularization
The factor, for adjusting the consistency between the first similar structure and the second similar structure.The first similar structure, refers to each
Similar structure between the initial characteristic data of mode;Second similar structure refers to each initial characteristic data quilt in each mode
The similar structure between sub- sharing feature data after mapping dimensionality reduction.That is, the local invariant regularization factor, for guaranteeing
Similar structure between the initial characteristic data of each mode, after being mapped dimensionality reduction with each initial characteristic data of each mode
The consistency between similar structure between sub- sharing feature data.
It is understood that, it is assumed that two initial characteristic data examples in mode vWithThe distance in original data space
It is relatively close, then indicating the sub- sharing feature data V of their low-dimensional in mapping the data space after dimensionality reductionCiAnd VCjAlso should
It is as close as possible, in this way, number in geometry and original data space in mapping the data space after dimensionality reduction between data
Under the premise of geometry between is consistent, it is just more accurate to carry out feature extraction, could promote the robustness of fusion feature.
Therefore, computer equipment can combine the local invariant regularization of the basic machine learning model and each mode because
Son obtains machine learning model.So, the machine learning model obtained can pass through the local invariant rule of each mode
Change the factor, during the repetitive exercise of machine learning model, adjusts in mapping the data space after dimensionality reduction between data
The consistency of geometry in geometry and original data space between data, so that the robustness of fusion feature is improved,
So that the sharing feature of obtained target object finally merged is more accurate.
In one embodiment, the local invariant regularization factor of each mode of the determination includes: to obtain for each mode
Take the data weighting matrix of the mode;Each weight in the data weighting matrix, for characterizing the primitive character of the mode
The distance between data how far;It determines between the sub- sharing feature data after the initial characteristic data is mapped dimensionality reduction
Similarity;The data weighting matrix and the similarity are coupled, the local invariant regularization factor of each mode is obtained.
Wherein, data weighting matrix is the matrix using weight as element.Each weight in the data weighting matrix of each mode,
The distance between initial characteristic data for characterizing each mode how far.
Specifically, for each mode, computer equipment, which can determine, meets the former two-by-two of arest neighbors condition in the mode
The distance between beginning characteristic determines the weight between the two initial characteristic datas according to this distance, and so on, thus
It determines the weight between all initial characteristic datas for meeting arest neighbors condition, in turn, according to obtained each weight, constitutes
The data weighting matrix of the mode.
It is appreciated that after each initial characteristic data of the mode in original data space is mapped dimensionality reduction, available phase
The sub- sharing feature data answered, computer equipment can be mapped dimensionality reduction in the hope of meeting the initial characteristic data two-by-two of arest neighbors condition
The similarity between sub- sharing feature data afterwards.It is appreciated that the similarity between sub- sharing feature data, shows mapping
Degree of closeness in data space after dimensionality reduction, between data.
In one embodiment, computer equipment can be spent according to the Euclidean distance between two sub- sharing feature data
The similarity of amount between the two.
Therefore, computer equipment can couple the data weighting matrix and the similarity, obtain the part of each mode
The constant regularization factor.It should be noted that since sub- sharing feature data are the unknown numbers indicated by the first parametric variable
According to, so, by coupling data weight matrix and the similarity, the local invariant regularization factor of obtained each mode is also
The unknown data indicated by variable, and a not specific data value.
In one embodiment, computer equipment can be determined by following formula the local invariant regularization of each mode because
Son:
Wherein, VCiIndicate the sub- sharing feature data that i-th of initial characteristic data mapping dimensionality reduction obtains;VCjIt indicates j-th
The sub- sharing feature data that initial characteristic data mapping dimensionality reduction obtains;N indicates total data volume;For data weighting matrix W(v)The i-th row jth column numerical value;V indicates v-th of mode;W(v)For the data weighting matrix of v-th of mode;VCIt indicates by variable
The sharing feature data of expression;The mark of Tr (g) representing matrix, so, Tr (VCD(v)(VC)T) indicate VCD(v)(VC)TThis matrix
Mark;Tr(VCW(v)(VC)T) indicate VCW(v)(VC)TThe mark of this matrix;Tr(VCL(v)(VC)T) indicate VCL(v)(VC)TThis square
The mark of battle array;D(v)For diagonal matrix, diagonal matrix D(v)Diagonal line on each data be W(v)Every a line or each column
With L(v)=D(v)-W(v)For G(v)Figure Laplacian Matrix, the transposition of subscript T representing matrix.
In above-described embodiment, after being mapped dimensionality reduction by the data weighting matrix and initial characteristic data that couple each mode
Similarity between sub- sharing feature data can accurately, easily be depicted for adjusting the first similar structure and the second phase
Like the local invariant regularization factor of the consistency between structure, the robustness of fusion feature is promoted.
In one embodiment, described to be directed to each mode, the data weighting matrix for obtaining the mode includes: for every
A mode constructs arest neighbors figure using each initial characteristic data of the mode as vertex;According to side total in the arest neighbors figure
The distance between vertex, determine the weight on each side in arest neighbors figure;According to the weight on each side, the mode is constructed in original number
According to data weighting matrix corresponding in space.
Wherein, arest neighbors figure, refer to the vertex including meeting arest neighbors condition and meet arest neighbors condition vertex it
Between the figure on side that constitutes.Arest neighbors, i.e. nearest-neighbors.Arest neighbors condition refers to that belonging to arest neighbors shelter between two vertex answers
The condition of the satisfaction.Arest neighbors condition may include the distance between two vertex within pre-determined distance threshold range.
Computer equipment can be directed to the initial characteristic data of each mode, all carry out the following processing.Specifically, computer
Equipment can be determined using each initial characteristic data of the mode as vertex according to the distance between each initial characteristic data
Meet the vertex of arest neighbors condition, and then build side between the vertex for meeting arest neighbors condition, to construct arest neighbors figure.Computer
Equipment can determine the weight on the side of the arest neighbors figure according to each the distance between vertex for meeting arest neighbors condition.It calculates
Machine equipment can construct the mode corresponding data power in original data space using the weight on each side as matrix element
Weight matrix.
In one embodiment, computer equipment can calculate the data weighting matrix of each mode according to the following formula:
Wherein,For data weighting matrix W(v)The i-th row jth column numerical value.It is data instanceWithIt
Between Euclidean distance.It indicatesP nearest-neighbors example set.It indicatesP recently
The set of neighbours' example.For i-th of initial characteristic data of v-th of mode,J-th for v-th of mode is original
Characteristic, σ are the standard deviation of initial characteristic data.
In above-described embodiment, in conjunction with arest neighbors figure, each mode corresponding data weighting square in original data space is determined
Matrix data processing is converted to figure processing by battle array, convenient in the extreme, accurate.
In one embodiment, step S204 includes: to obtain the corresponding shared mapping matrix of each mode respectively and privately owned reflect
Penetrate matrix;By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding shared mapping matrix, obtains each mode
Sub- sharing feature matrix of variables;By the initial characteristic data of each mode, mapped respectively according to corresponding privately owned mapping matrix
It decomposes, obtains privately owned characteristic variable matrix.
Wherein, mapping matrix is shared, for decompositing sharing feature data from the initial characteristic data of corresponding mode.It is private
There is mapping matrix, for decompositing the privately owned characteristic of the mode from the initial characteristic data of corresponding mode.It is appreciated that
Shared mapping matrix and privately owned mapping matrix are equivalent to basic matrix.
Matrix of variables is the matrix in matrix element including variable.
Sharing feature matrix of variables refers to the sharing feature data including variable of matrix form.Privately owned characteristic variable square
Battle array, refers to the privately owned characteristic including variable of matrix form.
It is appreciated that each mode has corresponding shared mapping matrix and privately owned mapping matrix, therefore, computer is set
It is standby the initial characteristic data of each mode to be subjected to mapping decomposition according to corresponding shared mapping matrix, obtain being total to for each mode
Characteristic variable matrix is enjoyed, and by the initial characteristic data of each mode, carries out mapping point according to corresponding privately owned mapping matrix respectively
Solution, obtains privately owned characteristic variable matrix.
It should be noted that shared mapping matrix and privately owned mapping matrix, coming under indicates unknown by variable parameter
Matrix data.So, it decomposes and obtains by the shared mapping matrix and the mapping of privately owned mapping matrix that include variable parameter
Sharing feature matrix of variables and privately owned characteristic variable matrix, and the unknown data including parametric variable.So, according to fusion
Each sub- sharing feature matrix of variables can obtain sharing feature matrix of variables.
In above-described embodiment, by the corresponding shared mapping matrix of each mode and privately owned mapping matrix, by primitive character number
Be decomposed into sharing feature data and privately owned characteristic according to mapping, so can it is subsequent to sharing feature and privately owned feature all into
Row unsupervised ground learning training, so that privately owned feature and sharing feature be distinguished, that is, finally obtained sharing feature excludes
The interference of privately owned feature improves the accuracy of sharing feature extraction.
In one embodiment, step S206 includes: that the sharing feature matrix of variables is corresponding according to each mode respectively
The shared mapping matrix carry out inverse mapping conversion, obtain the first inverse mapping result;By each privately owned characteristic variable matrix according to
The corresponding privately owned mapping matrix carries out inverse mapping conversion, obtains the second inverse mapping result;According to the first inverse mapping result and
Two inverse mappings are as a result, reconstruct the characteristic of each mode.
It is appreciated that being by the primitive character of each mode due to sharing feature matrix of variables and privately owned characteristic variable matrix
Data are obtained according to shared mapping matrix and privately owned mapping matrix mapping dimensionality reduction respectively, so, it will be converted to according to inverse mapping
Be combined, can reconstruct to obtain in original data space.Ideally, the characteristic and mapping reconstructed drops
Initial characteristic data before dimension is consistent.But under normal circumstances can not be completely the same, so, when the characteristic and original of reconstruct
When difference between beginning characteristic minimizes, illustrate, by the sharing feature matrix of variables and privately owned feature of mapping lowering dimension decomposition
The characteristic and initial characteristic data of matrix of variables reconstruct are closest, then, at this point, the sharing feature of mapping lowering dimension decomposition
The value of matrix of variables and the value of privately owned characteristic variable matrix are also just the most accurate.
In one embodiment, the objective function of machine learning model can be indicated according to following formula:
Wherein,WithMode sharing feature and privately owned feature respectively in the communal space, i.e.,
Related and uncorrelated features,WithFor the corresponding mapping matrix of each mode, | | | |FIt indicates
Frobenius norm,Indicate that square of current Frobenius norm, s.t. indicate constraint condition;It indicatesIt is all non-negative;Indicate the feature of reconstruct
Data are equivalent to, by VC according toIt carries out inverse mapping conversion and incites somebody to actionAccording toProgress inverse mapping conversion, and will
The the first inverse mapping result arrived and the combination of the second inverse mapping result, reconstruct obtain the characteristic in original data space.
In one embodiment, computer equipment can by building, objective function be used to characterize the original of each mode
The machine learning model of difference between beginning characteristic and the characteristic of reconstruct, will be basic as basic machine learning model
The local invariant regularization combinations of factors of machine learning model and each mode, obtains final machine learning model.
In one embodiment, computer equipment can control the local invariant regularization factor of each mode and regularization
Parameter is coupled, and is combined with basic machine learning model, and final machine learning model is obtained.
Therefore, in one embodiment, computer equipment can be by basic machine learning model represented by formula (3)
Objective function and local irregularities' changed factor are combined, and the objective function for obtaining final machine learning model is as follows:
Wherein, parameter alpha(v)>=0 is the regularization control parameter of v-th of mode;L(G(v)) be v-th of mode part not
Become the regularization factor.
It is appreciated that, when minimizing the objective function of machine learning model, being equivalent to, to basic in the present embodiment
The combination of the objective function and the local invariant regularization factor of machine learning model is minimized, i.e., asks most the Y in formula (4)
Small value.
Fig. 3 is the structural schematic diagram that machine learning model is constructed in one embodiment.Referring to Fig. 3, X(1)~X(v)Indicate v
The initial characteristic data of mode,Indicate the shared mapping matrix of v mode,Indicate the private of v mode
There is mapping matrix, it, can be with by the way that the initial characteristic data of multiple mode is carried out mapping dimensionality reduction by corresponding shared mapping matrix
Obtain sharing feature matrix VC, by the way that the initial characteristic data of multiple mode is carried out mapping drop by corresponding privately owned mapping matrix
Dimension, the privately owned eigenmatrix of available each modeBy sharing feature matrix VCRespectively according to each shared mapping square
Battle arrayCarry out inverse mapping and by privately owned eigenmatrixInverse reflect is carried out according to corresponding privately owned mapping matrix
It penetrates, can reconstruct to obtain characteristic.In turn, can based on the difference between initial characteristic data and the characteristic of reconstruct,
Construct basic machine learning model.In addition, determining the data weighting matrix W of each mode by arest neighbors figure again(1)~W(v), base
In data weighting matrix, local invariant regularization changed factor is determined, by combining basic machine learning model and local invariant
Regularization changed factor can obtain final machine learning model.
In above-described embodiment, by the way that the sharing feature matrix of variables and privately owned characteristic variable matrix are carried out inverse mapping,
Reconstruct the characteristic of each mode.When decomposing accurately, the characteristic and initial characteristic data of each mode of reconstruct
Relatively, therefore, it can use this dot characteristics, the initial characteristic data to each mode and between the characteristic of reconstruct
Difference carry out minimum processing, the value of sharing feature data when minimizing has been obtained, at this point, being the most shared
Feature, so as to improve the accuracy of sharing feature extraction.
In one embodiment, sharing mapping matrix is shared mapping variable matrix, and the privately owned mapping matrix is privately owned
Mapping variable matrix.In the present embodiment, in every wheel iteration, the value and each privately owned characteristic of the sharing feature data are updated
Value, until meeting iteration stopping condition and including:
In every wheel iteration, the value of the sharing feature matrix of variables, value of each privately owned characteristic variable matrix, each total is updated
The value of mapping variable matrix and the value of each privately owned mapping variable matrix are enjoyed, until meeting iteration stopping condition.
It is appreciated that sharing feature matrix of variables, each privately owned characteristic variable matrix, each shared mapping variable matrix and
Each privately owned mapping variable matrix, is different matrix of variables.In one embodiment, in every wheel iteration, computer equipment can
It is iterated optimization with the value to every kind of matrix of variables to update, that is, when the value to one of matrix of variables is updated, protect
The value for holding other matrix variables is constant.
In one embodiment, in every wheel iteration, the value of the sharing feature matrix of variables is updated, each privately owned feature becomes
The value of the value of moment matrix, the value of each shared mapping variable matrix and each privately owned mapping variable matrix includes: in every wheel iteration
In, successively from the sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable
Current variable matrix is chosen in matrix, keeping the value of non-present matrix of variables is non-present matrix of variables described in last update
Value it is constant, update optimized to the value of current variable matrix, and from the sharing feature matrix of variables, privately owned characteristic variable
Next current variable matrix is chosen in matrix, shared mapping variable matrix and privately owned mapping variable matrix, it is excellent to continue
Change update processing.
Wherein, current variable matrix is the matrix of variables that currently optimize update.Non-present matrix of variables is to work as
It is preceding not do optimization the updates and constant matrix of variables of retention value in current variable matrix update.
For example, sharing feature matrix of variables is current variable matrix when being updated to sharing feature matrix of variables.
Privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix, then be non-present matrix of variables.
It should be noted that when non-present matrix of variables not yet undergoes and updates for the first time, non-present matrix of variables nearest one
The value of secondary update can be initial default value.The initial default value of different non-present matrix of variables can be different.
Specifically, in first run iteration, it is assumed that sharing feature matrix of variables is chosen as current variable matrix, then, then
It is respective initial default in the value for keeping privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix
Be worth it is constant under conditions of, update (for example, being updated to a1) is optimized to the value of sharing feature matrix of variables.Then, continue to select
Next current variable matrix is taken, for example, privately owned characteristic variable matrix is chosen as current variable matrix, then, then it is keeping
The value of sharing feature matrix of variables is the value (that is, a1) just updated, shared mapping variable matrix and privately owned mapping variable matrix
Value is to optimize update (for example, updating to the value of privately owned characteristic variable matrix under conditions of respective initial default value is constant
For a2).Then, continue to choose next current variable matrix, for example, choosing shared mapping variable matrix as current variable square
Battle array, then, then keep sharing feature matrix of variables value be the last update of sharing feature matrix of variables value (that is,
A1), it is initial default value that the value of privately owned characteristic variable matrix, which is the value of value (that is, a2) and the privately owned mapping variable matrix just updated,
Under conditions of constant, update (for example, being updated to a3) is optimized to the value of shared mapping variable matrix.Similarly, it chooses private
There is mapping variable matrix as current variable matrix, then, then keeping the value of sharing feature matrix of variables to become for sharing feature
Value (that is, a1), the value of privately owned characteristic variable matrix of moment matrix last update are that privately owned characteristic variable matrix is the last
The value (that is, a2) of update and the value of shared mapping variable matrix be shared mapping variable matrix last update value (that is,
A3) under conditions of constant, update (for example, being updated to a4) is optimized to the value of privately owned mapping variable matrix.To completely work as
The optimization update of all parametric variables of preceding round is handled.It is appreciated that in the iteration of a new round time, it is assumed that choose shared spy
When levying matrix of variables as current variable matrix, then, then keep privately owned characteristic variable matrix, share mapping variable matrix and
The value of privately owned mapping variable matrix is under conditions of the value (that is, a2, a3 and a4) of respective last update is constant, to shared
The value of characteristic variable matrix optimizes update (for example, a1 is updated to a11).It is appreciated that the parametric variable of a new round time
Update processing principle be same as above.
In order to make it easy to understand, the process being updated to each matrix of variables of now illustrating is explained.Assuming that
For v-th of mode shared mapping variable matrix,For v-th of mode privately owned mapping variable matrix,For v-th of mould
The privately owned characteristic variable matrix and V of stateCFor the sharing feature matrix of variables of multiple mode, then:
(1) it givesWithUpdate VC:
Table is optimized first with objective function of the Lagrangian to machine learning model represented by (formula 4)
Show, obtains
Wherein,To limit VC>=0 Lagrange multiplier.Using L to VCLocal derviation is sought, is obtained
Using KKT (Karush-Kuhn-Tucher) condition, (KKT condition is Non-Linear Programming (nonlinear
Programming) the necessary condition of optimum solution)Available following VCMore new formula
(2) it givesAnd VC, update
The objective function of the machine learning model represented by (formula 4) can be seen that each modeMutually solely
It is vertical, thus corresponding minimum objective function can simplify for
It is optimized when using Lagrangian and KKT condition, it is availableMore new formula
(3) similar with solution procedure in (2), it is availableWithMore new formula
It is appreciated that then completing the iteration of current round after having executed above-mentioned steps (1)~(3), can sentencing at this time
Whether meet iteration stopping condition after settled preceding round iteration, if it is satisfied, then stopping iteration, and obtains and meet iteration stopping item
V when partCTo get the sharing feature finally merged to target object.If conditions are not met, each matrix of variables that will then obtain
Be updated to currently to be calculated as a result, re-executeing the steps (1)~(3) to carry out the iterative processing of next one.
In above-described embodiment, when multiple matrix of variables are coupled, most to the objective function of machine learning model
Small optimization is a non-convex problem, and it is very difficult to find globally optimal solution.Therefore, excellent by the successively iteration of each matrix of variables
Change, finds the locally optimal solution of the objective function of machine learning model, thus more quickly and accurately determine finally to merge
Sharing feature.
In one embodiment, this method further include: obtain the sharing feature finally merged of multiple target objects;
According to each sharing feature, multiple target objects are clustered;According to the cluster result, the target object is carried out
Respective handling.
It is appreciated that the sharing feature ratio of the target object obtained according to the method for each embodiment of the application finally merged
It is relatively accurate, it is equivalent to the feature that can accurately determine out target object, it in turn, can be according to the side in each embodiment of the application
Method obtains the sharing feature of multiple target objects finally merged, and according to the sharing feature of acquisition, to multiple target objects into
Row clustering processing.Since the sharing feature of acquisition is more accurate, target object is clustered based on the sharing feature
Cluster result is more accurate, so, according to cluster result, respective handling is carried out to the target object, can be improved to target
The accuracy of object handles.
As shown in figure 4, in one embodiment, providing another feature extracting method, this method specifically includes following
Step:
S402 obtains the multi-modal default characteristic of target object;Default characteristic is non-negative data;To default
Characteristic is normalized, and obtains multi-modal initial characteristic data.
S404 obtains the corresponding shared mapping variable matrix of each mode and privately owned mapping variable matrix respectively;By each mode
Initial characteristic data, carry out mapping decomposition according to corresponding shared mapping variable matrix, obtain sharing feature matrix of variables;It will
The initial characteristic data of each mode carries out mapping decomposition by corresponding privately owned mapping variable matrix respectively, obtains privately owned feature and becomes
Moment matrix.
Sharing feature matrix of variables is carried out inverse mapping according to the corresponding shared mapping matrix of each mode respectively and turned by S406
It changes, obtains the first inverse mapping result;Each privately owned characteristic variable matrix is subjected to inverse mapping conversion according to corresponding privately owned mapping matrix,
Obtain the second inverse mapping result;According to the first inverse mapping result and the second inverse mapping as a result, reconstructing the characteristic of each mode.
S408 constructs basic machine learning model;The objective function of basic machine learning model, for characterizing each mode
Difference between initial characteristic data and the characteristic of reconstruct.
S410 constructs arest neighbors figure using each initial characteristic data of mode as vertex for each mode;According to most
The distance between the vertex on total side, determines the weight on each side in arest neighbors figure in neighbour's figure;According to the weight on each side, mode is constructed
The corresponding data weighting matrix in original data space;Sub- sharing feature after determining initial characteristic data mapping dimensionality reduction two-by-two
Similarity between data;Coupling data weight matrix and similarity obtain the local invariant regularization factor of each mode.
S412 combines the local invariant regularization factor of basic machine learning model and each mode, obtains machine learning mould
Type.
S414 minimizes the value of the objective function of machine learning model by repetitive exercise, and in every wheel iteration, successively
It is chosen from sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix
Current variable matrix, keep non-present matrix of variables value be last update non-present matrix of variables value it is constant, to working as
The value of preceding matrix of variables optimizes update, and from sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable
Next current variable matrix is chosen in matrix and privately owned mapping variable matrix, updates processing until optimization to continue optimization
Whole matrix of variables have been handled, and have entered next round iteration, until meeting iteration stopping condition.
S416 obtains the value of the sharing feature data when meeting iteration stopping condition, obtains finally melting for target object
The sharing feature of conjunction.
In one embodiment, this method further include: obtain the sharing feature of multiple target objects finally merged;According to
Each sharing feature clusters multiple target objects;According to cluster result, respective handling is carried out to target object.
In conjunction with the scheme of each embodiment of the application, it is as follows to carry out experimental analysis:
In order to verify the validity that the application proposes feature extraction, by itself and more representational multi-modal feature at present
Learning model ConcatNMF ((Concat Nonnegative Matrix Factor splices Non-negative Matrix Factorization),
MultiNMF(Multi Nonnegative Matrix Factor)、MultiGNMF(Multi Graph Nonnegative
Matrix Factor, more figure Non-negative Matrix Factorizations), MMNMF (Multi-Manifold Nonnegative Matrix
Factor, multi-streaming type are aligned Non-negative Matrix Factorization) and UMCFL (unsupervised multiple view correlated characteristic learn) compare and analyze,
And (k-means clustering algorithm, k means clustering algorithm is a kind of gathering for iterative solution by Kmeans cluster
Alanysis algorithm) performance of each model is verified in three precision, normalised mutual information and purity indexs.Experimental data set such as table
Shown in 1.
The description of 1 data set of table
It is appreciated that SensIT, MultipleFeatures, ALOI and 3Sources respectively indicate different data sources.
Instance number, the as quantity of initial characteristic data.
Fig. 5 to Fig. 7 give each embodiment of the application precision on four data sets of method and other comparison models,
Normalised mutual information and purity cluster comparison result.Referring to Fig. 5 to Fig. 7 it is found that experiment shows that the application machine learning model exists
Other models are significantly better than that on all data sets.This is because during the study of multi-modal data fusion feature, this Shen
Please each embodiment method, during model training, while learning mode sharing feature and the privately owned feature of each mode, it is logical in this way
It crosses and constantly isolates mode private information, can effectively promote the accuracy of sharing feature.
In comparative experiments, all model parameter values be its with optimal performance when value.For example, in the application
It is as shown in Figs. 8 to 11, different by setting by testing the optimal value of Selecting All Parameters α in the method that each embodiment proposes
α value obtains the normalised mutual information value of model on different data sets, and choosing has the parameter value of optimal performance as model
The final value calculated.For example, in Fig. 8, when SensIT provides multi-modal initial characteristic data as data source, parameter alpha
Take it is 10 best, for another example, in Fig. 9, when MultipleFeatures provides multi-modal initial characteristic data as data source,
Parameter alpha takes 0.1 best.That is, for the multi-modal initial characteristic data that different data sources provides, the optimal value of model parameter
There are differences.Model training is carried out in the case where model parameter takes optimal value, is capable of providing the accuracy of sharing feature extraction.
As shown in figure 12, in one embodiment, a kind of feature deriving means 1200 are provided, which includes:
Obtain module 1202, mapping block 1204, reconstructed module 1206 and Fusion Module 1208, in which:
Module 1202 is obtained, for obtaining the multi-modal initial characteristic data of target object.
Mapping block 1204 carries out mapping dimensionality reduction for the initial characteristic data to each mode, and decomposition is obtained by argument table
The privately owned characteristic of the sharing feature data and each mode shown indicated by variable.
Reconstructed module 1206, for reconstructing each mode according to the sharing feature data and each privately owned characteristic
Characteristic.
Fusion Module 1208, for determining the difference between the initial characteristic data of each mode and the characteristic of reconstruct
It is different;By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make described
Difference minimizes;The value for obtaining the sharing feature data when difference minimizes, obtains the final fusion of the target object
Sharing feature.
In one embodiment, the Fusion Module 1208 is also used to construct machine learning model;The machine learning mould
The objective function of type, for characterizing the difference between the initial characteristic data of each mode and the characteristic of reconstruct;Pass through
Repetitive exercise minimizes the value of the objective function of the machine learning model, and in every wheel iteration, updates the sharing feature
The value of the value of data and each privately owned characteristic, until meeting iteration stopping condition;It obtains when meeting iteration stopping condition
The value of sharing feature data obtains the value of the sharing feature data when difference minimizes.
In one embodiment, Fusion Module 1208 is also used to determine the local invariant regularization factor of each mode;It is described
The local invariant regularization factor, for adjusting the consistency between the first similar structure and the second similar structure;First phase
It is the similar structure between the initial characteristic data of each mode like structure;Second similar structure is each original in each mode
Characteristic is mapped the similar structure between the sub- sharing feature data after dimensionality reduction;Construct basic machine learning model;It is described
The objective function of basic machine learning model, for characterize each mode the initial characteristic data and reconstruct characteristic it
Between difference;The local invariant regularization factor for combining the basic machine learning model and each mode, obtains machine learning mould
Type.
In one embodiment, Fusion Module 1208 is also used to obtain the data weighting of the mode for each mode
Matrix;Each weight in the data weighting matrix, the distance between initial characteristic data two-by-two for characterizing the mode are remote
Short range degree;The similarity between sub- sharing feature data after determining the mapping of the initial characteristic data two-by-two dimensionality reduction;Coupling institute
Data weighting matrix and the similarity are stated, the local invariant regularization factor of each mode is obtained.
In one embodiment, Fusion Module 1208 is also used to for each mode, by each primitive character of the mode
Data construct arest neighbors figure as vertex;According to the distance between the vertex on side total in the arest neighbors figure, arest neighbors figure is determined
In each side weight;According to the weight on each side, the mode corresponding data weighting matrix in original data space is constructed.
In one embodiment, mapping block 1204 is also used to obtain the corresponding shared mapping matrix of each mode and private respectively
There is mapping matrix;By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding shared mapping matrix, is shared
Characteristic variable matrix;By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding privately owned mapping matrix respectively, is obtained
To privately owned characteristic variable matrix.
In one embodiment, reconstructed module 1206 is also used to the sharing feature matrix of variables respectively according to each mode
The corresponding shared mapping matrix carries out inverse mapping conversion, obtains the first inverse mapping result;By each privately owned characteristic variable
Matrix carries out inverse mapping conversion according to the corresponding privately owned mapping matrix, obtains the second inverse mapping result;According to the first inverse mapping
As a result with the second inverse mapping as a result, reconstructing the characteristic of each mode.
In one embodiment, the shared mapping matrix is shared mapping variable matrix, and the privately owned mapping matrix is
Privately owned mapping variable matrix;Fusion Module 1208 is also used in every wheel iteration, update the sharing feature matrix of variables value,
The value of the value of each privately owned characteristic variable matrix, the value of each shared mapping variable matrix and each privately owned mapping variable matrix, until
Meet iteration stopping condition.
In one embodiment, Fusion Module 1208 is also used in every wheel iteration, successively from the sharing feature variable
Current variable matrix is chosen in matrix, privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix, is protected
The value that the value of non-present matrix of variables is held as non-present matrix of variables described in last update is constant, to current variable matrix
Value optimizes update, and from the sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable matrix and private
Have and choose next current variable matrix in mapping variable matrix, to continue to optimize update processing.
In one embodiment, the multi-modal default characteristic that module 1202 is also used to obtain target object is obtained;
The default characteristic is non-negative data;The default characteristic is normalized, is obtained multi-modal original
Characteristic.
As shown in figure 13, in one embodiment, the device 1200 further include:
Cluster module 1210, for obtaining the sharing feature finally merged of multiple target objects;According to each described
Sharing feature clusters multiple target objects;According to the cluster result, respective handling is carried out to the target object.
Figure 14 is the schematic diagram of internal structure of computer equipment in one embodiment.Referring to Fig.1 4, which can
To be the server 120 in Fig. 1.The computer equipment includes being connect by processor, memory and network that system bus connects
Mouthful.Wherein, memory includes non-volatile memory medium and built-in storage.The non-volatile memory medium of the computer equipment can
Storage program area and computer program.The computer program is performed, and processor may make to execute a kind of feature extraction side
Method.The processor of the computer equipment supports the operation of entire computer equipment for providing calculating and control ability.The memory
Computer program can be stored in reservoir, when which is executed by processor, processor may make to execute a kind of feature
Extracting method.The network interface of computer equipment is for carrying out network communication.
It will be understood by those skilled in the art that structure shown in Figure 14, only part relevant to application scheme
The block diagram of structure, does not constitute the restriction for the computer equipment being applied thereon to application scheme, and specific computer is set
Standby may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, feature deriving means provided by the present application can be implemented as a kind of shape of computer program
Formula, computer program can be run in computer equipment as shown in figure 14, and the non-volatile memory medium of computer equipment can
Each program module of storage composition this feature extraction element, for example, obtaining module 1202, mapping block shown in Figure 12
1204, reconstructed module 1206 and Fusion Module 1208.Computer program composed by each program module is for making the calculating
Machine equipment executes the step in the feature extracting method of each embodiment of the application described in this specification, for example, computer
Equipment can obtain the multi-modal of target object by the acquisition module 1202 in feature deriving means 1200 as shown in figure 12
Initial characteristic data.Computer equipment can carry out mapping drop by initial characteristic data of the mapping block 1204 to each mode
Dimension decomposes the privately owned characteristic of the sharing feature data and each mode that obtain being indicated by variable indicated by variable.Computer
Equipment can reconstruct each mode by reconstructed module 1206 according to the sharing feature data and each privately owned characteristic
Characteristic.Computer equipment can determine the initial characteristic data of each mode and the spy of reconstruct by Fusion Module 1208
Levy the difference between data;Value by adjusting the sharing feature data indicated by variable and the privately owned characteristic by variable expression
According to value, minimize the difference;The value for obtaining the sharing feature data when difference minimizes, obtains the target pair
The sharing feature of elephant finally merged.
Figure 15 is the schematic diagram of internal structure of computer equipment in one embodiment.Referring to Fig.1 5, which can
With terminal.The computer equipment includes processor, memory, network interface, display screen and the input connected by system bus
Device.Wherein, memory includes non-volatile memory medium and built-in storage.The non-volatile memory medium of the computer equipment
It can storage program area and computer program.The computer program is performed, and processor may make to execute a kind of feature extraction
Method.The processor of the computer equipment supports the operation of entire computer equipment for providing calculating and control ability.This is interior
Computer program can be stored in memory, when which is executed by processor, processor may make to execute a kind of spy
Levy extracting method.The network interface of computer equipment is for carrying out network communication.The display screen of computer equipment can be liquid crystal
Display screen or electric ink display screen etc..The input unit of computer equipment can be the touch layer covered on display screen,
It can be the key being arranged in terminal enclosure, trace ball or Trackpad, be also possible to external keyboard, Trackpad or mouse etc..
The computer equipment can be personal computer, intelligent sound box, mobile terminal or mobile unit, and mobile terminal includes mobile phone, puts down
At least one of plate computer, personal digital assistant or wearable device etc..
It will be understood by those skilled in the art that structure shown in Figure 15, only part relevant to application scheme
The block diagram of structure, does not constitute the restriction for the computer equipment being applied thereon to application scheme, and specific computer is set
Standby may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, a kind of computer equipment, including memory and processor are provided, memory is stored with meter
Calculation machine program, when computer program is executed by processor, so that the step of processor executes features described above extracting method.It is special herein
The step of levying extracting method can be the step in the feature extracting method of above-mentioned each embodiment.
In one embodiment, a kind of computer readable storage medium is provided, computer program, computer journey are stored with
When sequence is executed by processor, so that the step of processor executes features described above extracting method.The step of feature extracting method herein
It can be the step in the feature extracting method of above-mentioned each embodiment.
It should be understood that although each step in each embodiment of the application is not necessarily to indicate according to step numbers
Sequence successively execute.Unless expressly stating otherwise herein, there is no stringent sequences to limit for the execution of these steps, these
Step can execute in other order.Moreover, in each embodiment at least part step may include multiple sub-steps or
Multiple stages, these sub-steps or stage are not necessarily to execute completion in synchronization, but can be at different times
Execute, these sub-steps perhaps the stage execution sequence be also not necessarily successively carry out but can with other steps or its
The sub-step or at least part in stage of its step execute in turn or alternately.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with
Relevant hardware is instructed to complete by computer program, the program can be stored in a non-volatile computer and can be read
In storage medium, the program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, provided herein
Each embodiment used in any reference to memory, storage, database or other media, may each comprise non-volatile
And/or volatile memory.Nonvolatile memory may include that read-only memory (ROM), programming ROM (PROM), electricity can be compiled
Journey ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include random access memory
(RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms, such as static state RAM
(SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM), enhanced SDRAM
(ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) directly RAM (RDRAM), straight
Connect memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of embodiment described above can be combined arbitrarily, for simplicity of description, not to above-mentioned reality
It applies all possible combination of each technical characteristic in example to be all described, as long as however, the combination of these technical characteristics is not deposited
In contradiction, all should be considered as described in this specification.
The embodiments described above only express several embodiments of the present invention, and the description thereof is more specific and detailed, but simultaneously
It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art
It says, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to protection of the invention
Range.Therefore, the scope of protection of the patent of the invention shall be subject to the appended claims.
Claims (15)
1. a kind of feature extracting method, which comprises
Obtain the multi-modal initial characteristic data of target object;
Mapping dimensionality reduction is carried out to the initial characteristic data of each mode, decomposes the sharing feature data for obtaining being indicated by variable and Ge Mo
The privately owned characteristic of state indicated by variable;
According to the sharing feature data and each privately owned characteristic, the characteristic of each mode is reconstructed;
Determine the difference between the initial characteristic data of each mode and the characteristic of reconstruct;
By adjusting the value of the sharing feature data indicated by variable and the value of the privately owned characteristic indicated by variable, make described
Difference minimizes;
The value for obtaining the sharing feature data when difference minimizes, obtains the shared spy of the target object finally merged
Sign.
2. the method according to claim 1, wherein the initial characteristic data of each mode of the determination and again
Difference between the characteristic of structure includes:
Construct machine learning model;The objective function of the machine learning model, for characterizing the primitive character of each mode
Difference between data and the characteristic of reconstruct;
The value of the value and the privately owned characteristic indicated by variable by adjusting the sharing feature data indicated by variable, makes
The difference minimizes
The value of the objective function of the machine learning model is minimized by repetitive exercise, and in every wheel iteration, described in update
The value of the value of sharing feature data and each privately owned characteristic, until meeting iteration stopping condition;
The value for obtaining the sharing feature data when meeting iteration stopping condition obtains the sharing feature when difference minimizes
The value of data.
3. according to the method described in claim 2, it is characterized in that, the method also includes:
Determine the local invariant regularization factor of each mode;The local invariant regularization factor, for adjusting the first similar knot
Consistency between structure and the second similar structure;The first similar structure is between the initial characteristic data of each mode
Similar structure;Second similar structure is mapped the sub- sharing feature data after dimensionality reduction for each initial characteristic data in each mode
Between similar structure;
The building machine learning model includes:
Construct basic machine learning model;The objective function of the basic machine learning model, for characterizing described in each mode
Difference between initial characteristic data and the characteristic of reconstruct;
The local invariant regularization factor for combining the basic machine learning model and each mode, obtains machine learning model.
4. according to the method described in claim 3, it is characterized in that, the local invariant regularization of each mode of the determination is because of attached bag
It includes:
For each mode, the data weighting matrix of the mode is obtained;Each weight in the data weighting matrix, for characterizing
The distance between initial characteristic data how far two-by-two of the mode;
The similarity between sub- sharing feature data after determining the mapping of the initial characteristic data two-by-two dimensionality reduction;
The data weighting matrix and the similarity are coupled, the local invariant regularization factor of each mode is obtained.
5. according to the method described in claim 4, it is characterized in that, it is described be directed to each mode, obtain the data of the mode
Weight matrix includes:
Arest neighbors figure is constructed using each initial characteristic data of the mode as vertex for each mode;
According to the distance between the vertex on side total in the arest neighbors figure, the weight on each side in arest neighbors figure is determined;
According to the weight on each side, the mode corresponding data weighting matrix in original data space is constructed.
6. according to the method described in claim 2, it is characterized in that, the initial characteristic data to each mode carries out mapping drop
Dimension, the privately owned characteristic indicated by variable for decomposing the sharing feature data and each mode that obtain being indicated by variable include:
The corresponding shared mapping matrix of each mode and privately owned mapping matrix are obtained respectively;
By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding shared mapping matrix, obtains sharing feature change
Moment matrix;
By the initial characteristic data of each mode, mapping decomposition is carried out according to corresponding privately owned mapping matrix respectively, obtains privately owned spy
Levy matrix of variables.
7. according to the method described in claim 6, it is characterized in that, described according to the sharing feature data and each described privately owned
Characteristic, the characteristic for reconstructing each mode include:
The sharing feature matrix of variables is subjected to inverse mapping conversion according to the corresponding shared mapping matrix of each mode respectively,
Obtain the first inverse mapping result;
Each privately owned characteristic variable matrix is subjected to inverse mapping conversion according to the corresponding privately owned mapping matrix, it is inverse to obtain second
Mapping result;
According to the first inverse mapping result and the second inverse mapping as a result, reconstructing the characteristic of each mode.
8. according to the method described in claim 6, it is characterized in that, the shared mapping matrix is shared mapping variable matrix,
The privately owned mapping matrix is privately owned mapping variable matrix;
It is described to update the value of the sharing feature data and the value of each privately owned characteristic in every wheel iteration, it changes until meeting
Include: for stop condition
In every wheel iteration, updates the value of the sharing feature matrix of variables, the value of each privately owned characteristic variable matrix, each shared reflects
The value of matrix of variables and the value of each privately owned mapping variable matrix are penetrated, until meeting iteration stopping condition.
9. according to the method described in claim 8, updating the sharing feature and becoming it is characterized in that, described in every wheel iteration
The value of moment matrix, the value of each privately owned characteristic variable matrix, the value of each shared mapping variable matrix and each privately owned mapping variable square
Battle array value include:
In every wheel iteration, successively from the sharing feature matrix of variables, privately owned characteristic variable matrix, shared mapping variable matrix
With selection current variable matrix in privately owned mapping variable matrix, keeping the value of non-present matrix of variables is described in last update
The value of non-present matrix of variables is constant, optimizes update to the value of current variable matrix, and from the sharing feature variable square
Next current variable square is chosen in battle array, privately owned characteristic variable matrix, shared mapping variable matrix and privately owned mapping variable matrix
Battle array, to continue to optimize update processing.
10. method according to any one of claim 1 to 9, which is characterized in that described to obtain the multi-modal of target object
Initial characteristic data include:
Obtain the multi-modal default characteristic of target object;The default characteristic is non-negative data;
The default characteristic is normalized, multi-modal initial characteristic data is obtained.
11. method according to any one of claim 1 to 9, which is characterized in that the method also includes:
Obtain the sharing feature finally merged of multiple target objects;
According to each sharing feature, multiple target objects are clustered;
According to the cluster result, respective handling is carried out to the target object.
12. a kind of feature deriving means, which is characterized in that described device includes:
Module is obtained, for obtaining the multi-modal initial characteristic data of target object;
Mapping block carries out mapping dimensionality reduction for the initial characteristic data to each mode, decomposes and obtains being indicated shared by variable
The privately owned characteristic of characteristic and each mode indicated by variable;
Reconstructed module, for reconstructing the characteristic of each mode according to the sharing feature data and each privately owned characteristic
According to;
Fusion Module, for determining the difference between the initial characteristic data of each mode and the characteristic of reconstruct;Pass through
The value of the value for the sharing feature data that adjustment is indicated by variable and the privately owned characteristic indicated by variable keeps the difference minimum
Change;The value for obtaining the sharing feature data when difference minimizes, obtains the shared spy of the target object finally merged
Sign.
13. device according to claim 12, which is characterized in that the Fusion Module is also used to construct machine learning mould
Type;The objective function of the machine learning model, for characterizing the initial characteristic data of each mode and the characteristic of reconstruct
Difference between;The value of the objective function of the machine learning model is minimized by repetitive exercise, and in every wheel iteration,
The value of the sharing feature data and the value of each privately owned characteristic are updated, until meeting iteration stopping condition;Acquisition is meeting
The value of sharing feature data when iteration stopping condition obtains the value of the sharing feature data when difference minimizes.
14. a kind of computer equipment, which is characterized in that including memory and processor, be stored with computer in the memory
Program, when the computer program is executed by the processor, so that the processor perform claim requires any one of 1 to 11
The step of the method.
15. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium
Program, when the computer program is executed by processor, so that the processor perform claim requires described in any one of 1 to 11
The step of method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910663371.0A CN110378423A (en) | 2019-07-22 | 2019-07-22 | Feature extracting method, device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910663371.0A CN110378423A (en) | 2019-07-22 | 2019-07-22 | Feature extracting method, device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110378423A true CN110378423A (en) | 2019-10-25 |
Family
ID=68254878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910663371.0A Pending CN110378423A (en) | 2019-07-22 | 2019-07-22 | Feature extracting method, device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110378423A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110874650A (en) * | 2020-01-16 | 2020-03-10 | 支付宝(杭州)信息技术有限公司 | Alliance learning method, device and system fusing public domain data and private data |
CN111091475A (en) * | 2019-12-12 | 2020-05-01 | 华中科技大学 | Social network feature extraction method based on non-negative matrix factorization |
CN111309850A (en) * | 2020-02-10 | 2020-06-19 | 深圳云天励飞技术有限公司 | Data feature extraction method and device, terminal equipment and medium |
CN111325221A (en) * | 2020-02-25 | 2020-06-23 | 青岛海洋科学与技术国家实验室发展中心 | Image feature extraction method based on image depth information |
CN111450531A (en) * | 2020-03-30 | 2020-07-28 | 腾讯科技(深圳)有限公司 | Virtual character control method, virtual character control device, electronic equipment and storage medium |
CN112100145A (en) * | 2020-09-02 | 2020-12-18 | 南京三眼精灵信息技术有限公司 | Digital model sharing learning system and method |
CN114510525A (en) * | 2022-04-18 | 2022-05-17 | 深圳丰尚智慧农牧科技有限公司 | Data format conversion method and device, computer equipment and storage medium |
-
2019
- 2019-07-22 CN CN201910663371.0A patent/CN110378423A/en active Pending
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111091475A (en) * | 2019-12-12 | 2020-05-01 | 华中科技大学 | Social network feature extraction method based on non-negative matrix factorization |
CN111091475B (en) * | 2019-12-12 | 2022-08-02 | 华中科技大学 | Social network feature extraction method based on non-negative matrix factorization |
CN110874650A (en) * | 2020-01-16 | 2020-03-10 | 支付宝(杭州)信息技术有限公司 | Alliance learning method, device and system fusing public domain data and private data |
CN111309850A (en) * | 2020-02-10 | 2020-06-19 | 深圳云天励飞技术有限公司 | Data feature extraction method and device, terminal equipment and medium |
CN111309850B (en) * | 2020-02-10 | 2022-03-25 | 深圳云天励飞技术股份有限公司 | Data feature extraction method and device, terminal equipment and medium |
CN111325221A (en) * | 2020-02-25 | 2020-06-23 | 青岛海洋科学与技术国家实验室发展中心 | Image feature extraction method based on image depth information |
CN111325221B (en) * | 2020-02-25 | 2023-06-23 | 青岛海洋科技中心 | Image feature extraction method based on image depth information |
CN111450531A (en) * | 2020-03-30 | 2020-07-28 | 腾讯科技(深圳)有限公司 | Virtual character control method, virtual character control device, electronic equipment and storage medium |
CN112100145A (en) * | 2020-09-02 | 2020-12-18 | 南京三眼精灵信息技术有限公司 | Digital model sharing learning system and method |
CN112100145B (en) * | 2020-09-02 | 2023-07-04 | 南京三眼精灵信息技术有限公司 | Digital model sharing learning system and method |
CN114510525A (en) * | 2022-04-18 | 2022-05-17 | 深圳丰尚智慧农牧科技有限公司 | Data format conversion method and device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110378423A (en) | Feature extracting method, device, computer equipment and storage medium | |
US20200242822A1 (en) | Digital Media Environment for Style-Aware Patching in a Digital Image | |
CN106537422A (en) | Systems and methods for capture of relationships within information | |
CN109063742A (en) | Butterfly identifies network establishing method, device, computer equipment and storage medium | |
CN109614935A (en) | Car damage identification method and device, storage medium and electronic equipment | |
CN109493417A (en) | Three-dimension object method for reconstructing, device, equipment and storage medium | |
WO2022105117A1 (en) | Method and device for image quality assessment, computer device, and storage medium | |
CN109086697A (en) | A kind of human face data processing method, device and storage medium | |
CN111322716B (en) | Air conditioner temperature automatic setting method, air conditioner, equipment and storage medium | |
CN105550649A (en) | Extremely low resolution human face recognition method and system based on unity coupling local constraint expression | |
CN112508092A (en) | Sample screening method, system, equipment and medium | |
CN106971197A (en) | The Subspace clustering method of multi-view data based on otherness and consistency constraint | |
CN113177592B (en) | Image segmentation method and device, computer equipment and storage medium | |
Yang et al. | Joint learning of unsupervised dimensionality reduction and gaussian mixture model | |
CN116152544A (en) | Hyperspectral image classification method based on residual enhancement spatial spectrum fusion hypergraph neural network | |
CN113469091B (en) | Face recognition method, training method, electronic device and storage medium | |
Wang et al. | Generative image inpainting with enhanced gated convolution and Transformers | |
CN110378883A (en) | Picture appraisal model generating method, image processing method, device, computer equipment and storage medium | |
CN111507259B (en) | Face feature extraction method and device and electronic equipment | |
CN117315090A (en) | Cross-modal style learning-based image generation method and device | |
CN109934926B (en) | Model data processing method, device, readable storage medium and equipment | |
CN114638823B (en) | Full-slice image classification method and device based on attention mechanism sequence model | |
CN116258923A (en) | Image recognition model training method, device, computer equipment and storage medium | |
CN115758271A (en) | Data processing method, data processing device, computer equipment and storage medium | |
CN115393376A (en) | Medical image processing method, medical image processing device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |