CN112435343A - Point cloud data processing method and device, electronic equipment and storage medium - Google Patents

Point cloud data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112435343A
CN112435343A CN202011328046.8A CN202011328046A CN112435343A CN 112435343 A CN112435343 A CN 112435343A CN 202011328046 A CN202011328046 A CN 202011328046A CN 112435343 A CN112435343 A CN 112435343A
Authority
CN
China
Prior art keywords
point cloud
matrix
sub
feature
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011328046.8A
Other languages
Chinese (zh)
Inventor
牛建伟
朱智
谷宁波
李青锋
任涛
于晓龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hejian Technology Partnership (L.P.)
Original Assignee
Hangzhou Weishi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Weishi Technology Co ltd filed Critical Hangzhou Weishi Technology Co ltd
Priority to CN202011328046.8A priority Critical patent/CN112435343A/en
Publication of CN112435343A publication Critical patent/CN112435343A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The application provides a point cloud data processing method and device, electronic equipment and a storage medium, and relates to the technical field of deep learning. In the present application, first, a point cloud matrix is formed based on a plurality of point cloud data. And secondly, carrying out serialization processing on the point cloud matrix based on a serialization network layer to obtain a plurality of sub-feature matrices, wherein the dimension of each sub-feature matrix is larger than that of the point cloud matrix, and the size of each sub-feature matrix is the same. And then, splicing the sub-feature matrixes to obtain a first feature matrix, wherein the number of point cloud data in the first feature matrix is greater than that in the point cloud matrix. And finally, reconstructing the first characteristic matrix based on a reconstruction network layer to obtain a target characteristic matrix with the same dimensionality as the point cloud matrix. Based on the method, the problem that effective reconstruction of point cloud data is difficult in the prior art can be solved.

Description

Point cloud data processing method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of deep learning, in particular to a point cloud data processing method and device, electronic equipment and a storage medium.
Background
The point cloud data is the most intuitive and basic expression mode of a computer to the real world, however, due to a series of reasons such as scene occlusion and low resolution, large-area loss or local detail loss of the point cloud data captured and generated by a visual system can be caused, and highly sparse incomplete point clouds are formed.
In the existing point cloud reconstruction (completion) technology, reconstruction processing of point cloud data is generally performed based on prior information such as a basic structure of an object, for example, symmetric information. However, the inventor researches and finds that the problem that effective reconstruction of point cloud data is difficult still exists in the prior art.
Disclosure of Invention
In view of the above, an object of the present application is to provide a method and an apparatus for processing point cloud data, an electronic device, and a storage medium, so as to solve the problem in the prior art that it is difficult to effectively reconstruct point cloud data.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
a point cloud data processing method, comprising:
forming a point cloud matrix based on the obtained plurality of point cloud data;
carrying out serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices, wherein the dimension of each sub-feature matrix is larger than that of the point cloud matrix, and the dimension of each sub-feature matrix is the same;
splicing the sub-feature matrixes to obtain a first feature matrix, wherein the number of point cloud data in the first feature matrix is greater than that in the point cloud matrix;
and reconstructing the first characteristic matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target characteristic matrix with the same dimensionality as the point cloud matrix.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, the step of performing serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices includes:
a, sampling the point cloud matrix based on a sampling sub-network layer to obtain a sampling data set, wherein the sampling data set comprises a plurality of point cloud sampling data;
b, respectively taking each point cloud sampling data as a central point, and carrying out clustering processing based on a clustering sub-network layer to obtain a plurality of sampling data clusters corresponding to the sampling data set;
c, performing feature extraction processing on the plurality of sampling data clusters based on a feature extraction sub-network layer to obtain a sub-feature matrix corresponding to the sampling data set, wherein the feature extraction sub-network layer, the sampling sub-network layer and the clustering sub-network layer form a serialization network layer;
d, sampling the sub-feature matrix based on the sampling sub-network layer to obtain a new sampling data set, wherein the new sampling data set comprises at least one point cloud sampling data;
e, respectively taking each point cloud sampling data in the new sampling data set as a central point, and carrying out clustering processing based on the clustering sub-network layer to obtain a plurality of corresponding new sampling data clusters;
f, performing feature extraction processing on the plurality of new sampled data clusters based on the feature extraction sub-network layer to obtain sub-feature matrixes corresponding to the new sampled data sets;
and g, performing data interpolation processing on each obtained sub-feature matrix to obtain a plurality of sub-feature matrices with the same size.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, step d, step e, and step f are sequentially performed at least once until the number of point cloud data included in the sub-feature matrix obtained by performing step f at the last time is 1.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, the step of performing a stitching process based on the plurality of sub-feature matrices to obtain a first feature matrix includes:
for each sub-feature matrix, performing descending order sorting processing on the row features in the sub-feature matrix, and obtaining a new sub-feature matrix based on a first number of row features sorted in the front;
and selecting a second number of new sub-feature matrixes from the new sub-feature matrixes corresponding to the sub-feature matrixes to perform splicing processing to obtain a first feature matrix.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, the step of performing descending order sorting processing on the row features in the sub-feature matrix includes:
respectively calculating Euclidean distances between each row of features in the sub-feature matrix and global features, wherein the global features are obtained by performing serialization processing on the point cloud matrix, and the number of point cloud data in the global features is 1;
and performing descending order sorting processing on the row characteristics in the sub-characteristic matrix based on the Euclidean distance.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, the step of reconstructing the first feature matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target feature matrix having the same dimension as the point cloud matrix includes:
performing first convolution processing on the first characteristic matrix based on a first convolution layer to obtain a first characteristic matrix with reduced dimensionality, wherein an activation function of the first convolution layer is a linear rectification function, and the first characteristic matrix with reduced dimensionality and the point cloud matrix have the same dimensionality;
and performing second convolution processing on the dimensionality-reduced first feature matrix based on a second convolution layer to obtain a target feature matrix, wherein an activation function of the second convolution layer is a hyperbolic tangent function, and the second convolution layer and the first convolution layer form a reconstruction network layer in the point cloud reconstruction model.
In a preferred option of the embodiment of the present application, in the point cloud data processing method, the method further includes a step of training the point cloud reconstruction model, where the step includes:
obtaining a plurality of point cloud data sets corresponding to a plurality of target objects, wherein at least two target objects with different identification information exist in the plurality of target objects, and each point cloud data set is formed based on a plurality of point cloud data corresponding to the target objects;
forming a rule of staggered distribution based on the identification information of the target object, and sequencing the plurality of point cloud data sets to obtain a set sequence;
and in the set sequence, sequentially selecting a third number of point cloud data sets to train the constructed neural network model to obtain the point cloud reconstruction model.
The embodiment of the present application further provides a point cloud data processing apparatus, including:
a point cloud matrix obtaining module for forming a point cloud matrix based on the obtained plurality of point cloud data;
the point cloud matrix serialization module is used for serializing the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices, wherein the dimension of each sub-feature matrix is larger than that of the point cloud matrix, and the dimension of each sub-feature matrix is the same;
the sub-feature matrix splicing module is used for carrying out splicing processing based on the plurality of sub-feature matrices to obtain a first feature matrix, wherein the number of point cloud data in the first feature matrix is greater than that in the point cloud matrix;
and the characteristic matrix reconstruction module is used for reconstructing the first characteristic matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target characteristic matrix with the same dimensionality as the point cloud matrix.
On the basis, an embodiment of the present application further provides an electronic device, including:
a memory for storing a computer program;
and the processor is connected with the memory and is used for executing the computer program stored in the memory so as to realize the point cloud data processing method.
On the basis of the above, an embodiment of the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed, the method for processing point cloud data described above is implemented.
According to the point cloud data processing method and device, the electronic device and the storage medium, the point cloud matrix is serialized through the serialization network layer, so that the point cloud matrix can be converted into a plurality of sub-feature matrixes with the same size, then, the splicing processing can be carried out on the basis of the plurality of sub-feature matrixes to obtain the first feature matrix, the first feature matrix can be reconstructed on the basis of the reconstruction network layer, the target feature matrix is obtained, and the point cloud data can be reconstructed. Therefore, when point cloud data are reconstructed, the method does not depend on prior information (such as symmetric information) of an object base structure and the like, and on one hand, the prior information does not need to be acquired independently, so that the convenience is higher; on the other hand, under the condition that the prior information is not available, the reconstruction of the point cloud data can be reliably and effectively realized, so that the problem that the point cloud data is difficult to effectively reconstruct in the prior art is solved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Fig. 2 is a schematic flow chart of a point cloud data processing method according to an embodiment of the present disclosure.
Fig. 3 is a flowchart illustrating sub-steps included in step S120 in fig. 2.
Fig. 4 is a flowchart illustrating sub-steps included in step S130 in fig. 2.
Fig. 5 is a flowchart illustrating the sub-steps included in step S131 in fig. 4.
Fig. 6 is a flowchart illustrating sub-steps included in step S140 in fig. 2.
Fig. 7 is a schematic flowchart of other steps (a step of training a neural network model) included in the point cloud data processing method according to the embodiment of the present application.
Fig. 8 is a schematic block diagram of a point cloud data processing apparatus according to an embodiment of the present disclosure.
Icon: 10-an electronic device; 12-a memory; 14-a processor; 100-point cloud data processing means; 110-a point cloud matrix obtaining module; 120-point cloud matrix serialization module; 130-sub-feature matrix splicing module; 140-feature matrix reconstruction module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1, an embodiment of the present application provides an electronic device 10, which may include a memory 12, a processor 14, and a point cloud data processing apparatus 100.
Wherein the memory 12 and the processor 14 are electrically connected directly or indirectly to realize data transmission or interaction. For example, they may be electrically connected to each other via one or more communication buses or signal lines. The point cloud data processing device 100 includes at least one software functional module which can be stored in the memory 12 in the form of software or firmware (firmware). The processor 14 is configured to execute an executable computer program stored in the memory 12, for example, a software functional module and a computer program included in the point cloud data processing apparatus 100, so as to implement the point cloud data processing method provided by the embodiment of the present application.
Alternatively, the Memory 12 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 14 may be a general-purpose processor including a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like.
It will be appreciated that the configuration shown in FIG. 1 is merely illustrative and that the electronic device 10 may include more or fewer components than shown in FIG. 1 or may have a different configuration than shown in FIG. 1. For example, the electronic device 10 may further include a communication unit for performing information interaction with other devices, such as for obtaining point cloud data and the like.
In an alternative example, the electronic device 10 may be a server with data processing capabilities, such as a vision processing computer.
With reference to fig. 2, an embodiment of the present application further provides a point cloud data processing method, which can be applied to the electronic device 10. Wherein. The method steps defined by the flow related to the point cloud data processing method may be implemented by the electronic device 10.
The specific process shown in FIG. 2 will be described in detail below.
Step S110, a point cloud matrix is formed based on the obtained plurality of point cloud data.
In this embodiment, after obtaining the plurality of point cloud data, the electronic device 10 may form a point cloud matrix based on the plurality of point cloud data.
And step S120, carrying out serialization processing on the point cloud matrix based on a serialization network layer in the point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices.
In this embodiment, after the point cloud matrix is formed based on step S110, the electronic device 10 may perform serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training, so that a plurality of sub-feature matrices may be obtained.
And the dimension of each sub-feature matrix is larger than that of the point cloud matrix (in this way, the dimension improvement of the point cloud data is realized), and the size of each sub-feature matrix is the same.
And step S130, splicing the plurality of sub-feature matrixes to obtain a first feature matrix.
In this embodiment, after obtaining the plurality of sub-feature matrices based on step S120, the electronic device 10 may perform a stitching process based on the plurality of sub-feature matrices, so that a first feature matrix formed by stitching may be obtained.
And the number of point cloud data in the first characteristic matrix is greater than that of the point cloud data in the point cloud matrix (so that the point cloud data is supplemented or expanded).
Step S140, reconstructing the first feature matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target feature matrix with the same dimensionality as the point cloud matrix.
In this embodiment, after obtaining the first feature matrix based on step S130, the electronic device 10 may perform reconstruction processing on the first feature matrix based on a reconstruction network layer in the point cloud reconstruction model, so as to obtain a target feature matrix with the same dimension as the point cloud matrix.
Based on the method, when the point cloud data is processed, the prior information (such as symmetrical information) of the basic structure of the object can be not relied on, and on one hand, the prior information does not need to be acquired independently, so that the convenience is higher; on the other hand, under the condition that the prior information is not available, the reconstruction of the point cloud data can be reliably and effectively realized, so that the problem that the point cloud data is difficult to effectively reconstruct in the prior art is solved.
In the first aspect, it should be noted that, in step S110, a specific manner of forming the point cloud matrix is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, after obtaining the plurality of point cloud data, processing may be performed directly based on the plurality of point cloud data to form a point cloud matrix.
For another example, in another alternative example, after obtaining the plurality of point cloud data, the plurality of point cloud data may be preprocessed (e.g., aligning the plurality of point cloud data to a target number, such as 1024, that is, when the number of the plurality of point cloud data is greater than the target number, a target number of point cloud data may be selected based on a random sampling algorithm or other algorithm; when the number of the plurality of point cloud data is less than the target number, the number of point cloud data may be increased to the target number based on an algorithm such as a neighboring point interpolation), and then a point cloud matrix is formed based on the preprocessed point cloud data (the size of the point cloud matrix is N × D, N represents the number of point cloud data, D represents a dimension of each point cloud data, such as three dimensions of a three-dimensional space, or three dimensions of a color space are increased in the three dimensions, r, G, B, and as such, six dimensions may be formed).
Wherein a manner of obtaining the plurality of point cloud data is not limited.
For example, in one alternative example, it may be for an industrial application scenario. Thus, an industrial robot arm (a mechatronic device with anthropomorphic arm, wrist and hand functions, which may include a body for controlling the posture and a dexterous hand for grasping an object) may recognize the position of a target object, and an image capture device (e.g., a 3D camera) built in the dexterous hand is directed at the target object to capture an image, thereby obtaining a target image. Then, irrelevant information in the target image can be removed, and point cloud data containing the target object is extracted by using a point cloud segmentation algorithm.
In the second aspect, it should be noted that, in step S120, a specific manner of performing the serialization processing to obtain the plurality of sub-feature matrices with the same size is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, in order to make the information content included in the point cloud data in the obtained multiple sub-feature matrices more, so as to improve the reliability of the reconstructed point cloud data, with reference to fig. 3, step S120 may include step S121, step S122, step S123, step S124, step S125, step S126, and step S127, which is specifically described below.
And step S121, sampling the point cloud matrix based on the sampling sub-network layer to obtain a sampling data set.
In this embodiment, after the point cloud matrix is formed based on step S110, the point cloud matrix may be sampled based on a sampling sub-network layer (which belongs to the serialization network layer), so as to obtain a sampling data set.
Wherein the sample data set comprises a plurality of point cloud sample data.
And S122, respectively taking each point cloud sampling data as a central point, and carrying out clustering processing based on a clustering sub-network layer to obtain a plurality of sampling data clusters corresponding to the sampling data set.
In this embodiment, after the sampling data set is obtained based on step S121, for each point cloud sampling data in the sampling data set, a clustering process may be performed based on a clustering sub-network layer (the clustering sub-network layer belongs to the serialization network layer) with the point cloud sampling data as a central point, so as to obtain a sampling data cluster corresponding to the point cloud sampling data. In this way, for the sample data set, a corresponding plurality of sample data clusters can be obtained.
And S123, performing feature extraction processing on the plurality of sampling data clusters based on the feature extraction sub-network layer to obtain a sub-feature matrix corresponding to the sampling data set.
In this embodiment, after obtaining the plurality of sample data clusters based on step S122, feature extraction processing may be performed on the plurality of sample data clusters based on a feature extraction sub-network layer (the feature extraction sub-network layer belongs to the serialization network layer), so that a sub-feature matrix corresponding to the sample data set may be obtained.
Wherein the feature extraction sub-network layer, the sampling sub-network layer and the clustering sub-network layer may constitute the serialization network layer in the point cloud reconstruction model.
And step S124, sampling the sub-feature matrix based on the sampling sub-network layer to obtain a new sampling data set.
In this embodiment, after obtaining the sub-feature matrix corresponding to the sampled data set based on step S123, the sub-feature matrix may be sampled based on the sampling sub-network layer, so that a new sampled data set may be obtained.
Wherein the new sample data set may include at least one point cloud sample data.
And step S125, respectively taking each point cloud sampling data in the new sampling data set as a central point, and carrying out clustering processing based on the clustering sub-network layer to obtain a plurality of corresponding new sampling data clusters.
In this embodiment, after obtaining the new sampling data set based on step S124, for each point cloud sampling data in the new sampling data set, a clustering process may be performed based on the clustering sub-network layer with the point cloud sampling data as a central point, so as to obtain a new sampling data cluster corresponding to the point cloud sampling data. In this way, a corresponding plurality of new clusters of sampled data may be obtained for the new set of sampled data.
And step S126, performing feature extraction processing on the plurality of new sampled data clusters based on the feature extraction sub-network layer to obtain a sub-feature matrix corresponding to the new sampled data set.
In this embodiment, after obtaining the plurality of new sample data clusters based on step S125, feature extraction processing may be performed on the plurality of new sample data clusters based on the feature extraction sub-network layer, so that a sub-feature matrix corresponding to the new sample data set may be obtained.
And step S127, performing data interpolation processing on each obtained sub-feature matrix to obtain a plurality of sub-feature matrices with the same size.
In this embodiment, after the corresponding sub-feature matrices are obtained based on step S123 and step S126, data interpolation processing may be performed on each obtained sub-feature matrix, so as to obtain a plurality of sub-feature matrices with the same size.
Alternatively, in the above example, the specific manner of performing the sampling process based on steps S121 and S124 is not limited, and the manner of performing the sampling process twice may be the same or different.
For example, in an alternative example, the Sampling process based on steps S121 and S124 is performed in the same manner, such as based on a farthest Point Sampling algorithm (FPS), so that the sampled Point cloud data can better represent the original data, and each time the Sampling process is performed, half (in other examples, other numbers, such as one third, etc.) of the Point cloud data in the corresponding matrix can be sampled.
Alternatively, in the above example, the specific manner of performing the clustering process based on step S122 and step S125 is not limited, and the manner of performing the clustering process twice may be the same or different.
For example, in an alternative example, the clustering process based on steps S122 and S125 is performed in the same manner, such as sampling a same size range with the point cloud sampling data as a center point, and finding K adjacent points, so that the obtained sampling data cluster and the new sampling data cluster respectively include K +1 point cloud data.
Alternatively, in the above example, the specific manner of performing the feature extraction processing based on steps S123 and S126 is not limited, and the manner of the two feature extraction processes may be the same or different.
For example, in an alternative example, the feature extraction processing based on steps S123 and S126 is performed in the same manner, for example, each sample data cluster and a new sample data cluster may be processed based on a PointNet network, so as to obtain a corresponding vector with a size of 1 × C, and thus, combining vectors corresponding to a plurality of sample data clusters may form a corresponding sub-feature matrix, that is, local global feature extraction is performed on the obtained sample data cluster through the PointNet network.
Alternatively, in the above example, the number of times step S124, step S125 and step S126 are executed is not limited, and may be selected according to the actual application requirement.
For example, in an alternative example, the steps S124, S125 and S126 may be performed at least once in sequence until the last time the sub-feature matrix obtained by performing step S126 includes 1 point cloud data.
The sub-feature matrix obtained by performing step S126 last time may be referred to as a Global feature or a Global (Global) vector, where the Global feature includes the least amount of point cloud data but has the largest receptive field, i.e., contains the most information.
Alternatively, in the above example, the specific manner of performing the data interpolation processing based on step S127 is not limited, and may be selected according to the actual application requirements.
For example, in an alternative example, the number of point cloud data in each sub-feature matrix after data interpolation processing may be the same as the number of point cloud data in the point cloud matrix, such as 1024.
The specific interpolation processing method is also not limited, and may be an algorithm such as neighboring point interpolation.
In the third aspect, it should be noted that, in step S130, a specific manner of performing the splicing process is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, the multiple sub-feature matrices may be directly spliced based on the generated sequence of the multiple sub-feature matrices, so that the first feature matrix formed by the multiple sub-feature matrices may be obtained.
For another example, in another alternative example, in order to ensure that the obtained first feature matrix is not too redundant and has more information, in conjunction with fig. 4, step S130 may include step S131 and step S132, which are described in detail below.
Step S131, aiming at each sub-feature matrix, performing descending sorting processing on the line features in the sub-feature matrix, and obtaining a new sub-feature matrix based on the first number of line features sorted before.
In this embodiment, after obtaining a plurality of sub-feature matrices with the same size based on step S120, for each sub-feature matrix, a descending order sorting process may be performed on the row features in the sub-feature matrix, and a new sub-feature matrix is obtained based on a first number (for example, in an alternative example, when the number of point cloud data included in the sub-feature matrix is 1024, the first number may be 512) of row features sorted before the row features are obtained.
That is, a first number of row features in each of the sub-feature matrices may be selected to represent the sub-feature matrix for forming a new sub-feature matrix.
Step S132, selecting a second number of new sub-feature matrixes from the new sub-feature matrixes corresponding to the sub-feature matrixes to perform splicing processing, so as to obtain a first feature matrix.
In this embodiment, after obtaining a plurality of new sub-feature matrices based on step S131, a second number (for example, in an alternative example, when the number of point cloud data included in the point cloud matrix is 1024, 10 new sub-feature matrices may be obtained, and thus, the second number may be 8) of new sub-feature matrices may be selected from the plurality of new sub-feature matrices for performing a stitching process, and thus, the first feature matrix may be obtained.
Optionally, in the above example, the specific manner of performing the descending order processing based on step S131 is not limited, and may be selected according to the actual application requirement.
For example, in an alternative example, in order to make the new sub-feature matrix include more information, the descending order process may be performed based on the difference between the row feature in the new sub-feature matrix and the global feature (in an alternative example, the sub-feature matrix obtained by performing step S126 last time in the above example). Based on this, in conjunction with fig. 5, step S131 may include step S131a and step S131b, which are described in detail below.
Step S131a, respectively calculating the euclidean distance between each line feature and the global feature in the sub-feature matrix.
In this embodiment, after you obtain a plurality of sub-feature matrices with the same size based on step S120, the euclidean distance between each line feature and the global feature may be calculated for each line feature in each sub-feature matrix.
The global feature is obtained by performing serialization processing on the point cloud matrix, and the number of point cloud data in the global feature is 1, as in the above example, the sub-feature matrix obtained in step S126 is executed for the last time.
Step S131b, performing descending order processing on the row features in the sub-feature matrix based on the euclidean distance.
In this embodiment, after the euclidean distance between each line feature in each sub-feature matrix and the global feature is obtained based on step S131a, the line features in the sub-feature matrix may be sorted in a descending order based on the euclidean distance.
That is, in a sub-feature matrix after the descending sorting process, along the direction from the first row to the last row, the euclidean distances corresponding to the included features of each row are sequentially decreased.
Optionally, in the above example, the specific manner of performing the downward splicing processing based on step S132 is not limited, and may be selected according to the actual application requirement.
For example, in an alternative example, the generation order of each sub-feature matrix may be determined based on step S126 in the above example, and thus, the second number of new sub-feature matrices may be subjected to the splicing process according to the order, so as to obtain the first feature matrix.
After passing through the PointNet network, the number of channels in the PointNet network is configured, so that the dimension of the obtained sub-feature matrix can be promoted to a target, and if the dimension of the point cloud matrix is 3, the dimension of the sub-investment matrix can be 8. Thus, if the first number is 512 and the second number is 8, the size of the first feature matrix obtained may be 8 × 512 × 8, that is, 4096 × 8, that is, the number of point cloud data is 4096 (which achieves the purpose of basic supplement or expansion at 1024 by 4 times), and the dimension is 8.
In the fourth aspect, it should be noted that, in step S140, a specific manner of performing the reconfiguration process is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, on the basis of ensuring that the dimension of the target feature matrix is the same as the dimension of the point cloud matrix, in order to make the target feature matrix have more information, in conjunction with fig. 6, step S140 may include step S141 and step S142, which are described in detail below.
Step S141, performing a first convolution process on the first feature matrix based on the first convolution layer to obtain a first feature matrix with reduced dimensionality.
In this embodiment, after obtaining the first feature matrix based on step S130, a first convolution layer (the first convolution layer belongs to the reconstructed network layer) may be subjected to a first convolution process, so that a first feature matrix with reduced dimension may be obtained.
Wherein the activation function of the first convolution layer is a linear rectification function (such as a ReLU function), and the dimensionality-reduced first feature matrix has the same dimensionality with the point cloud matrix.
And step S142, performing second convolution processing on the dimensionality-reduced first feature matrix based on the second convolution layer to obtain a target feature matrix.
In this embodiment, after obtaining the first feature matrix with reduced dimension based on step S141, the second convolution processing may be performed on the first feature matrix with reduced dimension based on the second convolution layer, so that the target feature matrix may be obtained.
The activation function of the second convolution layer is a hyperbolic tangent function (such as a tanh function), and may be used to normalize the corresponding coordinate data to a (-1, 1) interval, and the second convolution layer and the first convolution layer form a reconstruction network layer in the point cloud reconstruction model.
Based on the above example, as when the size of the first feature matrix is 4096 × 8, in a specific application example, the convolution kernel size of the first convolution layer may be 1 × 8, and the number of channels may be 3, so that the size of the first feature matrix with reduced dimension is 4096 × 3.
The convolution kernel size of the second convolution layer may be 1 × 1, and the number of channels may be 3, so that the size of the target feature matrix obtained is 4096 × 3. The convolution kernel of the second convolution layer is 1 x 1, so that convolution processing with smaller granularity can be performed in the convolution process, the precision of the convolution processing is higher, and each point cloud data can be ensured to be subjected to convolution processing.
That is, in the above example, for the point cloud matrix with size 1024 × 3, after the processing, the target feature matrix with size 4096 × 3 can be obtained, and the purpose of expanding the point cloud data by 4 times is completed. If the point cloud data needs to be supplemented and expanded, step S120, step S130, and step S140 may be sequentially performed on the target feature matrix (step S120 is performed to change the point cloud matrix into the target feature matrix from the point cloud matrix), so that the number of the point cloud data is increased by 4 times again, and a new target feature matrix with a size of 16384 × 3 is obtained.
On the basis of the above example, in order to obtain the point cloud reconstruction model, corresponding network model training processing is also required. Based on this, with reference to fig. 7, the point cloud data processing method may further include step S150, step S160, and step S170, which are described in detail below.
Step S150, a plurality of point cloud data sets corresponding to the plurality of target objects are obtained.
In this embodiment, when the point cloud reconstruction model needs to be trained, the electronic device 10 may obtain a plurality of point cloud data sets corresponding to a plurality of target objects.
At least two target objects with different identification information exist in the plurality of target objects, and each point cloud data set is formed based on a plurality of point cloud data of the corresponding target object. That is, there is a one-to-one correspondence between the plurality of target objects and the plurality of point cloud data sets.
Step S160, forming a rule of staggered distribution based on the identification information of the target object, and sorting the plurality of point cloud data sets to obtain a set sequence.
In this embodiment, after obtaining the plurality of point cloud data sets based on step S150, the electronic device 10 may form a rule of staggered distribution based on the identification information of the target object (in this way, the trained point cloud reconstruction model may have better generalization capability), and rank the plurality of point cloud data sets to obtain a set sequence.
And S170, sequentially selecting a third number of point cloud data sets from the set sequence to train the constructed neural network model to obtain the point cloud reconstruction model.
In this embodiment, after obtaining the set sequence based on step S160, the electronic device 10 may sequentially select a third number (the third number may be a Batch number, that is, a parameter Batch in deep learning or neural network training) of point cloud data sets from the set sequence to train the constructed neural network model, so as to obtain the point cloud reconstruction model.
In order to understand the above steps S150, S160, and S170, the target object is a part in an industrial scene, and the identification information is shape information (formed by a rectangular parallelepiped, a cylinder, a triangular pyramid, or the like) of the part.
The method comprises the following steps that firstly, 1000 parts can be selected, each part is shot through a 3D camera, and then a complete and dense point cloud data initial set corresponding to each part is obtained based on a point cloud segmentation algorithm, so that the point cloud data initial set can be randomly sampled to obtain a corresponding sparse point cloud data set, such as 1024 point cloud data;
secondly, the obtained multiple point cloud data sets can be subjected to staggered sequencing to obtain a set sequence, such as a cuboid 1, a cylinder 2, a triangular pyramid 3, a cuboid 4, a cylinder 5, a triangular pyramid 6, a cuboid 7, a cylinder 8 and a triangular pyramid 9;
and thirdly, sequentially selecting 32 point cloud data sets for the set sequence (namely, the first 32 point cloud data sets can be obtained during first training, the 33 th to 64 th point cloud data sets can be obtained during second training, and thus, the quantity of the point cloud data trained each time is 32 x 1024), and training the constructed neural network model (wherein, during the training process, comparison calculation can be carried out based on the corresponding point cloud data original sets when loss values are calculated, and the adopted loss function can be a Chamfer Distance function, namely Chamfer Distance, so that reliable and effective loss value calculation can be carried out even if the quantity of the point cloud data in the point cloud data original sets is different from the quantity of the point cloud data after the point cloud data sets are reconstructed).
With reference to fig. 8, an embodiment of the present application further provides a point cloud data processing apparatus 100, which can be applied to the electronic device 10. The point cloud data processing apparatus 100 may include a point cloud matrix obtaining module 110, a point cloud matrix serialization module 120, a sub-feature matrix concatenation module 130, and a feature matrix reconstruction module 140.
The point cloud matrix obtaining module 110 may be configured to form a point cloud matrix based on the obtained plurality of point cloud data. In this embodiment, the point cloud matrix obtaining module 110 may be configured to perform step S110 shown in fig. 2, and reference may be made to the foregoing description of step S110 regarding relevant contents of the point cloud matrix obtaining module 110.
The point cloud matrix serialization module 120 may be configured to perform serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training, so as to obtain a plurality of sub-feature matrices, where a dimension of each sub-feature matrix is greater than a dimension of the point cloud matrix, and a size of each sub-feature matrix is the same. In this embodiment, the point cloud matrix serialization module 120 may be configured to perform step S120 shown in fig. 2, and reference may be made to the description of step S120 about relevant contents of the point cloud matrix serialization module 120.
The sub-feature matrix stitching module 130 may be configured to perform stitching processing based on the plurality of sub-feature matrices to obtain a first feature matrix, where the number of point cloud data in the first feature matrix is greater than the number of point cloud data in the point cloud matrix. In this embodiment, the sub-feature matrix splicing module 130 may be configured to perform step S130 shown in fig. 2, and reference may be made to the foregoing description of step S130 for relevant contents of the sub-feature matrix splicing module 130.
The feature matrix reconstruction module 140 may be configured to perform reconstruction processing on the first feature matrix based on a reconstruction network layer in the point cloud reconstruction model, so as to obtain a target feature matrix with dimensions the same as those of the point cloud matrix. In this embodiment, the feature matrix reconstruction module 140 may be configured to perform step S140 shown in fig. 2, and reference may be made to the description of step S150 regarding the relevant content of the feature matrix reconstruction module 140.
On the basis of the above example, the point cloud data processing apparatus 100 may include other modules having different roles based on different requirements.
For example, in an alternative example, other modules may be used to:
obtaining a plurality of point cloud data sets corresponding to a plurality of target objects, wherein at least two target objects with different identification information exist in the plurality of target objects, and each point cloud data set is formed based on a plurality of point cloud data corresponding to the target objects; forming a rule of staggered distribution based on the identification information of the target object, and sequencing the plurality of point cloud data sets to obtain a set sequence; and in the set sequence, sequentially selecting a third number of point cloud data sets to train the constructed neural network model to obtain the point cloud reconstruction model.
In an embodiment of the present application, corresponding to the point cloud data processing method, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program executes each step of the point cloud data processing method when running.
The steps executed when the computer program runs are not described in detail herein, and reference may be made to the explanation of the point cloud data processing method.
In summary, the point cloud data processing method and apparatus, the electronic device, and the storage medium provided by the present application perform serialization processing on a point cloud matrix through a serialization network layer, so that the point cloud matrix can be converted into a plurality of sub-feature matrices with the same size, and then perform splicing processing based on the plurality of sub-feature matrices to obtain a first feature matrix, so that the first feature matrix can be reconstructed based on a reconstruction network layer, thereby obtaining a target feature matrix, and realizing reconstruction of point cloud data. Therefore, when point cloud data are reconstructed, the method does not depend on prior information (such as symmetric information) of an object base structure and the like, and on one hand, the prior information does not need to be acquired independently, so that the convenience is higher; on the other hand, under the condition that the prior information is not available, the reconstruction of the point cloud data can be reliably and effectively realized, so that the problem that the point cloud data is difficult to effectively reconstruct in the prior art is solved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A point cloud data processing method is characterized by comprising the following steps:
forming a point cloud matrix based on the obtained plurality of point cloud data;
carrying out serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices, wherein the dimension of each sub-feature matrix is larger than that of the point cloud matrix, and the dimension of each sub-feature matrix is the same;
splicing the sub-feature matrixes to obtain a first feature matrix, wherein the number of point cloud data in the first feature matrix is greater than that in the point cloud matrix;
and reconstructing the first characteristic matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target characteristic matrix with the same dimensionality as the point cloud matrix.
2. The point cloud data processing method of claim 1, wherein the step of performing serialization processing on the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices comprises:
a, sampling the point cloud matrix based on a sampling sub-network layer to obtain a sampling data set, wherein the sampling data set comprises a plurality of point cloud sampling data;
b, respectively taking each point cloud sampling data as a central point, and carrying out clustering processing based on a clustering sub-network layer to obtain a plurality of sampling data clusters corresponding to the sampling data set;
c, performing feature extraction processing on the plurality of sampling data clusters based on a feature extraction sub-network layer to obtain a sub-feature matrix corresponding to the sampling data set, wherein the feature extraction sub-network layer, the sampling sub-network layer and the clustering sub-network layer form a serialization network layer;
d, sampling the sub-feature matrix based on the sampling sub-network layer to obtain a new sampling data set, wherein the new sampling data set comprises at least one point cloud sampling data;
e, respectively taking each point cloud sampling data in the new sampling data set as a central point, and carrying out clustering processing based on the clustering sub-network layer to obtain a plurality of corresponding new sampling data clusters;
f, performing feature extraction processing on the plurality of new sampled data clusters based on the feature extraction sub-network layer to obtain sub-feature matrixes corresponding to the new sampled data sets;
and g, performing data interpolation processing on each obtained sub-feature matrix to obtain a plurality of sub-feature matrices with the same size.
3. The point cloud data processing method of claim 2, wherein the steps d, e and f are performed at least once in sequence until the number of point cloud data included in the sub-feature matrix obtained by performing the step f last time is 1.
4. The point cloud data processing method of claim 1, wherein the step of performing stitching processing based on the plurality of sub-feature matrices to obtain a first feature matrix comprises:
for each sub-feature matrix, performing descending order sorting processing on the row features in the sub-feature matrix, and obtaining a new sub-feature matrix based on a first number of row features sorted in the front;
and selecting a second number of new sub-feature matrixes from the new sub-feature matrixes corresponding to the sub-feature matrixes to perform splicing processing to obtain a first feature matrix.
5. The point cloud data processing method of claim 4, wherein the step of sorting the row features in the sub-feature matrix in descending order comprises:
respectively calculating Euclidean distances between each row of features in the sub-feature matrix and global features, wherein the global features are obtained by performing serialization processing on the point cloud matrix, and the number of point cloud data in the global features is 1;
and performing descending order sorting processing on the row characteristics in the sub-characteristic matrix based on the Euclidean distance.
6. The point cloud data processing method of claim 1, wherein the step of reconstructing the first feature matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target feature matrix having the same dimension as the point cloud matrix comprises:
performing first convolution processing on the first characteristic matrix based on a first convolution layer to obtain a first characteristic matrix with reduced dimensionality, wherein an activation function of the first convolution layer is a linear rectification function, and the first characteristic matrix with reduced dimensionality and the point cloud matrix have the same dimensionality;
and performing second convolution processing on the dimensionality-reduced first feature matrix based on a second convolution layer to obtain a target feature matrix, wherein an activation function of the second convolution layer is a hyperbolic tangent function, and the second convolution layer and the first convolution layer form a reconstruction network layer in the point cloud reconstruction model.
7. The point cloud data processing method of any one of claims 1-6, further comprising the step of training the point cloud reconstruction model, the step comprising:
obtaining a plurality of point cloud data sets corresponding to a plurality of target objects, wherein at least two target objects with different identification information exist in the plurality of target objects, and each point cloud data set is formed based on a plurality of point cloud data corresponding to the target objects;
forming a rule of staggered distribution based on the identification information of the target object, and sequencing the plurality of point cloud data sets to obtain a set sequence;
and in the set sequence, sequentially selecting a third number of point cloud data sets to train the constructed neural network model to obtain the point cloud reconstruction model.
8. A point cloud data processing apparatus, comprising:
a point cloud matrix obtaining module for forming a point cloud matrix based on the obtained plurality of point cloud data;
the point cloud matrix serialization module is used for serializing the point cloud matrix based on a serialization network layer in a point cloud reconstruction model obtained through pre-training to obtain a plurality of sub-feature matrices, wherein the dimension of each sub-feature matrix is larger than that of the point cloud matrix, and the dimension of each sub-feature matrix is the same;
the sub-feature matrix splicing module is used for carrying out splicing processing based on the plurality of sub-feature matrices to obtain a first feature matrix, wherein the number of point cloud data in the first feature matrix is greater than that in the point cloud matrix;
and the characteristic matrix reconstruction module is used for reconstructing the first characteristic matrix based on a reconstruction network layer in the point cloud reconstruction model to obtain a target characteristic matrix with the same dimensionality as the point cloud matrix.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor connected to the memory for executing the computer program stored in the memory to implement the point cloud data processing method of any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program, wherein the computer program is configured to implement the point cloud data processing method according to any one of claims 1 to 7 when executed.
CN202011328046.8A 2020-11-24 2020-11-24 Point cloud data processing method and device, electronic equipment and storage medium Pending CN112435343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011328046.8A CN112435343A (en) 2020-11-24 2020-11-24 Point cloud data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011328046.8A CN112435343A (en) 2020-11-24 2020-11-24 Point cloud data processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112435343A true CN112435343A (en) 2021-03-02

Family

ID=74692914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011328046.8A Pending CN112435343A (en) 2020-11-24 2020-11-24 Point cloud data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112435343A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990010A (en) * 2021-03-15 2021-06-18 深圳大学 Point cloud data processing method and device, computer equipment and storage medium
CN113052955A (en) * 2021-03-19 2021-06-29 西安电子科技大学 Point cloud completion method, system and application
CN114119923A (en) * 2021-11-29 2022-03-01 浙江大学 Three-dimensional face reconstruction method and device and electronic equipment
CN114283294A (en) * 2021-12-20 2022-04-05 平安普惠企业管理有限公司 Neural network point cloud feature extraction method, system, equipment and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112990010A (en) * 2021-03-15 2021-06-18 深圳大学 Point cloud data processing method and device, computer equipment and storage medium
CN112990010B (en) * 2021-03-15 2023-08-18 深圳大学 Point cloud data processing method and device, computer equipment and storage medium
CN113052955A (en) * 2021-03-19 2021-06-29 西安电子科技大学 Point cloud completion method, system and application
CN113052955B (en) * 2021-03-19 2023-06-30 西安电子科技大学 Point cloud completion method, system and application
CN114119923A (en) * 2021-11-29 2022-03-01 浙江大学 Three-dimensional face reconstruction method and device and electronic equipment
CN114119923B (en) * 2021-11-29 2022-07-19 浙江大学 Three-dimensional face reconstruction method and device and electronic equipment
CN114283294A (en) * 2021-12-20 2022-04-05 平安普惠企业管理有限公司 Neural network point cloud feature extraction method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112435343A (en) Point cloud data processing method and device, electronic equipment and storage medium
Riegler et al. Octnetfusion: Learning depth fusion from data
CN108710830B (en) Human body 3D posture estimation method combining dense connection attention pyramid residual error network and isometric limitation
Wechsler et al. Invariant object recognition using a distributed associative memory
Guo et al. A completed modeling of local binary pattern operator for texture classification
CN111445418A (en) Image defogging method and device and computer equipment
CN112912890A (en) Method and system for generating synthetic point cloud data using generative models
CN107329962B (en) Image retrieval database generation method, and method and device for enhancing reality
EP3074926A1 (en) Method and system for exacting face features from data of face images
CN113159232A (en) Three-dimensional target classification and segmentation method
CN112036381B (en) Visual tracking method, video monitoring method and terminal equipment
Rara et al. Model-based 3D shape recovery from single images of unknown pose and illumination using a small number of feature points
CN113312966B (en) Action recognition method and device based on first person viewing angle
CN110599588A (en) Particle reconstruction method and device in three-dimensional flow field, electronic device and storage medium
CN116797640A (en) Depth and 3D key point estimation method for intelligent companion line inspection device
CN112529897A (en) Image detection method and device, computer equipment and storage medium
CN115131384B (en) Bionic robot 3D printing method, device and medium based on edge preservation
Li et al. Real-time action recognition by feature-level fusion of depth and inertial sensor
WO2019233654A1 (en) Method for determining a type and a state of an object of interest
Zhu et al. Depth estimation for deformable object using a multi-layer neural network
CN111178299B (en) Image processing method, image processing device, electronic equipment and storage medium
CN113887289A (en) Monocular three-dimensional object detection method, device, equipment and product
CN112580442A (en) Behavior identification method based on multi-dimensional pyramid hierarchical model
Zheng et al. Object detection algorithm based on feature enhancement
Shafiq et al. More for less: Insights into convolutional nets for 3D point cloud recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220920

Address after: Room 1016, 10th Floor, Building 1, No. 768, Jianghong Road, Changhe Street, Binjiang District, Hangzhou, Zhejiang 310000

Applicant after: Hangzhou Hejian Technology Partnership (L.P.)

Address before: 310000 room 1014, 10th floor, building 1, no.768 Jianghong Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: Hangzhou Weishi Technology Co.,Ltd.