CN113780460A - Material identification method and device, robot, electronic equipment and storage medium - Google Patents

Material identification method and device, robot, electronic equipment and storage medium Download PDF

Info

Publication number
CN113780460A
CN113780460A CN202111111280.XA CN202111111280A CN113780460A CN 113780460 A CN113780460 A CN 113780460A CN 202111111280 A CN202111111280 A CN 202111111280A CN 113780460 A CN113780460 A CN 113780460A
Authority
CN
China
Prior art keywords
data
detected
feature
pooling
dictionary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111111280.XA
Other languages
Chinese (zh)
Inventor
刘嘉瑞
蒿杰
梁俊
舒琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xintiao Technology Guangzhou Co ltd
Institute of Automation of Chinese Academy of Science
Guangdong Institute of Artificial Intelligence and Advanced Computing
Original Assignee
Xintiao Technology Guangzhou Co ltd
Institute of Automation of Chinese Academy of Science
Guangdong Institute of Artificial Intelligence and Advanced Computing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xintiao Technology Guangzhou Co ltd, Institute of Automation of Chinese Academy of Science, Guangdong Institute of Artificial Intelligence and Advanced Computing filed Critical Xintiao Technology Guangzhou Co ltd
Priority to CN202111111280.XA priority Critical patent/CN113780460A/en
Publication of CN113780460A publication Critical patent/CN113780460A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/253Grammatical analysis; Style critique

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the application provides a material identification method and device, a robot, electronic equipment and a storage medium, and relates to the technical field of artificial intelligence. Acquiring to-be-detected data corresponding to different materials acquired by a touch sensing device; carrying out sparse coding on the data to be detected by utilizing preset dictionary data to obtain coded data; performing feature pooling on the encoded data to obtain feature data; classifying the characteristic data by using a preset classifier to obtain a classification result of the material; the material identification robot is used for collecting data of the target object, so that the adjective identification of the target object is realized, and the problem that the existing method is dangerous when the target object is directly contacted and identified is solved.

Description

Material identification method and device, robot, electronic equipment and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a material identification method, a material identification device, a robot, electronic equipment and a storage medium.
Background
In dangerous tasks such as fire fighting and disaster relief, staff often cannot directly contact and distinguish on-site objects due to dangerous factors such as narrow space, high temperature and high pressure, and the direct contact and the distinction have certain danger.
Disclosure of Invention
An object of the embodiments of the present application is to provide a material identification method, a device, a robot, an electronic device, and a storage medium, in which a material identification robot performs data acquisition on a target object, so as to implement adjective identification of the target object, and solve the problem that the existing method is dangerous when directly performing contact and identification in some special occasions.
The embodiment of the application provides a material identification method, a material identification device, a robot, electronic equipment and a storage medium, wherein the method comprises the following steps:
acquiring to-be-detected data corresponding to different materials acquired by a touch sensing device;
carrying out sparse coding on the data to be detected by utilizing preset dictionary data to obtain coded data;
performing feature pooling on the encoded data to obtain feature data;
and classifying the characteristic data by using a preset classifier to obtain a classification result of the material.
In the implementation process, the data to be detected collected by the touch sensing device is analyzed by using a material algorithm, and finally, the adjective recognition of the target object is realized.
Further, before the step of sparsely encoding the data to be detected by using the preset dictionary data, the method further includes:
dividing the data to be measured of each object into matrix data with the same width according to the atom length;
the matrix data for all objects is combined into a two-dimensional matrix of atomic lengths.
In the implementation process, a preprocessing process of the data to be detected is given, and the data to be detected with different lengths are divided into matrix data with the same width according to the atom length.
Further, before the step of sparsely encoding the data to be detected by using the preset dictionary data, the method further includes:
acquiring training data corresponding to different materials acquired by a touch sensing device, wherein the training data comprises positive pressure data and 9-axis imu data acquired by using a mechanical arm in a pressing, fast-sliding and slow-sliding contact mode respectively, and also comprises adjective data acquired by sensing the surface of the material by fingers and label data of material attributes corresponding to the adjective data;
and performing dictionary learning on the preprocessed training data to obtain the dictionary data.
In the implementation process, dictionary learning is carried out by utilizing the training data, so that dictionary data are obtained and are used for carrying out sparse coding during subsequent testing.
Further, the performing feature pooling on the encoded data to obtain feature data includes:
acquiring a coding coefficient corresponding to each data to be detected;
performing pooling operation on the coding coefficients to obtain a feature matrix;
and splicing the feature matrixes of all the data to be detected to generate feature data.
In the implementation process, the characteristic data is obtained through pooling operation, so that adjective classification is performed.
Further, before the step of classifying the feature data by using a preset classifier to obtain a classification result of the material, the method further includes:
and training the classifier by using the training data and the label data corresponding to each material.
In the implementation process, the classification result of the adjective of each material is tested independently according to the label data of each material.
The embodiment of the present application further provides a material identification device, the device includes:
the acquisition module is used for acquiring the data to be detected corresponding to different materials acquired by the touch sensing device;
the encoding module is used for carrying out sparse encoding on the data to be detected by utilizing preset dictionary data so as to obtain encoded data;
the pooling module is used for performing feature pooling on the coded data to obtain feature data;
and the classification module is used for classifying the characteristic data by utilizing a preset classifier so as to obtain a classification result of the material.
In the implementation process, the dictionary learning and classifier are used for identifying the data to be detected collected by the touch sensing device, so that the adjective classification is carried out on the material of the target object.
Further, the pooling module comprises:
the coefficient acquisition module is used for acquiring a coding coefficient corresponding to each data to be detected;
the characteristic matrix acquisition module is used for carrying out pooling operation on the coding coefficients to obtain a characteristic matrix;
and the characteristic data acquisition module is used for splicing the characteristic matrixes of all the data to be detected to generate characteristic data.
In the implementation process, the feature data is obtained through a pooling operation, so that the feature data is used as the input of the classifier.
The embodiment of the present application further provides a material identification robot, including the material identification device, still include:
the touch sensing device is arranged at the tail end of a finger joint of the manipulator and used for acquiring data to be detected;
and the mechanical arm is connected with the mechanical arm and used for controlling the mechanical arm to move so that the touch sensing device is contacted with the surface of the target object in a pressing, fast-sliding and slow-sliding contact mode.
In the implementation process, the mechanical arm and the mechanical arm are used for controlling the moving path and the pose of the touch sensing device, so that the touch sensing device is in contact with a target object and acquires data to be measured according to the contact modes of pressing, fast sliding and slow sliding. Therefore, the problem that the existing method is dangerous when being used for directly contacting and identifying is solved.
An embodiment of the present application further provides an electronic device, where the electronic device includes a memory and a processor, where the memory is used to store a computer program, and the processor runs the computer program to enable the electronic device to execute the material identification method described in any one of the foregoing descriptions.
An embodiment of the present application further provides a readable storage medium, where computer program instructions are stored, and when the computer program instructions are read and executed by a processor, the method for identifying a material quality is performed.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic view of a mechanical arm of a material recognition robot according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a material identification method according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of material identification provided in an embodiment of the present application;
FIG. 4 is a flow chart of a training process provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a process of training data provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a two-dimensional matrix obtained after preprocessing provided in the embodiment of the present application;
fig. 7 is a schematic diagram of a coefficient matrix in dictionary learning according to an embodiment of the present application;
fig. 8 is a schematic diagram of data splicing after pooling provided in an embodiment of the present application;
FIG. 9 is a flow chart of a pooling process provided by an embodiment of the present application;
FIG. 10 is a block diagram illustrating a structure of a texture recognition apparatus according to an embodiment of the present disclosure;
fig. 11 is a block diagram of another texture recognition apparatus according to an embodiment of the present disclosure.
Icon:
100-an acquisition module; 200-an encoding module; 300-a pooling module; 301-coefficient acquisition module; 302-a feature matrix acquisition module; 303-a characteristic data acquisition module; 400-a classification module; 500-a pre-processing module; 600-a dictionary learning module; 700-classifier training module; 801-tactile sensing means; 802-a robot arm; 803-robotic arm.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a schematic view illustrating a robot arm 803 of a material recognition robot according to an embodiment of the present disclosure. Because the objective reasons such as environmental hazard, space are narrow and small lead to the real person can't reach or because environmental factors such as dense cigarette, night and bandwidth cause visual information insufficient or unable the acquirement, therefore unable carry out object classification through the vision, again or on the vision or other sensors categorised have puzzlement nature, need use the sense of touch to further distinguish, under these circumstances, need use the material to discern the robot and carry out material quality discernment to solve above-mentioned problem.
The robot controls the robot arm 803 and the manipulator 802 to collect data, and identifies the material of the target object using the collected data.
The tactile sensor device 801 is mounted on the end of the robot 802, the robot 802 is mounted on the end of the robot 803, and the robot 803 and the robot 802 can be controlled using ros (robot operating system) so that the tactile sensor device 801 comes into contact with a target object, specifically:
assuming that the approximate orientation of the target object is known, the pose of the end of the robotic arm 803 may be determined from this, and the end of the robotic arm 803 may be planned using a trajectory planning method of inverse kinematics so that the robotic arm 803, carrying the manipulator 802 and the tactile sensing device 801, reaches a specified position above the target object. After the target object is located above the designated position, the root joint of the forefinger of the manipulator 802 can be independently controlled to move downwards until the joint is contacted with the surface of the target object, namely, the position of the mechanical arm 803 is fixed, a certain joint of the manipulator 802 is controlled to slowly explore downwards, and pressure data of a sensor is monitored in real time; when the maximum value in the pressure data is greater than a certain fixed threshold, it is judged that the experimental condition has been reached.
And after the experimental conditions are met, acquiring the data of the sensor in real time. The mechanical arm 803 interacts with the target object by using three contact modes, namely pressing, fast sliding, slow sliding and slow sliding respectively, and codes each interaction mode, wherein the specific codes are as follows: 0 (not started), 1 (started), 2 (explored), 3 (pressed), 4 (fast-sliding), 5 (slow-sliding), etc., and a piece of data that is continuously identically encoded is referred to as an interaction phase.
The collected data can be identified through the material identification method, so that the type of the adjective of the target object is judged, and the problem of danger caused by manual direct contact and identification is avoided.
An embodiment of the present application further provides a material identification method, which may be applied to the material identification robot, as shown in fig. 2, which is a flowchart of the material identification method, and specifically includes the following steps:
step S100: acquiring to-be-detected data corresponding to different materials acquired by a touch sensing device 801;
step S200: carrying out sparse coding on the data to be detected by utilizing preset dictionary data to obtain coded data;
step S300: performing feature pooling on the encoded data to obtain feature data;
step S400: and classifying the characteristic data by using a preset classifier to obtain a classification result of the material.
As shown in fig. 3, the method is a material identification flowchart, and includes preprocessing single data to be detected, performing sparse coding using a dictionary obtained in a training process, pooling coding results to obtain test feature data, and classifying using a trained classifier to obtain a material classification result of the data to be detected.
As shown in FIG. 4, for the training process flow chart, for each extracted data in the same subdata set
Figure BDA0003270746440000061
The method comprises the steps of preprocessing data into a uniform format, and then independently learning a dictionary of the data. And performing dictionary learning by using a K-SVD (K-singular value decomposition) method, performing coefficient coding on the data of the learned dictionary, pooling a plurality of groups of learned coefficients of each group of data as the features of each group of data, and training the features by using a classification algorithm to obtain a final result.
The dictionary learning and classifier algorithm is trained as follows:
acquiring training data corresponding to different materials acquired by a touch sensing device 801, wherein the training data comprises positive pressure data and 9-axis imu data acquired by a mechanical arm 803 in a pressing, fast-sliding and slow-sliding contact mode, adjective data acquired by sensing the surface of the material by fingers and label data of material attributes corresponding to the adjective data; and then dictionary learning is carried out on the preprocessed training data to obtain dictionary data.
Firstly, the materials to be identified, such as more than 20 materials including bricks, glass, plastics, sponge and the like, need to be collected, and the above contact mode is used for data acquisition for multiple times, such as 20 times, so as to obtain the acquired data.
The method comprises the steps that a plurality of volunteers cover eyes, guide fingers of the volunteers to the surface of an object to sense, obtain more than 20 kinds of sensed adjectives such as hard/soft, rough/smooth and the like of a target object, mark the adjectives, judge that the adjectives which accord with the attributes of corresponding materials are marked as 1, and the adjectives which do not accord with the attributes of the materials are marked as-1, so that label data are formed and training data are generated by combining collected data. Here, the object type is denoted as n.
The collected data includes positive pressure array data and 9-axis imu data of the tactile sensing device 801, and the like, and the numbers of the interaction modes corresponding to each moment are output in time sequence. The specific data format is shown in the following table:
TABLE 1
Figure BDA0003270746440000071
After all the data are obtained, extracting the required data, namely the data obtained according to the contact mode of fast sliding, slow sliding and pressing, forming different subdata sets by the data acquired by different target objects for multiple times according to the dimension of the sensor, and recording the subdata sets as data
Figure BDA0003270746440000072
Wherein p represents the p-th interaction mode, and q represents the q-th sensor; each sub data set contains all data of the same sensor in the same interaction phase in the data acquired 20 times for the n target objects.
Before training, training data needs to be preprocessed, and it should be noted that the preprocessing process of the training data is completely the same as the preprocessing process of the data to be detected, and therefore the subsequent preprocessing process of the data to be detected is not described herein again.
The preprocessing of the training data is a process of dividing training data of different lengths into matrices of the same width by atomic length. Wherein, one dimension value of the dictionary is called atom (atom), and the length is marked as A. When atoms of different lengths are divided, the time series is divided into a plurality of time series segments of equal length, each segment having a 50% overlap.
Specifically, for training data with different lengths, the training data is converted into matrix data of L × a by using an overlap ratio of 50%, where L is the length of a matrix obtained by combining the training data in a manner that the atomic length is a and the overlap ratio is 50%, and a represents the width of the matrix, as shown in fig. 5, which is a schematic diagram of a processing procedure of the training data.
Separate sequences extracted for each target object
Figure BDA0003270746440000081
Wherein i represents the ith sequence in the sub data set; the two-dimensional matrix data is formed by the method
Figure BDA0003270746440000082
Splicing the matrix data after data processing into a two-dimensional matrix Yp,qIts length is sum { LiAnd the width is a, as shown in fig. 6, which is a schematic diagram of a two-dimensional matrix obtained after preprocessing.
And performing dictionary learning by using the preprocessed training data to obtain dictionary data, performing sparse coding by using the dictionary data, and classifying by using a coding result as a data feature. As shown in fig. 7, a schematic diagram of a coefficient matrix in dictionary learning is shown, where a dictionary includes K atoms, each atom has a length, so that the length of the dictionary matrix is a × K, and the learned coefficient size is K × L. The specific process of dictionary learning is as follows:
the goal of dictionary learning is to learn a set of bases D for a set of observations of a class of data, so that data Y can be represented using a permutation combination of the set of bases.
Figure BDA0003270746440000083
Where X represents the projection of data Y in dictionary X space, i.e. the sparse coding of Y. x is the number oflRepresents the l column, | x of the coefficient matrixl||0T is less than or equal to T, and the sparsity of the codes is guaranteed.
The dictionary learning specifically comprises the following steps:
first, a learning coefficient X is obtained by using an orthogonal pursuit matching algorithm (OMP) fixed dictionary.
Calculating the X-minimization objective function is an NP problem that can be solved using a chase algorithm. When using OMP algorithm, calculating residual error in current state each time, and then finding out atom d closest to residual errorkUpdate the coefficient x corresponding to this atomkThus minimizing the residual:
Figure BDA0003270746440000091
wherein, yiI.e. yi p,qAbbreviation of (a), xiI.e. yiCorresponding coding coefficient in dictionary D, EkDenotes that Y is at the kth atom dkF denotes the norm, here typically taken to be 2, i.e. the second norm.
The coefficients in the above formula can be obtained by directly pairing EkIs subjected to SVD decomposition, but then
Figure BDA0003270746440000092
Will not be sparse. The solution is to find all features related to the current atom:
Figure BDA0003270746440000093
for residual error
Figure BDA0003270746440000094
Performing SVD decomposition to obtain:
Figure BDA0003270746440000095
setting the feature vector corresponding to the maximum feature value as the corresponding new dictionary element
Figure BDA0003270746440000096
To corresponding sparse representation
Figure BDA0003270746440000097
Is updated to VTThe product of the first column vector and the corresponding first eigenvalue delta. The above steps are repeated until each dictionary has found its sparse representation.
Second, the coefficients are fixed, dictionary D is updated, still the above penalty function, for the k column atom DkAnd re-separating the loss corresponding to the atom k and carrying out individual optimization:
Figure BDA0003270746440000098
the above two steps are iterated and alternated until the error is less than a given threshold.
And then carrying out sparse coding on the obtained dictionary data: for each subdata set, after dictionary learning is completed, a matrix D of a target dictionary is obtainedp,q. For a new data, it needs to be projected to obtain the dictionary code X corresponding to the datap,q. The encoding method also uses an orthogonal trace matching method, similar to the first step in dictionary learning.
Matrix D for each stage of each sensorp,qThe dictionary was trained separately, and 6 sets were randomly drawn from 10 readings of each data for dictionary training. The atoms learned by the dictionary are also vectors of length a. The atom length a and the dictionary size K belong to hyperparameters, and the largest parameter of F1 score is used as a result in the adjective classification task.
The specific process of characteristic pooling is as follows:
learning the coding coefficient of the sensor
Figure BDA0003270746440000101
After the characteristic, a coefficient matrix corresponding to each data
Figure BDA0003270746440000102
It is subjected to a pooling operation. I.e. features for each data
Figure BDA0003270746440000103
Record the maximum value of the corresponding dimension as
Figure BDA0003270746440000104
The pooled data are re-spliced together, as shown in fig. 8, which is a schematic diagram of the pooling process, and each row in each matrix is taken as a maximum value, and all the maximum values are spliced together to form feature data.
For the coding coefficients obtained by different dimensions and different actions, after the operation is finished by splicing the coding coefficients into the integral characteristic, the characteristic of each data in different subdata sets
Figure BDA0003270746440000105
Spliced together as the final feature x of each datumi. The matrix label x formed by the final features corresponding to different data is the feature data, i.e. the input data for the next adjective classification, as shown in fig. 8, which is a schematic diagram of data concatenation after pooling.
For the test, pooling the encoded data corresponding to the data to be tested, which is the same as pooling operation in the training process, as shown in fig. 9, is a flow chart of the pooling process, and specifically may include the following steps:
step S301: acquiring a coding coefficient corresponding to each data to be detected;
step S302: performing pooling operation on the coding coefficients to obtain a feature matrix;
step S303: and splicing the feature matrixes of all the data to be detected to generate feature data.
The adjectives are classified, and the used classification algorithm can be an algorithm such as a support vector machine and a multilayer perceptron, and is not limited at all.
For example, when the support vector machine of 1 to 1 is adopted for classification, each kind of adjective can be classified separately, wherein the adjective attribute is 1, and the adjective attribute is not-1, so as to complete training according to the label data of the adjective.
The support vector machine is a method of determining a hyperplane by finding a support vector and dividing data into two types at a maximum interval from the hyperplane. The optimization goal is to maximize the distance from the sample point closest to the hyperplane on both sides to the hyperplane.
The hyperplane is described as:
ωTx+b=0;
point x ═ x1,x2,...,xn) To the plane omegaTThe distance of x + b ═ 0 is:
Figure BDA0003270746440000111
the support vector is the vector closest to the hyperplane, the distance from the support vector to the hyperplane is d, and the distance from other vectors to the hyperplane is greater than or equal to d, and the following can be deduced:
Figure BDA0003270746440000112
the two formulas are combined to obtain:
y(ωTx+b)≥1;
the objective function is to maximize the distance of the support vector to the hyperplane, i.e.:
Figure BDA0003270746440000113
since for the support vector, y (ω)Tx + b) ═ 1, then the above formula is equivalent to:
Figure BDA0003270746440000114
for convenience of calculation:
Figure BDA0003270746440000115
s.t.y(ωTx+b)≥1;
the algorithm may be solved using lagrangian multipliers.
During training, for a certain adjective, the label result of the object corresponding to each datum is divided into 1 and-1, and a support vector machine is independently used for classification. In the classification process, an 'rbf' kernel function is used as a kernel function of the support vector machine, and a regularization term is used to avoid overfitting.
When testing the data to be tested, the classification effect of each adjective is tested independently, and the adjective corresponding to the classifier with the classification result of 1 is displayed as the attribute of the material.
An embodiment of the present application further provides a material identification device, as shown in fig. 10, which is a block diagram of the structure of the material identification device, and is applied to the material identification method, where the device includes, but is not limited to:
the acquisition module 100 is used for acquiring to-be-detected data corresponding to different materials acquired by the touch sensing device 801;
the encoding module 200 is configured to perform sparse encoding on the data to be detected by using preset dictionary data to obtain encoded data;
a pooling module 300 for feature pooling the encoded data to obtain feature data;
the classification module 400 is configured to classify the feature data by using a preset classifier to obtain a classification result of the material.
As shown in fig. 11, a block diagram of another texture recognition apparatus is shown, and based on the apparatus, the pooling module 300 includes:
a coefficient obtaining module 301, configured to obtain a coding coefficient corresponding to each piece of data to be detected;
a feature matrix obtaining module 302, configured to perform pooling operation on the coding coefficients to obtain a feature matrix;
the characteristic data obtaining module 303 is configured to splice characteristic matrices of all data to be detected to generate characteristic data.
The apparatus further comprises a pre-processing module 500 for:
dividing the data to be measured of each object into matrix data with the same width according to the atom length; the matrix data for all objects is combined into a two-dimensional matrix of atomic lengths.
The apparatus further comprises a dictionary learning module 600 and a classifier training module 700, specifically:
the dictionary learning module 600 is configured to acquire training data corresponding to different materials acquired by the tactile sensing device 801, where the training data includes positive pressure data and 9-axis imu data acquired by using the mechanical arm 803 in a pressing, fast-sliding and slow-sliding contact manner, and also includes adjective data acquired by sensing the material surface with a finger and label data of material attributes corresponding to the adjective data; and performing dictionary learning on the preprocessed training data to obtain the dictionary data.
The classifier training module 700 is configured to train a classifier using the training data and the label data corresponding to each material.
The specific training process is not described herein.
An embodiment of the present application further provides an electronic device, where the electronic device includes a memory and a processor, the memory is used for storing a computer program, and the processor runs the computer program to enable the electronic device to execute the material identification method in the foregoing.
An embodiment of the present application further provides a readable storage medium, where computer program instructions are stored, and when the computer program instructions are read and executed by a processor, the method for identifying a material quality is performed.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (10)

1. A material identification method, characterized in that the method comprises:
acquiring to-be-detected data corresponding to different materials acquired by a touch sensing device;
carrying out sparse coding on the data to be detected by utilizing preset dictionary data to obtain coded data;
performing feature pooling on the encoded data to obtain feature data;
and classifying the characteristic data by using a preset classifier to obtain a classification result of the material.
2. The material quality identification method according to claim 1, wherein before the step of sparsely encoding the data to be detected using the preset dictionary data, the method further comprises:
dividing the data to be measured of each object into matrix data with the same width according to the atom length;
the matrix data for all objects is combined into a two-dimensional matrix of atomic lengths.
3. The material quality identification method according to claim 1, wherein before the step of sparsely encoding the data to be detected using the preset dictionary data, the method further comprises:
acquiring training data corresponding to different materials acquired by a touch sensing device, wherein the training data comprises positive pressure data and 9-axis imu data acquired by using a mechanical arm in a pressing, fast-sliding and slow-sliding contact mode respectively, and also comprises adjective data acquired by sensing the surface of the material by fingers and label data of material attributes corresponding to the adjective data;
and performing dictionary learning on the preprocessed training data to obtain the dictionary data.
4. The material recognition method of claim 1, wherein the pooling of features of the encoded data to obtain feature data comprises:
acquiring a coding coefficient corresponding to each data to be detected;
performing pooling operation on the coding coefficients to obtain a feature matrix;
and splicing the feature matrixes of all the data to be detected to generate feature data.
5. The method of claim 3, wherein before the step of classifying the feature data by using a preset classifier to obtain the classification result of the material, the method further comprises:
and training the classifier by using the training data and the label data corresponding to each material.
6. A material quality identifying device, the device comprising:
the acquisition module is used for acquiring the data to be detected corresponding to different materials acquired by the touch sensing device;
the encoding module is used for carrying out sparse encoding on the data to be detected by utilizing preset dictionary data so as to obtain encoded data;
the pooling module is used for performing feature pooling on the coded data to obtain feature data;
and the classification module is used for classifying the characteristic data by utilizing a preset classifier so as to obtain a classification result of the material.
7. The material recognition device of claim 6, wherein the pooling module comprises:
the coefficient acquisition module is used for acquiring a coding coefficient corresponding to each data to be detected;
the characteristic matrix acquisition module is used for carrying out pooling operation on the coding coefficients to obtain a characteristic matrix;
and the characteristic data acquisition module is used for splicing the characteristic matrixes of all the data to be detected to generate characteristic data.
8. A material quality recognition robot comprising the material quality recognition apparatus according to any one of claims 6 to 7, and further comprising:
the touch sensing device is arranged at the tail end of a finger joint of the manipulator and used for acquiring data to be detected;
and the mechanical arm is connected with the mechanical arm and used for controlling the mechanical arm to move so that the touch sensing device is contacted with the surface of the target object in a pressing, fast-sliding and slow-sliding contact mode.
9. An electronic device, comprising a memory for storing a computer program and a processor for executing the computer program to cause the electronic device to perform the material quality identification method according to any one of claims 1 to 5.
10. A readable storage medium having stored thereon computer program instructions which, when read and executed by a processor, perform the material identification method of any one of claims 1 to 5.
CN202111111280.XA 2021-09-18 2021-09-18 Material identification method and device, robot, electronic equipment and storage medium Pending CN113780460A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111111280.XA CN113780460A (en) 2021-09-18 2021-09-18 Material identification method and device, robot, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111111280.XA CN113780460A (en) 2021-09-18 2021-09-18 Material identification method and device, robot, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113780460A true CN113780460A (en) 2021-12-10

Family

ID=78852695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111111280.XA Pending CN113780460A (en) 2021-09-18 2021-09-18 Material identification method and device, robot, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113780460A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116150684A (en) * 2023-01-17 2023-05-23 中国科学院自动化研究所 Attention mechanism-based haptic attribute identification method and device
CN116383571A (en) * 2023-06-02 2023-07-04 帕西尼感知科技(张家港)有限公司 Touch data acquisition method, device and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130184868A1 (en) * 2012-01-17 2013-07-18 Seiko Epson Corporation Robot controller, robot system, robot control method
CN105005787A (en) * 2015-06-24 2015-10-28 清华大学 Dexterous hand tactile information based material classification method based on joint sparse coding
CN106056082A (en) * 2016-05-31 2016-10-26 杭州电子科技大学 Video action recognition method based on sparse low-rank coding
CN107024271A (en) * 2017-03-29 2017-08-08 兰州理工大学 Mechanical oscillation signal compression reconfiguration method and system
CN107463952A (en) * 2017-07-21 2017-12-12 清华大学 A kind of object material sorting technique based on multi-modal fusion deep learning
CN110084181A (en) * 2019-04-24 2019-08-02 哈尔滨工业大学 A kind of remote sensing images Ship Target Detection method based on sparse MobileNetV2 network
US20200073482A1 (en) * 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
CN111204476A (en) * 2019-12-25 2020-05-29 上海航天控制技术研究所 Vision-touch fusion fine operation method based on reinforcement learning
CN112668607A (en) * 2020-12-04 2021-04-16 深圳先进技术研究院 Multi-label learning method for recognizing tactile attributes of target object

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130184868A1 (en) * 2012-01-17 2013-07-18 Seiko Epson Corporation Robot controller, robot system, robot control method
CN105005787A (en) * 2015-06-24 2015-10-28 清华大学 Dexterous hand tactile information based material classification method based on joint sparse coding
CN106056082A (en) * 2016-05-31 2016-10-26 杭州电子科技大学 Video action recognition method based on sparse low-rank coding
US20200073482A1 (en) * 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
CN107024271A (en) * 2017-03-29 2017-08-08 兰州理工大学 Mechanical oscillation signal compression reconfiguration method and system
CN107463952A (en) * 2017-07-21 2017-12-12 清华大学 A kind of object material sorting technique based on multi-modal fusion deep learning
CN110084181A (en) * 2019-04-24 2019-08-02 哈尔滨工业大学 A kind of remote sensing images Ship Target Detection method based on sparse MobileNetV2 network
CN111204476A (en) * 2019-12-25 2020-05-29 上海航天控制技术研究所 Vision-touch fusion fine operation method based on reinforcement learning
CN112668607A (en) * 2020-12-04 2021-04-16 深圳先进技术研究院 Multi-label learning method for recognizing tactile attributes of target object

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
NAWID J.等: "Majority Voting: Material Classification by Tactile Sensing Using Surface Texture", 《IEEE TRANSACTIONS ON ROBOTICS》 *
何孔飞 等: "一种基于联合组核稀疏编码的多模态材料感知与识别方法", 《中国测试》 *
余发军 等: "基于字典学习的轴承早期故障稀疏特征提取", 《振动与冲击》 *
张华杰: "多种混杂因素下鲁棒式肌电模式识别方法研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116150684A (en) * 2023-01-17 2023-05-23 中国科学院自动化研究所 Attention mechanism-based haptic attribute identification method and device
CN116383571A (en) * 2023-06-02 2023-07-04 帕西尼感知科技(张家港)有限公司 Touch data acquisition method, device and system

Similar Documents

Publication Publication Date Title
Cretu et al. Soft object deformation monitoring and learning for model-based robotic hand manipulation
CN113780460A (en) Material identification method and device, robot, electronic equipment and storage medium
US20150325046A1 (en) Evaluation of Three-Dimensional Scenes Using Two-Dimensional Representations
Kwiatkowski et al. Grasp stability assessment through the fusion of proprioception and tactile signals using convolutional neural networks
Wu et al. Pixel-attentive policy gradient for multi-fingered grasping in cluttered scenes
US9477909B2 (en) Object investigation and classification
Ogawara et al. Modeling manipulation interactions by hidden Markov models
Yin et al. Classification of eye tracking data using a convolutional neural network
KR20190139539A (en) A System of Searching the Channel Expansion Parameter for the Speed-up of Inverted Residual Block and the method thereof for low specification embedded system and the method thereof
CN112668607A (en) Multi-label learning method for recognizing tactile attributes of target object
Amiri et al. A probabilistic artificial neural network-based procedure for variance change point estimation
Ruppel et al. Simulation of the SynTouch BioTac sensor
CN111611395A (en) Entity relationship identification method and device
CN110781968B (en) Extensible class image identification method based on plastic convolution neural network
US20230086261A1 (en) Clustering device, clustering method, and clustering program
Rao et al. Object recall from natural-language descriptions for autonomous robotic grasping
Al-Behadili et al. Semi-supervised learning using incremental support vector machine and extreme value theory in gesture data
Haley et al. Low level entity state sequence mapping to high level behavior via a DeepLSTM model
Dudczyk et al. Data fusion in the decision-making process based on artificial neural networks
Becari et al. Comparative analysis of classification algorithms on tactile sensors
CN113033807B (en) Online data collection method, neural network training method, related device and storage medium
Vogt et al. Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks
Kicki et al. Learning Quasi-Static 3D Models of Markerless Deformable Linear Objects for Bimanual Robotic Manipulation
EP4287078A1 (en) Device for building model for estimating action interval, method for building model for estimating action interval, and program for building model for estimating action interval
US20230316050A1 (en) Blocking neural networks for high capacity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211210