CN111709467A - Product and cloud data matching system and method based on machine vision - Google Patents

Product and cloud data matching system and method based on machine vision Download PDF

Info

Publication number
CN111709467A
CN111709467A CN202010500817.0A CN202010500817A CN111709467A CN 111709467 A CN111709467 A CN 111709467A CN 202010500817 A CN202010500817 A CN 202010500817A CN 111709467 A CN111709467 A CN 111709467A
Authority
CN
China
Prior art keywords
unit
network
file
parameters
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010500817.0A
Other languages
Chinese (zh)
Inventor
闫纪红
梁赟
王鹏翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202010500817.0A priority Critical patent/CN111709467A/en
Publication of CN111709467A publication Critical patent/CN111709467A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The utility model provides a product and high in clouds data matching system and method based on machine vision, relates to data analysis technical field, to the problem that the mode that obtains product design information is not convenient, inefficiency that exists among the prior art, include: the system comprises a local terminal product identification module and a cloud server file storage module; the local terminal product identification module is used for identifying the acquired product image; the cloud server file storage module is used for storing design files of products and interacting with local terminal files; the local terminal product identification module comprises a clustering unit, a cfg configuration file unit, an analysis unit, a network layer unit, a convolutional neural network construction unit, a network weight file unit, an input/output unit and a functional unit.

Description

Product and cloud data matching system and method based on machine vision
Technical Field
The invention relates to the technical field of data analysis, in particular to a product and cloud data matching system and method based on machine vision.
Background
Under the trend of intelligent manufacturing with rapid change and development, in order to meet the functional requirements of various application scenes, small-batch, multi-variety and personalized products gradually become the inevitable direction of intelligent manufacturing. However, in order to satisfy the requirement of the customer for individually customizing a new product and making a new manufacturing process, more resources are wasted, so that improvement and upgrade on the basis of the design of the original product become more important.
However, the following problems exist in the product upgrade process:
1. in order to meet the requirements of the customer for individually customizing and designing a new product and a new manufacturing process, more resources are wasted;
2. the technical design file is stored locally and limited by the storage space of hardware, so that the storage capacity of the large file is limited and the safety is difficult to ensure;
3. the method for searching the original technical file of the product is single, and the requirement of quickly and conveniently searching various, personalized and customized intelligent products cannot be met.
Disclosure of Invention
The purpose of the invention is: aiming at the problems of inconvenient and low efficiency of a product design information acquisition mode in the prior art, a product and cloud data matching system and method based on machine vision are provided.
The technical scheme adopted by the invention to solve the technical problems is as follows:
a machine vision-based product and cloud data matching system, comprising: the system comprises a local terminal product identification module and a cloud server file storage module;
the local terminal product identification module is used for identifying the acquired product image;
the cloud server file storage module is used for storing design files of products and interacting with local terminal files;
the local terminal product identification module comprises a clustering unit, a cfg configuration file unit, an analysis unit, a network layer unit, a convolutional neural network construction unit, a network weight file unit, an input/output unit and a function unit;
the clustering unit is connected with the local cfg configuration file unit, and is used for taking a label file of the training sample from the local, analyzing the label file and calculating to obtain the size of the target frame;
the cfg configuration file unit is connected with the clustering unit and the analyzing unit, and is used for storing main model parameters of the convolutional neural network and storing the target frame body size of the training sample as output layer parameters of the network;
the analysis unit is connected with the cfg configuration file unit, the network layer unit and the convolutional neural network construction unit, reads network parameters for constructing the neural network from the cfg configuration file unit and sends the parameters to the network layer unit;
the network layer unit is connected with the analysis unit and the convolutional neural network construction unit, receives network layer parameters from the analysis unit, instantiates each network layer by using the network layer parameters, and sends the instantiated network layers to the convolutional neural network construction unit;
the convolutional neural network construction unit is connected with the analysis unit, the network layer unit and the network weight file unit, receives the network structure parameters from the analysis unit, receives the instantiated network layers from the network layer unit, combines the network layers according to the sequence according to the network structure parameters, and is also used for reading the network weight parameters from the network weight file unit;
the network weight file unit is connected with the convolutional neural network construction unit and the functional unit, and acquires and stores network weight parameters from the functional unit;
the input and output unit is connected with the functional unit and is used for reading an image of a target to be detected from the local;
the function unit is connected with the input and output unit, the convolutional neural network construction unit, the network weight file unit and the local data, calls a training function to train the convolutional neural network to update the weight, stores the updated weight, predicts a target by calling a prediction function, and feeds back a recognition result to the cloud.
Further, the analyzing the labeled file includes a target type, a target position coordinate, a width and a height of the labeled file.
Further, the acquired product image comprises a data set required by the task to be identified, the data set comprises a product test set picture to be tested, a product training set picture to be trained and a product training set label file, and the product training set label file records labeling information of the target in the training set picture to be trained.
Furthermore, the labeling information of the targets in the to-be-trained picture of the training set comprises the names of the targets, the positions of the targets and position coordinates.
Furthermore, the clustering unit, the cfg configuration file unit, the analyzing unit, the network layer unit, the convolutional neural network constructing unit, the network weight file unit and the input and output unit are all packaged through functions.
Further, the label file is in an xml text format.
Further, the cloud server file storage module is realized based on a Hadoop platform.
A matching method of a product and cloud data matching system based on machine vision comprises the following steps:
the method comprises the following steps: the clustering unit reads the marked data file from the local, reads the marking information from the marking file, calculates the frame size and takes the frame size as the network output layer parameter;
step two: the analysis unit receives network parameters for constructing the convolutional neural network from the cfg configuration file unit, wherein the network parameters comprise network layer parameters and network structure parameters, and then defines a loss function to construct the neural network;
step three: training the neural network, continuously iteratively updating the network weight parameters, and storing the trained network weight parameters into a network weight file unit;
step four: the convolutional neural network construction unit reads the trained network weight parameters from the network weight file unit and gives the trained network weight parameters to the neural network, so that model building of the neural network is completed, and the model is stored and used for identifying products;
step five: and reading the image to be detected by the recognition model, retaining the recognition result with the highest confidence coefficient through a non-maximum suppression algorithm, deleting the candidate frames smaller than the set threshold value, and outputting the recognition result.
Furthermore, the clustering unit adopts a K-means clustering mode.
Further, the specific steps of the first step are as follows:
firstly, a clustering unit acquires the labeling information of a target from a labeling file of a training sample, and a binary group (w) formed by the width and the height of each frame bodyi,hi) As elements, a set G is formed, the number of the elements in the set G is N, N is the number of boxes in a sample to be trained, w is width, h is height, i is the serial number of the boxes, i ∈ [1, N];
The clustering unit sets k clustering centers and randomly selects an element (w) from Gl,hl),l∈[1,N]Setting the cluster center as a first cluster center, and setting the cluster center as C;
let the variable m be 1 and n be 1, calculate the element in G (w)m,hn) The distance from the element in C is d ((w)m,hm),(wn,hn)):
d((wm,hm),(wn,hn))=1-IOU((wm,hm),(wn,hn))
Wherein x and y are the width w of the marking frame body respectivelymAnd a height hmA, b are width w of the cluster center, respectivelynAnd hnFor the clustering algorithm, the IOU calculation for any element (x, y) in G and any element (a, b) in C is as follows:
if x is larger than or equal to a, y is larger than or equal to b, and IOU is ab/xy;
if x is more than or equal to a, y is less than or equal to b, IOU is ya/(ab + (x-a) y);
if x is less than or equal to a, y is more than or equal to b, IOU is xb/(ab + (y-b) x);
if x is less than or equal to a and y is less than or equal to b, IOU is xy/ab;
calculating the update weight to obtain the minimum distance sum
Figure BDA0002524755060000031
The invention has the beneficial effects that:
according to the invention, the product design file is called from the cloud end through the matching of visual identification and cloud end data, and the design file of various, personalized and customized products under intelligent manufacturing can be quickly, efficiently and intelligently retrieved through the visual identification;
for a target product, the design file can be called from the cloud only by acquiring an image through the image acquisition equipment and identifying the system, so that the convenience and the operability of product retrieval are improved;
the cloud storage management is adopted, so that the automation and the intellectualization of resource storage can be realized, all storage resources are integrated together, the storage efficiency is improved, the waste of storage space is solved, the utilization rate of the storage space is improved, and the problem of mass file storage is solved;
therefore, from the application perspective of the industry, the product design file is called from the cloud end through visual identification and cloud end data matching, and more intelligent, convenient and efficient technical support is provided for product upgrading and personalized customization of products.
Drawings
FIG. 1 is a block diagram of a matching method according to the present invention;
FIG. 2 is a schematic diagram of a framework of the matching system of the present invention;
FIG. 3 is a schematic illustration of a labeling process;
FIG. 4 is a diagram illustrating the recognition result;
FIGS. 5(a) to 5(d) are graphs showing data enhancement results;
FIG. 6 is a schematic diagram of a WEB-side UI interface of Hadoop;
FIG. 7 is a schematic view of an interactive interface of Hive.
Detailed Description
The first embodiment is as follows: specifically describing the present embodiment with reference to fig. 1 and fig. 2, the product and cloud data matching system based on machine vision in the present embodiment includes: the system comprises a local terminal product identification module and a cloud server file storage module;
the local terminal product identification module is used for identifying the acquired product image;
the cloud server file storage module is used for storing design files of products and interacting with local terminal files;
the local terminal product identification module comprises a clustering unit, a cfg configuration file unit, an analysis unit, a network layer unit, a convolutional neural network construction unit, a network weight file unit, an input/output unit and a function unit;
the clustering unit is connected with the local cfg configuration file unit, and is used for taking a label file of the training sample from the local, analyzing the label file and calculating to obtain the size of the target frame;
the cfg configuration file unit is connected with the clustering unit and the analyzing unit, and is used for storing main model parameters of the convolutional neural network and storing the target frame body size of the training sample as output layer parameters of the network;
the analysis unit is connected with the cfg configuration file unit, the network layer unit and the convolutional neural network construction unit, reads network parameters for constructing the neural network from the cfg configuration file unit and sends the parameters to the network layer unit;
the network layer unit is connected with the analysis unit and the convolutional neural network construction unit, receives network layer parameters from the analysis unit, instantiates each network layer by using the network layer parameters, and sends the instantiated network layers to the convolutional neural network construction unit;
the convolutional neural network construction unit is connected with the analysis unit, the network layer unit and the network weight file unit, receives the network structure parameters from the analysis unit, receives the instantiated network layers from the network layer unit, combines the network layers according to the sequence according to the network structure parameters, and is also used for reading the network weight parameters from the network weight file unit;
the network weight file unit is connected with the convolutional neural network construction unit and the functional unit, and acquires and stores network weight parameters from the functional unit;
the input and output unit is connected with the functional unit and is used for reading an image of a target to be detected from the local;
the function unit is connected with the input and output unit, the convolutional neural network construction unit, the network weight file unit and the local data, calls a training function to train the convolutional neural network to update the weight, stores the updated weight, predicts a target by calling a prediction function, and feeds back a recognition result to the cloud.
The key point of the invention is identification, and by identifying a local product, for example, identifying that the id of a target product is a No.001 product, then retrieving relevant data of the No.001 product from a cloud, for example, size data of the product such as length, width, height and the like, the important function of the cloud plays more roles of storage, and as long as the id of the product is identified locally, the relevant data is retrieved from the same id of the cloud.
The technical solutions disclosed in the present invention are further clearly and completely described below by using specific embodiments, and referring to the drawings, it is obvious that the described embodiments are only a part of the embodiments of the present invention, and are not intended to limit the scope of the present invention. In actual practice, the invention will be described with reference to the product WPA type vertical speed reduction end cap as an example.
Firstly, constructing a main frame of a matching system of a reducer end cover based on machine vision and an end cover design file stored in a cloud, as shown in fig. 2;
the local identification module is responsible for identifying the acquired end cover image, and mainly comprises a data set required by a stored task to be identified, the data set comprises an end cover test set picture to be tested, an end cover training set picture to be trained and an end cover training set label file, the label file of the end cover training set records the label information (such as a target name, a target position, a position coordinate and the like) of a target in the training set picture to be trained, and when the local identification module is specifically implemented, the local identification module can comprise: for example, as shown in fig. 3, the name of the end cover, the position coordinates of the end cover, and the like are stored in an xml text format as follows:
Figure BDA0002524755060000061
the cloud database mainly stores design files and the like of the end cover, the storage and reading of the cloud files are achieved based on the Hadoop platform, and the cloud files are interacted with files of the local terminal, so that the cloud product design files are locally and intelligently read. The distributed storage system in the Hadoop ecosystem can ensure the storage speed and the storage reliability of mass data, and the integrity of the data can be ensured through a redundancy backup mechanism.
Secondly, completing the establishment of a visual model of the local identification module;
the local terminal identification module mainly comprises a clustering part, a cfg configuration file, an analysis part, a network architecture part, a convolutional neural network construction part, a network weight file, an input and output part and the like;
setting the functions of each part of the local terminal identification module,
the clustering part is connected with the local cfg file, a marking file (xml file) of the training sample is called from the local, the target type, the target position coordinate, the width, the height and the like of the marking file are analyzed, and the size of the target frame body is obtained through calculation.
The cfg configuration file is connected with the clustering part and the analyzing part, and not only needs to store main model parameters of the convolutional neural network, but also needs to store the size of a target frame of the training sample as output layer parameters of the network.
The analysis part is connected with the cfg configuration file, the network layer part and the neural network construction part; the part reads network parameters for constructing the neural network from the cfg configuration file, and sends the parameters to the network layer.
The network layer part is connected with the analysis part and the convolutional neural network construction part, receives network layer parameters from the analysis part, instantiates each network layer by using the network layer parameters, and sends the instantiated network layers to the neural network part.
The convolutional neural network construction part is connected with the analysis part, the network layer part and the network weight file; the convolutional neural network construction part receives the network structure parameters from the analysis part, receives the network layers from the network layer part, and combines the network layers according to the network structure parameters in sequence to construct a basic framework of the convolutional neural network; the convolutional neural network construction part also reads network weight parameters from the network weight file, gives the weight parameters to the basic architecture of the neural network, completes the construction of the convolutional neural network, and sends the neural network to the function part.
The network weight file is connected with the convolutional neural network construction part and the functional part, and the network weight file stores the network weight parameters received from the functional part and is read by the convolutional neural network construction part.
The input and output part is connected with the functional part, and the input and output part reads the image of the target to be measured from the local.
Connecting a functional part formed by a packaged function with an input/output part, a convolutional neural network construction part, a weight file and local data, calling a training function to train the convolutional neural network to update the weight, and storing the updated weight; the functional part predicts the target by calling a prediction function and feeds back the recognition result to the cloud.
Connecting all parts of the local terminal identification module to construct an identification model:
firstly, the clustering part reads the marked data file from the local, reads the marking information from the xml file, and calculates the frame size:
the clustering part adopts a K-means clustering mode to acquire the marking information of the target from the training set, and the binary group (w) is formed by the width and the height of each frame bodyi,hi) As elements, a set G is formed, the number of the elements in the set G is N, N is the number of boxes in a sample to be trained, w is width, h is height, i is the serial number of the boxes, i ∈ [1, N];
ClusteringPartially setting k cluster centers, randomly selecting an element from G (w)l,hl),l∈[1,N]Setting the cluster center as a first cluster center, and setting the cluster center as C;
let the variable m be 1 and n be 1, calculate the element in G (w)m,hn) The distance from the element in C is d ((w)m,hm),(wn,hn)):
d((wm,hm),(wn,hn))=1-IOU((wm,hm),(wn,hn))
Wherein x and y are the width w of the marking frame body respectivelymAnd a height hmA, b are width w of the cluster center, respectivelynAnd a height hnFor the clustering algorithm, the IOU calculation for any element (x, y) in G and any element (a, b) in C is as follows:
if x is larger than or equal to a, y is larger than or equal to b, and IOU is ab/xy;
if x is more than or equal to a, y is less than or equal to b, IOU is ya/(ab + (x-a) y);
if x is less than or equal to a, y is more than or equal to b, IOU is xb/(ab + (y-b) x);
if x is less than or equal to a and y is less than or equal to b, IOU is xy/ab;
calculating the update weight to obtain the minimum distance sum
Figure BDA0002524755060000071
The analysis part receives network parameters for constructing the convolutional neural network from the cfg configuration file, analyzes the network parameters into network layer parameters and network structure parameters, defines a loss function by the network, constructs a basic framework of the neural network, trains the convolutional neural network and completes construction of the convolutional neural network.
The input and output part, the function part and the convolutional neural network construction part jointly construct a basic framework of a training model, continuously iterate and update network weight parameters, and store the trained network weight parameters into a network weight file.
The convolutional neural network construction part reads the trained network weight parameters from the network weight file, gives the trained network weight parameters to a basic framework of the neural network, completes construction of the neural network, and stores the model for identifying the product;
the model reads an image to be detected, 9 prediction frames are preset, the recognition result with the highest confidence coefficient is reserved through a non-maximum suppression algorithm, candidate frames smaller than a set threshold value are deleted, and the recognition result is output, as shown in FIG. 4;
optimizing parameters of the model;
for different recognition objects, in order to improve the generalization capability of the model, the model parameters need to be optimized, for the example of the end cover, the training data set is preprocessed, and data enhancement (mirroring, rotation) is performed to adjust the noise and the image brightness according to the influence of factors such as illumination in the actual scene, as shown in fig. 5(a), 5(b), 5(c) and 5 (d);
the activation function is adjusted to avoid overfitting in the face of different product types, and Leaky RELU can be adopted as the activation function when the product types are continuously increased, so that the activity of neurons is ensured, and the model learning capability is improved.
Thirdly, local and cloud data communication is achieved, the end cover identification result is matched with a cloud end cover design file, and an existing end cover design file is called;
in the process of building the Hadoop, the node number, the master-slave relationship, the system setting, the environment variables and the configuration file of YARN need to be configured, after the file configuration of the cloud server, the file can be transmitted to the local through an SSH inter-access mechanism, and finally, after the Hadoop is started through a system command, the operation information of the auxiliary machine can be seen through a UI interface at a WEB end of the host, as shown in fig. 6.
According to the actual requirements and the performance characteristics of the Hadoop Distributed data processing architecture, the address of the product File resource at the cloud end is stored in a Distributed storage System (HDFS) for calling out and using subsequent functional modules.
The address file of the design file resource of the end cover in the cloud service is stored in the Hive database, the Hive can be stored facing mass historical data, the data is stored in the HDFS of the Hadoop system, the method is suitable for storing and processing structured data, the addresses of the resources such as product technical files are stored in the Hive database, the positions of the product files are conveniently and rapidly searched, and the method is the Hive interactive interface as shown in fig. 7.
The purpose of adding the Thrift component is to increase the extensibility of the Hive data warehouse, and the service can be provided with a universal programming call interface to realize the access of different programming languages to the Hive data warehouse.
The TCP/IP communication technology is selected as a communication protocol between the local part of the system and the cloud end, the local part of the system is communicated with the cloud platform by adopting the TCP protocol, an IP address, a subnet mask code, a default gateway and a host name are set in a local server, and routing information is configured. In order to increase the data transmission speed between the local terminal and the cloud server, a gigabit switch is used to configure the communication network between the servers, and the IP address of the cloud server is 192.168.0.X and the IP address of the local terminal is 192.168.1.11. The system transmits the identification result of the local end cover to the cloud platform through a Socket technology, then the cloud platform realizes data communication between the local and the cloud platform through a Web Socket technology, transmits the design file of the end cover to the local terminal, and completes file interaction between the cloud end and the local terminal.
The mode of obtaining the product design information is more convenient and efficient, and more intelligent and efficient technical support is provided for personalized customization of products. The cloud storage can provide a mass storage space for a user, so that the storage problem and the safety problem of large files are solved, and the excellent performance of machine vision in the field of object identification makes it possible to identify existing products and retrieve design files from the cloud, so that the cloud storage has many advantages:
it should be noted that the detailed description is only for explaining and explaining the technical solution of the present invention, and the scope of protection of the claims is not limited thereby. It is intended that all such modifications and variations be included within the scope of the invention as defined in the following claims and the description.

Claims (10)

1. A product and cloud data matching system based on machine vision is characterized by comprising: the system comprises a local terminal product identification module and a cloud server file storage module;
the local terminal product identification module is used for identifying the acquired product image;
the cloud server file storage module is used for storing design files of products and interacting with local terminal files;
the local terminal product identification module comprises a clustering unit, a cfg configuration file unit, an analysis unit, a network layer unit, a convolutional neural network construction unit, a network weight file unit, an input/output unit and a function unit;
the clustering unit is connected with the local cfg configuration file unit, and is used for taking a label file of the training sample from the local, analyzing the label file and calculating to obtain the size of the target frame;
the cfg configuration file unit is connected with the clustering unit and the analyzing unit, and is used for storing main model parameters of the convolutional neural network and storing the target frame body size of the training sample as output layer parameters of the network;
the analysis unit is connected with the cfg configuration file unit, the network layer unit and the convolutional neural network construction unit, reads network parameters for constructing the neural network from the cfg configuration file unit and sends the parameters to the network layer unit;
the network layer unit is connected with the analysis unit and the convolutional neural network construction unit, receives network layer parameters from the analysis unit, instantiates each network layer by using the network layer parameters, and sends the instantiated network layers to the convolutional neural network construction unit;
the convolutional neural network construction unit is connected with the analysis unit, the network layer unit and the network weight file unit, receives the network structure parameters from the analysis unit, receives the instantiated network layers from the network layer unit, combines the network layers according to the sequence according to the network structure parameters, and is also used for reading the network weight parameters from the network weight file unit;
the network weight file unit is connected with the convolutional neural network construction unit and the functional unit, and acquires and stores network weight parameters from the functional unit;
the input and output unit is connected with the functional unit and is used for reading an image of a target to be detected from the local;
the function unit is connected with the input and output unit, the convolutional neural network construction unit, the network weight file unit and the local data, calls a training function to train the convolutional neural network to update the weight, stores the updated weight, predicts a target by calling a prediction function, and feeds back a recognition result to the cloud.
2. The machine-vision-based product and cloud data matching system of claim 1, wherein said analyzing of annotation files comprises annotation file object categories, object location coordinates, and width and height.
3. The system of claim 1, wherein the acquired product image comprises a data set required by a task to be identified, the data set comprises a product test set picture to be tested, a product training set picture to be trained and a product training set label file, and the product training set label file records labeling information of targets in the training set picture to be trained.
4. The system according to claim 3, wherein the labeling information of the targets in the images to be trained in the training set comprises names of the targets, positions of the targets, and position coordinates of the targets.
5. The machine-vision-based product and cloud data matching system of claim 1, wherein the clustering unit, the cfg profile unit, the parsing unit, the network layer unit, the convolutional neural network building unit, the network weight file unit, and the input and output unit are all encapsulated by functions.
6. The machine-vision-based product and cloud data matching system of claim 1, wherein said annotation file is in xml text format.
7. The machine vision-based product and cloud data matching system of claim 1, wherein the cloud server file storage module is implemented based on a Hadoop platform.
8. The matching method of the machine vision-based product and cloud data matching system according to claim 1, comprising the following steps:
the method comprises the following steps: the clustering unit reads the marked data file from the local, reads the marking information from the marking file, calculates the frame size and takes the frame size as the network output layer parameter;
step two: the analysis unit receives network parameters for constructing the convolutional neural network from the cfg configuration file unit, wherein the network parameters comprise network layer parameters and network structure parameters, and then defines a loss function to construct the neural network;
step three: training the neural network, continuously iteratively updating the network weight parameters, and storing the trained network weight parameters into a network weight file unit;
step four: the convolutional neural network construction unit reads the trained network weight parameters from the network weight file unit and gives the trained network weight parameters to the neural network, so that model building of the neural network is completed, and the model is stored and used for identifying products;
step five: and reading the image to be detected by the recognition model, retaining the recognition result with the highest confidence coefficient through a non-maximum suppression algorithm, deleting the candidate frames smaller than the set threshold value, and outputting the recognition result.
9. The machine vision-based product and cloud data matching method of claim 8, wherein: the clustering unit adopts a K-means clustering mode.
10. The method of claim 9, wherein the step one comprises the following steps:
firstly, a clustering unit acquires the labeling information of a target from a labeling file of a training sample, and a binary group (w) formed by the width and the height of each frame bodyi,hi) As elements, a set G is formed, the number of the elements in the set G is N, N is the number of boxes in a sample to be trained, w is width, h is height, i is the serial number of the boxes, i ∈ [1, N];
The clustering unit sets k clustering centers and randomly selects an element (w) from Gl,hl),l∈[1,N]Setting the cluster center as a first cluster center, and setting the cluster center as C;
let the variable m be 1 and n be 1, calculate the element in G (w)m,hn) At a distance of from the element in C of
d((wm,hm),(wn,hn)):
d((wm,hm),(wn,hn))=1-IOU((wm,hm),(wn,hn))
Wherein x and y are the width w of the marking frame body respectivelymAnd a height hmA, b are width w of the cluster center, respectivelynAnd hnFor the clustering algorithm, the IOU calculation for any element (x, y) in G and any element (a, b) in C is as follows:
if x is larger than or equal to a, y is larger than or equal to b, and IOU is ab/xy;
if x is more than or equal to a, y is less than or equal to b, IOU is ya/(ab + (x-a) y);
if x is less than or equal to a, y is more than or equal to b, IOU is xb/(ab + (y-b) x);
if x is less than or equal to a and y is less than or equal to b, IOU is xy/ab;
calculating the update weight to obtain the minimum distance sum
Figure FDA0002524755050000031
CN202010500817.0A 2020-06-04 2020-06-04 Product and cloud data matching system and method based on machine vision Pending CN111709467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010500817.0A CN111709467A (en) 2020-06-04 2020-06-04 Product and cloud data matching system and method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010500817.0A CN111709467A (en) 2020-06-04 2020-06-04 Product and cloud data matching system and method based on machine vision

Publications (1)

Publication Number Publication Date
CN111709467A true CN111709467A (en) 2020-09-25

Family

ID=72539753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010500817.0A Pending CN111709467A (en) 2020-06-04 2020-06-04 Product and cloud data matching system and method based on machine vision

Country Status (1)

Country Link
CN (1) CN111709467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529106A (en) * 2020-12-28 2021-03-19 平安普惠企业管理有限公司 Method, device and equipment for generating visual design manuscript and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529106A (en) * 2020-12-28 2021-03-19 平安普惠企业管理有限公司 Method, device and equipment for generating visual design manuscript and storage medium

Similar Documents

Publication Publication Date Title
CN109218379B (en) Data processing method and system in Internet of things environment
US11431817B2 (en) Method and apparatus for management of network based media processing functions
JP7065498B2 (en) Data orchestration platform management
CN111464627B (en) Data processing method, edge server, center server and processing system
CN108683877B (en) Spark-based distributed massive video analysis system
JP7412847B2 (en) Image processing method, image processing device, server, and computer program
US20210390422A1 (en) Knowledge-Base Information Sensing Method And System For Operations And Maintenance Of Data Center
CN113176948B (en) Edge gateway, edge computing system and configuration method thereof
CN111797969A (en) Neural network model conversion method and related device
CN111314371A (en) Edge device access system and method based on intelligent gateway technology
CN113312957A (en) off-Shift identification method, device, equipment and storage medium based on video image
US20230342147A1 (en) Model processing method and apparatus
CN111813910A (en) Method, system, terminal device and computer storage medium for updating customer service problem
CN111709467A (en) Product and cloud data matching system and method based on machine vision
CN109818796B (en) Data center construction method and device, electronic equipment and medium
CN110991279B (en) Document Image Analysis and Recognition Method and System
CN116229188B (en) Image processing display method, classification model generation method and equipment thereof
CN116863116A (en) Image recognition method, device, equipment and medium based on artificial intelligence
US20200026701A1 (en) Dynamic visualization of application and infrastructure components with layers
CN109754319B (en) Credit score determination system, method, terminal and server
CN114564590A (en) Intelligent medical information processing method and system applied to big data and artificial intelligence
CN113297491A (en) Service subscription information pushing method and system based on social network
CN114494933A (en) Hydrology monitoring station image recognition monitoring system based on edge intelligence
KR20210128096A (en) Apparatus and method for interworking among internet of things platforms
CN112204525A (en) Distributed computing system with integrated data-as-a-service frameset package repository

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200925