CN115600941B - Material management system of assembled prefabricated part factory - Google Patents

Material management system of assembled prefabricated part factory Download PDF

Info

Publication number
CN115600941B
CN115600941B CN202211577533.7A CN202211577533A CN115600941B CN 115600941 B CN115600941 B CN 115600941B CN 202211577533 A CN202211577533 A CN 202211577533A CN 115600941 B CN115600941 B CN 115600941B
Authority
CN
China
Prior art keywords
warehouse
prediction
image
cloud server
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211577533.7A
Other languages
Chinese (zh)
Other versions
CN115600941A (en
Inventor
李学俊
王恒新
周思宇
王华彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Green Industry Innovation Research Institute of Anhui University
Original Assignee
Green Industry Innovation Research Institute of Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Green Industry Innovation Research Institute of Anhui University filed Critical Green Industry Innovation Research Institute of Anhui University
Priority to CN202211577533.7A priority Critical patent/CN115600941B/en
Publication of CN115600941A publication Critical patent/CN115600941A/en
Application granted granted Critical
Publication of CN115600941B publication Critical patent/CN115600941B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • G06Q10/0875Itemisation or classification of parts, supplies or services, e.g. bill of materials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/766Arrangements for image or video recognition or understanding using pattern recognition or machine learning using regression, e.g. by projecting features on hyperplanes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Human Resources & Organizations (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention belongs to the field of building industry, and particularly relates to a material management system of an assembly type prefabricated part factory. The method is used for digitally managing the warehousing and warehousing processes of materials in the prefabricated member factory. The material management system comprises: electronic tags, net weight measuring equipment, cloud server, and handheld terminal. Wherein the electronic tags include a first tag and a second tag mounted on each of the transportation vehicles. The net weight measuring device comprises a weighing mechanism and an electronic tag writing module. An inventory database and a discrete material detection counting model based on machine vision are operated in the cloud server. The handheld terminal is in communication connection with the cloud server, and data interaction is performed between the handheld terminal and the cloud server. The handheld terminal comprises an image acquisition module, an electronic tag reading and writing module and a display module. The problems that material management of an existing assembly type prefabricated part factory depends on manpower, stock data is not updated timely, data is prone to errors and the like are solved.

Description

Material management system of assembled prefab mill
Technical Field
The invention belongs to the field of building industry, and particularly relates to a material management system of an assembly type prefabricated part factory.
Background
The assembly type building is an important direction for the development of the future building industry, and the assembly type building is a novel building mode formed by processing various types of building components in advance in a factory, and then transporting the building components to a construction site to be assembled and assembled through reliable connection. Compared with the existing cast-in-place structure building, the cast-in-place structure building has the advantages of large-scale production, high construction speed and low construction cost.
Current prefabricated component factory is starting to become more specialized slowly along with market changes, and in order to satisfy the assembly demand of different grade type buildings, the product variety of the prefabricated component that the enterprise generated is also more and more. The diversification of preform types also puts a great deal of pressure on the inventory management of the plant.
The production costs of building pre-forms are high, and therefore, the mode of order generation is mostly adopted. If the production is carried out in large quantities and the stock is generated, the operation cost of the enterprise is greatly increased. Therefore, in order to save the production cost, enterprises must accurately grasp the stock condition of materials in a workshop, the processing condition of the prefabricated products and the condition of manufacturing elements closely related to the materials in the processing process in real time, so as to achieve the purposes of ordered manufacturing, accurate production and fine management.
The premise for realizing the fine management of the manufacturing process of the prefabricated part is that the stock data in a factory can be updated in real time. However, in various prefabricated part factories at present, most of the material data acquisition and management mainly rely on manual checking, and the latest stock data of materials are uploaded regularly according to paper documents by corresponding management personnel. Because the materials in the warehouse are various in types, large in data volume and variable in state, the manual filling input acquisition is relied on, so that the workload is large, the acquisition speed is low, and errors are easy to occur. These all affect the accuracy and timeliness of inventory data. Therefore, how to achieve digital management of inventory data has become a key to the ability of fabricated preform factories to reduce cost and increase efficiency, but the prior art still fails to provide an effective solution.
Disclosure of Invention
The invention provides a material management system of an assembly type prefabricated member factory, which aims to solve the problems that the material management of the existing assembly type prefabricated member factory depends on manpower, the stock data is not updated timely, the data is easy to generate errors and the like.
The invention is realized by adopting the following technical scheme:
a material management system of an assembly type prefabricated member factory is used for carrying out digital management on the warehousing and ex-warehouse processes of materials in the assembly type prefabricated member factory. The material management system includes: electronic tags, net weight measuring equipment, cloud server, and handheld terminal.
Wherein the electronic tags include a first tag and a second tag mounted on each of the transportation vehicles. The first label is used for storing a warehousing list or a ex-warehouse list during vehicle distribution. The second tag is used for storing the weighing result of the vehicle.
The net weight measuring device comprises a weighing mechanism and an electronic tag writing module. The weighing mechanism is used for weighing the transport vehicles entering and leaving the factory, and the electronic tag writing module is used for writing the weighing result of the weighing mechanism into a second tag in the vehicle.
An inventory database and a discrete material detection counting model based on machine vision are operated in the cloud server. And the inventory database is used for carrying out classified statistics on the inventory of each material in the factory according to the information of the factory after acceptance. The discrete material detection counting model is used for identifying and counting the number of the counting nominal materials in the conveyed materials according to the input material images at different angles.
The handheld terminal is in communication connection with the cloud server, and data interaction is performed between the handheld terminal and the cloud server. The handheld terminal comprises an image acquisition module, an electronic tag reading and writing module and a display module. The image acquisition module is used for acquiring material images of materials entering and leaving a field; the electronic tag reading and writing module is used for reading tag information in any electronic tag or updating the tag information in any electronic tag.
The application process of the material management system is as follows:
1. vehicle warehousing stage:
the warehousing inventory or the ex-warehouse inventory is pre-recorded into the first label of the transport vehicle. And reading the inventory information in the first tag of the vehicle through the handheld terminal, and uploading the inventory information to the cloud server to finish the pre-registration of the warehouse entry and the warehouse exit. Meanwhile, the net weight measuring equipment writes the warehousing weighing result of the vehicle into a second label in the vehicle.
And the cloud server matches the received warehouse entry list or warehouse exit list with the inventory database, and sends the designated unloading or loading warehouse position to the handheld terminal.
2. Unloading or loading stage:
the vehicle is guided according to the designated warehouse location.
And collecting material images by adopting a handheld terminal in the loading and unloading process, and uploading the material images to a cloud server.
And the cloud server identifies and counts the actual measurement nominal weight of the counting nominal material through the discrete material detection counting model.
3. And (3) vehicle delivery stage:
the net weight measuring device writes the out-of-warehouse weighing result of the vehicle into a second tag in the vehicle.
And the warehouse manager scans a second label of the vehicle by using the handheld terminal and synchronously sends the read warehousing weighing result and the ex-warehouse weighing result to the cloud server.
And the cloud server calculates the net weight of the goods according to the warehousing weighing result and the ex-warehouse weighing result, and takes the net weight of the goods as the actual measurement standard weight of the weight nominal type material. And then the cloud server compares whether the nominal information of the materials in the warehousing list or the ex-warehouse list is matched with the actual measured nominal weight, if so, the vehicle is released through ex-warehouse verification. And modifying the inventory database according to the inventory list or the inventory list.
As a further improvement of the present invention, the electronic tag further includes a third tag mounted on each of the warehouses. The third label is used for storing the material storage capacity data in each warehouse. After the vehicle is taken out of the warehouse, the handheld terminal receives the information in the modified inventory database in the cloud server and updates a third label corresponding to the warehouse;
when a production department in a factory needs to receive materials in any warehouse, a receiving request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; and after the materials are received, the manager updates the third label of the corresponding warehouse through the handheld terminal.
When a production department in a factory needs to store produced products in any warehouse, a storage request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; and after the product is stored and taken, the manager updates the third label of the corresponding warehouse through the handheld terminal.
As a further improvement of the invention, in the process of the regular inventory of the warehouse, the inventory personnel count and dispose the materials in the warehouse. And then comparing the material storage capacity data recorded in the third tag in each warehouse with the actually checked data, reporting to the system when the difference occurs between the two, and modifying the inventory database in the cloud server after verification and cancellation by a manager with authority. And then the checking personnel renew the third label in the corresponding warehouse through the handheld terminal.
As a further improvement of the invention, during the process of vehicle entering and exiting, when the transported materials are materials with the weight of nominal type, at most one material is allowed to be transported at a time; when the material to be transported is a piece-count nominal type material, it is allowed to transport a plurality of types simultaneously.
As a further improvement of the present invention, the information in the warehousing list includes: the system comprises a supply unit of the current batch of materials, information of transport vehicles, traceability information of the materials, types of the materials, standard weights of the checked and accepted materials, units and departments for receiving the materials, and other remark information.
The information in the ex-warehouse list comprises: the receiving unit of the current batch of materials, the information of the transport vehicle, the traceability information of the materials, the types of the materials, the standard weight of the checked and accepted materials, the supply unit and department of the materials, and other remark information.
As a further improvement of the present invention, the first tag, the second tag and the third tag are RFID tags; the handheld terminal adopts special electronic equipment with a network communication function or general electronic equipment with a communication function and a specific management program.
As a further improvement of the invention, the constructed discrete material detection counting model comprises an image optimization module, a target detection module designed based on YOLOv5 and an output module. The image optimization module is used for carrying out local brightness optimization on the input image so as to obtain an image with uniform pixel brightness distribution and outputting the image to the target detection module. The target detection module is used for identifying each prediction frame containing the identification target under different scale characteristic states and the corresponding confidence coefficient according to the input image. The output module processes the output of the target detection network by adopting an improved non-maximum suppression algorithm to obtain the number of target objects contained in the input image.
Wherein, the target detection module includes: the system comprises a feature extraction network, a feature fusion network and a prediction head. The feature extraction network is used for extracting features of the input image, and four feature maps with different scales of 160 × 160, 80 × 80, 40 × 40, and 20 × 20 are obtained. And the feature fusion network performs fusion reinforcement on the feature graphs of four different scales according to repeated bidirectional feature fusion paths to obtain fusion features of 80 × 80, 40 × 40 and 20 × 20 in three different scales. The prediction head is used for respectively carrying out object detection and prediction frame regression on the three fusion characteristics with different scales; and outputting the corresponding prediction box and the confidence level.
As a further improvement of the invention, the optimization process of the image optimization module is as follows: and (1) converting the original Image into a gray level Image Grey.
(2) Expanding the gray image Grey to the number of original image channels, and multiplying the number by a preset brightness weight coefficient alpha to obtain a weight matrix W, namely: w = α × Grey.
Wherein "+" represents a multiplication operation of a matrix; the luminance weight coefficient α ranges from 0 to 1, and a larger value indicates a larger degree of enhancement.
(3) And determining that the pixel value in the gray image Grey is less than 0.5 as a dark pixel, and performing pixel enhancement on the dark pixel screened from the gray image Grey to obtain a corresponding Exposure image Exposure.
(4) Outputting the enhanced image
Figure DEST_PATH_IMAGE001
The following:
Figure DEST_PATH_IMAGE002
as a further improvement of the invention, the target detection module is improved as follows:
(1) And replacing all CSP-blocks in the characteristic extraction network of YOLOv5 with newly designed CSP-DSC-blocks.
The CSP-DSC-Block divides the input characteristics into two paths, one path is processed by DSC-Block, the other path is processed by multiple Dense-Block, and then the two paths of processing results are output after being convoluted by 3*3.
(2) Replacing all 3*3 convolution modules on a backbone network in YOLOv5 with Rep-Block; so as to adopt a multi-branch structure for Rep-Block during network training.
(3) An attention mechanism module is introduced into a feature extraction network and a feature fusion network of YOLOv5, and the attention mechanism module adopts a module for fusing channel attention and space attention.
The feature input dimension of the attention mechanism module is H W C, H is the feature height, W is the feature width, and C is the number of feature channels. First of all. And respectively carrying out global maximum pooling and global average pooling on the input features, and splicing the input features into 1 × C dimensional features. Then, obtaining the channel attention weight through a 1*1 convolution kernel and a Sigmoid function; and performing dot product on the input features and the channel attention weight to obtain H, W, C features after fusion of the channel attention. And then, splicing the characteristic graphs into H W2 after respectively carrying out global maximum pooling and global average pooling. And finally, outputting the H-W-C dimensional output characteristics processed by the attention mechanism through the convolution kernel convolution of 7*7 and the sigmoid function.
(4) And adjusting the prediction head, wherein the simplified prediction head adopts a structure of decoupling object confidence judgment and prediction frame regression.
Specifically, the prediction head uses 1*1 convolution to split the feature map transmitted into the prediction head into two feature maps with the same shape on a channel, then 3*3 convolution is respectively superposed, and finally 1*1 convolution is used to perform dimensionality reduction to obtain a confidence coefficient and a corresponding detection result of the prediction frame. The number of characteristic output channels judged by the object confidence coefficient is 1, and the number of characteristic output channels of regression of the prediction frame is 4.
(5) The algorithm adopts an improved non-maximum inhibition algorithm, reserves the prediction frames with the first three scores, gives corresponding weights to the three prediction frames, and averages the prediction frame closest to the real frame by the three prediction frames with the highest scores.
As a further improvement of the invention, the discrete material detection counting model is trained respectively for different types of materials, and corresponding model parameters of the trained target detection module are stored;
the model training process of each target detection module is improved as follows:
(1) Amplifying the original data set by adopting a Mosaic enhancement strategy; the specific processing process comprises the steps of splicing any four images, and transforming the spliced images in a rotating, cutting and scaling mode.
(2) And in the network model training stage, the SimOTA label dynamic allocation method is adopted to optimize the positive sample.
(3) The loss function calculation formula after the improved design of the adaptation task is as follows:
Figure DEST_PATH_IMAGE003
in the above formula, the first and second carbon atoms are,
Figure DEST_PATH_IMAGE004
the weight representing the prediction box regression prediction loss,
Figure DEST_PATH_IMAGE005
representing the weight corresponding to the positive sample confidence prediction loss,
Figure DEST_PATH_IMAGE006
a weight representing a negative sample confidence prediction penalty; s 2 Representing the number of the characteristic points, wherein the value of the number is the product of the length and the width of the characteristic diagram; b represents that the number of the prediction frames can be preset according to the task; c represents the object confidence prediction value and,
Figure DEST_PATH_IMAGE007
representing the object confidence real label value, wherein the upper and lower labels i and j represent the object confidence value corresponding to the jth prediction box in the ith characteristic point and indicate whether the characteristic point contains the outline of the objectA ratio ranging from 0 to 1; x, y, w and h respectively represent the horizontal and vertical coordinates and the width and the height of the center of the prediction frame,
Figure DEST_PATH_IMAGE008
respectively representing the horizontal and vertical coordinates and the width and the height of the center of the real label, wherein the upper and lower labels i and j represent a group of horizontal and vertical coordinates and a group of width and height values corresponding to the jth prediction frame in the ith characteristic point;
Figure DEST_PATH_IMAGE009
and
Figure DEST_PATH_IMAGE010
respectively representing a group of marks for representing whether the characteristic points at i and j contain the target object: when the ith row and jth column feature points contain an object,
Figure 697247DEST_PATH_IMAGE009
the number of the carbon atoms is 1,
Figure DEST_PATH_IMAGE011
is 0; when the feature point in the ith row and the jth column does not contain an object,
Figure 316185DEST_PATH_IMAGE009
is a non-volatile organic compound (I) with a value of 0,
Figure 627080DEST_PATH_IMAGE011
is 1.
(4) A knowledge distillation structure is adopted, and a teacher model is arranged to help the model to converge quickly; the prediction result output by the teacher model is used as a soft label, and the loss calculated by the prediction result output by the student model is called soft loss
Figure DEST_PATH_IMAGE012
The loss calculated by the prediction result output by the real label and the student model is called hard loss
Figure DEST_PATH_IMAGE013
(ii) a Wherein the loss function is as follows:
Figure DEST_PATH_IMAGE014
in the above loss function
Figure DEST_PATH_IMAGE015
Calculating formula Loss by using Loss function in (3) 1 To obtain wherein
Figure DEST_PATH_IMAGE016
(ii) a When the output value of the student model is closer to the real label than the output value of the teacher model, stopping the knowledge distillation training and only training the student model independently, wherein the training loss is as follows:
Figure DEST_PATH_IMAGE017
as a further improvement of the invention, the discrete material detection counting model is obtained by training through a learning-based target detection network. The input of the material detection technical counting network model is a material image containing a material to be identified, and the material to be identified should be kept in a flat state or a regular stacking state in the material image. The output of the discrete material detection counting model is the type and the number of the materials to be identified contained in the material image.
As a further improvement of the invention, the training process of the discrete material detection counting model is as follows:
(1) Acquiring clear original images which meet the requirements of shooting angles and contain different materials, and preprocessing the original images to obtain an original data set; and the original data set is expanded through cutting and rotating operations.
(2) And manually labeling the images in the original data set. The marked object is a discrete material in the image, and the marked mark information comprises: type of material, location information and quantity information.
Meanwhile, each image and corresponding marking information in the original data set are stored to obtain a new data set, and the new data set is randomly divided into a training set, a verification set and a test set according to the data proportion of 8.
(3) And performing multiple rounds of training on the constructed discrete material detection counting model by using the training set, and verifying the discrete material detection counting model by using the verification set after each round of training is finished. Respectively obtaining loss values of the material detection counting network in a training stage and a verification stage; and stopping the training process when the loss value obtained by the training set in each round is reduced and the loss value obtained by the verification set is increased. And storing five network models with loss values ranked in the top five obtained in the training stage.
(4) And testing the five stored network models by using a test set, and then taking the network model with the highest mAP value in the test result as the final discrete material detection counting model.
As a further improvement of the invention, in the unloading stage, the cloud server preferentially designates a warehouse which has higher residual capacity and is suitable for storing the current type of materials as an unloading warehouse. In the charging stage, the cloud server preferentially designates the warehouse with the larger material storage amount as the charging warehouse of the current materials.
The technical scheme provided by the invention has the following beneficial effects:
the material management system provided by the invention can carry out digital management on the environments such as material inlet and outlet, material getting and new material adding and the like in the whole factory, and real-time recording and automatic uploading are carried out on data generated in the inventory management process by utilizing a machine vision technology and a radio frequency identification technology, so that the timeliness of the material data is improved.
The material management system provided by the invention realizes full-process supervision and verification on different material changes in the plant by using less electronic equipment, thereby ensuring digitization and paperless property in the material management process. The work efficiency of managers is improved, the workload of the managers is reduced, and the risks of data errors and data omission are greatly reduced.
Due to the advantages of the scheme, after the material management system provided by the invention is applied to an assembled prefabricated part factory, the management efficiency of order production is favorably improved, and the inventory rate of products is reduced; thereby effectively compressing the production cost of enterprises and improving the economic benefits of the enterprises.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a system block diagram of a material management system of a prefabricated component factory according to embodiment 1 of the present invention.
Fig. 2 is a block diagram of a material management system of a prefabricated component factory constructed in example 1 of the present invention.
Fig. 3 is a schematic diagram of an operation process of a material management system of an assembly type preform factory according to embodiment 1 of the present invention.
Fig. 4 is a network framework diagram of the discrete material detection counting model constructed in embodiment 2 of the present invention.
Fig. 5 is a flowchart of the operation of the image optimization module performing the local image brightness enhancement processing in embodiment 2 of the present invention.
Fig. 6 is an overall network framework diagram of the object detection module in embodiment 2 of the present invention.
Fig. 7 is a network framework diagram of the improved feature fusion network part in embodiment 2 of the present invention.
Fig. 8 is a network framework diagram of the improved prediction header part in embodiment 2 of the present invention.
FIG. 9 is a Block diagram of a CSP-DSC-Block designed according to the present invention.
FIG. 10 is a schematic diagram of the attention mechanism of a module incorporating channel attention and spatial attention designed in accordance with the present invention.
Fig. 11 is a graph of accuracy versus recall of the network model in the testing process according to embodiment 2 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
The embodiment provides a material management system of a fabricated part factory, which is used for digitally managing the warehousing and ex-warehousing processes of materials in the fabricated part factory. As shown in fig. 1 and 2, the material management system includes: electronic tags, net weight measuring equipment, cloud server, and handheld terminal.
Wherein the electronic tags include a first tag and a second tag mounted on each of the transportation vehicles. The first label is used for storing a warehousing list or a ex-warehousing list during vehicle distribution. The second tag is used for storing the weighing result of the vehicle. The warehousing inventory and the ex-warehouse inventory are delivery-related material inventories filled by transportation units or material supply and receiving units, and the inventory records information of materials delivered into the warehouse or information of products taken from the warehouse.
Specifically, when the material is delivered to a warehouse of a factory, the information in the warehousing list includes: the unit of supplying goods of the present batch of materials, the information of transport vehicles, the traceability information of the materials, the types of the materials, the standard weights of the materials to be checked and accepted, the unit and the department for receiving the materials, other remark information and the like.
The supply unit refers to information of a manufacturing company or a trading company of the current batch of materials, and the information of the transport vehicle records information such as vehicle model, license plate number and the like used by the material distribution unit. The traceability information of the material records the information of the circulation process of the material before reaching the station. The name of the material and the nominal type are specified by the material type, and the nominal type can be two types, one type is a weighing nominal type, and the other type is a counting nominal type. For example, sand and the like are typically heavy nominal materials that are registered by weight as they enter and exit the warehouse, while cement, piping, construction products, and the like are typically heavy nominal materials that are registered by quantity as they enter and exit the warehouse. The nominal amount of the acceptance material refers to the total amount of the material of the current batch, for example, for sand, the nominal amount refers to the total weight of the material of the current batch; whereas for cement the nominal amount refers to the total amount of cement in the current batch, i.e. the number of bags. The unit and department receiving the material refers to the object for verification when the batch of material is put in storage, and is usually capital Guan Ke of the company. Other remark information other than the above information may be recorded, such as storage requirements, shelf life, etc. of the material.
Accordingly, when the manufactured product or semi-finished product is shipped from the factory warehouse, the information in the shipment list includes: the method comprises the steps of receiving units of the materials in the current batch, information of transport vehicles, traceability information of the materials, types of the materials, standard weights of the checked and accepted materials, supply units and departments of the materials, other remark information and the like.
The net weight measuring device comprises a weighing mechanism and an electronic tag writing module. The weighing mechanism is used for weighing the transport vehicles entering and leaving the factory, and the electronic tag writing module is used for writing the weighing result of the weighing mechanism into a second tag in the vehicle.
An inventory database and a discrete material detection counting model based on machine vision are operated in the cloud server. And the inventory database is used for carrying out classified statistics on the inventory of each material in the factory according to the information of the factory after acceptance. The discrete material detection counting model is used for identifying and counting the number of the counting nominal materials in the conveyed materials according to the input material images at different angles.
The handheld terminal is in communication connection with the cloud server, and data interaction is performed between the handheld terminal and the cloud server. The handheld terminal comprises an image acquisition module, an electronic tag reading and writing module and a display module. The image acquisition module is used for acquiring material images of materials entering and leaving a field; the electronic tag reading and writing module is used for reading tag information in any electronic tag or updating the tag information in any electronic tag.
As shown in fig. 3, the application process of the material management system is as follows:
1. vehicle warehousing stage:
the warehousing inventory or the ex-warehouse inventory is pre-recorded into the first label of the transport vehicle. And reading the inventory information in the first tag of the vehicle through the handheld terminal, and uploading the inventory information to the cloud server to finish the pre-registration of the warehouse entry and the warehouse exit.
Meanwhile, the net weight measuring equipment writes the warehousing weighing result of the vehicle into a second label in the vehicle.
And the cloud server matches the received warehousing list or ex-warehouse list with the inventory database and sends the appointed unloading or loading warehouse position to the handheld terminal. In the unloading stage, the cloud server preferentially designates a warehouse which has higher residual capacity and is suitable for storing the current type of materials as an unloading warehouse. In the charging stage, the cloud server preferentially designates the warehouse with the larger material storage amount as the charging warehouse of the current materials.
2. Unloading or loading stage:
the vehicle is guided according to the designated warehouse location.
And collecting material images by adopting a handheld terminal in the loading and unloading process, and uploading the material images to a cloud server.
And the cloud server identifies and counts the actual measurement nominal weight of the counting nominal type material through the discrete material detection counting model.
3. And (3) vehicle delivery stage:
the net weight measuring device writes the out-of-warehouse weighing result of the vehicle into a second tag in the vehicle.
And the warehouse manager scans a second label of the vehicle by using the handheld terminal and synchronously sends the read warehousing weighing result and the ex-warehouse weighing result to the cloud server.
And the cloud server calculates the net weight of the goods according to the warehousing weighing result and the ex-warehouse weighing result, and takes the net weight of the goods as the actual measurement standard weight of the weight nominal type material. And then the cloud server compares whether the nominal information of the materials in the warehousing list or the ex-warehouse list is matched with the actual measured nominal weight, if so, the vehicle is released through ex-warehouse verification. And modifying the inventory database according to the inventory list or the ex-warehouse list.
In order to make the scheme provided by this embodiment clearer, the following processes of warehousing PVC pipes (piece-counting nominal type) and building sands (weighing nominal type), ex-warehouse of cement culvert pipes, daily stock checking, and goods receiving and product storing are taken as examples respectively, and the operation process of the material management system provided by the present application is described in detail:
1. warehousing and registering materials:
before entering a field, a supply unit or a logistics company needs to write a warehousing list into a first label of a vehicle. When the transport vehicle arrives at the factory entrance, the vehicle needs to be stopped at a weighing mechanism (usually a wagon balance) which measures the warehousing weighing result of the vehicle. The warehousing weighing result can be written into the second label by the net weight measuring equipment through the electronic label writing module.
Meanwhile, warehouse management personnel in a factory can scan the first label on the vehicle by using the handheld terminal, acquire information of a warehousing list and upload the information to the cloud server for pre-registration. The warehousing list is actually a material detail list which is informed to the factory by a supply unit or a transport company, and can replace the traditional paper logistics distribution list.
And then the cloud server can know the details of the delivered materials of the batch according to the pre-registration information, and reasonably distribute the unloading positions based on the storage capacity conditions of different warehouses. The warehouse manager may act as a pilot to bring the delivery vehicle to the unloading warehouse.
During the unloading process, different operations are performed according to different types of cargoes. For example, for a material adopting a nominal type of piece count such as a PVC pipe, a warehouse manager may use a handheld terminal to photograph the pipe in a car or in a warehouse after being unloaded and stacked, and then the photograph is uploaded to a cloud server, and the cloud server may identify the specification and number of the PVC pipes transported in the batch according to the image. The specification and the number identified by the cloud server are actual measurement standard weights for the current round of acceptance, and the actual measurement standard weights refer to the amount of goods actually received by the warehouse. Certainly in the unloading process, if warehouse management personnel determine that the delivered material quality has problems, the reminding can be sent to the cloud server through the handheld terminal, and the actual measurement standard weight is verified and subtracted.
When the vehicle arrives at the factory exit after unloading is finished, the vehicle can be weighed for the second time by another group of weighing structures, the weighing result at this time is recorded as the ex-warehouse weighing result, and the ex-warehouse weighing result can also be written into the first label of the vehicle. At the moment, the warehouse manager reads the data in the first label through the handheld terminal again, and then the latest warehousing weighing result and the latest ex-warehouse weighing result can be obtained at the same time. The results may also be uploaded to the cloud server.
It should be noted that, for a piece-counting nominal type material, the factory weighing result does not affect the acceptance result of goods delivery, and the acceptance process is completed when the cloud server generates a measured nominal weight (quantity). If the actual measured standard weight is in accordance with the warehousing list, directly performing warehousing registration by taking the warehousing list as a standard, releasing the vehicle and modifying the inventory database. If the two are not in agreement, the information is informed to a supply company or a logistics company, and the information is put in storage and registered based on the measured standard weight.
The purpose of weighing during the ex-warehouse phase is to check and accept the weight of the nominal type of cargo, for example for construction sand, which is dumped by the transport vehicle after the sand has been dumped to the yard. And the cloud server determines the weight of the sand actually received at this time according to the difference value of the warehousing weighing result and the ex-warehouse weighing result. And this is used as the actual standard amount (weight). Similarly, the cloud server determines whether the warehousing inventory and the measured standard weight match: if the vehicle is matched with the vehicle, directly taking the warehousing list as the standard to carry out warehousing registration, releasing the vehicle and modifying the inventory database. If the two are not in agreement, the information is informed to a supply company or a logistics company, and the information is put in storage and registered based on the measured standard weight.
2. Out-of-warehouse registration of materials
When a certain receiving unit or the unit sends out a vehicle (empty vehicle) to transport a product from the warehouse, the delivery list needs to be written into the first tag of the vehicle. When the transport vehicle arrives at a factory entrance, warehouse management personnel can scan a first label of the vehicle by using a handheld terminal, read the content of the ex-warehouse list in the first label and upload the content to a cloud server. Meanwhile, the weighing mechanism can weigh the transport vehicle for the first time, and records the warehousing weighing result into the second label.
The cloud server distributes a loading warehouse for the transport vehicle according to the received delivery list and sends the loading warehouse to the handheld terminal; the vehicle is guided by the warehouse manager to the designated loading bay. In the loading process, the warehouse management personnel can shoot images of products through the handheld terminal and send the images to the cloud server, and the cloud server determines whether the actual measured standard quantity (namely the loading quantity) is consistent with the ex-warehouse list or not according to the recognized types and the recognized quantity of the products, and if the actual measured standard quantity is consistent with the ex-warehouse list, the actual measured standard quantity passes the ex-warehouse verification request of the current vehicle.
Similarly, in the ex-warehouse stage, the vehicle still needs to be subjected to secondary weighing at the factory outlet, and the ex-warehouse weighing result and the in-warehouse weighing result are jointly used as the criterion for verifying the net weight of the ex-warehouse material. If the result is in accordance with the standard, the product is released, and if the result is not in accordance with the standard, the product needs to be checked.
It should be noted that, in the embodiment, the weight of the material is obtained mainly by weighing the vehicle before and after unloading; thus, during vehicle loading and unloading, when the material being transported is of the weight nominal type, at most one is allowed to be transported at a time. Of course, if several different weight nominal materials must be transported simultaneously, a separate loading and unloading is required and weighing is carried out after each loading and unloading in order to calculate the net weight of the various materials loaded from the weight variation of the vehicle during two consecutive weighings.
In the embodiment, the machine vision technology is used for identifying and counting the quantity of the piece counting nominal type materials, and the machine identification technology can be used for analyzing the quantity of different materials from the same picture. Thus, when the material being transported is a piece-by-piece nominal type of material, it allows for the simultaneous transport of a plurality of different types of material. For example, segments, boxes, culverts, etc. can be loaded and unloaded simultaneously during a single transport. These different materials can be identified by the trained network model.
3. Inventory checking
The daily management task of the materials in the warehouse also comprises material counting, and in this embodiment, the electronic tag also comprises a third tag installed on each warehouse. The third label is used for storing the material storage capacity data in each warehouse. And after the vehicle is taken out of the warehouse, the handheld terminal receives the information in the modified inventory database in the cloud server and updates the third label corresponding to the warehouse. The third tag on each warehouse can accurately record the actual storage capacity information of the warehouse, including the occupied storage capacity and the remaining storage capacity of the acceptable materials. The data in the third tag may also be used as reference data for the warehouse to periodically count.
Specifically, in the process of checking the warehouse regularly, checking personnel check and dispose materials in the warehouse. And then comparing the material storage capacity data recorded in the third tag in each warehouse with the actually checked data, reporting to the system when the difference occurs between the two, and modifying the inventory database in the cloud server after verification and cancellation by a manager with authority. And then the checking personnel updates the third label in the corresponding warehouse through the handheld terminal.
For example, when the inventory process finds that the amount of cement actually stored in one warehouse is 801 packets, but the data in the inventory database is displayed as 805 packets, it is necessary to report to the cloud server, modify the data in the inventory database, and update the data of the third tag. In the reporting process, a manager can observe the theoretical data of the inventory database by using a display module in the handheld terminal and modify the theoretical data.
4. Goods reception and product storage
In factories, warehouse materials are mainly used for daily production. In the field of the manufacturing sector, therefore, it is necessary to obtain the raw materials of the warehouse, and to store the products in the warehouse after the fabrication of the prefabricated parts. These plant-internal production activities can also result in changes to the plant's material inventory.
In the material management system of the embodiment, when a production department in a factory needs to receive materials in any warehouse, a receiving request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; and after the materials are received, the manager updates the third label of the corresponding warehouse through the handheld terminal.
When a production department in a factory needs to store produced products in any warehouse, a storage request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; and after the product is stored and taken, the manager updates the third label of the corresponding warehouse through the handheld terminal.
As can be seen from the above description: the material management system provided by the embodiment can be used for processing all matters related to material management in a factory, after the material management system provided by the embodiment is used, all links of the entrance and the exit of all goods and tasks such as consumption and newly-added materials in the factory are processed on line, and the management system can record relevant data at any time and report the relevant data to the cloud server in the processing process to dynamically update the inventory database. The problems of low manual registration efficiency and complex processing process can be effectively solved. Even for a larger task amount, the problem that data omission or errors are easy to occur can be effectively avoided.
The scheme provided by the embodiment provides a completely digital and paperless management method, the 'clearance' and 'clearance' processes of goods in a warehouse are more convenient, the efficiency of in-warehouse and out-warehouse registration is greatly improved, and the data acquisition is more timely and accurate. In addition, for the statistics of the number of the goods, a machine vision technology at a leading edge is introduced, and the cloud server is used for efficiently processing the network task, so that the workload of workers is greatly reduced.
In the material management system provided by this embodiment, the first tag, the second tag, and the third tag may be RFID tags; and data reading and modification are carried out through a chip with an RFID reading and writing function. The handheld terminal adopts special electronic equipment with a network communication function or general electronic equipment with a communication function and a specific management program. The universal electronic equipment can be a mobile phone, a tablet personal computer, a smart band, a smart watch, smart glasses and other wearable equipment. The specific management program installed in the general electronic device refers to an application program which can log in an account and perform data interaction with a cloud server.
In addition, the third tag can also adopt a visual electronic tag with a communication function and a data modification function, the third electronic tag can be communicated with the cloud server, and the third tag regularly acquires the storage capacity information related to the third tag in the cloud server and autonomously updates the storage capacity recorded or displayed by the third tag.
Example 2
The embodiment provides a discrete material detection counting model; the overall framework of the network model is shown in fig. 4, and includes an image optimization module, a plurality of target detection modules designed based on YOLOv5 and performing corresponding training, and at least one output module.
The image optimization module is used for carrying out local brightness optimization on the input image to obtain an image with uniform pixel brightness distribution and outputting the image to the target detection module. The target detection module is used for identifying each prediction frame containing the identification target under different scale characteristic states and the corresponding confidence coefficient according to the input image. The output module processes the output of the target detection network by adopting a non-maximum suppression algorithm to obtain the number of target objects contained in the input image.
In this embodiment, the image optimization module performs luminance analysis on the input image first in the actual processing process, then performs luminance enhancement on an area with darker image pixels, and does not perform processing on an area with proper luminance. Further obtaining an image with uniformly distributed pixel brightness; therefore, the influence of the image acquisition illumination environment on the counting discrete material counting algorithm result is reduced. The specific image processing process is shown in fig. 5, and includes the following steps:
(1) And converting the original Image into a gray scale Image Grey.
(2) Expanding the gray level graph Grey to the number of original image channels, and multiplying the original image channels by a preset brightness weight coefficient alpha to obtain a weight matrix W, namely: w = α × Grey.
Wherein "+" represents a multiplication operation of a matrix; the luminance weight coefficient α ranges from 0 to 1, and a larger value indicates a larger degree of enhancement.
(3) And determining that the pixel value in the gray image Grey is less than 0.5 as a dark pixel, and performing pixel enhancement on the dark pixel screened from the gray image Grey to obtain a corresponding Exposure image Exposure.
(4) Outputting the enhanced image
Figure 8908DEST_PATH_IMAGE001
The following were used:
Figure 229804DEST_PATH_IMAGE002
in this embodiment, a frame of the target detection module is shown in fig. 6, and includes a feature extraction network, a feature fusion network, and a prediction head. The feature extraction network is used for extracting features of the input image to obtain feature maps of four different scales, namely 160 × 160, 80 × 80, 40 × 40 and 20 × 20. And the feature fusion network performs fusion reinforcement on the feature graphs of four different scales according to repeated bidirectional feature fusion paths to obtain fusion features of 80 × 80, 40 × 40 and 20 × 20 in three different scales. The prediction head is used for respectively carrying out object detection and prediction frame regression on the three fusion characteristics with different scales; and outputting the corresponding prediction box and the confidence level.
In the network model of this embodiment, a Focus module in the YOLOv5 network is used, the module takes a value every other pixel, and then the processed results are superimposed in the channel direction, so as to finally output a feature map in which the feature length and width of the image are halved and the number of channels is quadrupled. The structure can effectively improve the processing speed on the GPU under the condition of not compressing image information. In addition, the idea of the ResNet and DenseNet network is used for reference in the improved network model of the embodiment, multiple jump connections are added, and the idea of the RepVGG heavy parameter and the idea of the depth separable convolution are combined to optimize the CSPDarknet network. And experiments prove that the main network can be rapidly converged and has stronger feature extraction capability.
In the feature fusion network in this embodiment, the structure of the feature fusion network is shown in fig. 7, and the feature fusion network performs fusion reinforcement on four features of different sizes extracted by the feature extraction network, and then obtains three size features reinforced in the same size. In this embodiment, the large-scale feature maps, such as 160 × 160 and 80 × 80 feature maps, contain more image detail information, and this layer of detail information can enable the network prediction head to output more accurate object confidence, which helps the algorithm to output more accurate detection results. In the algorithm, the small-scale feature map such as 20-by-20 contains more semantic information, the semantic information of the layer contains more object position information and object high-level features, and the network prediction head can be regressed into a finer prediction box. Therefore, the feature enhancement fusion can fuse the image detail features of the large-size feature map into the high-level semantic information of the small-size feature map, and can also fuse the high-level semantic information of the small-size feature map into the image detail texture information of the large-size feature map.
In the embodiment, the discrete material counting task has the characteristics of density, partial shielding, generally smaller target and smaller characteristic in the practical application process. And the large-size features contain more target information, so that a reinforced feature fusion network is designed, and the aim of utilizing the large-size features to the maximum extent is fulfilled. In the feature fusion network shown in fig. 7, a new repeated bidirectional feature fusion path is specially designed, so that the large-scale features and the small-scale features are mutually transmitted and fused, the receptive field of the large-scale features is increased, and more target texture details are given to the small-scale features. Meanwhile, jump connection between the original features and the enhanced features is increased, loss of self features caused by fusion of other size features due to excessive features of the layer is prevented, and the features are fused by giving higher weight to the large-scale features instead of simple superposition.
The feature fusion network respectively transmits the three features subjected to fusion enhancement into a prediction head for splicing and outputs a result, wherein the result comprises object position information (four parameters) and object confidence coefficient (one parameter). In this embodiment, the prediction head is redesigned, specifically, a network architecture of the prediction head in this embodiment is as shown in fig. 8, the object detection head and the prediction frame regression head are decoupled, and the two modules independently perform object detection and prediction frame regression tasks.
The output result of the prediction head comprises a plurality of overlapped prediction boxes, and the traditional non-maximum suppression (NMS) algorithm only keeps the highest object confidence coefficient and the highest object classification score in the same stacking area and directly deletes the rest prediction boxes. Although most of overlapped prediction frames can be suppressed, the prediction frame closest to the real frame cannot be kept, so that the embodiment adopts an improved non-maximum suppression algorithm to process the prediction structure, keeps the prediction frames with the first three scores, gives corresponding weights to the three prediction frames, and obtains the prediction frame closest to the real frame by jointly averaging the three prediction frames with the highest scores.
In general, the present embodiment improves upon the YOLOv 5-based target detection module as follows:
(1) And replacing all CSP-blocks in the characteristic extraction network of YOLOv5 with newly designed CSP-DSC-blocks.
As shown in fig. 9, the CSP-DSC-Block divides the input features into two paths, one path is processed by DSC-Block, the other path is processed by multiple Dense-Block, and the two paths are convolved by 3*3 and then output.
(2) Replacing all 3*3 convolution modules on a backbone network in YOLOv5 with Rep-Block; so as to adopt a multi-branch structure for Rep-Block during network training.
In the embodiment, 3*3 convolution in a backbone network is replaced by RepBlock, the RepBlock adopts a multi-branch structure during training, repeated gradients can be effectively avoided, the network can be rapidly converged, single-branch prediction is performed by using 3*3 convolution with a reparameterization parameter during deployment, the network inference speed is increased, and meanwhile, some precision is improved.
(3) An attention mechanism module is introduced into a feature extraction network and a feature fusion network of YOLOv5, and the attention mechanism module adopts a module for fusing channel attention and space attention.
The discrete material meter target detection module provided by the embodiment comprises an attention mechanism module. Because discrete materials such as reinforcing steel bars, embedded parts and the like in an assembly type prefabricated part factory are generally single in color and piled in an aggregation mode, image features extracted by a discrete material counting task in the scene are not uniformly distributed in the image channel direction and are not uniformly distributed in the image plane space direction. For this particular application scenario, the module for merging the channel attention and the spatial attention designed in this embodiment is shown in fig. 10.
The feature input dimension of the attention mechanism module is H W C, H is the feature height, W is the feature width, and C is the number of feature channels. First of all. And respectively carrying out global maximum pooling and global average pooling on the input features, and splicing the input features into 1 × C dimensional features. Then, obtaining the channel attention weight through a 1*1 convolution kernel and a Sigmoid function; and performing dot product on the input features and the channel attention weight to obtain H, W, C features after fusion of the channel attention. And then, splicing the characteristic graphs into H W2 after respectively carrying out global maximum pooling and global average pooling. And finally, outputting the H-W-C dimensional output characteristics processed by the attention mechanism through the convolution kernel convolution of 7*7 and the sigmoid function.
(4) And adjusting the prediction head, wherein the simplified prediction head adopts a structure of decoupling object confidence judgment and prediction frame regression.
Specifically, the prediction head uses 1*1 convolution to split the feature map transmitted into the prediction head into two feature maps with the same shape on a channel, then 3*3 convolution is respectively superposed, and finally 1*1 convolution is used to perform dimensionality reduction to obtain a confidence coefficient and a corresponding detection result of the prediction frame. The number of characteristic output channels judged by the object confidence degree is 1, and the number of characteristic output channels subjected to regression by the prediction frame is 4.
(5) Since the prediction head output contains many overlapping prediction boxes, the traditional non-maximum suppression (NMS) algorithm only retains the highest object confidence and object classification score in the same stacking region and deletes the remaining prediction boxes directly. Although most of overlapped prediction frames can be suppressed, the prediction frame closest to the real frame cannot be kept, so that the algorithm adopts an improved non-maximum suppression algorithm, the prediction frames with the scores of the first three are kept, corresponding weights are given to the three prediction frames, and the prediction frame closest to the real frame is jointly averaged by the three prediction frames with the highest scores.
The material types to be identified by the target detection module of this embodiment include a plurality of types, and therefore, different types of materials should be trained respectively in the network model, and corresponding network model parameters should be maintained. In the application stage, the images to be recognized can be respectively input into different network models for respective detection. Specifically, the training process of the network model is roughly as follows:
1. acquiring clear original images which meet the requirements of shooting angles and contain different materials, and preprocessing the original images to obtain an original data set; and the original data set is expanded through cutting and rotating operations.
2. And manually labeling the images in the original data set. The marked object is a discrete material in the image, and the marked marking information comprises: type of material, location information and quantity information.
Meanwhile, each image and corresponding marking information in the original data set are stored to obtain a new data set, and the new data set is randomly divided into a training set, a verification set and a test set according to the data proportion of 8.
3. And performing multiple rounds of training on the constructed discrete material detection counting model by using the training set, and verifying the discrete material detection counting model by using the verification set after each round of training is finished. Respectively obtaining loss values of the material detection counting network in a training stage and a verification stage; and stopping the training process when the loss value obtained by the training set in each round is reduced and the loss value obtained by the verification set is increased. And storing five network models with loss values ranked in the top five obtained in the training stage.
4. And testing the five stored network models by using a test set, and then taking the network model with the highest mAP value in the test result as the final discrete material detection counting model.
In particular, in order to improve the training effect and the convergence rate of the network model, the present embodiment further adjusts the strategy in the training phase of the network model as follows:
(1) Adopting a Mosaic enhancement strategy to amplify the original data set; the specific processing process comprises the steps of splicing any four images, and transforming the spliced images in a rotating, cutting and scaling mode.
(2) And in the network model training stage, the SimOTA label dynamic allocation method is adopted to optimize the positive sample.
In the training stage of the network model, considering that the same distribution scheme is used for all samples, some positive samples which are difficult to regress by the network may be distributed, and the positive samples have opposite effects on network training. The SimOTA adopts a dynamic allocation scheme, a cost matrix is calculated after allocation every time, and the cost matrix mainly considers three factors: a. and the coincidence degree of the real frame and the characteristic point prediction frame. b. And the coincidence degree of the target confidence degrees of the real frame and the characteristic point. c. Whether the center of the real box is too far from the feature point. And after the cost is obtained, selecting 10 samples with the lowest cost as positive samples according to the cost, and using the 10 positive samples of the batch to predict the real frame back and forth.
(3) The loss function calculation formula after the improved design of the adaptation task is as follows:
Figure DEST_PATH_IMAGE018
in the above-mentioned formula, the compound has the following structure,
Figure 583425DEST_PATH_IMAGE004
the weight representing the prediction box regression prediction loss,
Figure 991404DEST_PATH_IMAGE005
representing the weight corresponding to the positive sample confidence prediction loss,
Figure 95364DEST_PATH_IMAGE006
a weight representing a negative sample confidence prediction penalty; s. the 2 Representing the number of the characteristic points, wherein the value of the number is the product of the length and the width of the characteristic diagram; b represents that the number of the prediction frames can be preset according to the task; c represents the object confidence prediction value and,
Figure 436347DEST_PATH_IMAGE007
representing the confidence of the object by the real label value, the upper and lower labels i and j representing the confidence of the object corresponding to the jth prediction box in the ith characteristic pointA probability of whether the token contains an object, which ranges from 0 to 1; x, y, w and h respectively represent the horizontal and vertical coordinates and the width and the height of the center of the prediction frame,
Figure 960869DEST_PATH_IMAGE008
respectively representing the horizontal and vertical coordinates and the width and the height of the center of the real label, wherein the upper and lower labels i and j represent a group of horizontal and vertical coordinates and a group of width and height values corresponding to the jth prediction frame in the ith characteristic point;
Figure 980778DEST_PATH_IMAGE009
and
Figure 124314DEST_PATH_IMAGE010
respectively representing a group of marks for representing whether the characteristic points at i and j contain the target object: when the ith row and jth column feature points contain objects,
Figure 913279DEST_PATH_IMAGE009
the number of the carbon atoms is 1,
Figure 749647DEST_PATH_IMAGE011
is 0; when the ith row and jth column feature points do not contain an object,
Figure 522431DEST_PATH_IMAGE009
is a non-volatile organic compound (I) with a value of 0,
Figure 705544DEST_PATH_IMAGE011
is 1.
(4) A knowledge distillation structure is adopted, and a teacher model is arranged to help the model to converge quickly; the idea of this type of design in this embodiment is to use an existing well-trained and superior network, called a "teacher model". Since the output result of the teacher model is closer to the prediction result of the excellent neural network, the student model can converge quickly. The prediction result output by the teacher model is used as a soft label, and the loss calculated by the prediction result output by the student model is called soft loss
Figure 880174DEST_PATH_IMAGE012
True label and student model inputThe calculated loss of the predicted result is called hard loss
Figure 480919DEST_PATH_IMAGE013
(ii) a Wherein the loss function is as follows:
Figure 616366DEST_PATH_IMAGE014
in the above loss function
Figure 960759DEST_PATH_IMAGE015
Calculating formula Loss by using Loss function in (3) 1 To obtain wherein
Figure 724316DEST_PATH_IMAGE016
(ii) a And when the output value of the student model is closer to the real label than that of the teacher model, stopping knowledge distillation training and only training the student model independently, finally storing the training result of the student model, and only deploying the student model for actual prediction when the student model is deployed and used. The loss of the student model training is as follows:
Figure 636908DEST_PATH_IMAGE017
performance testing
In order to more fully explain the effectiveness of the discrete material counting model adopted in the embodiment, the embodiment also performs a simulation experiment on the performance of the algorithm model, the simulation experiment process takes a steel bar and a steel pipe as a test of an identification object, and an identification scene of a nested steel pipe (a small-caliber steel pipe is sleeved in a large-caliber steel pipe) with higher identification difficulty is designed in the experiment.
1. The experimental process compares the adaptation conditions of various algorithms to the identification tasks, and the final algorithm result is better. The material image and the material distribution in the material detection task have the characteristics of singleness and aggregation, so the detection task is better completed by introducing an attention mechanism into the algorithm. Specifically, the experimental parameters of the effect of the self-designed module for fusing channel attention and space attention and the existing mainstream CBAM attention module on the network are shown in table 1 below:
table 1: parameter setting of algorithm model
Figure DEST_PATH_IMAGE020
2. In the training stage of the network model, 200 data sets from the Datafountain steel bar end face, 86 data sets from the steel bar end face, 200 data sets from other materials and 486 pictures are collected. Wherein. 360 training sets (including a verification set) and 126 testing sets. This experiment completed the training process on a test rig using a single RTX 3060 TI.
3. Based on the same data set, this experiment compared the prediction accuracy (Precision) of this example with other typical algorithmic models, and the results are shown in table 2 below.
Table 2: prediction accuracy statistical table of algorithm model and other algorithm models in the example
Figure DEST_PATH_IMAGE022
In addition, the changes of the prediction precision of the algorithm when the self-design attention module and the space attention module (CBAM) are located at different positions of the algorithm are compared in the experimental process, and the test results are shown in table 3.
Table 3: prediction accuracy change of algorithm
Figure DEST_PATH_IMAGE024
Based on the data in tables 2 and 3, it can be seen that: the self-design module is arranged at the position of the image feature output by the trunk network in the embodiment, and the best performance is obtained on the data set by introducing the CBAM module into the position of the fusion of the image features with multiple scales through comprehensive experimental analysis.
4. Based on the training and verification results of the constructed network model, the embodiment also tests the finally saved trained network model, and the test result is shown as the PR curve of the algorithm of fig. 11. The curve is finally converged to 0, and reflects that the algorithm of the embodiment has stronger applicability to the discrete material counting task. The trained network model can complete the discrete material counting task with higher precision and recall rate, and the higher precision representation algorithm can effectively avoid the phenomenon of false detection and the higher recall rate represents that the network rarely has the phenomenon of missed detection.
With the above embodiments, it can be seen that: the algorithm model finally trained in the experiment achieves 97.46% of accuracy on the data set, and therefore the basic requirements of material detection in a production environment can be met. The algorithm is proved to have accurate counting capability and can meet urgent material acceptance tasks. In addition, the prediction result also shows that the algorithm model can solve the problem of steel pipe detection with nesting relation which cannot be solved by most material detection schemes. .
Example 3
This example is a further optimization of the protocol in example 1. Specifically, the handheld terminal in this embodiment further includes a local identification module of the nested pipe material based on the image identification technology, and the local identification module is configured to identify and count the pipe material locally at the handheld terminal according to the image acquired by the image acquisition module. The identification result of the local identification module and the image acquired by the image acquisition module are synchronously sent to the cloud server; and the identification result of the local identification module is used for carrying out cross validation on the identification result of the local identification module and the detection result of the discrete material detection counting model in the cloud server.
The pipe fitting detection method of the local identification module comprises the following steps:
(1) Detecting and extracting the edge of an image of a corresponding area of the pipe fitting by adopting a Canny algorithm; the extraction process of the edge comprises four steps: a. performing Gaussian filtering on the image to enable the edge contour in the image to be smooth and increase the edge width; b. calculating gradient values and gradient directions of pixels in the image; c. only the point where the gradient value is maximum along the gradient direction of the pixel is retained by non-maximum suppression (NMS), and the retained pixel must be an edge. d. C, smoothly expanding the preserved pixel points in the step c along the direction of the normal vector of the gradient, and stopping the operation when the gradient of the expanded points is lower than a given threshold value to obtain a final contour;
(2) And calculating the normal vector of the edge tangent. After edge detection, the edges are processed smoothly and continuously. And then calculating the tangent inward normal vector of the curve where the point on the contour is located, and recording the normal vector straight line.
(3) The intersection point of the normal vector straight lines is the center of a circle of the circular outline, and the normal vector straight lines which are not intersected are judged as noises with other shapes and are not calculated.
(4) The line segment is radiated from the center of the circle by the contour taking the intersection point as the center of the circle, the curve contour is matched as an object, and the object is considered to be detected by matching the center of the circle and the curve contour.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A material management system for a prefabricated component factory for digitally managing the warehousing and ex-warehouse of materials in the prefabricated component factory, the material management system comprising:
an electronic tag including a first tag and a second tag mounted on each transport vehicle; the first label is used for storing a warehousing list or a ex-warehouse list during vehicle distribution; the second label is used for storing the weighing result of the vehicle;
the net weight measuring device comprises a weighing mechanism and an electronic tag writing module; the electronic tag writing module is used for writing the weighing result of the weighing mechanism into a second tag in the vehicle;
the cloud server is internally provided with an inventory database and a discrete material detection counting model based on machine vision in operation; the inventory database is used for carrying out classified statistics on the inventory of each material in the factory according to the information of the factory after acceptance; the discrete material detection counting model is used for identifying and counting the number of the counting nominal materials in the conveyed materials according to the input material images at different angles; and
the handheld terminal is in communication connection with the cloud server and performs data interaction between the cloud server and the cloud server; the handheld terminal comprises an image acquisition module, an electronic tag reading and writing module and a display module; the image acquisition module is used for acquiring material images of materials entering and leaving a field; the electronic tag reading and writing module is used for reading tag information in any electronic tag or updating the tag information in any electronic tag;
the application process of the material management system is as follows:
1. vehicle warehousing stage:
the warehousing inventory or the ex-warehouse inventory is recorded into a first label of the transport vehicle in advance; reading the inventory information in the first tag of the vehicle through the handheld terminal, and uploading the inventory information to the cloud server to finish warehouse entry and exit pre-registration; meanwhile, the net weight measuring equipment writes the warehousing weighing result of the vehicle into a second tag in the vehicle; the cloud server matches the received warehousing list or ex-warehouse list with a warehousing database, and sends a designated unloading or loading warehouse position to the handheld terminal;
2. unloading or loading stage:
guiding the vehicle according to the designated warehouse location; collecting material images by adopting a handheld terminal in the loading and unloading process, and uploading the material images to a cloud server; the cloud server identifies and counts the actual measurement nominal weight of the counting nominal material through the discrete material detection counting model;
3. and (3) vehicle delivery stage:
the net weight measuring equipment writes the ex-warehouse weighing result of the vehicle into a second tag in the vehicle; the warehouse management personnel use the handheld terminal to scan a second label of the vehicle and synchronously send the read warehousing weighing result and the ex-warehouse weighing result to the cloud server; the cloud server calculates the net weight of the goods according to the warehousing weighing result and the ex-warehouse weighing result, and takes the net weight of the goods as the actual measurement standard weight of the weight nominal type material; and then the cloud server compares whether the nominal information of the materials in the warehousing list or the ex-warehouse list is matched with the actual measurement standard weight, if so, the vehicle is released through ex-warehouse verification, and the warehousing database is modified according to the warehousing list or the ex-warehouse list.
2. The material management system of a fabricated preform factory as set forth in claim 1, wherein: the electronic tags further comprise third tags mounted on respective warehouses; the third label is used for storing the material storage capacity data in each warehouse; after the vehicle is taken out of the warehouse, the handheld terminal receives the information in the modified inventory database in the cloud server and updates a third label corresponding to the warehouse;
when a production department in a factory needs to receive materials in any warehouse, a receiving request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; after the materials are received by the manager, updating a third label of the corresponding warehouse through the handheld terminal;
when a production department in a factory needs to store produced products in any warehouse, a storage request is sent to a cloud server through a handheld terminal; the cloud server modifies the inventory database after responding to the request; and after the product storage is finished, the manager updates the third label of the corresponding warehouse through the handheld terminal.
3. The material management system of a fabricated preform factory as set forth in claim 2, wherein: in the process of regularly checking the warehouse, checking personnel check and handle materials in the warehouse, then compare the material storage capacity data recorded in the third label in each warehouse with the actually checked data, report the data to the system when the difference occurs between the two, modify the inventory database in the cloud server after the management personnel with authority checks and sell the data, and then renew the third label in the corresponding warehouse through the handheld terminal.
4. The material management system of a fabricated preform factory as set forth in claim 1, wherein: the information in the warehousing list comprises: the method comprises the following steps of (1) supplying units of the materials in the current batch, information of transport vehicles, traceability information of the materials, material types, standard weights of the checked and accepted materials, units and departments for receiving the materials, and other remark information;
the information in the ex-warehouse list comprises: the receiving unit of the current batch of materials, the information of the transport vehicle, the traceability information of the materials, the types of the materials, the standard weight of the checked and accepted materials, the supply unit and department of the materials, and other remark information.
5. The material management system of an assembled preform factory as claimed in claim 1, wherein: the first tag, the second tag and the third tag are RFID tags; the handheld terminal adopts special electronic equipment with a network communication function or general electronic equipment with a communication function and a specific management program.
6. The material management system of a fabricated preform factory as set forth in claim 1, wherein: the constructed discrete material detection counting model comprises an image optimization module, a target detection module designed based on YOLOv5 and an output module; the image optimization module is used for carrying out local brightness optimization on the input image to obtain an image with uniform pixel brightness distribution and outputting the image to the target detection module; the target detection module is used for identifying each prediction frame containing an identification target and the corresponding confidence coefficient thereof under different scale characteristic states according to the input image; the output module processes the output of the target detection network by adopting an improved non-maximum suppression algorithm to obtain the number of target objects contained in the input image;
the target detection module comprises: the method comprises the following steps of (1) extracting a feature, fusing a feature and a measuring head; the feature extraction network is used for extracting features of an input image to obtain feature maps of four different scales, namely 160 × 160, 80 × 80, 40 × 40 and 20 × 20; the feature fusion network performs fusion reinforcement on feature graphs of four different scales according to repeated bidirectional feature fusion paths to obtain fusion features of three different scales, namely 80 × 80, 40 × 40 and 20 × 20; the prediction head is used for respectively carrying out object detection and prediction frame regression on the three fusion characteristics with different scales; and outputting the corresponding prediction box and the confidence level.
7. The material management system of a fabricated preform factory as set forth in claim 6, wherein: the optimization process of the image optimization module is as follows: (1) converting the original Image into a Grey-scale Image Grey;
(2) Expanding the gray level graph Grey to the number of original image channels, and multiplying the original image channels by a preset brightness weight coefficient alpha to obtain a weight matrix W, namely: w = α Grey;
wherein "+" represents a multiplication operation of a matrix; the value range of the brightness weight coefficient alpha is 0-1, and the larger the value is, the larger the enhancement degree is;
(3) Determining that the pixel value in the gray image Grey is less than 0.5 as a dark pixel, and performing pixel enhancement on the dark pixel screened from the gray image Grey to obtain a corresponding Exposure image Exposure;
(4) The enhanced Image' is output as follows:
Image′=Image*W+Exposure*(1-W)。
8. the material management system of a fabricated preform factory as set forth in claim 6, wherein: the improvement of the target detection module is as follows:
(1) Replacing all CSP-blocks in the characteristic extraction network of YOLOv5 with newly designed CSP-DSC-blocks;
the CSP-DSC-Block divides input characteristics into two paths, one path is subjected to DSC-Block processing, the other path is subjected to multiple Dense-Block processing, and then two paths of processing results are subjected to 3*3 convolution processing and then output;
(2) Replacing all 3*3 convolution modules on the backbone network in the YOLOv5 with Rep-Block; adopting a multi-branch structure for Rep-Block during network training;
(3) An attention mechanism module is introduced into a feature extraction network and a feature fusion network of YOLOv5, and the attention mechanism module adopts a function module for fusing channel attention and space attention;
the feature input dimension of the attention mechanism module is H x W x C, H is the feature height, W is the feature width, and C is the number of feature channels; firstly, inputting features, respectively carrying out global maximum pooling and global average pooling, and splicing into 1 × C dimensional features; then obtaining the channel attention weight through a 1*1 convolution kernel and a Sigmoid function; obtaining H, W, C characteristics after channel attention fusion after dot product of the input characteristics and the channel attention weight; then, splicing into a H W2 feature graph respectively through global maximum pooling and global average pooling, and finally outputting H W C dimensional output features processed by an attention mechanism through a 7*7 convolution and a sigmoid function;
(4) Adjusting the prediction head, wherein the simplified prediction head adopts a structure of decoupling object confidence judgment and prediction frame regression;
the method comprises the steps that a prediction head uses 1*1 convolution to split a feature map transmitted into the prediction head into two feature maps with the same shape on a channel, 3*3 convolution is superposed respectively, and finally 1*1 convolution is used for dimensionality reduction to obtain a confidence coefficient and a corresponding detection result of a prediction frame; the object confidence coefficient judgment feature output channel number is 1, and the prediction frame regression feature output channel number is 4;
(5) And (3) adopting an improved non-maximum inhibition algorithm, keeping the prediction frames with the first three scores, giving corresponding weights to the three prediction frames, and averaging the prediction frames closest to the real frame by the three prediction frames with the highest scores.
9. The material management system of a fabricated preform factory as set forth in claim 6, wherein: respectively training different types of materials in the discrete material detection counting model, and storing corresponding model parameters of the trained target detection module;
the model training process of each target detection module is improved as follows:
(1) Amplifying the original data set by adopting a Mosaic enhancement strategy; the specific processing process comprises the steps of splicing any four images, and transforming the spliced images in the modes of rotation, cutting and scaling;
(2) Selecting positive samples by adopting a SimOTA label dynamic allocation method in a network model training stage;
(3) The loss function calculation formula after the improved design of the adaptation task is as follows:
Figure FDA0004075000890000041
in the above formula, λ coord Weight, λ, representing the prediction loss of the prediction box regression prediction obj Weight, λ, representing the correspondence of positive sample confidence prediction loss noobj A weight representing a negative sample confidence prediction penalty; s 2 Representing the number of the characteristic points, wherein the value of the number is the product of the length and the width of the characteristic diagram; b represents that the number of the prediction frames can be preset according to the task; x, y, w and h respectively represent the horizontal and vertical coordinates and the width and the height of the center of the prediction frame,
Figure FDA0004075000890000042
respectively representing the horizontal and vertical coordinates and the width and the height of the center of the real label; c represents the object confidence prediction value and,
Figure FDA0004075000890000043
a true tag value representing an object confidence; the upper subscripts i and j represent the jth prediction box in the ith characteristic point; the object confidence value represents the probability of whether the feature point contains an object, and the range of the object confidence value is 0 to 1;
Figure FDA0004075000890000044
and
Figure FDA0004075000890000045
respectively representing a group of marks for representing whether the characteristic points at i and j contain the target object: when the ith row and jth column feature points contain objects,
Figure FDA0004075000890000051
the number of the carbon atoms is 1,
Figure FDA0004075000890000052
is 0; when the ith row and jth column feature points do not contain an object,
Figure FDA0004075000890000053
is a group of a number of 0 s,
Figure FDA0004075000890000054
is 1;
(4) A knowledge distillation structure is adopted, and a teacher model is arranged to help the model to converge quickly; the prediction result output by the teacher model is used as a soft label, and the Loss calculated by the prediction result output by the student model is called soft Loss soft The Loss calculated by the prediction result output by the real label and the student model is called hard Loss hard (ii) a Wherein the loss function is as follows:
Figure FDA0004075000890000055
in the above loss function
Figure FDA0004075000890000056
Calculating formula Loss by using Loss function in (3) 1 Is given by where λ hard 、λ soft Respectively as follows; when the output value of the student model is closer to the real label than the output value of the teacher model, stopping the knowledge distillation training and only training the student model independently, wherein the training loss is as follows:
Figure FDA0004075000890000057
10. the material management system of a fabricated preform factory as set forth in claim 1, wherein: the handheld terminal also comprises a local identification module of the nested pipe fitting material based on the image identification technology, and the local identification module is used for identifying and counting the pipe fittings locally on the handheld terminal according to the image acquired by the image acquisition module; the identification result of the local identification module and the image acquired by the image acquisition module are synchronously sent to a cloud server; the identification result of the local identification module is used for carrying out cross validation on the detection result of the discrete material detection counting model in the cloud server;
the pipe fitting detection method of the local identification module comprises the following steps:
(1) Detecting and extracting the edge of an image of a corresponding area of the pipe fitting by adopting a Canny algorithm; the extraction process of the edge comprises four steps: a. performing Gaussian filtering on the image to enable the edge contour in the image to be smooth and increase the edge width; b. calculating gradient values and gradient directions of pixels in the image; c. only the point with the maximum gradient value along the gradient direction of the pixel is reserved through non-maximum suppression, and the reserved pixel is necessarily an edge; d. c, smoothly expanding the pixel points reserved in the step c along the direction of the normal vector of the gradient, and stopping the smooth expansion operation when the gradient of the expansion points is lower than a given threshold value to obtain a final contour;
(2) After edge detection, performing smooth coherent processing on the edge, calculating a tangent inward normal vector of a curve where the point on the contour is located, and recording a normal vector straight line;
(3) The intersection point of the normal vector straight lines is the center of a circle of the circular outline, and the normal vector straight lines which are not intersected are judged as noises with other shapes and are not calculated;
(4) The line segment is radiated from the center of the circle by the contour taking the intersection point as the center of the circle, the curve contour is matched as an object, and the object is considered to be detected by matching the center of the circle and the curve contour.
CN202211577533.7A 2022-12-09 2022-12-09 Material management system of assembled prefabricated part factory Active CN115600941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211577533.7A CN115600941B (en) 2022-12-09 2022-12-09 Material management system of assembled prefabricated part factory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211577533.7A CN115600941B (en) 2022-12-09 2022-12-09 Material management system of assembled prefabricated part factory

Publications (2)

Publication Number Publication Date
CN115600941A CN115600941A (en) 2023-01-13
CN115600941B true CN115600941B (en) 2023-03-21

Family

ID=84852574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211577533.7A Active CN115600941B (en) 2022-12-09 2022-12-09 Material management system of assembled prefabricated part factory

Country Status (1)

Country Link
CN (1) CN115600941B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116958907B (en) * 2023-09-18 2023-12-26 四川泓宝润业工程技术有限公司 Method and system for inspecting surrounding hidden danger targets of gas pipeline
CN117456473B (en) * 2023-12-25 2024-03-29 杭州吉利汽车数字科技有限公司 Vehicle assembly detection method, device, equipment and storage medium
CN117853823B (en) * 2024-03-04 2024-05-14 朗峰新材料启东有限公司 Foreign matter detection method and system for assisting wireless charging of new energy automobile

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921461A (en) * 2018-05-16 2018-11-30 天津科技大学 Warehousing system and control method based on RFID technique
CN111091320A (en) * 2019-11-26 2020-05-01 国网辽宁省电力有限公司朝阳供电公司 Intelligent unattended warehouse management system and method based on acousto-optic positioning

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8055377B2 (en) * 2004-09-22 2011-11-08 Sap Aktiengeselleschaft Online controlled picking in a warehouse
CA2565408C (en) * 2005-10-25 2013-01-08 Mckesson Automation Systems, Inc. Adaptive interface for product dispensing systems
CN105005886A (en) * 2015-08-18 2015-10-28 青岛华高软件科技有限公司 Grain storage intelligent electronic information management device
CN105775510A (en) * 2016-05-09 2016-07-20 徐洪军 Medical waste management system based on RFID technology
US10217074B1 (en) * 2017-08-29 2019-02-26 Amazon Technologies, Inc. Modular automated inventory sorting and retrieving
CN109712277B (en) * 2018-11-20 2021-06-29 中信梧桐港供应链管理有限公司 Intelligent warehouse storage and transportation management system based on Internet of things
CN114140049A (en) * 2021-11-29 2022-03-04 广州权馨信息科技有限公司 Data processing method and system for multi-user warehouse management
CN115439070A (en) * 2022-10-13 2022-12-06 南京维拓科技股份有限公司 Method for realizing self-service material receiving based on storage digitization

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921461A (en) * 2018-05-16 2018-11-30 天津科技大学 Warehousing system and control method based on RFID technique
CN111091320A (en) * 2019-11-26 2020-05-01 国网辽宁省电力有限公司朝阳供电公司 Intelligent unattended warehouse management system and method based on acousto-optic positioning

Also Published As

Publication number Publication date
CN115600941A (en) 2023-01-13

Similar Documents

Publication Publication Date Title
CN115600941B (en) Material management system of assembled prefabricated part factory
US10878363B2 (en) Inland freight management
US20220084186A1 (en) Automated inspection system and associated method for assessing the condition of shipping containers
CN109711717A (en) Intelligent container port port management system
Kopytov et al. Multiple-criteria analysis and choice of transportation alternatives in multimodal freight transport system
AU2008282178B2 (en) Transportation management system
Dotoli et al. A decision support system for optimizing operations at intermodal railroad terminals
CN114331284A (en) Intelligent warehousing service management system based on cloud computing
Zolkin et al. Application of the modern information technologies for design and monitoring of business processes of transport and logistics system
CN109064085B (en) Construction site material management method
US8131584B2 (en) Gateway balancing
CN116523270B (en) Logistics transportation task automatic scheduling method, equipment, server and medium
US9626638B2 (en) Method and device for assigning surplus slabs in the slab yard before hot rolling process
CN114897359A (en) Finished automobile transport capacity scheduling method under multidimensional data
CN115545369A (en) Automated quayside container bridge resource planning decision-making method, terminal and medium
CN114066055A (en) Method, device and server for predicting late-stage approach of vehicle in logistics transportation
JP5434204B2 (en) Product transfer work amount prediction apparatus, product transfer work amount prediction method, and computer program
CN111626666A (en) Distributed storage yard mode
CN113984083B (en) Scrap steel warehouse-in navigation method and system
CN115380298A (en) System for determining the transport route and/or position of goods and triggering an automated operation
CN113869647A (en) Intelligent allocation and transportation system for coal yard vehicles
CN117764493A (en) Coal supervisory systems
Shahbudinbhai Design and Analysis of Iot Ecosystem for Freight Management System
CN116229131A (en) Goods sorting method, device, system and storage medium
CN116108964A (en) Intelligent logistics method and system for irregular industrial products based on shadow price

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant