CN114821015A - Goods placement control method and device, computer equipment and storage medium - Google Patents

Goods placement control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114821015A
CN114821015A CN202210581504.1A CN202210581504A CN114821015A CN 114821015 A CN114821015 A CN 114821015A CN 202210581504 A CN202210581504 A CN 202210581504A CN 114821015 A CN114821015 A CN 114821015A
Authority
CN
China
Prior art keywords
goods
library
cargo
state
placement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210581504.1A
Other languages
Chinese (zh)
Inventor
李汉邦
李陆洋
方牧
鲁豫杰
杨秉川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionnav Robotics Shenzhen Co Ltd
Original Assignee
Visionnav Robotics Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionnav Robotics Shenzhen Co Ltd filed Critical Visionnav Robotics Shenzhen Co Ltd
Priority to CN202210581504.1A priority Critical patent/CN114821015A/en
Publication of CN114821015A publication Critical patent/CN114821015A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The application relates to a cargo placement control method, a cargo placement control device, computer equipment and a storage medium. The method comprises the following steps: acquiring a library bitmap obtained by acquiring an image aiming at a library position; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods; positioning and detecting a library bit object in a library bit map; carrying out goods placement state identification on the warehouse location object to obtain a goods placement state of a warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the warehouse position represented by the warehouse position object; and carrying out goods placing control processing on the carrying equipment according to the goods placing state. By the method, the goods placement state of the warehouse location object in the warehouse location image can be identified, and whether goods exist on the warehouse location represented by the warehouse location object can be automatically judged through programming, so that the handling equipment is controlled to place the goods in time, and the goods placement efficiency is improved.

Description

Goods placement control method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of logistics application technologies, and in particular, to a cargo placement control method, apparatus, computer device, and storage medium.
Background
With the development of the logistics industry, the quantity of goods is continuously increased, and the demand for placing the goods is gradually increased. At present, the goods are placed at the storage position of the carrying equipment on the loading platform, and then the goods on the storage position are carried to the carriage manually. After all goods placed on the storage position are taken away manually, the carrying equipment needs to place the next goods on the storage position.
However, the carrying device itself cannot judge whether the goods in the storage location are all taken away, so that the carrying device cannot place the goods in time, and the efficiency of placing the goods is affected.
Disclosure of Invention
In view of the above, it is necessary to provide a cargo placement control method, apparatus, computer device, computer readable storage medium, and computer program product capable of improving cargo placement efficiency.
In a first aspect, the present application provides a cargo placement control method. The method comprises the following steps:
acquiring a library bitmap obtained by image acquisition aiming at a library position; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods;
positioning and detecting the library bit object in the library bit map;
carrying out goods placement state identification on the warehouse location object to obtain a goods placement state of a warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the position represented by the position object;
and carrying out cargo placement control processing on the carrying equipment according to the cargo placement state.
In a second aspect, the present application further provides a cargo placement control device. The device comprises:
the image acquisition module is used for acquiring a library bitmap obtained by image acquisition; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods;
the positioning detection module is used for positioning and detecting the library position object in the library position map;
the state identification module is used for identifying the goods placement state of the warehouse location object to obtain the goods placement state of the warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the position represented by the position object;
and the placement control module is used for controlling and processing the placement of the goods on the carrying equipment according to the goods placement state.
In some embodiments, the library bitmap is acquired by an image acquisition device, and the cargo placement control device further comprises a device installation module for calculating a field of view of the image acquisition device according to a highest cargo height, device parameters of the image acquisition device, and a preset installation position; if the goods on the target storage position can be completely displayed in the visual field range, the image acquisition equipment is installed according to the preset installation position; and the target library position is a library position represented by each library position object in the library bitmap.
In some embodiments, the location detection module is further configured to obtain a preconfigured file, and obtain the bin position information of the bin position object from the preconfigured file; the library position information is obtained by calibrating a sample library position map; the sample library bitmap is obtained by acquiring an image of the library position represented by the library position object in advance through the image acquisition equipment; and carrying out positioning detection on the library bitmap according to the library position information to obtain the library position object in the library bitmap.
In some embodiments, the cargo placement state is obtained by performing cargo placement state recognition on the cargo object through a cargo state recognition engine; the cargo state recognition engine is obtained by converting a cargo state recognition model of deep learning training; the detection inference speed of the cargo state recognition engine is superior to that of the cargo state recognition model.
In some embodiments, the placement control module is further configured to generate a cargo operation signal if the cargo placement state is a no-cargo state, so that the handling device places the cargo on the position represented by the position object according to the cargo operation signal.
In some embodiments, the placement control module is further configured to generate a notification for prompting that the position represented by the position object is in a good state if the cargo placement state is in a good state.
In some embodiments, the placement control module is further configured to, if the cargo placement state of the storage location represented by the plurality of storage location objects is a non-cargo state, generate a next cargo operation signal after the transporting device places the cargo on the previous storage location in the non-cargo state, so as to control the transporting device to place the cargo on the next storage location corresponding to the non-cargo state based on the next cargo operation signal.
In a third aspect, the present application also provides a computer device. The computer equipment comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the cargo placement control method when executing the computer program.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned goods placement control method.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of the above-mentioned goods placement control method.
According to the goods placement control method, the goods placement control device, the computer equipment, the storage medium and the computer program product, the library bitmap obtained by acquiring the images aiming at the library positions is obtained; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods; positioning and detecting a library bit object in a library bit map; carrying out goods placement state identification on the warehouse location object to obtain a goods placement state of a warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the warehouse position represented by the warehouse position object; and carrying out goods placing control processing on the carrying equipment according to the goods placing state. The goods placement state identification is carried out on the warehouse location object in the warehouse location graph, whether goods exist on the warehouse location represented by the warehouse location object can be automatically judged through programming, and therefore the handling equipment is controlled to carry out goods placement in time, and the goods placement efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of an application environment of a cargo placement control method in some embodiments;
FIG. 2 is a schematic flow chart of a cargo placement control method according to some embodiments;
FIG. 3 is a schematic view of the installation position of the image capturing apparatus in some embodiments;
FIG. 4 is a schematic flow chart of a cargo placement control method according to further embodiments;
FIG. 5 is a block diagram of a cargo placement control device according to some embodiments;
FIG. 6 is a diagram of the internal structure of a computer device in some embodiments.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The cargo placement control method provided by the embodiment of the application can be applied to the application environment shown in fig. 1. The image capturing device 102 for capturing images and the configured server 104 need to be connected to the same network to ensure that the server 104 can access all the image capturing devices 102, and the server 104 communicates with the image capturing device 102 through the network. The image acquisition device 102 acquires images of the warehouse location for storing the goods to obtain a warehouse bitmap, and the image acquisition device 102 may further send the acquired warehouse location map to the server 104 for controlling the placement of the goods. Specifically, after obtaining the library bitmap, the server 104 positions and detects the library object in the library bitmap, and performs goods placement state identification on the library object to obtain a goods placement state of the library position represented by the library object, where the goods placement state is used to indicate whether there is a good in the library position represented by the library object. In addition, the server 104 also performs cargo placement control processing on the handling apparatus according to the cargo placement state. The number of the image capturing devices 102 may be 1, or may be multiple, and the server 104 may be implemented by an independent server or a server cluster formed by multiple servers.
In some embodiments, as shown in fig. 2, a cargo placement control method is provided, which may be applied to a server, or may be implemented through interaction between the server and an image capturing device, and the embodiments of the present application are not particularly limited. Taking the application of the method to the server in fig. 1 as an example, the method includes the following steps:
step 202, a library bitmap obtained by collecting images for the library positions is obtained.
Wherein, the storage position refers to the specific storage position of the goods in the warehouse. It can be understood that the shape of the library site may be rectangular, or may be other shapes, and the specific shape of the library site is not limited in the present application.
The library bitmap is obtained by carrying out image acquisition on one or more library positions through image acquisition equipment, the number of library position objects in the library bitmap is at least one, and the library position objects are used for representing the library positions in the library bitmap. It is understood that the library space represented by the library space object is used for storing goods. It is understood that the library bitmap is not limited to include only the image content of the library bit, and may include the image content of other position areas, as long as the image content of the library bit is ensured in the library bitmap.
The image capturing device refers to a device with a photographing function, and may be, but is not limited to, at least one of various cameras and mobile devices.
The other position areas may include a safety area, where the safety area refers to an area around the storage space for warning, and for example, whether people or transportation equipment are included in the safety area may be identified, the number of people or transportation equipment in the safety area may also be identified, and if it is identified that people or transportation equipment enter the set safety area, a warning may also be generated. It should be noted that the shape of the safety region may be a polygon, and the number, area and position of the specific vertices of the safety region may be set according to actual requirements.
Specifically, the image acquisition equipment acquires images for the warehouse location for storing goods to obtain a warehouse bitmap. And the server side acquires the library bitmap acquired by the image acquisition equipment for image acquisition, and is used for subsequently positioning and detecting the library position object in the library bitmap.
Step 204, positioning the library bit object in the detection library bitmap.
Specifically, the server may identify the library position object that needs to be identified as the goods placement state from the library position map. It is understood that the library location object to be identified for the cargo placement status may be all or part of the library location object in the library bitmap.
In some embodiments, the server may locate the library object from the library bitmap by pre-configuring the library location information for calibrating the library location.
In other embodiments, the server may automatically perform image segmentation processing on the library level map to locate the detected library level object. It can be understood that the present application does not limit the specific implementation of how to locate the library objects from the library bitmap.
And step 206, identifying the goods placement state of the warehouse location object to obtain the goods placement state of the warehouse location represented by the warehouse location object.
The goods placement state identification means identifying whether goods are available on the position represented by the position object, and the goods placement state is used for representing whether goods are available or unavailable on the position represented by the position object.
In some embodiments, the server may directly perform goods placement state recognition on the stock location object through a pre-trained goods state recognition model, so as to determine whether a stock is available on the stock location represented by the stock location object.
In other embodiments, the server may further perform optimization inference on the goods state recognition model, and then perform goods placement state recognition on the storage location object, so as to determine whether goods exist on the storage location represented by the storage location object.
The cargo state recognition model refers to a neural network model trained by a deep learning algorithm and is used for recognizing the placement state of the cargo. It can be understood that the cargo state identification model can be a multi-classification model, that is, the cargo state identification model is not limited to detecting and identifying the placement state of the cargo, but also can identify objects such as people and forklifts.
Deep learning refers to learning the intrinsic rules and representation levels of sample data, and in the learning process, data such as characters, images and sounds can be recognized.
And 208, carrying out cargo placement control processing on the carrying equipment according to the cargo placement state.
The handling device is a device for transporting and placing goods, and may include at least one of an automatic Guided Vehicle (AGV cart) and a forklift.
Specifically, the server determines whether the warehouse location represented by the warehouse location object has goods according to the goods placement state, and performs goods placement control processing on the handling equipment under the condition that the corresponding warehouse location has goods or has no goods.
In some embodiments, if the server determines that the goods placement state of the storage location represented by the storage location object is a no-goods state, the server controls the handling equipment to place the goods onto the storage location.
In other embodiments, if the server determines that the goods placement state of the storage location represented by the storage location object is a goods-available state, the corresponding storage location is reported with goods.
In the cargo placement control method, a library bitmap obtained by collecting images aiming at a library position is obtained; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods; positioning and detecting a library bit object in a library bitmap; carrying out goods placement state identification on the warehouse location object to obtain a goods placement state of a warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the position represented by the position object; and carrying out goods placing control processing on the carrying equipment according to the goods placing state. The goods placement state identification is carried out on the warehouse location object in the warehouse location graph, whether goods exist on the warehouse location represented by the warehouse location object can be automatically judged through programming, and therefore the handling equipment is controlled to carry out goods placement in time, and the goods placement efficiency is improved.
In some embodiments, the library bitmap is captured by an image capture device, and prior to step 202, the cargo placement control method further comprises the step of installing the image capture device. Wherein, the step of installing the image acquisition equipment comprises: calculating the visual field range of the image acquisition equipment according to the highest cargo height, equipment parameters of the image acquisition equipment and a preset installation position; and if the goods on the target storage position can be completely displayed in the visual field range, installing the image acquisition equipment according to the preset installation position.
The target storage position is a storage position which is represented by each storage position object in the storage position diagram and corresponds to the storage position for storing goods in the real world.
The maximum cargo height refers to the maximum height at which cargo on the storage level can be stacked.
The device parameters of the image capturing device include, but are not limited to, the focal length of the image capturing device, such as 2.8 mm, 4 mm, and 6 mm.
The preset installation position includes, but is not limited to, at least one of a preset installation height and a preset installation angle of the image capturing device.
Specifically, according to the focal length of the image capturing device, the horizontal angle and the vertical angle corresponding to the focal length can be directly determined, for example, the horizontal angle and the vertical angle corresponding to the image capturing device with the focal length of 4 mm are 70 degrees and 40 degrees, respectively. According to the height of the highest goods, the horizontal visual angle, the vertical visual angle and the preset installation height, the visual field length and the visual field width of the image acquisition equipment at the preset installation height can be calculated, and the visual field range of the image acquisition equipment at the preset installation height can be calculated according to the visual field length and the visual field width. The specific processes of the visual field length and the visual field width are shown as formula (1) and formula (2):
L1=2*(H-h)*tan(A/2) (1)
L2=2*(H-h)*tan(B/2) (2)
wherein L1 refers to a view length, L2 refers to a view width, H refers to a preset installation height, H refers to a highest cargo height, a refers to a horizontal view, and B refers to a vertical view.
After the visual field length and the visual field width are respectively calculated through the formula (1) and the formula (2), the visual field length and the visual field width are multiplied, and the visual field range which can be seen by the image acquisition equipment under the preset installation height can be obtained.
In some embodiments, after the visual field range is calculated, if the goods on the target storage position can be completely displayed in the visual field range, the image acquisition equipment is installed according to the preset installation position. Specifically, the visual field range may be compared with a preset visual field range, and if the visual field range is greater than or equal to the preset visual field range, it is determined that the goods on the target storage location can be completely displayed within the visual field range. The preset visual field range can be determined according to the space size of the target library position, specifically according to the library position area of the target library position.
In practical application, as shown in fig. 3, if the target storage location is a single storage location and only two cargos can be placed in the target storage location, the visible area of the image acquisition device under the preset installation height, i.e., the visual field range, is calculated according to the height H of the highest cargo, the preset installation height H, and the horizontal viewing angle and the vertical viewing angle of the image acquisition device. If the preset visual field range is set to be the area of the warehouse location corresponding to the target warehouse location, and the calculated visual field range is equal to the preset visual field range, the image acquisition equipment can completely see all goods on the target warehouse location, namely two goods in fig. 3, within the visual field range, and at this moment, it is illustrated that the preset installation height is reasonably set, and the image acquisition equipment can be directly installed according to the preset installation height H.
In some embodiments, if the goods on the target storage location cannot be completely displayed in the visual field range, adjusting the preset mounting position until the adjusted preset mounting position can enable the goods on the target storage location to be completely displayed in the visual field range; and installing the image acquisition equipment according to the adjusted preset installation position.
Specifically, if the goods on the target storage location cannot be completely displayed in the visual field range, it is indicated that the visual field range of the image capturing device is smaller than the preset visual field range, and the preset installation position is unreasonable. At the moment, the preset installation position needs to be adjusted, the preset installation height and the preset installation angle of the image acquisition equipment can be adjusted, and the image acquisition equipment is installed according to the adjusted preset installation height and the adjusted preset installation angle, so that all goods on the target storage position can be completely seen by the image acquisition equipment within the visual field range.
It is understood that the library maps captured by the image capture devices may be distorted to varying degrees due to the precision of lens fabrication. In order to improve the recognition effect of image shooting, it is considered that the image acquisition equipment is subjected to distortion removal by a method of calibrating internal parameters of the image acquisition equipment before the image acquisition equipment is installed.
Specifically, image acquisition is carried out on the calibration plate through image acquisition equipment, and an image of the calibration plate is obtained. And (4) introducing a calibration plate image into calibration software, and calibrating the calibration plate image by the calibration software to obtain internal parameters and distortion parameters of the image acquisition equipment. And updating the internal parameters and distortion parameters of the image acquisition equipment into a configuration file of the image acquisition equipment so as to finish the process of distortion removal of the image acquisition equipment.
In practical application, it can be considered that the image acquisition device is installed vertically above or obliquely above the loading platform by using a fixing device, and the installation position of the image acquisition device needs to ensure that the region covering all the storage positions can be covered and all goods can be seen. After all the image capturing apparatuses are installed, the number of the image capturing apparatuses to be installed can be calculated. Wherein, need to guarantee that every image acquisition equipment shoots the storehouse position as much as possible to this reduces image acquisition equipment's installation quantity, not only can come the goods placement state on the automatic identification a plurality of storehouse positions through the storehouse position picture that single image acquisition equipment gathered like this, can also reduce image acquisition equipment's installation cost.
In some embodiments, step 204 specifically includes, but is not limited to including: acquiring a pre-configuration file, and acquiring the library position information of a library position object from the pre-configuration file; and carrying out positioning detection on the library bit map according to the library bit information to obtain a library bit object in the library bit map.
The library position information is obtained by calibrating a sample library position map; the sample library bitmap is obtained by acquiring images of library positions characterized by the library position object in advance through image acquisition equipment.
The pre-configuration file is used for setting the library position information of the library position object, and drawing the library position object in a library position map shot by the image acquisition equipment, and is used for representing the specific position of the library position object in the library position map.
It is understood that the library bit information includes, but is not limited to including, the size of the space of the library bit (e.g., the length and width of the library bit), the library bit number, and the specific location of the library bit object in the library bitmap. The specific position of the library bit object in the library bit map refers to the pixel coordinates of each vertex of the library bit represented by the library bit object in the pixel coordinate system, and the like.
Specifically, the server obtains the preconfigured file, and obtains the specific position of the library position object in the library bitmap from the preconfigured file, that is, the pixel coordinates of each vertex of the library position represented by the library position object recorded in the preconfigured file in the pixel coordinate system. And the server side carries out positioning detection on the library bitmap according to the pixel coordinates of each vertex, so that the library object in the library bitmap can be positioned and obtained.
In practical applications, the configuration process of the pre-configuration file may be: and acquiring a plurality of library bitmaps of a plurality of library positions through image acquisition equipment, and ensuring that each library bitmap is not duplicated. A library location area is set in a configuration file of the cargo state identification program, specifically, the length, width, library location number (the library location number corresponds to the library location number corresponding to the actual library location one by one) of each library location in the library location area, and a coordinate point of each library location in a pixel coordinate system are input in the configuration file to generate a pre-configuration file. It should be noted that the storage location area includes all storage locations, and the cargo state identification program is an executable program integrated with a tensorrt environment (i.e., a cargo state identification engine is deployed), and the program can detect and identify a targeted object, for example, when a cargo exists on a certain storage location, the cargo state identification program highlights that the storage location has the cargo through a detection frame in an identification interface of the cargo state identification program.
In some embodiments, the cargo placement state is obtained by identifying the cargo placement state of the cargo object through a cargo state identification engine, the cargo state identification engine can be integrally deployed at a server, the cargo state identification engine is obtained by converting a cargo state identification model through deep learning training, and the detection inference speed of the cargo state identification engine is superior to that of the cargo state identification model. The goods state recognition model established through the deep learning algorithm can automatically find accurate characteristics, so that the accuracy of goods state recognition is improved, the goods state recognition model trained through deep learning is converted into a goods state recognition engine, reasoning can be further optimized, and the accuracy of goods state recognition is further improved.
The cargo state recognition engine is an inference optimizer for optimizing a trained cargo state recognition model, can optimize inference, can recognize various cargos more accurately, and can accelerate deployment.
In some embodiments, the training process of the cargo state recognition model is: and connecting the installed image acquisition equipment and the corresponding server to the same network so as to ensure that the server can access all the image acquisition equipment. The image acquisition function of the image acquisition equipment is opened through the server, the image acquisition equipment acquires images of cargoes in different storage positions and selects non-repeated images as cargo sample images, so that the balance of training samples is guaranteed, wherein the cargo sample images comprise but are not limited to images of single cargo, images of stacked cargoes in different intervals, angles, positions and the like, images of different types of cargoes and the like. Then, labeling each goods sample image to obtain a labeled image set, wherein the type of the label includes but is not limited to at least one of people, forklifts and goods, for example, the image content of one goods sample image includes people, forklifts and goods, the object frames corresponding to the people, the forklifts and the goods are respectively selected by using a labeling tool, and corresponding labels are marked on the object frames for telling what the model is people, what the forklift is goods and the like when the model is trained. And finally, putting the labeled image set in an original neural network model for training to obtain the optimal model parameters, and updating the neural network model according to the optimal model parameters to obtain a trained cargo state recognition model.
In some embodiments, the original neural network model may be a yolov5 model, wherein the yolov5 model is a neural network model trained by a single-stage object detection algorithm. The goods state recognition model obtained by training according to the yolov5 model has the advantages of high reasoning speed, small occupied space and higher precision.
In some embodiments, the conversion of the cargo state identification model into the cargo state identification engine process is: and converting the format of the model file corresponding to the trained goods state recognition model into the file format corresponding to the goods state recognition engine to generate an engine file. Specifically, the cargo state recognition model is imported into the cargo state recognition engine to generate an engine file, the engine file is stored in a serialized mode, and then the engine file can be called conveniently and quickly to execute accelerated reasoning of the cargo state recognition model.
In practical applications, tensorrt can be selected as the cargo state recognition engine of the present application, wherein tensorrt is a high-performance deep learning inference optimizer and engine.
It will be appreciated that in some embodiments, the process of converting the cargo state identification model to a cargo state identification engine (i.e., tensorrt) is: and acquiring a pt format model file corresponding to the cargo state identification model, and converting the pt format model file into an wts format model file. The model file in wts format is placed under the folder corresponding to tenorrt, and the engine generator in the folder is run by double-clicking to generate a new engine file. And copying the new engine file and replacing the original engine file to generate the goods state recognition engine.
In some embodiments, step 208 specifically includes, but is not limited to: and if the goods placement state is a goods-free state, generating a goods operation signal so that the carrying equipment places the goods on the warehouse position represented by the warehouse position object according to the goods operation signal. Through the automatic goods operating signal that generates of programming to control haulage equipment in time carries out the goods and places, and then improves the efficiency that the goods was placed.
In some embodiments, the cargo placement control method specifically includes, but is not limited to: and if the goods placement state is a goods state, generating a notice for prompting the goods on the storage position represented by the storage position object, and indicating that the transport equipment does not need to put the goods on the storage position at the moment.
In some embodiments, the cargo placement control method specifically includes, but is not limited to: if the goods placing state of the storage positions represented by the plurality of storage position objects is a goods-free state, after the carrying equipment places the goods on the previous storage position in the goods-free state, a next goods operation signal is generated so as to control the carrying equipment to place the goods on the next storage position corresponding to the goods-free state based on the next goods operation signal, and the carrying equipment can carry out continuous loading work.
In practical application, a goods state identification program is operated on a server to monitor all the positions in the position area, and when goods enter a detection area (namely the position area) in a program interface of the goods state identification program, the goods state identification program can automatically mark a detection frame on a goods object and display a corresponding goods label. When the goods are identified to be stored in the storage positions, the goods state identification program reports that the corresponding storage positions have the goods. When no goods are identified on the library location, data in a specific format (e.g., JSON format) is generated and the retrieved data is transmitted to the system program through a data interface (e.g., JSON-RPC interface). And the system program generates a cargo operation signal according to the received data and transmits the cargo operation signal to the carrying equipment, and the carrying equipment receives the cargo operation signal and puts the cargo on the corresponding storage position.
The json format data may be in a key value pair format, for example, the key value pair format is set to 1, no goods is set to 0, the bin number corresponds to a specific bin, and the like, and after receiving the key value pair format data, the system program generates a goods operation signal, which tells that no goods exist on a certain bin of the handling equipment, so that the handling equipment puts the goods on the corresponding bin. In addition, after the carrying equipment completes one-time stock placement, the system program sends a next goods operation signal to the carrying equipment so that the carrying equipment can go to the next stock placement for placement until the goods on all the stock placements are placed completely.
In some embodiments, as shown in fig. 4, the cargo placement control method of the present application further includes, but is not limited to, the following steps:
step 402, determining device parameters and installation information of the image acquisition device.
And step 404, performing distortion removal operation on the image acquisition equipment.
And 406, fixedly installing the image acquisition equipment above the loading platform according to the installation information.
And step 408, collecting the library position map of the library position through the image collecting device.
And step 410, carrying out library position calibration according to the library position diagram, and writing the library position calibration into a configuration file of the cargo state identification program.
And step 412, operating the goods state identification program to collect a sample library bitmap.
Step 414, labeling the sample library bitmap.
And step 416, performing model training and conversion through the labeled sample library bitmap to obtain a cargo state recognition model.
Step 418, generating a cargo state recognition engine according to the cargo state recognition model.
And step 420, identifying an engine for replacing the source file according to the cargo state.
And step 422, running the goods state identification program again to monitor all the storage positions.
In step 424, the goods status identification program identifies whether there is goods in the storage space, if yes, step 426 is executed, and if no, step 428 is executed.
Step 426, the goods status identification program reports the goods in the storage location.
At step 428, the cargo state identification routine transmits a put signal to the transfer apparatus.
Specifically, device parameters and installation information of the image acquisition device are determined, wherein the device parameters of the image acquisition device comprise a focal length, and the installation information of the image acquisition device comprises an installation height and an installation angle of the image acquisition device. After the image acquisition equipment is subjected to distortion removal operation, the image acquisition equipment is fixedly installed above a loading platform, wherein the loading platform refers to an area for putting goods, a plurality of trays (for example, 2 trays) are arranged on the loading platform, and each tray corresponds to one storage position. After the image acquisition equipment is installed, the library bitmap is acquired by the image acquisition equipment aiming at one or more library positions, library position calibration is carried out according to the library bitmap, and the calibrated library position is written into a configuration file of the goods state identification program. And after the configuration is finished, operating a cargo state identification program to acquire a sample library bitmap and marking the sample library bitmap. And performing model training through the labeled sample library bitmap, and converting the format of the trained model file to obtain a cargo state recognition model. And converting the cargo state recognition model to generate a cargo state recognition engine, and replacing the engine of the source file according to the cargo state recognition engine. And running the goods state identification program again to monitor all the storage positions, identifying whether goods exist on the storage positions or not through the goods state identification program, reporting that the goods exist on the storage positions if the goods exist, and transmitting a goods release signal to the carrying equipment if the goods do not exist so as to enable the carrying equipment to carry out goods release operation aiming at the storage positions.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a goods placement control device for realizing the goods placement control method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so specific limitations in one or more embodiments of the cargo placement control device provided below can be referred to the limitations on the cargo placement control method in the foregoing, and details are not described herein again.
In some embodiments, as shown in fig. 5, there is provided a cargo placement control device comprising: an image acquisition module 502, a positioning detection module 504, a state identification module 506, and a placement control module 508, wherein:
an image acquisition module 502, configured to acquire a library bitmap obtained by image acquisition; at least one library bit object in the library bitmap; the storage position represented by the storage position object is used for storing goods.
And a positioning detection module 504, configured to position and detect a library bit object in the library bitmap.
The state identification module 506 is configured to perform goods placement state identification on the library position object to obtain a goods placement state of a library position represented by the library position object; the goods placement state is used for indicating whether goods exist on the warehouse position represented by the warehouse position object.
And the placing control module 508 is used for controlling and processing the goods placing on the carrying equipment according to the goods placing state.
According to the goods placement control device, a library bitmap obtained by acquiring images aiming at library positions is obtained; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods; positioning and detecting a library bit object in a library bit map; carrying out goods placement state identification on the warehouse location object to obtain a goods placement state of a warehouse location represented by the warehouse location object; the goods placement state is used for indicating whether goods exist on the warehouse position represented by the warehouse position object; and carrying out goods placing control processing on the carrying equipment according to the goods placing state. The goods placement state identification is carried out on the warehouse location object in the warehouse location graph, whether goods exist on the warehouse location represented by the warehouse location object can be automatically judged through programming, and therefore the handling equipment is controlled to carry out goods placement in time, and the goods placement efficiency is improved.
In some embodiments, the library bitmap is acquired by an image acquisition device, the cargo placement control device further comprises a device installation module, and the device installation module is used for calculating the visual field range of the image acquisition device according to the highest cargo height, the device parameters of the image acquisition device and a preset installation position; if the goods on the target storage position can be completely displayed in the visual field range, installing image acquisition equipment according to a preset installation position; the target storage position is a storage position which is represented by each storage position object in the storage position diagram and corresponds to the storage position for storing goods in the real world.
In some embodiments, the location detection module 504 is further configured to obtain a preconfigured file, and obtain the bin position information of the bin position object from the preconfigured file; the library position information is obtained by calibrating a sample library position map; the sample library bitmap is obtained by acquiring images of the library positions represented by the library position object in advance through image acquisition equipment; and carrying out positioning detection on the library bit map according to the library bit information to obtain a library bit object in the library bit map.
In some embodiments, the cargo placement state is obtained by identifying the cargo placement state of the cargo object through a cargo state identification engine; the cargo state recognition engine is obtained by converting a cargo state recognition model of deep learning training; the detection inference speed of the cargo state recognition engine is superior to that of the cargo state recognition model.
In some embodiments, the placement control module 508 is further configured to generate a cargo operation signal if the cargo placement state is a no-cargo state, so that the handling device places the cargo on the position represented by the position object according to the cargo operation signal.
In some embodiments, the placement control module 508 is further configured to generate a notification prompting the stock location represented by the stock location object to be in stock if the cargo placement status is in a good status.
In some embodiments, the placement control module 508 is further configured to, if the cargo placement state of the storage location represented by the plurality of storage location objects is a non-cargo state, generate a next cargo operation signal after the transporting device places the cargo on the previous storage location in the non-cargo state, so as to control the transporting device to place the cargo on the next storage location corresponding to the non-cargo state based on the next cargo operation signal.
The division of the modules in the cargo placement control device is only for illustration, and in other embodiments, the cargo placement control device may be divided into different modules as needed to complete all or part of the functions of the cargo placement control device.
The modules in the cargo placement control device can be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be the server in fig. 1, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing image data and library bit information data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a cargo placement control method.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In some embodiments, there is further provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above method embodiments when executing the computer program.
In some embodiments, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In some embodiments, a computer program product is provided, comprising a computer program which, when executed by a processor, performs the steps in the above-described method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (10)

1. A cargo placement control method, characterized in that the method comprises:
acquiring a library bitmap obtained by image acquisition aiming at a library position; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods;
positioning and detecting the library bit object in the library bit map;
identifying the goods placement state of the storage position object to obtain the goods placement state of the storage position represented by the storage position object; the goods placement state is used for indicating whether goods exist on the position represented by the position object;
and carrying out cargo placement control processing on the carrying equipment according to the cargo placement state.
2. The method of claim 1, wherein said library bitmap is captured by an image capture device, said method further comprising the step of installing said image capture device prior to said obtaining of a library bitmap from which said image capture is performed; the step of installing the image capturing device comprises:
calculating the visual field range of the image acquisition equipment according to the highest cargo height, equipment parameters of the image acquisition equipment and a preset installation position;
if the goods on the target storage position can be completely displayed in the visual field range, the image acquisition equipment is installed according to the preset installation position; and the target library position is a library position which is represented by each library position object in the library position map and corresponds to the real world for storing goods.
3. The method of claim 1, wherein the locating detects the bin object in the bin bitmap, comprising:
acquiring a pre-configuration file, and acquiring the library position information of the library position object from the pre-configuration file; the library position information is obtained by calibrating a sample library position map; the sample library bitmap is obtained by acquiring an image of the library position represented by the library position object in advance through the image acquisition equipment;
and carrying out positioning detection on the library bitmap according to the library position information to obtain the library position object in the library bitmap.
4. The method of claim 1, wherein the cargo placement state is obtained by cargo placement state recognition of the cargo object by a cargo state recognition engine; the cargo state recognition engine is obtained by converting a cargo state recognition model of deep learning training; the detection and inference speed of the cargo state recognition engine is superior to that of the cargo state recognition model.
5. The method according to any one of claims 1 to 4, wherein the performing of the cargo placement control process on the handling apparatus according to the cargo placement state includes:
and if the goods placement state is a goods-free state, generating a goods operation signal so that the carrying equipment places the goods on the position represented by the position object according to the goods operation signal.
6. The method of claim 5, further comprising:
and if the goods placement state is a goods state, generating a notice for prompting the goods in the storage position represented by the storage position object.
7. The method of claim 5, further comprising:
if the goods placement state of the warehouse positions represented by the warehouse position objects is a goods-free state, then
And after the carrying equipment places the goods on the last storage position in the non-goods state, generating a next goods operation signal so as to control the carrying equipment to place the goods on the next storage position corresponding to the non-goods state based on the next goods operation signal.
8. A cargo placement control device, the device comprising:
the image acquisition module is used for acquiring a library bitmap obtained by acquiring an image aiming at a library position; at least one library position object in the library bitmap; the storage position represented by the storage position object is used for storing goods;
the positioning detection module is used for positioning and detecting the library position object in the library position map;
the state identification module is used for identifying the goods placement state of the storage position object to obtain the goods placement state of the storage position represented by the storage position object; the goods placement state is used for indicating whether goods exist on the position represented by the position object;
and the placement control module is used for controlling and processing the placement of the goods on the carrying equipment according to the goods placement state.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202210581504.1A 2022-05-26 2022-05-26 Goods placement control method and device, computer equipment and storage medium Pending CN114821015A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210581504.1A CN114821015A (en) 2022-05-26 2022-05-26 Goods placement control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210581504.1A CN114821015A (en) 2022-05-26 2022-05-26 Goods placement control method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114821015A true CN114821015A (en) 2022-07-29

Family

ID=82517321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210581504.1A Pending CN114821015A (en) 2022-05-26 2022-05-26 Goods placement control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114821015A (en)

Similar Documents

Publication Publication Date Title
US20170091706A1 (en) System for monitoring the condition of packages throughout transit
CN112132523B (en) Method, system and device for determining quantity of goods
US11790632B2 (en) Method and apparatus for sample labeling, and method and apparatus for identifying damage classification
CN111241969A (en) Target detection method and device and corresponding model training method and device
CN111553914B (en) Vision-based goods detection method and device, terminal and readable storage medium
CN115600953A (en) Monitoring method and device for warehouse positions, computer equipment and storage medium
CN115546300A (en) Method and device for identifying pose of tray placed tightly, computer equipment and medium
CN114648233A (en) Dynamic station cargo carrying method and system
WO2021233058A1 (en) Method for monitoring articles on shop shelf, computer and system
CN113506293A (en) Image processing method, device, equipment and storage medium
CN116187718B (en) Intelligent goods identification and sorting method and system based on computer vision
EP3647236B1 (en) Projection instruction device, parcel sorting system, and projection instruction method
CN114821015A (en) Goods placement control method and device, computer equipment and storage medium
CN113978987B (en) Pallet object packaging and picking method, device, equipment and medium
CN114758163B (en) Forklift movement control method and device, electronic equipment and storage medium
CN116308038A (en) Warehouse goods purchase management method and system based on Internet of things
WO2023122708A1 (en) Systems and methods of image analysis for automated object location detection and management
WO2023213070A1 (en) Method and apparatus for obtaining goods pose based on 2d camera, device, and storage medium
CN114435828A (en) Goods storage method and device, carrying equipment and storage medium
CN113936278A (en) Method, device and equipment for determining abrasion of positioning identifier
JP6670351B2 (en) Map updating device and map updating method
CN111414804B (en) Identification frame determining method, identification frame determining device, computer equipment, vehicle and storage medium
CN114841949A (en) Pallet access method and device, computer equipment and storage medium
CN112489240B (en) Commodity display inspection method, inspection robot and storage medium
CN117485783A (en) Storage state detection method and device for warehouse storage bits, storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination