CN116188961A - Luggage soft and hard bag identification method and device and luggage sorting system - Google Patents
Luggage soft and hard bag identification method and device and luggage sorting system Download PDFInfo
- Publication number
- CN116188961A CN116188961A CN202310274377.5A CN202310274377A CN116188961A CN 116188961 A CN116188961 A CN 116188961A CN 202310274377 A CN202310274377 A CN 202310274377A CN 116188961 A CN116188961 A CN 116188961A
- Authority
- CN
- China
- Prior art keywords
- target
- information
- point cloud
- luggage
- baggage
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/35—Categorising the entire scene, e.g. birthday party or wedding scene
- G06V20/36—Indoor scenes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B07—SEPARATING SOLIDS FROM SOLIDS; SORTING
- B07C—POSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
- B07C5/00—Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
- B07C5/04—Sorting according to size
- B07C5/10—Sorting according to size measured by light-responsive means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B07—SEPARATING SOLIDS FROM SOLIDS; SORTING
- B07C—POSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
- B07C5/00—Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
- B07C5/34—Sorting according to other particular properties
- B07C5/342—Sorting according to other particular properties according to optical properties, e.g. colour
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B07—SEPARATING SOLIDS FROM SOLIDS; SORTING
- B07C—POSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
- B07C5/00—Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
- B07C5/36—Sorting apparatus characterised by the means used for distribution
- B07C5/361—Processing or control devices therefor, e.g. escort memory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/32—Normalisation of the pattern dimensions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Sorting Of Articles (AREA)
Abstract
The application discloses a method and a device for identifying soft and hard bags of baggage and a baggage sorting system, and belongs to the technical field of baggage sorting. The method for identifying the soft and hard bags of the luggage comprises the following steps: acquiring a first RGB image and first point cloud information corresponding to the first RGB image, wherein the first RGB image comprises pixel information of target luggage to be identified; based on the first RGB image, resolving the first point cloud information to obtain target point cloud information of target baggage; determining target type information, target size information and target pose information of target baggage based on the first RGB image and the target point cloud information; and determining the soft and hard package type of the target luggage based on the target type information, the target size information and the target pose information. According to the method, the point cloud information is resolved through the RGB image, the size and the pose of the target luggage are accurately measured, the soft luggage and the hard luggage are accurately identified, the luggage sorting efficiency is improved, and the luggage breakage rate is reduced.
Description
Technical Field
The application belongs to the technical field of baggage sorting, and particularly relates to a baggage soft and hard bag identification method and device and a baggage sorting system.
Background
In recent years, with the great increase of travel frequency, the quantity of transported baggage is promoted year by year, and airport ground service sections are built with special baggage sorting systems, and the operation processes of most baggage sorting systems are as follows: the baggage falls onto the baggage transfer mechanism through the chute, and is rotated to a specific position following the turntable, and is picked up by a special sorting person and carried into the baggage conveyor.
Different luggage transportation, the different requirements of letter sorting, soft luggage and hard luggage need be layered pile up neatly respectively, and soft luggage needs to be placed on hard luggage upper strata, avoids crushing soft luggage.
At present, mostly rely on the letter sorting personnel to rely on self experience to discern soft luggage and hard luggage in the luggage letter sorting system, human resource is with high costs and recognition efficiency is low, leads to luggage letter sorting inefficiency, appears the misrecognition simultaneously easily, and then causes the luggage damage.
Disclosure of Invention
The present application aims to solve at least one of the technical problems existing in the prior art. Therefore, the application provides a method and a device for identifying soft and hard bags of baggage and a baggage sorting system, which can accurately identify soft baggage and hard baggage, reduce manpower resource cost, help to improve baggage sorting efficiency and reduce baggage breakage rate.
In a first aspect, the present application provides a method for identifying a soft and hard luggage, the method comprising:
acquiring a first RGB image and first point cloud information corresponding to the first RGB image, wherein the first RGB image comprises pixel information of target luggage to be identified;
based on the first RGB image, resolving the first point cloud information to obtain target point cloud information of the target luggage;
determining target type information, target size information and target pose information of the target baggage based on the first RGB image and the target point cloud information;
and determining the soft and hard package type of the target luggage based on the target type information, the target size information and the target pose information.
According to the luggage soft and hard bag identification method, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage are accurately measured based on the obtained target point cloud information, the data such as the target type information, the target size information and the target pose information are synthesized, soft luggage and hard luggage can be accurately identified, the manpower resource cost is reduced, the luggage sorting efficiency is improved, and the luggage breakage rate is reduced.
According to an embodiment of the present application, the calculating the first point cloud information based on the first RGB image to obtain the target point cloud information of the target baggage includes:
performing target detection on the first RGB image to obtain the target type information of the target luggage and the target position information of the target luggage in the first RGB image;
and calculating the target point cloud information based on the target position information and the first point cloud information, wherein the target point cloud information is used for determining the target size information and the target pose information.
According to an embodiment of the present application, the calculating, based on the target location information and the first point cloud information, the target point cloud information includes:
performing foreground and background segmentation on the first point cloud information, and resolving to obtain second point cloud information of the target luggage;
based on the first point cloud information and the target position information, third point cloud information of the target luggage is obtained through calculation;
and carrying out fusion processing on the second point cloud information and the third point cloud information to obtain the target point cloud information.
According to an embodiment of the present application, the performing object detection on the first RGB image to obtain the object type information of the object baggage and the object position information of the object baggage in the first RGB image includes:
Inputting the first RGB image into a target detection model for target detection to obtain the target type information and the target position information output by the target detection model;
the target detection model is trained based on a sample luggage data set, and the sample luggage data set comprises a sample luggage image and label information corresponding to the sample luggage image.
According to one embodiment of the present application, the acquiring the first RGB image and the first point cloud information corresponding to the first RGB image includes:
acquiring a plurality of first RGB images and a plurality of first point cloud information corresponding to the first RGB images one by one;
the calculating the first point cloud information based on the first RGB image to obtain target point cloud information of the target baggage includes:
and resolving the plurality of first point cloud information in a one-to-one correspondence manner based on the plurality of first RGB images to obtain a plurality of target point cloud information.
According to one embodiment of the application, the target size information is determined based on the average value of the first size information corresponding to the target point cloud information, and the target pose information is determined based on the average value of the first pose information corresponding to the target point cloud information.
According to one embodiment of the application, the soft and hard packet class of the target baggage is determined based on a mode of a plurality of first soft and hard packet classes, and the first soft and hard packet classes are determined based on the first RGB image and the target point cloud information corresponding to the first RGB image.
According to one embodiment of the present application, the determining the soft and hard packet category of the target baggage based on the target type information, the target size information and the target pose information includes:
inputting the target type information, the target size information and the target pose information into a classification network model to obtain the soft and hard packet type of the target luggage output by the classification network model;
the classification network model is obtained through training based on a training sample set.
In a second aspect, the present application provides a luggage soft and hard bag identification device, the device comprising:
the acquisition module is used for acquiring a first RGB image and first point cloud information corresponding to the first RGB image, wherein the first RGB image comprises pixel information of target luggage to be identified;
the first processing module is used for resolving the first point cloud information based on the first RGB image to obtain target point cloud information of the target luggage;
The second processing module is used for determining target type information, target size information and target pose information of the target luggage based on the first RGB image and the target point cloud information;
and the third processing module is used for determining the soft and hard package type of the target luggage based on the target type information, the target size information and the target pose information.
According to the luggage soft and hard bag recognition device, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage are accurately measured based on the obtained target point cloud information, the data such as the target type information, the target size information and the target pose information are integrated, soft luggage and hard luggage can be accurately recognized, the manpower resource cost is reduced, the luggage sorting efficiency is improved, and the luggage breakage rate is reduced.
In a third aspect, the present application provides a baggage sorting system comprising:
a baggage transfer mechanism for transporting a target baggage;
the image acquisition module is arranged on the luggage conveying mechanism and is used for acquiring a first RGB image of the target luggage and first point cloud information corresponding to the first RGB image;
The soft and hard bag identification module is electrically connected with the image acquisition module and is used for determining the soft and hard bag type, the target size information and the target pose information of the target baggage based on the baggage soft and hard bag identification method in the first aspect;
and the baggage sorting mechanism is electrically connected with the soft and hard bag identification module and is used for performing sorting operation on the target baggage based on the soft and hard bag type, the target size information and the target pose information of the target baggage.
According to the luggage sorting system, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage are accurately measured based on the obtained target point cloud information, the data such as the target type information, the target size information and the target pose information are integrated, soft luggage and hard luggage can be accurately identified, the luggage sorting speed is improved, the manpower resource cost is reduced, the luggage breakage rate of passengers is effectively reduced, the automatic application of the luggage sorting system is enhanced, and the luggage sorting system is energy-saving and synergistic.
According to one embodiment of the present application, further comprising:
The sorting decision module is connected between the soft and hard package identification module and the luggage sorting mechanism, and is used for determining the soft and hard package type, the target size information and the target pose information of the target luggage under the condition that a plurality of first RGB images and a plurality of first point cloud information which are in one-to-one correspondence with the first RGB images are acquired.
According to one embodiment of the present application, further comprising:
the state monitoring module is connected with at least one of the baggage conveying mechanism, the image acquisition module, the soft and hard bag identification module, the baggage sorting mechanism and the sorting decision module.
In a fourth aspect, the present application provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the method for identifying a soft and hard luggage according to the first aspect when the processor executes the computer program.
In a fifth aspect, the present application provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a baggage soft and hard pack identification method according to the first aspect described above.
In a sixth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements a method for identifying a hard or soft bag of luggage as described in the first aspect above.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, wherein:
fig. 1 is a flow chart of a method for identifying soft and hard bags of luggage provided in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an object detection model according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart of training detection of a target detection model according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a device for identifying soft and hard bags of luggage provided in an embodiment of the present application;
fig. 5 is one of schematic structural views of a baggage sorting system according to an embodiment of the present application;
FIG. 6 is a second schematic diagram of a baggage sorting system according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals:
the system comprises target baggage 501, a baggage conveying mechanism 510, an image acquisition module 520, an image acquisition device 521, a photoelectric switch device 522, a soft and hard packet recognition module 530,2D, an image processing module 531,3D, a soft and hard packet classification module 533, a sorting decision module 540, a data transmission module 550 and a baggage sorting mechanism 560.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The number of baggage in current civil aviation business is rapidly increasing, and the variety of baggage is also various. On one hand, the demand for human resources is increased; on the other hand, when the baggage is sorted manually, the baggage is easy to damage due to negligence of personnel.
The current image-based object size measurement technology is widely applied, and the technology is relatively mature, but in the aspect of line Li Checun measurement, due to the diversity and irregularity of passenger baggage, the traditional template-based measurement technology method is not suitable for measuring the size and the pose of airport baggage.
The materials of the baggage are diversified, for example, soft bags and hard bags are divided, and if the same grabbing method is adopted for baggage with different materials, the baggage is easily damaged.
At present, mostly rely on the letter sorting personnel to rely on self experience to discern soft luggage and hard luggage in the luggage letter sorting system, human resource is with high costs and recognition efficiency is low, leads to luggage letter sorting inefficiency, appears the misrecognition simultaneously easily, and then causes the luggage damage.
The method for identifying the soft and hard bags of the baggage, the device for identifying the soft and hard bags of the baggage, the baggage sorting system, the electronic equipment and the readable storage medium provided by the embodiment of the application are described in detail below by means of specific embodiments and application scenes of the specific embodiments with reference to the accompanying drawings.
The luggage soft and hard packet identification method can be applied to the terminal, and can be specifically executed by hardware or software in the terminal.
The terminal includes, but is not limited to, a portable communication device such as a mobile phone or tablet having a touch sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be appreciated that in some embodiments, the terminal may not be a portable communication device, but rather a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following various embodiments, a terminal including a display and a touch sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and joystick.
The execution main body of the method for identifying the soft and hard luggage can be an electronic device or a functional module or a functional entity capable of implementing the method for identifying the soft and hard luggage in the electronic device, and the electronic device in the embodiment of the application includes, but is not limited to, a mobile phone, a tablet computer, a camera, a wearable device and the like.
As shown in fig. 1, the method for identifying the soft and hard bags of the luggage comprises the following steps: steps 110 to 140.
Wherein the first RGB image comprises pixel information of the target baggage 501 to be identified.
In this embodiment, the target baggage 501 may be photographed by the image capturing device 521, and the first RGB image and the first point cloud information corresponding thereto may be obtained, and the image capturing device 521 may be an RGB-D camera or the like.
It will be appreciated that the first RGB image includes pixel information of the target baggage 501 and also includes pixel information of a background or other interfering object surrounding the target baggage 501.
For example, the target baggage 501 is placed on the baggage transfer mechanism 510 for transfer, the first RGB image includes pixel information of the target baggage 501 and the baggage transfer mechanism 510, and the corresponding first point cloud information includes point cloud information of the target baggage 501 and the baggage transfer mechanism 510.
It is understood that the first RGB image and the first point cloud information are corresponding, and that the object in the first RGB image has corresponding point cloud information in the first point cloud information.
In this step, according to the region where the target baggage 501 is located in the first RGB image, the first point cloud information is resolved to obtain target point cloud information of the target baggage 501, where the target point cloud information is used to characterize the three-dimensional characteristics of the target baggage 501.
In this embodiment, the first RGB image may represent a color feature and a two-dimensional feature of the target baggage 501, the target point cloud information may represent a three-dimensional feature of the target baggage 501, and the target type information, the target size information, and the target pose information of the target baggage 501 are determined from the first RGB image and the target point cloud information.
The target type information characterizes, among other things, the category to which the target baggage 501 belongs, e.g., a hard-shell luggage, a cloth luggage, a cardboard box, a luggage bag, or other irregular, special-material luggage category.
In this embodiment, the soft and hard packet tag of the target baggage 501 is accurately identified according to the target type information of the target baggage 501 in combination with the target size information and the target pose information, and the soft and hard packet type of the target baggage 501 is obtained.
In actual execution, the identified soft-hard bag category of the target baggage 501 includes a case where the target baggage 501 is a soft bag and a case where the target baggage 501 is a hard bag.
For example, the target type information of the target baggage 501 is cloth-type baggage, and the target size information and the target pose information are combined, so that it is determined that more hard baggage is contained in the cloth-type baggage, the soft and hard bag tag of the target baggage 501 is recognized as a hard bag, and the target baggage 501 is sorted into a hard bag baggage stack by the hard bag sorting mechanism of the baggage sorting mechanism 560.
In the related art, the two-dimensional image is used for identifying the material of the surface of the luggage, judging whether the luggage is a hard-shell luggage case, a cloth luggage case, a paper box or a luggage bag, further determining whether the luggage belongs to a soft bag or a hard bag, and the technology is easy to cause misjudgment, especially when the material difference between the luggage and the material of an object in the luggage is large.
For example, when a bag is filled with a hard object, the material of the surface of the bag is recognized by a two-dimensional image, the bag is classified into a soft bag, the bag is not easily grasped by a grasping mechanism of the soft bag, and other soft bags are easily damaged when the bag is stacked in a soft bag stack.
According to the target type information of the target baggage 501 and combining the target size information and the target pose information, the embodiment of the application can accurately identify the target baggage 501 such as a baggage bag filled with a hard object as a hard bag baggage.
In this embodiment of the present application, the first RGB image is used to perform a calculation optimization process on the first point cloud information, and based on the optimized target point cloud information, the size and pose of the target baggage 501 are accurately measured, so that the measurement accuracy is effectively improved, and based on the target type information, the target size information and the target pose information are combined, and the accurate soft and hard package classification is performed on the target baggage 501.
In actual execution, the accurate soft and hard package classification can be performed on the target baggage 501, so that the accurate classification and identification of the baggage can be replaced by manual operation, the baggage can be accurately measured, accurate code grabbing strategy information is provided for the baggage sorting mechanism 560, the full-automatic process of grabbing and transporting the baggage from the baggage conveying mechanism 510 to the baggage carrier can be realized, the baggage sorting efficiency is improved, and the baggage breakage rate is reduced.
According to the luggage soft and hard bag identification method provided by the embodiment of the application, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage 501 are accurately measured based on the obtained target point cloud information, and the data such as the target type information, the target size information and the target pose information are synthesized, so that soft luggage and hard luggage can be accurately identified, the manpower resource cost is reduced, the luggage sorting efficiency is improved, and the luggage breakage rate is reduced.
In some embodiments, the step 120 of resolving the first point cloud information based on the first RGB image to obtain the target point cloud information of the target baggage 501 may include:
performing target detection on the first RGB image to obtain target type information of the target baggage 501 and target position information of the target baggage 501 in the first RGB image;
and calculating to obtain target point cloud information based on the target position information and the first point cloud information, wherein the target point cloud information is used for determining target size information and target pose information.
Among them, object Detection (Object Detection) is the task of finding all objects (objects) of interest in an image, determining their category and location.
In this embodiment, the target detection is performed on the first RGB image, so that the target type information of the target baggage 501 and the target position information of the target baggage 501 in the first RGB image, which characterizes the position information of the region of the target baggage 501 on the first RGB image, can be obtained.
According to the target position information of the target baggage 501 in the first RGB image, the point cloud information of the corresponding region in the first point cloud information is resolved, so that the target point cloud information can be obtained.
It should be noted that, the target point cloud information characterizes the three-dimensional feature of the target baggage 501, and the category of the target baggage 501 may also be determined according to the target point cloud information, so as to verify the target type information obtained by performing the target detection on the first RGB image.
In some embodiments, performing object detection on the first RGB image to obtain object type information of the object baggage 501 and object position information of the object baggage 501 in the first RGB image may include:
inputting the first RGB image into a target detection model for target detection to obtain target type information and target position information output by the target detection model;
the target detection model is trained based on a sample luggage data set, and the sample luggage data set comprises a sample luggage image and label information corresponding to the sample luggage image.
In this embodiment, as shown in fig. 2, the object detection model may include structures such as an input, a convolution layer 1, a convolution layer 2, a convolution layer a, a convolution layer b …, a convolution layer n, and a fusion output layer.
As shown in fig. 3, the training process and the detection process of the target detection model may include the steps of:
step one, data set making and model training.
In this step, a sample baggage image corresponding to the sample baggage is collected, and a sample baggage data set including a plurality of sample baggage images is created, and corresponding tag information is labeled for each sample line Li Tuxiang.
In actual execution, the sample baggage dataset may be divided into a training set, a validation set, and a detection set, applied to different phases of the target detection model.
The SOTA (state-of-the-art) target detection network can be used for training, the network is pre-trained on the disclosed coco data set, then the sample luggage data set is used for training, the deep network is used for extracting image information, and the final target detection model is obtained through multiple training and tuning.
And step two, obtaining image detection result information.
And inputting the first RGB image into a target detection model to perform target detection to obtain detection result information such as target type information, target position information and the like.
And thirdly, fusion and solution are carried out to calculate target point cloud information.
And using target position information of a candidate region where the target baggage 501 is located in the first RGB image identified by the target detection model, and calculating target point cloud information corresponding to the target baggage 501 according to the camera parameter information and the camera coordinate conversion matrix.
In some embodiments, the calculating the target point cloud information based on the target location information and the first point cloud information may include:
performing foreground and background segmentation on the first point cloud information to obtain second point cloud information of the target baggage 501 through calculation;
third point cloud information of the target baggage 501 is obtained by calculation based on the first point cloud information and the target position information;
And carrying out fusion processing on the second point cloud information and the third point cloud information to obtain target point cloud information.
In this embodiment, a data processing algorithm for three-dimensional point cloud may be set, the size information and pose information of the target baggage 501 may be initially resolved, the foreground and background segmentation may be performed on the first point cloud information, and the second point cloud information obtained after the first resolution may be obtained.
Determining a detection frame corresponding to the target luggage 501 by using target position information of target detection by using the first RGB image, calculating third point cloud information of the target luggage 501 of the detection frame, performing fusion processing on the third point cloud information obtained by second calculation and the second point cloud information obtained by first calculation to obtain optimized target point cloud information, and obtaining information such as the category, the size, the pose and the like of the target luggage 501 according to the optimized target point cloud information.
In some embodiments, step 110 of acquiring the first RGB image and the first point cloud information corresponding to the first RGB image may include:
acquiring a plurality of first RGB images and a plurality of first point cloud information corresponding to the first RGB images one by one;
And resolving the plurality of first point cloud information in a one-to-one correspondence manner based on the plurality of first RGB images to obtain a plurality of target point cloud information.
In this embodiment, a plurality of first RGB images and a plurality of first point cloud information may be acquired for the same target baggage 501, the first RGB images and the first point cloud information are in one-to-one correspondence, and the first RGB images and the corresponding first point cloud information are used for resolving, so that a plurality of target point cloud information in one-to-one correspondence with the plurality of first RGB images may be obtained.
For example, 5 sets of data are acquired for the target baggage 501, each set of data including a first RGB image and a first point cloud information, and each set of data is resolved to obtain 5 target point cloud information.
In some embodiments, the target size information is determined based on a mean of first size information corresponding to the plurality of target point cloud information, and the target pose information is determined based on a mean of first pose information corresponding to the plurality of target point cloud information.
It can be understood that the first RGB image, the first point cloud information and the target point cloud information corresponding to the first RGB image are taken as a set of data, and size information and pose information corresponding to the set of data, namely, the first size information and the first pose information, can be obtained through solving.
In this embodiment, the average value of the first size information of the plurality of sets of data is taken as the target size information in the basis of the soft and hard packet identification data of the target baggage 501; and taking the average value of the first pose information of the plurality of groups of data as the target pose information in the basis of the soft and hard packet identification data of the target baggage 501.
For example, 5 sets of data are collected for the target baggage 501, each set of data includes a first RGB image and a first point cloud information, each set of data is resolved to obtain 5 pieces of target point cloud information, and the 5 sets of data are correspondingly calculated to obtain 5 pieces of first size information and 5 pieces of first pose information.
In this embodiment, the detection results of 5 sets of size information and pose information acquired and processed by the target baggage 501 are fused and averaged to obtain an optimized detection result, so that the baggage detection accuracy can be effectively improved.
In some embodiments, the soft and hard packet classification of the target baggage 501 is determined based on the mode of a plurality of first soft and hard packet classifications, which are determined based on the first RGB image and the target point cloud information corresponding to the first RGB image.
In this embodiment, the first type information, the first size information and the first pose information may be obtained by calculation according to the first RGB image and the target point cloud information in the set of data, and the first soft and hard packet type of the target baggage 501 corresponding to each set of data may be determined; the first soft and hard packet type of the target baggage 501 corresponding to each group of data may be determined by calculating the target size information and the target pose information according to the average value of the plurality of groups of data and combining the target type information.
The first soft and hard packet categories include both soft and hard packets, and the soft and hard packet category of the target baggage 501 is determined based on the mode, i.e., the high frequency value, of the plurality of first soft and hard packet categories.
For example, 5 sets of data are collected for the target baggage 501, and 5 first soft and hard packet categories, that is, a hard packet, a soft packet, a hard packet, and a hard packet, are obtained correspondingly, and the mode in the 5 categories is the hard packet, so that the soft and hard packet category of the target baggage 501 is determined to be the hard packet.
In some embodiments, determining 140 the soft and hard packet category of the target baggage 501 based on the target type information, the target size information, and the target pose information may include:
inputting the target type information, the target size information and the target pose information into a classification network model to obtain the soft and hard packet type of the target luggage 501 output by the classification network model;
the classification network model is obtained through training based on a training sample set.
In this embodiment, a training sample set of soft and hard packet classification is constructed, the training sample set includes type information obtained by performing object detection on RGB images, and a plurality of data such as size and pose luggage characteristic information after point cloud information processing and point cloud information foreground basic characteristic information are subjected to modeling fit calculation to obtain a final classification network model, and soft and hard packet identification tags corresponding to the object luggage 501 are calculated.
According to the luggage soft and hard bag identification method provided by the embodiment of the application, the execution main body can be a luggage soft and hard bag identification device. In this embodiment of the present application, a method for identifying a piece of luggage by using a device for identifying a piece of luggage is taken as an example, and the device for identifying a piece of luggage provided in this embodiment of the present application is described.
The embodiment of the application also provides a device for identifying the soft and hard bags of the luggage.
As shown in fig. 4, the baggage soft and hard bag recognition device includes:
an acquiring module 410, configured to acquire a first RGB image and first point cloud information corresponding to the first RGB image, where the first RGB image includes pixel information of a target baggage 501 to be identified;
the first processing module 420 is configured to calculate, based on the first RGB image, first point cloud information, to obtain target point cloud information of the target baggage 501;
a second processing module 430 for determining target type information, target size information, and target pose information of the target baggage 501 based on the first RGB image and the target point cloud information;
the third processing module 440 is configured to determine a soft and hard packet category of the target baggage 501 based on the target type information, the target size information and the target pose information.
According to the luggage soft and hard bag identification device provided by the embodiment of the application, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage 501 are accurately measured based on the obtained target point cloud information, the data such as the target type information, the target size information and the target pose information are synthesized, soft luggage and hard luggage can be accurately identified, the manpower resource cost is reduced, the luggage sorting efficiency is improved, and the luggage breakage rate is reduced.
In some embodiments, the first processing module 420 is configured to perform target detection on the first RGB image to obtain target type information of the target baggage 501 and target position information of the target baggage 501 in the first RGB image;
and calculating to obtain target point cloud information based on the target position information and the first point cloud information, wherein the target point cloud information is used for determining target size information and target pose information.
In some embodiments, the first processing module 420 is configured to perform foreground-background segmentation on the first point cloud information, and calculate second point cloud information of the target baggage 501;
third point cloud information of the target baggage 501 is obtained by calculation based on the first point cloud information and the target position information;
and carrying out fusion processing on the second point cloud information and the third point cloud information to obtain target point cloud information.
In some embodiments, the first processing module 420 is configured to input the first RGB image into the target detection model, perform target detection, and obtain target type information and target position information output by the target detection model;
the target detection model is trained based on a sample luggage data set, and the sample luggage data set comprises a sample luggage image and label information corresponding to the sample luggage image.
In some embodiments, the obtaining module 410 is further configured to obtain a plurality of first RGB images and a plurality of first point cloud information corresponding to the plurality of first RGB images one-to-one;
the first processing module 420 is further configured to calculate, based on the plurality of first RGB images, a plurality of first point cloud information in a one-to-one correspondence manner, to obtain a plurality of target point cloud information.
In some embodiments, the target size information is determined based on a mean of first size information corresponding to the plurality of target point cloud information, and the target pose information is determined based on a mean of first pose information corresponding to the plurality of target point cloud information.
In some embodiments, the soft and hard packet classification of the target baggage 501 is determined based on the mode of a plurality of first soft and hard packet classifications, which are determined based on the first RGB image and the target point cloud information corresponding to the first RGB image.
In some embodiments, the third processing module 440 is configured to input the target type information, the target size information, and the target pose information into the classification network model, and obtain the soft and hard packet class of the target baggage 501 output by the classification network model;
the classification network model is obtained through training based on a training sample set.
The luggage soft and hard bag identification device in the embodiment of the application can be electronic equipment, and can also be a component in the electronic equipment, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The luggage soft and hard bag identification device in the embodiment of the application can be a device with an operating system. The operating system may be an Android operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The device for identifying soft and hard bags of luggage provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 3, and in order to avoid repetition, a detailed description is omitted here.
The embodiment of the application also provides a baggage sorting system.
As shown in fig. 5 and 6, the baggage sorting system comprises: a baggage conveyor 510, an image acquisition module 520, a soft and hard bag recognition module 530, and a baggage sorting mechanism 560.
The baggage transferring mechanism 510 is configured to transport the target baggage 501, the image capturing module 520 is disposed in the baggage transferring mechanism 510, and the image capturing module 520 is configured to capture a first RGB image of the target baggage 501 and first point cloud information corresponding to the first RGB image.
In this embodiment, the image acquisition module 520 may include an image acquisition device 521 and a photoelectric switching device 522, and the photoelectric switching device 522 may be disposed at both sides of the baggage conveyor 510.
When the target baggage 501 reaches the installation position of the photoelectric switch device 522, photoelectric information is automatically blocked, photoelectric signal change data is generated, the baggage sorting system acquires the photoelectric information change information amount, and the image acquisition device 521 is controlled to capture the first RGB image and the first point cloud information, so that real-time acquisition of baggage data is realized.
The soft and hard bag recognition module 530 is electrically connected to the image acquisition module 520, and the soft and hard bag recognition module 530 is configured to determine the soft and hard bag type, the target size information, and the target pose information of the target baggage 501 based on the baggage soft and hard bag recognition method described above.
As shown in fig. 6, the soft and hard packet identification module 530 may include a 2D image processing module 531, a 3D point cloud processing module 532, and a soft and hard packet classification module 533.
In this embodiment, the 2D image processing module 531 may perform object detection on the first RGB image using the object detection model to obtain 2D object detection results such as object type information and object position information of the object baggage 501.
The 3D point cloud processing module 532 is connected with the 2D image processing module 531, and the 3D point cloud processing module 532 receives 2D target detection results such as target position information, so as to realize the pose detection of the line Li Checun based on point cloud processing.
In this embodiment, the 3D point cloud processing module 532 may be designed to perform preliminary calculation on the size information and the pose information of the target baggage 501 based on a data processing algorithm of the 3D point cloud, so as to implement foreground and background segmentation of the first point cloud information of the target baggage 501 and obtain the second point cloud information.
The 3D point cloud processing module 532 may further use the 2D image processing module 531 to perform target detection to obtain target position information, calculate to obtain third point cloud information, perform fusion processing on the second point cloud information and the third point cloud information to obtain optimized target point cloud information, and obtain information such as the size and the pose of the target baggage 501, that is, the target size information and the target pose information, according to the target point cloud information.
The soft and hard packet classification module 533 is connected to the 3D point cloud processing module 532 and the 2D image processing module 531, respectively, and the soft and hard packet classification module 533 uses the target type information of the 2D image processing module 531 in combination with the target size information and the target pose information obtained by the processing of the 3D point cloud processing module 532 to identify whether the target baggage 501 belongs to a soft packet or a hard packet.
The baggage sorting mechanism 560 is electrically connected to the soft and hard bag recognition module 530, and the baggage sorting mechanism 560 is configured to perform a sorting operation on the target baggage 501 based on the soft and hard bag classification, the target size information, and the target pose information of the target baggage 501.
The baggage sorting mechanism 560 is a sorting mechanism for actually gripping stacked baggage, and in this embodiment, the target baggage 501 may be classified into two types of soft bags and hard bags, and the baggage sorting mechanism 560 may include two types of grippers for gripping the hard bags and the soft bags, respectively.
Based on the target size information and the target pose information of the target baggage 501, the baggage sorting mechanism 560 may be controlled to grasp the target baggage 501 using different sized grippers in cooperation with different grasping poses.
In this embodiment, the first RGB image is used to perform target detection, the target detection result is used to optimize the first point cloud information, perform multi-level processing on the point cloud information, obtain accurate measurement information such as the size and the pose of the baggage, construct a torpedo network model of the baggage classification weight network, perform soft and hard packet classification recognition by using the multi-level calculation result as input, generate a corresponding baggage sorting scheme according to the comprehensive detection result such as the size information, the pose information and the soft and hard packet classification of the target baggage 501, and control the baggage sorting mechanism 560 to perform sorting operation.
According to the luggage sorting system provided by the embodiment of the application, the first point cloud information is resolved by using the first RGB image, the size and the pose of the target luggage 501 are accurately measured based on the obtained target point cloud information, the data such as the target type information, the target size information and the target pose information are synthesized, soft luggage and hard luggage can be accurately identified, the luggage sorting speed is improved, the human resource cost is reduced, the luggage breakage rate of passengers is effectively reduced, the automatic application of the luggage sorting system is enhanced, and energy conservation and synergy are realized.
In some embodiments, the baggage sorting system may further include a sorting decision module 540.
In this embodiment, the sorting decision module 540 is connected between the soft and hard packet identifying module 530 and the baggage sorting mechanism 560, and the sorting decision module 540 is configured to determine the soft and hard packet type, the target size information, and the target pose information of the target baggage 501 when acquiring the plurality of first RGB images and the plurality of first point cloud information corresponding to the plurality of first RGB images one to one.
In this embodiment, the image acquisition module 520 acquires multiple sets of data, where each set of data includes a first RGB image and first point cloud information corresponding to the first RGB image, and the 3D point cloud processing module 532 and the 2D image processing module 531 respectively process each set of data to obtain type information, size information and pose information corresponding to each set of data.
In this embodiment, the sorting decision module 540 averages the first size information of the plurality of sets of data to the target size information in the basis of the identification data of the soft and hard packets of the target baggage 501; the sorting decision module 540 takes the mean value of the first pose information of the plurality of sets of data as the target pose information in the basis of the soft and hard packet identification data of the target baggage 501.
In this embodiment, the first type information, the first size information and the first pose information may be obtained by calculation according to the first RGB image and the target point cloud information in the set of data, and the first soft and hard packet type of the target baggage 501 corresponding to each set of data may be determined; the first soft and hard packet type of the target baggage 501 corresponding to each group of data may be determined by calculating the target size information and the target pose information according to the average value of the plurality of groups of data and combining the target type information.
The first soft and hard packet categories include both soft and hard packets, and the sorting decision module 540 may determine the soft and hard packet category of the target baggage 501 based on the mode, i.e., the high frequency value, of the plurality of first soft and hard packet categories.
For example, the image acquisition module 520 acquires 5 sets of data for the target baggage 501, where each set of data includes a first RGB image and a first point cloud information, and calculates each set of data to obtain 5 target point cloud information, and the 5 sets of data correspondingly calculate 5 first size information and 5 first pose information.
The sorting decision module 540 takes the average of the 5 pieces of first size information as the target size information and the average of the 5 pieces of first pose information as the target pose information.
The collection of 5 sets of data for the target baggage 501 corresponds to obtaining 5 first soft and hard packet categories, which are hard packet, soft packet, hard packet, and hard packet, respectively, and the sort decision module 540 determines that the soft and hard packet category of the target baggage 501 is hard packet.
In this embodiment, multiple sets of data are collected for the same target baggage 501, the soft and hard bag identification module 530 identifies each set of data, and the sorting decision module 540 determines the soft and hard bag type, the target size information and the target pose information of the final target baggage 501, so as to effectively improve the baggage inspection accuracy.
The sorting decision module 540 outputs the soft and hard packet type, the target size information and the target pose information of the target baggage 501 to the baggage sorting mechanism 560, and the baggage sorting mechanism 560 performs a corresponding sorting operation, so that the breakage rate of the baggage of the passenger is effectively reduced.
In some embodiments, the baggage sorting system may further include a data transmission module 550.
In this embodiment, the data transmission module 550 is connected between the sorting decision module 540 and the baggage sorting mechanism 560, and the data transmission module 550 implements communication between the sorting decision module 540 and the baggage sorting mechanism 560 according to a network transmission protocol, and after the data transmission module 550 confirms that the communication is valid, outputs the type of the soft and hard packet of the target baggage 501, the target size information and the target pose information to the baggage sorting mechanism 560.
In some embodiments, the baggage sorting system may further include a status monitoring module.
In this embodiment, the status monitoring module is coupled to at least one of the baggage conveying mechanism 510, the image acquisition module 520, the soft and hard bag recognition module 530, the baggage sorting mechanism 560, and the sorting decision module 540.
The state monitoring module can acquire heartbeat data of external equipment in the baggage conveying mechanism 510, the image acquisition module 520 and the baggage sorting mechanism 560 in real time, and can monitor the running state of the baggage sorting system in real time, so as to obtain state monitoring information of the baggage sorting system and inform an upper application system.
In actual implementation, the data transmission module 550 of the baggage sorting system may periodically send status monitoring information of the baggage sorting system to the upper control system according to the network transmission protocol, so as to realize timely alarm when a fault occurs.
In some embodiments, as shown in fig. 7, the embodiment of the present application further provides an electronic device 700, including a processor 701, a memory 702, and a computer program stored in the memory 702 and capable of running on the processor 701, where the program when executed by the processor 701 implements the respective processes of the above-mentioned method embodiment for identifying a soft and hard package of a baggage, and the same technical effects can be achieved, and for avoiding repetition, a detailed description is omitted herein.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
The embodiment of the present application further provides a non-transitory computer readable storage medium, on which a computer program is stored, where the computer program when executed by a processor implements each process of the above-mentioned baggage soft and hard packet identification method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
Embodiments of the present application also provide a computer program product, including a computer program, which when executed by a processor implements the above-mentioned baggage soft and hard packet recognition method.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or an instruction, implementing each process of the above-mentioned luggage soft and hard packet identification method embodiment, and can achieve the same technical effect, so as to avoid repetition, and no further description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
Claims (12)
1. A method for identifying soft and hard bags of luggage, which is characterized by comprising the following steps:
acquiring a first RGB image and first point cloud information corresponding to the first RGB image, wherein the first RGB image comprises pixel information of target luggage to be identified;
Based on the first RGB image, resolving the first point cloud information to obtain target point cloud information of the target luggage;
determining target type information, target size information and target pose information of the target baggage based on the first RGB image and the target point cloud information;
and determining the soft and hard package type of the target luggage based on the target type information, the target size information and the target pose information.
2. The method for identifying a soft or hard luggage according to claim 1, wherein the calculating the first point cloud information based on the first RGB image to obtain the target point cloud information of the target luggage includes:
performing target detection on the first RGB image to obtain the target type information of the target luggage and the target position information of the target luggage in the first RGB image;
and calculating the target point cloud information based on the target position information and the first point cloud information, wherein the target point cloud information is used for determining the target size information and the target pose information.
3. The method for identifying a soft or hard luggage according to claim 2, wherein the calculating the target point cloud information based on the target location information and the first point cloud information includes:
Performing foreground and background segmentation on the first point cloud information, and resolving to obtain second point cloud information of the target luggage;
based on the first point cloud information and the target position information, third point cloud information of the target luggage is obtained through calculation;
and carrying out fusion processing on the second point cloud information and the third point cloud information to obtain the target point cloud information.
4. The method for identifying a soft or hard luggage according to claim 2, wherein the performing the object detection on the first RGB image to obtain the object type information of the object luggage and the object position information of the object luggage in the first RGB image includes:
inputting the first RGB image into a target detection model for target detection to obtain the target type information and the target position information output by the target detection model;
the target detection model is trained based on a sample luggage data set, and the sample luggage data set comprises a sample luggage image and label information corresponding to the sample luggage image.
5. The method for identifying a soft or hard luggage according to claim 1, wherein the acquiring a first RGB image and first point cloud information corresponding to the first RGB image includes:
Acquiring a plurality of first RGB images and a plurality of first point cloud information corresponding to the first RGB images one by one;
the calculating the first point cloud information based on the first RGB image to obtain target point cloud information of the target baggage includes:
and resolving the plurality of first point cloud information in a one-to-one correspondence manner based on the plurality of first RGB images to obtain a plurality of target point cloud information.
6. The method for identifying a soft or hard luggage according to claim 5, wherein the target size information is determined based on a mean value of first size information corresponding to a plurality of the target point cloud information, and the target pose information is determined based on a mean value of first pose information corresponding to a plurality of the target point cloud information.
7. The baggage soft and hard bag recognition method according to claim 5, wherein a soft and hard bag category of the target baggage is determined based on a mode of a plurality of first soft and hard bag categories, the first soft and hard bag categories being determined based on the first RGB image and the target point cloud information corresponding to the first RGB image.
8. The baggage item identification method according to any one of claims 1 to 7, wherein said determining a category of said baggage item based on said object type information, said object size information and said object pose information comprises:
Inputting the target type information, the target size information and the target pose information into a classification network model to obtain the soft and hard packet type of the target luggage output by the classification network model;
the classification network model is obtained through training based on a training sample set.
9. A luggage soft and hard bag recognition device, comprising:
the acquisition module is used for acquiring a first RGB image and first point cloud information corresponding to the first RGB image, wherein the first RGB image comprises pixel information of target luggage to be identified;
the first processing module is used for resolving the first point cloud information based on the first RGB image to obtain target point cloud information of the target luggage;
the second processing module is used for determining target type information, target size information and target pose information of the target luggage based on the first RGB image and the target point cloud information;
and the third processing module is used for determining the soft and hard package type of the target luggage based on the target type information, the target size information and the target pose information.
10. A baggage sorting system comprising:
A baggage transfer mechanism for transporting a target baggage;
the image acquisition module is arranged on the luggage conveying mechanism and is used for acquiring a first RGB image of the target luggage and first point cloud information corresponding to the first RGB image;
the soft and hard bag identification module is electrically connected with the image acquisition module and is used for determining the soft and hard bag type, the target size information and the target pose information of the target baggage based on the baggage soft and hard bag identification method according to any one of claims 1-8;
and the baggage sorting mechanism is electrically connected with the soft and hard bag identification module and is used for performing sorting operation on the target baggage based on the soft and hard bag type, the target size information and the target pose information of the target baggage.
11. The baggage sorting system of claim 10, further comprising:
the sorting decision module is connected between the soft and hard package identification module and the luggage sorting mechanism, and is used for determining the soft and hard package type, the target size information and the target pose information of the target luggage under the condition that a plurality of first RGB images and a plurality of first point cloud information which are in one-to-one correspondence with the first RGB images are acquired.
12. The baggage sorting system of claim 11, further comprising:
the state monitoring module is connected with at least one of the baggage conveying mechanism, the image acquisition module, the soft and hard bag identification module, the baggage sorting mechanism and the sorting decision module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310274377.5A CN116188961A (en) | 2023-03-20 | 2023-03-20 | Luggage soft and hard bag identification method and device and luggage sorting system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310274377.5A CN116188961A (en) | 2023-03-20 | 2023-03-20 | Luggage soft and hard bag identification method and device and luggage sorting system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116188961A true CN116188961A (en) | 2023-05-30 |
Family
ID=86438534
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310274377.5A Pending CN116188961A (en) | 2023-03-20 | 2023-03-20 | Luggage soft and hard bag identification method and device and luggage sorting system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116188961A (en) |
-
2023
- 2023-03-20 CN CN202310274377.5A patent/CN116188961A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li et al. | Automatic detection and classification system of domestic waste via multimodel cascaded convolutional neural network | |
CN111368852A (en) | Article identification and pre-sorting system and method based on deep learning and robot | |
CN106504233A (en) | Image electric power widget recognition methodss and system are patrolled and examined based on the unmanned plane of Faster R CNN | |
CN110378235A (en) | A kind of fuzzy facial image recognition method, device and terminal device | |
CN111783569B (en) | Luggage specification detection and personal bag information binding method of self-service consignment system | |
CN103390164A (en) | Object detection method based on depth image and implementing device thereof | |
CN107808126A (en) | Vehicle retrieval method and device | |
CN111597857B (en) | Logistics package detection method, device, equipment and readable storage medium | |
CN109092696A (en) | Sorting system and method for sorting | |
CN113927601B (en) | Method and system for realizing precise picking of mechanical arm based on visual recognition | |
CN103279760A (en) | Real-time classifying method of plant quarantine larvae | |
CN112893159A (en) | Coal gangue sorting method based on image recognition | |
CN108108703A (en) | Deceleration strip missing detection method, device and electronic equipment | |
CN104063720A (en) | Method for detecting images of prohibited commodities of e-commerce websites based on deep Boltzmann machine | |
Shankar et al. | A framework to enhance object detection performance by using YOLO algorithm | |
CN114419428A (en) | Target detection method, target detection device and computer readable storage medium | |
CN117735244A (en) | Box-type cargo intelligent inspection integrated system and method | |
CN116188961A (en) | Luggage soft and hard bag identification method and device and luggage sorting system | |
CN114255435A (en) | Method and device for detecting abnormality of transport device, electronic apparatus, and storage medium | |
CN112884392A (en) | Logistics piece management method and device, computing equipment and storage medium | |
TWM632230U (en) | Automatic plastic cup recycling and collection system architecture | |
CN114972967A (en) | Airplane part identification and counting method and detection system | |
CN114581803A (en) | Article identification processing method and device | |
Bhuyan et al. | Structure‐aware multiple salient region detection and localization for autonomous robotic manipulation | |
CN113012136A (en) | Airport luggage counting method and counting system based on target detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |