WO2021218792A1 - 包裹处理设备、包裹处理方法、电子设备及存储介质 - Google Patents

包裹处理设备、包裹处理方法、电子设备及存储介质 Download PDF

Info

Publication number
WO2021218792A1
WO2021218792A1 PCT/CN2021/089167 CN2021089167W WO2021218792A1 WO 2021218792 A1 WO2021218792 A1 WO 2021218792A1 CN 2021089167 W CN2021089167 W CN 2021089167W WO 2021218792 A1 WO2021218792 A1 WO 2021218792A1
Authority
WO
WIPO (PCT)
Prior art keywords
package
image
target
conveying
confidence
Prior art date
Application number
PCT/CN2021/089167
Other languages
English (en)
French (fr)
Inventor
刘腾澳
周萧
陶鹏
王春涛
丛强滋
Original Assignee
山东新北洋信息技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 山东新北洋信息技术股份有限公司 filed Critical 山东新北洋信息技术股份有限公司
Publication of WO2021218792A1 publication Critical patent/WO2021218792A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • B65G43/08Control devices operated by article or material being fed, conveyed or discharged
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/22Devices influencing the relative position or the attitude of articles during transit by conveyors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2201/00Indexing codes relating to handling devices, e.g. conveyors, characterised by the type of product or load being conveyed or handled
    • B65G2201/02Articles
    • B65G2201/0285Postal items, e.g. letters, parcels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/02Control or detection
    • B65G2203/0208Control or detection relating to the transported articles
    • B65G2203/0233Position of the article
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/041Camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • This application relates to the field of logistics technology, specifically, for example, to a package processing device, a package processing method, an electronic device, and a storage medium.
  • the express delivery system of a logistics company usually includes multi-level conveying equipment.
  • the multi-level conveying equipment includes parcel processing equipment for separating parallel parcels.
  • the parcel processing equipment receives multiple packages conveyed by the upper level equipment (such as evacuation equipment). And process the received packages (for example, separate and convey, make multiple packages leave the package processing equipment one by one to enter the next level of conveying equipment).
  • the related art discloses a package processing equipment.
  • the package processing equipment includes a control device, a conveying device, and a plurality of 3D cameras arranged above the conveying device.
  • the conveying device includes a plurality of parallel conveyor belts.
  • the control device controls the multiple 3D cameras. Continue to take pictures, stitch the images acquired by multiple 3D cameras to obtain the image of the package on the conveying device, determine the location information of each package on the conveying device according to the package image, and adjust the location of the package according to the location information of the package.
  • the purpose of this application includes providing a package processing device, a package processing method, an electronic device, and a storage medium, which can improve the accuracy of package location detection, and thereby improve the accuracy of package processing by the package processing device.
  • an embodiment of the present application provides a package processing equipment, including a control device, a conveying device for conveying packages, and a first detection device and a second detection device arranged on the path of the conveying device for conveying the package, so The first detection device, the second detection device, and the conveying device are all electrically connected to the control device, wherein the first detection device is configured to obtain a three-dimensional image of the package, and the second detection device is configured to To obtain a two-dimensional image of the package, the control device is set to:
  • an embodiment of the present application provides a package processing method, which is applied to a package processing device, the package processing device includes a conveying device for conveying packages and a first detection set on the path of the conveying device for conveying the packages The device and the second detection device, wherein the first detection device is configured to obtain a three-dimensional image of the package, the second detection device is configured to obtain a two-dimensional image of the package, and the package processing method includes:
  • an electronic device including:
  • Memory set to store computer programs
  • the processor is configured to run the computer program to execute the package processing method described in the second aspect of the embodiments of the present application.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and the computer program is configured to execute the package processing method described in the second aspect of the embodiments of the present application .
  • Figure 1 is a top view of a package processing device in an embodiment of the application
  • Figure 2 is a block diagram of a package processing device in an embodiment of this application.
  • Figure 3 is a top view of a package processing device in another embodiment of the application.
  • Figure 4 is a block diagram of the package processing equipment in another embodiment of the application.
  • FIG. 5 is a flowchart of a package processing method in an embodiment of the application.
  • FIG. 6 is a schematic diagram of the structure of an electronic device in an embodiment of the application.
  • Icon 010-package processing equipment; 100-conveying device; 110-conveyor belt; 200-first detection device; 210-3D camera; 300-second detection device; 310-2D camera; 320-first detection component; 321 -First sensor; 330-Second detection component; 331-Second sensor; 400-Control device; 020-Upper level conveying equipment; 030-Lower level conveying equipment.
  • the related technology package processing equipment multiple 3D cameras are used to detect the position of the package on the conveying device.
  • the inventor found that in actual use, due to a certain detection error of the 3D camera, when the height of the package is low, the 3D The camera usually does not detect the presence of the package on the conveyor.
  • a related art 3D camera detects objects in its field of view, it obtains the coordinate data of each pixel in its field of view in the three dimensions of height, length, and width, and compares the coordinates of all pixels in its field of view.
  • the coordinate data is sent to the control device in the form of a point cloud.
  • the control device determines whether there is a pixel point with a high mutation relative to a reference surface by analyzing the point cloud data, thereby determining whether there is an object on the reference surface, and whether there is an object on the reference surface.
  • the height, length and width of the object are calculated by the coordinate value of each pixel of the object.
  • the control device may not detect the location of the package with a sudden change in height relative to the reference plane. Therefore, the package location information obtained by the related technology package processing equipment is not accurate enough, and the package processing equipment can not detect the presence of the package on the conveying device. When the information adjusts the position of the package, this may lead to the problem of poor accuracy of the position adjustment of the package.
  • the conveying device based on the detection result of the 3D camera cannot handle the two parcels. Carry out separation treatment, resulting in separation omissions.
  • the embodiments of the present application provide a package processing device, a package processing method, an electronic device, and a storage medium.
  • the package processing equipment of the embodiment of the present application uses two detection devices to detect the three-dimensional image and the two-dimensional image of the package on the conveying device respectively, and determines that each package is on the conveying device according to the confidence of the package in the images collected by the two detection devices On the actual location.
  • the package processing device, package processing method, electronic device, and storage medium of the embodiments of the present application can improve the accuracy of package location detection, thereby providing a reliable basis for the package processing device to subsequently process packages (such as separating and transporting packages).
  • Fig. 1 is a top view of a package processing device 010 in an embodiment of this application
  • Fig. 2 is a block diagram of a package processing device 010 in an embodiment of this application.
  • the embodiment of the present application provides a package processing device 010, which is set in an express mail processing system to transport packages and obtain the relative position of each package on the package processing device 010.
  • the upstream of the parcel processing equipment 010 is provided with an upper-level conveying device 020 for conveying packages to the parcel processing equipment 010; the downstream of the parcel processing equipment 010 is provided with a lower-level conveying device 030 for Accept the parcels from the parcel processing equipment 010.
  • the upper-level conveying device 020 may be an evacuation device, and the evacuation device includes a plurality of conveyor belts pointing to the package processing device 010;
  • the lower-level conveying device 030 may be a side device (used to bring all packages close to one side of the conveying path) , Centering equipment (used to bring all the packages close to the middle of the conveying path) or pulling equipment, etc.
  • the package processing equipment 010 includes a conveying device 100, a first detection device 200, a second detection device 300, and a control device electrically connected to the conveying device 100, the first detection device 200, and the second detection device 300. 400.
  • the first detection device 200 and the second detection device 300 are configured to collect images when the package is transported on the conveying device 100, and send the image information to the control device 400, and the control device 400 is configured to control the conveying device 100 to convey the package, and according to The image information of the package determines the location of each package, and according to the location of the package, the conveying device 100 is controlled to adjust the location of the package.
  • control device 400 controls the conveying device 100 to adjust the positions of the parcels conveyed in parallel, so that each package on the conveying device 100 can be sequentially conveyed from the conveying device 100 to the lower-level conveying equipment 030.
  • the conveying device 100 includes a plurality of conveying belts 110 arranged in parallel, and each conveying belt 110 extends along the conveying direction (the direction indicated by the arrow ab in the figure).
  • a plurality of conveyor belts 110 arranged in parallel are configured to have different conveying speeds.
  • the conveying device 100 includes a plurality of conveying belts 110 in M rows in the conveying direction and N columns in the width direction (the direction indicated by the arrow cd in the figure), where M and N are both positive. Integer.
  • the parcels conveyed in parallel are often supported by conveyor belts 110 in different rows. Therefore, by setting the conveyor belts 110 of different rows supporting two (or more) parcels conveyed in parallel to have different conveying speeds, the parcels conveyed in parallel can be separated so that the two parcels are in the conveying direction. On the interval.
  • the conveying device 100 provided by the embodiment of the present application, the relative position between two packages on the conveying device 100 can be adjusted by adjusting the conveying speeds of different conveying belts 110, which can not only realize the separation of parcels conveyed in parallel, but also It is also possible to adjust the spacing between packages distributed in the conveying direction.
  • the conveying device 100 may also only include a plurality of conveying belts 110 arranged in parallel in the width direction of the conveying device 100, and the parcels conveyed in parallel can also be separated.
  • Different conveyor belts 110 should be driven by different driving parts (such as motors), and each driving part is electrically connected to the control device 400.
  • the first detection device 200 may include at least one 3D camera 210 arranged above the conveying device 100.
  • the total field of view of the at least one 3D camera 210 may cover the conveying area of the conveying device 100.
  • a three-dimensional image of the conveying surface of the conveying device 100 and the package conveyed on the conveying surface can be obtained.
  • Each 3D camera 210 of the first detection device 200 may be fixed above the conveying device 100 by a bracket.
  • the first detection device 200 includes 9 3D cameras 210 arranged in an array of 3 rows and 3 columns. The three-dimensional images collected by the 9 3D cameras 210 can be spliced to obtain the entire conveying surface of the conveying device 100. And three-dimensional images of all the packages conveyed on the conveying surface.
  • the second detection device 300 includes at least one 2D camera 310.
  • the total field of view of the 2D camera 310 of the second detection device 300 may cover the transportation area of the transportation device 100, and all the 2D cameras 310 collect images at the same time. After splicing, a two-dimensional image of the conveying surface of the conveying device 100 and the package conveyed on the conveying surface can be obtained.
  • the second detection device 300 in this embodiment includes a 2D camera 310 that can collect a complete image of the entire conveying area of the conveying device 100 and obtain image information of each package in the conveying area.
  • FIG. 3 is a top view of the package processing device 010 in another embodiment of the application
  • FIG. 4 is a block diagram of the package processing device 010 in another embodiment of the application.
  • the second detection device 300 includes a sensor array arranged at intervals along the width direction of the conveying device 100, and the sensor array can collect image data of one row at a time.
  • the conveying can be obtained by splicing the multi-point line image data collected by the sensor array in a preset time period.
  • the second detection device 300 includes a first detection component 320
  • the first detection component 320 includes a plurality of first sensors 321 arranged at intervals along the width direction of the conveying device 100 to form a sensor array.
  • Each first sensor 321 includes a first light generator and a first light receiver, and along the height direction of the conveying device 100, the first light generator and the first light receiver of each first sensor 321 are respectively arranged in the conveying device. Both sides of the conveying plane of the device 100.
  • a gap is provided between two adjacent rows of conveyor belts 110 of the conveying device 100.
  • the first light generator and the first light receiver of the first sensor 321 are respectively disposed in the gap.
  • each first sensor 321 outputs a corresponding signal according to whether its first light receiver can receive the light emitted by its first light generator.
  • a point of line dot matrix data is output in the collection period, and the control device 400 continuously obtains the dot matrix data output by the first detection component 320, and compares the dot matrix data received at the current moment with the dot matrix data received in the preset time period before the current time. The data is spliced to generate image data.
  • the second detection device 300 further includes a second detection component 330, and the second detection component 330 includes a plurality of second sensors 331 arranged at intervals along the width direction of the conveying device 100, thereby forming another sensor array .
  • Each second sensor 331 includes a second light generator and a second light receiver.
  • the second light generator and the second light receiver of the second sensor 331 are respectively arranged in the conveying device 100. Both sides of the conveying plane.
  • the second detection component 330 can also output a bit of row dot matrix data in each collection period.
  • the control device 400 continuously obtains the dot matrix data output by the second detection component 330 and combines the data received at the current moment.
  • the dot matrix data is spliced with the dot matrix data received within a preset time period before the current moment to generate image data.
  • the second detection component 330 is located downstream of the first detection component 320, and the second light generator and the second light receiver of the second sensor 331 are also located between two adjacent rows of conveyor belts 110, respectively. On both sides of the gap, the light emitted by the second light generator of the second sensor 331 can pass through the gap without being covered by a package.
  • the image data acquired by the second detection component 330 may be used to correct the image acquired by the first detection component 320 to obtain a more accurate package image.
  • the first detection device 200, the second detection device 300, and the conveying device 100 are all electrically connected to the control device 400.
  • the control device 400 is set to:
  • the package processing equipment in the related art usually uses a 3D camera, a 2D camera or a photoelectric sensor array to detect the position of the package on the conveying device.
  • the above-mentioned various detection devices have certain defects.
  • the 3D camera cannot detect a low height (for example, For thin parts below 5cm), 2D cameras are more sensitive to the color of light and cannot accurately detect packages in shadow.
  • the photoelectric sensor array has poor detection time efficiency. It can only collect image data point by line, and cannot accurately obtain the same Information about the packages in the entire conveying area at all times.
  • the package processing equipment can obtain images collected by two different detection devices, and based on the confidence of each package in the corresponding image in the images collected by the two different detection devices, it is determined that the actual presence on the conveying device is Location information of each package. Since in the application embodiment, the package processing equipment determines the position information of each package actually existing on the conveying device 100 based on the confidence that each package in the image acquired by two different detection devices is in the corresponding image, it can improve The accuracy of package location detection.
  • control device 400 is further configured to: control the conveying device 100 to control the multiple packages conveyed in parallel according to the location information of each package on the conveying device 100 Carry out separation treatment. Since the embodiments of the present application can improve the accuracy of package location detection, thereby providing a reliable basis for the package processing equipment to subsequently process packages (such as separating and transporting packages in parallel), the embodiments of the present application can improve the accuracy of package processing.
  • control device 400 executes "calculate the first confidence of each package in the target first image in the target first image, and calculate the first confidence that each package in the target second image is in the target first image.
  • the second confidence level in the second image is based on the first confidence level of each package in the target first image and the second confidence level of each package in the target second image to determine the location information of each package on the conveyor 100"
  • Operations can include:
  • the first confidence level of the package in the target first image calculates the first confidence level of the package in the target first image, and determine the first confidence level of the package in the target first image Whether it is higher than the first preset value, based on the judgment result that the first confidence of the package in the first image of the target is higher than the first preset value, the location information of the package is determined according to the first image of the target, and the package is in the first image of the target. If the first confidence level in the image is not higher than the first preset value, discard the packaged data;
  • the second confidence level of the package in the target second image For packages in the target second image that cannot be matched with any package in the target first image, calculate the second confidence level of the package in the target second image, and determine the second confidence level of the package in the target second image Whether it is higher than the second preset value, based on the judgment result that the second confidence degree of the package in the target second image is higher than the second preset value, the location information of the package is determined according to the target second image, and the package is in the target second image. If the second confidence level in the image is not higher than the second preset value, discard the package data;
  • the location information of the package may be determined according to the target first image, or the location information of the package may be determined according to the target second image.
  • control device 400 selects a set image among the target first image and the target second image.
  • the set image is taken as the target first image as an example for description.
  • a package calculate the matching confidence of the package and each package in the target second image, and when the matching confidence of the two packages is higher than the set value, determine that the two packages can establish a matching relationship And establish the matching relationship between the two packages, for example, by marking the two packages with the same serial number to establish the matching relationship between the two packages; when the package and each package in the target second image When the matching confidence is not higher than the set value, it is determined that the package cannot establish a matching relationship with any package in the target second image.
  • the packages in the first image of the target that cannot be matched with any package in the second image of the target are counted, and the packages in the second image of the target are traversed, and the target second image is counted
  • There is no package that has a matching relationship with any package in the first image of the target and the package that cannot be matched with any package in the second image of the target in the first image of the target, and the package that does not have a matching relationship with any package in the second image of the target.
  • Any package in the image that establishes a matching relationship is determined to be a package that cannot establish a matching relationship.
  • control device 400 obtains the length, width, contour, and coordinate position of each package in the target first image and the target second image, and calculates these two packages according to the above-mentioned parameters of the two packages. The matching confidence of the package.
  • the control device 400 sets the package in the target first image as the target package, and assumes that there is a package that can establish a matching relationship with the target package in the target second image, and based on the target package
  • the coordinate position of the package in the first image of the target determines the coordinate position of the hypothetical package in the second image of the target.
  • the installation position of the second detection device 300 calculates the confidence of the hypothetical package in the target second image, and uses the calculated confidence as the confidence of the target package in the target first image.
  • the second detection device 300 includes a 2D camera 310 (refer to FIG. 1 and FIG. 2), since the 2D camera 310 is more sensitive to the light color, when one package falls in the shadow of another package, the The package will be missing in the second image collected by the 2D camera 310.
  • the control device 400 is based on the assumed coordinate position of the package, the distribution of the package around the assumed package, and the installation of the 2D camera 310 of the second detection device 300
  • the control device 400 determines that the target package in the first image of the target is a real package. At this time, the control device 400 determines the location of the target package according to the data of the target first image. Information, and add the package information in the target second image data.
  • the control device 400 determines that one package in the target first image is incorrectly determined as two packages due to a detection error. At this time, the control device 400 determines that the target package does not exist in the target first image, The information of the package in the first image data of the target is discarded.
  • the control device 400 sets the package in the target second image as the target package, and assumes that there is a package that can establish a matching relationship with the target package in the target first image, and according to the target package
  • the coordinate position of the package in the second image of the target determines the coordinate position of the hypothetical package in the first image of the target, and calculates the assumed package in the first image of the target according to the coordinate position of the assumed package and the data of the target first image And use the calculated confidence as the confidence that the target is wrapped in the second image of the target.
  • the control device 400 may incorrectly determine the target according to the point cloud data returned by the 3D camera 210 of the first detection device 200
  • the package does not exist in the first image. Therefore, when the control device 400 calculates the confidence of the assumed package in the target first image according to the coordinate position of the assumed package and the data of the target first image, if the calculated confidence is Degree is higher (for example, higher than the second preset value). In this case, the control device 400 often determines the height coordinates of the pixel at the coordinate position of the hypothetical package according to the point cloud data returned by the 3D camera 210.
  • the height coordinate changes on the reference plane are all small (for example, within the error fluctuation range).
  • the control device 400 determines that the package has a small height and there is a loss in the first image of the target. Therefore, the control device 400 determines that the target package in the target second image is a real package. At this time, the control device 400 determines the location information of the target package according to the data of the target second image, and supplements the package information in the target first image data On the contrary, when calculating the confidence of the hypothesis wrapped in the first image of the target, if the calculated confidence is low (for example, lower than or equal to the second preset value), the control device 400 determines that the detection error is wrong It is determined that one package in the target second image is two packages. At this time, the control device 400 determines that the target package does not exist in the target second image, and discards the information of the package in the target second image data.
  • the control device calculates the package in the image data according to the position information of the package in an image, the distribution of the package around the package, and the imaging principle of the detection device of the image The confidence level of, thereby obtaining the first confidence level wrapped in the first image data of the target and the second confidence level wrapped in the second image of the target.
  • packages that can establish a matching relationship when calculating the confidence in an image, it needs to be based on the distribution of packages around the package.
  • the control device 400 preferentially processes packages that cannot establish a matching relationship, and When it is determined that the package is missing from an image due to abnormal reasons, the package information is added to the image data, and the package is deleted from the image data when it is determined that the package is incorrectly generated in an image due to erroneous detection Information, in this way, can improve the accuracy of package distribution detection, thereby improving the accuracy of the final location information of packages that can establish a matching relationship.
  • control device 400 determines the target first image and the target second image in the following manner:
  • the control device 400 may be based on an image
  • the receiving time of the image, the time for the corresponding detection device to collect an image, the time for the corresponding detection device to process an image, and the time for the corresponding detection device to upload an image to the control device 400 determine the collection time of the image;
  • the image uploaded by the detection device to the control device 400 The data includes a time stamp for identifying the time of collection of the image, and the control device 400 determines the time of collection of the image according to the time stamp of an image.
  • control device 400 may also determine one of the multiple second images acquired by the second detection device 300 as the target second image, and set the multiple first images acquired by the first detection device 200 The image with the smallest time interval between the acquisition time of the target second image and the target second image is determined as the target first image.
  • control device 400 is further configured to:
  • the package processing device of the embodiment of the present application it is possible to correct the position of the package in the image before the acquisition time in both the target first image and the target second image, so that the target first image and the target second image are
  • the location information of the package carried is determined based on the same collection time, so as to avoid the difference in the location of the same package in the two images due to the difference in the collection time of the first image of the target and the second image of the target, thereby improving the determination The accuracy of the location information of each package on the conveying device 100.
  • control device 400 calculates the interval between the acquisition time of the first image of the target and the acquisition time of the second image of the target, and according to the position of the package in the image before the acquisition time, the The conveying speed of each conveyor belt of the conveying device 100 within the interval is tracked and calculated at the location of the package at the target time in the image before the collection time.
  • the displacement of each package along the conveying direction is calculated according to the interval and the moving speed of the conveyor belt 110 to determine The position of each package at the target time; when the multiple conveyor belts 110 of the conveying device 100 move at different speeds within the interval, then each package is calculated according to the interval and the specific movement of the conveyor belt 110 carrying each package. The displacement of the package along its conveying direction and the displacement in the width direction of the conveying device 100, thereby determining the position of each package at the target time.
  • control device 400 executes "calculate the first confidence that each package in the target first image is contained in the target first image, and calculates that each package in the target second image is contained in the target second image.
  • the second degree of confidence based on the first degree of confidence of each package in the first image of the target and the second degree of confidence of each package in the second image of the target, to determine the location information of each package on the conveyor 100"
  • the group information determines the location information of each package on the conveying device 100.
  • Abnormally missing packages refer to packages that are missing other than those that leave the package processing equipment 010 due to normal transportation.
  • the control device 400 stores the location information of all packages determined according to the target first image and the target second image. Is the package group information, and compares the package group information stored this time with the package group information stored last time to determine the missing packages in the package group information stored this time compared with the package group information stored last time, and then Calculate the interval between the two operations of "determining the location information of the package based on the target first image and the target second image", according to the location information of each package in the last stored package group information and the delivery within the interval
  • the conveying speed of each conveyor belt 110 of the device 100 determines whether the missing package leaves the package processing equipment 010 due to normal transportation.
  • the control device 400 is based on the abnormal missing
  • the location information of the package in the last stored package group information, the interval between determining the location information of the package twice according to the target first image and the target second image, and each conveyor belt of the conveyor device 100 within the interval 110 track and calculate the location information of the abnormally missing packages, and add the location information of the abnormally missing packages to the package group information stored this time, and determine the location on the conveyor device 100 according to the package group information stored this time Location information of each package.
  • acquiring the second image of the package by the second detection device includes:
  • a second image of the package is acquired through images collected by at least one 2D camera.
  • the control device 400 can obtain the second image of the package through the image collected by the at least one 2D camera 310.
  • the second detection device 300 includes a 2D camera 310 whose field of view can cover the entire conveying area of the conveying device 100, and the control device 400 uses the 2D camera to cover the entire conveying area of the conveying device 100.
  • Each image acquired by 310 is determined to be a second image; in other alternative embodiments, the second detection device 300 includes multiple 2D cameras 310, and the total field of view of the multiple 2D cameras 310 can cover the entire conveying device 100.
  • the control device 400 stitches the images collected by the multiple 2D cameras 310 at the same time to obtain the second image of the package.
  • acquiring the second image of the package by the second detection device includes:
  • the one-point line image data collected by the first detection component at one time and the multi-point line image data collected within a preset time period before the time are spliced to obtain the wrapped second image.
  • control device 400 is configured to obtain the second image of the package through the second detection device 300 in the following manner:
  • the one-point line image data collected by the first detection component 320 at a time and the multi-point line image data collected within a preset time period before that time are spliced together to obtain the wrapped second image.
  • the second image of the package on the conveying device 100 is acquired through the photoelectric sensor array.
  • each first sensor 321 determines whether its first light receiver is It can receive the light emitted by its first light generator and output the corresponding signal.
  • the first detection component 320 outputs a bit of row dot matrix data in each collection period, and the control device 400 continuously obtains the dot matrix output by the first detection component 320
  • the second image is generated by splicing the dot matrix data received at the current moment with the dot matrix data received within a preset time period before the current time.
  • control device 400 is further configured to:
  • the image data of each dot collected in the preset time period before the time is corrected, and the second image is generated based on the corrected result.
  • control device 400 determines the position of the package pixels in each line of image data along the width direction of the conveyor device 100 according to the speed of each conveyor belt 110 of the conveyor device 100 in the preset time period before that time. Tracking calculation is performed on the displacement on the upper side, and the line image data of each point collected in the preset time period before the moment is corrected according to the tracking calculation result, so that the generated second image can truly reflect the state of the package at the current moment.
  • the second detection device 300 of the package processing equipment 010 shown in FIG. 3 and FIG. 4 further includes a second detection component 330.
  • the control device 400 is further configured to: The image data and the multi-point line image data collected in the preset time period before this moment are spliced to obtain a third image of the package, and the position of the package in the second image is corrected according to the third image. Since the timeliness of acquiring the image of the package on the conveying device 100 through the sensor array is poor, therefore, in this embodiment, the first detection component 320 and the second detection component 330 are used to collect the dot matrix image data of the corresponding positions.
  • the control device 400 is based on two The detection result of the person generates the final second image.
  • Fig. 5 is a flowchart of a package processing method in an embodiment of the application.
  • the package processing method provided in the embodiment of the present application can be applied to the package processing device 010 provided in the embodiment of the present application.
  • the package processing method provided by the embodiment of the present application includes:
  • Step S100 controlling the conveying device to transport the package, and acquiring a first image of the package through the first detection device, and acquiring a second image of the package through the second detection device;
  • Step S200 determining the target first image from the first image and determining the target second image from the second image
  • Step S300 Calculate the first confidence of each package in the target first image in the target first image, and calculate the second confidence of each package in the target second image in the target second image, based on the target The first confidence level of each package in the first image and the second confidence level of each package in the target second image determine the location information of each package on the conveying device.
  • step S300 may include:
  • the first confidence of the package in the first image of the target For packages in the first image of the target that cannot be matched with any package in the second image of the target, calculate the first confidence of the package in the first image of the target, and determine the first confidence of the package in the first image of the target. Whether the confidence is higher than the first preset value, based on the judgment result that the first confidence of the package in the target first image is higher than the first preset value, the location information of the package is determined according to the target first image, based on the If the first confidence degree of the package in the first image of the target is not higher than the first preset value, discard the data of the package;
  • the second confidence of the package in the target second image For the package in the target second image that cannot be matched with any package in the target first image, calculate the second confidence of the package in the target second image, and determine the second degree of the package in the target second image. Whether the confidence level is higher than the second preset value, based on the judgment result that the second confidence level of the package in the target second image is higher than the second preset value, the location information of the package is determined according to the target second image, based on the If the second confidence degree of the package in the target second image is not higher than the second preset value, discard the package data;
  • the location information of the package is determined according to the first image of the target, based on the second confidence higher than the first confidence According to the comparison result, the location information of the package is determined according to the second image of the target. Based on the comparison result that the first confidence is equal to the second confidence, the location information of the package can be determined according to the first image of the target, or the package can be determined according to the second image of the target. Location information.
  • step S300 may further include:
  • the group information determines the location information of each package on the conveyor.
  • the package processing method may further include: controlling the conveying device to separate multiple packages conveyed in parallel according to the position information of each package on the conveying device.
  • FIG. 6 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment. As shown in FIG. 6, the electronic device includes: one or more processors 610 and a memory 620. One processor 610 is taken as an example in FIG. 6.
  • the electronic device may further include: an input device 630 and an output device 640.
  • the processor 610, the memory 620, the input device 630, and the output device 640 in the electronic device may be connected through a bus or other methods.
  • the connection through a bus is taken as an example.
  • the memory 620 can be configured to store software programs, computer-executable programs, and modules.
  • the processor 610 executes multiple functional applications and data processing by running software programs, instructions, and modules stored in the memory 620 to implement any one of the methods in the foregoing embodiments.
  • the memory 620 may include a program storage area and a data storage area, where the program storage area may store an operating system and an application program required by at least one function; the data storage area may store data created according to the use of the electronic device, and the like.
  • the memory may include volatile memory such as Random Access Memory (RAM), and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device.
  • RAM Random Access Memory
  • the memory 620 may be a non-transitory computer storage medium or a transitory computer storage medium.
  • the non-transitory computer storage medium for example, at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device.
  • the memory 620 may optionally include memories remotely provided with respect to the processor 610, and these remote memories may be connected to the electronic device through a network. Examples of the aforementioned network may include the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 630 may be configured to receive input digital or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the output device 640 may include a display device such as a display screen.
  • This embodiment also provides a computer-readable storage medium that stores a computer program, and the computer program is used to execute the above method.
  • All or part of the processes in the methods of the above-mentioned embodiments may be implemented by a computer program that executes the relevant hardware.
  • the program may be stored in a non-transitory computer-readable storage medium. When the program is executed, it may include the method described above.
  • the non-transitory computer-readable storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or RAM, etc.
  • the package processing equipment, package processing method, electronic equipment, and storage medium provided by the embodiments of the present application can separately collect three-dimensional images and two-dimensional images of packages on the conveying device through two different detection devices.
  • the confidence level of each package in the corresponding image in the images collected by different detection devices determines the location information of each package that actually exists on the conveying device. Since the embodiment of the present application determines the position information of each package actually existing on the conveying device based on the confidence of each package in the corresponding image obtained by two different detection devices, the accuracy of the package location detection can be improved. , So as to provide a reliable basis for the subsequent processing of packages (such as separating and transporting packages) by the package processing equipment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Sorting Of Articles (AREA)

Abstract

本申请的实施例提供了一种包裹处理设备、包裹处理方法、电子设备及存储介质,涉及物流技术领域。本申请实施例提供的包裹处理设备,可以通过两种不同的检测装置来分别采集输送装置上包裹的三维图像和二维图像,基于两种不同的检测装置采集的图像中每个包裹在对应的图像中的置信度确定输送装置上实际存在的每个包裹的位置信息。

Description

包裹处理设备、包裹处理方法、电子设备及存储介质
本公开要求在2020年04月30日提交中国专利局、申请号为202010366800.0的中国专利申请的优先权,以上申请的全部内容通过引用结合在本公开中。
技术领域
本申请涉及物流技术领域,具体而言,例如涉及一种包裹处理设备、包裹处理方法、电子设备及存储介质。
背景技术
在物流公司的快件处理系统中通常包括多级输送设备,多级输送设备中包括用于分离并行包裹的包裹处理设备,包裹处理设备接收上一级设备(例如疏散设备)输送的多个包裹,并对接收的包裹进行处理(比如分离输送,使多个包裹一个接一个地离开包裹处理设备以进入下一级输送设备)。
相关技术公开了一种包裹处理设备,包裹处理设备包括控制装置、输送装置和设置于输送装置上方的多个3D相机,输送装置包括多条并行排列的输送带,控制装置通过控制多个3D相机持续拍照,将多个3D相机获取的图像进行拼接得到输送装置上包裹的图像,并根据包裹的图像确定输送装置上各个包裹的位置信息,并根据包裹的位置信息对包裹的位置进行调整。
然而,申请人发现,相关技术的包裹处理设备会因为获取的包裹位置信息不够准确,而导致包裹位置调整的准确性较差。
发明内容
本申请的目的包括提供一种包裹处理设备、包裹处理方法、电子设备及存储介质,能够提高包裹位置检测的准确性,进而提高包裹处理设备处理包裹的准确性。
本申请的实施例可以这样实现:
第一方面,本申请实施例提供一种包裹处理设备,包括控制装置、用于输送包裹的输送装置以及设置在所述输送装置输送包裹的路径上的第一检测装置和第二检测装置,所述第一检测装置、所述第二检测装置、所述输送装置均与所述控制装置电连接,其中,所述第一检测装置设置为获取包裹的三维图像,所述第二检测装置设置为获取包裹的二维图像,所述控制装置被设置为:
控制所述输送装置输送包裹,并通过所述第一检测装置获取包裹的第一图 像,通过所述第二检测装置获取包裹的第二图像;
从所述第一图像中确定目标第一图像以及从所述第二图像中确定目标第二图像;
计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息。
第二方面,本申请实施例提供一种包裹处理方法,应用于包裹处理设备,所述包裹处理设备包括用于输送包裹的输送装置以及设置在所述输送装置输送包裹的路径上的第一检测装置和第二检测装置,其中,所述第一检测装置设置为获取包裹的三维图像,所述第二检测装置设置为获取包裹的二维图像,所述包裹处理方法包括:
控制所述输送装置输送包裹,并通过所述第一检测装置获取包裹的第一图像,通过所述第二检测装置获取包裹的第二图像;
从所述第一图像中确定目标第一图像以及从所述第二图像中确定目标第二图像;
计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息。
第三方面,本申请实施例提供一种电子设备,包括:
处理器;
存储器,设置为存储计算机程序,
所述处理器被设置为运行所述计算机程序以执行本申请实施例第二方面所述的包裹处理方法。
第四方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被设置为执行本申请实施例第二方面所述的包裹处理方法。
附图说明
为了说明本申请实施例的技术方案,下面将对实施例中所需要使用的附图进行介绍,应当理解的是,以下附图仅示出了本申请的某些实施例,对于本领 域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本申请一种实施例中包裹处理设备的俯视图;
图2为本申请一种实施例中包裹处理设备的组成框图;
图3为本申请另一种实施例中包裹处理设备的俯视图;
图4为本申请另一种实施例中包裹处理设备的组成框图;
图5为本申请实施例中包裹处理方法的流程图;
图6为本申请一种实施例中电子设备的结构示意图。
图标:010-包裹处理设备;100-输送装置;110-输送带;200-第一检测装置;210-3D相机;300-第二检测装置;310-2D相机;320-第一检测组件;321-第一传感器;330-第二检测组件;331-第二传感器;400-控制装置;020-上级输送设备;030-下级输送设备。
具体实施方式
对于相关技术的包裹处理设备,通过多个3D相机检测输送装置上包裹的位置,然而,发明人发现,实际使用过程中,由于3D相机存在一定的检测误差,当包裹的高度较低时,3D相机通常检测不到输送装置上存在该包裹。例如,相关技术的3D相机在检测其视场内的物体时,获取其视场内每个像素点在高度、长度和宽度三个维度上的坐标数据,并将其视场内所有像素点的坐标数据以点云的形式发送给控制装置,控制装置通过解析点云数据确定是否存在相对于一个基准面发生高度突变的像素点,从而判定基准面上是否有物体,以及在判定基准面上有物体时通过该物体的各个像素点的坐标值计算该物体的高度、长度和宽度。发明人发现,在实际使用3D相机检测输送装置上的包裹时,由于3D相机存在一定的检测误差,因此当包裹的上表面到3D相机的第一距离的值较大时,3D相机检测误差的最大值就会较大,例如,当第一距离达到2m时,如果3D相机具有最大±2%的检测误差,则该检测误差的最大值就会达到±4cm,在这种情况下,如果包裹的高度较小,例如包裹为信封件,其高度小于4cm,则控制装置根据3D相机返回的点云数据,在该包裹所处的位置可能就检测不到存在相对于基准面发生高度突变的像素点,从而检测不到输送装置上存在该包裹,也即,出现包裹漏检,因此,相关技术的包裹处理设备存在获取的包裹位置信息不够准确的问题,在包裹处理设备根据所获取的包裹位置信息对包裹的位置进行调整的情况下,这就可能导致包裹位置调整的准确性较差的问题。例如, 在包裹处理设备为对并行输送的包裹进行分离的并行分离器的情况下,如果被漏检的包裹与其他包裹并行输送,基于3D相机的检测结果输送装置就不能够对这两个包裹进行分离处理,从而造成分离遗漏的情况。
因此,本申请实施例提供一种包裹处理设备、包裹处理方法、电子设备及存储介质。本申请实施例的包裹处理设备采用两种检测装置分别检测输送装置上包裹的三维图像和二维图像,并根据两种检测装置所采集的图像中包裹的置信度来确定每个包裹在输送装置上的实际位置。通过本申请实施例的包裹处理设备、包裹处理方法、电子设备及存储介质,能够提高包裹位置检测的准确性,从而为包裹处理设备后续处理包裹(比如分离并行输送的包裹)提供可靠的依据。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。在本申请的描述中,需要说明的是,若出现术语“上”、“下”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,或者是该发明产品使用时惯常摆放的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作。
此外,若出现术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
图1为本申请一种实施例中包裹处理设备010的俯视图;图2为本申请一种实施例中包裹处理设备010的组成框图。本申请实施例提供了一种包裹处理设备010,该包裹处理设备010设置于快件处理系统中,用于输送包裹,并获得各个包裹在包裹处理设备010上的相对位置。如图1和图2中所示,包裹处理设备010的上游设置有上级输送设备020,用于将包裹输送到包裹处理设备010上;包裹处理设备010的下游设置有下级输送设备030,用于承接包裹处理设备010输送而来的包裹。在本实施例中,上级输送设备020可以为疏散设备,疏散设备包括多条指向包裹处理设备010的输送带;下级输送设备030可以为靠边设备(用于使包裹全部靠近输送路径的一侧)、居中设备(用于使包裹全部靠近输送路径的中间)或者拉包设备等。
在本申请实施例中,包裹处理设备010包括输送装置100、第一检测装置200、第二检测装置300以及与输送装置100、第一检测装置200、第二检测装置300均电连接的控制装置400。第一检测装置200和第二检测装置300设置为采集包裹在输送装置100上被运输时的图像,并将图像信息发送给控制装置400,控制装置400设置为控制输送装置100输送包裹,以及根据包裹的图像信息确定各个包裹的位置,并根据包裹的位置,控制输送装置100对包裹的位置进行调整。
在本实施例中,控制装置400控制输送装置100对并行输送的包裹的位置进行调整,以使输送装置100上的各个包裹能够依次从输送装置100输送到下级输送设备030。
输送装置100包括多条并行排列的输送带110,每条输送带110沿输送方向(图中箭头ab所指示的方向)延伸。多条并行排列的输送带110被设置为能够具有不同的输送速度。在本实施例中,输送装置100包括在输送方向上呈M行,在宽度方向(图中箭头cd所指示的方向)上呈N列的多条输送带110,其中,M和N均为正整数。当存在并行输送的包裹时,并行输送的包裹往往被不同列的输送带110支撑。因此,通过将支撑并行输送的两个(或两个以上)包裹的不同列的输送带110设置为具有不同的输送速度,则可以对并行输送的包裹进行分离,使两个包裹具有在输送方向上的间隔。
另外,由于设置了在输送方向上排列的多行输送带110,通过将不同行的输送带110的输送速度设置为不同,便可以使在输送方向上间隔的包裹具有不同的速度,从而在输送方向上相互靠近或者远离。因此,采用本申请实施例提供的输送装置100,能够通过调节不同的输送带110的输送速度,来调整输送装置100上两个包裹之间的相对位置,不仅可以实现并行输送的包裹的分离,也能够调整在输送方向上分布的包裹之间的间距。应当理解,相邻两行输送带110以及相邻两列输送带110之间可能会具有间隙,该间隙应当以包裹不会从该间隙落下为准。在本申请其他实施例中,输送装置100也可以仅仅包括在输送装置100的宽度方向上并行排列的多个输送带110,也能够实现将并行输送的包裹进行分离。不同的输送带110应分别由不同的驱动件(比如电机)来驱动,各个驱动件与控制装置400电连接。
在本实施例中,第一检测装置200可以包括设置于输送装置100上方的至少一个3D相机210,至少一个3D相机210的总视场范围可以覆盖输送装置100的输送区域,所有3D相机210在同一时刻采集的图像进行拼接后能够得到输送 装置100的输送面以及输送面上输送的包裹的三维图像。第一检测装置200的每个3D相机210可以通过支架固定在输送装置100的上方。在图1的实施例中,第一检测装置200包括以3行3列的形式阵列设置的9个3D相机210,9个3D相机210所采集的三维图像可以拼接得到输送装置100的整个输送面以及输送面上输送的所有包裹的三维图像。
可选的,第二检测装置300包括至少一个2D相机310,第二检测装置300的2D相机310的总视场范围可以覆盖输送装置100的输送区域,所有2D相机310在同一时刻采集的图像进行拼接后能够得到输送装置100的输送面以及输送面上输送的包裹的二维图像。如图1所示,该实施例中第二检测装置300包括一个2D相机310,该2D相机310能够采集输送装置100的整个输送区域的完整图像,并获取输送区域中各个包裹的图像信息。当然,在第二检测装置300仅包括一个2D相机310的情况下,则无需通过拼接来获得输送装置100的整个输送区域的完整图像。
图3为本申请另一种实施例中包裹处理设备010的俯视图;图4为本申请另一种实施例中包裹处理设备010的组成框图。可选的,在图3所示的实施例中,第二检测装置300包括沿输送装置100的宽度方向间隔排布的传感器阵列,传感器阵列每次能够采集一点行的图像数据,在这种情况下,由于输送装置100运行时多条输送带110相对于传感器阵列沿包裹输送方向移动,因此,通过将传感器阵列在一预设的时间段内采集的多点行图像数据进行拼接后能够得到输送装置100的输送面以及输送面上输送的包裹的二维图像。
在图3实施例中,第二检测装置300包括第一检测组件320,第一检测组件320包括多个沿输送装置100的宽度方向间隔排布的第一传感器321,从而形成一个传感器阵列。每个第一传感器321包括第一光发生器和第一光接收器,且沿输送装置100的高度方向,每个第一传感器321的第一光发生器和第一光接收器分别设置于输送装置100的输送平面的两侧。可选的,输送装置100的相邻两行输送带110之间设置有间隙,沿输送装置100的高度方向,第一传感器321的第一光发生器和第一光接收器分别设置于间隙的两侧,在没有包裹遮挡的情况下,第一传感器321的第一光发生器发出的光可通过该间隙。在输送装置100输送包裹的过程中,每个第一传感器321根据其第一光接收器是否能接收到其第一光发生器发出的光而输出对应的信号,第一检测组件320在每个采集周期内输出一点行点阵数据,控制装置400持续获取第一检测组件320输出的点阵数据,并将当前时刻接收的点阵数据与在当前时刻之前的预设时间段内接收 的点阵数据进行拼接生成图像数据。
在图3实施例中,第二检测装置300还包括第二检测组件330,第二检测组件330包括多个沿输送装置100的宽度方向间隔排布的第二传感器331,从而形成另一个传感器阵列。每个第二传感器331包括第二光发生器和第二光接收器,沿输送装置100的高度方向,第二传感器331的第二光发生器和第二光接收器分别设置于输送装置100的输送平面的两侧。与第一检测组件320类似,第二检测组件330也能够在每个采集周期内输出一点行点阵数据,控制装置400持续获取第二检测组件330输出的点阵数据,并将当前时刻接收的点阵数据与在当前时刻之前的预设时间段内接收的点阵数据进行拼接生成图像数据。沿包裹的输送方向,第二检测组件330位于第一检测组件320的下游,并且第二传感器331的第二光发生器和第二光接收器也分别位于相邻的两行输送带110之间的间隙的两侧,在没有包裹遮挡的情况下,第二传感器331的第二光发生器发出的光可通过该间隙。第二检测组件330获取的图像数据,可以用于对第一检测组件320获取的图像进行修正,以得到更加精准的包裹图像。
在本申请各个实施例中,第一检测装置200、第二检测装置300、输送装置100均与控制装置400电连接。控制装置400被设置为:
控制输送装置100输送包裹,并通过第一检测装置200获取包裹的第一图像,通过第二检测装置300获取包裹的第二图像;从第一图像中确定目标第一图像以及从第二图像中确定目标第二图像;计算目标第一图像中的每个包裹在目标第一图像中的第一置信度,以及计算目标第二图像中的每个包裹在目标第二图像中的第二置信度,基于目标第一图像中每个包裹的第一置信度和目标第二图像中每个包裹的第二置信度确定输送装置100上每个包裹的位置信息。
相关技术中的包裹处理设备中通常通过3D相机、2D相机或者光电传感器阵列检测输送装置上包裹的位置,上述各种检测装置均存在一定的缺陷,例如,3D相机不能检测到高度较低(例如低于5cm)的薄件,2D相机对于光照颜色较为敏感,不能准确检测到处于阴影中的包裹,光电传感器阵列的检测时效较差,其仅能逐点行采集图像数据,不能准确地获取同一时刻整个输送区域内包裹的信息。通过本申请实施例,包裹处理设备可以获取两种不同的检测装置采集的图像,基于两种不同的检测装置采集的图像中每个包裹在对应的图像中的置信度确定输送装置上实际存在的每个包裹的位置信息。由于申请实施例中,包裹处理设备基于通过两种不同的检测装置获取的图像中每个包裹在对应的图像中的置信度确定输送装置100上实际存在的每个包裹的位置信息,因此能够提高 包裹位置检测的准确性。
在一实施例中,在确定输送装置100上每个包裹的位置信息之后,控制装置400还被设置为:根据输送装置100上每个包裹的位置信息控制输送装置100对并行输送的多个包裹进行分离处理。由于本申请实施例能够提高包裹位置检测的准确性,从而为包裹处理设备后续处理包裹(比如分离并行输送的包裹)提供可靠的依据,因此,本申请实施例能够提高包裹处理的准确性。
在可选的实施方式中,控制装置400执行的“计算目标第一图像中的每个包裹在目标第一图像中的第一置信度,以及计算目标第二图像中的每个包裹在目标第二图像中的第二置信度,基于目标第一图像中每个包裹的第一置信度和目标第二图像中每个包裹的第二置信度确定输送装置100上每个包裹的位置信息”的操作,可以包括:
确定目标第一图像和目标第二图像中能够建立匹配关系的包裹和不能建立匹配关系的包裹;
对于目标第一图像中不能与目标第二图像中的任意一个包裹建立匹配关系的包裹,计算包裹在目标第一图像中的第一置信度,判断包裹在目标第一图像中的第一置信度是否高于第一预设值,基于包裹在目标第一图像中的第一置信度高于第一预设值的判断结果,根据目标第一图像确定包裹的位置信息,基于包裹在目标第一图像中的第一置信度不高于第一预设值的判断结果,丢弃包裹的数据;
对于目标第二图像中不能与目标第一图像中的任意一个包裹建立匹配关系的包裹,计算包裹在目标第二图像中的第二置信度,判断包裹在目标第二图像中的第二置信度是否高于第二预设值,基于包裹在目标第二图像中的第二置信度高于第二预设值的判断结果,根据目标第二图像确定包裹的位置信息,基于包裹在目标第二图像中的第二置信度不高于第二预设值的判断结果,丢弃包裹的数据;
对于目标第一图像和目标第二图像中能够建立匹配关系的包裹,计算包裹在目标第一图像中的第一置信度和包裹在目标第二图像中的第二置信度,比较包裹的第一置信度和第二置信度,基于第一置信度高于第二置信度的比较结果,根据目标第一图像确定包裹的位置信息,基于第二置信度高于第一置信度的比较结果,根据目标第二图像确定包裹的位置信息。
在一实施例中,基于第一置信度等于第二置信度的比较结果,可以根据目标第一图像确定包裹的位置信息,也可以根据目标第二图像确定包裹的位置信 息。
在一种可选的实施方式中,控制装置400选择目标第一图像和目标第二图像中的一幅设定的图像,例如,此处以设定的图像为目标第一图像为例进行描述,对于目标第一图像中的所有包裹,逐一查询每个包裹是否能够与目标第二图像中的任意一个包裹建立匹配关系。可选的,对于一个包裹,计算该包裹与目标第二图像中每个包裹的匹配置信度,当两个包裹的匹配置信度高于设定值时,确定这两个包裹为能够建立匹配关系的包裹,并建立这两个包裹之间的匹配关系,例如,通过将这两个包裹标记为相同的序号建立这两个包裹的匹配关系;当该包裹与目标第二图像中每个包裹的匹配置信度均不高于设定值时,确定该包裹不能与目标第二图像中的任意一个包裹建立匹配关系。当对目标第一图像中所有包裹查询结束后,统计目标第一图像中不能与目标第二图像中任意一个包裹建立匹配关系的包裹,以及遍历目标第二图像中的包裹,统计目标第二图像中没有与目标第一图像中任意一个包裹建立匹配关系的包裹,将目标第一图像中不能与目标第二图像中任意一个包裹建立匹配关系的包裹,以及目标第二图像中没有与目标第一图像中任意一个包裹建立匹配关系的包裹确定为不能建立匹配关系的包裹。
在一种可选的实施方式中,控制装置400获取目标第一图像和目标第二图像中每个包裹的长度、宽度、轮廓以及坐标位置等参数,根据两个包裹的上述参数计算这两个包裹的匹配置信度。
对于目标第一图像中不能与目标第二图像中的任意一个包裹建立匹配关系的包裹,计算该包裹在目标第一图像中的置信度。一种可选的实施方式中,控制装置400将目标第一图像中的该包裹设定为目标包裹,并假设目标第二图像中存在能够与该目标包裹建立匹配关系的包裹,并根据该目标包裹在目标第一图像中的坐标位置确定该假设的包裹在目标第二图像中的坐标位置,根据该假设的包裹的坐标位置、目标第二图像中该假设的包裹周围包裹的分布情况以及第二检测装置300的安装位置计算该假设的包裹在目标第二图像中的置信度,并将计算所得的置信度作为该目标包裹在目标第一图像中的置信度。可选的,当第二检测装置300包括2D相机310时(参考图1、图2),由于2D相机310对光照颜色比较敏感,因此,当一个包裹落在另一个包裹的阴影中时,该包裹会在2D相机310采集的第二图像中发生缺失,因此,控制装置400根据上述假设的包裹的坐标位置、该假设的包裹周围包裹的分布情况以及第二检测装置300的2D相机310的安装位置计算该假设的包裹在目标第二图像中的置信度时,如 果计算所得的置信度较高(如高于第一预设值),则可以确定该包裹由于落在了另一个包裹的阴影中而在目标第二图像中发生了缺失,因此,控制装置400判定目标第一图像中的目标包裹为真实存在的包裹,此时,控制装置400根据目标第一图像的数据确定目标包裹的位置信息,并在目标第二图像数据中补入该包裹的信息,反之,在计算该假设的包裹在目标第二图像中的置信度时,如果计算所得的置信度较低(如低于或等于第一预设值),则控制装置400判定由于检测误差错误地将目标第一图像中的一个包裹判定为了两个包裹,此时,控制装置400确定目标包裹在目标第一图像中不存在,丢弃目标第一图像数据中该包裹的信息。
同理,对于目标第二图像中不能与目标第一图像中的任意一个包裹建立匹配关系的包裹,计算该包裹在目标第二图像中的置信度。一种可选的实施方式中,控制装置400将目标第二图像中的该包裹设定为目标包裹,并假设目标第一图像中存在能够与该目标包裹建立匹配关系的包裹,并根据该目标包裹在目标第二图像中的坐标位置确定该假设的包裹在目标第一图像中的坐标位置,根据该假设的包裹的坐标位置以及目标第一图像的数据计算该假设的包裹在目标第一图像中的置信度,并将计算所得的置信度作为该目标包裹在目标第二图像中的置信度。可选的,在第一检测装置200包括3D相机210的情况下,当包裹的高度较低时,控制装置400根据第一检测装置200的3D相机210返回的点云数据可能会错误地判定目标第一图像中不存在该包裹,因此,控制装置400根据该假设的包裹的坐标位置以及目标第一图像的数据计算该假设的包裹在目标第一图像中的置信度时,如果计算所得的置信度较高(如高于第二预设值),在这种情况下,往往是控制装置400根据3D相机210返回的点云数据确定该假设的包裹的坐标位置处的像素点的高度坐标相对于基准面的高度坐标变化值均较小(例如在误差波动范围之内),此时,控制装置400判定由于该包裹的高度较小而在目标第一图像中发生了缺失,因此,控制装置400判定目标第二图像中的目标包裹为真实存在的包裹,此时,控制装置400根据目标第二图像的数据确定目标包裹的位置信息,并在目标第一图像数据中补入该包裹的信息,反之,在计算该假设的包裹在目标第一图像中的置信度时,如果计算所得的置信度较低(如低于或等于第二预设值),则控制装置400判定由于检测误差错误地将目标第二图像中的一个包裹判定为了两个包裹,此时,控制装置400确定目标包裹在目标第二图像中不存在,丢弃目标第二图像数据中该包裹的信息。
对于能够建立匹配关系的包裹,控制设备根据该包裹在一幅图像中的位置 信息、该包裹周围包裹的分布情况以及该幅图像的检测装置的成像原理,计算该包裹在该幅图像的数据中的置信度,从而得到包裹在目标第一图像数据中的第一置信度和包裹在目标第二图像中的第二置信度。由于对于能够建立匹配关系的包裹,在计算其在一幅图像中的置信度时,需要基于该包裹周围包裹的分布情况,因此,优选的,控制装置400优先处理不能建立匹配关系的包裹,并在确定包裹由于非正常原因而在一幅图像中缺失时在该图像数据中补入该包裹的信息,在确定包裹由于错误检测而在一幅图像中错误生成时在该图像数据中删除该包裹信息,如此,能够提高包裹分布检测的准确性,从而可以提高最终得到的能够建立匹配关系的包裹的位置信息的准确性。
在可选的实施方式中,控制装置400通过以下方式来确定目标第一图像和目标第二图像:
将第一检测装置200获取的多幅第一图像中的一幅确定为目标第一图像,将第二检测装置300获取的多幅第二图像中与目标第一图像的采集时刻间隔最小的一幅图像确定为目标第二图像。
在一种可选的实施方式中,对于包括3D相机210或2D相机310的检测装置(例如本实施例中的第一检测装置200或第二检测装置300),控制装置400可以根据一幅图像的接收时刻、对应的检测装置采集一幅图像的时长、对应的检测装置处理一幅图像的时长以及由对应的检测装置向控制装置400上传一幅图像的时长确定该图像的采集时刻;在另一种可选的实施方式中,对于包括3D相机210或2D相机310的检测装置(例如本实施例中的第一检测装置200或第二检测装置300),检测装置向控制装置400上传的图像数据中包含用于标识该图像的采集时刻的时间戳,控制装置400根据一幅图像的时间戳确定该图像的采集时刻。
在其他可选的实施方式中,控制装置400还可以将第二检测装置300获取的多幅第二图像中的一幅确定为目标第二图像,将第一检测装置200获取的多幅第一图像中与目标第二图像的采集时刻间隔最小的一幅图像确定为目标第一图像。
在可选的实施方式中,控制装置400还被设置为:
比较目标第一图像和目标第二图像的采集时刻,确定二者中采集时刻在前的图像和采集时刻在后的图像;将采集时刻在后的图像的采集时刻确定为目标时刻;追踪计算采集时刻在前的图像中的包裹在目标时刻的位置,并根据采集时刻在前的图像中的包裹在目标时刻的位置对采集时刻在前的图像中的包裹的 位置进行修正。
通过本申请实施例的包裹处理设备,能够对目标第一图像和目标第二图像二者中采集时刻在前的图像中的包裹的位置进行修正,从而使目标第一图像和目标第二图像中所携带的包裹的位置信息基于同一个采集时刻而确定,避免由于目标第一图像和目标第二图像的采集时刻存在差异而导致的同一个包裹在两幅图像中的位置存在差异,从而提高确定输送装置100上每个包裹的位置信息的准确性。在一种可选的实施方式中,控制装置400计算目标第一图像的采集时刻和目标第二图像的采集时刻之间的间隔时长,并根据采集时刻在前的图像中的包裹的位置、该间隔时长内输送装置100的每条传输带的输送速度,追踪计算采集时刻在前的图像中的包裹在目标时刻的位置。例如,当该间隔时长内输送装置100的多条输送带110均以相同速度匀速运动,则根据该间隔时长以及输送带110的运动速度计算每个包裹在沿其输送方向上的位移,从而确定每个包裹在目标时刻的位置;当该间隔时长内输送装置100的多条输送带110以不同速度运动,则根据该间隔时长、承载每个包裹的输送带110的具体运动情况,计算每个包裹在沿其输送方向上的位移以及在输送装置100宽度方向上的位移,从而确定每个包裹在目标时刻的位置。
可以理解的是,控制装置400执行的“计算目标第一图像中的每个包裹在目标第一图像中的第一置信度,以及计算目标第二图像中的每个包裹在目标第二图像中的第二置信度,基于目标第一图像中每个包裹的第一置信度和目标第二图像中每个包裹的第二置信度确定输送装置100上每个包裹的位置信息”的操作,还可以包括:
将根据目标第一图像和目标第二图像所确定的所有包裹的位置信息存储为包裹群信息;将本次存储的包裹群信息与上一次存储的包裹群信息进行对比,判断是否存在非正常缺失的包裹;如果判定存在非正常缺失的包裹,追踪计算非正常缺失的包裹的位置信息,并在本次存储的包裹群信息中加入非正常缺失的包裹的位置信息;根据本次所存储的包裹群信息确定输送装置100上每个包裹的位置信息。
非正常缺失的包裹是指除了由于正常输送而离开包裹处理设备010的包裹外其他缺失的包裹,例如,控制装置400将根据目标第一图像和目标第二图像所确定的所有包裹的位置信息存储为包裹群信息,并将本次存储的包裹群信息与上一次存储的包裹群信息进行对比,确定与上一次存储的包裹群信息相比,本次存储的包裹群信息中缺失的包裹,然后计算执行这两次“根据目标第一图 像和目标第二图像确定包裹的位置信息”操作之间的间隔时长,根据上一次存储的包裹群信息中每个包裹的位置信息以及该间隔时长内输送装置100的每条输送带110的输送速度,确定缺失的包裹是否是由于正常输送而离开包裹处理设备010,如果不是,则确定包裹是由于其他原因导致非正常缺失,控制装置400根据非正常缺失的包裹在上一次存储的包裹群信息中的位置信息、两次根据目标第一图像和目标第二图像确定包裹的位置信息之间的间隔时长以及该间隔时长内输送装置100的每条输送带110的输送速度,追踪计算非正常缺失的包裹的位置信息,并在本次存储的包裹群信息中加入非正常缺失的包裹的位置信息,根据本次所存储的包裹群信息确定输送装置100上每个包裹的位置信息。通过在本次存储的包裹群信息中加入非正常缺失的包裹的位置信息,可以避免由于检测误差、设备异常而导致的包裹检测缺失,提高了所确定的输送装置100上每个包裹的位置信息的准确性。
在一实施例中,通过第二检测装置获取包裹的第二图像,包括:
通过至少一个2D相机采集的图像获取包裹的第二图像。
可选的,在第二检测装置300包括至少一个2D相机310时,在其中一个2D相机310的视场或者多个2D相机310的总视场能够覆盖整个输送装置100的输送区域的情况下,则控制装置400可以通过至少一个2D相机310采集的图像获取包裹的第二图像。
比如,在图1、图2的包裹处理设备010中,第二检测装置300包括一个2D相机310,该2D相机310的视场能够覆盖整个输送装置100的输送区域,控制装置400将该2D相机310获取的每幅图像确定为第二图像;在另一些可选的实施方式中,第二检测装置300包括多个2D相机310,多个2D相机310的总视场能够覆盖整个输送装置100的输送区域,控制装置400将多个2D相机310在同一时刻采集的图像进行拼接,得到包裹的第二图像。
在一实施例中,通过第二检测装置获取包裹的第二图像,包括:
将第一检测组件在一个时刻采集的一点行图像数据以及在时刻之前的预设时间段内采集的多点行图像数据进行拼接,得到包裹的第二图像。
在图3、图4的包裹处理设备010中,控制装置400被设置为通过以下方式来实现通过第二检测装置300获取包裹的第二图像:
将第一检测组件320在一个时刻采集的一点行图像数据以及在该时刻之前的预设时间段内采集的多点行图像数据进行拼接,得到包裹的第二图像。
在图3、图4的实施例中,通过光电传感器阵列获取输送装置100上包裹的 第二图像,在输送装置100输送包裹的过程中,每个第一传感器321根据其第一光接收器是否能接收到其第一光发生器发出的光而输出对应的信号,第一检测组件320在每个采集周期内输出一点行点阵数据,控制装置400持续获取第一检测组件320输出的点阵数据,并将当前时刻接收的点阵数据与在当前时刻之前的预设时间段内接收的点阵数据进行拼接生成第二图像。
在可选的实施方式中,控制装置400还被设置为:
根据在该时刻之前的预设时间段内输送装置100的每条输送带110的速度对该时刻之前的预设时间段内采集的每点行图像数据进行修正,基于修正的结果生成第二图像。
在一实施例中,控制装置400根据在该时刻之前的预设时间段内输送装置100的每条输送带110的速度对每点行图像数据中的包裹像素点在沿输送装置100的宽度方向上的位移进行追踪计算,并根据追踪计算结果对该时刻之前的预设时间段内采集的每点行图像数据进行修正,从而可以使生成的第二图像能够真实反映当前时刻包裹的状态。
如图3、图4的包裹处理设备010的第二检测装置300还包括第二检测组件330,可选的,控制装置400还被设置为:将第二检测组件330在一个时刻采集的一点行图像数据以及在该时刻之前的预设时间段内采集的多点行图像数据进行拼接得到包裹的第三图像,根据第三图像对第二图像中包裹的位置进行修正。由于通过传感器阵列获取输送装置100上包裹的图像的时效性较差,因此,本实施例通过第一检测组件320和第二检测组件330分别采集对应位置的点阵图像数据,控制装置400基于二者的检测结果生成最终的第二图像。
图5为本申请一种实施例中包裹处理方法的流程图。如图5所示,本申请实施例提供的包裹处理方法,可以应用于本申请实施例提供的包裹处理设备010。如图5所示,本申请实施例提供的包裹处理方法包括:
步骤S100,控制输送装置输送包裹,并通过第一检测装置获取包裹的第一图像,通过第二检测装置获取包裹的第二图像;
步骤S200,从第一图像中确定目标第一图像以及从第二图像中确定目标第二图像;
步骤S300,计算目标第一图像中的每个包裹在目标第一图像中的第一置信度,以及计算目标第二图像中的每个包裹在目标第二图像中的第二置信度,基于目标第一图像中每个包裹的第一置信度和目标第二图像中每个包裹的第二置信度确定输送装置上每个包裹的位置信息。
进一步的,步骤S300可以包括:
确定目标第一图像和目标第二图像中能够建立匹配关系的包裹和不能建立匹配关系的包裹;
对于目标第一图像中不能与目标第二图像中的任意一个包裹建立匹配关系的包裹,计算该包裹在目标第一图像中的第一置信度,判断该包裹在目标第一图像中的第一置信度是否高于第一预设值,基于该包裹在目标第一图像中的第一置信度高于第一预设值的判断结果,根据目标第一图像确定该包裹的位置信息,基于该包裹在目标第一图像中的第一置信度不高于第一预设值的判断结果,丢弃该包裹的数据;
对于目标第二图像中不能与目标第一图像中的任意一个包裹建立匹配关系的包裹,计算该包裹在目标第二图像中的第二置信度,判断该包裹在目标第二图像中的第二置信度是否高于第二预设值,基于该包裹在目标第二图像中的第二置信度高于第二预设值的判断结果,根据目标第二图像确定该包裹的位置信息,基于该包裹在目标第二图像中的第二置信度不高于第二预设值的判断结果,丢弃该包裹的数据;
对于目标第一图像和目标第二图像中能够建立匹配关系的包裹,计算该包裹在目标第一图像中的第一置信度和该包裹在目标第二图像中的第二置信度,比较该包裹的第一置信度和第二置信度,基于第一置信度高于第二置信度的比较结果,根据目标第一图像确定该包裹的位置信息,基于第二置信度高于第一置信度的比较结果,根据目标第二图像确定该包裹的位置信息,基于第一置信度等于第二置信度的比较结果,可以根据目标第一图像确定包裹的位置信息,也可以根据目标第二图像确定包裹的位置信息。
在一实施例中,步骤S300还可以包括:
将根据目标第一图像和目标第二图像所确定的所有包裹的位置信息存储为包裹群信息;将本次存储的包裹群信息与上一次存储的包裹群信息进行对比,判断是否存在非正常缺失的包裹;如果判定存在非正常缺失的包裹,追踪计算非正常缺失的包裹的位置信息,并在本次存储的包裹群信息中加入非正常缺失的包裹的位置信息;根据本次所存储的包裹群信息确定输送装置上每个包裹的位置信息。
在一实施例中,在步骤S300之后,包裹处理方法还可以包括:根据输送装置上每个包裹的位置信息控制输送装置对并行输送的多个包裹进行分离处理。
以上各个步骤的具体实现方式,可以参考前文对本申请实施例的包裹处理 设备010的结构介绍,以及对本申请实施例的包裹处理设备010的控制装置400的功能介绍,此处不再赘述。
图6是一实施例提供的一种电子设备的硬件结构示意图,如图6所示,该电子设备包括:一个或多个处理器610和存储器620。图6中以一个处理器610为例。
所述电子设备还可以包括:输入装置630和输出装置640。
所述电子设备中的处理器610、存储器620、输入装置630和输出装置640可以通过总线或者其他方式连接,图6中以通过总线连接为例。
存储器620作为一种计算机可读存储介质,可设置为存储软件程序、计算机可执行程序以及模块。处理器610通过运行存储在存储器620中的软件程序、指令以及模块,从而执行多种功能应用以及数据处理,以实现上述实施例中的任意一种方法。
存储器620可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据电子设备的使用所创建的数据等。此外,存储器可以包括随机存取存储器(Random Access Memory,RAM)等易失性存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件或者其他非暂态固态存储器件。
存储器620可以是非暂态计算机存储介质或暂态计算机存储介质。该非暂态计算机存储介质,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器620可选包括相对于处理器610远程设置的存储器,这些远程存储器可以通过网络连接至电子设备。上述网络的实例可以包括互联网、企业内部网、局域网、移动通信网及其组合。
输入装置630可设置为接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。输出装置640可包括显示屏等显示设备。
本实施例还提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序用于执行上述方法。
上述实施例方法中的全部或部分流程可以通过计算机程序来执行相关的硬件来完成的,该程序可存储于一个非暂态计算机可读存储介质中,该程序在执行时,可包括如上述方法的实施例的流程,其中,该非暂态计算机可读存储介质可以为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或RAM等。
综上所述,本申请实施例提供的包裹处理设备、包裹处理方法、电子设备及存储介质,可以通过两种不同的检测装置来分别采集输送装置上包裹的三维图像和二维图像,基于两种不同的检测装置采集的图像中每个包裹在对应的图像中的置信度确定输送装置上实际存在的每个包裹的位置信息。由于本申请实施例基于通过两种不同的检测装置获取的图像中每个包裹在对应图像中的置信度确定输送装置上实际存在的每个包裹的位置信息,因此能够提高包裹位置检测的准确性,从而为包裹处理设备后续处理包裹(比如分离并行输送的包裹)提供可靠的依据。

Claims (12)

  1. 一种包裹处理设备,包括控制装置、用于输送包裹的输送装置以及设置在所述输送装置输送包裹的路径上的第一检测装置和第二检测装置,所述第一检测装置、所述第二检测装置、所述输送装置均与所述控制装置电连接,其中,所述第一检测装置设置为获取包裹的三维图像,所述第二检测装置设置为获取包裹的二维图像,所述控制装置被设置为:
    控制所述输送装置输送包裹,并通过所述第一检测装置获取包裹的第一图像,通过所述第二检测装置获取包裹的第二图像;
    从所述第一图像中确定目标第一图像以及从所述第二图像中确定目标第二图像;
    计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息。
  2. 根据权利要求1所述的包裹处理设备,其中,所述控制装置执行的计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息的操作,包括:
    确定所述目标第一图像和所述目标第二图像中能够建立匹配关系的包裹和不能建立匹配关系的包裹;
    对于所述目标第一图像中不能与所述目标第二图像中的任意一个包裹建立匹配关系的包裹,计算所述包裹在所述目标第一图像中的第一置信度,判断所述包裹在所述目标第一图像中的所述第一置信度是否高于第一预设值,基于所述包裹在所述目标第一图像中的所述第一置信度高于所述第一预设值的判断结果,根据所述目标第一图像确定所述包裹的位置信息,基于所述包裹在所述目标第一图像中的所述第一置信度不高于所述第一预设值的判断结果,丢弃所述包裹的数据;
    对于所述目标第二图像中不能与所述目标第一图像中的任意一个包裹建立匹配关系的包裹,计算所述包裹在所述目标第二图像中的第二置信度,判断所述包裹在所述目标第二图像中的所述第二置信度是否高于第二预设值,基于所述包裹在所述目标第二图像中的所述第二置信度高于所述第二预设值的判断结果,根据所述目标第二图像确定所述包裹的位置信息,基于所述包裹在所述目 标第二图像中的所述第二置信度不高于所述第二预设值的判断结果,丢弃所述包裹的数据;
    对于所述目标第一图像和所述目标第二图像中能够建立匹配关系的包裹,计算所述包裹在所述目标第一图像中的第一置信度和所述包裹在所述目标第二图像中的第二置信度,比较所述包裹的所述第一置信度和所述第二置信度,基于所述第一置信度高于所述第二置信度的比较结果,根据所述目标第一图像确定所述包裹的位置信息,基于所述第二置信度高于所述第一置信度的比较结果,根据所述目标第二图像确定所述包裹的位置信息。
  3. 根据权利要求1所述的包裹处理设备,其中,所述控制装置是设置为通过以下方式从所述第一图像中确定所述目标第一图像以及从所述第二图像中确定所述目标第二图像:
    将所述第一检测装置获取的多幅所述第一图像中的一幅确定为所述目标第一图像,将所述第二检测装置获取的多幅所述第二图像中与所述目标第一图像的采集时刻间隔最小的一幅图像确定为所述目标第二图像。
  4. 根据权利要求1或3所述的包裹处理设备,其中,所述控制装置还被设置为:
    比较所述目标第一图像和所述目标第二图像的采集时刻,确定所述目标第一图像和所述目标第二图像中采集时刻在前的图像和采集时刻在后的图像;
    将所述采集时刻在后的图像的采集时刻确定为目标时刻;
    追踪计算所述采集时刻在前的图像中的包裹在所述目标时刻的位置,并根据所述采集时刻在前的图像中的包裹在所述目标时刻的位置对所述采集时刻在前的图像中的包裹的位置进行修正。
  5. 根据权利要求2所述的包裹处理设备,其中,所述控制装置执行的计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息的操作,还包括:
    将根据所述目标第一图像和所述目标第二图像所确定的所有包裹的位置信息存储为包裹群信息;
    将本次存储的包裹群信息与上一次存储的包裹群信息进行对比,判断是否存在非正常缺失的包裹;
    基于存在所述非正常缺失的包裹的判断结果,追踪计算所述非正常缺失的 包裹的位置信息,并在本次存储的包裹群信息中加入所述非正常缺失的包裹的位置信息;
    根据本次存储的包裹群信息确定所述输送装置上每个包裹的位置信息。
  6. 根据权利要求1所述的包裹处理设备,其中,所述第二检测装置包括至少一个2D相机,所述至少一个2D相机的总视场能够覆盖整个所述输送装置的输送区域;
    所述控制装置是被设置为:
    通过所述至少一个2D相机采集的图像获取所述包裹的第二图像。
  7. 根据权利要求1所述的包裹处理设备,其中,所述第二检测装置包括第一检测组件,所述第一检测组件包括多个沿所述输送装置的宽度方向间隔排布的第一传感器,每个所述第一传感器包括第一光发生器和第一光接收器,且沿所述输送装置的高度方向,每个所述第一传感器的第一光发生器和第一光接收器分别设置于所述输送装置的输送平面的两侧;
    所述控制装置是被设置为:
    将所述第一检测组件在一个时刻采集的一点行图像数据以及在所述时刻之前的预设时间段内采集的多点行图像数据进行拼接,得到所述包裹的第二图像。
  8. 根据权利要求7所述的包裹处理设备,其中,所述控制装置还被设置为:
    根据在所述时刻之前的所述预设时间段内所述输送装置输送包裹的速度对所述时刻之前的所述预设时间段内采集的每点行图像数据进行修正,基于修正的结果生成第二图像。
  9. 根据权利要求1所述的包裹处理设备,其中,所述输送装置包括多条并行排列的输送带,所述多条并行排列的输送带被设置为能够具有不同的输送速度,所述控制装置还被设置为:
    在确定所述输送装置上每个包裹的位置信息后,根据所述输送装置上每个包裹的位置信息控制所述输送装置对并行输送的多个包裹进行分离处理。
  10. 一种包裹处理方法,应用于包裹处理设备,所述包裹处理设备包括用于输送包裹的输送装置以及设置在所述输送装置输送包裹的路径上的第一检测装置和第二检测装置,其中,所述第一检测装置设置为获取包裹的三维图像,所述第二检测装置设置为获取包裹的二维图像,所述包裹处理方法包括:
    控制所述输送装置输送包裹,并通过所述第一检测装置获取包裹的第一图像,通过所述第二检测装置获取包裹的第二图像;
    从所述第一图像中确定目标第一图像以及从所述第二图像中确定目标第二 图像;
    计算所述目标第一图像中的每个包裹在所述目标第一图像中的第一置信度,以及计算所述目标第二图像中的每个包裹在所述目标第二图像中的第二置信度,基于所述目标第一图像中每个包裹的所述第一置信度和所述目标第二图像中每个包裹的所述第二置信度确定所述输送装置上每个包裹的位置信息。
  11. 一种电子设备,包括:
    处理器;
    存储器,设置为存储计算机程序,
    所述处理器被设置为运行所述计算机程序以执行如权利要求10所述的包裹处理方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被设置为执行如权利要求10所述的包裹处理方法。
PCT/CN2021/089167 2020-04-30 2021-04-23 包裹处理设备、包裹处理方法、电子设备及存储介质 WO2021218792A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010366800.0 2020-04-30
CN202010366800.0A CN111553951B (zh) 2020-04-30 2020-04-30 包裹处理设备和包裹处理方法

Publications (1)

Publication Number Publication Date
WO2021218792A1 true WO2021218792A1 (zh) 2021-11-04

Family

ID=72004892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/089167 WO2021218792A1 (zh) 2020-04-30 2021-04-23 包裹处理设备、包裹处理方法、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN111553951B (zh)
WO (1) WO2021218792A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117387539A (zh) * 2023-12-11 2024-01-12 苏州朗信智能科技有限公司 货运车辆的车厢动态检测方法、检测装置及存储介质
WO2024067787A1 (zh) * 2022-09-29 2024-04-04 顺丰科技有限公司 异常包裹处理方法、装置、电子设备及存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553951B (zh) * 2020-04-30 2023-10-24 山东新北洋信息技术股份有限公司 包裹处理设备和包裹处理方法
CN114273240A (zh) * 2020-09-27 2022-04-05 深圳顺丰泰森控股(集团)有限公司 快递单件分离方法、装置、系统和存储介质
CN112547529A (zh) * 2020-11-18 2021-03-26 晶测自动化(深圳)有限公司 一种多路拉包智慧供包系统及其控制方法
CN115471730A (zh) * 2021-06-10 2022-12-13 未来机器人(深圳)有限公司 料笼堆叠的确认方法、装置、计算机设备和存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6614928B1 (en) * 1999-12-21 2003-09-02 Electronics And Telecommunications Research Institute Automatic parcel volume capture system and volume capture method using parcel image recognition
US20070237356A1 (en) * 2006-04-07 2007-10-11 John Dwinell Parcel imaging system and method
US9208621B1 (en) * 2014-05-20 2015-12-08 Fedex Corporate Services, Inc. Methods and systems for detecting a package type using an imaging logistics receptacle
DE102014114506A1 (de) * 2014-10-07 2016-04-07 Sick Ag Kamera zur Montage an einer Fördereinrichtung und Verfahren zur Inspektion oder Identifikation
CN109926342A (zh) * 2019-03-27 2019-06-25 杭州翼道智能科技有限公司 一种快递高效分拣系统及其分拣方法
CN110163225A (zh) * 2018-02-11 2019-08-23 顺丰科技有限公司 一种基于云平台的混杂包裹检测和识别方法、装置及系统
CN111553951A (zh) * 2020-04-30 2020-08-18 山东新北洋信息技术股份有限公司 包裹处理设备和包裹处理方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9805240B1 (en) * 2016-04-18 2017-10-31 Symbol Technologies, Llc Barcode scanning and dimensioning
CN107597600B (zh) * 2017-09-26 2019-08-30 北京京东振世信息技术有限公司 分拣系统和分拣方法
CN110349216A (zh) * 2019-07-18 2019-10-18 合肥泰禾光电科技股份有限公司 货箱位置检测方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6614928B1 (en) * 1999-12-21 2003-09-02 Electronics And Telecommunications Research Institute Automatic parcel volume capture system and volume capture method using parcel image recognition
US20070237356A1 (en) * 2006-04-07 2007-10-11 John Dwinell Parcel imaging system and method
US9208621B1 (en) * 2014-05-20 2015-12-08 Fedex Corporate Services, Inc. Methods and systems for detecting a package type using an imaging logistics receptacle
DE102014114506A1 (de) * 2014-10-07 2016-04-07 Sick Ag Kamera zur Montage an einer Fördereinrichtung und Verfahren zur Inspektion oder Identifikation
CN110163225A (zh) * 2018-02-11 2019-08-23 顺丰科技有限公司 一种基于云平台的混杂包裹检测和识别方法、装置及系统
CN109926342A (zh) * 2019-03-27 2019-06-25 杭州翼道智能科技有限公司 一种快递高效分拣系统及其分拣方法
CN111553951A (zh) * 2020-04-30 2020-08-18 山东新北洋信息技术股份有限公司 包裹处理设备和包裹处理方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024067787A1 (zh) * 2022-09-29 2024-04-04 顺丰科技有限公司 异常包裹处理方法、装置、电子设备及存储介质
CN117387539A (zh) * 2023-12-11 2024-01-12 苏州朗信智能科技有限公司 货运车辆的车厢动态检测方法、检测装置及存储介质
CN117387539B (zh) * 2023-12-11 2024-03-26 苏州朗信智能科技有限公司 货运车辆的车厢动态检测方法、检测装置及存储介质

Also Published As

Publication number Publication date
CN111553951A (zh) 2020-08-18
CN111553951B (zh) 2023-10-24

Similar Documents

Publication Publication Date Title
WO2021218792A1 (zh) 包裹处理设备、包裹处理方法、电子设备及存储介质
WO2022105231A1 (zh) 入库管理方法、装置、仓库管理系统和电子系统
KR20150092578A (ko) 소포 구분 시스템 및 그 방법
KR101753279B1 (ko) 택배화물 자동 인식 시스템
WO2022206744A1 (zh) 信息关联方法、系统、装置、服务器及存储介质
CN107597600A (zh) 分拣系统和分拣方法
JP5930037B2 (ja) ロボットシステムおよび物品移送方法
CN111921873B (zh) 包裹分拣方法及系统
WO2021244360A1 (zh) 包裹处理设备、方法、电子设备及存储介质
JP2016016915A (ja) 物品をコンベヤに整列させる物品整列装置
KR20150142923A (ko) 소포 인입 시스템 및 그 방법
US20210370352A1 (en) Detecting non-handleable items
CN111311691A (zh) 拆垛机器人拆垛方法及系统
CN112099104A (zh) 一种安检方法及装置
US20240177260A1 (en) System and method for three-dimensional scan of moving objects longer than the field of view
KR102218894B1 (ko) 중복인식 물체를 처리하는 컨베이어 벨트 영상 처리장치 및 방법
CN112700179A (zh) 包裹追踪方法、装置、系统及包裹检测方法、装置
US20230294134A1 (en) Parcel singulation yield correcting system and method
CN113936051A (zh) 一种基于2d相机的物体分离机及物体分离方法
CN113869184A (zh) 一种物流面单识别方法、系统及装置
JP2021045835A (ja) ロボットシステム
CN216655386U (zh) 一种物体分离机
WO2022145236A1 (ja) 情報処理装置及びプログラム
CN112101498A (zh) 一种违禁品定位方法及装置
CN112200976B (zh) 药品检测方法、装置、电子终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21795278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21795278

Country of ref document: EP

Kind code of ref document: A1