CN116997423A - Removal of airbag modules from automotive waste - Google Patents

Removal of airbag modules from automotive waste Download PDF

Info

Publication number
CN116997423A
CN116997423A CN202280015959.5A CN202280015959A CN116997423A CN 116997423 A CN116997423 A CN 116997423A CN 202280015959 A CN202280015959 A CN 202280015959A CN 116997423 A CN116997423 A CN 116997423A
Authority
CN
China
Prior art keywords
failed
sorting
airbag module
pieces
piece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280015959.5A
Other languages
Chinese (zh)
Inventor
N·库马
小曼纽尔·G·加西亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sotera Technology Co ltd
Original Assignee
Sotera Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/752,669 external-priority patent/US20220355342A1/en
Application filed by Sotera Technology Co ltd filed Critical Sotera Technology Co ltd
Priority claimed from PCT/US2022/039622 external-priority patent/WO2023015000A1/en
Publication of CN116997423A publication Critical patent/CN116997423A/en
Pending legal-status Critical Current

Links

Landscapes

  • Sorting Of Articles (AREA)

Abstract

A system for classifying materials using a vision system implements an artificial intelligence system to identify or classify automotive airbag modules and then remove the automotive airbag modules from a waste stream, which may be generated from the comminution of a scrapped vehicle. The sorting process may be designed such that the non-fail-safe module is not activated, which may cause injury to equipment or personnel.

Description

Removal of airbag modules from automotive waste
Related patent and patent application
The present application claims priority from U.S. provisional patent application Ser. No. 63/229724. The present application is a partial continuation-in-process of U.S. patent application Ser. No. 17/752669, U.S. patent application Ser. No. 17/752669 is a partial continuation-in-process of U.S. patent application Ser. No. 17/667397, U.S. patent application Ser. No. 17/667397 is a partial continuation-in-process of U.S. patent application Ser. No. 17/495291 is a partial continuation-in-process of U.S. patent application Ser. No. 17/491415 (issued as U.S. patent No. 11278937), U.S. 17/491415 is a partial continuation-in-process of U.S. patent application Ser. No. 17/380428, U.S. patent application Ser. No. 17/380428 is a partial continuation-in-process of U.S. patent application Ser. No. 17/227245, U.S. patent application Ser. No. 17/227245 is a partial continuation of U.S. patent application Ser. No. 16/939011, U.S. patent application Ser. No. 16/939011 is a continuation of U.S. patent application Ser. No. 16/375675 (issued as U.S. patent No. 10722922), U.S. patent application Ser. No. 16/375675 is a partial continuation of U.S. patent application Ser. No. 15/963755 (issued as U.S. patent No. 10710119), U.S. patent application Ser. No. 15/963755 is a partial continuation of U.S. patent application Ser. No. 15/213129 (issued as U.S. patent No. 10207296), U.S. 15/213129 claims priority to U.S. provisional patent application Ser. No. 62/193332, all of which are hereby incorporated by reference. U.S. patent application Ser. No. 17/491415 (issued as U.S. patent No. 11278937) is a continuation-in-part application of U.S. patent application Ser. No. 16/852514 (issued as U.S. patent No. 11260426), U.S. patent application Ser. No. 16/852514 is a divisional application of U.S. patent application Ser. No. 16/358374 (issued as U.S. patent No. 10625304), U.S. patent application Ser. No. 16/358374 is a continuation-in-part application of U.S. patent application Ser. No. 15/963755 (issued as U.S. patent No. 10710119), U.S. 15/963755 claims priority to U.S. provisional patent application Ser. No. 62/490219, all of which are hereby incorporated by reference.
Government licensing rights
The present disclosure is made with U.S. government support under DE-AR0000422 awarded by the U.S. department of energy. The united states government may have certain rights in this disclosure.
Technical Field
The present invention relates to the recycling of automotive waste and, more particularly, to the removal of airbag modules from automotive waste.
Background
This section is intended to introduce various aspects of the art, which may be associated with exemplary embodiments of the present disclosure. This discussion is believed to be helpful in providing a framework to facilitate a better understanding of the particular aspects of the present disclosure. Accordingly, it should be understood that this section should be read from this perspective and not necessarily as an admission of prior art.
Recycling is the process of collecting and disposing of materials that would otherwise be discarded as waste and converting them into new products. Since recycling reduces the amount of waste to landfill and incinerator, protects natural resources, improves economic safety by using domestic material sources, prevents pollution by reducing the need for collecting new raw materials, and saves energy, recycling has benefits to communities as well as to the environment.
Scrap metal is typically crushed and thus requires sorting to facilitate reuse of the metal. By sorting scrap metal, the metal that was originally sent to the landfill is reused. Additionally, the use of sorted scrap metal results in reduced pollution and emissions compared to refining raw materials from ores. Scrap metal may be used by the manufacturer instead of the original raw material if the quality of the sorted metal meets certain criteria. Scrap metals may include various types of ferrous and nonferrous metals, heavy metals, high value metals such as nickel or titanium, cast or wrought metals, and other various alloys.
In the united states, there are estimated 1500 tens of thousands of vehicles that are crushed each year (commonly referred to as scrap vehicles). Each vehicle may have several (e.g., 6-15) airbag modules; this is over 9000 tens of thousands of airbag modules that may enter the vehicle recovery stream each year. Airbag modules typically have three main components enclosed within a container or canister of some sort: an airbag, an inflator, and a propellant.
The resulting problem associated with all of the airbag modules is that they contain sodium azide for inflation, which is toxic. Additionally, not all of the airbag modules may inflate/burst as they pass through the vehicle shredder. Thus, these airbag modules may be inflated/exploded at different locations with different consequences: inflation/bursting on the conveyor system, thereby damaging the conveyor belt; inflation/burst when handled by a person, resulting in possible serious injury and/or limb loss; and inflated/exploded from the point of sale of the recycling facility, thereby damaging customer equipment.
There are technical challenges to be overcome for ensuring satisfactory removal of such airbag modules from vehicle waste: the airbag module may be small (e.g., an airbag module typically has a shape factor of a cylinder with a diameter of 1 inch and a height of 1 inch); the airbag module in the mixed scrap metal stream after the comminution process looks similar to other scrap metal pieces; the airbag module after comminution is difficult to identify when mixed with other waste pieces; the airbag module may be partially obscured when mixed with other waste pieces for transport on the conveyor belt; and the airbag modules come in different shapes, sizes and colors.
Drawings
FIG. 1 illustrates a schematic diagram of a material handling system configured in accordance with an embodiment of the present disclosure.
Fig. 2A-2B illustrate exemplary representations of a control set of an airbag module for use during a training phase.
Fig. 2C illustrates an exemplary representation of a control set of airbag modules used during a training phase, wherein the algorithm has identified and classified the airbag modules.
Fig. 3 illustrates a flow chart configured in accordance with an embodiment of the present disclosure.
Fig. 4 illustrates a flow chart configured in accordance with an embodiment of the present disclosure.
FIG. 5 illustrates a block diagram of a data processing system configured in accordance with an embodiment of the present disclosure.
Fig. 6A-6B illustrate exemplary representations of heterogeneous mixtures of pieces of material comprising an airbag module.
FIG. 7 illustrates an exemplary representation of a heterogeneous mixture of pieces of material comprising an airbag module, wherein an artificial intelligence algorithm has identified and/or categorized the airbag module.
Detailed Description
Various detailed embodiments of the present disclosure are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary of the disclosure that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present disclosure.
Embodiments of the present disclosure utilize artificial intelligence techniques for identifying/classifying airbag modules in a waste stream. According to certain embodiments of the present disclosure, pieces of material may be separated on a conveyor belt using any standard computer vision method, with spaces between the pieces. According to certain embodiments of the present disclosure, a regional advice neural network may be used to detect airbag modules and/or a deep neural network may be used to classify airbag modules. According to certain embodiments of the present disclosure, the two neural networks for detection and classification may be combined. Embodiments of the present disclosure may use semantic segmentation or object detection/localization. Alternatively, instance segmentation or panoramic segmentation may be utilized. Embodiments of the present disclosure may use pixel level, neighborhood, region, and/or overall image classification.
As used herein, "material" may include any item or object including, but not limited to: metals (black and colored), metal alloys, metal pieces embedded in another different material, plastics (including but not limited to any of the plastics disclosed herein, known in the art, or newly created in the future), rubber, foam, glass (including but not limited to borosilicate or soda lime glass, and various colored glasses), ceramics, paper, cardboard, polytetrafluoroethylene, polyethylene, bundled wires, insulated coated wires, rare earth elements, leaves, wood, plants, plant parts, textiles, biowaste, packaging, electronic waste, batteries and accumulators, automotive waste pieces from crushed vehicles, mining, construction, and demolition waste, crop waste, forest residues, specially planted grass, woody energy crops, microalgae, municipal food waste, hazardous chemicals and biomedical waste, construction waste, farm waste, biological items, non-biological items, objects having a specific carbon content, any other objects that can be found within municipal solid waste, and any other objects, objects or materials disclosed herein, including types of sensors that can be distinguished from one another by one or more other sensors (including any of the types of the other technologies disclosed herein) including sensors in any of the systems.
In a more general sense, a "material" may include any article or object composed of a chemical element, a compound or mixture of one or more chemical elements, or a compound or mixture of chemical elements, wherein the complexity of a compound or mixture may vary from simple to complex (all of which may also be referred to herein as a material having a particular "chemical composition"). "chemical element" means a chemical element in the periodic table of chemical elements, which includes chemical elements that may be found at the time of filing of the present application. Within this disclosure, the terms "scrap," "scrap piece," "material," and "material piece" may be used interchangeably. As used herein, a piece of material or scrap piece referred to as having a metal alloy composition is a metal alloy having a particular chemical composition that distinguishes the metal alloy from other metal alloys.
As used herein, the term "predetermined" refers to something that includes, but is not limited to, a predetermined or decision by a user or operator of the sorting system disclosed herein.
As used herein, "spectral imaging" is imaging that uses multiple bands across the electromagnetic spectrum. Although a typical camera captures light across three bands in the visible spectrum (red, green, and blue ("RGB")), spectral imaging may encompass a variety of technologies including and beyond RGB. For example, spectral imaging may use infrared, visible, ultraviolet, and/or x-ray spectra, or some combination of the above. The spectral data or spectral image data is a digital data representation of the spectral image. Spectral imaging may include simultaneous acquisition of spectral data in the visible and invisible bands, illumination from outside the visible range, or use of optical filters for capturing a particular spectral range. It is also possible to capture hundreds of bands for each pixel in the spectral image.
As used herein, the term "image data packet" refers to a digital data packet associated with a captured spectral image of each piece of material.
As used herein, the terms "identify …" and "classify …," and the terms "identify" and "classify" may be used interchangeably with any of the derivatives of the foregoing. As used herein, "classifying" a piece of material is to determine (i.e., identify) the type or class of material to which the piece of material belongs. For example, in accordance with certain embodiments of the present disclosure, a sensor system (as further described herein) may be configured to collect and analyze any type of information for classifying materials that may be utilized within a sorting system to selectively sort pieces of material according to one or more sets of physical and/or chemical characteristics (which may be user-defined, for example) including, but not limited to: color; texture; color tone; shape; brightness; a weight; a density; a chemical composition; size of the material; uniformity; the type of manufacture; chemical characteristics; a predetermined score; a radioactive feature; transmittance of light, sound, or other signals; and reactions to stimuli such as various fields, including emitted and/or reflected electromagnetic radiation ("EM") of the material element.
The type or class (i.e., classification) of material may be user-definable and is not limited to any known classification of material. The granularity of a type or class may vary from very coarse to very fine. For example, the type or category may include: plastics, ceramics, glass, metals, and other materials, wherein the particle size of such types or classes is relatively coarse; different metals and metal alloys such as, for example, zinc, copper, brass, chrome plating and aluminum, wherein the particle size of such types or classes is finer; or between certain types of plastics, where the granularity of such types or classes is relatively fine. Thus, the type or class may be configured for distinguishing between materials of significantly different chemical compositions, such as, for example, plastics and metal alloys, or for distinguishing between materials of nearly the same chemical composition, such as, for example, different types of metal alloys. It should be appreciated that the methods and systems discussed herein may be applied to accurately identify/sort a piece of material for which the chemical composition is not known at all before the piece of material is sorted.
As referred to herein, a "conveyor system" may be any known piece of mechanical handling equipment that moves material from one location to another, including, but not limited to: pneumatic mechanical conveyor, automotive conveyor, belt driven live roller conveyor, bucket conveyor, chain driven live roller conveyor, drag conveyor, dust-proof conveyor, electric rail vehicle system, flexible conveyor, gravity slide conveyor, spool roller conveyor, electric roller conveyor, overhead I-beam conveyor, land conveyor, drug conveyor, plastic belt conveyor, pneumatic conveyor, screw or auger conveyor, screw conveyor, piping lane conveyor, vertical conveyor, free-fall conveyor, vibratory conveyor, wire mesh conveyor, and robotic manipulator.
According to certain embodiments of the present disclosure, the systems and methods described herein receive a heterogeneous mixture of a plurality of pieces of material, wherein at least one piece of material within the heterogeneous mixture comprises: the method includes the steps of providing a heterogeneous mixture of at least one material piece that is different from an elemental composition of one or more other material pieces, and/or at least one material piece that is physically distinguishable from other material pieces within the heterogeneous mixture, and/or at least one material piece that has a class or type of material within the heterogeneous mixture that is different from a class or type of material of other material pieces in the mixture, and the system and method are configured for identifying/classifying/distinguishing/sorting the one material piece into a group separate from such other material pieces. Embodiments of the present disclosure may be used to sort any type or class of material defined herein. Instead, the homogeneous set or group of materials fall within a identifiable class or type of material (or even a specified plurality of identifiable classes or types of materials) such as a non-failing (live) airbag module.
Embodiments of the present disclosure may be described herein as sorting pieces of material into separate groups by physically placing (e.g., transferring or discharging) the pieces of material into separate containers or bins according to user-defined groupings (e.g., types or classifications of materials). As an example, in certain embodiments of the invention, pieces of material may be sorted into separate containers so as to be able to distinguish pieces of material classified as belonging to a certain class or type of material (e.g., non-fail airbag module) from other pieces of material (e.g., classified as belonging to a different class or type of material).
It should be noted that the material to be sorted may have irregular size and shape. For example, such materials may have been previously crushed by some crushing mechanism that cuts the material into such irregularly shaped and sized pieces (creating scrap pieces) that are then fed or transferred onto a conveyor system. According to an embodiment of the present disclosure, the material piece comprises an automotive waste piece of a vehicle that has been passed through some sort of crushing mechanism, wherein the automotive waste piece comprises an airbag module that has not been activated (i.e., inflated or exploded), also referred to herein as a "non-fail airbag module".
Fig. 1 illustrates an example of a system 100 configured in accordance with various embodiments of the invention. The conveyor system 103 may be implemented to convey individual pieces of material 101 through the system 100 such that each of the individual pieces of material 101 may be tracked, sorted, distinguished, and sorted into a predetermined desired group. Such conveyor systems 103 may be implemented using one or more conveyors on which the pieces of material 101 travel at a generally predetermined constant speed. However, certain embodiments of the present disclosure may be implemented with other types of conveyor systems, including systems in which pieces of material freely fall past various components of the system 100 (or any other type of vertical sorter), or vibrating conveyor systems. Hereinafter, where applicable, the conveyor system 103 may also be referred to as a conveyor belt 103. In one or more embodiments, some or all of the acts or functions of delivering, capturing, stimulating, detecting, classifying, distinguishing, and sorting may be performed automatically (i.e., without human intervention). For example, in system 100, one or more cameras, one or more stimulus sources, one or more emission detectors, a sorting module, sorting equipment/devices, and/or other system components may be configured to automatically perform these and other operations.
Furthermore, while fig. 1 illustrates a single flow of pieces of material 101 on a conveyor system 103, wherein multiple such flows of pieces of material are parallel to each other through the various components of the system 100, embodiments of the present disclosure may be implemented. For example, as further described in U.S. patent No. 10207296, pieces of material may be dispensed into two or more parallel separate streams or sets of parallel conveyors traveling on a single conveyor. Thus, certain embodiments of the present disclosure are capable of simultaneously tracking, sorting, and sorting a plurality of such parallel traveling streams of material pieces. According to certain embodiments of the present disclosure, the incorporation or use of a separator is not required. Instead, the conveyor system (e.g., conveyor belt 103) may simply convey a collection of pieces of material that may have been placed onto the conveyor system 103 in a random manner.
According to certain embodiments of the present disclosure, some suitable feeding mechanism (e.g., another conveyor system or hopper 102) may be utilized to feed the pieces of material 101 onto the conveyor system 103, whereby the conveyor system 103 conveys the pieces of material 101 through various components within the system 100. An optional drum/vibrator/separator 106 may be used to separate individual pieces of material from a large number of pieces of material after the pieces of material 101 are received by the conveyor system 103. In certain embodiments of the present disclosure, the conveyor system 103 is operated by the conveyor system motor 104 to travel at a predetermined speed. The predetermined speed may be programmable and/or adjustable by an operator in any known manner. The monitoring of the predetermined speed of the conveyor system 103 may alternatively be performed with the position detector 105. Within certain embodiments of the present disclosure, control of the conveyor system motor 104 and/or the position detector 105 may be performed by the automated control system 108. Such an automation control system 108 may be operated under control of the computer system 107 and/or the functions for performing the automation control may be implemented in software within the computer system 107.
The conveyor system 103 may be a conventional endless belt conveyor employing a conventional drive motor 104, the conventional drive motor 104 being adapted to move the belt conveyor at a predetermined speed. The position detector 105, which may be a conventional encoder, may be operably coupled to the conveyor system 103 and the automation control system 108 to provide information (e.g., speed) corresponding to the movement of the conveyor belt. Thus, by utilizing control of the conveyor system drive motor 104 and/or the automated control system 108 (and, alternatively, the position detector 105), as each of the pieces of material 101 traveling on the conveyor system 103 is identified, they may be tracked by position and time (relative to the various components of the system 100) such that the various components of the system 100 may be activated/deactivated as each piece of material 101 passes in proximity to the various components of the system 100. As a result, the automated control system 108 is able to track the position of each of the pieces 101 as each of the pieces 101 travels along the conveyor system 103.
Referring again to fig. 1, certain embodiments of the present disclosure may utilize a visual or optical recognition system 110 and/or a piece tracking device 111 as a means for tracking each of the pieces 101 as each of the pieces 101 travels on the conveyor system 103. The vision system 110 may utilize one or more stationary or real-time motion cameras 109 to record the position (i.e., position and timing) of each of the pieces of material 101 on the moving conveyor system 103. The vision system 110 may be further or alternatively configured to perform certain types of identification (e.g., classification) of all or a portion of the piece of material 101, as will be further described herein. For example, such vision systems 110 may be used to capture or gather information about each of the pieces of material 101. For example, the vision system 110 may be configured (e.g., an artificial intelligence ("AI") system) to capture or collect any type of information from the pieces of material that may be used within the system 100 to classify/differentiate the pieces of material 101 and to selectively sort the pieces of material 101 according to one or more sets of characteristics (e.g., physical and/or chemical and/or radioactivity, etc.), as described herein. According to certain embodiments of the present disclosure, the vision system 110 may be configured to capture visual images (including one-dimensional, two-dimensional, three-dimensional, or holographic imaging) of each of the pieces of material 101, for example, by using optical sensors utilized in typical digital cameras and video equipment. Such visual images captured by the optical sensor are then stored in a memory device as spectral image data (e.g., formatted as image data packets). Such spectral image data may represent images captured within the optical wavelength of light (i.e., the wavelength of light that is observable by a typical human eye), according to certain embodiments of the present disclosure. However, alternative embodiments of the present disclosure may utilize a sensor system configured to capture an image of a material composed of wavelengths of light other than the human eye's visual wavelength.
According to alternative embodiments of the present disclosure, the system 100 may be implemented using one or more sensor systems 120, which one or more sensor systems 120 may be utilized alone or in combination with the vision system 110 to classify/identify/distinguish the pieces of material 101. The sensor system 120 may be configured with any type of sensor technology, the sensor system 120 including sensors that utilize irradiated or reflected electromagnetic radiation (e.g., utilizing infrared ("IR"), fourier transform IR ("FTIR"), forward-looking infrared ("FLIR"), very near infrared ("NIR"), short wave infrared ("SWIR"), long wave infrared ("LWIR"), mid wave infrared ("MWIR" or "MIR"), X-ray transmission ("XRT"), gamma rays, ultraviolet ("UV"), X-ray fluorescence ("XRF"), laser induced breakdown spectroscopy ("LIBS"), raman spectroscopy, anti-stokes raman spectroscopy, gamma spectroscopy, hyperspectral spectroscopy (e.g., any range beyond visible wavelengths), acoustic spectroscopy, NMR spectroscopy, microwave spectroscopy, terahertz spectroscopy, including one-dimensional, two-dimensional, or three-dimensional imaging with any of the foregoing), or by any other type of sensor technology including, but not limited to, chemistry or radioactivity. An implementation of an XRF system (e.g., for use as sensor system 120 herein) is further described in U.S. patent No. 10207296. Note that in certain contexts of the description herein, references to a sensor system may thus refer to a vision system. However, any of the vision and sensor systems disclosed herein may be configured to collect or capture information (e.g., characteristics) specifically associated with each of the pieces of material, whereby the captured information may then be used to identify/classify/differentiate some of the pieces of material.
According to certain embodiments of the present disclosure, more than one optical camera and/or sensor system may include a non-fail airbag module that is used at different angles to help identify that it is partially or even substantially or completely obscured by other materials on the conveyor system. According to certain embodiments of the present disclosure, multiple cameras and/or sensor systems may be used to create 3D information to generate more information available than is possible with 2D data. The 2D or 3D data may be used with AI systems to collect data.
According to certain embodiments of the present disclosure, lidar systems ("light detection and ranging" or "laser imaging, detection and ranging") may be used in place of camera and/or sensor systems. According to certain embodiments of the present disclosure, a scanning laser may be used to collect 3D data of the waste stream. The laser-based 3D data may then be used with a neural network to identify non-failed airbag modules.
It should be noted that although fig. 1 is illustrated with a combination of vision system 110 and one or more sensor systems 120, embodiments of the present disclosure may be implemented using any combination of sensor systems that utilize any of the sensor technologies disclosed herein or any other sensor technology currently available or developed in the future. Although fig. 1 is illustrated as including one or more sensor systems 120, in certain embodiments of the present disclosure, the implementation of such sensor system(s) is optional. Within certain embodiments of the present disclosure, a combination of both the vision system 110 and the one or more sensor systems 120 may be used to categorize the piece of material 101. Within certain embodiments of the present disclosure, any combination of one or more of the different sensor technologies disclosed herein may be used to classify the piece of material 101 without utilizing the vision system 110. Further, embodiments of the present disclosure may include any combination of one or more sensor systems and/or vision systems, wherein the output of such sensor/vision systems is processed within an AI system (as further disclosed herein) in order to classify/identify materials from a heterogeneous mixture of materials, which may then be sorted from one another.
Within certain embodiments of the present disclosure, the piece tracking device 111 and accompanying control system 112 may be utilized and configured to measure the size and/or shape of each of the pieces 101 as the pieces 101 pass in proximity to the piece tracking device 111 along with the position (i.e., location and timing) of each of the pieces 101 on the moving conveyor system 103. Exemplary operation of such a material piece tracking device 111 and control system 112 is further described in U.S. patent No. 10207296. Alternatively, as previously disclosed, the vision system 110 may be used to track the position (i.e., position and timing) of each of the pieces of material 101 as the pieces of material 101 are transported by the conveyor system 103. Thus, certain embodiments of the present disclosure may be implemented without a material piece tracking device (e.g., material piece tracking device 111) for tracking a material piece.
Such a distance measuring device 111 may be implemented with a well known visible light (e.g. laser) system that continuously measures the distance that light travels before being reflected back into the detector of the laser system. Thus, as each of the pieces of material 101 passes in proximity to the device 111, it outputs a signal indicative of such distance measurement to the control system 112. Thus, such a signal may essentially represent an intermittent series of pulses, whereby during those moments when the piece of material 101 is not in the vicinity of the device 111, a baseline of the signal is generated as a result of the distance measurement between the distance measuring device 111 and the conveyor belt 103, whereas each pulse provides a measurement of the distance between the distance measuring device 111 and the piece of material 101 passing on the conveyor belt 103.
Within certain embodiments of the present disclosure implementing one or more sensor systems 120, the sensor system(s) 120 may be configured to assist the vision system 110 in identifying the chemical composition, relative chemical composition, and/or type of manufacture of each of the pieces of material 101 as the piece of material 101 passes in proximity to the sensor system(s) 120. The sensor system(s) 120 may include an energy emitting source 121, which energy emitting source 121 may be powered, for example, by a power source 122 to stimulate a response from each of the pieces 101.
In certain embodiments of the present disclosure, the sensor system 120 may emit an appropriate sensing signal toward the pieces of material 101 as each piece of material 101 passes near the emission source 121. The one or more detectors 124 may be positioned and configured to sense/detect one or more characteristics from the piece of material 101 in a form suitable for the type of sensor technology utilized. One or more detectors 124 and associated detector electronics 125 capture these received sensed characteristics to perform signal processing thereon and generate digitized information (e.g., spectral data) representative of the sensed characteristics, which are then analyzed in accordance with certain embodiments of the present disclosure so as to be usable to classify each of the pieces of material 101. This sorting may be performed within the computer system 107 and then may be utilized by the automated control system 108 to activate one of the N (N.gtoreq.1) sorting devices 126 … … 129 of the sorting apparatus for sorting (e.g., removing/transferring/discharging) the pieces of material 101 into one or more N (N.gtoreq.1) sorting containers 136 … … 139 according to the determined sorting. Four sorting apparatuses 126 … … 129 and four sorting containers 136 … … 139 associated with the sorting apparatuses are illustrated in fig. 1 by way of non-limiting example only.
As described herein, embodiments of the present disclosure are configured for identifying non-failed airbag modules in a moving waste stream (e.g., distinguishing them from other automotive waste pieces), and for sorting these non-failed airbag modules so that they are removed/diverted/discharged from the conveyor system.
The sorting apparatus may include any known sorting mechanism for removing/transferring/discharging selected pieces of material 101 identified as non-failed airbag modules toward a desired location, including but not limited to transferring pieces of material 101 from a conveyor system into one or more sorting containers. According to certain embodiments of the present disclosure, the sorting mechanism for removing/transferring/discharging a non-failed airbag module from the conveyor system may be configured to cause it to remove/transfer/discharge an airbag from the conveyor system, regardless of whether other pieces of material in the vicinity of the non-failed airbag module are also removed/transferred/discharged from the conveyor system along with the non-failed airbag module, even though this means that one or more other pieces of material are lost from the remaining waste stream, as the removed/transferred/discharged non-failed airbag module may be more important. For example, referring to any of fig. 6A, 6B or 7, it can be readily seen that there are other waste pieces in the vicinity of the non-failed airbag module. In such instances, other waste pieces in the vicinity of the classified non-failed airbag module may be transferred from the conveyor belt into the designated container along with the classified non-failed airbag module because it is more important to remove the non-failed airbag module from the waste piece stream than to attempt to transfer only the classified non-failed airbag module at the risk of the transfer action by the sorting mechanism being inaccurate, thereby resulting in the classified non-failed airbag module not being removed from the waste piece stream.
Mechanisms that may be used to remove/transfer/eject the pieces include robotically removing the pieces from the conveyor belt, pushing the pieces from the conveyor belt (e.g., using a paint brush type plunger) to create an opening (e.g., a trapdoor) in the conveyor system 103 from which the pieces may fall or using air jets to separate the pieces into individual containers as the pieces fall from the edges of the conveyor belt. As the term is used herein, a pusher device may refer to any form of device that may be activated to dynamically displace an object on or from a conveyor system/device, using a pneumatic, mechanical, hydraulic, or vacuum actuator or other means, such as any suitable type of mechanical pushing mechanism (e.g., ACME screw drive), pneumatic pushing mechanism, or air jet pushing mechanism.
According to certain embodiments of the present disclosure, it may be desirable to remove/transfer/expel the non-failed airbag modules from the conveyor system in a relatively "gentle" manner such that the non-failed airbag modules are not activated, thereby causing them to inflate/burst. According to certain embodiments of the present disclosure, any technique for removing/transferring/ejecting a non-failed airbag module from a conveyor system may be utilized, wherein the force performing the removal/transfer/ejection is configured such that it does not cause the non-failed airbag module to be activated such that it inflates or bursts. For example, sorting may be performed by a sorting mechanism that transfers the non-failed airbag module into the container using a transfer force configured to deactivate the non-failed airbag module. Thus, the sorting mechanism may be configured such that it transfers the non-failed airbag module away from the conveyor belt with sufficient force for moving the non-failed airbag module but with less force than is known to activate such non-failed airbag modules. Of course, this may be determined by trial and error. According to certain embodiments of the present disclosure, such sorting mechanism may be a paint brush plunger.
Robotic removal may be performed by some suitable robotic arm, such as a Stewart platform, delta robot, or multi-tipped gripper.
In addition to the N sorting containers 136, … … 139 into which material pieces 101 (e.g., non-fail airbag modules) are removed/transferred/discharged, the system 100 may also include a container 140 that receives material pieces 101 (e.g., remaining automotive waste pieces) that are not transferred/discharged from the conveyor system 103 into any of the aforementioned sorting containers 136, … …, 139.
Depending on the multiple classifications of desired pieces of material, multiple classifications may be mapped to a single sorting apparatus and associated sorting containers. In other words, there need not be a one-to-one correlation between sorting and sorting containers. For example, it may be desirable for a user to sort certain classified materials (e.g., non-fail airbag modules and other material types) into the same sorting container. To achieve this sort, the same sort device may be activated to sort pieces of material 101 into the same sort container when the pieces of material 101 are sorted to fall into a predetermined sort group. Such combination sorting may be applied to produce any desired combination of sorted pieces of material. The mapping of classifications may be programmed by a user (e.g., using algorithm(s) operated on by computer system 107) to produce such desired combinations. Additionally, the classification of the pieces of material is user definable and is not limited to any particular known classification of pieces of material.
Conveyor system 103 may include a circular conveyor (not shown) such that unclassified material pieces are returned to the beginning of system 100 and again pass through system 100. Further, because the system 100 is able to specifically track each material piece 101 as each material piece 101 travels on the conveyor system 103, some sort of sorting equipment (e.g., sorting equipment 129) may be implemented to remove/guide/discharge material pieces 101 for which the system 100 fails to sort (e.g., material pieces 101 that have not been classified as non-failed airbag modules according to a predetermined threshold value, yet have a higher probability of all or substantially all non-failed airbag modules being removed/guided/discharged (or material pieces 101 being collected in the container 140) after a predetermined number of cycles through the system 100, the user expects to be classified as non-failed airbag modules for material pieces with non-failed airbag modules assigned a particular value below the predetermined threshold value).
As illustrated in fig. 2A-2C, the systems and methods described herein may be applied to sort and/or sort individual airbag modules having any of a variety of sizes.
As previously described, certain embodiments of the present disclosure may implement one or more vision systems (e.g., vision system 110) to identify, track, and/or classify pieces of material. According to embodiments of the present disclosure, such vision system(s) may operate alone to identify and/or sort and sort pieces of material, or may operate in combination with a sensor system (e.g., sensor system 120) to identify and/or sort and sort pieces of material. If a system (e.g., system 100) is configured to operate with only such vision system(s) 110, sensor system 120 may be omitted from system 100 (or simply disabled).
Such vision systems may be configured with one or more devices for capturing or acquiring images of the piece of material as it passes over the conveyor system. The device may be configured to capture or collect any desired range of wavelengths that are irradiated or reflected by the piece of material, including but not limited to visible light, infrared ("IR"), ultraviolet ("UV") light. For example, the vision system may be configured with one or more cameras (still and/or video, any of which may be configured to capture two-dimensional, three-dimensional, and/or holographic images) positioned near (e.g., above) the conveyor system such that an image of the piece of material is captured as the piece of material passes through the sensor system(s). According to alternative embodiments of the present disclosure, the data captured by the sensor system 120 may be processed (converted) into data to be used (either alone or in combination with image data captured by the vision system 110) for sorting/sorting pieces of material. Such implementations may replace classifying the pieces of material with the sensor system 120 or may be combined with classifying the pieces of material with the sensor system 120.
Regardless of the type(s) of sensed characteristics/information of the captured piece of material, the information may then be sent to a computer system (e.g., computer system 107) for processing (e.g., by an AI system) to identify and/or classify the piece of material. The AI system may implement any known AI system (e.g., artificial narrow intelligence ("ANI"), artificial general intelligence ("AGI"), and artificial super intelligence ("ASI")) or derivative product thereof to be developed; a machine learning system including a machine learning system implementing a neural network (e.g., an artificial neural network, a deep neural network, a convolutional neural network, a recurrent neural network, an automatic encoder, reinforcement learning, etc.); machine learning systems implementing supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, self-learning, feature learning, sparse dictionary learning, anomaly detection, robotics learning, association rule learning, fuzzy logic, deep learning algorithms, deep structure learning hierarchical learning algorithms, extreme learning machines, support vector machines ("SVM") (e.g., linear SVM, nonlinear SVM, SVM regression, etc.), decision tree learning (e.g., classification and regression trees ("CART"), integration methods (e.g., ensemble learning, random forest, bagging (Bagging), and Pasting (packing), patches and subspaces, boosting (Stacking), etc.), dimension reduction (e.g., projection, manifold learning, principal component analysis, etc.), and/or deep machine learning algorithms non-limiting examples of publicly available machine learning software and libraries that may be utilized in embodiments of the present disclosure include: python, openCV, inception, theano libraries, torch libraries, pyTorch libraries, pylearn2 libraries, numpy libraries, blocks libraries, tensorFlow libraries, MXNet libraries, caffe libraries, lasagne libraries, keras libraries, chainer libraries, matlab deep learning, CNTK, matConvNet (MATLAB toolbox implementing convolutional neural networks for computer vision applications), deep Learn toolbox (MATLAB toolbox for deep learning (from Rasmus Berg Palm)), bigDL (large DL), cuda-Convnet (fast C++/CUDA implementation of convolutional (or, more generally, feedforward) neural networks), deep belief networks, RNNLM, RNNLIB-RNNLIB, matrbm, deeplearning j (deep learning4 j), eblearn.1sh, deepdat library, MShadow library, matplotlib library, sciPy library, CXXNET, nengo-Nengo library, eblearn, CUDAMat, gnumpy, three-way factors RBM and mcRBM, mPAT (Python code for training natural image models using CUDAMat and Gumpy), convNet, elektronn, openNN library, neuralDesigner, theano generalized Hebbian learning, apache Singa, lightnet and SimpleDNN (simple DNN).
According to embodiments of the present disclosure, identifying and/or classifying each of the pieces of material 101 may be performed by an AI system implementing semantic segmentation. However, other implementations may be utilized, such as image segmentation (e.g., using Python code), such as Mask R-CNN, panorama segmentation, instance segmentation, block segmentation, or bounding box algorithms.
Image segmentation enables identification/classification of material elements that are partially obscured by other material elements. Fig. 6A and 7 illustrate exemplary images of material pieces overlapping one another such that one or more non-failed airbag modules are partially obscured, but which may be identified/categorized as non-failed airbag modules by embodiments of the present disclosure (as illustrated in fig. 7) and thus distinguished from other automotive waste pieces, such as when the AI system implements some form of image segmentation algorithm.
The configuration of AI systems typically occurs in multiple phases. For example, first, training occurs, which may be performed offline, as the system 100 is not being used to perform actual sorting/sorting of pieces of material. The system 100 may be used to train an AI system because a homogeneous set of pieces of material (also referred to herein as control samples) (i.e., of the same type or class of material) may be passed through the system 100 (e.g., by the conveyor system 103); and all such pieces of material may not be sorted, but may be collected in a common container (e.g., container 140). Alternatively, training may be performed at another location remote from the system 100, including using some other mechanism for collecting sensed information (characteristics) of the control set of pieces of material. During this training phase, algorithms within the AI system extract features from the captured information (e.g., using image processing techniques well known in the art). Non-limiting examples of training algorithms include, but are not limited to: linear regression, gradient descent, feed forward, polynomial regression, learning curve, canonical learning model, and logistic regression. Additionally, training may include data supervision, data organization, data tagging, semi-synthetic data composition, synthetic data generation, data augmentation, and other activities surrounding the preparation of "courses" (e.g., training or control sets) taught to the AI system (e.g., off-board training on stand-alone equipment designed for this purpose, and "no-equipment" training (simulation, augmentation, etc.) performed entirely in computer memory). It is during this training phase that algorithms within the AI system learn the relationship between the material (e.g., captured by the vision system and/or sensor system (s)) and its characteristics/properties, creating a knowledge base for later classification of heterogeneous mixtures of pieces of material received by the system 100, which can then be sorted by desired classification. Such knowledge bases may include one or more libraries, where each library includes parameters (e.g., neural network parameters) for use by the AI system in classifying a piece of material. For example, one particular library may include parameters configured by the training phase to identify and classify airbag modules. According to certain embodiments of the present disclosure, such libraries may be input into the AI system, and then a user of the system 100 may be able to adjust certain of the parameters in order to adjust the operation of the system 100 (e.g., adjust the threshold effectiveness of the AI system to identify/classify and differentiate non-failed airbag modules from a mixture of materials (e.g., a moving automotive waste stream)).
As shown by the example images in fig. 2A-2B, during a training phase, examples of one or more non-failed airbag modules (which may be referred to herein as one or more control sample sets) may be delivered (e.g., by a conveyor system) through a vision system and/or one or more sensor systems such that algorithms within the AI system detect, extract, and learn what features represent such types or classes of materials. For example, each of the non-failed airbag modules is passed through such a training phase so that the algorithm within the AI system "learns" (is trained) how to detect, identify, and classify the non-failed airbag module (see fig. 2C). In the case of training a vision system (e.g., vision system 110), the AI system is trained to visually discern between pieces of material. This creates a parameter library specific to the non-failed airbag module. The same process may then be performed, for example, for a particular class or type of metal alloy (or a mixture of automotive scrap pieces that are not non-spent airbag modules), creating a library of parameters specific to that class or type of metal alloy, and so on. For each class or type of material to be classified by the system, any number of exemplary pieces of material of that class or type of material may be communicated by the vision system and/or one or more sensor systems. Given a captured image or other captured characteristic as input data, the AI algorithm(s) may use N classifiers, where each classifier tests for one of N different material categories or types.
After the algorithm has been established and the AI system has sufficiently learned (trained) the differences (e.g., visually discernable differences) in material classifications (e.g., within a user-defined statistical confidence level), a library for different material classifications is then implemented into a material classification/sorting system (e.g., system 100) for identifying and/or sorting material pieces (e.g., non-fail airbag modules) from heterogeneous mixtures of material pieces (e.g., automotive waste streams), and then sorting such sorted material pieces.
As found in the relevant literature, techniques for constructing, optimizing, and utilizing AI systems are known to those of ordinary skill in the art. Examples of such documents include publications: "ImageNet Classification with Deep Convolutional Networks (ImageNet Classification using deep convolutional networks)", "25 th International conference treatise on neuro-information handling systems", 2012, 12 months 3-6 days, taihao lake, nevada; and LeCun et al, "Gradient-Based Learning Applied to Document Recognition (applied to Gradient-based learning of document recognition)", institute of Electrical and Electronics Engineers (IEEE), month 11 in 1998, both of which are hereby incorporated by reference in their entirety.
In an example technique, data captured by a vision or sensor system regarding a particular piece of material (e.g., a non-failed airbag module) may be processed as an array of data values (within a data processing system implementing (configured with) an AI system (e.g., data processing system 3400 of fig. 10)). For example, the data may be spectral data captured by a digital camera or other type of sensor system with respect to a particular piece of material and processed into an array of data values (e.g., image data packets). Each data value may be represented by a single number, or by a series of numbers representing the value. These values may be multiplied by a neuron weight parameter (e.g., using a neural network), and may have added bias. This can be fed into the neuron nonlinearity. The resulting number of outputs from the neurons may be processed as the original value, with the outputs multiplied by subsequent neuron weight values, optionally with the addition of bias, and again fed into the neuron nonlinearity. Each such iteration of the process is referred to as a "layer" of the neural network. The final output of the final layer may be interpreted as the probability that material is present or absent in the captured data relating to the piece of material. Examples of such processes are described in detail in the previously mentioned references "ImageNet Classification with Deep Convolutional Networks (ImageNet classification using deep convolutional networks)" and "Gradient-Based Learning Applied to Document Recognition (Gradient-based learning applied to document recognition)". According to embodiments of the present disclosure, because the captured visual image of the automotive waste contains visual characteristics that result in the captured visual image being similar to a non-failed airbag module, the bias may be configured such that the AI system classifies the automotive waste as a non-failed airbag module. The bias may be configured such that a ratio of false positive (false positive) to false negative (false negative) occurs greater than a predetermined threshold (e.g., 95%). False positives refer to examples in which classification results in the identification of a piece of automotive waste as a non-failed airbag module, but this is not actually the case (such as when the piece of automotive waste is physically similar to the non-failed airbag module). A false negative refers to an instance in which classification results in failure to identify a non-failed airbag module. Since it is critical to remove non-spent airbag modules from the automotive waste stream, it may be acceptable to configure a very high predetermined ratio of false positives to false negatives into the classification algorithm of the AI system, even though this may result in the removal of other automotive waste pieces from the waste stream.
According to certain embodiments of the present disclosure in which the neural network is implemented as a final layer ("classification layer"), a final set of outputs of neurons is trained to represent the likelihood that a piece of material (e.g., an airbag module) is associated with the captured data. During operation, if the likelihood that a piece of material is associated with the captured data exceeds a user-specified threshold, it is determined that a particular piece of material is indeed associated with the captured data. These techniques may be extended to determine not only the presence of a material type associated with a particular captured data, but also whether a sub-region of the particular captured data belongs to one type of material or another type of material. This process is known as segmentation and there are techniques in the literature that use neural networks (such as what is known as "full convolution" neural networks), or networks that include convolved portions (i.e., partially convolved) if not fully convolved. This allows the location and size of the material to be determined. Examples include Mask R-CNN, which implements image segmentation.
It should be understood that the present disclosure is not limited exclusively to AI technology. Other common techniques for material classification/identification may also be used. For example, the sensor system may provide a signal that may indicate the presence or absence of a certain type of material (e.g., containing one or more specific elements) by examining the spectral emissions (i.e., spectral imaging) of the material using optical spectrometry techniques using a multispectral or hyperspectral camera. Spectral images of a piece of material (e.g., an airbag module) may also be used in a template matching algorithm, wherein a database of spectral images is compared to the acquired spectral images to find the presence or absence of certain types of materials from the database. The histogram of the captured spectral image may also be compared to a histogram database. Similarly, the bag of words model may be used with feature extraction techniques such as scale invariant feature transform ("SIFT") to compare extracted features between captured images and images in a database. According to certain embodiments of the present disclosure, instead of utilizing a training phase in which control samples of the pieces of material are conveyed by the vision system and/or sensor system(s), training of the machine learning system may be performed utilizing a marking/annotation technique (or any other supervised learning technique), whereby when data/information of the pieces of material is captured by the vision/sensor system, a user enters a marking or annotation identifying each piece of material (e.g., a non-failed airbag module) that is then used to create a library for use by the machine learning system in classifying the pieces of material within a heterogeneous mixture of pieces of material. In other words, a knowledge base of previously generated characteristics captured from one or more samples in a class of materials may be accomplished by any of the techniques disclosed herein, whereby such knowledge base is then used to automatically classify the materials.
Thus, as disclosed herein, certain embodiments of the present disclosure provide for the identification/classification of one or more different types or categories of materials in order to determine which pieces of material (e.g., non-fail airbag modules) should be transferred from a conveyor system in a defined group. According to some embodiments, AI technology is utilized to train (i.e., configure) a neural network to identify various one or more different categories or types of materials. A spectral image or other type of sensed information of the material (e.g., traveling on the conveyor system) is captured, and based on the identification/classification of such material, the systems described herein can decide which material piece should be allowed to remain on the conveyor system, and which material piece should be transferred/removed from the conveyor system (e.g., either into a collection container or onto another conveyor system).
It is to be mentioned herein that, according to certain embodiments of the present disclosure, the collected/captured/detected/extracted features/characteristics (e.g., spectral images) of the piece of material are not necessarily simple particularly identifiable or discernable physical characteristics; they may be abstract formulas that can only be expressed mathematically, or not at all; however, the AI system may be configured to parse the spectral data for patterns that allow classification of the control samples during the training phase. Further, the machine learning system may acquire sub-portions of the captured information (e.g., spectral images) of the piece of material and attempt to find correlations between the predefined classifications.
According to certain embodiments of the present disclosure, instead of utilizing a training phase in which control samples of the pieces of material are conveyed by the vision system and/or sensor system(s), training of the AI system may be performed utilizing a marking/annotation technique (or any other supervised learning technique), whereby when data/information of the pieces of material (e.g., non-failed airbag modules) is captured by the vision/sensor system, a user enters a marking or annotation identifying each piece of material, which is then used to create a library for use by the AI system in classifying the pieces of material within a heterogeneous mixture of pieces of material.
According to certain embodiments of the present disclosure, any sensed characteristic output by any of the sensor systems 120 disclosed herein may be input into an AI system for sorting and/or sorting materials. For example, in an AI system implementing supervised learning, the sensor system 120 output that uniquely characterizes a particular type or composition of material (e.g., an unspent airbag module) may be used to train the AI system.
Fig. 3 illustrates a flow chart depicting an exemplary embodiment of a process 3500 for sorting/sorting pieces of material using a vision system and/or one or more sensor systems, in accordance with certain embodiments of the present disclosure. Process 3500 may be configured for operation within any of the embodiments of the present disclosure described herein, including system 100 of fig. 1. The operations of process 3500 may be performed by hardware and/or software included within a computer system (e.g., computer system 3400 of fig. 5) of a control system (e.g., computer system 107, vision system 110, and/or sensor system(s) 120 of fig. 1). In process block 3501, a piece of material (e.g., a mixture of pieces of automotive waste) may be placed onto a conveyor system, such as represented in fig. 6A and 6B. In process block 3502, the position of each material piece on the conveyor system is detected for tracking each material piece as it travels through the system 100. This may be performed by vision system 110 (e.g., by distinguishing pieces of material from underlying conveyor system material when communicating with a conveyor system position detector (e.g., position detector 105). Alternatively, the material piece tracking device 111 may be used for tracking pieces. Alternatively, any system that can create a light source (including but not limited to visible light, UV, and IR) and that has a detector that can be used for the positioning member can be used for the tracking member. In process block 3503, the sensed information/characteristics of the piece of material are captured/collected when the piece of material has traveled into proximity of one or more of the vision system and/or the sensor system(s). In process block 3504, a vision system, such as previously disclosed (e.g., implemented within computer system 107), may perform preprocessing on the captured information, which may be used to detect (extract) information (e.g., from a background (e.g., conveyor belt)) for each of the pieces of material, in other words, the preprocessing may be used to identify differences between the pieces of material and the background. Well-known image processing techniques such as dilation, thresholding and contouring may be used to identify pieces of material as being different from the background. In process block 3505, segmentation may be performed. For example, the captured information may include information related to one or more pieces of material. Additionally, a particular piece of material may be located on a seam of the conveyor belt when an image of the particular piece of material is captured. Thus, in such instances, it may be desirable to isolate the image of each piece of material from the background of the image. In an exemplary technique for process block 3505, the first step is to apply a high contrast of the image; in this way, the background pixels are reduced to substantially all black pixels, and at least some of the pixels associated with the piece of material are brightened to substantially all white pixels. The white image pixels of the piece of material are then expanded to cover the entire size of the piece of material. After this step the position of the piece of material is a high contrast image of all white pixels on a black background. A contouring algorithm may then be utilized to detect the boundary of the piece of material. The boundary information is saved and then the boundary position is transferred to the original image. Then, segmentation is performed on the original image over an area larger than the earlier defined boundary. In this way, the piece of material is identified and separated from the background.
According to an embodiment of the disclosure, process block 3505 may implement a semantic segmentation process that identifies an airbag module within a heterogeneous mixture of pieces of material, such as represented in fig. 7. Alternatively, instance segmentation (such as Mask R-CNN) or panorama segmentation may be utilized.
In optional process block 3506, the pieces of material may be conveyed along a conveyor system within a vicinity of the piece tracking device and/or the sensor system to track each of the pieces of material and/or to determine a size and/or shape of the piece of material; this may be useful if the XRF system or some other spectral sensor is also implemented within the sorting system. In process block 3507, post-processing may be performed. Post-processing may involve resizing the captured information/data to prepare it for use in a neural network. This may also include modifying certain properties in some way (e.g., enhancing image contrast, changing image background, or applying a filter), which will result in an enhancement of the AI system's ability to classify and distinguish pieces of material. In process block 3509, the size of the data may be adjusted. In some cases, it may be desirable to resize the data to match the data input requirements for certain AI systems (such as neural networks). For example, a neural network may require an image size (e.g., 225x 255 pixels or 299x 299 pixels) that is much smaller than the size of an image captured by a typical digital camera. In addition, the smaller the input data size, the less processing time is required to perform classification. Thus, smaller data sizes may ultimately increase the throughput of the system 100 and increase its value.
In process blocks 3510 and 3511, each piece of material is identified/classified based on the sensed/detected characteristics. For example, process block 3510 may be configured with a neural network employing one or more algorithms that compares the extracted features to features stored in a knowledge base that was previously generated (e.g., generated during a training phase), and assigns a classification with the highest match to each of the pieces of material based on such comparison. The algorithm may process the captured information/data in a hierarchical manner by using an automatically trained filter. The filter responses are then successfully combined in the next stage algorithm until probabilities are obtained in the final step. In process block 3511, these probabilities may be used for each of the N classifications to determine into which of the N sorting containers the respective piece of material should be sorted. For example, each of the N classifications may be assigned to one sorting container, and the piece of material under consideration is sorted into the container corresponding to the highest probability of returning greater than a predetermined threshold. Within embodiments of the present disclosure, such predefined thresholds may be preset by a user (e.g., to ensure that false positive classifications substantially exceed false positive classifications in number). If none of the probabilities is greater than a predetermined threshold, then the particular piece of material may be sorted into an abnormal container (e.g., sorting container 140).
Next, in process block 3512, a sorting device corresponding to one or more classifications of the pieces of material is activated (e.g., an instruction is sent to the sorting device for sorting). Between the time that the image of the piece of material is captured and the time that the sorting apparatus is activated, the piece of material has moved (e.g., at the conveyance rate of the conveyor system) from near the vision system and/or sensor system(s) to a position downstream of the conveyor system. In embodiments of the present disclosure, activation of the sorting apparatus is timed such that as a piece of material passes through the sorting apparatus mapped to a sorting of the piece of material, the sorting apparatus is activated and the piece of material is removed/diverted/discharged from the conveyor system (e.g., into its associated sorting container). Within embodiments of the present disclosure, activation of the sorting apparatus may be timed by a respective position detector that detects when a piece of material passes before the sorting apparatus and sends a signal to enable activation of the sorting apparatus. In process block 3513, a sorting container corresponding to the activated sorting apparatus receives the removed/transferred/ejected piece of material.
Fig. 4 illustrates a flow chart depicting an exemplary embodiment of a process 400 of sorting pieces of material according to certain embodiments of the present disclosure. Process 400 may be configured for operation within any of the embodiments of the present disclosure described herein, including system 100 of fig. 1. Process 400 may be configured to operate with process 3500. For example, according to certain embodiments of the present disclosure, process blocks 403 and 404 may be incorporated in process 3500 (e.g., operate in series or in parallel with process blocks 3503-3510) to combine the operation of vision system 110 implemented with an AI system with a sensor system (e.g., sensor system 120) not implemented with an AI system to sort and/or sort pieces of material.
The operations of process 400 may be performed by hardware and/or software included within a computer system (e.g., computer system 3400 of fig. 5) of a control system (e.g., computer system 107 of fig. 1). In process block 401, a piece of material may be placed onto a conveyor system. Next, in optional process block 402, the pieces of material may be conveyed along a conveyor system in the vicinity of the piece tracking device and/or the optical imaging system to track each piece of material and/or determine the size and/or shape of the piece of material. In process block 403, the piece of material may be interrogated or stimulated with EM energy (waves) or some other type of stimulus suitable for the particular type of sensor technology utilized by the sensor system as the piece of material travels into proximity of the sensor system. In process block 404, a physical property of the piece of material is sensed/detected and captured by a sensor system. In process block 405, for at least some of the pieces of material, the type of material is identified/classified based (at least in part) on the captured characteristics, which may be combined with the classification by the AI system in conjunction with the vision system 110.
Next, if sorting of the pieces of material is to be performed, in process block 406, a sorting device corresponding to one or more classifications of the pieces of material is activated. Between the time the piece of material is sensed and the time the sorting apparatus is activated, the piece of material has moved from the vicinity of the sensor system to a position downstream of the conveyor system at the conveying rate of the conveyor system. In certain embodiments of the present disclosure, activation of the sorting apparatus is timed such that as a piece of material passes through the sorting apparatus mapped to a sorting of the piece of material, the sorting apparatus is activated and the piece of material is removed/transferred/discharged from the conveyor system into its associated sorting container. Within certain embodiments of the present disclosure, activation of the sorting apparatus may be timed by a respective position detector that detects when a piece of material passes before the sorting apparatus and sends a signal to enable activation of the sorting apparatus. In process block 407, the sorting containers corresponding to the activated sorting apparatus receive the removed/transferred/ejected pieces of material.
According to certain embodiments of the present disclosure, at least a portion of the plurality of systems 100 may be linked together serially in order to perform a plurality of iterations or sorting layers. For example, when two or more systems 100 are linked in such a manner, the conveyor system may be implemented with a single conveyor belt or multiple conveyor belts, such that the pieces of material are conveyed through a first vision system (and, according to certain embodiments, a sensor system) configured for sorting pieces of material in a heterogeneous mixture of a first material into a first one or more container concentrations (e.g., sorting containers 136 … …) by a sorter (e.g., first automated control system 108 and associated one or more sorting devices 126 … … 129), and then conveying the pieces of material through a second vision system (and, according to certain embodiments, another sensor system) configured for sorting pieces of material in a heterogeneous mixture of a second material into a second one or more sorting container concentrations by a second sorter. For example, the first sorting system may sort the non-failed airbag modules such that the non-failed airbag modules are safely removed from the automotive waste stream before the second sorting system sorts between the two or more metal alloys. For further discussion of such multi-stage sorting see U.S. published patent application No. 2022/0016675, which is incorporated herein by reference.
Such continuous system 100 may comprise any number of such systems linked together in such a manner. According to certain embodiments of the present disclosure, each continuous vision system may be configured to sort out different classified materials or different types of materials than the previous system(s).
According to various embodiments of the present disclosure, different types or classes of materials may be classified by different types of sensors, each for use with an AI system, and combined to classify pieces of material in a waste material or waste stream.
According to various embodiments of the present disclosure, data (e.g., spectral data) from two or more sensors may be combined using a single or multiple AI systems to perform classification of pieces of material.
According to various embodiments of the present disclosure, multiple sensor systems may be installed on a single conveyor system, with each sensor system utilizing a different AI system. According to various embodiments of the present disclosure, multiple sensor systems may be installed on different conveyor systems, with each sensor system utilizing a different AI system.
Referring now to FIG. 5, a block diagram is depicted that illustrates a data processing ("computer") system 3400 in which aspects of embodiments of the present disclosure may be implemented. (the terms "computer," "system," "computer system," and "data processing system" may be used interchangeably herein.) computer system 107, automation control system 108, aspects of sensor system(s) 120, and/or vision system 110 may be configured similarly to computer system 3400. The computer system 3400 may employ a local bus 3405 (e.g., a peripheral component interconnect ("PCI") local bus architecture). Any suitable bus architecture may be utilized, such as an accelerated graphics port ("AGP"), industry standard architecture ("ISA"), and the like. One or more processors 3415, volatile memory 3420, and non-volatile memory 3435 may be connected to the local bus 3405 (e.g., through a PCI bridge (not shown)). An integrated memory controller and buffer memory may be coupled to the one or more processors 3415. The one or more processors 3415 may include one or more central processor units and/or one or more graphics processor units and/or one or more tensor processing units. Additional connections to the local bus 3405 may be made through direct component interconnection or through a card. In the depicted example, communications (e.g., network (LAN)) adapter 3425, I/O (e.g., small computer system interface ("SCSI") host bus) adapter 3430, and expansion bus interface (not shown) may be connected to local bus 3405 by direct component connection. An audio adapter (not shown), a graphics adapter (not shown), and a display adapter 3416 (coupled to the display 3440) may be connected to the local bus 3405 (e.g., through a card plugged into an expansion slot).
The user interface adapter 3412 may provide a connection for a keyboard 3413 and a mouse 3414, a modem/router (not shown), and additional memory (not shown). The I/O adapter 3430 may provide connections for a hard disk drive 3431, a tape drive 3432, and a CD-ROM drive (not shown).
One or more operating systems may run on the one or more processors 3415 and are used to coordinate and provide control of various components within the computer system 3400. In fig. 5, the operating system(s) may be commercially available operating systems. An object oriented programming system (e.g., java, python, etc.) may run in conjunction with the operating system and provide calls to the operating system from one or more programs (e.g., java, python, etc.) executing on system 3400. Instructions for the operating system, the object-oriented operating system, and programs may be located on non-volatile memory 3435 storage devices, such as hard disk drive 3431, and may be loaded into volatile memory 3420 for execution by processor 3415.
Those of ordinary skill in the art will appreciate that the hardware in FIG. 5 may vary depending on the implementation. Other internal hardware or peripheral devices, such as flash ROM (or equivalent nonvolatile memory) or optical disk drives and the like, may be used in addition to or in place of the hardware depicted in FIG. 5. Further, any of the processes of the present disclosure may be applied to a multiprocessor computer system, or performed by a plurality of such systems 3400. For example, training of vision system 110 may be performed by a first computer system 3400, while operations of vision system 110 for classification may be performed by a second computer system 3400.
As another example, computer system 3400 may be a stand-alone system configured to be bootable without relying on some type of network communication interface, whether or not computer system 3400 comprises some type of network communication interface. As a further example, the computer system 3400 may be an embedded controller configured with ROM and/or flash ROM that provides non-volatile memory that stores operating system files or user-generated data.
The depicted example in FIG. 5 and above-described examples are not meant to imply architectural limitations. Further, the computer program forms of aspects of the disclosure may reside on any computer readable storage medium (i.e., floppy disk, compact disk, hard disk, magnetic tape, ROM, RAM, etc.) used by a computer system.
As has been described herein, embodiments of the present disclosure may be implemented to perform the various functions described for identifying, tracking, sorting, and/or sorting pieces of material. Such functionality may be implemented within hardware and/or software, such as within one or more data processing systems (e.g., data processing system 3400 of fig. 5), such as the previously mentioned computer system 107, vision system 110, aspects of sensor system(s) 120, and/or automation control system 108. However, the functionality described herein is not limited to implementation in any particular hardware/software platform.
As will be appreciated by one of skill in the art, aspects of the present disclosure may be embodied as systems, processes, methods, and/or program products. Thus, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," circuitry, "" module "or" system. Furthermore, aspects of the present disclosure may take the form of a program product embodied in one or more computer-readable storage media, the program product having computer-readable program code embodied on the one or more computer-readable storage media. ( However, any combination of one or more computer readable media may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. )
The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, biological, atomic, or semiconductor system, apparatus, controller, or device, or any suitable combination of the foregoing, where the computer readable storage medium itself is not a transitory signal. More specific examples (a non-exhaustive list) of the computer-readable storage medium could include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory ("RAM") (e.g., RAM 3420 of fig. 5), a read-only memory ("ROM") (e.g., ROM 3435 of fig. 5), an erasable programmable read-only memory ("EPROM" or flash memory), an optical fiber, a portable compact disc read-only memory ("CD-ROM"), an optical storage device, a magnetic storage device (e.g., hard drive 3431 of fig. 5), or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, controller, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, radio frequency, etc., or any suitable combination of the foregoing.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein (e.g., in baseband or as part of a carrier wave). Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be a non-computer readable storage medium and may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, controller, or device.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, processes and program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable program instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Modules implemented in software for execution by various types of processors (e.g., GPU 3401, CPU 3415) may, for example, comprise physical or logical blocks of one or more computer instructions, which may, for example, be organized as objects, procedures, or functions. However, the executables of an identified module need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the module and achieve the stated purpose for the module. Indeed, a module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data (e.g., the materials taxonomy library described herein) may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices. The data may provide an electronic signal on a system or network.
These program instructions may be provided to one or more processors and/or controller(s) of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor(s) of the computer or other programmable data processing apparatus (e.g., GPU 3401, CPU 3415), create means or circuitry for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, which can include, for example, one or more graphics processing units (e.g., GPU 3401), or combinations of special purpose hardware and computer instructions. For example, a module may be implemented as a hardware circuit comprising custom Very Large Scale Integration (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, controllers, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.
In the description herein, the techniques of the flowcharts may be described in terms of a series of sequential actions. The order of acts and the elements performing the acts may be varied freely without departing from the scope of the present teachings. Actions may be added, deleted, or altered in several ways. Similarly, actions may be reordered or looped. Further, although processes, methods, algorithms, etc. may be described in a sequential order, such processes, methods, algorithms, or any combination thereof, may be operable to be executed in alternate orders. Further, some acts within a process, method, or algorithm may be performed concurrently (e.g., acts are performed in parallel) during at least one point in time, and may also be performed in whole, in part, or any combination thereof.
Reference herein may be made to a device, circuit, circuitry, system, or module configured to perform one or more particular functions. It should be appreciated that this may include selecting predefined logic blocks and logically associating them so that they provide specific logic functions, including monitoring or control functions. It may also include programming computer software-based logic, wiring discrete hardware components, or a combination of any or all of the foregoing.
To the extent not described herein, many details of the processing acts and circuits are conventional with respect to the specific materials and may be found in textbooks and other sources within the computing, electronic and software arts.
Computer program code (i.e., instructions) for carrying out operations of aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, python, C ++ or the like, a conventional procedural programming language such as the "C" programming language or similar programming languages, a programming language such as MATLAB or LabVIEW, or any of the AI software disclosed herein. The program code may execute entirely on the user's computer system, partly on the user's computer system (e.g., the computer system used for sorting) and partly on a remote computer system (e.g., the computer system used for training an AI system), or entirely on the remote computer system or server as a stand-alone software package. In the latter scenario, the remote computer system may be connected to the user's computer system through any type of network, including a local area network ("LAN") or a wide area network ("WAN"), or the connection may be made to an external computer system (for example, through the Internet using an Internet service provider). As an example of the foregoing, various aspects of the disclosure may be configured to perform on one or more of the following: aspects of computer system 107, automation control system 108, vision system 110, and sensor system(s) 120.
These program instructions may also be stored in a machine-readable storage medium that can direct a computer system, other programmable data processing apparatus, controller, or other device to function in a particular manner, such that the instructions stored in the machine-readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The program instructions may also be loaded onto a computer, other programmable data processing apparatus, controller, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
One or more databases can be included in the host for storing data and providing access to data for various implementations. Those skilled in the art will also appreciate that any database, system, or component of the present disclosure may include any combination of databases or components at a single location or multiple locations for security reasons, where each database or system may include any of a variety of suitable security features (such as firewalls, access codes, encryption, decryption, etc.). The database may be any type of database, such as a relational database, a hierarchical database, an object-oriented database, and the like. Common database products that may be used to implement a database include IBM's DB2, any of the database products available from Oracle corporation, microsoft Access from Microsoft corporation, or any other database product. The database may be organized in any suitable manner, including as a data table or a look-up table.
The association of certain data (e.g., for each of the pieces of material handled by the material handling system described herein) may be accomplished by any data association technique known and practiced in the art. For example, the association may be done either manually or automatically. Automatic association techniques may include, for example, database searches, database merging, GREP, AGREP, SQL, and the like. The association step may be accomplished by a database merge function, for example, using key fields in each of the manufacturer and retailer data tables. The key field partitions the database according to the high-level class of objects defined by the key field. For example, a certain category may be specified as a key field in both the first data table and the second data table, and then the two data tables may be merged based on category data in the key field. In these embodiments, the data corresponding to the key fields in each of the merged data tables is preferably the same. However, for example, data tables with similar but not identical data in the key fields may also be consolidated by using AGREP.
Aspects of the present disclosure provide a method of sorting non-spent airbag modules from a moving automotive waste stream, wherein the method comprises: conveying a piece of automotive waste through a vision system, wherein the piece of automotive waste comprises a non-fail airbag module; capturing a visual image of an automotive waste part; processing, by the artificial intelligence system, the captured visual image of the automotive waste part to distinguish the non-spent airbag module from other automotive waste parts; and sorting the non-spent airbag modules from the moving automotive waste stream. Sorting includes transferring the non-failed airbag module into a container along with other pieces of automotive waste in the vicinity of the non-failed airbag module. Sorting may be performed without activating non-failed airbag modules. Sorting is performed by a sorting mechanism that transfers the non-failed airbag module using a transfer force configured to deactivate the non-failed airbag module. The sorting mechanism may be a paint brush plunger. The non-failed airbag module is partially obscured by at least one other automotive waste piece such that the vision system cannot collect spectral image data of the entirety of the non-failed airbag module. The artificial intelligence system may be configured to identify a partially occluded non-failed airbag module. The artificial intelligence system is configured with a semantic segmentation algorithm for distinguishing between non-failed airbag modules and other automotive waste parts. The method further includes sorting the automotive scrap pieces into individual metal alloys after sorting the non-spent airbag modules from the automotive scrap piece stream. The artificial intelligence system is configured to classify a particular automotive waste part as a non-failed airbag module at a false positive to false negative ratio greater than a predetermined threshold.
Aspects of the present disclosure provide a system for sorting non-spent airbag modules from a moving automotive waste stream, wherein the system comprises: a conveyor system for conveying automotive waste pieces through the vision system, wherein the automotive waste pieces include non-fail airbag modules; a vision system configured to capture a visual image of an automotive waste item; a data processing system configured with an artificial intelligence system configured to process captured visual images of automotive waste items to distinguish non-spent airbag modules from other automotive waste items; and a sorting device for sorting non-spent airbag modules from the moving automotive waste stream. Sorting may include transferring the non-failed airbag module into a container along with other pieces of automotive waste in the vicinity of the non-failed airbag module. Sorting may be performed without activating non-failed airbag modules. The sorting apparatus may include a sorting mechanism that transfers the non-failed airbag module using a transfer force configured to deactivate the non-failed airbag module. The sorting mechanism may be a paint brush plunger. The non-failed airbag module is partially obscured by at least one other automotive waste piece such that the vision system cannot collect spectral image data of the entirety of the non-failed airbag module. The artificial intelligence system may be configured to identify partially occluded non-spent airbag modules and to distinguish partially occluded non-spent airbag modules from other automotive waste. The artificial intelligence system may be configured with a Mask R-CNN algorithm for distinguishing between non-failed airbag modules and other automotive waste parts. The artificial intelligence system may be configured to classify a particular automotive waste part as a non-failed airbag module if the particular automotive waste part is sufficiently similar to the non-failed airbag module. The artificial intelligence system may be configured to classify a particular automotive waste part as a non-failed airbag module at a false positive to false negative ratio greater than a predetermined threshold.
In the description herein, numerous specific details are provided, such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, controllers, etc., to provide a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the disclosure may be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
Reference throughout this specification to "one embodiment" or "an embodiment" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases "in one embodiment," "in an embodiment," "embodiments," "certain embodiments," and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment. Furthermore, the described features, structures, aspects, and/or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. Accordingly, even though features initially claimed are functional in certain combinations, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced, however, are not to be construed as a critical, required, or essential feature or element of any or all the claims. Further, unless explicitly described as being necessary or critical, the components described herein are not required for the practice of the present disclosure.
Herein, the term "or" may be intended to be included, wherein "a or B" includes a or B and also includes both a and B. As used herein, the term "and/or" when used in the context of a list of entities refers to entities that exist alone or in combination. Thus, for example, the phrase "A, B, C and/or D" includes A, B, C and D alone, but also includes any and all combinations and subcombinations of A, B, C and D.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below may be intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed.
As used herein, "substantially" with respect to an identified property or condition refers to a degree of deviation that is sufficiently small so as not to visually deviate from the identified property or condition. In some cases, the exact degree of allowable deviation may depend on the particular context.
As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a common list for convenience. However, these lists should be understood as though each member of the list is individually identified as a separate and unique member. Thus, any individual component on such a list should not be considered to be virtually identical to any other component on the same list based solely on its presentation in a common group, without an indication to the contrary.
Unless defined otherwise, all technical and scientific terms used herein, such as abbreviations for chemical elements in the periodic table of elements, have the same meaning as commonly understood by one of ordinary skill in the art to which the presently disclosed subject matter pertains. Although any methods, devices, and materials similar or equivalent to those described herein can be used in the practice or testing of the presently disclosed subject matter, representative methods, devices, and materials are now described.
The term coupled, as used herein, is not intended to be limited to a direct coupling or a mechanical coupling. Unless otherwise indicated, terms such as "first" and "second" are used to arbitrarily distinguish between the elements such terms describe. Thus, these terms are not necessarily intended to indicate temporal or other prioritization of such elements.

Claims (20)

1. A method of sorting non-spent airbag modules from a moving automotive waste stream, comprising:
conveying a piece of automotive waste through a vision system, wherein the piece of automotive waste comprises a non-spent airbag module;
capturing a visual image of the automotive waste item;
processing, by an artificial intelligence system, the captured visual image of the automotive waste part to distinguish the non-spent airbag module from other automotive waste parts; and
sorting said non-spent airbag modules from said moving automotive waste stream.
2. The method of claim 1, wherein the sorting comprises transferring the non-failed airbag module into a container along with other automotive waste in an area proximate the non-failed airbag module.
3. The method of claim 1, wherein the sorting is performed without activating the non-failed airbag module.
4. The method of claim 1, wherein the sorting is performed by a sorting mechanism that transfers the non-failed airbag module using a transfer force configured to deactivate the non-failed airbag module.
5. The method of claim 4, wherein the sorting mechanism is a paint brush plunger.
6. The method of claim 2, wherein the non-failed airbag module is partially obscured by at least one other automotive waste piece such that the vision system is unable to collect spectral image data of the entirety of the non-failed airbag module.
7. The method of claim 6, wherein the artificial intelligence system is configured to identify a partially occluded non-failed airbag module.
8. The method of claim 7, wherein the artificial intelligence system is configured with a semantic segmentation algorithm for distinguishing between non-failed airbag modules and other automotive waste components.
9. The method of claim 1, further comprising sorting automotive scrap pieces into individual metal alloys after sorting the non-spent airbag modules from the automotive scrap piece stream.
10. The method of claim 1, wherein the artificial intelligence system is configured to classify a particular piece of automotive waste as a non-failed airbag module at a false positive to false negative ratio greater than a predetermined threshold.
11. A system for sorting non-spent airbag modules from a moving automotive waste stream, comprising:
a conveyor system for conveying automotive waste through a vision system, wherein the automotive waste comprises a non-fail airbag module;
a vision system configured to capture a visual image of the automotive waste item;
a data processing system configured with an artificial intelligence system configured for the captured visual image of the automotive waste item processed by the artificial intelligence system to distinguish the non-spent airbag module from other automotive waste items; and
sorting apparatus for sorting said non-spent airbag modules from said moving automotive waste stream.
12. The system of claim 11, wherein the sorting comprises transferring the non-failed airbag module into a container along with automotive waste in an area proximate the non-failed airbag module.
13. The system of claim 11, wherein the sorting is performed without activating the non-failed airbag module.
14. The system of claim 11, wherein the sorting apparatus comprises a sorting mechanism that transfers the non-failed airbag module using a transfer force configured to deactivate the non-failed airbag module.
15. The system of claim 14, wherein the sorting mechanism is a paint brush plunger.
16. The system of claim 12, wherein the non-failed airbag module is partially obscured by at least one other automotive waste piece such that the vision system is unable to collect spectral image data of the entirety of the non-failed airbag module.
17. The system of claim 16, wherein the artificial intelligence system is configured to identify partially occluded non-failed airbag modules and to distinguish the partially occluded non-failed airbag modules from other automotive waste pieces.
18. The system of claim 17, wherein the artificial intelligence system is configured with a semantic segmentation algorithm for distinguishing between non-failed airbag modules and other automotive waste components.
19. The system of claim 11, wherein the artificial intelligence system is configured to classify a particular automotive waste part as a non-failed airbag module if the particular automotive waste part is sufficiently similar to the non-failed airbag module.
20. The system of claim 11, wherein the artificial intelligence system is configured to classify a particular piece of automotive waste as a non-failed airbag module at a false positive to false negative ratio greater than a predetermined threshold.
CN202280015959.5A 2021-08-05 2022-08-05 Removal of airbag modules from automotive waste Pending CN116997423A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US63/229,724 2021-08-05
US17/491,415 2021-09-30
US17/495,291 2021-10-06
US17/667,397 2022-02-08
US17/752,669 US20220355342A1 (en) 2015-07-16 2022-05-24 Sorting of contaminants
US17/752,669 2022-05-24
PCT/US2022/039622 WO2023015000A1 (en) 2021-08-05 2022-08-05 Removing airbag modules from automotive scrap

Publications (1)

Publication Number Publication Date
CN116997423A true CN116997423A (en) 2023-11-03

Family

ID=88534339

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280015959.5A Pending CN116997423A (en) 2021-08-05 2022-08-05 Removal of airbag modules from automotive waste

Country Status (1)

Country Link
CN (1) CN116997423A (en)

Similar Documents

Publication Publication Date Title
US20210346916A1 (en) Material handling using machine learning system
US11975365B2 (en) Computer program product for classifying materials
US11964304B2 (en) Sorting between metal alloys
US11260426B2 (en) Identifying coins from scrap
CN112543680A (en) Recovery of coins from waste
US20220355342A1 (en) Sorting of contaminants
US20220203407A1 (en) Sorting based on chemical composition
CA3233146A1 (en) Multiple stage sorting
WO2023076186A1 (en) Metal separation in a scrap yard
US20220371057A1 (en) Removing airbag modules from automotive scrap
CN116997423A (en) Removal of airbag modules from automotive waste
Koganti et al. Deep Learning based Automated Waste Segregation System based on degradability
TWI829131B (en) Method and system for sorting materials, and computer program product stored on computer readable storage medium
EP4267319A1 (en) Removing airbag modules from automotive scrap
WO2023003670A1 (en) Material handling system
WO2023003669A1 (en) Material classification system
CN116917055A (en) Sorting based on chemical compositions
US20230053268A1 (en) Classification and sorting with single-board computers
CA3209464A1 (en) Sorting based on chemical composition
US20240132297A1 (en) Thin strip classification
US20240133830A1 (en) Correction techniques for material classification
US20230173543A1 (en) Mobile sorter
US20230044783A1 (en) Metal separation in a scrap yard
WO2024086836A1 (en) Thin strip classification
WO2022251373A1 (en) Sorting of contaminants

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination