WO2023286795A1 - 情報処理方法、情報処理装置、および情報処理プログラム - Google Patents
情報処理方法、情報処理装置、および情報処理プログラム Download PDFInfo
- Publication number
- WO2023286795A1 WO2023286795A1 PCT/JP2022/027504 JP2022027504W WO2023286795A1 WO 2023286795 A1 WO2023286795 A1 WO 2023286795A1 JP 2022027504 W JP2022027504 W JP 2022027504W WO 2023286795 A1 WO2023286795 A1 WO 2023286795A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- waste
- facility
- model
- image data
- pit
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 102
- 238000003672 processing method Methods 0.000 title claims abstract description 55
- 239000002699 waste material Substances 0.000 claims abstract description 399
- 239000010813 municipal solid waste Substances 0.000 claims abstract description 153
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 84
- 238000003384 imaging method Methods 0.000 claims abstract description 52
- 238000010801 machine learning Methods 0.000 claims abstract description 20
- 238000012549 training Methods 0.000 claims description 75
- 238000002372 labelling Methods 0.000 claims description 21
- 238000003756 stirring Methods 0.000 claims description 9
- 230000032258 transport Effects 0.000 claims description 7
- 238000003702 image correction Methods 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000003066 decision tree Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 3
- 238000013461 design Methods 0.000 claims description 3
- 238000012417 linear regression Methods 0.000 claims description 3
- 238000007637 random forest analysis Methods 0.000 claims description 3
- 230000002787 reinforcement Effects 0.000 claims description 3
- 238000012706 support-vector machine Methods 0.000 claims description 3
- 239000002131 composite material Substances 0.000 claims description 2
- 238000012937 correction Methods 0.000 claims description 2
- 238000002485 combustion reaction Methods 0.000 description 29
- 238000001514 detection method Methods 0.000 description 18
- 239000000428 dust Substances 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 230000007613 environmental effect Effects 0.000 description 13
- 238000000034 method Methods 0.000 description 13
- 238000009434 installation Methods 0.000 description 11
- 241000196324 Embryophyta Species 0.000 description 9
- 230000002159 abnormal effect Effects 0.000 description 9
- 238000013019 agitation Methods 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 6
- 239000000123 paper Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 241001465754 Metazoa Species 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 239000010802 sludge Substances 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 235000017166 Bambusa arundinacea Nutrition 0.000 description 3
- 235000017491 Bambusa tulda Nutrition 0.000 description 3
- 241001330002 Bambuseae Species 0.000 description 3
- 235000019735 Meat-and-bone meal Nutrition 0.000 description 3
- 235000015334 Phyllostachys viridis Nutrition 0.000 description 3
- 230000032683 aging Effects 0.000 description 3
- 239000011425 bamboo Substances 0.000 description 3
- 239000011111 cardboard Substances 0.000 description 3
- 239000002360 explosive Substances 0.000 description 3
- 239000010806 kitchen waste Substances 0.000 description 3
- 239000002906 medical waste Substances 0.000 description 3
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 3
- 229910052753 mercury Inorganic materials 0.000 description 3
- 239000013502 plastic waste Substances 0.000 description 3
- 229920006327 polystyrene foam Polymers 0.000 description 3
- 239000002689 soil Substances 0.000 description 3
- 239000010902 straw Substances 0.000 description 3
- 125000000391 vinyl group Chemical group [H]C([*])=C([H])[H] 0.000 description 3
- 229920002554 vinyl polymer Polymers 0.000 description 3
- 240000000491 Corchorus aestuans Species 0.000 description 2
- 235000011777 Corchorus aestuans Nutrition 0.000 description 2
- 235000010862 Corchorus capsularis Nutrition 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 238000001704 evaporation Methods 0.000 description 2
- 230000008020 evaporation Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 239000010784 textile waste Substances 0.000 description 2
- 239000002916 wood waste Substances 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 235000012766 Cannabis sativa ssp. sativa var. sativa Nutrition 0.000 description 1
- 235000012765 Cannabis sativa ssp. sativa var. spontanea Nutrition 0.000 description 1
- 241000023066 Pitane Species 0.000 description 1
- 239000010868 animal carcass Substances 0.000 description 1
- 235000009120 camo Nutrition 0.000 description 1
- 235000005607 chanvre indien Nutrition 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- -1 cylinders Chemical compound 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000010922 glass waste Substances 0.000 description 1
- 239000011487 hemp Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 239000010814 metallic waste Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F23—COMBUSTION APPARATUS; COMBUSTION PROCESSES
- F23G—CREMATION FURNACES; CONSUMING WASTE PRODUCTS BY COMBUSTION
- F23G5/00—Incineration of waste; Incinerator constructions; Details, accessories or control therefor
- F23G5/50—Control or safety arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present disclosure relates to an information processing method, an information processing device, and an information processing program for generating an identification algorithm for identifying types of waste.
- wastes are brought into the waste pit. These wastes are stored in a garbage pit and then thrown into an incinerator for incineration. The quality of the waste that is put into the incinerator affects combustion. In addition, the waste pit contains waste that can cause equipment trouble if thrown into the waste pit as it is. For this reason, in conventional facilities, operators visually inspect the inside of the waste pit to identify the type of waste, stir the waste using a crane, and transfer it to the incinerator in order to stabilize combustion and prevent equipment trouble. Inserting operation is in progress.
- the applicant has already proposed a technology for automatically identifying waste and operating a crane using an identification algorithm (learned model) that identifies the type of waste stored in a garbage pit ( See JP-A-2020-038058).
- the inventors of the present application have conducted extensive research to find an improved technology for the dust identification method using the above identification algorithm. As a result, the following findings were obtained. Note that the following findings are the triggers for the present technology, and do not limit the present technology.
- machine learning (supervised learning) requires a large amount of correct data for learning.
- a large amount of image data and training data that must be newly prepared for each material processing facility is required.
- waste disposal facilities have different shapes and sizes of waste pits, and various environmental conditions (for example, the installation position, angle, lighting conditions, etc. of the image acquisition camera). Even if an identification algorithm created using data and training data is used as it is at other facilities, sufficient identification accuracy cannot be obtained.
- An information processing method includes: By performing machine learning on first image data that captures images of the inside of waste pits in multiple facilities where waste is stored and first training data that labels the types of waste in the images, generating a first model that is a class identification algorithm; Second image data obtained by imaging the inside of the waste pit of a second facility different from the plurality of facilities for the generated first model, and second teacher data labeling the types of waste in the images generating a second model, which is a discrimination algorithm corresponding to the waste of the second facility, by additionally learning the including.
- An information processing device includes: By performing machine learning on first image data that captures images of the inside of waste pits in multiple facilities where waste is stored and first training data that labels the types of waste in the images, a first model generator that generates a first model that is a type identification algorithm; Second image data obtained by imaging the inside of the waste pit of a second facility different from the plurality of facilities for the generated first model, and second teacher data labeling the types of waste in the images a second model generation unit that generates a second model, which is an identification algorithm corresponding to the waste of the second facility, by additionally learning the Prepare.
- FIG. 1 is a schematic diagram showing the configuration of a waste disposal facility according to one embodiment.
- FIG. 2 is a block diagram showing the configuration of the information processing apparatus according to one embodiment.
- FIG. 3 is a flowchart illustrating an example of an information processing method by the information processing apparatus according to one embodiment.
- FIG. 4 is a diagram showing an example of image data obtained by capturing an image inside the dust pit.
- FIG. 5 is a diagram showing an example of teacher data in which image data obtained by capturing an image in a garbage pit is labeled with the types of waste and objects to be identified other than waste.
- FIG. 6 is a diagram showing an example of data displayed by superimposing a classification result by the type identification unit on image data obtained by capturing an image in the dust pit.
- FIG. 1 is a schematic diagram showing the configuration of a waste disposal facility according to one embodiment.
- FIG. 2 is a block diagram showing the configuration of the information processing apparatus according to one embodiment.
- FIG. 3 is a flowchart illustrating an example of an
- FIG. 7 is a map showing the ratio of types of waste in the waste pit for each area.
- FIG. 8 is a conceptual diagram showing a conventional identification algorithm generation method.
- FIG. 9 is a conceptual diagram showing a method of generating an identification algorithm according to one embodiment.
- FIG. 10 is a block diagram showing the configuration of an information processing apparatus according to one modification of one embodiment.
- An information processing method includes: By performing machine learning on first image data that captures images of the inside of waste pits in multiple facilities where waste is stored and first training data that labels the types of waste in the images, generating a first model that is a class identification algorithm; Second image data obtained by imaging the inside of the waste pit of a second facility different from the plurality of facilities for the generated first model, and second teacher data labeling the types of waste in the images generating a second model, which is a discrimination algorithm corresponding to the waste of the second facility, by additionally learning the including.
- pre-learning using a large amount of teacher data collected and generated at a plurality of facilities for an unlimited period of time can do.
- learning is possible using a variety of training data that takes into account differences in the shape and size of garbage pits at each facility, as well as various environmental conditions (such as the installation position, angle, lighting conditions, etc. of cameras used to acquire images). Therefore, it is possible to obtain an identification algorithm (first model) having robustness (the property of being able to stably and accurately identify various data).
- a second model is generated by additionally learning supervised data collected and generated at a newly introduced facility (second facility) for such a first model.
- the amount of teacher data used for additional learning is , generated teacher data). If it is a small amount, the environmental conditions of the newly introduced facility (installation position, angle, lighting conditions, etc. of the camera for image acquisition), future environmental changes (changes in the shape and color of the garbage bag, aging of the pit side wall, etc.) dirt, etc.).
- the environmental conditions of the newly introduced facility installation position, angle, lighting conditions, etc. of the camera for image acquisition
- future environmental changes changes in the shape and color of the garbage bag, aging of the pit side wall, etc.
- dirt etc.
- there is a type of waste that does not exist in the first training data in the newly introduced facility by including the training data of that waste in the training data collected and generated at the newly introduced facility, it can be handled.
- An identifiable second model can be generated.
- a model (second model) with high identification accuracy corresponding to newly introduced facilities can be obtained while maintaining robustness to some extent.
- An information processing method is the information processing method according to the first aspect, Said second image data and said second teacher data include in the image a side wall of the waste pit of said second facility and/or a crane for stirring or transporting waste.
- additional learning can be performed in consideration of the difference in the shape and size of the garbage pit and the shape and size of the crane in the newly introduced facility (second facility). and/or more accurate identification of cranes and more accurate identification of waste types compared to using waste-only data (i.e., data that does not include the side walls of the garbage pit and/or cranes in the image). can be done.
- An information processing method is the information processing method according to the first or second aspect,
- the amount of the second teacher data with respect to the first teacher data is 30% or less.
- An information processing method is the information processing method according to the third aspect,
- the amount of the second teacher data with respect to the first teacher data is 15% or less.
- An information processing method is the information processing method according to the fourth aspect,
- the amount of the second teacher data is 5% or less with respect to the first teacher data.
- An information processing method is an information processing method according to any one of the first to fifth aspects,
- the identification algorithm is one or two of linear regression, Boltzmann machine, neural network, support vector machine, Bayesian network, sparse regression, decision tree, statistical inference using random forest, reinforcement learning, deep learning. Including above.
- An information processing method is an information processing method according to any one of the first to sixth aspects,
- the second training data includes a labeled image of the entirety of the captured image of the inside of the garbage pit of the second facility.
- additional learning is performed in consideration of information on how the image (field of view and angle of view) in the garbage pit obtained by the imaging device installed in the newly introduced facility (second facility) is taken into consideration. Therefore, the accuracy of identifying the type of waste by the second model can be improved.
- An information processing method is an information processing method according to any one of the first to sixth aspects,
- the second training data includes an image obtained by cutting out a part of an image of the garbage pit of the second facility and labeling only the cut out part.
- the amount of teacher data and the number of man-hours for creating it can be significantly reduced.
- An information processing method is an information processing method according to any one of the first to eight aspects,
- the second image data used in the step of generating the second model is obtained by inputting image data obtained by imaging the inside of the waste pit of the second facility into the first model to identify the type of waste. Then, image data whose identification accuracy is lower than a predetermined standard (first standard) is selected and used from those image data.
- first standard a predetermined standard
- the type of waste for which sufficient identification accuracy cannot be obtained with the first model is confirmed, and additional learning is performed mainly using image data and teacher data including the type. can be performed, which enables efficient learning data collection and learning.
- An information processing method is an information processing method according to any one of the first to tenth aspects, In the step of generating the first model, information on the imaging conditions and/or the imaging environment in each facility is also learned.
- An information processing method is an information processing method according to any one of the first to tenth aspects,
- the first image data is image data obtained by imaging the inside of the garbage pit of each facility, with reference to an image obtained by imaging a color chart for image correction common among the plurality of facilities. with at least one correction of
- An information processing method is an information processing method according to any one of the first to tenth aspects,
- the first image data is obtained by imaging the inside of the garbage pit of each facility together with a color chart for image correction that is common among the plurality of facilities.
- An information processing method is an information processing method according to any one of the first to twelfth aspects,
- the second image data is a rendered image of a side wall of a garbage pit and/or a crane that stirs or transports waste created based on the three-dimensional design data of the second facility. It includes a composite of images taken of waste in a waste pit or waste in a waste pit of another facility that is different from either of the plurality of facilities and the second facility.
- image data regarding the appearance of the pit, cranes, etc. can be created and used for learning.
- An information processing method is an information processing method according to any one of the first to thirteenth aspects, After the operation of the generated second model is started, the identification accuracy during operation is periodically monitored, and when the identification accuracy becomes lower than a predetermined standard (second standard), the second model is: The method further includes the step of updating the second model by additionally learning the image data of the waste at that time and the teacher data labeling the types of waste in the image.
- a predetermined standard second standard
- An information processing method is an information processing method according to any one of the first to fourteenth aspects, For the generated second model, third image data obtained by imaging the inside of the waste pit of a third facility different from the plurality of facilities and the second facility, and the type of waste in the image is labeled.
- the method further includes the step of generating a third model, which is an identification algorithm corresponding to the waste of the third facility, by additionally learning the attached third teacher data.
- a model (second model) obtained by additionally learning the second teacher data to the first teacher data is used, and third teacher data at another facility (third facility) is used. Since it can be used as a base model for additional learning, it is expected that the classification accuracy of waste types will be improved successively.
- An information processing device includes: By performing machine learning on first image data that captures images of the inside of waste pits in multiple facilities where waste is stored and first training data that labels the types of waste in the images, a first model generator that generates a first model that is a type identification algorithm; Second image data obtained by imaging the inside of the waste pit of a second facility different from the plurality of facilities for the generated first model, and second teacher data labeling the types of waste in the images a second model generation unit that generates a second model, which is an identification algorithm corresponding to the waste of the second facility, by additionally learning the Prepare.
- a computer-readable recording medium non-temporarily records the following fallen person detection program: the information processing program, to the computer, By performing machine learning on first image data that captures images of the inside of waste pits in multiple facilities where waste is stored and first training data that labels the types of waste in the images, generating a first model that is a class identification algorithm; Second image data obtained by imaging the inside of the waste pit of a second facility different from the plurality of facilities for the generated first model, and second teacher data labeling the types of waste in the images generating a second model, which is a discrimination algorithm corresponding to the waste of the second facility, by additionally learning the to run.
- An information processing method includes: Generated by performing machine learning on first image data obtained by imaging the inside of the waste pits of multiple facilities where waste is stored, and first teacher data in which the types of waste in the images are labeled, For a first model, which is a waste type identification algorithm, a second image data of the inside of a waste pit at a second facility different from the plurality of facilities, and labeling the type of waste in the image.
- the inside of the waste pit of the second facility is imaged using the second model, which is an identification algorithm corresponding to the waste of the second facility, generated by additionally learning the second teacher data. identifying the type of waste stored in the waste pit using the new image data obtained as input.
- An information processing device includes: Generated by performing machine learning on first image data obtained by imaging the inside of the waste pits of multiple facilities where waste is stored, and first teacher data in which the types of waste in the images are labeled, For a first model, which is a waste type identification algorithm, a second image data of the inside of a waste pit at a second facility different from the plurality of facilities, and labeling the type of waste in the image.
- the inside of the waste pit of the second facility is imaged using the second model, which is an identification algorithm corresponding to the waste of the second facility, generated by additionally learning the second teacher data.
- a type identification unit is provided for identifying the type of waste stored in the waste pit by inputting new image data obtained from the waste pit.
- a computer-readable recording medium non-temporarily records the following fallen person detection program: the information processing program, to the computer, Generated by performing machine learning on first image data obtained by imaging the inside of the waste pits of multiple facilities where waste is stored, and first teacher data in which the types of waste in the images are labeled,
- a first model which is a waste type identification algorithm
- a second image data of the inside of a waste pit at a second facility different from the plurality of facilities and labeling the type of waste in the image.
- the inside of the waste pit of the second facility is imaged using the second model, which is an identification algorithm corresponding to the waste of the second facility, generated by additionally learning the second teacher data.
- a step of identifying the type of waste stored in the waste pit is executed using the new image data obtained as an input.
- FIG. 1 is a schematic diagram showing the configuration of a waste disposal facility 100 (hereinafter sometimes referred to as a second facility) according to one embodiment.
- a waste disposal facility 100 includes a platform 21 on which a transport vehicle (garbage truck) 22 for loading waste stops, and a waste pit 3 in which the waste thrown from the platform 21 is stored. , a crane 5 for stirring and transporting the waste stored in the waste pit 3, a hopper 4 into which the waste transported by the crane 5 is thrown, and an incinerator 1 for incinerating the waste thrown from the hopper 4. and an exhaust heat boiler 2 for recovering exhaust heat from exhaust gas generated in the incinerator 1.
- the type of incinerator 1 is not limited to a stoker furnace as shown in FIG. 1, but also includes a fluidized bed furnace (also called fluidized bed furnace).
- the structure of the dust pit 3 is not limited to the one-stage pit shown in FIG. 1, but includes a two-stage pit in which the dust pit is divided into an input section and a storage section.
- the waste disposal facility 100 is also provided with a crane control device 50 that controls the operation of the crane 5 and a combustion control device 20 that controls combustion of waste within the incinerator 1 .
- the waste that is loaded on the transport vehicle 22 is thrown into the garbage pit 3 from the platform 21 and stored in the garbage pit 3 .
- the waste stored in the waste pit 3 is agitated by the crane 5, transported to the hopper 4 by the crane 5, thrown into the incinerator 1 via the hopper 4, and placed inside the incinerator 1. incinerated and processed.
- the waste disposal facility 100 includes an imaging device 6 (hereinafter also referred to as a camera) for capturing an image of the inside of the garbage pit 3, and an information processing apparatus for identifying the type of waste inside the garbage pit 3.
- a device 10 is provided.
- FIG. 4 is a diagram showing an example of image data obtained by capturing an image in the dust pit 3 with the imaging device 6. As shown in FIG.
- the imaging device 6 may be an RGB camera that outputs shape and color image data of the waste as an imaging result, or a near-infrared camera that outputs near-infrared image data of the waste as an imaging result. , a 3D camera or an RGB-D camera that captures three-dimensional image data of the waste as the imaging result, or a combination of two or more of these.
- FIG. 2 is a block diagram showing the configuration of the information processing device 10. As shown in FIG. Information processing apparatus 10 may be configured by one or more computers.
- the information processing device 10 has a control unit 11, a storage unit 12, and a communication unit 13. Each unit is communicably connected to each other via a bus.
- the communication unit 13 is a communication interface between the imaging device 6 , the crane control device 50 and the combustion control device 20 and the information processing device 10 .
- the communication unit 13 transmits and receives information between the imaging device 6 , the crane control device 50 and the combustion control device 20 and the information processing device 10 .
- the storage unit 12 is a fixed data storage such as a hard disk. Various data handled by the control unit 11 are stored in the storage unit 12 .
- the storage unit 12 also stores an identification algorithm 12a generated by a first model generating unit 11a1 and a second model generating unit 11a2, which will be described later, and image data 12b obtained by an image data obtaining unit 11b, which will be described later. be.
- the control unit 11 is control means for performing various processes of the information processing device 10 .
- the control unit 11 includes a first model generation unit 11a1, a second model generation unit 11a2, an image data acquisition unit 11b, a type identification unit 11c, a plant control unit 11d, and a fall detection unit. 11e and a foreign object detection unit 11f.
- Each of these units may be realized by the processor in the information processing apparatus 10 executing a predetermined program, or may be implemented by hardware.
- the first model generation unit 11a1 generates first image data obtained by imaging the inside of the waste pits of a plurality of facilities (waste treatment facilities different from the second facility 100) in which waste is stored, and the waste in the images. and the first training data labeled with the type of waste are machine-learned to generate a waste type identification algorithm 12a (first model).
- the first model generation unit 11a1 labels the first image data obtained by imaging the inside of the garbage pits of a plurality of facilities and the waste in the image with the type, and labels the identification objects other than the waste in the image.
- the identification algorithm 12a (first model) may be generated.
- the identification algorithm 12a is, for example, linear regression, Boltzmann machine, neural network, support vector machine, Bayesian network, sparse regression, decision tree, statistical estimation using random forest, reinforcement learning, deep learning, It may contain one or more of
- the first training data is obtained by visually identifying waste and objects other than waste in the first image data captured by a skilled operator who operates the waste pit. It may be created by labeling separately.
- the types of waste and objects to be identified other than waste may be labeled, for example, as layers by type overlaid on the first image data.
- the types of waste that are labeled in the first training data are garbage bag unbroken bag garbage, paper garbage, pruned branches, futons, sludge, oversized crushed garbage, cardboard, jute bags, paper bags, and bottom garbage (in garbage pit 3). waste that is near the bottom and has a high moisture content compressed into the waste above). Also, the types of waste labeled in the first training data may include unplanned waste (abnormalities) that may enter the garbage pit but are not desired.
- abnormal objects include, for example, objects that should not be incinerated, specifically, fluorescent lamps, mercury-containing garbage, and explosives such as cylinders, cans, and oil tanks.
- the types of waste labeled in the first training data are wood waste, textile waste, clothing waste, plastic waste, animal residue, animal corpses, kitchen waste, plants, soil, medical waste, incineration ash, bicycles, Drawers, beds, shelves, desks, chairs, agricultural vinyl, PET bottles, polystyrene foam, meat and bone meal, crops, pottery, glass scraps, metal scraps, debris, concrete scraps, tatami mats, bamboo, straw, activated carbon It may contain one or more.
- the types of objects to be identified other than waste that are labeled in the first training data are the beams of each facility, the side walls of the waste pit, and the cliff of the waste pile stored in the waste pit (waste pile). and a crane for agitating or transporting the waste.
- the types of identification objects other than waste that are labeled in the first training data may include one or both of workers and delivery vehicles.
- the types of objects to be identified other than waste that are labeled in the training data are walls, pillars, floors, windows, ceilings, doors, stairs, girders (structures that move by suspending cranes), and corridors in each facility. , a waste pit partition wall, a waste input hopper, an access door, a worker, an access vehicle.
- the first model generation unit 11a1 performs machine learning on the first image data and the first teacher data to generate the first model. And/or information on the shooting environment (amount of natural light depending on the time of day, weather, lighting ON/OFF, etc.) may also be learned. This makes it possible to operate even if the shooting conditions or shooting environment change due to trouble or the like.
- the first image data is the color tone, brightness, and saturation of the uncorrected image data of the garbage pit of each facility, with reference to the image of the color chart for image correction that is common among a plurality of facilities. At least one of them may be corrected, or the inside of the garbage pit of each facility may be imaged together with a color chart for image correction that is common among a plurality of facilities.
- the first model generating unit 11a1 performs machine learning on the first image data and the first teacher data to generate the first model, images captured at a plurality of facilities with different illumination and natural light conditions are Differences in color tone, lightness, and saturation can be corrected, thereby improving the identification accuracy of the identification algorithm 12a.
- the second model generation unit 11a2 generates second image data of the garbage pit of the second facility 100 and By additionally learning the second training data labeled with the type of the waste in the second facility, an identification algorithm 12a (second model) corresponding to the waste in the second facility is generated.
- the second model generation unit 11a2 generates second image data of the garbage pit of the second facility 100 and The waste pit of the second facility 100 is additionally learned with the second teacher data in which the types of the waste are labeled and the objects to be identified other than the waste in the image are also labeled by type.
- An identification algorithm 12a (second model) may be generated that identifies the types of objects to be identified other than waste in addition to the types of waste stored therein.
- the amount of teacher data (the amount of data of the second teacher data) used for creating the second model is larger than the amount of teacher data (the amount of data of the first teacher data) required when creating the first model. ) may be in a small amount, preferably 30% or less, more preferably 15% or less, and still more preferably 5% or less. If the training data of the second facility 100 is small, the environmental conditions of the newly introduced facility (installation position, angle, lighting conditions of the image acquisition camera, etc.) and future environmental changes (shape and color of garbage bags) change, contamination of the side wall of the pit over time, etc.
- the identification algorithm 12a (second model) with high identification accuracy can be obtained, and the number of man-hours required for its generation can be reduced.
- the second training data is obtained by, for example, visual observation of wastes and identification objects other than wastes in second image data captured by a skilled operator who drives in the second facility 100. It may be created by identifying and labeling by type. The types of waste and objects to be identified other than waste may be labeled, for example, as layered by type overlaid on the second image data.
- the types of waste that are labeled in the second training data are garbage bag unbroken bag garbage, paper garbage, pruned branches, futons, sludge, oversized crushed garbage, cardboard, jute bags, paper bags, and bottom garbage (in garbage pit 3). waste that is near the bottom and has a high moisture content compressed into the waste above). Also, the types of waste labeled in the second training data may include unplanned waste (abnormal matter) that may enter the garbage pit 3 but is not desired. .
- abnormal objects include, for example, objects that should not be incinerated, specifically, fluorescent lamps, mercury-containing garbage, and explosives such as cylinders, cans, and oil tanks.
- the types of waste labeled in the second training data are wood waste, textile waste, clothing waste, plastic waste, animal residue, animal corpses, kitchen waste, plants, soil, medical waste, incineration ash, bicycles, Drawers, beds, shelves, desks, chairs, agricultural vinyl, PET bottles, polystyrene foam, meat and bone meal, crops, pottery, glass scraps, metal scraps, debris, concrete scraps, tatami mats, bamboo, straw, activated carbon It may contain one or more.
- the type of waste labeled in the second training data need not be the same as the type of waste labeled in the first training data.
- the type of waste is labeled in the second training data. If there is waste in the newly introduced facility, by including the type of waste as a type of waste labeled in the second training data, a second model that can identify it is generated. can do. In addition, if there is a type of waste that exists in the first training data that does not exist in the newly introduced facility, the type of waste that is labeled in the second training data is It does not have to include the type.
- the types of objects to be identified other than waste that are labeled in the second training data are the beam of the second facility 100, the side wall of the waste pit 3, and the cliff of the pile of waste stored in the waste pit 3 ( part of the cliff of the waste pile and darkened to the extent that the type of waste cannot be visually identified), and a crane 5 for stirring or transporting the waste, good too.
- the types of identification objects other than waste that are labeled in the second training data may include one or both of workers and delivery vehicles.
- the types of objects to be identified other than waste labeled in the training data include walls, pillars, floors, windows, ceilings, doors, stairs, girders (a structure that moves by suspending the crane 5) of the second facility 100.
- the type of the identification object other than the waste labeled in the second training data need not be the same as the type of the identification object other than the waste labeled in the first training data. If an identification object other than waste of a type that does not exist in the training data exists in the newly introduced facility, the type of the identification object other than waste to be labeled in the second training data By including , it is possible to generate a second model that can identify it. In addition, if there is a type of identification object other than waste that exists in the first training data that does not exist in the newly introduced facility, identification other than waste labeled in the second training data The type of object may not include the type of identification object.
- FIG. 5 is a diagram showing an example of second teacher data in which image data of the inside of the garbage pit 3 of the second facility 100 is labeled with the types of waste and objects to be identified other than waste.
- image data of the inside of the garbage pit 3 of the second facility 100 is labeled with the types of waste and objects to be identified other than waste.
- unbroken bag garbage, pruned branches, and futons are labeled by type as waste, and cranes 5
- the cliffs of the waste pile, the side walls of the waste pit 3, and the floor of the second facility 100 are each labeled by type.
- the second training data may include an image in which the inside of the garbage pit 3 of the second facility 100 is captured and the entire image is labeled, or an image of the inside of the garbage pit 3 in the second facility 100 is labeled. A part may be cut out and only the cut out part may be labeled.
- training data By including training data by labeling the entire image, information on how the image (field of view and angle of view) in the garbage pit obtained by the imaging device installed at the newly introduced facility (second facility) can be obtained. Since additional learning can be performed taking into consideration, the accuracy of identifying the type of waste by the second model can be improved.
- the amount of teacher data and the number of man-hours required for creating the teacher data can be greatly reduced compared to the case where the entire image is labeled.
- the second image data and the second teacher data may include the side wall of the garbage pit 3 and/or the crane 5 of the second facility 100 in the image.
- the second training data includes the side wall of the garbage pit of the second facility 100 and/or the crane for stirring or transporting the waste in the image, so that the garbage pit at the newly introduced facility (second facility) Additional learning can be done to account for differences in shape and size, and crane shape and size, so that the second model can better identify sidewalls and/or cranes and waste-only data (i.e., waste-only data) Accuracy of waste identification can also be increased compared to using pit sidewalls and/or crane-free data).
- the second image data is obtained by inputting the image data of the inside of the garbage pit 3 of the second facility 100 into the first model to identify the type of waste, and then selecting the image data with the accuracy of identification.
- Image data lower than a predetermined standard may be selected and used, or other data (that is, the image data itself obtained by imaging the inside of the garbage pit 3 of the second facility 100) may be used.
- the second image data is image data obtained by imaging the inside of the garbage pit 3 of the second facility 100. After inputting the image data to the first model to identify the type of waste, the image data is selected from among the image data with the accuracy of identification.
- the type of waste for which sufficient identification accuracy cannot be obtained with the first model is confirmed, and image data and teacher data including the type are confirmed. It can be mainly used for additional learning, which enables efficient learning data collection and learning.
- the second image data is a rendered image of the side wall of the garbage pit 3 and/or the crane 5 created based on the three-dimensional design data of the second facility 100, the waste in the garbage pit of the plurality of facilities, Alternatively, it may include a synthesized image of the waste in the waste pit of another facility different from the plurality of facilities and the second facility. In this case, before the pit construction and crane installation of the newly introduced facility (second facility 100) are completed, image data regarding how the pit 3, crane 5, etc. look can be created and used for learning.
- the image data acquisition unit 11 b acquires new image data of the inside of the garbage pit 3 of the second facility 100 from the imaging device 6 .
- the new image data acquired by the image data acquiring section 11b is stored in the storage section 12.
- the type identification unit 11c uses the new image data acquired by the image data acquisition unit 11b as an input, and uses the identification algorithm 12a (second model) generated by the second model generation unit 11a2 to identify the second facility. Identify the types of waste stored in the 100 waste pits 3;
- the type identification unit 11c uses the identification algorithm 12a (second model) with new image data acquired by the image data acquisition unit 11b as an input, and uses the identification algorithm 12a (second model) to store in the garbage pit 3 of the second facility 100
- the type of identification object other than waste may be identified together with the type of waste.
- FIG. 6 is a diagram showing an example of data displayed by superimposing the identification result of the type identification unit 11c on the image data obtained by imaging the image inside the garbage pit 3 of the second facility 100. As shown in FIG. In the example shown in FIG. 6, the waste (unbroken bag waste, pruned branches) identified by the type identification unit 11c and identification objects other than waste (the crane 5, the cliff of the mountain of waste, the side wall of the waste pit 3). , the floor of the second facility 100) are superimposed on the image data by type and displayed.
- the type identification unit 11c generates a map displaying the ratio of types of waste stored in the waste pit 3 of the second facility 100 for each area based on the identification result.
- the inside of the waste pit 3 is divided into a grid of 5 ⁇ 4, and the ratio of the types of waste identified by the type identifying section 11c is displayed for each area.
- the plant control unit 11d controls the waste treatment facility (second facility) 100 based on the identification result of the type identification unit 11c.
- the plant control unit 11d sends the identification result of the type identification unit 11c (that is, The identification result of the type identification unit 11c (that is, the information on the type of waste identified from the image data) is transmitted to the crane control unit 11d1 that transmits the type information) and the combustion control device 20 that controls the combustion of the waste. and a combustion control unit 11d2.
- the plant control section 11d includes both the crane control section 11d1 and the combustion control section 11d2. may contain only one of
- the crane control unit 11d1 sends, for example, a map (see FIG. 7) displaying the ratio of types of waste stored in the waste pit 3 for each area to the crane control device 50 as the identification result of the type identification unit 11c. and send. Based on the map received from the crane control unit 11d1, the crane control device 50 operates the crane 5 to agitate the waste in the waste pit 3 so that the ratio of types of waste is equal in all areas. .
- the combustion control unit 11d2 sends, for example, a map (see FIG. 7) displaying the ratio of types of waste stored in the waste pit 3 for each area to the combustion control device 20 as the identification result of the type identification unit 11c. and send. Based on the map received from the combustion control unit 11d2, the combustion control device 20 grasps the ratio of the types of waste that are picked up by the crane 5 and conveyed together from the waste pit 3 to the hopper 4, and controls the hopper 4. The combustion of the waste is controlled according to the ratio of the waste co-charged into the incinerator 1 via the stoker (for example, the feed speed of the stoker and the amount of air supplied are controlled).
- the fall detection unit 11e detects the fall of the worker or transport vehicle from the platform 21 into the garbage pit 3 based on the identification result of the type identification unit 11c (that is, the information of the worker or transport vehicle identified from the image data). to detect.
- the fall detection unit 11e sends the identification result of the type identification unit 11c (that is, from the image data) to a fall detection device (not shown) that detects the presence of a worker or a delivery vehicle in the garbage pit 3 where waste is stored. identified worker or vehicle information) may be transmitted.
- a fall detection device (not shown) issues an alarm based on the identification result of the type identification unit 11c transmitted from the fall detection unit 11e, or operates the crane 5 and rescue equipment (for example, a gondola, etc.). to rescue workers.
- the foreign object detection unit 11f detects abnormal objects thrown into the garbage pit 3 based on the identification result of the type identification unit 11c (that is, information on the type of waste identified from the image data).
- abnormal matter refers to unplanned waste that may enter the garbage pit 3 although it is not desired. , fluorescent lamps, garbage containing mercury, and explosives such as cylinders, cans, and oil tanks.
- the foreign object detection unit 11f sends a foreign object detection device (not shown) for detecting an abnormal object thrown into the waste pit 3 where waste is stored, to the identification result of the type identification unit (that is, the waste identified from the image data). type information) may be transmitted.
- a foreign object detection device refers to a database that stores the name of the company or the vehicle that has thrown the waste into the garbage pit 3 together with time information, and identifies the type identification unit 11c sent from the foreign object detection unit 11f. Based on the results, the vendor or vehicle that put the foreign object into the garbage pit 3 is specified.
- FIG. 3 is a flow chart showing an example of an information processing method.
- the first image generated by the first model generation unit 11a1 captures the inside of the waste pits of a plurality of facilities (waste treatment facilities different from the second facility 100) where waste is stored.
- a waste type identification algorithm 12a (first model) is generated by performing machine learning on the data and the first training data in which the waste in the image is labeled with the type (step S11).
- the first model generation unit 11a can perform pre-learning using a large amount of teacher data collected and generated at a plurality of facilities for an unlimited period of time. In addition, it is possible to learn using a sufficient amount of training data even for waste that does not appear frequently in one facility. Furthermore, learning is possible using a variety of training data that takes into account differences in the shape and size of garbage pits at each facility, as well as various environmental conditions (such as the installation position, angle, lighting conditions, etc. of cameras used to acquire images). can be done, resulting in a robust discrimination algorithm 12a (first model).
- the second model generation unit 11a2 generates second image data of the inside of the garbage pit of the second facility 100 for the identification algorithm 12a (first model) generated by the first model generation unit 11a1, By additionally learning the second training data labeled with the type of the waste in the image, the identification algorithm 12a (second model) corresponding to the waste of the second facility is generated (step S12). .
- the amount of teacher data (second teacher data) used for additional learning should be smaller than the amount of teacher data (first teacher data) used when generating the first model. If it is a small amount, the environmental conditions of the newly introduced facility (installation position, angle, lighting conditions, etc. of the camera for image acquisition), future environmental changes (changes in the shape and color of the garbage bag, aging of the pit side wall, etc.) dirt, etc.).
- the image data acquisition unit 11b acquires new image data 12b (for example, see FIG. 4) of the inside of the garbage pit 3 of the second facility 100 from the imaging device 6 (step S13).
- the new image data 12b acquired by the image data acquiring section 11b is stored in the storage section 12.
- the type identifying unit 11c uses the new image data obtained by the image data obtaining unit 11b as an input, and uses the identification algorithm 12a (second model) generated by the second model generating unit 11a2 to obtain the second (step S14).
- the type identification unit 11c receives new image data (see, for example, FIG. 4) obtained by the image data obtaining unit 11b, and uses the identification algorithm 12a (second model) generated by the second model generation unit 11a2. , the type of identification object other than waste may be identified together with the type of waste stored in the waste pit 3 of the second facility 100 (see FIG. 6, for example). Further, the type identification unit 11c generates a map (see, for example, FIG. 7) that displays the ratio of the type of waste stored in the garbage pit 3 for each region based on the identification result of the type of waste. may
- the plant control unit 11d controls the waste treatment plant based on the identification result by the type identification unit 11c.
- the crane control unit 11d1 provides the crane control device 50, which controls the crane 5 for stirring or transporting the waste, as the identification result of the type identification unit 11c.
- a map displaying the ratio of types for each area is transmitted (step S15). Based on the map received from the crane control unit 11d1, the crane control device 50 operates the crane 5 to agitate the waste in the waste pit 3 so that the ratio of types of waste is equal in all areas. .
- the combustion control unit 11d2 provides the combustion control device 20, which controls the combustion of waste, with a map that displays the ratio of types of waste for each region as shown in FIG. 7 as the identification result of the type identification unit 11c.
- Send step S16.
- the combustion control device 20 grasps the ratio of the types of waste that are picked up by the crane 5 and conveyed together from the waste pit 3 to the hopper 4, and controls the hopper 4.
- the combustion of the waste is controlled according to the ratio of each waste that is co-charged into the incinerator 1 via the stoker (for example, the feed speed of the stoker and the amount of air supplied are controlled).
- the fall detection unit 11e moves the garbage pit from the platform 21 based on the identification result of the type identification unit 11c. 3 may be detected, and the identification result of the type identification unit 11c may be transmitted to a fall detection device (not shown).
- the foreign object detection unit 11f throws it into the dust pit 3 based on the identification result of the type identification unit 11c.
- An abnormal object may be detected, and the identification result of the type identification unit 11c may be sent to a foreign object detection device (not shown).
- machine learning supervised learning
- a large amount of image data and training data that must be newly prepared for each material processing facility is required.
- waste disposal facilities have different shapes and sizes of waste pits, and various environmental conditions (for example, the installation position, angle, lighting conditions, etc. of the image acquisition camera). Even if an identification algorithm created using data and training data is used as it is at other facilities, sufficient identification accuracy cannot be obtained.
- a large amount of teacher data collected and generated at multiple facilities without a limited period before introducing the identification algorithm to the new facility It is also possible to learn in advance using a sufficient amount of teacher data, even for wastes that appear infrequently in one facility. Furthermore, learning is possible using a variety of training data that takes into account differences in the shape and size of garbage pits at each facility, as well as various environmental conditions (such as the installation position, angle, lighting conditions, etc. of cameras used to acquire images). can be done, resulting in a robust identification algorithm 12a (first model).
- training data collected and generated at the newly introduced facility is additionally learned to such a first model to generate the identification algorithm 12a (second model) corresponding to the waste at the newly introduced facility.
- the amount of teacher data used for additional learning is , generated teacher data). If it is a small amount, the environmental conditions of the newly introduced facility (installation position, angle, lighting conditions, etc. of the camera for image acquisition), future environmental changes (changes in the shape and color of the garbage bag, aging of the pit side wall, etc.) dirt, etc.).
- the control unit 11 includes a first model generation unit 11a1, a second model generation unit 11a2, an image data acquisition unit 11b, a type identification unit 11c, and a plant control unit.
- the control unit 11d, the fall detection unit 11e, and the foreign object detection unit 11f are included, the control unit 11 is not limited to this, and a part of the processing of the control unit 11 is performed by the information processing apparatus instead of the information processing apparatus 10. 10 may be performed on another information processing device or a cloud server. Also, a part of the storage unit 12 may be located on an information processing apparatus different from the information processing apparatus 10 or on a cloud server instead of the information processing apparatus 10 .
- the processes of the first model generation unit 11a1 and the second model generation unit 11a2 are executed on the external information processing device 101 (cloud server), and the identification algorithm 12a (the first model and the second model) may be generated. Further, the processing of the type identification unit 11c may be executed on the external information processing device 101 (cloud server) using the identification algorithm 12a generated on the external information processing device 101 (cloud server), As shown in FIG. 10, an information processing device 10 introduced into a second facility 100 uses an identification algorithm 12a (second model) generated by an external information processing device 101 (cloud server). 101 (cloud server) and used in the information processing apparatus 10 to execute the processing of the type identification unit 11c. In this case, during the operation of the second model in the second facility 100, further learning can be performed separately by the external information processing device 101. FIG. Also, the storage capacity of the information processing apparatus 10 can be reduced.
- the control unit 11 After starting the operation of the identification algorithm (second model) generated by the second model generation unit 11a2, the control unit 11 periodically monitors the identification accuracy during operation (that is, the identification result of the type identification unit 11c), and performs identification. It may be determined whether or not the algorithm 12a (second model) needs to be reviewed and updated. For example, the control unit 11 uses an edge server to determine whether the identification result of the type identification unit 11c is normal or abnormal. determine if there is If it is determined that there is a problem in operation, the skilled operator labels the image data in which the abnormality was detected again by type of waste, prepares new training data, and the second model generation unit 11a2 The discrimination algorithm 12a (second model) may be updated by additionally learning image data in which anomalies have been detected and newly prepared teacher data. This makes it possible to respond to changes in the types of waste in the second facility 100 and changes in the composition ratio of each type of waste.
- the control unit 11 uses an identification algorithm ( Using the second model) as a base model, additional learning is performed on the third image data obtained by imaging the inside of the waste pit of the third facility and the third teacher data labeled with the type of waste in the image. may generate an identification algorithm (third model) corresponding to the waste of the third facility.
- the second model can be used as a base model for additional learning using the third training data at another facility (third facility), so the accuracy of identifying the type of waste is improved sequentially. can be expected.
- the combustion control unit 11d2 creates a map (see FIG. 7) that displays the ratio of types of waste stored in the waste pit 3 for each region as the identification result of the type identification unit 11c.
- a label obtained by converting the ratio of types of waste into quality information for example, input OK, input NG, calorie L ( Low), M (Middle), H (High), etc. may be transmitted to the combustion control device 20 for each region.
- the combustion control unit 11d2 displays a label indicating that the rate of influence on the combustion state is large (for example, unbroken bag dust, bottom dust, etc.) as the identification result of the type identification unit 11c. can be sent to Similarly, the crane control unit 11d1 assigns a label (for example, pruned branches, oversized crushed garbage, etc.) to the crane control device 50 as the identification result of the type identification unit 11c, indicating that each device is greatly affected. can be sent to a label (for example, pruned branches, oversized crushed garbage, etc.) to the crane control device 50 as the identification result of the type identification unit 11c, indicating that each device is greatly affected. can be sent to
- the type identification unit 11c When the type identification unit 11c generates a map that displays the ratio of the types of waste stored in the garbage pit 3 for each region based on the identification result, the image data obtained by the image data obtaining unit 11b On the other hand, the image data may simply be divided into areas to display the ratio of the types of waste, or the image data and the address assignment of the garbage pit 3 may be linked to display the ratio of the types of waste for each address. good too.
- a crane 5 whose position relative to the garbage pit 3 is measurable is marked in advance, and the marked crane 5 is captured by the imaging device 6 in a large number of images.
- the type identification unit 11c estimates the relative position and direction of the imaging device 6 with respect to the garbage pit 3 based on a large number of images in which the marked crane 5 is reflected, Based on the estimated position and shooting direction of the imaging device 6, it is estimated at which address the pixel on the image data exists.
- the crane 5 whose position relative to the garbage pit 3 is measurable is photographed by the imaging device 6 in a number of images, and the type identification unit 11c marks the crane 5 in the images. Then, the position and direction of the imaging device 6 relative to the garbage pit 3 are estimated based on a large number of images marked with the crane 5, and pixels on the image data are obtained from the estimated position and shooting direction of the imaging device 6. may be estimated at which address exists.
- the crane control device 50 selects from the garbage pit 3 garbage that satisfies the charging criteria based on the ratio threshold for each garbage type based on the map received from the crane control unit 11d1 when an injection request is received from the incinerator 1. , the crane 5 may be operated to throw it into the hopper 4 .
- the crane control device 50 may sort the garbage so that the difference from the ratio of each garbage type to the garbage that was previously thrown into the hopper 4 becomes smaller. .
- the crane control device 50 sets garbage bag unbroken bag garbage, paper garbage, pruned branches, futon, sludge, crushed oversized garbage, cardboard, hemp sacks, paper bags, bottom garbage, wood chips, fiber garbage, Clothing waste, plastic waste, animal residue, animal carcasses, kitchen waste, plants, soil, medical waste, incineration ash, agricultural vinyl, PET bottles, polystyrene foam, meat and bone meal, crops, pottery, glass waste, metal waste, rubble Ratio thresholds for one or more of: lint, concrete debris, tatami mats, bamboo, straw, activated carbon may be used.
- the crane control device 50 uses image data obtained by capturing images of the inside of the garbage pit 3 in the past, and a skilled operator determines the amount of waste indicated in the image data. From the viewpoint of combustion stability and the impact on the equipment, the quality is classified and labeled according to whether or not it can be thrown in, and the image is used to estimate the ratio of each type of garbage using the type identification unit 11c. By comparing with , a ratio threshold value for each type of garbage that a skilled operator determines whether or not to throw in may be determined.
- the crane control device 50 receives from the crane control unit 11d1 a map indicating the ratio of types of waste stored in the waste pit 3 for each area, and determines the type of waste actually put into the hopper 4.
- a ratio threshold for each type of waste may be determined, or both data are linked over time to dynamically change the ratio threshold.
- the crane control device 50 may dynamically change the ratio threshold based on not only the process data but also weather information. For example, if the weather information indicates that it is raining on the day, the crane control device 50 changes the input criteria based on the ratio threshold, such as lowering the ratio threshold for unbroken garbage bags or raising the ratio threshold for crushed oversized garbage.
- the crane control device 50 may dynamically change the ratio threshold based on the day of the week information. For example, since the amount of garbage in the garbage pit 3 is small on Sundays, the crane control device 50 determines the input amount standard based on the ratio threshold, such as increasing the ratio threshold of unbroken garbage bags, in order to suppress the amount of garbage incinerated. change.
- the crane control device 50 may dynamically change the ratio threshold value based on the furnace operating plan value of the waste treatment facility 100 . For example, if the evaporation amount has fallen below the current evaporation amount set value, the crane control device 50 reduces the ratio threshold for unbroken garbage bags or raises the ratio threshold for crushed oversized garbage. change the standards.
- the crane control device 50 When an input request is received from the incinerator 1, the crane control device 50 operates the crane 5 if there is no waste that satisfies the input criteria based on the ratio threshold for each type of waste in the map received from the crane control 11d1. Waste close to the input standard may be thrown into the hopper 4, or the waste close to the input standard may be agitated to create garbage that meets the input standard.
- the crane control device 50 may operate the crane 5 to pile up only the garbage that satisfies the input criteria based on the ratio threshold for each garbage type in the map received from the crane control 11d1. By doing so, it is possible to accumulate in the dust pit 3 the dust that satisfies the input criteria.
- the crane control device 50 identifies the waste (for example, sludge) that affects the combustion state in the waste pit 3 and the cause of the trouble of each device based on the ratio threshold value for each waste type.
- the waste for example, pruned branches
- the crane 5 may be operated to store it at a specific location in the garbage pit 3 or to scatter it at a specific location.
- the crane control device 50 In the map received from the crane control 11d1, the crane control device 50, if there is trash in the trash pit 3 that does not satisfy the agitation criteria based on the ratio threshold for each trash type, The garbage may be agitated by operating the crane 5 .
- the agitation criteria may be the same as or different from the dosing criteria.
- the crane control device 50 uses the agitation criteria as the process data of the incinerator 1, weather information, day of the week information, garbage delivery company information, garbage delivery amount (total amount and garbage delivery amount by type) information, garbage delivery speed, Garbage pit level (whole, specific area) information, crane operation status (2 units can be operated, only 1 unit is in operation, 1 unit is currently in operation, 2 units are currently in operation) information, collection route and collection area information for garbage trucks One or more of them may be used and dynamically changed.
- the crane control device 50 judges the agitation state of the entire garbage pit 3 from the ratio of each garbage type in each area in the map received from the crane control 11d1, judges whether two cranes 5 need to be operated, and 5 may be operated to start operation of the second crane or retract the second crane.
- the crane control device 50 operates the crane 5, but a crane operation determination device (not shown) is provided upstream of the crane control device 50, and the operation of the crane 5 is determined by the crane operation determination device.
- the contents may be determined, a command of operation contents may be transmitted to the crane control device 50, and the crane 5 may be operated based on the command contents received by the crane control device 50 which received the command.
- the crane operation determination device transmits and receives information to and from the information processing device 10 and the crane control device 50 .
- the crane operation determination device may be part of the information processing device 10, that is, the information processing device 10 may include the crane operation determination device.
- the crane operation determination device receives a load request signal from the crane control device 50 when a load request is received from the incinerator 1, and based on the map received from the crane control unit 11d1, determines the load standard based on the ratio threshold for each type of waste.
- a command is sent to the crane control device 50 to sort out the garbage of the map to be filled from the garbage pit 3 and throw the garbage into the hopper 4, and the crane control device 50 operates the crane 5 based on the received command. good too.
- the garbage may be sorted so that the difference from the ratio of each garbage type to the garbage that was previously thrown into the hopper 4 becomes small.
- the crane operation determination device receives an input request signal from the crane control device 50 when an input request is received from the incinerator 1, and in the map received from the crane control 11d1, there is garbage that satisfies the input criteria based on the ratio threshold for each type of garbage. If it does not exist, a command is sent to the crane control device 50 to load the waste close to the input standard into the hopper 4, or to agitate the waste close to the input standard to create a waste that meets the input standard, and the crane control device 50 controls the crane. 5 may be operated.
- the crane operation determination device transmits a command to the crane control device 50 to stack only the garbage that satisfies the input criteria based on the ratio threshold of the garbage type in the map received from the crane control 11d1 in a specific location in the garbage pit 3, and the crane.
- the control device 50 may operate the crane crane 5 . By doing so, it is possible to accumulate in the dust pit 3 the dust that satisfies the input criteria.
- the crane operation determination device determines the waste (for example, sludge) that affects the combustion state in the waste pit 3 and the cause of the trouble of each device based on the ratio threshold value for each waste type.
- a command is sent to the crane control device 50 to detect the waste (for example, pruned branches) that becomes a may operate.
- the crane operation determination device instructs the crane control device 50 to agitate the garbage if there is garbage in the garbage pit 3 that does not meet the agitation criteria based on the ratio threshold for each garbage type.
- the crane controller 50 may operate the crane 5 .
- the agitation criteria may be the same as or different from the dosing criteria.
- the crane operation judgment device judges the agitation state of the entire garbage pit 3 from the ratio of each garbage type in each area in the map received from the crane control 11d1, judges whether two cranes 5 need to be operated, and A command may be sent to the control device 50, and the crane control device 50 may operate the crane 5, start operation of the second crane, or retract the second crane.
- the example in which the information processing device 10 for identifying the type of waste is used in the waste pit 3 of the waste disposal facility 100 has been described. is not limited to the waste pit 3 of the waste disposal facility 100 as long as it is a waste storage place where waste is stored. may
- Information processing apparatus 10 can be configured by one or more computers.
- the recording medium on which it is recorded is also subject to protection in this case.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
- Incineration Of Waste (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成するステップと、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成するステップと、
を含む。
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成する第1モデル生成部と、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成する第2モデル生成部と、
を備える。
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成するステップと、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成するステップと、
を含む。
次に、そのような第1モデルに対して、新規導入施設(第2の施設)において収集、生成された教師データを追加学習して第2モデルを生成する。ここで、本件発明者らの知見によれば、追加学習に用いる教師データ(新規導入施設において収集、生成された教師データ)の量は、第1モデル生成時に用いる教師データ(複数の施設において収集、生成された教師データ)の量に対して、少量とするのがよい。少量であれば、新規導入施設の環境条件(画像取得用カメラの設置位置、角度、照明条件など)や、将来的な環境の変化(ごみ袋の形状や色の変化、ピット側壁の経年的な汚れなど)に影響されづらくなる。ただし、第1教師データに存在しない種類の廃棄物が新規導入施設に存在する場合は、新規導入施設において収集、生成される教師データにその廃棄物の教師データを含めておくことで、それを識別可能な第2モデルを生成できる。このように構成することで、ロバスト性をある程度保ちつつ、新規導入施設に対応した識別精度の高いモデル(第2モデル)が得られる。また、新規導入施設において廃棄物の種類を識別する識別アルゴリズムの導入に係る工数を減らすことができる。
前記第2画像データおよび前記第2教師データは、その画像中に前記第2の施設のごみピットの側壁および/または廃棄物の攪拌または搬送を行うクレーンを含む。
このような態様によれば、新規導入施設(第2の施設)におけるごみピットの形状や寸法、クレーンの形状や寸法の違いを考慮した追加学習を行うことができるため、第2モデルによる、側壁および/またはクレーンの識別精度が上がり、廃棄物のみのデータ(すなわち画像中にごみピットの側壁および/またはクレーンを含まないデータ)を使用する場合に比べ、廃棄物の種類の識別精度も上げることができる。
前記第1教師データに対する、前記第2教師データの量は、30%以下である。
前記第1教師データに対する、前記第2教師データの量は、15%以下である。
前記第1教師データに対する、前記第2教師データの量は、5%以下である。
前記識別アルゴリズムは、線形回帰、ボルツマンマシン、ニューラルネットワーク、サポートベクターマシン、ベイジアンネットワーク、スパース回帰、決定木、ランダムフォレストを用いた統計的推定、強化学習、深層学習、のうちの1つまたは2つ以上を含む。
前記第2教師データは、前記第2の施設のごみピット内を撮像した画像の全体にラベル付けしたものを含む。
このような態様によれば、新規導入施設(第2の施設)に設置した撮像装置により得られたごみピット内の画像の見え方(視野や画角)の情報を考慮した追加学習を行うことができるため、第2モデルによる廃棄物の種類の識別精度を上げることができる。
前記第2教師データは、前記第2の施設のごみピット内を撮像した画像の一部を切り出して、その切り出された部分のみラベル付けしたものを含む。
前記第2モデルを生成するステップにて用いられる前記第2画像データは、前記第2の施設のごみピット内を撮像した画像データを前記第1モデルに入力して廃棄物の種類を識別したうえで、それらの画像データの中から、識別精度が予め定められた基準(第1基準)より低い画像データを選択して用いるものである。
前記第1モデルを生成するステップでは、各施設における撮影条件および/または撮影環境の情報をあわせて学習する。
前記第1画像データは、各施設のごみピット内を撮像した画像データに対し、前記複数の施設間で共通の画像補正用カラーチャートを撮像した画像を参照として、色調、明度、彩度のうちの少なくとも1つの補正を行ったものである。
前記第1画像データは、各施設のごみピット内を、前記複数の施設間で共通の画像補正用カラーチャートとともに撮像したものである。
前記第2画像データは、前記第2の施設の3次元設計データに基づいて作成されたごみピットの側壁および/または廃棄物の攪拌または搬送を行うクレーンのレンダリング画像に対し、前記複数の施設のごみピット内の廃棄物、または前記複数の施設および前記第2の施設のどちらとも異なる他の施設のごみピット内の廃棄物を撮像した画像を合成したものを含む。
生成された第2モデルの運用開始後、運用時の識別精度を定期的にモニタリングし、識別精度が予め定められた基準(第2基準)より低くなった場合に、前記第2モデルに対し、そのときの廃棄物の画像データと、その画像中の廃棄物に種類をラベル付けした教師データとを、追加学習することにより、前記第2モデルを更新するステップをさらに含む。
生成された第2モデルに対し、前記複数の施設および前記第2の施設のどちらとも異なる第3の施設のごみピット内を撮像した第3画像データと、その画像中の廃棄物に種類をラベル付けした第3教師データとを、追加学習することにより、前記第3の施設の廃棄物に対応する識別アルゴリズムである第3モデルを生成するステップをさらに含む。
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成する第1モデル生成部と、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成する第2モデル生成部と、
を備える。
コンピュータに、
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成するステップと、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成するステップと、
を実行させる。
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設のごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別するステップ
を含む。
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設のごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別する種類識別部
を備える。
コンピュータに、
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類
をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設のごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別するステップ
を実行させる。
燃焼を制御する燃焼制御装置20に種類識別部11cの識別結果(すなわち、画像データから識別された廃棄物の種類の情報)を送信する燃焼制御部11d2とを含んでいる。
クレーン5を動作させて当該ごみを撹拌してもよい。上記撹拌基準は、投入基準と同一でもよいし異なっていてもよい。
Claims (20)
- 廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成するステップと、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成するステップと、
を含む情報処理方法。 - 前記第2画像データおよび前記第2教師データは、その画像中に前記第2の施設のごみピットの側壁および/または廃棄物の攪拌または搬送を行うクレーンを含む、
請求項1に記載の情報処理方法。 - 前記第1教師データに対する、前記第2教師データの量は、30%以下である、
請求項1または2に記載の情報処理方法。 - 前記第1教師データに対する、前記第2教師データの量は、15%以下である、
請求項3に記載の情報処理方法。 - 前記第1教師データに対する、前記第2教師データの量は、5%以下である、
請求項4に記載の情報処理方法。 - 前記識別アルゴリズムは、線形回帰、ボルツマンマシン、ニューラルネットワーク、サポートベクターマシン、ベイジアンネットワーク、スパース回帰、決定木、ランダムフォレストを用いた統計的推定、強化学習、深層学習、のうちの1つまたは2つ以上を含む、
請求項1~5のいずれかに記載の情報処理方法。 - 前記第2教師データは、前記第2の施設のごみピット内を撮像した画像の全体にラベル付けしたものを含む、
請求項1~6のいずれかに記載の情報処理方法。 - 前記第2教師データは、前記第2の施設のごみピット内を撮像した画像の一部を切り出して、その切り出された部分のみラベル付けしたものを含む、
請求項1~6のいずれかに記載の情報処理方法。 - 前記第2モデルを生成するステップにて用いられる前記第2画像データは、前記第2の施設のごみピット内を撮像した画像データを前記第1モデルに入力して廃棄物の種類を識別したうえで、それらの画像データの中から、識別精度が予め定められた第1基準より低い画像データを選択したものである、
請求項1~8のいずれかに記載の情報処理方法。 - 前記第1モデルを生成するステップでは、各施設における撮影条件および/または撮影環境の情報をあわせて学習する、
請求項1~9のいずれかに記載の情報処理方法。 - 前記第1画像データは、各施設のごみピット内を撮像した画像データに対し、前記複数の施設間で共通の画像補正用カラーチャートを撮像した画像を参照として、色調、明度、彩度のうちの少なくとも1つの補正を行ったものである、
請求項1~10のいずれかに記載の情報処理方法。 - 前記第1画像データは、各施設のごみピット内を、前記複数の施設間で共通の画像補正用カラーチャートとともに撮像したものである、
請求項1~10のいずれかに記載の情報処理方法。 - 前記第2画像データは、前記第2の施設の3次元設計データに基づいて作成されたごみピットの側壁および/または廃棄物の攪拌または搬送を行うクレーンのレンダリング画像に対し、前記複数の施設のごみピット内の廃棄物、または前記複数の施設および前記第2の施設のどちらとも異なる他の施設のごみピット内の廃棄物を撮像した画像を合成したものを含む、
請求項1~12のいずれかに記載の情報処理方法。 - 生成された第2モデルの運用開始後、運用時の識別精度を定期的にモニタリングし、識別精度が予め定められた第2基準より低くなった場合に、前記第2モデルに対し、そのときの廃棄物の画像データと、その画像中の廃棄物に種類をラベル付けした教師データとを、追加学習することにより、前記第2モデルを更新するステップをさらに含む、
請求項1~13のいずれかに記載の情報処理方法。 - 生成された第2モデルに対し、前記複数の施設および前記第2の施設のどちらとも異なる第3の施設のごみピット内を撮像した第3画像データと、その画像中の廃棄物に種類をラベル付けした第3教師データとを、追加学習することにより、前記第3の施設の廃棄物に対応する識別アルゴリズムである第3モデルを生成するステップをさらに含む、
請求項1~14のいずれかに記載の情報処理方法。 - 廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成する第1モデル生成部と、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成する第2モデル生成部と、
を備えた情報処理装置。 - コンピュータに、
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより、廃棄物の種類の識別アルゴリズムである第1モデルを生成するステップと、
生成された第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを生成するステップと、
を実行させるための情報処理プログラム。 - 廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設の
ごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別するステップ
を含む情報処理方法。 - 廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設のごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別する種類識別部
を備えた情報処理装置。 - コンピュータに、
廃棄物が貯留される複数の施設のごみピット内を撮像した第1画像データと、その画像中の廃棄物に種類をラベル付けした第1教師データとを、機械学習することにより生成された、廃棄物の種類の識別アルゴリズムである第1モデルに対し、前記複数の施設とは異なる第2の施設のごみピット内を撮像した第2画像データと、その画像中の廃棄物に種類をラベル付けした第2教師データとを、追加学習することにより生成された、前記第2の施設の廃棄物に対応する識別アルゴリズムである第2モデルを用いて、前記第2の施設のごみピット内を撮像した新たな画像データを入力として当該ごみピット内に貯留されている廃棄物の種類を識別するステップ
を実行させるための情報処理プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280049403.8A CN117836560A (zh) | 2021-07-13 | 2022-07-13 | 信息处理方法、信息处理装置及信息处理程序 |
EP22842137.6A EP4372278A1 (en) | 2021-07-13 | 2022-07-13 | Information processing method, information processing device, and information processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-115537 | 2021-07-13 | ||
JP2021115537A JP2023012094A (ja) | 2021-07-13 | 2021-07-13 | 情報処理方法、情報処理装置、および情報処理プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023286795A1 true WO2023286795A1 (ja) | 2023-01-19 |
Family
ID=84919474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/027504 WO2023286795A1 (ja) | 2021-07-13 | 2022-07-13 | 情報処理方法、情報処理装置、および情報処理プログラム |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4372278A1 (ja) |
JP (1) | JP2023012094A (ja) |
CN (1) | CN117836560A (ja) |
TW (1) | TW202309445A (ja) |
WO (1) | WO2023286795A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018181003A1 (ja) * | 2017-03-31 | 2018-10-04 | 三菱重工業株式会社 | 情報提供装置、情報提供システム、情報提供方法及びプログラム |
WO2020040110A1 (ja) * | 2018-08-23 | 2020-02-27 | 荏原環境プラント株式会社 | 情報処理装置、情報処理プログラム、および情報処理方法 |
JP2020038058A (ja) | 2018-08-23 | 2020-03-12 | 荏原環境プラント株式会社 | 情報処理装置、情報処理プログラム、および情報処理方法 |
WO2020194534A1 (ja) * | 2019-03-26 | 2020-10-01 | 東芝三菱電機産業システム株式会社 | 異常判定支援装置 |
-
2021
- 2021-07-13 JP JP2021115537A patent/JP2023012094A/ja active Pending
-
2022
- 2022-07-13 WO PCT/JP2022/027504 patent/WO2023286795A1/ja active Application Filing
- 2022-07-13 CN CN202280049403.8A patent/CN117836560A/zh active Pending
- 2022-07-13 EP EP22842137.6A patent/EP4372278A1/en active Pending
- 2022-07-13 TW TW111126220A patent/TW202309445A/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018181003A1 (ja) * | 2017-03-31 | 2018-10-04 | 三菱重工業株式会社 | 情報提供装置、情報提供システム、情報提供方法及びプログラム |
WO2020040110A1 (ja) * | 2018-08-23 | 2020-02-27 | 荏原環境プラント株式会社 | 情報処理装置、情報処理プログラム、および情報処理方法 |
JP2020038058A (ja) | 2018-08-23 | 2020-03-12 | 荏原環境プラント株式会社 | 情報処理装置、情報処理プログラム、および情報処理方法 |
WO2020194534A1 (ja) * | 2019-03-26 | 2020-10-01 | 東芝三菱電機産業システム株式会社 | 異常判定支援装置 |
Also Published As
Publication number | Publication date |
---|---|
TW202309445A (zh) | 2023-03-01 |
JP2023012094A (ja) | 2023-01-25 |
EP4372278A1 (en) | 2024-05-22 |
CN117836560A (zh) | 2024-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7281768B2 (ja) | 情報処理装置、情報処理プログラム、および情報処理方法 | |
WO2020040110A1 (ja) | 情報処理装置、情報処理プログラム、および情報処理方法 | |
US11315085B2 (en) | Device, system and method for the monitoring, control and optimization of a waste pickup service | |
JP5969685B1 (ja) | 廃棄物選別システム及びその選別方法 | |
CN111065859A (zh) | 推定废弃物的组成的装置、系统、程序、方法及数据结构 | |
JP2017109197A (ja) | 廃棄物選別システム及びその選別方法 | |
JP2019119545A (ja) | 情報処理装置、制御装置、情報処理方法、および情報処理プログラム | |
JP2022132331A (ja) | 廃棄物の質を推定する装置、システム、プログラム、及び方法 | |
WO2023286795A1 (ja) | 情報処理方法、情報処理装置、および情報処理プログラム | |
CA3022583A1 (fr) | Automatic system for reliable and efficient interactive management of waste generated by an individual or by a group of individuals, method thereof | |
CN109353620A (zh) | 商业经营场所垃圾去污化预处理方法及系统 | |
CN113291665A (zh) | 一种垃圾回收系统、服务器及底盘 | |
CN111094851B (zh) | 热值推定方法、热值推定装置及垃圾贮存设备 | |
JP2004192268A (ja) | 廃棄物輸送の計画方法及びその装置 | |
JP7113930B1 (ja) | 転落者検出システム、転落者検出方法、および転落者検出プログラム | |
JP7387465B2 (ja) | 廃棄物撹拌状態評価装置および該方法 | |
JP7360287B2 (ja) | ごみクレーンの運転システムおよびこれを適用したごみ処理施設 | |
JP2002279056A (ja) | 被覆型廃棄物最終処分場における埋立て管理方法 | |
CN113947258A (zh) | 一种厨余垃圾净化智能化管理方法及系统 | |
Tan et al. | Automated waste segregation system using Arduino Uno R3 | |
Kumar et al. | Smart Garbage Collector Bot with Advanced Monitoring System | |
JP2020203249A (ja) | 廃棄物選別貯留装置、該廃棄物選別貯留装置を備えた廃棄物処理施設 | |
CN118786315A (zh) | 信息处理装置、信息处理方法以及信息处理程序 | |
Pawase Avinash et al. | Automated Waste Segregator for Efficient Recycling Using IoT | |
JP2003020104A (ja) | ごみおよびその焼却灰の処理設備およびそれらの処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22842137 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280049403.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022842137 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022842137 Country of ref document: EP Effective date: 20240213 |