CN112173497A - Control method and device of garbage collection equipment - Google Patents

Control method and device of garbage collection equipment Download PDF

Info

Publication number
CN112173497A
CN112173497A CN202011247599.0A CN202011247599A CN112173497A CN 112173497 A CN112173497 A CN 112173497A CN 202011247599 A CN202011247599 A CN 202011247599A CN 112173497 A CN112173497 A CN 112173497A
Authority
CN
China
Prior art keywords
target
action
motion
target object
garbage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011247599.0A
Other languages
Chinese (zh)
Inventor
李拔龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202011247599.0A priority Critical patent/CN112173497A/en
Publication of CN112173497A publication Critical patent/CN112173497A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/14Other constructional features; Accessories
    • B65F1/1468Means for facilitating the transport of the receptacle, e.g. wheels, rolls
    • B65F1/1473Receptacles having wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/128Data transmitting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/138Identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/165Remote controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a control method and device of garbage collection equipment, and belongs to the technical field of big data. The method comprises the following steps: acquiring a target image sent by a shooting device, wherein the target image comprises environmental information and a target position where a target object is located; and controlling the garbage collection device to move from the current position to the target position to collect the garbage in the position area under the condition that the environment information identifies that the position area of the target object has the garbage. In the application, the server identifies that the garbage exists in the position area of the target object, the garbage collection device is controlled to move to the target position, the target object is not required to independently walk to the garbage collection device to throw the garbage, the life of the target object is facilitated, and the convenience of the life of the target object is improved.

Description

Control method and device of garbage collection equipment
Technical Field
The application relates to the technical field of big data, in particular to a control method and device of garbage collection equipment.
Background
At present, smart homes are deeply entered into daily life of people, and great convenience is brought to daily life of people. The intelligent garbage can belongs to an intelligent household, and when people stand in front of the intelligent garbage can, the intelligent garbage can automatically open the can cover without manually opening the can cover, so that the intelligent garbage can has sanitation and convenience.
The existing garbage throwing method also needs people to walk to the garbage can to throw the garbage, or the garbage can is taken to a position where the garbage needs to be dumped, so that the garbage throwing method brings inconvenience to life of people under the condition that the people are inconvenient to walk.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for controlling a garbage collecting device, so as to solve the problem of inconvenience in throwing garbage. The specific technical scheme is as follows:
in a first aspect, a method for controlling a garbage collection apparatus is provided, the method comprising:
acquiring a target image sent by a shooting device, wherein the target image comprises environmental information and a target position where a target object is located;
and controlling the garbage collection device to move from the current position to the target position to collect the garbage in the position area under the condition that the environment information identifies that the position area of the target object has the garbage.
Optionally, the environment information includes a handheld object of the target object and a target action, and the identifying that the garbage exists in the location area of the target object through the environment information includes:
determining a target action of the target object in the case that the handheld object is identified to belong to an unobvious junk object;
and confirming that the position area of the target object has garbage under the condition that the target action is determined to be the target action type.
Optionally, the target action includes a hand action, and in a case that it is determined that the target action is a target action type, the confirming that the garbage exists in the location area of the target object includes:
and when the hand motion is recognized as the first motion type, confirming that garbage exists in the position area of the target object.
Optionally, the target action includes a face action, and in a case that it is determined that the target action is a target action type, the confirming that the location area of the target object is junk includes:
and when the hand motion is recognized as a second motion type and the face motion is recognized as a third motion type, confirming that garbage exists in the position area of the target object, wherein the first motion type, the second motion type and the third motion type are different.
Optionally, the identifying that the location area of the target object is junk through the environment information includes:
and confirming that the position area of the target object has rubbish under the condition that the target position is identified to be in the area of the preset position and the shape of the article in the preset position belongs to the irregular shape.
Optionally, the identifying the hand action as the first action type comprises:
inputting a hand motion image into a target motion recognition model to obtain a recognition result output by the target motion recognition model, wherein the target recognition model is obtained by training an initial motion recognition model by using a training image sample marked with a motion type, the hand motion image is an image extracted from the target image and containing the hand motion, and the recognition result is used for representing the hand motion as a first motion type.
Optionally, before inputting the motion type of the hand motion into a motion recognition model, the method further comprises:
acquiring the training image sample and an annotation result of the training image sample, wherein the annotation result is used for indicating the action type of the hand action in the training image sample;
inputting the training image sample into the initial motion recognition model to obtain a recognition result output by the initial motion recognition model, wherein the recognition result is used for indicating the motion type of the hand motion in the training image sample;
and under the condition that the labeling result is inconsistent with the identification result, adjusting model parameters of the initial action identification model to obtain the target action identification model, wherein the identification result output by the target action identification model is consistent with the labeling result.
In a second aspect, there is provided a control apparatus for a waste collection device, the apparatus comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a target image sent by a shooting device, and the target image comprises environmental information and a target position where a target object is located;
and the control module is used for controlling the garbage collection equipment to move from the current position to the target position to collect the garbage in the position area under the condition that the environment information identifies that the position area of the target object has the garbage.
In a third aspect, an electronic device is provided, which includes a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
a processor for implementing any of the method steps described herein when executing the program stored in the memory.
In a fourth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when being executed by a processor, carries out any of the method steps.
The embodiment of the application has the following beneficial effects:
the embodiment of the application provides a control method of garbage collection equipment, which comprises the following steps: acquiring a target image sent by a shooting device, wherein the target image comprises environmental information and a target position where a target object is located; and controlling the garbage collection device to move from the current position to the target position to collect the garbage in the position area under the condition that the garbage exists in the position area of the target object is identified through the environment information. In the application, the server identifies that the garbage exists in the position area of the target object, the garbage collection device is controlled to move to the target position, the target object is not required to independently walk to the garbage collection device to throw the garbage, the life of the target object is facilitated, and the convenience of the life of the target object is improved.
Of course, not all of the above advantages need be achieved in the practice of any one product or method of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
Fig. 1 is a flowchart of a method for controlling a garbage collection apparatus according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for identifying that spam exists in a location area of a target object according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a control device of a garbage collection apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the application provides a control method of a garbage collection device, which can be applied to a server and is used for controlling the garbage collection device to automatically move to a target position for garbage collection.
The following describes a control method of a garbage collection apparatus provided in an embodiment of the present application in detail with reference to a specific embodiment, as shown in fig. 1, the specific steps are as follows:
step 101: and acquiring the target image transmitted by the shooting device.
The target image comprises environment information and a target position where the target object is located.
In the embodiment of the application, a target object moves in a target area, a plurality of shooting devices are arranged in the target area to shoot pictures, a shot target image is sent to a server, the target image comprises environment information of the target area, the environment information comprises map information of the target area, and after the server receives the map information, a map model of the target area is built through modeling, and the position of each article in the map is identified. Since the target object is also within the target area, the server may also identify a target location of the target object in the target area. Optionally, when the map information is updated, the server reconstructs the map model according to the updated map information.
Illustratively, the target area is a room of a target object, a shooting device is arranged in the room, the shooting device can shoot information such as tables, chairs, users and articles on the tables in the room, and the shot pictures are sent to a server, and the server constructs a map model of the room according to the pictures, wherein the map model can be a three-dimensional model.
The environment information also comprises a handheld object of the target object, hand movements and face movements of the target object, an object in a position area where the target object is located, and the like. The environment information includes all environment information within the target area that the photographing device can photograph.
Step 102: and controlling the garbage collection device to move from the current position to the target position to collect the garbage in the position area under the condition that the garbage exists in the position area of the target object is identified through the environment information.
The server analyzes environmental information in the target image after acquiring the target image, and if the server identifies that garbage exists in the position area of the target object and indicates that the target object has the intention of throwing the garbage, the server controls the garbage collection device to automatically move from the current position to the position of the target object, so that the garbage can be conveniently put into the garbage collection device under the condition that the target object does not move.
Wherein the garbage collection device can be a garbage can with wheels.
In the application, the server identifies that the garbage exists in the position area of the target object, the garbage collection device is controlled to move to the target position, the target object is not required to independently walk to the garbage collection device to throw the garbage, the life of the target object is facilitated, and the convenience of the life of the target object is improved.
As an alternative implementation, as shown in fig. 2, the environment information includes a handheld object of the target object and a target action, and the identifying that the spam exists in the location area of the target object through the environment information includes:
step 201: and in the case that the handheld object is identified to belong to an unobvious junk object, determining the target action of the target object.
The server analyzes the handheld object of the target object in the target image, if the server judges that the handheld object is obvious garbage, the server indicates that the handheld object holds the garbage object in the hand, and the server controls the garbage collection device to move to the target position for garbage collection; if the server determines that the handheld object is not obviously junk, the server needs to further determine the target action of the target object to make further determination.
Optionally, the server analyzes the handheld item as a non-obvious junk item through the three-dimensional model. The server establishes a three-dimensional model and sets a size threshold of an article in the three-dimensional model. The server analyzes the size information of the handheld article, and if the server judges that the size information of the handheld article exceeds a size threshold value in the three-dimensional model, the handheld article is a large article and is not an obvious junk article. If the server determines that the size information of the hand-held object does not exceed the size threshold in the three-dimensional model, the object is a small object and may be garbage, but further determination is needed.
For example, if the handheld object is an electric fan and the size of the electric fan exceeds a size threshold, the server determines that the electric fan is not an obvious garbage object; if the handheld article is a tissue or a plastic bag, and the size of the tissue or the plastic bag is smaller than the size threshold, the server determines that the tissue or the plastic bag is possibly a garbage article, and further determines the action of the target object.
Step 202: and if the target action is determined to be the target action type, confirming that the position area of the target object has garbage.
The method comprises the steps that a server obtains a target action of a target object, analyzes the action type of the target action, and indicates that garbage exists in a position area of the target object if the server judges that the action type of the target action belongs to the target action type; and if the server judges that the action type of the target action does not belong to the target action type, indicating that the position area of the target object does not have garbage. Wherein the target action type is an action behavior of the target action.
As an optional implementation manner, the target action includes a hand action, and in a case that the target action is determined to be the target action type, the determining that the position area of the target object has the garbage includes: when the manual operation is recognized as the first operation type, it is confirmed that the position area of the target object has garbage.
In the embodiment of the application, the target action comprises a hand action, the server identifies whether the action type of the hand action is a first action type, and if the action type of the hand action is the first action type, the server indicates that the hand of the target object currently has garbage; if the server identifies that the action type of the hand action is not the first action type, the situation that the hands of the target object do not have garbage currently is indicated. Wherein the first type of motion is a peeling or scraping motion.
Illustratively, the first action type is actions such as peeling and cutting, and the server indicates that the position area of the target object has garbage if the server determines that the hand of the target object is doing the actions such as peeling and cutting, and indicates that the position area of the target object has no garbage if the hand determined by the target object is not doing the actions such as peeling and cutting.
As an optional implementation, the target action includes a face action, and in a case where the target action is determined to be of the target action type, the confirming that the location area of the target object is junk includes: when the hand movement is recognized as a second movement type and the face movement is recognized as a third movement type, it is confirmed that garbage exists in the position area of the target object, wherein the first movement type, the second movement type and the third movement type are different.
The server identifies whether the hand movement of the target object is of a second movement type and the face movement is of a third movement type, and if the server determines that the hand movement is of the second movement type and the face movement is of the third movement type, the server confirms that the position area of the target object has rubbish; if the server determines that the hand motion is not the second motion type or the face motion is not the third motion type, it is confirmed that the position area of the target object is free from garbage. The second action type is the action of the hands which are continuously close to and far away from the face, and the third action type is the opening and closing action of the mouth of the face or the occlusion action of facial muscles.
For example, if the server recognizes that the hand of the target object is continuously close to or away from the face and the mouth is continuously opened and closed, it indicates that the target object is eating and the probability of the area where the target object is located being garbage is high.
As an optional implementation, the identifying that the location area of the target object is junk through the environment information includes: and when the target position is identified to be in the area of the preset position and the shape of the article in the preset position belongs to the irregular shape, confirming that the position area of the target object has rubbish.
The server identifies the preset position and the target position of the target object, and if the server judges that the target position is in the area of the preset position and the shape of the article in the preset position belongs to an irregular shape, the fact that the position area of the target object has rubbish is indicated. The preset position is a position where articles can be placed, such as a table position or a tea table position. The irregularly shaped articles have scattered shapes and no specific shapes, such as scattered fruit peels, crumpled paper towels and the like.
Illustratively, if the server identifies that the handheld object of the target object is an obvious trash object, the garbage collection device is controlled to move to the target position, if the handheld object is identified as an unobvious trash object, whether the hand action of the target object is a peeling or cutting action is identified, if the server identifies that the target object is performing the peeling or cutting action, the garbage collection device is controlled to move to the target position, if the target object is not performing the peeling or cutting action, whether the hand of the target object is continuously close to or away from the face and the mouth is continuously opened and closed, if the server determines that the hand of the target object is continuously close to or away from the face and the mouth is continuously opened and closed, indicating that the target object is eating, the garbage collection device is controlled to move to the target position, if the server determines that the hand of the target object is not continuously close to or away from the face, or the mouth part is not opened and closed continuously, the garbage collection device is controlled not to move.
And if the server identifies that the target object is located in the position area of the table or the tea table and the table or the tea table has garbage articles, controlling the garbage collection equipment to move to the target position.
As an alternative embodiment, recognizing the manual action as the first action type includes: inputting a hand motion image into a target motion recognition model to obtain a recognition result output by the target motion recognition model, wherein the target recognition model is obtained by training an initial motion recognition model by using a training image sample marked with a motion type, the hand motion image is an image extracted from the target image and containing hand motion, and the recognition result is used for representing the hand motion as a first motion type.
After the server acquires the target image, a hand action image containing hand actions is extracted from the target image, then the hand action image is input into the target action recognition model, and a recognition result output by the target action recognition model is obtained and used for representing the hand actions as a first action type.
The server identifies whether the hand motion is the second motion type and identifies whether the face motion is the third motion type, and the identification model is also adopted for identification, which is not described herein again.
As an optional implementation, before inputting the motion type of the hand motion into the motion recognition model, the method further comprises: acquiring a training image sample and an annotation result of the training image sample, wherein the annotation result is used for indicating the action type of the hand action in the training image sample; inputting the training image sample into an initial action recognition model to obtain a recognition result output by the initial action recognition model, wherein the recognition result is used for indicating the action type of the hand action in the training image sample; and under the condition that the labeling result is inconsistent with the recognition result, adjusting the model parameters of the initial action recognition model to obtain a target action recognition model, wherein the recognition result output by the target action recognition model is consistent with the labeling result.
And the server inputs the training image sample into the initial action recognition model to obtain a recognition result output by the initial action recognition model, and if the labeling result is inconsistent with the recognition result, the model parameters of the initial action recognition model are adjusted until the recognition result is consistent with the labeling result to obtain the target action recognition model.
Based on the same technical concept, an embodiment of the present application further provides a control device of a garbage collection apparatus, as shown in fig. 3, the control device includes:
a first obtaining module 301, configured to obtain a target image sent by a shooting device, where the target image includes environment information and a target position where a target object is located;
and the control module 302 is configured to control the garbage collection device to move from the current location to the target location to collect garbage in the location area when it is identified that the location area of the target object has garbage through the environment information.
Optionally, the environment information includes a handheld object of the target object and the target action, and the control module 302 includes:
the first determination unit is used for determining the target action of the target object under the condition that the handheld object is identified to belong to the unobvious junk object;
and a first confirming unit, configured to confirm that the location area of the target object is junk if the target action is determined to be the target action type.
Optionally, the target action comprises a hand action, and the first confirmation unit comprises:
and a first confirming subunit, configured to confirm that the location area of the target object has garbage, if the hand movement is recognized as the first action type.
Optionally, the target action comprises a facial action, and the first confirmation unit comprises:
and a second confirming subunit, configured to confirm that the position area of the target object has garbage when the hand movement is recognized as the second action type and the face movement is recognized as the third action type, where the first action type, the second action type, and the third action type are different.
Optionally, the control module 302 comprises:
and the second confirming unit is used for confirming that garbage exists in the position area of the target object under the condition that the target position is in the area of the preset position and the shape of the article in the preset position belongs to the irregular shape.
Optionally, the first acknowledgement subunit comprises:
and the input submodule is used for inputting the hand motion image into a target motion recognition model to obtain a recognition result output by the target motion recognition model, wherein the target recognition model is obtained by training the initial motion recognition model by using a training image sample labeled with the motion type, the hand motion image is an image which is extracted from the target image and contains the hand motion, and the recognition result is used for representing the hand motion as a first motion type.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring the training image sample and the labeling result of the training image sample, wherein the labeling result is used for indicating the action type of the hand action in the training image sample;
the input module is used for inputting the training image sample into the initial action recognition model to obtain a recognition result output by the initial action recognition model, wherein the recognition result is used for indicating the action type of the hand action in the training image sample;
and the adjusting module is used for adjusting the model parameters of the initial action recognition model to obtain a target action recognition model under the condition that the labeling result is inconsistent with the recognition result, wherein the recognition result output by the target action recognition model is consistent with the labeling result.
Based on the same technical concept, the embodiment of the present invention further provides an electronic device, as shown in fig. 4, including a processor 401, a communication interface 402, a memory 403 and a communication bus 404, where the processor 401, the communication interface 402, and the memory 403 complete mutual communication through the communication bus 404,
a memory 403 for storing a computer program;
the processor 401 is configured to implement the above steps when executing the program stored in the memory 403.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In a further embodiment provided by the present invention, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of any of the methods described above.
In a further embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is merely exemplary of the present application and is presented to enable those skilled in the art to understand and practice the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method of controlling a waste collection device, the method comprising:
acquiring a target image sent by a shooting device, wherein the target image comprises environmental information and a target position where a target object is located;
and controlling the garbage collection device to move from the current position to the target position to collect the garbage in the position area under the condition that the environment information identifies that the position area of the target object has the garbage.
2. The method of claim 1, wherein the environment information comprises a handheld object and a target action of the target object, and wherein the identifying that the location area of the target object is junk through the environment information comprises:
determining a target action of the target object in the case that the handheld object is identified to belong to an unobvious junk object;
and confirming that the position area of the target object has garbage under the condition that the target action is determined to be the target action type.
3. The method of claim 2, wherein the target action comprises a hand action, and wherein confirming that the location area of the target object is junk if the target action is determined to be of a target action type comprises:
and when the hand motion is recognized as the first motion type, confirming that garbage exists in the position area of the target object.
4. The method of claim 3, wherein the target action comprises a facial action, and wherein confirming that the location area of the target object is junk if the target action is determined to be of a target action type comprises:
and when the hand motion is recognized as a second motion type and the face motion is recognized as a third motion type, confirming that garbage exists in the position area of the target object, wherein the first motion type, the second motion type and the third motion type are different.
5. The method of claim 1, wherein the identifying that the location area of the target object is junk through the environment information comprises:
and confirming that the position area of the target object has rubbish under the condition that the target position is identified to be in the area of the preset position and the shape of the article in the preset position belongs to the irregular shape.
6. The method of claim 3, wherein the identifying the hand action as a first action type comprises:
inputting a hand motion image into a target motion recognition model to obtain a recognition result output by the target motion recognition model, wherein the target recognition model is obtained by training an initial motion recognition model by using a training image sample marked with a motion type, the hand motion image is an image extracted from the target image and containing the hand motion, and the recognition result is used for representing the hand motion as a first motion type.
7. The method of claim 6, wherein prior to entering the motion type of the hand motion into a motion recognition model, the method further comprises:
acquiring the training image sample and an annotation result of the training image sample, wherein the annotation result is used for indicating the action type of the hand action in the training image sample;
inputting the training image sample into the initial motion recognition model to obtain a recognition result output by the initial motion recognition model, wherein the recognition result is used for indicating the motion type of the hand motion in the training image sample;
and under the condition that the labeling result is inconsistent with the identification result, adjusting model parameters of the initial action identification model to obtain the target action identification model, wherein the identification result output by the target action identification model is consistent with the labeling result.
8. A control device for a waste collection apparatus, the device comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a target image sent by a shooting device, and the target image comprises environmental information and a target position where a target object is located;
and the control module is used for controlling the garbage collection equipment to move from the current position to the target position to collect the garbage in the position area under the condition that the environment information identifies that the position area of the target object has the garbage.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 7.
CN202011247599.0A 2020-11-10 2020-11-10 Control method and device of garbage collection equipment Pending CN112173497A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011247599.0A CN112173497A (en) 2020-11-10 2020-11-10 Control method and device of garbage collection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011247599.0A CN112173497A (en) 2020-11-10 2020-11-10 Control method and device of garbage collection equipment

Publications (1)

Publication Number Publication Date
CN112173497A true CN112173497A (en) 2021-01-05

Family

ID=73918155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011247599.0A Pending CN112173497A (en) 2020-11-10 2020-11-10 Control method and device of garbage collection equipment

Country Status (1)

Country Link
CN (1) CN112173497A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022174541A1 (en) * 2021-02-20 2022-08-25 北京市商汤科技开发有限公司 Garbage detection method and apparatus, device, storage medium, and program product

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104326195A (en) * 2014-11-10 2015-02-04 安徽省新方尊铸造科技有限公司 Intelligent garbage can with automatic demand judgment function
CN107618783A (en) * 2016-07-13 2018-01-23 深圳市朗驰欣创科技股份有限公司 The automatic control method and control system for receiving object
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN108545359A (en) * 2018-05-15 2018-09-18 福建工程学院 A kind of intelligent garbage bin and its control method of view-based access control model identification
CN208412852U (en) * 2018-05-07 2019-01-22 北京三辰环卫机械有限公司 Mobile dustbin
CN109592256A (en) * 2018-12-18 2019-04-09 东莞市第三人民医院(东莞市石龙人民医院) A kind of clinical waste collection method with intelligent garbage bin
CN109625706A (en) * 2018-11-30 2019-04-16 湖南人文科技学院 A kind of intelligent garbage bin and its control method
CN110040394A (en) * 2019-04-10 2019-07-23 广州大学 A kind of interactive intelligent rubbish robot and its implementation
CN111382599A (en) * 2018-12-27 2020-07-07 北京搜狗科技发展有限公司 Image processing method and device and electronic equipment
CN111439503A (en) * 2020-04-26 2020-07-24 东风汽车集团有限公司 Automatic classification system and method for garbage in vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104326195A (en) * 2014-11-10 2015-02-04 安徽省新方尊铸造科技有限公司 Intelligent garbage can with automatic demand judgment function
CN107618783A (en) * 2016-07-13 2018-01-23 深圳市朗驰欣创科技股份有限公司 The automatic control method and control system for receiving object
WO2018033154A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Gesture control method, device, and electronic apparatus
CN208412852U (en) * 2018-05-07 2019-01-22 北京三辰环卫机械有限公司 Mobile dustbin
CN108545359A (en) * 2018-05-15 2018-09-18 福建工程学院 A kind of intelligent garbage bin and its control method of view-based access control model identification
CN109625706A (en) * 2018-11-30 2019-04-16 湖南人文科技学院 A kind of intelligent garbage bin and its control method
CN109592256A (en) * 2018-12-18 2019-04-09 东莞市第三人民医院(东莞市石龙人民医院) A kind of clinical waste collection method with intelligent garbage bin
CN111382599A (en) * 2018-12-27 2020-07-07 北京搜狗科技发展有限公司 Image processing method and device and electronic equipment
CN110040394A (en) * 2019-04-10 2019-07-23 广州大学 A kind of interactive intelligent rubbish robot and its implementation
CN111439503A (en) * 2020-04-26 2020-07-24 东风汽车集团有限公司 Automatic classification system and method for garbage in vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022174541A1 (en) * 2021-02-20 2022-08-25 北京市商汤科技开发有限公司 Garbage detection method and apparatus, device, storage medium, and program product

Similar Documents

Publication Publication Date Title
CN110294236B (en) Garbage classification monitoring device and method and server system
CN110428019B (en) Intelligent garbage classification method and modularized intelligent garbage classification processing system
WO2021012761A1 (en) Garbage classification method, apparatus and device, and storage medium
CN111012261A (en) Sweeping method and system based on scene recognition, sweeping equipment and storage medium
CN110109878A (en) Photograph album management method, device, storage medium and electronic equipment
CN110334344A (en) A kind of semanteme intension recognizing method, device, equipment and storage medium
CN110428352B (en) Intelligent garbage classification method for wearable equipment and modularized garbage classification system
CN111814828B (en) Intelligent space planning method, device, equipment and storage medium
CN110570856A (en) intelligent garbage can based on voice interaction auxiliary classification putting and auxiliary putting method
CN111160186B (en) Intelligent garbage classification processing method and related products
CN111638651A (en) Intelligent household control panel, setting method thereof, server and storage medium
WO2022174541A1 (en) Garbage detection method and apparatus, device, storage medium, and program product
CN108597510A (en) a kind of data processing method and device
CN107784034A (en) The recognition methods of page classification and device, the device for the identification of page classification
CN112173497A (en) Control method and device of garbage collection equipment
CN110781805A (en) Target object detection method, device, computing equipment and medium
CN110298380A (en) Image processing method, device and electronic equipment
CN112651318A (en) Image recognition-based garbage classification method, device and system
CN113128397A (en) Garbage classification throwing monitoring method, system and device and storage medium
CN110929693A (en) Intelligent garbage classification method and system
CN112766096A (en) Recoverable garbage abnormal delivery identification method, system, terminal and throwing device
CN111832749B (en) Garbage bag identification method and related device
CN111046974B (en) Article classification method and device, storage medium and electronic equipment
CN111232483A (en) Garbage classification method and device and garbage can
CN107506407A (en) A kind of document classification, the method and device called

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210105