CN115885231B - Cleaning system - Google Patents
Cleaning system Download PDFInfo
- Publication number
- CN115885231B CN115885231B CN202080102335.8A CN202080102335A CN115885231B CN 115885231 B CN115885231 B CN 115885231B CN 202080102335 A CN202080102335 A CN 202080102335A CN 115885231 B CN115885231 B CN 115885231B
- Authority
- CN
- China
- Prior art keywords
- cleaning
- range
- robot
- autonomous traveling
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 437
- 238000003384 imaging method Methods 0.000 claims abstract description 32
- 238000012544 monitoring process Methods 0.000 abstract description 24
- 238000012806 monitoring device Methods 0.000 abstract description 17
- 238000010586 diagram Methods 0.000 description 9
- 238000000034 method Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Electric Vacuum Cleaner (AREA)
Abstract
Compared with the case of cleaning a relatively wide cleaning area by using only an autonomous traveling robot suitable for cleaning a relatively wide place, the cleaning omission in the cleaning area is reduced. The cleaning monitoring device (10) comprises: an unclean range determination unit (14) that determines an unclean range in which the large cleaning robot (1) is not cleaning in the cleaning area (3) based on an actual cleaning range of the large cleaning robot (1), the actual cleaning range of the large cleaning robot (1) being obtained by analyzing imaging data of the cleaning area (3) obtained by the monitoring camera (5); a small instruction generation unit (15) that generates a program for cleaning the range that is not cleaned by the large cleaning robot (1) on the basis of the determined not-cleaned range; and a robot control unit (12) that downloads the program generated by the small instruction generation unit (15) to the small cleaning robot (2), thereby causing the small cleaning robot (2) to clean the large cleaning robot (1) from the range of omission.
Description
Technical Field
The present invention relates to a cleaning system, and more particularly, to a cleaning operation using an autonomous traveling robot.
Background
In order to improve efficiency of cleaning work in facilities having a large floor area such as shopping malls and buildings, a relatively large autonomous traveling type cleaning robot has been introduced. The management company providing the maintenance service of the facility makes a cleaning service contract with the side of the facility, and undertakes the cleaning service of the facility as a ring of services.
When a facility is cleaned by a cleaning robot, the management company may not perform the cleaning at night when no facility user is present. In this case, the cleaning robot is programmed so as to clean a predetermined cleaning range in the facility. The cleaning robot starts the cleaning operation at a predetermined timing in accordance with the program.
However, the contract amount related to the cleaning business may be set according to the cleaning area in the facility. In this case, since the above-described cleaning range is known, the management company programs the cleaning robot so that the cleaning robot can clean the entire cleaning range.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2004-326692
Patent document 2: japanese patent laid-open publication No. 2018-043358
Patent document 3: japanese patent laid-open publication 2016-087106
Patent document 4: japanese patent laid-open No. 2009-165823
Disclosure of Invention
Problems to be solved by the invention
However, the cleaning robot does not necessarily perform cleaning according to a programmed plan. For example, when a tire serving as a foot of a cleaning robot is driven by a motor and is driven, if a place where inclination exists is included in a facility, an error may occur in the driving distance. In addition, if the ground is wet, an error may occur in the travel distance due to the tire slip. In addition, if a shelf or the like is temporarily placed on a programmed travel path due to an activity or the like, it may become an obstacle. The cleaning robot may be equipped with a function of avoiding an obstacle and returning to the original travel path. However, an error may occur in the position returned to the travel path. That is, even if the cleaning robot is programmed to clean a predetermined cleaning range, the cleaning robot is not necessarily able to travel in accordance with the program, and thus the entire planned cleaning range is not necessarily able to be reliably cleaned.
Further, since the cleaning robot introduced into a facility having a relatively large floor area is relatively large, even if a large area can be cleaned at one time, a narrow place or a minute part in the cleaning area may not be cleaned.
The purpose of the present invention is to reduce the number of cleaning omission sites in a cleaning area, compared to the case where cleaning is performed in a relatively wide cleaning area using only an autonomous traveling robot suitable for cleaning in a relatively wide area.
Means for solving the problems
The cleaning system according to the present invention is characterized by comprising: a1 st autonomous travel robot that autonomously travels and performs a cleaning operation; an imaging unit that images the entire cleaning area during a period in which the 1 st autonomous traveling robot cleans the cleaning area in accordance with cleaning instruction information created based on layout information of the cleaning area; a generation unit that, after the cleaning of the cleaning area by the 1 st autonomous traveling robot is completed, analyzes the imaging data generated by the imaging unit to generate unclean range information for specifying a range in the cleaning area where the 1 st autonomous traveling robot is actually unclean; and an instruction means for instructing the 1 st autonomous traveling robot to clean the area that is actually not cleaned, which is determined based on the unclean area information, wherein the generation means does not wait for the 1 st autonomous traveling robot to end cleaning of the entire cleaning area, but generates unclean area information corresponding to a predetermined area when the 1 st autonomous traveling robot ends cleaning of the predetermined area, wherein the predetermined area is formed by dividing the cleaning area, and the instruction means instructs the 1 st autonomous traveling robot to clean the area that is actually not cleaned within the predetermined area each time unclean area information corresponding to the predetermined area is generated.
Further, the cleaning system includes a2 nd autonomous traveling robot that autonomously travels and performs a cleaning operation, the 2 nd autonomous traveling robot being more suitable for cleaning a small place than the 1 st autonomous traveling robot, and the instruction means instructs the 2 nd autonomous traveling robot to clean a range in which the 1 st autonomous traveling robot is not actually cleaned out of the ranges planned as the cleaning ranges of the 1 st autonomous traveling robot.
The cleaning system of the present invention is characterized by comprising: a1 st autonomous travel robot that autonomously travels and performs a cleaning operation; a2 nd autonomous traveling robot that autonomously travels and performs a cleaning operation, and is more suitable for cleaning a narrow place than the 1 st autonomous traveling robot; an imaging unit that images the entire cleaning area during a period in which the 1 st autonomous traveling robot cleans the cleaning area in accordance with cleaning instruction information created based on layout information of the cleaning area; a generation unit that, after the cleaning of the cleaning area by the 1 st autonomous traveling robot is completed, analyzes the imaging data generated by the imaging unit to generate unclean range information for specifying a range in the cleaning area where the 1 st autonomous traveling robot is actually unclean; and an instruction unit that instructs the 1 st autonomous traveling robot to clean a range that is not actually cleaned, which is determined from the non-cleaning range information, the instruction unit instructing the 2 nd autonomous traveling robot to clean a range that is not actually cleaned by the 1 st autonomous traveling robot, out of the ranges that are planned as the cleaning ranges of the 1 st autonomous traveling robot.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, compared to a case where cleaning of a relatively wide cleaning area is performed using only an autonomous traveling robot suitable for cleaning of a relatively wide place, the cleaning omission portion in the cleaning area can be reduced.
Drawings
Fig. 1 is a diagram showing an overall configuration of an embodiment of a cleaning system according to the present invention and a block configuration of a cleaning monitoring device included in the cleaning system.
Fig. 2 is a diagram showing an example of layout information stored in the layout information storage unit in the present embodiment.
Fig. 3 is a diagram showing an example of the point table stored in the point table storage unit in the present embodiment.
Fig. 4 is a diagram showing a travel path for the large cleaning robot to perform a cleaning operation according to the created program in the present embodiment.
Fig. 5 is a flowchart showing the cleaning performance information generation process in the present embodiment.
Fig. 6 is a view showing a cleaning range in the case where the large cleaning robot normally performs a cleaning operation according to a program to be produced in the present embodiment.
Fig. 7 is a view showing a cleaning range obtained when the large cleaning robot actually performs a cleaning operation in the present embodiment.
Fig. 8 is a view showing a cleaning range of the small cleaning robot in the present embodiment.
Fig. 9 is a view showing a small area which is a unit of cleaning area by the large cleaning robot in the present embodiment.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
Fig. 1 is a diagram showing an overall configuration of an embodiment of a cleaning system according to the present invention and a block configuration of a cleaning monitoring device 10 included in the cleaning system. Fig. 1 also shows a schematic view of a cleaning region 3 to be subjected to cleaning operations by the autonomous traveling robots 1 and 2. The cleaning area 3 is, for example, a facility having a large floor area such as a shopping mall or a building, or a part thereof. The layout and area of the cleaning area 3 are known information. In the cleaning area 3, the pillars of the building, articles placed on the ground, and the like become obstacles 4a, 4b in the case of cleaning. In the case where the cleaning area 3 is, for example, a shopping mall, the racks or the like may become obstacles 4a, 4b at the time of cleaning, and in the case where the cleaning area 3 is one room of a tenant in a building, tables, cabinets or the like may become obstacles 4a, 4b at the time of cleaning. The places where these obstacles 4a, 4b are located are ranges where the autonomous traveling robot 1 cannot clean. In the following description, the term "obstacle 4" is collectively used when describing the obstacle without distinguishing between the obstacles 4a and 4b.
The monitoring camera 5 is a photographing unit provided so as to be able to photograph the entire cleaning area 3 by 1 or more. The monitoring system 6 is a system for monitoring facilities using the monitoring camera 5. The monitoring system 6 stores imaging data generated by imaging by the monitoring camera 5, and manages the imaging data.
The autonomous traveling robots 1 and 2 perform cleaning work while traveling autonomously in accordance with a program. The autonomous traveling robots 1 and 2 include an implement for cleaning, a traveling means for traveling, a controller for controlling cleaning and traveling according to a program, and the like. The controller includes a storage unit such as a processor, ROM, RAM, and a memory storing a program, and a communication unit between the cleaning monitoring device 10. In addition, at least the autonomous traveling robot 1 is provided with a marker 1a so that the current position in the cleaning area 3 can be easily searched from the captured data of the monitoring camera 5. The mark 1a is preferably installed on the center line of the autonomous traveling robot 1 so that the range (width) of the autonomous traveling robot 1 to be cleaned while traveling can be easily determined.
In the present embodiment, 2 autonomous traveling robots 1 and 2 are used. The autonomous traveling robot (hereinafter, referred to as "large cleaning robot") 1 is a relatively large 1 st autonomous traveling robot suitable for cleaning a relatively wide place. The autonomous traveling robot (hereinafter, referred to as "small cleaning robot") 2 is a 2 nd autonomous traveling robot suitable for cleaning in a relatively small place. The large cleaning robot 1 can perform cleaning over a wide range, more specifically, over a wide range in the width direction with respect to the traveling direction. On the other hand, the side of a space or wall with a narrow width cannot be cleaned. In addition, an article placed on the ground may become an obstacle, and the vicinity of the obstacle may not be cleaned. In contrast, the small cleaning robot 2 is not suitable for efficiently cleaning a wide range because the cleaning range (the width for cleaning in the traveling direction) is relatively narrow in terms of specifications. On the other hand, unlike the large cleaning robot 1, the small cleaning robot 2 is small and compact, and therefore can clean the vicinity of the articles placed on the wall or floor.
The cleaning monitoring device 10 can be realized by a general-purpose hardware configuration such as a Personal Computer (PC). That is, the cleaning and monitoring device 10 includes a storage unit such as a processor, a ROM, a RAM, and a Hard Disk Drive (HDD), a user interface such as a mouse, a keyboard, and a display, and a network interface for performing data communication with the cleaning robots 1 and 2 and the monitoring system 6.
The cleaning monitoring device 10 in the present embodiment is configured to obtain imaging data in which the entire cleaning area 3 is taken as an imaging range by the monitoring camera 5 included in the monitoring system 6, but the monitoring camera 5 may be connected to the cleaning monitoring device 10. In this case, the cleaning and monitoring device 10 needs to have an interface for connecting the monitoring camera 5.
As shown in fig. 1, the cleaning and monitoring device 10 includes a program creation unit 11, a robot control unit 12, an image acquisition unit 13, an unclean range determination unit 14, a small instruction generation unit 15, a layout information storage unit 16, and a dot table storage unit 17. In the present embodiment, components not used in the description are omitted from the drawings.
The program creation unit 11 refers to the layout information of the cleaning area 3, and creates a program for the large cleaning robot 1 to perform the cleaning operation by autonomous traveling. The robot control unit 12 controls the operations of the cleaning robots 1 and 2. For example, the program created by the program creation unit 11 is downloaded to the large cleaning robot 1 by wireless, and instructions concerning the cleaning work are given. The robot control unit 12 also instructs the cleaning operation by wirelessly downloading the control program created by the small instruction generating unit 15 to the small cleaning robot 2. The image acquisition unit 13 acquires imaging data in which the cleaning region 3 is set as an imaging range from the monitoring system 6. In particular, in the present embodiment, the imaging data including at least the cleaning operation time zone of the large cleaning robot 1 is acquired. The unclean range specification unit 14 analyzes the captured data acquired by the image acquisition unit 13 to specify a range (hereinafter referred to as "unclean range") in which the large-scale cleaning robot 1 is actually unclean. The "actually uncleaned range" includes a range which cannot be cleaned according to the specifications of the large cleaning robot 1, a range in which cleaning omission occurs due to a travel error of the large cleaning robot 1 or the presence of the obstacle 4 even if traveling in accordance with a program, and the like. The small instruction generating unit 15 refers to the unclean range specified by the unclean range specifying unit 14, and creates a program for causing the small cleaning robot 2 to clean the unclean range.
Fig. 2 is a diagram showing an example of layout information stored in the layout information storage unit 16. Specifically, a plan view showing the layout of the cleaning area 3 shown in fig. 1 is shown. By the layout of the cleaning area 3, the position of the obstacle 4 becomes clear.
Fig. 3 is a diagram showing an example of the point table stored in the point table storage unit 17. The dot table 7 is an image showing the cleaning region 3, specifically, an image produced by dispersing dots (also referred to as "dots") at predetermined intervals on a plan view showing the cleaning region 3 in the same manner as in fig. 2. In the present embodiment, the dot table 7 is also referred to as "dot information". As will be apparent from the following description, in order to obtain an unclean range of the large cleaning robot 1, a large number of points are preferably marked at predetermined intervals in the cleaning region 3. For example, points are marked at 10cm intervals in the cleaning region 3. The greater the number of points, the more accurately the unclean range can be determined. The number of points contained in the point table 7 is known.
Each of the components 11 to 15 in the cleaning and monitoring device 10 is realized by a coordinated operation of a computer forming the cleaning and monitoring device 10 and a program operating by a CPU mounted on the computer. The storage units 16 and 17 are realized by HDD mounted on the cleaning monitor apparatus 10. Or may utilize RAM or an externally located storage unit via a network.
The program used in the present embodiment may be provided by a communication unit, or may be provided by being stored in a computer-readable recording medium such as a CD-ROM or a USB memory. The program provided by the communication unit or the recording medium is installed in a computer, and the CPU of the computer realizes various processes by sequentially executing the program.
Next, the operation of the present embodiment will be described.
In the present embodiment, the large cleaning robot 1 and the small cleaning robot 2 are used to clean the cleaning area 3, but the large cleaning robot 1 is first used to clean. In order to cause the large-sized cleaning robot 1 to perform cleaning, the program creation unit 11 refers to the layout information of the cleaning area 3 stored in the layout information storage unit 16, and creates a program for the large-sized cleaning robot 1 to perform cleaning work.
Fig. 4 is a diagram showing a travel path when the large cleaning robot 1 moves according to the program created by the program creation unit 11. The large cleaning robot 1 performs operation control by a program such that cleaning is started from a position that becomes the start point 8 to a position that becomes the end point 9, and finally returns to the position of the start point 8. Fig. 4 shows a travel path 21 of the large cleaning robot 1. When cleaning is performed according to the travel path 21, the entire predetermined cleaning area 3 can be uniformly cleaned.
As described above, the robot control unit 12 downloads the program created by the program creation unit 11 to the large cleaning robot 1. Thus, the large cleaning robot 1 can autonomously clean the cleaning area 3 according to the downloaded program.
The process of creating the program in the program creation unit 11 described above may be performed by the same technique as the conventional one for setting the program in the large cleaning robot 1.
Next, the large cleaning robot 1 in the standby state at the starting point 8 starts the cleaning operation at a predetermined time in accordance with the program. The monitoring camera 5 photographs the entire cleaning area 3 from the start of cleaning to the end of cleaning by the large-sized cleaning robot 1. The photographing data generated by photographing by the monitoring camera 5 is accumulated in the monitoring system 6.
Next, a process of generating cleaning performance information by analyzing the captured data by the cleaning monitoring device 10 will be described with reference to a flowchart shown in fig. 5.
When the predetermined end time of the cleaning operation of the large cleaning robot 1 or when a predetermined time has elapsed from the end time, the image acquisition unit 13 acquires, from the monitoring system 6, the imaging data of the cleaning region 3 imaged in the cleaning operation time zone of the large cleaning robot 1 (step 110). The image acquisition unit 13 may be configured to operate to acquire the imaging data sequentially from the start to the end of the cleaning operation, instead of acquiring the imaging data together after the cleaning operation is completed.
When the imaging data is acquired, the unclean range specification unit 14 specifies an unclean range indicating a range not cleaned by the large cleaning robot 1 as follows.
First, the unclean range specification unit 14 analyzes the captured data to detect the travel path of the large cleaning robot 1 (step 120). Specifically, the unclean range determination unit 14 can detect the trajectory of the movement of the large cleaning robot 1 as the travel path of the large cleaning robot 1 at the time of cleaning by detecting the mark 1a attached to the large cleaning robot 1 from the captured data.
In order to detect the mark 1a from the captured data, the mark 1a is formed with a color different from the color of the upper surface of the main body of the large cleaning robot 1 or the floor surface of the cleaning area 3, or is formed with a light emitting member.
Next, the unclean range specification unit 14 specifies the actual unclean range of the large cleaning robot 1 based on the detected travel path (step 130). As described above, the mark 1a is provided on the center line of the large cleaning robot 1. Since the width of the large cleaning robot 1, that is, the cleaning range in the width direction with respect to the moving direction of the large cleaning robot 1 is known information, the non-cleaning range determining unit 14 can determine the cleaning range in the width direction when the large cleaning robot 1 is located at each place, based on the position of the mark 1a detected from the imaging data and the cleaning range in the width direction of the large cleaning robot 1. Further, by connecting the cleaning ranges of the respective points in time series, the actual cleaning range of the entire cleaning area 3 can be determined.
In the present embodiment, the cleaning range of the cleaning robot 1 is determined based on the mark 1a attached to the large-sized cleaning robot 1, but the method for determining the cleaning range is not necessarily required. For example, the processing may be performed as follows: instead of searching the shot data for the mark 1a, the width of the large cleaning robot 1 is regarded as the cleaning range in the width direction, and only the image of the main body of the large cleaning robot 1 is extracted.
When the actual cleaning range of the entire cleaning area 3 is determined, the non-cleaning range determining unit 14 overlaps the determined cleaning range with the point table 7 shown in fig. 3, and then deletes the point located within the determined cleaning range from the point table 7.
Fig. 6 is a view showing a cleaning range in the case where the large cleaning robot 1 normally performs a cleaning operation according to a program to be created in the present embodiment. That is, fig. 6 is a diagram showing a cleaning range in a case where the large cleaning robot 1 can travel on the travel path 21 shown in fig. 4. In fig. 6, a white area without dots indicates a cleaned range (hereinafter, referred to as a "cleaning completion range") 31. On the other hand, the areas where the dots remain represent the unclean areas 32a, 32b, 32c which are not cleaned. In the following description, when the explanation is made without distinguishing the range corresponding to the unclean range, it is collectively referred to as "unclean range 32". In the present embodiment, the cleaning range of the large cleaning robot 1 and the cleaning completion range 31 obtained from the point table 7 are excluded from the cleaning region 3, and the range becomes the unclean range 32.
The cleaning robot 1 cleans the entire cleaning area 3 according to a program, but in terms of the specification, as shown in fig. 6, for example, when the cleaning area 3 is a room, the edge of the cleaning area 3 becomes an unclean area 32a in the vicinity of the wall of the room. The boundary portions with the obstacle 4 are the unclean ranges 32b and 32c, respectively. The cleaning completion range 31 shown in fig. 6 is a cleaning range in a case where the large cleaning robot 1 normally performs a cleaning operation according to a program.
Fig. 7 is a view showing a cleaning range obtained when the large cleaning robot 1 actually performs a cleaning operation in the present embodiment. That is, the actual cleaning range of the large cleaning robot 1 obtained from the imaging data of the monitoring camera 5 is shown. Fig. 7 is a view corresponding to fig. 6, in which a white area indicates a cleaning completion range 31 and a dot-remaining area indicates an unclean range 32. The following is shown in fig. 7: the large-sized cleaning robot 1 performs cleaning in accordance with a program so that the cleaning completion range 31 shown in fig. 6 can be cleaned, but an uncleaned range 32 in which cleaning cannot be performed due to an unillustrated obstacle occurs by a predetermined amount or more. Specifically, an example of the following is shown: an unclean area 32d is formed near the wall of the cleaning area 3 due to an obstacle placed on the ground, an unclean area 32e is formed between the obstacles 4a, 4b due to an obstacle placed on the ground, and an unclean area 32f is formed near the obstacle 4a due to an obstacle placed on the ground.
Next, the small instruction generating unit 15 refers to the unclean range information generated by the unclean range determining unit 14, and generates a program for causing the small cleaning robot 2 to clean the unclean range. Therefore, the small-sized instruction generating unit 15 refers to each of fig. 6 and 7, and sets the cleaning range of the small-sized cleaning robot 2 (step 140).
Fig. 8 is a view showing a cleaning range of the small cleaning robot 2. Fig. 7 shows the unclean area 32 of the large cleaning robot 1, but basically, the unclean area 32, which is the result of the actual cleaning of the large cleaning robot 1, is the cleaning areas 33a to 33e of the small cleaning robot 2. However, strictly speaking, even with the small-sized cleaning robot 2, the obstacle 4 in the uncleaned area 32 of the large-sized cleaning robot 1 cannot be cleaned at the place where it is actually located, and therefore, as shown in fig. 8, the uncleaned area 32 is described as the cleaned area 33 of the small-sized cleaning robot 2 for convenience of description, except for the cleaned area 33 of the small-sized cleaning robot 2. In addition, when it is not necessary to distinguish between the cleaning ranges 33a to 33e, the cleaning range is collectively referred to as "cleaning range 33".
When the cleaning range 33 of the small-sized cleaning robot 2 is determined, the small-sized instruction generating unit 15 refers to the cleaning range 33, and creates a program for the small-sized cleaning robot 2 to perform the cleaning operation by autonomous traveling. The robot control unit 12 downloads the program created by the small instruction generating unit 15 to the small cleaning robot 2 by wireless. When the small-sized cleaning robot 2 receives an instruction to start the cleaning operation, the cleaning of the cleaning range 33 is performed based on the instruction. Here, the reception of the program is referred to as a start instruction of the cleaning operation. The small-sized cleaning robot 2 can clean the wall or the vicinity of the obstacle 4 which cannot be cleaned by the large-sized cleaning robot 1. Thus, the small cleaning robot 2 supplements the cleaning operation of the large cleaning robot 1.
However, when the large cleaning robot 1 cannot clean the obstacle placed on the floor in the unclean ranges 32d, 32e, 32f, the small cleaning robot 2 cannot clean the place where the obstacle is located. In this case, the small-sized instruction generator 15 cleans the cleaning range 33 except for the portion where the obstacle is placed. Further, since the obstacle in the unclean range 32d, 32e, 32f is reflected in the imaging data of the monitoring camera 5, the position of the obstacle can be determined by analyzing the imaging data. In this case, the small-sized instruction generating unit 15 creates a program that can avoid the position of the obstacle and perform cleaning.
According to the present embodiment, although the cleaning range is planned by the large cleaning robot 1, the range which cannot be cleaned in practice is cleaned in addition to the small cleaning robot 2, whereby the entire planned cleaning area 3 can be cleaned.
As described above, basically, the large-sized cleaning robot 1 cleans the cleaning region 3, and the cleaning range 33 of the small-sized cleaning robot 2 is determined from the resulting uncleaned range 32. However, when the cleaning operation is performed at night without the facility user or the like, even if the cleaning operation by the large-sized cleaning robot 1 is waited for and then a cleaning instruction is issued to the small-sized cleaning robot 2, the cleaning operation by the small-sized cleaning robot 2 may not be completed until a desired time, depending on the size of the cleaning area 3, the size of the cleaning range 33 of the small-sized cleaning robot 2, the performance of the small-sized cleaning robot 2, and the like.
Therefore, the cleaning area 3 may be divided to perform the cleaning operation. Fig. 9 is a view corresponding to fig. 4, but as shown in fig. 9, the cleaning area 3 is divided into a plurality of small areas 3a to 3d. Every time the large cleaning robot 1 finishes cleaning a small area, the image acquisition unit 13 acquires imaging data corresponding to the small area from the monitoring system 6. Alternatively, the non-cleaning range determining unit 14 may start processing at a timing when the imaging data corresponding to the small area is acquired by sequentially acquiring the imaging data from the start of the cleaning operation.
Each time the image acquisition unit 13 acquires the captured data corresponding to the small area, the unclean range specification unit 14 generates unclean range information specifying the range in which the large cleaning robot 1 is actually uncleaned in the small area. Next, the small instruction generating unit 15 creates a program for causing the small cleaning robot 2 to clean the unclean range of the small area, based on the unclean range information generated by the unclean range determining unit 14. Then, the robot control unit 12 instructs the small-sized cleaning robot 2 to clean the small area by downloading the created program.
By dividing the cleaning area 3 into a plurality of small areas and sequentially performing the small cleaning robots 2 in units of small areas in this way, the cleaning operation by the small cleaning robots 2 can be started without waiting for the end of the cleaning of the entire cleaning area 3 by the large cleaning robot 1. Specifically, at the time when the large-sized cleaning robot 1 ends the cleaning operation for the small area 3a and shifts to the cleaning operation for the small area 3b, the small-sized cleaning robot 2 can start the cleaning operation for the small area 3a without waiting for the large-sized cleaning robot 1 to end the cleaning operation for the remaining small areas 3b, 3c, 3 d. This makes it possible to advance the end time of the cleaning of the entire cleaning area 3.
In fig. 9, the example in which the cleaning area 3 is equally divided into 4 parts is shown, but the number of divisions of the cleaning area 3 and the shape of each small area shown in fig. 9 are merely an example, and may be appropriately set according to the shape of the cleaning area 3, the position of the obstacle 4, the movement of the small cleaning robot 2, the cleaning performance, and the like. In addition, the small cleaning robot 2 may be prepared corresponding to each small area. Further, the cleaning area 3 may be cleaned by a plurality of small cleaning robots 2, regardless of whether the cleaning area 3 is divided or not.
In the present embodiment, the non-cleaning range specification unit 14 and the small-sized instruction generation unit 15 are provided in the cleaning and monitoring device 10, and the cleaning and monitoring device 10 is caused to perform operation control of the cleaning operation of the small-sized cleaning robot 2, but the operation control may be configured to be performed by the large-sized cleaning robot 1. That is, the large cleaning robot 1 is provided with a download function of a program downloaded to the small cleaning robot 2, which the robot control unit 12 has, the image acquisition unit 13, the unclean range specification unit 14, and the small instruction generation unit 15, and the large cleaning robot 1 creates and downloads a program for controlling the operation of the small cleaning robot 2. In this way, the large cleaning robot 1 may be configured to be capable of instructing the small cleaning robot 2 to perform the cleaning operation.
In the present embodiment, the non-cleaning range information indicating the non-cleaning range 32 of the large-sized cleaning robot 1 is generated, and the small-sized cleaning robot 2 is caused to clean the cleaning range 33 specified based on the non-cleaning range information, but the cleaning range information indicating the cleaning range 33 of the small-sized cleaning robot 2 may be provided to other than the small-sized cleaning robot 2. For example, the cleaning monitoring device 10 may instruct the cleaning person to clean the cleaning range 33 by providing the cleaning range information so that the cleaning person can refer to the information.
In addition, the monitoring camera 5 may be used to capture the cleaning operation of the small cleaning robot 2 in the same manner as in the cleaning operation of the large cleaning robot 1, and the cleaning person may be presented with the range where the small cleaning robot 2 is not completely cleaned, and the final cleaning may be performed.
In the present embodiment, since the large cleaning robot 1 and the small cleaning robot 2 are combined to more reliably clean the cleaning area 3, the human load involved in cleaning the cleaning area 3 and the human cost spent by the cleaning staff can be significantly reduced.
Description of the reference numerals
A large-sized cleaning robot, a 1a mark, a 3 cleaning area, 4a, 4b, 4c obstacle, 5 monitoring camera, 6 monitoring system, 7-point table, 8 start point, 9 travel path, 10 cleaning monitoring device, 11 program making unit, 12 robot control unit, 13 image acquisition unit, 14 unclean range determination unit, 15 small-sized instruction generation unit, 16 layout information storage unit, 17-point table storage unit, 31 cleaning completion range, 32a, 32b, 32c, 32d, 32e unclean range.
Claims (3)
1. A cleaning system, the cleaning system comprising:
A1 st autonomous travel robot that autonomously travels and performs a cleaning operation;
An imaging unit that images the entire cleaning area during a period in which the 1 st autonomous traveling robot cleans the cleaning area in accordance with cleaning instruction information created based on layout information of the cleaning area;
A generation unit that, after the cleaning of the cleaning area by the 1 st autonomous traveling robot is completed, analyzes the imaging data generated by the imaging unit to generate unclean range information for specifying a range in the cleaning area where the 1 st autonomous traveling robot is actually unclean; and
An instruction unit that instructs the 1 st autonomous traveling robot to clean an actually unclean range based on the unclean range information,
The generation means generates non-cleaning range information corresponding to a predetermined range formed by dividing the cleaning region, each time the 1 st autonomous traveling robot ends cleaning of the predetermined range, without waiting for the 1 st autonomous traveling robot to end cleaning of the entire cleaning region,
The instruction means instructs the 1 st autonomous traveling robot to clean an actually unclean range within a predetermined range every time unclean range information corresponding to the predetermined range is generated.
2. The cleaning system of claim 1, wherein the cleaning system comprises a cleaning system,
The cleaning system has a 2 nd autonomous traveling robot which autonomously travels and performs cleaning work, the 2 nd autonomous traveling robot is more suitable for cleaning a narrow place than the 1 st autonomous traveling robot,
The instruction means instructs the 2 nd autonomous traveling robot to clean a range in which the 1 st autonomous traveling robot is not actually cleaned, out of the ranges planned as the cleaning ranges of the 1 st autonomous traveling robot.
3. A cleaning system, the cleaning system comprising:
A1 st autonomous travel robot that autonomously travels and performs a cleaning operation;
A 2 nd autonomous traveling robot that autonomously travels and performs a cleaning operation, and is more suitable for cleaning a narrow place than the 1 st autonomous traveling robot;
An imaging unit that images the entire cleaning area during a period in which the 1 st autonomous traveling robot cleans the cleaning area in accordance with cleaning instruction information created based on layout information of the cleaning area;
A generation unit that, after the cleaning of the cleaning area by the 1 st autonomous traveling robot is completed, analyzes the imaging data generated by the imaging unit to generate unclean range information for specifying a range in the cleaning area where the 1 st autonomous traveling robot is actually unclean; and
An instruction unit that instructs the 1 st autonomous traveling robot to clean an actually unclean range based on the unclean range information,
The instruction means instructs the 2 nd autonomous traveling robot to clean a range in which the 1 st autonomous traveling robot is not actually cleaned, out of the ranges planned as the cleaning ranges of the 1 st autonomous traveling robot.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/026865 WO2022009391A1 (en) | 2020-07-09 | 2020-07-09 | Cleaning system and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115885231A CN115885231A (en) | 2023-03-31 |
CN115885231B true CN115885231B (en) | 2024-04-30 |
Family
ID=79552336
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080102335.8A Active CN115885231B (en) | 2020-07-09 | 2020-07-09 | Cleaning system |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP7019865B1 (en) |
CN (1) | CN115885231B (en) |
WO (1) | WO2022009391A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7562625B2 (en) | 2022-12-14 | 2024-10-07 | 東芝エレベータ株式会社 | Server system, behavior planning system, behavior planning method and program |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002222013A (en) * | 2001-01-26 | 2002-08-09 | Matsushita Electric Ind Co Ltd | Moving working robot |
JP2004199451A (en) * | 2002-12-19 | 2004-07-15 | Matsushita Electric Ind Co Ltd | Automatic traveling device and route management device |
CN1628274A (en) * | 2002-05-31 | 2005-06-15 | 富士通株式会社 | Remotely-operated robot, and robot self position identifying method |
JP2016087106A (en) * | 2014-11-05 | 2016-05-23 | シャープ株式会社 | Cleaning support device and cleaner |
CN106313046A (en) * | 2016-09-27 | 2017-01-11 | 成都普诺思博科技有限公司 | Multi-level obstacle avoidance system of mobile robot |
CN109358619A (en) * | 2018-09-28 | 2019-02-19 | 北京奇虎科技有限公司 | A kind of robot cleaning method, device and electronic equipment |
CN110313863A (en) * | 2018-03-29 | 2019-10-11 | 松下知识产权经营株式会社 | Autonomous scavenging machine, the cleaning method of autonomous scavenging machine and program |
CN110801180A (en) * | 2018-08-03 | 2020-02-18 | 速感科技(北京)有限公司 | Operation method and device of cleaning robot |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981237A (en) * | 1995-09-18 | 1997-03-28 | Matsushita Electric Ind Co Ltd | Traveling object controller |
DE102016125319A1 (en) * | 2016-12-22 | 2018-06-28 | Vorwerk & Co. Interholding Gmbh | Method for operating a self-propelled vehicle |
JP2019082807A (en) * | 2017-10-30 | 2019-05-30 | パナソニックIpマネジメント株式会社 | Augmented reality display system, terminal device, augmented reality display method and autonomously travelling cleaner |
JP2019084165A (en) * | 2017-11-08 | 2019-06-06 | トヨタホーム株式会社 | Cleaning support system |
JP6558708B2 (en) * | 2017-11-16 | 2019-08-14 | みこらった株式会社 | CLEANING SYSTEM, ROBOT CLEANING DEVICE FORMING CLEANING SYSTEM AND FLYING DEVICE |
-
2020
- 2020-07-09 WO PCT/JP2020/026865 patent/WO2022009391A1/en active Application Filing
- 2020-07-09 JP JP2021507728A patent/JP7019865B1/en active Active
- 2020-07-09 CN CN202080102335.8A patent/CN115885231B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002222013A (en) * | 2001-01-26 | 2002-08-09 | Matsushita Electric Ind Co Ltd | Moving working robot |
CN1628274A (en) * | 2002-05-31 | 2005-06-15 | 富士通株式会社 | Remotely-operated robot, and robot self position identifying method |
JP2004199451A (en) * | 2002-12-19 | 2004-07-15 | Matsushita Electric Ind Co Ltd | Automatic traveling device and route management device |
JP2016087106A (en) * | 2014-11-05 | 2016-05-23 | シャープ株式会社 | Cleaning support device and cleaner |
CN106313046A (en) * | 2016-09-27 | 2017-01-11 | 成都普诺思博科技有限公司 | Multi-level obstacle avoidance system of mobile robot |
CN110313863A (en) * | 2018-03-29 | 2019-10-11 | 松下知识产权经营株式会社 | Autonomous scavenging machine, the cleaning method of autonomous scavenging machine and program |
CN110801180A (en) * | 2018-08-03 | 2020-02-18 | 速感科技(北京)有限公司 | Operation method and device of cleaning robot |
CN109358619A (en) * | 2018-09-28 | 2019-02-19 | 北京奇虎科技有限公司 | A kind of robot cleaning method, device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2022009391A1 (en) | 2022-01-13 |
CN115885231A (en) | 2023-03-31 |
JPWO2022009391A1 (en) | 2022-01-13 |
JP7019865B1 (en) | 2022-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110293558B (en) | Robot system and method for operating a workpiece | |
CN109843139B (en) | Electric vacuum cleaner | |
EP2888603B1 (en) | Robot positioning system | |
CN110313863B (en) | Autonomous mobile cleaning machine, cleaning method for autonomous mobile cleaning machine, and program | |
JP6430944B2 (en) | Robot and method for autonomously inspecting or processing floor surfaces | |
US10562177B2 (en) | System with at least two floor processing fixtures | |
CN110325938B (en) | Electric vacuum cleaner | |
KR100571834B1 (en) | Method and apparatus for detecting floor dust of cleaning robot | |
WO2020259274A1 (en) | Area identification method, robot, and storage medium | |
CN111449571B (en) | Cleaning method, device and equipment based on positioning system and computer readable medium | |
JP2018500624A (en) | System and method for use of an optical mileage sensor in a mobile robot | |
CN109381122A (en) | The method for running the cleaning equipment advanced automatically | |
JP2016024820A (en) | Method for cleaning or processing room by independent mobile device and independent mobile device | |
CN113633221B (en) | Method, device and system for processing missed-scanning area of automatic cleaning equipment | |
CN114433517B (en) | Photovoltaic module cleaning method, device, equipment and computer readable storage medium | |
CN115885231B (en) | Cleaning system | |
CN112286185A (en) | Floor sweeping robot, three-dimensional map building method and system thereof, and computer readable storage medium | |
US20220221872A1 (en) | Information processing device, information processing method, and program | |
CN112017149A (en) | Contamination level determination method and electronic device using the same | |
CN115701277A (en) | Cleaning system and program | |
US11537141B2 (en) | Robotic cleaning device with dynamic area coverage | |
KR102504411B1 (en) | Contaminant recognition device and recognition method | |
CN107803837B (en) | Restricting device, visual floor sweeping robot and control method of visual floor sweeping robot | |
JP7417944B2 (en) | Autonomous vacuum cleaner, autonomous vacuum cleaner control method, and program | |
CN115607052A (en) | Cleaning method, device and equipment of robot and cleaning robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |