WO2018087951A1 - 自律走行体 - Google Patents
自律走行体 Download PDFInfo
- Publication number
- WO2018087951A1 WO2018087951A1 PCT/JP2017/021219 JP2017021219W WO2018087951A1 WO 2018087951 A1 WO2018087951 A1 WO 2018087951A1 JP 2017021219 W JP2017021219 W JP 2017021219W WO 2018087951 A1 WO2018087951 A1 WO 2018087951A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- obstacle
- map
- unit
- detected
- traveling
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 120
- 230000008859 change Effects 0.000 claims abstract description 27
- 238000004140 cleaning Methods 0.000 claims description 83
- 238000003384 imaging method Methods 0.000 claims description 24
- 239000000428 dust Substances 0.000 description 30
- 238000000034 method Methods 0.000 description 25
- 238000012545 processing Methods 0.000 description 16
- 238000004891 communication Methods 0.000 description 13
- 238000000605 extraction Methods 0.000 description 11
- 239000000463 material Substances 0.000 description 11
- 238000005286 illumination Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009408 flooring Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 210000004209 hair Anatomy 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000007779 soft material Substances 0.000 description 1
- 229920003002 synthetic resin Polymers 0.000 description 1
- 239000000057 synthetic resin Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
Definitions
- Embodiment of this invention is related with the autonomous running body which creates the map which describes the information of the area
- the interior layout of the room is not always constant, and the arrangement of obstacles may change from when the map was created, so if you set the travel route based only on the stored map, There is a risk that the vehicle may be hindered by obstacles that are not marked on the map. Therefore, when an obstacle not described in the map is newly detected, it is conceivable to change the map in accordance with the detection and set the next and subsequent travel routes in accordance with the changed map.
- the next travel route will be different from the original optimal route May be a route.
- the problem to be solved by the present invention is to provide an autonomous traveling body capable of realizing efficient and highly accurate autonomous traveling.
- the autonomous traveling body of the embodiment includes a main body case, drive wheels, self-position estimation means, obstacle detection means, map generation means, and control means.
- the drive wheel is allowed to travel in the main body case.
- the self position estimating means estimates the self position.
- the obstacle detection means detects an obstacle outside the main body case.
- the map generation means creates a map in which information on the area where the main body case has traveled is recorded based on the obstacle detection by the obstacle detection means while the main body case is traveling and the self-position estimated by the self-position estimation means.
- the control means causes the main body case to autonomously travel by controlling the operation of the drive wheels. Further, the control means includes a travel mode for controlling the operation of the drive wheels so that the main body case autonomously travels along the travel route set based on the map. Then, this control means determines whether or not to change the next travel route based on the obstacle detected by the obstacle detection means in the travel mode.
- FIG. 6 is an explanatory diagram illustrating an example of a distance image generated based on ().
- (a) is explanatory drawing which shows an example of the memorize
- (b) is explanatory drawing which shows an example of operation
- It is a flowchart which shows control of an autonomous traveling body same as the above.
- It is explanatory drawing which shows typically the state from which the position of the obstacle memorize
- surface which shows the relationship of the presence or absence of the change of the time information of the autonomous traveling body of 2nd Embodiment, detection frequency, and a travel route.
- reference numeral 11 denotes a vacuum cleaner as an autonomous traveling body
- this vacuum cleaner 11 is a charging device (charging stand) as a base device that serves as a charging base portion of the vacuum cleaner 11.
- 12 (FIG. 5) constitutes an electric cleaning device (electric cleaning system) as an autonomous traveling body device.
- the vacuum cleaner 11 is a so-called self-propelled robot cleaner (cleaning robot) that cleans the floor surface while autonomously traveling (self-propelled) on the floor surface to be cleaned as a traveling surface.
- cleaning robot cleaning robot
- a general-purpose server 16 as data storage means (data storage unit) or display means (data storage unit) via an (external) network 15 such as the Internet It can be wired or wirelessly communicated with a general-purpose external device 17 that is a display unit).
- the vacuum cleaner 11 includes a hollow main body case 20 (FIG. 2).
- the vacuum cleaner 11 includes a traveling unit 21.
- the electric vacuum cleaner 11 includes a cleaning unit 22 that cleans dust such as a floor surface.
- the electric vacuum cleaner 11 includes a communication unit 23 that communicates with external devices including the charging device 12.
- the vacuum cleaner 11 also includes an imaging unit 25 that captures an image.
- the electric vacuum cleaner 11 includes a control unit 27 as a control means that is a controller.
- the vacuum cleaner 11 includes a secondary battery 28 that is a battery.
- the direction along the traveling direction of the vacuum cleaner 11 (main body case 20) is defined as the front-rear direction (arrow FR, RR direction shown in FIG. 2), and the left-right direction intersecting (orthogonal) with the front-rear direction ( The description will be made assuming that the width direction is the width direction.
- the main body case 20 shown in FIGS. 2 to 4 is formed in a flat columnar shape (disk shape) or the like with, for example, a synthetic resin. That is, the main body case 20 includes a side surface portion 20a (FIG. 2), and an upper surface portion 20b (FIG. 2) and a lower surface portion 20c (FIG. 3) that are continuous with the upper and lower portions of the side surface portion 20a, respectively.
- the side surface portion 20a of the main body case 20 is formed in a substantially cylindrical surface shape, and for example, an imaging unit 25 is disposed on the side surface portion 20a.
- the upper surface portion 20b and the lower surface portion 20c of the main body case 20 are each formed in a substantially circular shape. As shown in FIG. 3, the lower surface portion 20c facing the floor surface has a suction port that is a dust collection port. 31 and the exhaust port 32 are opened.
- the traveling unit 21 causes the main body case 20 to travel on the floor surface.
- the traveling unit 21 includes drive wheels 34 and 34 as a plurality (a pair) of drive units. Further, the traveling unit 21 includes motors 35 and 35 (FIG. 1) which are driving means as an operation unit. Further, the traveling unit 21 may include a turning wheel 36 for turning.
- Each drive wheel 34 causes the vacuum cleaner 11 (main body case 20) to travel in the forward and backward directions on the floor (autonomous traveling), that is, for traveling, and rotates not shown along the left-right width direction. It has an axis and is arranged symmetrically in the width direction.
- Each motor 35 drives the drive wheels 34 and 34.
- Each motor 35 (FIG. 1) is arranged corresponding to each of the drive wheels 34, for example, and can drive each drive wheel 34 independently.
- the swivel wheel 36 is a driven wheel that is located at the front and substantially at the center in the width direction of the lower surface 20c of the main body case 20, and is capable of swiveling along the floor surface.
- the cleaning unit 22 is located in the main body case 20, for example, an electric blower 41 that sucks dust together with air from the suction port 31 and exhausts it from the exhaust port 32, and a rotary cleaning body that is rotatably attached to the suction port 31 and scrapes up dust.
- a side brush 44 that is an auxiliary cleaning unit
- a side brush motor 45 that drives the side brush 44 (FIG. 1)
- a dust collecting unit 46 that communicates with the suction port 31 and collects dust (FIG. 2). Yes.
- the electric blower 41, the rotating brush 42 and the brush motor 43 (FIG. 1), and the side brush 44 and the side brush motor 45 (FIG. 1) may be provided with at least one of them.
- the communication unit 23 shown in FIG. 1 includes, for example, a wireless communication unit (wireless communication unit) and a vacuum cleaner signal receiving unit (a vacuum cleaner signal receiving unit) for wireless communication with the external device 17 via the home gateway 14 and the network 15.
- the wireless LAN device 47 may be provided as a notification means (notification unit).
- the communication unit 23 may include a transmission unit (transmission unit) (not illustrated) such as an infrared light emitting element that transmits a radio signal (infrared signal) to the charging device 12 (FIG. 5), for example.
- the communication unit 23 may include a receiving unit (receiving unit) (not shown) such as a phototransistor that receives a radio signal (infrared signal) from the charging device 12 or a remote control (not shown), for example.
- a receiving unit such as a phototransistor that receives a radio signal (infrared signal) from the charging device 12 or a remote control (not shown), for example.
- the communication unit 23 may be equipped with an access point function to directly communicate with the external device 17 without using the home gateway 14.
- a web server function may be added to the communication unit 23.
- the wireless LAN device 47 transmits and receives various types of information to and from the network 15 via the home gateway 14 from the vacuum cleaner 11.
- the imaging unit 25 includes a plurality of, for example, cameras 51a and 51b as one and other imaging units (imaging unit main body), and lamps such as LEDs as illumination units (illumination units) that illuminate the cameras 51a and 51b. 53.
- the cameras 51 a and 51 b are disposed on both sides of the front portion of the side surface portion 20 a of the main body case 20.
- the cameras 51a and 51b have a predetermined angle (equivalent to the left-right direction) with respect to the center line L in the width direction of the vacuum cleaner 11 (main body case 20) on the side surface portion 20a of the main body case 20. (Acute angle) It is arranged at an inclined position.
- these cameras 51a and 51b are arranged substantially symmetrically in the width direction with respect to the main body case 20, and the center positions of these cameras 51a and 51b are in the traveling direction of the vacuum cleaner 11 (main body case 20).
- these cameras 51a and 51b are respectively arranged at substantially equal positions in the vertical direction, that is, at substantially equal height positions. For this reason, these cameras 51a and 51b are set to have substantially the same height from the floor surface with the vacuum cleaner 11 placed on the floor surface. Therefore, the cameras 51a and 51b are arranged to be spaced apart from each other (positions displaced in the left-right direction).
- these cameras 51a and 51b are arranged in such a way that a digital image is displayed at a predetermined horizontal angle of view (for example, 105 °) at a predetermined time, for example, every several tens of milliseconds, in front of the main body case 20 in the traveling direction. It is a digital camera that takes images every hour or every few seconds. Furthermore, these cameras 51a and 51b have their imaging ranges (fields of view) Va and Vb overlapping (FIG. 6), and images P1 and P2 (one and the other) captured by these cameras 51a and 51b (FIG. 7).
- the imaging region is wrapped in the left-right direction in a region including a forward position where the center line L in the width direction of the vacuum cleaner 11 (main body case 20) is extended.
- these cameras 51a and 51b capture, for example, a color image in the visible light region. Note that images captured by these cameras 51a and 51b can be compressed into a predetermined data format by an image processing circuit (not shown), for example.
- the lamp 53 outputs illumination light when an image is captured by the cameras 51a and 51b, and is disposed at an intermediate position between the cameras 51a and 51b, that is, a position on the center line L of the side surface portion 20a of the main body case 20.
- the lamp 53 is disposed at a position substantially equal to the cameras 51a and 51b in the vertical direction, that is, at a substantially equal height. Accordingly, the lamp 53 is disposed at a substantially central portion in the width direction of the cameras 51a and 51b.
- the lamp 53 illuminates light including a visible light region.
- the lamp 53 may be set for each of the cameras 51a and 51b.
- the sensor unit 26 shown in FIG. 1 may include, for example, a step sensor (step detecting means (step detecting unit)) 56.
- the sensor unit 26 may include a temperature sensor (temperature detection means (temperature detection unit)) 57, for example.
- the sensor unit 26 may include, for example, a dust amount sensor (dust amount detecting means (dust amount detecting unit)) 58.
- the sensor unit 26 is a light that detects the turning angle and travel distance of the vacuum cleaner 11 (main body case 20 (FIG. 2)) by detecting, for example, the rotational speed of each drive wheel 34 (each motor 35).
- a rotation speed sensor such as an encoder may be provided.
- the sensor unit 26 may include a non-contact type obstacle sensor that detects an obstacle using ultrasonic waves, infrared rays, or the like. Further, the sensor unit 26 may include a contact type obstacle sensor that detects an obstacle in contact. A rotation speed sensor or an obstacle sensor is not an essential component.
- the step sensor 56 is, for example, a non-contact sensor such as an infrared sensor or an ultrasonic sensor, and outputs infrared or ultrasonic waves toward a detection target, here a floor surface, and receives a reflected wave from the detection target.
- a distance sensor that detects the distance between the step sensor 56 and the detection target based on the time difference between transmission and reception is used. That is, the level difference sensor 56 detects the level difference of the floor surface by detecting the distance between the level difference sensor 56 (position where the level difference sensor 56 is disposed) and the floor surface.
- the step sensor 56 is disposed on the lower surface portion 20 c of the main body case 20.
- the step sensor 56 is disposed, for example, in front of and behind the drive wheels 34, 34, and in the front part of the swivel wheel 36 (the lower front part of the main body case 20).
- the temperature sensor 57 shown in FIG. 1 for example, a non-contact sensor that detects the temperature of the detection target by detecting infrared rays emitted from the detection target is used.
- the temperature sensor 57 is arranged, for example, on the side surface portion 20a or the upper surface portion 20b (FIG. 2) of the main body case 20, and detects the temperature of the detection target in front of the main body case 20 (FIG. 2).
- the temperature may be detected based on an infrared image captured by the cameras 51a and 51b.
- the dust amount sensor 58 includes, for example, a light emitting unit and a light receiving unit disposed inside an air passage communicating from the suction port 31 (FIG. 3) to the dust collecting unit 46 (FIG. 2), and passes through the air passage.
- An optical sensor or the like is used that detects the amount of dust by increasing or decreasing the amount of light received by the light receiving unit from the light emitting unit according to the amount of dust.
- dust amount detection means dust amount detection unit (dust amount sensor) for detecting minute dust visible on the floor surface from images taken by the cameras 51a and 51b is provided. You may prepare.
- the control unit 27 shown in FIG. 1 controls the traveling unit 21, the cleaning unit 22, the communication unit 23, the imaging unit 25, and the like.
- the control unit 27 includes, for example, a CPU that is a control unit main body (control unit main body), a ROM that is a storage unit that stores fixed data such as a program read by the CPU, and a work that is a work area for data processing by the program.
- the microcomputer includes a RAM (not shown) as an area storage unit that dynamically forms various memory areas such as areas.
- the control unit 27 further includes a memory 61 as a storage unit (storage unit), for example.
- the control unit 27 includes an image processing unit 62, for example.
- the control unit 27 includes an image generation unit 63 as, for example, a distance image generation unit (distance image generation unit). Further, the control unit 27 includes a shape acquisition unit 64 that is, for example, a shape acquisition unit. Further, the control unit 27 includes an extraction unit 65 which is, for example, an extraction unit. Further, the control unit 27 includes, for example, a determination unit 66 as a determination unit. In addition, the control unit 27 includes a map generation unit 67 that is a map generation unit that generates, for example, a travel map. Further, the control unit 27 includes a route setting unit 68 that sets a travel route based on, for example, a map. Further, the control unit 27 may include, for example, a travel control unit 69.
- control unit 27 may include a cleaning control unit 70, for example. Further, the control unit 27 may include an imaging control unit 71, for example. Further, the control unit 27 may include an illumination control unit 72, for example. And this control part 27 is equipped with the driving mode which drives the driving wheels 34 and 34 (FIG. 3), ie, the motors 35 and 35, and makes the vacuum cleaner 11 (main body case 20 (FIG. 2)) autonomously drive, for example. . Further, the control unit 27 may have a charging mode for charging the secondary battery 28 via the charging device 12 (FIG. 5). Further, the control unit 27 may include a standby mode during operation standby.
- the memory 61 can store data of images taken by the cameras 51a and 51b, for example.
- the memory 61 may store a threshold value used in the determination unit 66, for example.
- the memory 61 may store various data such as a map created by the map generation unit 67, for example.
- a non-volatile memory such as a flash memory that holds various data stored regardless of whether the electric power of the vacuum cleaner 11 is turned on or off is used.
- the image processing unit 62 performs image processing such as lens distortion correction and contrast adjustment of images captured by the cameras 51a and 51b.
- the image processing unit 62 is not an essential configuration.
- the image generation unit 63 uses a known method to capture an image captured by the cameras 51a and 51b, in this embodiment, an image captured by the cameras 51a and 51b and processed by the image processing unit 62, and between the cameras 51a and 51b.
- the distance (depth) of the object (feature point) is calculated based on the distance of the object, and a distance image (parallax image) indicating the calculated distance of the object (feature point) is generated. That is, the image generation unit 63 applies triangulation based on the distance between the cameras 51a and 51b and the object (feature point) O and the distance between the cameras 51a and 51b (FIG. 6), for example.
- the pixel dots indicating the same position are detected from the respective images (images processed by the image processing unit 62) captured by, and the vertical and horizontal angles of the pixel dots are calculated, and these angles and the cameras 51a, The distance from the cameras 51a and 51b at that position is calculated from the distance between 51b. Therefore, it is preferable that the images captured by the cameras 51a and 51b overlap (wrap) as much as possible.
- the distance image is generated by the image generation unit 63 by converting the calculated distance of each pixel dot into a gradation that can be identified by visual recognition, such as brightness or color tone for each predetermined dot such as one dot. Is displayed.
- the image generation unit 63 is a black and white image that has a smaller brightness as the distance increases, that is, the blacker the distance to the front from the vacuum cleaner 11 (main body case 20), the whiter the distance is, for example, 256.
- the image generation unit 63 may generate a distance image only for pixel dots within a predetermined image range in each image captured by the cameras 51a and 51b, or generate a distance image of the entire image. May be.
- the shape acquisition unit 64 acquires the shape information of the object in the images captured by the cameras 51a and 51b. That is, the shape acquisition unit 64 acquires shape information of the object O located at a predetermined distance D (or a predetermined distance range) with respect to the distance image generated by the image generation unit 63 (FIG. 6). As an example of the distance image, the shape acquisition unit 64 detects the pixel dot distance at a predetermined distance (or distance range) with respect to an object O such as an empty can captured in the distance image PL. The horizontal dimension, ie, the width dimension W, and the vertical dimension, ie, the height dimension H of the object O can be detected (FIG. 7 (c)). In addition, the shape acquisition unit 64 acquires the shape information (width dimension and height dimension) of the object, thereby obtaining the shape information (width dimension and height dimension) of the space or hole where the object does not exist. Can be obtained indirectly.
- the extraction unit 65 extracts feature points based on images captured by the cameras 51a and 51b. That is, the extraction unit 65 extracts feature points in the distance image by performing feature detection (feature extraction) such as edge detection on the distance image generated by the image generation unit 63. This feature point is a reference point when the vacuum cleaner 11 estimates the self position in the cleaning region. Note that any known method can be used as the edge detection method.
- the determination unit 66 includes information detected by the sensor unit 26 (step sensor 56, temperature sensor 57, dust amount sensor 58), shape information of an object at a predetermined distance (or a predetermined distance range) acquired by the shape acquisition unit 64, or Shape information such as narrow spaces located between objects, feature points extracted by the extraction unit 65, and images captured by the cameras 51a and 51b.
- shape information such as narrow spaces located between objects, feature points extracted by the extraction unit 65, and images captured by the cameras 51a and 51b.
- an object in the image processed by the image processing unit 62 Information (height, material, and color tone), etc., and based on these determinations, the self-position of the vacuum cleaner 11 and the presence / absence of an obstacle are determined, and the vacuum cleaner 11 ( The necessity of changing the travel control and the cleaning control of the main body case 20 (FIG.
- the self-position estimation unit as self-position estimation means for estimating the self-position of the vacuum cleaner 11 by the cameras 51a and 51b (image processing unit 62), the image generation unit 63, the extraction unit 65, the determination unit 66, and the like 73 is configured, and the presence or absence of an obstacle is detected by the sensor unit 26 (step sensor 56), the cameras 51a and 51b (image processing unit 62), the image generation unit 63, the shape acquisition unit 64, the determination unit 66, and the like.
- An obstacle detection unit 74 is configured as an obstacle detection means, and includes a sensor unit (step sensor 56, temperature sensor 57, dust amount sensor 58), cameras 51a and 51b (image processing unit 62), image generation unit 63, and shape acquisition.
- the information acquisition unit 75 as information acquisition means for acquiring various types of information on the cleaning area is configured by the unit 64, the determination unit 66, and the like.
- the self-position estimation unit 73 estimates the self-position by collating the feature points stored in the map with the feature points extracted by the extraction unit 65 from the distance image.
- the obstacle detection unit 74 detects the presence or absence of an object (including a step) that becomes an obstacle depending on whether or not there is an object within a predetermined image range in the distance image.
- the information acquisition unit 75 acquires, for example, the floor level information detected by the level sensor 56, the temperature information of the object detected by the temperature sensor 57, the dust level of the floor detected by the dust sensor 58, and the shape acquisition unit 64. Acquire the shape such as the height and width dimensions of the object, the material information of the floor surface, the color tone of the floor surface, and the like.
- the map generation unit 67 uses the shape of the object acquired by the shape acquisition unit 64 and the position of the vacuum cleaner 11 (main body case 20 (FIG. 2)) estimated by the self-position estimation unit 73 to A map is created by calculating the positional relationship between the cleaning area in which the main body case 20 (FIG. 2) is disposed and the objects located in the cleaning area.
- the route setting unit 68 uses the map generated by the map generation unit 67, the self-position estimated by the self-position estimation unit 73, and the detection frequency of an object that becomes an obstacle detected by the obstacle detection unit 74, to make an optimal travel Set the route.
- the optimal travel route to be generated is a route that can travel in the shortest travel distance in a region that can be cleaned in the map (a region that cannot be traveled such as an obstacle or a step), such as a vacuum cleaner 11 ( The main body case 20 (FIG. 2)) travels as straight as possible (the least direction change), the path with the least contact with the obstructing object, or the number of times of traveling the same part is minimized.
- a route that can efficiently travel (clean) such as a route is set.
- a plurality of relay points (subgoals) are set on this travel route.
- the traveling control unit 69 causes the motors 35 and 35 (driving wheels 34 and 34 (FIG. 3) of the traveling unit 21 to travel the electric vacuum cleaner 11 (main body case 20) along the traveling route set by the route setting unit 68. )) Is controlled.
- the travel control unit 69 controls the operation of the motors 35 and 35 by controlling the magnitude and direction of the current flowing through the motors 35 and 35 to rotate the motors 35 and 35 forward or backward. By controlling the operation of these motors 35, 35, the operation of the drive wheels 34, 34 (FIG. 3) is controlled.
- the traveling control unit 69 is configured to control the traveling direction and / or traveling speed of the electric vacuum cleaner 11 (main body case 20) according to the determination of the determining unit 66.
- the cleaning control unit 70 controls the operation of the electric blower 41, the brush motor 43, and the side brush motor 45 of the cleaning unit 22. That is, the cleaning control unit 70 controls the energization amounts of the electric blower 41, the brush motor 43, and the side brush motor 45 separately, so that the electric blower 41, the brush motor 43 (the rotating brush 42 (FIG. 3 )) And the operation of the side brush motor 45 (side brush 44 (FIG. 3)).
- the cleaning control unit 70 is configured to control the operation of the cleaning unit 22 in accordance with the determination of the determination unit 66.
- a controller may be provided separately for each of the electric blower 41, the brush motor 43, and the side brush motor 45.
- the imaging control unit 71 controls the operations of the cameras 51a and 51b of the imaging unit 25. That is, the imaging control unit 71 includes a control circuit that controls the operation of the shutters of the cameras 51a and 51b, and causes the cameras 51a and 51b to capture images every predetermined time by operating the shutters every predetermined time. To control.
- the illumination control unit 72 controls the operation of the lamp 53 of the imaging unit 25. That is, the illumination control unit 72 controls on / off of the lamp 53 via a switch or the like.
- the illumination control unit 72 includes a sensor that detects the brightness around the vacuum cleaner 11, and the lamp 53 is turned on when the brightness detected by the sensor is equal to or lower than a predetermined value. In this case, the lamp 53 is not turned on.
- the imaging control unit 71 and the illumination control unit 72 may be configured as imaging control means (imaging control unit) separate from the control unit 27.
- the secondary battery 28 supplies power to the traveling unit 21, the cleaning unit 22, the communication unit 23, the imaging unit 25, the sensor unit 26, the control unit 27, and the like.
- the secondary battery 28 is electrically connected to charging terminals 77 and 77 as connection portions exposed on both sides of the rear portion of the lower surface portion 20c of the main body case 20 shown in FIG. Is electrically and mechanically connected to the charging device 12 (FIG. 5), so that the battery is charged via the charging device 12 (FIG. 5).
- the home gateway 14 shown in FIG. 1 is also called an access point or the like, is installed in a building, and is connected to the network 15 by wire, for example.
- the server 16 is a computer (cloud server) connected to the network 15, and can store various data.
- the external device 17 can be wired or wirelessly communicated with the network 15 via the home gateway 14 inside the building, for example, and can be wired or wirelessly communicated with the network 15 outside the building.
- a general-purpose device such as a terminal (tablet PC) 17a and a smartphone (mobile phone) 17b.
- the external device 17 has at least a display function for displaying an image.
- the electric vacuum cleaner is roughly classified into a cleaning operation for cleaning with the electric vacuum cleaner 11 and a charging operation for charging the secondary battery 28 with the charging device 12. Since a known method using a charging circuit such as a constant current circuit built in the charging device 12 is used for the charging operation, only the cleaning operation will be described.
- an imaging operation for imaging a predetermined object by at least one of the cameras 51a and 51b according to a command from the external device 17 or the like may be provided.
- the control unit 27 when the vacuum cleaner 11 reaches a preset cleaning start time or receives a cleaning start command signal transmitted from the remote controller or the external device 17, the control unit 27 is set in the standby mode. The mode is switched to the travel mode, and the control unit 27 (travel control unit 69) drives the motors 35 and 35 (drive wheels 34 and 34) to leave the charging device 12 by a predetermined distance.
- the vacuum cleaner 11 creates a map of the cleaning area by the map generation unit 67.
- the control unit 27 running control unit 69
- the vacuum cleaner 11 main body case 20
- the control unit 27 causes the vacuum cleaner 11 (main body case 20) to run along the outer wall of the cleaning area, or to turn at that position.
- the presence / absence of an object that becomes an obstacle by the obstacle detection unit 74 and various information by the information acquisition unit 75 and create a map based on the current position of the vacuum cleaner 11 (map creation mode) . If the control unit 27 determines that the entire cleaning area has been mapped, the map creation mode is terminated and the process proceeds to a cleaning mode described later.
- the map creation mode in addition to the case where the vacuum cleaner 11 is activated in a state where the map generation unit 67 has not created a map of the cleaning area (the map is not stored in the memory 61), for example, the map is created by the user. This is selected when a new creation or change instruction is entered.
- the detection frequency of the detected object is ignored when the travel route is set by the route setting unit 68 at the next cleaning, and at least part of the information such as the position of the detected object, for example, all is detected. This is used to change the travel route for the next cleaning.
- the cleaning unit 22 may be operated to perform cleaning simultaneously with map creation.
- the created map MP has a cleaning area (room) divided into meshes M having a predetermined size such as a square shape (square shape).
- the object detected by the obstacle detection unit 74 (FIG. 1) and the information acquired by the information acquisition unit 75 (FIG. 1) are associated with each other and stored in the memory 61.
- the stored information includes the height, material, color tone, shape, temperature, feature point, detection frequency, and the like of an object in the mesh M.
- the presence / absence of an object is acquired by the obstacle detection unit 74 shown in FIG. 1, and the height and shape of the object are acquired by the shape acquisition unit 64 based on images captured by the cameras 51a and 51b, and the material and color tone of the object are acquired.
- This map MP is stored in, for example, the memory 61 at the time of generation, and is read from the memory 61 and used after the next cleaning. However, since the layout of the object may be changed even in the same cleaning area, in this embodiment, the map MP once created is updated as needed based on the distance measurement of the object in the cleaning mode described later.
- the map MP may be generated as appropriate according to, for example, a user command, or may be input in advance by the user without providing a map creation mode.
- the control unit 27 (route setting unit 68) generates an optimal travel route based on the map, and performs cleaning while autonomously traveling in the cleaning area along the travel route (cleaning mode).
- the cleaning unit 22 has an electric blower 41 driven by a control unit 27 (cleaning control unit 70), a brush motor 43 (rotary brush 42 (FIG. 3)), or a side brush motor 45 (side brush 44 ( 3)), the dust on the floor is collected into the dust collecting part 46 (FIG. 2) through the suction port 31 (FIG. 3).
- the vacuum cleaner 11 operates the cleaning unit 22 while moving toward the relay point along the traveling route by the obstacle detection unit 74 and the information acquisition unit 75.
- the presence / absence of the object and various information are acquired, and the self-position estimating unit 73 periodically estimates the self-position and repeats the operation of passing through the relay point. That is, the electric vacuum cleaner 11 travels so as to sequentially pass through the set relay points while cleaning. For example, when the map MP shown in FIG. 8 (a) is stored, the vacuum cleaner 11 goes straight from the predetermined start position (for example, the upper left position in the figure) toward the relay point SG, and turns 90 °. Then, the travel route RT is set so that the vehicle travels straight toward the next relay point SG and repeats the operation of.
- the predetermined start position for example, the upper left position in the figure
- the search operation is performed assuming that the cleaning area is different from the information recorded on the map.
- the vacuum cleaner 11 detects these objects O1 and O2. Perform a search operation to search. Specifically, the vacuum cleaner 11 sets a provisional travel route RT1 (relay point SG1) so as to travel along the periphery of the objects O1 and O2. That is, since the relay point SG is set on the travel route RT generated based on the map MP, if the state of the cleaning area is as the map MP, it is on the travel route RT between the relay points SG and SG. Since there is no object that becomes an obstacle (FIG.
- the map MP and the actual state of the cleaning area are different.
- the vacuum cleaner 11 is travel-controlled by the control unit 27 so as to travel while grasping the difference by acquiring information by the information acquisition unit 75 shown in FIG. Accordingly, the map generation unit 67 can reflect the result on the map.
- an object that becomes an obstacle is detected by the map generation unit 67 by performing a search operation to search for an object around the position where the object is not detected. Can be reflected on the map with higher accuracy.
- the control unit 27 determines whether or not to change the travel route (step 1). At this time, whether or not to change the travel route is determined based on, for example, the detection frequency of the object described on the map. That is, the control unit 27 (route setting unit 68) refers to the detection frequency of the object written on the map, and when this detection frequency is equal to or higher than a predetermined value (for example, when the detection frequency is equal to or higher than the predetermined frequency).
- the travel route is changed so that the optimum travel route for avoiding the obstacle is set assuming that the object is an obstacle that is routinely arranged in the cleaning area (step 2). move on.
- step 1 when the detection frequency is less than a predetermined value (for example, when the number of detections is less than the predetermined number), the possibility that the obstacle is an obstacle that is temporarily arranged cannot be denied. It is assumed that the process proceeds to step 3 below without changing the travel route. That is, for example, when the number of detections indicating how many times the cleaning has detected an object that becomes an obstacle in the most recent predetermined multiple cleanings is used as the detection frequency, the number of detections is predetermined multiple times, for example 3 The object that has become more than the number of times is used for changing the travel route, and less than three times, that is, the object that is detected for the first time or the object that is detected for the second time is not used for the change of the travel route.
- a travel command determined by the relationship between the set travel route and the self-position for example, a suitable travel command for determining the distance and the turning direction and angle when turning (changing direction) when traveling straight Is output from the determination unit 66 to the travel control unit 69, and the travel control unit 69 operates the motors 35 and 35 (drive wheels 34 and 34 (FIG. 3)) according to the travel command.
- the front of the traveling direction is imaged by the cameras 51a and 51b driven by the control unit 27 (imaging control unit 71) (step 4). At least one of these captured images can be stored in the memory 61. Further, based on the images captured by the cameras 51a and 51b and the distance between the cameras 51a and 51b, the image generating unit 63 calculates the distance of an object (feature point) within a predetermined image range (step 5). Specifically, for example, when images P1 and P2 (for example, FIGS. 7A and 7B) are captured by the cameras 51a and 51b, the images P1 and P2 (in this embodiment, the image processing unit 62 performs image processing). The image generation unit 63 calculates the distance of each pixel dot in the image).
- the image generation unit 63 generates a distance image based on the calculated distance (step 6). This distance image can also be stored in the memory 61, for example.
- An example of the distance image PL generated by the image generation unit 63 is shown in FIG.
- the shape acquisition unit 64 shown in FIG. 1 acquires shape information of an object at a predetermined distance (or a predetermined distance range) (step 7). At this time, by detecting a width dimension, a height dimension, and the like as the shape information of the object, the shape information of the narrow space can also be acquired.
- the extraction unit 65 extracts feature points from the generated distance image (step 8). Then, the self-position estimation unit 73 estimates the self-position by collating the feature points extracted by the extraction unit 65 with the feature points described in the map (step 9).
- control unit 27 determines whether or not the vacuum cleaner 11 has reached the relay point from the estimated self-position (step 10). If it is determined in step 10 that the relay point has been reached, the control unit 27 (determination unit 66) determines whether or not the current position of the vacuum cleaner 11 is the final point of arrival (step 11). . If it is determined in step 11 that the current position of the vacuum cleaner 11 is not the final destination, the process returns to step 3, and if it is determined that the current position of the vacuum cleaner 11 is the final destination, cleaning is performed. Is finished (step 12).
- control unit 27 controls the operation of the motors 35 and 35 (drive wheels 34 and 34) so that the vacuum cleaner 11 returns to the charging device 12, and the charging terminal 77 and 77 (FIG. 3) and the charging terminal of the charging device 12 (FIG. 5) are connected, and the control unit 27 shifts to the standby mode or the charging mode.
- the control unit 27 determines whether the electric vacuum cleaner 11 is based on the shape information of the object acquired by the shape acquisition unit 64. It is determined whether or not there is an object that becomes an obstacle at a predetermined distance (or within a predetermined distance range) in front of the cleaner 11 (main body case 20 (FIG. 2)) (step 13). Specifically, the determination unit 66, based on the width and height dimensions of the object acquired by the shape acquisition unit 64, the horizontal or vertical interval between the objects, at least the object within a predetermined image range of the distance image It is determined whether or not a part is located.
- This image range is obtained when the vacuum cleaner 11 (main body case 20 (FIG. 2)) is located at a predetermined distance D (FIG. 6) from the cameras 51a and 51b or at a predetermined position within the predetermined distance range.
- This corresponds to the outer shape (size of top, bottom, left and right) of the vacuum cleaner 11 (main body case 20). Therefore, when the object is located within the predetermined distance D (FIG. 6) or within the predetermined distance range within this image range, an obstacle not marked on the map is located on the travel route connecting the relay points. It means to do.
- step 14 When it is determined in step 13 that there is an object, the control unit 27 causes the vacuum cleaner 11 to perform a search operation (step 14). This search operation will be described later. In this search operation, the cleaning unit 22 may be driven or stopped, but in this embodiment, the cleaning unit 22 is driven. Further, when it is determined in step 13 that there is no object, the control unit 27 detects whether or not an object that becomes an obstacle described in the map has been detected, that is, whether or not an object that becomes an obstacle has disappeared. Is determined (step 15). If it is determined in step 15 that an object is detected (the object is not lost), the process returns to step 3. If it is determined that an object is not detected (the object is lost), the process proceeds to step 14. Then, the vacuum cleaner 11 is caused to perform a search operation.
- control unit 27 determines whether or not to end the search operation (step 16). Whether or not to end the search operation is determined by whether or not the object has been circulated. If it is determined not to end the search operation (continue the search operation), the process returns to step 14, and if it is determined to end the search operation, the process returns to step 3.
- the control unit 27 (running control unit 69) shown in FIG. 1 records the vacuum cleaner 11 (main body case 20 (FIG. 2)) on an object that is not shown on the map or on the map. Information is acquired by the information acquisition unit 75 while controlling the operation of the motors 35 and 35 (drive wheels 34 and 34 (FIG. 3)) so that the vehicle travels in a different position from the map such as a region where there is no object despite being applied. To get.
- the control unit 27 (travel control unit 69) causes the vacuum cleaner 11 (main body case 20 (FIG. 2)) to travel along the periphery of the object. If the object indicated on the map is not detected by the obstacle detection unit 74, for example, the vacuum cleaner 11 (main body case 20 (FIG. 2)) is arranged around the position of the object indicated on the map. And run.
- the information acquisition unit 75 can acquire, for example, the arrangement location, the arrangement range, the width dimension, the height dimension, and the like of the object serving as an obstacle as the cleaning area information.
- the acquired information is reflected on the map by the map generation unit 67 and stored in the memory 61 in association with the detection frequency.
- the detection frequency and various kinds of information of these obstacles are stored in the memory 61 over a plurality of predetermined cleaning times, for example.
- the location of the object that becomes the obstacle is, for example, the distance to an arbitrary object by the image generation unit 63 using images captured by the cameras 51a and 51b (images processed by the image processing unit 62).
- a distance image (parallax image) is generated by calculation and can be determined by the determination unit 66 based on the distance image.
- the arrangement range of the object that becomes an obstacle can be acquired by traveling around the object while the vacuum cleaner 11 detects the object.
- the object O1 having a height of a predetermined height (for example, 50 cm) or more is The judgment unit 66 (FIG.
- a range having a height equal to or higher than a predetermined height is a position different from the stored map.
- an object O2 having a shape as low as about 1 cm and a wide area can be determined by the determination unit 66 (FIG. 1) as a carpet or a carpet. That is, the mesh M indicated by 1 in FIG. 8B is also at a different position from the stored map.
- the shape such as the width dimension and the height dimension of the object serving as the obstacle is calculated by the shape acquisition unit 64 based on the distance image.
- a predetermined distance for example, 30 cm from the object stored in the memory 61 ) (In the case where the center distance DI between the object OR detected by the obstacle detection unit 74 and the object OM stored in the memory 61 is within a predetermined distance (FIG.
- an object that is not in the position of the object stored in the memory 61 and is detected at a position within a predetermined distance from the center with respect to this object for example, a chair or a sofa
- these objects may not be determined as objects different from the objects stored in the memory 61, because they may be semi-fixed objects that are easily changed in position even though they are arranged on a daily basis.
- an object detected within a predetermined distance with respect to the position of the object stored in the memory 61 is preferably stored in the memory 61 by shifting the position on the map by the map generation unit 67. Then, the control unit 27 (route setting unit 68) changes the travel route of the main body case 20 based on the map reflecting the arrangement and detection frequency of the obstacles.
- the obstacle object detected on the travel route is an object that is likely to be temporarily placed or an object that is likely to be detected accidentally, such as a shopping bag, a pet, or a person.
- the control unit 27 determines whether or not to change the next travel route based on the obstacle object detected by the obstacle detection unit 74 in the travel mode. This eliminates disturbance elements such as objects that are likely to be temporarily installed or objects that are likely to be detected accidentally, and suppresses unintentional changes in the travel route to set the travel route more accurately.
- the cleaning area can be run and cleaned more efficiently and with high accuracy.
- control unit 27 determines whether or not to change the next travel route based on the detection frequency stored in the memory 61 of the object that becomes the obstacle detected by the obstacle detection unit 74. Therefore, it is possible to accurately predict the state of the next area from this detection frequency, and to lead to the formulation of the optimum travel route.
- the controller 27 (route setting unit 68) does not change the next travel route based on the object because it cannot be determined whether the device is temporarily or accidentally installed.
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (for example, the number of detections is 2 or less)
- the detection frequency stored in the memory 61 is less than a predetermined value (
- an object having a detection frequency stored in the memory 61 of a predetermined value or more (for example, the detection frequency is three times or more), that is, an object that is repeatedly detected is fixed or semi-fixed. And the next travel route is changed based on the object, the travel route can be set with higher accuracy.
- the optimization and efficiency of the cleaning control can also be achieved, and efficient automatic cleaning is achieved. realizable.
- the ratio is used as the detection frequency of an object that becomes an obstacle detected by the obstacle detection unit 74, it is possible to statistically determine whether or not there is a change in the travel route with a higher probability.
- control unit 27 (route setting unit 68) is configured so that the driving wheels 34 and 34 (motors 35 and 35) of the vacuum cleaner 11 (main body case 20) autonomously travel while generating a map by the map generation unit 67.
- the next travel route is set based on these objects regardless of the detection frequency of objects that are not described in the map exceptionally detected by the obstacle detection unit 74. It can be applied when a new map is created or when the user instructs to update the map. In particular, when the user instructs to update the map, it is highly likely that the user has cleared the area, and it can be determined that there is no object that is temporarily or accidentally installed. It is preferable that the object is used for setting a travel route assuming that the object is a fixed object or a semi-fixed object.
- the control unit 27 Route setting unit 68 If the distance is within the predetermined range, it is determined that the object is the same. Therefore, the position recorded on the map can be updated while taking over the detection frequency of the object stored in the memory 61.
- control unit 27 uses the obstacle information detected by the obstacle and the obstacle detection unit 74 based on the obstacle shape information acquired by the shape acquisition unit 64 in the traveling mode. It is determined whether or not the object to be an object is the same obstacle, i.e., the object having the same shape information is determined to be the same object, so the detection frequency of the object stored in the memory 61 is determined. Can take over.
- the shape acquisition unit 64 uses the distance images (parallax images) of the images captured by the plurality of cameras 51a and 51b, that is, using the cameras 51a and 51b, the image generation unit 63, and the shape acquisition unit 64. Since the shape of the object is acquired, the shape can be acquired easily and accurately.
- the control unit 27 sets the travel route so that the cleaning unit 22 sequentially cleans objects with the highest detection frequency when the remaining amount of the secondary battery 28 is less than a predetermined value (for example, 30% or less) in the travel mode. You can also. Objects with higher detection frequency are more likely to be fixed or semi-fixed objects that are installed on a daily basis, and it can be determined that dust tends to collect around them. Thus, the capacity of the secondary battery 28 can be effectively used and cleaned efficiently.
- a predetermined value for example, 30% or less
- the wireless LAN device 47 transmits the object to the server 16 on the network 15, and the user possesses it using the mail server. It is possible to notify the user such as sending an e-mail to the external device 17, directly sending it to the external device 17, or providing a display unit on the vacuum cleaner 11 and displaying it on the display unit. In this case, it is possible to promote the user to clean up an object that is not installed on a daily basis, and the cleanliness awareness and cleaning awareness of the user's room can be improved. At this time, by enabling viewing of images captured by the cameras 51a and 51b, it is possible to notify the user in an easy-to-understand manner, which is easy to use.
- information on the detection frequency of an object that becomes an obstacle can be set by an external operation.
- the user can arbitrarily set the obstacle detection frequency, and can set a more accurate travel route.
- the object detection unit 74 it is also possible to make it possible to delete information on an object that becomes an obstacle detected by the obstacle detection unit 74, for example, the detection frequency by an external operation.
- the object to be cleared installed in the area is detected as an obstacle on the vacuum cleaner 11 side, the object can be deleted on the condition that the object is cleared.
- the area information can be efficiently transmitted to the vacuum cleaner 11.
- a display unit such as a display having a capacitive touch sensor is provided on the vacuum cleaner 11 (main body case 20), and a user directly inputs the vacuum cleaner 11 from this display unit. Or the user can input from the external device 17 to the vacuum cleaner 11 by a wireless signal.
- the information acquisition unit 75 also includes at least a floor area material information, a floor surface step information, an object temperature information, a floor dust amount, etc. as cleaning area information. Either may be acquired. That is, the information acquisition unit 75 uses the hard and flat material information such as flooring by the determination unit 66 from the images captured by the cameras 51a and 51b, in this embodiment, the image processed by the image processing unit 62. It is possible to acquire and judge information on materials, soft materials such as carpets and carpets and long hairs, tatami mats, and floor tone information.
- the step information on the floor can be detected by the sensor unit 26 (step sensor 56), the temperature information of the object can be detected by the sensor unit 26 (temperature sensor 57), and the dust amount can be detected by the sensor unit 26 (dust amount). It can be detected by a sensor 58).
- the acquired information can be reflected in the map by the map generation unit 67 and stored in the memory 61.
- the control unit 27 (travel control unit 69) sets the travel control by the drive wheels 34, 34 (motors 35, 35), or the cleaning unit 22 (electric Changing the cleaning control such as changing the operation of the blower 41, brush motor 43 (rotary brush 42 (FIG. 3)), side brush motors 45, 45 (side brush 44, 44 (FIG. 3)) And you can also clean up more finely.
- the detection frequency is set to the detection frequency.
- detection time information is stored in the memory 61.
- the time information is, for example, a day of the week and a time zone in this embodiment, and the time zone is, for example, morning (6: 00-12: 00), noon (12: 00-18: 00), night (18 : 0:00 to 6:00).
- the memory 61 stores a map of each day of the week and each time zone and a travel route. In each map, the detection frequency of an object that becomes an obstacle is stored in association with the day of the week and the time zone.
- the control unit 27 determines whether or not to change the next travel route based on the detection frequency and time information stored in the memory 61. That is, at the time of the next cleaning, only when the detection frequency of the object stored in association with the day of the week and the time zone corresponding to the cleaning time is equal to or higher than a predetermined value (for example, when the detection frequency is equal to or higher than the predetermined frequency). , Change the travel route.
- the reference values for example, a predetermined number of times) for determining the detection frequency may all be the same, or may be different values for each day of the week or for each time zone.
- the detection frequency of an object serving as an obstacle and the presence or absence of a change in the travel route will be described. Since the detection frequency of the object shown in FIG. 11 is 0 in the morning time zone on Monday, the next travel route is not changed when cleaning is performed in this time zone. On the other hand, for example, the detection frequency is 3 in the daytime zone on Monday, and the detection frequency is 2 in the nighttime zone. To change. Further, since the detection frequency is 3 in the morning time zone on Tuesday, the next travel route is changed when cleaning is performed during this time zone. On the other hand, for example, the detection frequency is 1 in the noon time zone on Tuesday, and the detection frequency is 0 in the night time zone. Therefore, when cleaning is performed during these time zones, the next travel route is not changed. .
- the time zone of the temporarily installed object can be grasped, and the optimum travel route can be maintained and set according to each time zone and day of the week.
- the same time zone is set regardless of the day of the week, but a different time zone may be set for each day of the week. For example, it is possible to set different time zones for weekdays and holidays. These settings can be arbitrarily set by the user.
- the information acquisition unit 75 may be, for example, only the cameras 51a and 51b and the image generation unit 63, and the shape acquisition unit 64 and the sensor unit 26 are not essential components.
- the sensor unit 26 only needs to include at least one of the step sensor 56, the temperature sensor 57, and the dust amount sensor 58.
- the information acquisition unit 75 may be configured by an arbitrary sensor or the like that acquires the location and shape of the object.
- Whether or not the travel route needs to be changed may be determined immediately before the start of cleaning, or may be determined for the next travel route when cleaning is completed.
- a traveling control method for an autonomous traveling body comprising a traveling mode for traveling and determining whether or not to change a next traveling route based on an obstacle detected during the traveling mode.
- the detected time information together with the detected frequency of the obstacle is stored in the memory, and the next time based on the stored detected frequency and time information.
Abstract
Description
Claims (13)
- 本体ケースと、
この本体ケースを走行可能とする駆動輪と、
自己位置を推定する自己位置推定手段と、
前記本体ケースの外部の障害物を検出する障害物検出手段と、
前記本体ケースの走行中の前記障害物検出手段による障害物の検出および前記自己位置推定手段により推定した自己位置に基づいて前記本体ケースが走行した領域の情報を記すマップを作成するマップ生成手段と、
前記駆動輪の動作を制御することで前記本体ケースを自律走行させる制御手段とを具備し、
前記制御手段は、前記マップに基づいて設定した走行経路に沿って前記本体ケースを自律走行するように前記駆動輪の動作を制御する走行モードを備え、この走行モード時に前記障害物検出手段により検出した障害物に基づいて次回の走行経路を変更するか否かを判断する
ことを特徴とした自律走行体。 - 制御手段の走行モード時にマップに記されていない障害物を障害物検出手段により検出した場合に、この障害物の検出頻度を示す情報を記憶する記憶手段を備え、
前記制御手段は、前記記憶手段により記憶した検出頻度に基づいて次回の走行経路を変更するか否かを判断する
ことを特徴とした請求項1記載の自律走行体。 - 制御手段は、走行モード時にマップに記されていない障害物を障害物検出手段により初めて検出したときにはその障害物に基づいて次回の走行経路を変更しない
ことを特徴とした請求項2記載の自律走行体。 - 制御手段は、走行モード時に記憶手段により記憶した検出頻度が所定以上の障害物についてはその障害物に基づいて次回の走行経路を変更する
ことを特徴とした請求項2または3記載の自律走行体。 - 制御手段は、マップ生成手段によりマップを生成しつつ本体ケースを自律走行するように駆動輪の動作を制御するマップ作成モードを備え、マップ作成モード時には、障害物検出手段により検出したマップに記されていない障害物に基づいて次回の走行経路を設定する
ことを特徴とした請求項2ないし4いずれか一記載の自律走行体。 - 被掃除部を掃除する掃除部と、
駆動輪を駆動する電源となる電池とを備え、
制御手段は、走行モード時に前記電池の残量が所定値以下のときに、検出頻度が大きい障害物から順に前記掃除部により掃除するように走行経路を設定する
ことを特徴とした請求項2ないし5いずれか一記載の自律走行体。 - 記憶手段は、制御手段の走行モード時にマップに記されていない障害物を障害物検出手段により検出した場合に、この障害物の検出頻度とともに検出した時間情報を記憶し、
制御手段は、前記記憶手段により記憶した検出頻度および時間情報に基づいて次回の走行経路を変更するか否かを判断する
ことを特徴とした請求項2ないし6いずれか一記載の自律走行体。 - 制御手段は、走行モード時にマップに記されている障害物の位置と障害物検出手段により検出した障害物の位置とに相違がある場合、それらの位置の距離が所定以内であれば同一の障害物と判断する
ことを特徴とした請求項2ないし7いずれか一記載の自律走行体。 - 障害物検出手段により検出した障害物の形状情報を取得する形状取得手段を備え、
制御手段は、走行モード時に形状取得手段により取得した障害物の形状情報に基づいて、マップに記されている障害物と前記障害物検出手段により検出した障害物とが同一の障害物であるか否かを判断する
ことを特徴とした請求項2ないし8いずれか一記載の自律走行体。 - 画像を撮像する複数の撮像手段を備え、
形状取得手段は、前記撮像手段により撮像された画像の視差画像により障害物の形状を取得する
ことを特徴とした請求項9記載の自律走行体。 - マップに記されていない障害物を障害物検出手段により検出したときに報知する
ことを特徴とした請求項2ないし10いずれか一記載の自律走行体。 - 障害物の検出頻度の情報を外部操作により設定可能である
ことを特徴とした請求項2ないし11いずれか一記載の自律走行体。 - 障害物検出手段により検出した障害物の情報を外部操作により削除可能である
ことを特徴とした請求項2ないし12いずれか一記載の自律走行体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780064439.2A CN109891348B (zh) | 2016-11-09 | 2017-06-07 | 自主行走体 |
US16/346,718 US11221629B2 (en) | 2016-11-09 | 2017-06-07 | Autonomous traveler and travel control method thereof |
GB1906014.4A GB2570239A (en) | 2016-11-09 | 2017-06-07 | Autonomous traveling body |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016219123A JP6752118B2 (ja) | 2016-11-09 | 2016-11-09 | 自律走行体 |
JP2016-219123 | 2016-11-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018087951A1 true WO2018087951A1 (ja) | 2018-05-17 |
Family
ID=62109755
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/021219 WO2018087951A1 (ja) | 2016-11-09 | 2017-06-07 | 自律走行体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11221629B2 (ja) |
JP (1) | JP6752118B2 (ja) |
CN (1) | CN109891348B (ja) |
GB (1) | GB2570239A (ja) |
TW (1) | TWI656423B (ja) |
WO (1) | WO2018087951A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022201785A1 (ja) * | 2021-03-22 | 2022-09-29 | ソニーグループ株式会社 | 情報処理装置および方法、並びにプログラム |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017113286A1 (de) * | 2017-06-16 | 2018-12-20 | Vorwerk & Co. Interholding Gmbh | System mit mindestens zwei sich selbsttätig fortbewegenden Bodenbearbeitungsgeräten |
JPWO2019097626A1 (ja) * | 2017-11-16 | 2020-07-27 | 学校法人千葉工業大学 | 自走式掃除機 |
JP2019204336A (ja) * | 2018-05-24 | 2019-11-28 | 東芝ライフスタイル株式会社 | 自律走行体 |
JP7123656B2 (ja) * | 2018-06-22 | 2022-08-23 | 東芝ライフスタイル株式会社 | 自律型電気掃除機 |
US20210271257A1 (en) * | 2018-07-06 | 2021-09-02 | Sony Corporation | Information processing device, optimum time estimation method, self-position estimation method, and record medium recording computer program |
JP7065449B2 (ja) * | 2018-07-20 | 2022-05-12 | パナソニックIpマネジメント株式会社 | 自走式掃除機 |
JP7123430B2 (ja) * | 2018-07-30 | 2022-08-23 | 学校法人千葉工業大学 | 地図生成システムおよび移動体 |
GB2576494B (en) * | 2018-08-06 | 2022-03-23 | Dyson Technology Ltd | A mobile robot and method of controlling thereof |
JP6940461B2 (ja) * | 2018-09-10 | 2021-09-29 | 日立グローバルライフソリューションズ株式会社 | 自律走行型掃除機 |
WO2020065701A1 (ja) * | 2018-09-25 | 2020-04-02 | 学校法人 千葉工業大学 | 情報処理装置および移動ロボット |
TWI723330B (zh) | 2019-01-21 | 2021-04-01 | 瑞軒科技股份有限公司 | 機器人以及機器人控制方法 |
JP6764138B2 (ja) * | 2019-03-28 | 2020-09-30 | 日本電気株式会社 | 管理方法、管理装置、プログラム |
TWI716957B (zh) * | 2019-08-07 | 2021-01-21 | 陳水石 | 清潔機器人及其材質辨識方法 |
JP7345132B2 (ja) * | 2019-08-09 | 2023-09-15 | パナソニックIpマネジメント株式会社 | 自律走行型掃除機、自律走行型掃除機の制御方法、及び、プログラム |
KR20210099470A (ko) * | 2020-02-04 | 2021-08-12 | 엘지전자 주식회사 | 청소기 |
CN111625007A (zh) * | 2020-06-24 | 2020-09-04 | 深圳市银星智能科技股份有限公司 | 一种识别动态障碍物的方法和移动机器人 |
CN111700546B (zh) * | 2020-06-24 | 2022-09-13 | 深圳银星智能集团股份有限公司 | 一种移动机器人的清扫方法及移动机器人 |
KR20220012000A (ko) * | 2020-07-22 | 2022-02-03 | 엘지전자 주식회사 | 로봇 청소기 및 이의 제어방법 |
CN112220405A (zh) * | 2020-10-29 | 2021-01-15 | 久瓴(江苏)数字智能科技有限公司 | 自移动工具清扫路线更新方法、装置、计算机设备和介质 |
CN113786134B (zh) * | 2021-09-27 | 2023-02-03 | 汤恩智能科技(上海)有限公司 | 一种清洁方法、程序产品、可读介质和电子设备 |
CN114137966A (zh) * | 2021-11-22 | 2022-03-04 | 北京云迹科技有限公司 | 一种移动装置的控制方法、装置、系统及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0373004A (ja) * | 1989-08-14 | 1991-03-28 | Honda Motor Co Ltd | 自走型作業ロボット |
JP2004033340A (ja) * | 2002-07-01 | 2004-02-05 | Hitachi Home & Life Solutions Inc | ロボット掃除機及びロボット掃除機制御プログラム |
US20050022273A1 (en) * | 2003-07-23 | 2005-01-27 | Hitachi, Ltd. | Location aware automata |
JP2007323402A (ja) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | 自走式機器およびそのプログラム |
JP2008129614A (ja) * | 2006-11-16 | 2008-06-05 | Toyota Motor Corp | 移動体システム |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5107946A (en) | 1989-07-26 | 1992-04-28 | Honda Giken Kogyo Kabushiki Kaisha | Steering control system for moving vehicle |
JPH0680203A (ja) * | 1992-03-24 | 1994-03-22 | East Japan Railway Co | 床面洗浄ロボットの制御方法 |
JPH1088625A (ja) * | 1996-09-13 | 1998-04-07 | Komatsu Ltd | 自動掘削機、自動掘削方法および自動積み込み方法 |
JP2004275468A (ja) * | 2003-03-17 | 2004-10-07 | Hitachi Home & Life Solutions Inc | 自走式掃除機およびその運転方法 |
TWI308487B (en) * | 2006-12-26 | 2009-04-11 | Ind Tech Res Inst | Position-detecting system and method |
JP2008242908A (ja) * | 2007-03-28 | 2008-10-09 | Matsushita Electric Ind Co Ltd | 自律走行装置およびこの装置を機能させるためのプログラム |
KR20090077547A (ko) | 2008-01-11 | 2009-07-15 | 삼성전자주식회사 | 이동 로봇의 경로 계획 방법 및 장치 |
CN104248395B (zh) | 2008-04-24 | 2018-06-22 | 艾罗伯特公司 | 用于机器人使能的移动产品的定位、位置控制和导航系统的应用 |
CN101920498A (zh) * | 2009-06-16 | 2010-12-22 | 泰怡凯电器(苏州)有限公司 | 实现室内服务机器人同时定位和地图创建的装置及机器人 |
CN102596517B (zh) | 2009-07-28 | 2015-06-17 | 悠进机器人股份公司 | 移动机器人定位和导航控制方法及使用该方法的移动机器人 |
DE102009041362A1 (de) | 2009-09-11 | 2011-03-24 | Vorwerk & Co. Interholding Gmbh | Verfahren zum Betreiben eines Reinigungsroboters |
CN102138769B (zh) * | 2010-01-28 | 2014-12-24 | 深圳先进技术研究院 | 清洁机器人及其清扫方法 |
KR101750340B1 (ko) | 2010-11-03 | 2017-06-26 | 엘지전자 주식회사 | 로봇 청소기 및 이의 제어 방법 |
KR101566207B1 (ko) * | 2011-06-28 | 2015-11-13 | 삼성전자 주식회사 | 로봇 청소기 및 그 제어방법 |
TW201305761A (zh) * | 2011-07-21 | 2013-02-01 | Ememe Robot Co Ltd | 自走機器人及其定位方法 |
JP5165784B1 (ja) | 2011-10-07 | 2013-03-21 | シャープ株式会社 | 自走式イオン発生機及び掃除ロボット |
EP2713232A1 (en) * | 2012-09-27 | 2014-04-02 | Koninklijke Philips N.V. | Autonomous mobile robot and method for operating the same |
US8949016B1 (en) * | 2012-09-28 | 2015-02-03 | Google Inc. | Systems and methods for determining whether a driving environment has changed |
JP2015535373A (ja) * | 2012-10-05 | 2015-12-10 | アイロボット コーポレイション | 移動ロボットを含むドッキングステーション姿勢を決定するためのロボット管理システムとこれを用いる方法 |
CN103054522B (zh) | 2012-12-31 | 2015-07-29 | 河海大学 | 一种清洁机器人系统及其测控方法 |
JP2014200449A (ja) * | 2013-04-04 | 2014-10-27 | シャープ株式会社 | 自走式掃除機 |
CN103431812B (zh) | 2013-08-02 | 2016-04-06 | 南京航空航天大学金城学院 | 一种基于超声雷达探测的清洁机器人及其行走控制方法 |
CN103353758B (zh) | 2013-08-05 | 2016-06-01 | 青岛海通机器人系统有限公司 | 一种室内机器人导航方法 |
JP5897517B2 (ja) * | 2013-08-21 | 2016-03-30 | シャープ株式会社 | 自律移動体 |
EP2839769B1 (en) * | 2013-08-23 | 2016-12-21 | LG Electronics Inc. | Robot cleaner and method for controlling the same |
EP3056453A4 (en) * | 2013-10-11 | 2017-07-26 | Hitachi, Ltd. | Transport vehicle control device and transport vehicle control method |
KR102093177B1 (ko) * | 2013-10-31 | 2020-03-25 | 엘지전자 주식회사 | 이동 로봇 및 그 동작방법 |
TWI502297B (zh) | 2013-11-18 | 2015-10-01 | Weistech Technology Co Ltd | Mobile device with route memory function |
CN104188598B (zh) * | 2014-09-15 | 2016-09-07 | 湖南格兰博智能科技有限责任公司 | 一种自动地面清洁机器人 |
CN104932493B (zh) * | 2015-04-01 | 2017-09-26 | 上海物景智能科技有限公司 | 一种自主导航的移动机器人及其自主导航的方法 |
CN204631616U (zh) * | 2015-05-13 | 2015-09-09 | 美的集团股份有限公司 | 机器人 |
CN105320140B (zh) * | 2015-12-01 | 2018-09-18 | 浙江宇视科技有限公司 | 一种扫地机器人及其清扫路径规划方法 |
CN105425801B (zh) * | 2015-12-10 | 2018-06-12 | 长安大学 | 基于先进路径规划技术的智能清洁机器人及其清洁方法 |
CN105425803B (zh) * | 2015-12-16 | 2020-05-19 | 纳恩博(北京)科技有限公司 | 自主避障方法、装置和系统 |
JP7058067B2 (ja) | 2016-02-16 | 2022-04-21 | 東芝ライフスタイル株式会社 | 自律走行体 |
-
2016
- 2016-11-09 JP JP2016219123A patent/JP6752118B2/ja active Active
-
2017
- 2017-06-07 US US16/346,718 patent/US11221629B2/en active Active
- 2017-06-07 CN CN201780064439.2A patent/CN109891348B/zh active Active
- 2017-06-07 WO PCT/JP2017/021219 patent/WO2018087951A1/ja active Application Filing
- 2017-06-07 GB GB1906014.4A patent/GB2570239A/en not_active Withdrawn
- 2017-07-03 TW TW106122167A patent/TWI656423B/zh not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0373004A (ja) * | 1989-08-14 | 1991-03-28 | Honda Motor Co Ltd | 自走型作業ロボット |
JP2004033340A (ja) * | 2002-07-01 | 2004-02-05 | Hitachi Home & Life Solutions Inc | ロボット掃除機及びロボット掃除機制御プログラム |
US20050022273A1 (en) * | 2003-07-23 | 2005-01-27 | Hitachi, Ltd. | Location aware automata |
JP2007323402A (ja) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | 自走式機器およびそのプログラム |
JP2008129614A (ja) * | 2006-11-16 | 2008-06-05 | Toyota Motor Corp | 移動体システム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022201785A1 (ja) * | 2021-03-22 | 2022-09-29 | ソニーグループ株式会社 | 情報処理装置および方法、並びにプログラム |
Also Published As
Publication number | Publication date |
---|---|
US11221629B2 (en) | 2022-01-11 |
GB2570239A (en) | 2019-07-17 |
CN109891348B (zh) | 2022-04-29 |
CN109891348A (zh) | 2019-06-14 |
US20190302796A1 (en) | 2019-10-03 |
GB201906014D0 (en) | 2019-06-12 |
TWI656423B (zh) | 2019-04-11 |
JP6752118B2 (ja) | 2020-09-09 |
TW201818175A (zh) | 2018-05-16 |
JP2018077685A (ja) | 2018-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018087951A1 (ja) | 自律走行体 | |
WO2017141535A1 (ja) | 自律走行体 | |
WO2017141536A1 (ja) | 自律走行体 | |
EP3727122B1 (en) | Robot cleaners and controlling method thereof | |
WO2018087952A1 (ja) | 電気掃除機 | |
CN110636789B (zh) | 电动吸尘器 | |
JP7007078B2 (ja) | 電気掃除機 | |
CN110325938B (zh) | 电动吸尘器 | |
JP7141220B2 (ja) | 自走式電気掃除機 | |
CN109938642B (zh) | 电动吸尘器 | |
JP2018068896A (ja) | 電気掃除機 | |
US20200033878A1 (en) | Vacuum cleaner | |
JP2016185182A (ja) | 電気掃除機およびその情報表示方法 | |
JP6945144B2 (ja) | 掃除情報提供装置、および掃除機システム | |
CN111225592B (zh) | 自主行走吸尘器和扩展区域识别方法 | |
JP2020140248A (ja) | 自律走行体装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17869149 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 201906014 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20170607 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17869149 Country of ref document: EP Kind code of ref document: A1 |