CN110025260B - Autonomous traveling body and autonomous traveling body system - Google Patents

Autonomous traveling body and autonomous traveling body system Download PDF

Info

Publication number
CN110025260B
CN110025260B CN201811554782.8A CN201811554782A CN110025260B CN 110025260 B CN110025260 B CN 110025260B CN 201811554782 A CN201811554782 A CN 201811554782A CN 110025260 B CN110025260 B CN 110025260B
Authority
CN
China
Prior art keywords
travel
image data
external device
camera
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811554782.8A
Other languages
Chinese (zh)
Other versions
CN110025260A (en
Inventor
井泽浩一
丸谷裕树
渡边浩太
古田和浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Lifestyle Products and Services Corp
Original Assignee
Toshiba Lifestyle Products and Services Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Lifestyle Products and Services Corp filed Critical Toshiba Lifestyle Products and Services Corp
Publication of CN110025260A publication Critical patent/CN110025260A/en
Application granted granted Critical
Publication of CN110025260B publication Critical patent/CN110025260B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Abstract

The invention provides an autonomous traveling body and an autonomous traveling system capable of intuitively guiding a desired traveling position. The communication unit (25) can communicate with an external device (17) having a display function capable of displaying image data, can transmit image data to the external device (17), and can receive a signal from the external device (17). A communication unit (25) receives, from an external device (17), coordinate data on image data corresponding to a position indicated by image data displayed on the external device (17). A CPU (61) performs travel control on the main body case. The CPU (61) sets a target traveling position on the map data indicated by the coordinate data based on the map data of the place where the main body case travels and the coordinate data received by the communication unit (25), and causes the main body case to travel to the target traveling position.

Description

Autonomous traveling body and autonomous traveling body system
Technical Field
Embodiments of the present invention relate to an autonomous traveling body and an autonomous traveling system capable of self-traveling.
Background
Conventionally, as an autonomously drivable robot such as an autonomous driving type electric vacuum cleaner, a system capable of remote operation is known, in which: a position of a work area displayed on an external device such as a smartphone on a map is designated, and a robot receiving designated position information via a network such as the internet is guided to the position. This system enables, for example, cleaning of a designated place in a house from an outside destination, warning of the designated place in the house when the camera is not at home, and imaging by the camera to observe a pet when the pet is not at home.
When operating a robot by such remote operation, a user is required to correctly grasp the position of the robot, and a position capable of easily instructing guidance is required. In this case, when the map of the work area displayed on the external device is, for example, a two-dimensional map input by the user himself or a two-dimensional map created by the robot, it is not easy to intuitively know the location where the robot is to be guided. In addition, when an object that is an obstacle to travel exists on a route from the current position of the robot to the guidance position, it is difficult to perform accurate guidance.
Disclosure of Invention
An object to be solved by the present invention is to provide an autonomous traveling body capable of intuitively guiding a desired traveling position, and an autonomous traveling system including the autonomous traveling body.
The autonomous traveling body of the embodiment includes a main body capable of self-traveling, a camera, an image input unit, a communication unit, and a traveling control unit. The camera is arranged on the main body. The image input unit acquires image data captured by a camera. The communication means can communicate with an external device having a display function capable of displaying image data, can transmit image data to the external device, and can receive a signal from the external device. In addition, the communication unit receives, from the external device, coordinate data on the image data corresponding to the instructed position of the image data displayed on the external device. The travel control unit performs travel control on the main body. The travel control means sets a target travel position on the map data indicated by the coordinate data based on the map data of the place where the subject travels and the coordinate data received by the communication means, and causes the subject to travel to the target travel position.
According to the above configuration, it is possible to intuitively guide the vehicle to a desired position for traveling.
Drawings
Fig. 1 is a block diagram showing an internal structure of an autonomous traveling body according to an embodiment.
Fig. 2 is a perspective view showing the autonomous traveling body.
Fig. 3 is a plan view showing the autonomous traveling body from below.
Fig. 4 is an explanatory view schematically showing an autonomous traveling system including the autonomous traveling body.
Fig. 5 is an explanatory diagram showing an example of display performed by the external device of the autonomous traveling system.
Fig. 6A is an explanatory diagram showing an example of setting of a travel route in the map data of the autonomous traveling body.
Fig. 6B is an explanatory diagram showing an example of a change in the travel route in the case where an obstacle exists in the travel route in the map data.
Fig. 7A is an explanatory diagram showing an example of switching of image data displayed by the external device.
Fig. 7B is an explanatory diagram showing another example of switching of image data displayed on the external device.
Fig. 7C is an explanatory diagram showing another example of switching of image data displayed on the external device.
Detailed Description
Hereinafter, a configuration of an embodiment will be described with reference to the drawings.
In fig. 1 to 4, 11 denotes an autonomous traveling body. The autonomous traveling body 11 constitutes an autonomous traveling body device together with a charging device as a base device. In the present embodiment, the autonomous traveling body 11 is a robot cleaner that is a self-propelled electric vacuum cleaner that cleans a floor surface, which is a traveling surface, while autonomously traveling on the floor surface, which is a portion to be cleaned. The self-propelled autonomous traveling body 11 is not limited to a device that travels completely autonomously, and includes a device that travels autonomously by remote operation of an external device such as a remote controller. Further, by performing communication between the autonomous traveling body 11 and a home gateway 14 as a relay unit disposed in a cleaning area or the like by using wired communication or wireless communication such as Wi-Fi or Bluetooth, the autonomous traveling body 11 can perform wired or wireless communication with a general-purpose server 16 as a data storage unit, a general-purpose or dedicated external device 17 having a function of a display unit, and the like via a network 15 such as the internet. Further, if the autonomous traveling body 11 is inside a building, for example, it can wirelessly communicate with the external device 17 via the home gateway 14. Therefore, the autonomous traveling body 11 and the external device 17 constitute an autonomous traveling system 18 via the home gateway 14, the network 15, the server 16, and the like.
The autonomous traveling body 11 includes a main body case 20 as a main body. The autonomous traveling body 11 includes a driving wheel 21 as a traveling drive unit. The autonomous traveling body 11 may further include a cleaning unit 22 for cleaning dust on the floor surface. The autonomous traveling body 11 is provided with a sensor unit 23. The autonomous traveling body 11 is provided with an imaging unit 24. The autonomous traveling body 11 is provided with a communication unit 25 as communication means. The autonomous traveling body 11 may further include an input/output unit 26 that inputs and outputs signals to and from the external device 17 or the user. The autonomous traveling body 11 includes a controller, i.e., a control unit 27 as a control means. The autonomous traveling body 11 may further include a display unit for displaying an image. The autonomous traveling body 11 may be provided with a battery for supplying power to the power supply unit. Hereinafter, the directions indicated by arrows FR and RR in fig. 2 along the traveling direction of the main body case 20 will be referred to as the front-rear direction, and the left-right direction or both directions orthogonal to the front-rear direction will be referred to as the width direction.
The main body case 20 is formed in a shape capable of housing various devices and components. The main body case 20 may be provided with a suction port 31 or the like as a dust collection port at a lower portion or the like facing the floor surface.
The drive wheel 21 is a wheel for running to cause the main body case 20 to run on the ground in the forward direction and the backward direction. The driving wheels 21 of the present embodiment are provided in a pair on the left and right of the main body case 20, but the present invention is not limited to this arrangement. The drive wheel 21 is driven by a motor 33 as a drive unit. Instead of the drive wheels 21, a crawler belt or the like may be used.
The cleaning part 22 removes dust on the floor. The cleaning unit 22 has a function of collecting and collecting dust on the floor surface from the suction port 31 and wiping and cleaning the floor surface and the like. The cleaning unit 22 may include at least one of an electric blower 35 for sucking dust together with air from the suction port 31, a rotary brush 36 as a rotary cleaning member rotatably attached to the suction port 31 and picking up the dust, a brush motor for rotationally driving the rotary brush 36, and a side brush 38 as an auxiliary cleaning unit for rotatably attaching to a peripheral portion of the main body case 20 and picking up the dust, and a side brush motor for driving the side brush 38. The cleaning portion 22 is formed to collect dust in the dust collecting portion 40 communicating with the suction port 31. The cleaning unit 22 is not necessarily configured.
The sensor unit 23 senses the state of unevenness such as a step on the ground, a wall portion or an obstacle which becomes a traveling obstacle of the autonomous traveling body 11, the amount of dust on the ground, and the like.
The imaging unit 24 includes a camera 51 as imaging means. The imaging unit 24 may include a lamp 53 as a detection auxiliary unit of an illumination unit, such as an LED for illuminating the imaging range of the camera 51.
The camera 51 is a digital camera that faces the front side, which is the traveling direction of the main body case 20, and captures a digital image, a moving image, a still image, or both of them at a predetermined horizontal angle of view, for example, 105 ° with respect to a direction parallel to the floor on which the main body case 20 is placed. The camera 51 includes an imaging element such as a lens, an aperture, a shutter, and a CCD, and an imaging control circuit. In addition, a plurality of cameras 51 may be provided.
The communication unit 25 can communicate with the external device 17 outside the building through the server 16 via the home gateway 14 and the network 15. The communication unit 25 can communicate with the external device 17 inside the building via the home gateway 14. The communication unit 25 can transmit data to the external device 17 and can receive a signal from the external device 17. The communication unit 25 includes a wireless communication unit for performing wireless communication with the external device 17, a wireless LAN device as a dust collector signal receiving unit, and the like. The communication unit 25 may be equipped with an access point function to directly perform wireless communication with the external device 17 without passing through the home gateway 14. Further, a web server function may be added to the communication unit 25.
The input/output unit 26 obtains a control command transmitted from an external device such as a remote controller or a control command input from an input unit such as a switch or a touch panel provided in the main body case 20, and transmits a signal to the charging device or the like. The input/output unit 26 is not necessarily configured.
The control unit 27 may be a microcomputer including a CPU, a ROM, a RAM, and the like as a control unit main body. The control unit 27 is electrically connected to the cleaning unit 22, the sensor unit 23, the imaging unit 24, the communication unit 25, the input/output unit 26, and the like. The control unit 27 of the present embodiment includes a CPU 61. The control unit 27 includes an image processing processor 63 as an image processing unit. The control unit 27 includes a cleaning control unit as cleaning control means. The control unit 27 includes a memory as a storage means. The control unit 27 is electrically connected to the battery. The control unit 27 may include a charge control unit that controls charging of the battery.
The CPU61 is electrically connected to the motor 33, the sensor unit 23, and the input/output unit 26. Further, the CPU61 is electrically connected to the image processing processor 63. The CPU61 functions as a travel control unit that controls the driving of the drive wheels 21 by controlling the driving of the motor 33. The CPU61 also has a function of a sensing control unit that obtains a detection result by the sensing unit 23. The CPU61 also has a function as a signal input/output unit that acquires signals received by the input/output unit 26 and generates signals to be output from the input/output unit 26. The CPU61 also has the following travel modes: a travel route is set based on map data indicating a travel area, which is an area where the autonomous traveling structure 11 is arranged and can travel, and detection by the sensor unit 23, and driving of the motor 33 is controlled so that the main body casing 20 autonomously travels within the travel area along the travel route.
The travel route set by the CPU61 can adopt: the route that can be traveled and/or cleaned efficiently can be a route that can travel the shortest travel distance in a travelable or cleanable area in the map data, in other words, a route of an area other than an area where travel is impossible, such as an obstacle, a step, or the like, for example, a route in which the body casing 20 travels straight as much as possible, the direction change is the least, a route with little contact with an object that becomes an obstacle, or a route in which travel is repeated the least number of times at the same position. In the present embodiment, the area in which autonomous traveling body 11 can travel is substantially the area to be cleaned by cleaning unit 22, and therefore the travel area is the same as the area to be cleaned.
The image processing processor 63 has a function of an image input unit that acquires image data from at least two cameras 51, 51. In addition, the image processing processor 63 has a function of an image processing unit that performs image processing on the acquired at least two image data. The image processing processor 63 also has a function of a map creation unit and a self-position estimation unit that create map data of a travel area or a cleaning target area and estimate the self-position based on image data captured by the camera 51. That is, the image processing processor 63 has a function of creating an area map indicating an area where the vehicle can travel by sensing the surroundings and acquiring the position of the autonomous traveling body 11 in the area map. The image processor 63 of the present embodiment has a SLAM (Simultaneous Localization and Mapping) function. The SLAM function mounted on the autonomous traveling body 11 of the present embodiment uses a stereo camera image. That is, in the image processing processor 63 of the present embodiment, by extracting feature points such as corners of furniture at the same position in the captured images of the two cameras 51, the distance from the captured position to the feature points can be obtained by using the binocular parallax of the cameras 51. Since the situation around the autonomous traveling body 11, such as the shape of the wall portion, the position and size of the furniture, can be known by repeating the acquisition of the distance in this manner, a map indicating an area where the main body case 20 can travel can be created in the image processing processor 63 based on the situation around. Then, the image processor 63 can acquire the self-position by associating the created map with the distance measurement information based on the captured image. In addition, when the image processor 63 acquires the own position, it is not necessary to create a map by itself, and an externally obtained map may be used. In order to realize the SLAM function, a known technique using infrared sensing or both of angular velocity sensing and acceleration sensing can be used in addition to the stereoscopic camera image, and therefore, a detailed description thereof is omitted. The created map data M is created on a predetermined coordinate system such as an orthogonal coordinate system. As shown in fig. 6A and 6B, the map data M according to the present embodiment is created using a grid set based on a coordinate system as a basic unit, and the obstacle O, the outer wall W, and the like are arranged corresponding to an actual travel area. Therefore, the position of the main body case 20 itself is associated with the coordinate system on the map by the image processing processor 63. The image data and the map data created by the image processor 63 can be stored in the memories, respectively. The image data created by the image processor 63 can also be transmitted from the communication unit 25 to the external device 17. The map data may be transmitted to the external device 17 so as to be displayable, or may be stored only in the memory. In addition, the image processing processor 63 can appropriately correct the map data when the shape and arrangement of the obstacle or the like in the created map data do not match the detected shape and arrangement of the surroundings.
The cleaning control unit controls the operation of the cleaning unit 22. In the present embodiment, the cleaning control unit controls the driving of the electric blower 35, the brush motor, and the side brush motor, respectively.
The memory may be a nonvolatile memory such as a flash memory. The memory stores the travel-completed or cleaning-completed area in the map data together with the map data created by the image processing processor 63.
The battery is used for supplying power to the cleaning unit 22, the sensor unit 23, the imaging unit 24, the communication unit 25, the input/output unit 26, the control unit 27, and the like. As the battery, in the present embodiment, a rechargeable secondary battery is used. Therefore, in the present embodiment, the charging terminal 71 for charging the battery is disposed so as to be exposed at the bottom of the main body case 20.
The charging device serves as a base portion to which the autonomous traveling body 11 is reset when traveling or cleaning is completed. The charging device may have a charging circuit such as a constant current circuit built therein. The charging device is provided with a charging terminal for charging the battery. The charging terminal is electrically connected to the charging circuit. The charging terminal is mechanically and electrically connected to the charging terminal 71 of the autonomous traveling body 11 returned to the charging device.
The home gateway 14 is also called an access point or the like, is provided in a building, and is connected to the network 15 by a wire, for example.
The server 16 is a computer such as a cloud server connected to the network 15, and can store various data.
The external device 17 is a general-purpose device such as a PC, a tablet terminal, a tablet PC, a smartphone, a mobile phone, or the like that can communicate with the network 15 inside the building via, for example, the home gateway 14 and can communicate with the network 15 outside the building. The external device 17 includes a display 73 such as a liquid crystal display device, and functions as an external device display unit that displays at least an image on the display 73. The display 73 has a touch panel function for inputting an instruction by a user touching a finger or a pointer. That is, the external device 17 has a function of an external device input unit. The external device 17 may be provided with an application or a program for displaying a map and the amount of dust, or for allowing a user to instruct the autonomous traveling structure 11 to start or stop cleaning, charge a battery, or specify a position to move by remote operation. That is, it is possible to display on the display 73 icons for the user to instruct the start and stop of cleaning, the charging of the battery, and the like, and the image data G captured by the camera 51 of the autonomous traveling body 11 as shown in fig. 5, and to instruct the position on the image data G displayed on the display 73 by touching. Further, indication icons AL and AR, which are indication buttons for a user to instruct switching of the displayed image data G, may be displayed on the display 73. The instruction icons AL and AR may be formed, for example, in a manner imitating arrows indicating the left-right direction, and may be arranged on both sides of the image data G, so that the currently displayed image data G can be intuitively instructed to be switched to other image data in the direction that the user desires to observe. Further, the external device 17 may be provided with a function of transmitting and receiving an electronic mail in advance.
The external device 17 further includes a coordinate data acquisition unit 75, and the coordinate data acquisition unit 75 acquires coordinate data of a position indicated by a user touching a finger or a pointer from the image data G displayed on the display 73. The coordinate data acquisition unit 75 is capable of acquiring coordinate data of the image data G corresponding to a position on the display 73 touched by a finger or a pointer in conjunction with the touch panel function. The external device 17 can transmit the coordinate data acquired by the coordinate data acquisition unit 75 to the communication unit 25 of the autonomous traveling body 11.
Next, the operation of the embodiment will be described.
The autonomous traveling body 11 includes an automatic cleaning mode in which cleaning by the cleaning unit 22 is performed while autonomous traveling according to a predetermined schedule or in accordance with an instruction from the user, and a remote operation mode in which the autonomous traveling body travels to a position designated by the user by a remote operation of the user. In addition, when autonomous traveling body 11 does not have a cleaning function, the automatic cleaning mode is not necessary.
In the automatic cleaning mode, the control unit 27 switches to the travel mode and starts cleaning at a timing when the autonomous traveling body 11 reaches a preset cleaning start timing or when a control command for starting cleaning transmitted from a remote controller or an external device is received by the input/output unit 26. At this time, when a room that has not been cleaned is newly cleaned, the memory does not store the map data of the travel area. In this case, a predetermined operation is performed, and obstacles and the like around the main body case 20 are detected by the sensor unit 23, the camera 51, the image processing processor 63, and the like, so that new map data is created by the image processing processor 63. The map data may be acquired from the outside without being created by the image processing processor 63 of the autonomous traveling body 11.
As the processing of the image processing processor 63, first, image data is acquired from at least two cameras 51, and primary image processing such as distortion correction of a lens is performed as necessary. Then, the image processing processor 63 performs SLAM processing on the image data of 1 set of 2 sheets corresponding to each camera 51 after the image processing, and performs self-position estimation and map creation. In addition, the image processing processor 63 can also turn on the lamp 53 in order to obtain an appropriate image even in a dark traveling area.
Next, based on the map data, the CPU61 creates a travel route. The method for creating the travel route may be a known method.
Then, the CPU61 controls the motor 33 to autonomously travel the main body case 20 along the set travel path, and the cleaning control unit operates the cleaning unit 22 to clean the floor surface of the cleaning target area, which is the travel area. In the cleaning unit 22, the electric blower 35, the brush motor, or the side brush motor is driven by the cleaning control unit, and dust on the floor is collected into the dust collection unit 40 through the suction port 31. When the sensor unit 23 and the image processing processor 63 detect the three-dimensional coordinates of an object such as an obstacle that is not described in the travel area of the map during autonomous travel, the autonomous traveling body 11 causes the image processing processor 63 to reflect the three-dimensional coordinates in the map data and store the three-dimensional coordinates in the memory.
When the cleaning is completed, the autonomous traveling body 11 returns to the charging device, and then the operation proceeds to the charging operation of the battery.
On the other hand, in the remote operation mode, the autonomous traveling body 11 stands by in the traveling region and waits for an instruction from the user of the external device 17. The autonomous traveling body 11 transmits the image data G captured by at least one of the cameras 51 from the communication unit 25 to the external device 17, and the external device 17 that received the image data G displays the image data G on the display 73 as shown in fig. 5. The image data G displayed on the display 73 at this time may be an image captured from a predetermined fixed position such as a charging device, but it is preferable that the image captured from the current position of the autonomous traveling body 11 allows the user to visually confirm the current state of the traveling area.
The user indicates, by touching, the position of the autonomous moving body 11 that the user wishes to guide in the image data G displayed on the display 73 of the external device 17.
The coordinate data acquisition unit 75 acquires coordinate data of a position on the image data G indicated by the touch. The external device 17 transmits the coordinate data acquired by the coordinate data acquisition unit 75, and the autonomous moving body 11 acquires the coordinate data through the communication unit 25.
Then, in the autonomous traveling body 11, the CPU61 refers to the coordinate data acquired by the communication unit 25 and the map data of the traveling area stored in the memory, and calculates the target traveling position of the traveling area corresponding to the position indicated by the coordinate data.
Next, the CPU61 calculates a travel path TR that links the self position of the main body case 20 estimated by the image processing processor 63 and the calculated target travel position. As shown in fig. 6A, the travel route TR is set so that the self position P1 and the target travel position P2 are connected by a straight line, thereby enabling travel in the shortest distance and the shortest time.
There are cases here: the user cannot determine whether or not there is an obstacle in the travel path set by the CPU61 at the position specified by the user, from the image data G displayed on the display 73 of the external device 17. Therefore, when the travel route TR is calculated and the obstacle O is present on the travel route TR, the CPU61 controls the operation of the motor 33 so that the motor approaches the obstacle O along the travel route TR at a position P3 that is a predetermined distance away, and thereafter performs a predetermined avoidance operation, and as shown in fig. 6B, the CPU61 calculates a new travel route TR1 from the position P4 to the target travel position P2. When approaching the position P3 at the predetermined distance to the obstacle, the distance between the body housing 20 and the obstacle may be detected based on the image captured by the camera 51, or the distance between the body housing 20 and the obstacle may be detected based on the detection by the sensor unit 23. Further, as the avoidance operation, an operation of rotating at a predetermined angle such as 90 ° with respect to the right direction or the left direction with respect to the traveling direction and then traveling a predetermined traveling distance such as 50cm can be performed.
However, when such calculation of a new travel route, travel control along the travel route, and detection of an obstacle are repeated several times, there is a fear that the target travel position cannot be reached for a long time. Therefore, it is preferable that the upper limit number of times of detection of the obstacle is set to, for example, 5 times in advance, and when the calculation of the new travel route, the travel control along the travel route, and the detection of the obstacle overlap by a predetermined upper limit number of times of detection or more, the travel of the autonomous traveling body 11 is interrupted. The operation of the motor 33 is preferably controlled by the CPU61 so that the autonomous traveling body 11, which has been interrupted from traveling, travels in the reverse direction on the travel path and returns to the start point. In this case, the autonomous traveling body 11 may capture an image with at least one of the cameras 51 before traveling in the reverse direction on the traveling route, and may transmit the image data from the communication unit 25 to the external device 17, or may transmit the image data from the communication unit 25 to the external device 17 when the autonomous traveling body fails to reach the target traveling position at any time. When the image is captured, the user can grasp the position where the autonomous moving body 11 is prevented from moving based on the image.
Then, the autonomous traveling body 11 stops when traveling to the target traveling position, and performs necessary operations in accordance with the input of the user. For example, the cleaning unit 22 can perform fixed-point cleaning for cleaning a predetermined area around the target traveling position, or can take an image of at least one of the cameras 51 and 51 at the target traveling position. The captured image can be transmitted from the communication unit 25 to the external device 17 and displayed on the display 73. In addition, when an image is captured, a panoramic image of the periphery of the main body case 20 can be captured. When a panoramic image is captured, the CPU61 can control the operation of the motor 33 to capture an image, for example, a still image, each time the main body case 20 is rotated by an angle equal to or less than the horizontal angle of view of the camera 51, preferably equal to or less than half the horizontal angle of view, and the image processor 63 processes the data of these images so that the overlapping positions are continuous, thereby forming a panoramic image in a predetermined angle range.
When the position at which the user desires to guide the autonomous moving object 11 is not captured in the image data G currently displayed on the display 73, the user can slide the image data G displayed on the display 73 in the left-right direction by touching the instruction icons AL and AR. When the user touches either of the instruction icons AL and AR, an instruction signal corresponding to the touch of the instruction icons AL and AR is transmitted from the external device 17 to the autonomous traveling body 11. The CPU61 controls the operation of the motor 33 so that the autonomous traveling body 11 that has received the instruction signal via the communication unit 25 travels in the left or right direction indicated by the instruction signal or rotates by a predetermined angle, and re-captures an image with at least one of the cameras 51 at the position. The data of the captured image is transmitted from the communication unit 25 to the external device 17, and the external device 17 acquires the transmitted image data and switches the image data G displayed on the display 73 as shown in fig. 7A to 7C. By switching the image data G displayed on the display 73 in a sliding manner in this manner, the user can find a position desired to be guided from the image data G.
According to the above-described embodiment, the communication unit 25 receives, from the external device 17, the coordinate data on the image data G corresponding to the position instructed by the user of the image data G displayed on the external device 17, and the CPU61 sets the target traveling position on the map data indicated by the coordinate data based on the map data of the traveling area of the body case 20 and the coordinate data received by the communication unit 25, and causes the body case 20 to travel to the target traveling position, so that the user can intuitively guide the autonomous traveling body 11 to the position desired to travel only by specifying the desired position by observing the image data G displayed on the external device 17. Therefore, the operation is very simple, and the arrival of the autonomous traveling body 11 can be confirmed by the image data G.
The image data G displayed on the external device 17 is made data based on the image data captured by the camera 51 from the current position of the main body housing 20, so that the user can observe the current state of the travel area from the image data G, and the user can easily estimate the position of the main body housing 20 from the image data G, and more intuitively instruct the position to be guided.
The CPU61 calculates a travel path linking the self position of the body casing 20 and the target travel position, and performs travel control so that the body casing 20 travels along the travel path, thereby enabling the autonomous traveling body 11 to travel easily and reliably to the target travel position, the self position of the body casing 20 being the self position of the body casing 20 in the travel area estimated based on the image data captured by the camera 51.
At this time, when an obstacle is detected on the travel route, the CPU61 performs travel control so as to avoid the obstacle, and calculates a new travel route from the position where the obstacle is avoided to the target travel position, so that the autonomous traveling body 11 can travel to the target travel position without stopping the travel even if an obstacle is assumed to be present on the travel route to the target travel position specified by the user.
Further, the CPU61 can prevent the autonomous moving body 11 from continuing to challenge the arrival at the target traveling position when it is difficult to reach the target traveling position for a long time, such as when a door between the current position of the main body case 20 and the target traveling position is closed, by interrupting the traveling control of the main body case 20 when the calculation of the new traveling path, the traveling control along the new traveling path, and the detection of the obstacle are repeated more than a predetermined number of times.
When the main body case 20 travels to the target travel position, the autonomous traveling body 11 can easily clean the position desired by the user from outside the house at the destination by cleaning the predetermined area around the target travel position by the cleaning unit 22.
Further, when the main body housing 20 travels to the target travel position, the camera 51 captures an image at the position, and the autonomous traveling body 11 can be used for a guard when not at home, a guard of a pet, and the like.
In this case, by panoramic-imaging the periphery of the main body case 20 with the camera 51, the user can visually recognize a wide range of the traveling area from the image data G, and more effective guard can be performed.
Specifically, the CPU61 performs the travel control so that the main body case 20 is rotated at the target travel position, and performs the panoramic photographing by combining a plurality of images photographed by the camera 51 during the rotation of the main body case 20, thereby making it possible to easily create the panoramic image by using a known method.
In addition, when the image data G displayed on the display 73 of the external device 17 does not indicate the desired position, the user transmits a predetermined instruction signal from the external device 17, and when the instruction signal is received by the communication unit 25, the CPU61 captures an image with the camera 51 at a position where the main body case 20 is moved or rotated by a predetermined angle, so that the user can easily find the desired position from the image data G.
In the above-described embodiment, the image processing processor 63 is configured to integrally include the respective functions of the image input means, the image processing means, the self-position estimation means, the map creation means, and the like, but these functions may be configured by separate processing units, or a processing unit integrally including any plural of these functions may be configured.
An embodiment of the present invention has been described, but the embodiment is presented as an example only and is not intended to limit the scope of the invention. The new embodiment can be implemented by other various embodiments, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. The embodiment and its modifications are included in the scope and gist of the invention, and are included in the invention described in the claims and their equivalent scope.

Claims (11)

1. An autonomous traveling body, comprising:
a body capable of self-advancing;
a camera provided to the main body;
an image input unit that acquires image data captured by the camera;
a communication unit capable of communicating with an external device having a display function capable of displaying image data, transmitting the image data captured by the camera to the external device, and receiving a signal from the external device; and
a travel control unit that performs travel control on the main body,
the communication unit receives, from the external apparatus, coordinate data on image data corresponding to a position indicated by a user among the image data captured by the camera and displayed on the external apparatus,
the travel control means sets a target travel position on the map data indicated by the coordinate data based on the map data of the place where the subject travels and the coordinate data received by the communication means, and causes the subject to travel to the target travel position.
2. The autonomous running body according to claim 1,
the image data displayed on the external device is image data obtained based on image data captured by the camera from the current position of the subject.
3. The autonomous running body according to claim 1 or 2,
a self-position estimating unit that calculates a self-position of the subject in a traveling place based on image data captured by the camera,
the travel control means calculates a travel route that connects the self position estimated by the self position estimation means and the target travel position, and performs travel control so that the main body travels along the travel route.
4. The autonomous running body according to claim 3,
when an obstacle is detected in the travel route, the travel control means calculates a new travel route that avoids the obstacle and that extends from a position where the obstacle is avoided to the target travel position.
5. The autonomous running body according to claim 4,
the travel control means interrupts the travel control of the main body when the calculation of a new travel route, the travel control along the new travel route, and the detection of an obstacle are repeated a predetermined number of times or more.
6. The autonomous running body according to claim 1 or 2,
comprises a cleaning part for cleaning the running surface,
when the main body travels to a target travel position, a predetermined area around the target travel position is cleaned by the cleaning unit.
7. The autonomous running body according to claim 1 or 2,
the camera takes a picture at a target driving position when the subject drives the position.
8. The autonomous running body according to claim 7,
the camera performs panoramic photographing around the subject.
9. The autonomous running body according to claim 8,
the travel control unit rotates the main body at the target travel position,
the camera performs panoramic photographing by synthesizing a plurality of images photographed while the subject is rotating.
10. The autonomous running body according to claim 1 or 2,
the travel control unit causes the main body to travel or rotate at a predetermined angle when receiving a predetermined instruction signal from the external device via the communication unit,
the camera performs shooting at a position where the subject is driven or rotated by a predetermined angle.
11. An autonomous traveling system, comprising:
an autonomous traveling body including a body capable of traveling by itself, a camera provided in the body, an image input unit for acquiring image data captured by the camera, a communication unit, and a traveling control unit for controlling traveling of the body; and
an external device having a function of displaying image data and a function of indicating a position on the displayed image data,
the communication unit is capable of communicating with the external device, transmitting the image data captured by the camera to the external device, receiving a signal from the external device, and receiving from the external device coordinate data on image data corresponding to a position indicated by a user among the image data captured by the camera and displayed on the external device,
the travel control means sets a target travel position on the map data indicated by the coordinate data based on the map data of the place where the subject travels and the coordinate data received by the communication means, and causes the subject to travel to the target travel position.
CN201811554782.8A 2017-12-20 2018-12-19 Autonomous traveling body and autonomous traveling body system Active CN110025260B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-244099 2017-12-20
JP2017244099A JP2019109853A (en) 2017-12-20 2017-12-20 Autonomous vehicle and autonomous vehicle system

Publications (2)

Publication Number Publication Date
CN110025260A CN110025260A (en) 2019-07-19
CN110025260B true CN110025260B (en) 2021-09-14

Family

ID=67179953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811554782.8A Active CN110025260B (en) 2017-12-20 2018-12-19 Autonomous traveling body and autonomous traveling body system

Country Status (2)

Country Link
JP (1) JP2019109853A (en)
CN (1) CN110025260B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111759231A (en) * 2020-06-28 2020-10-13 深圳拓邦股份有限公司 Control method of cleaning equipment and cleaning equipment

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1647124A (en) * 2002-10-04 2005-07-27 富士通株式会社 Robot system and self movable robot
KR101893152B1 (en) * 2012-10-26 2018-08-31 엘지전자 주식회사 robot cleaner system and a control method of the same
US9868211B2 (en) * 2015-04-09 2018-01-16 Irobot Corporation Restricting movement of a mobile robot
CN104965511B (en) * 2015-06-16 2018-05-04 南京邮电大学 A kind of intelligent carriage real-time video return system based on free routing planning
CN107229274B (en) * 2016-03-24 2022-06-28 松下电器(美国)知识产权公司 Position indication method, terminal device, self-propelled device, and program
CN106054878B (en) * 2016-06-03 2022-12-16 中国计量大学 Inertial guidance vehicle navigation method based on two-dimensional code positioning and inertial guidance vehicle
CN107569181B (en) * 2016-07-04 2022-02-01 九阳股份有限公司 Intelligent cleaning robot and cleaning method
CN106200645B (en) * 2016-08-24 2019-07-26 北京小米移动软件有限公司 Autonomous robot, control device, and control method
CN107544507A (en) * 2017-09-28 2018-01-05 速感科技(北京)有限公司 Mobile robot control method for movement and device
CN108245099A (en) * 2018-01-15 2018-07-06 深圳市沃特沃德股份有限公司 Robot moving method and device
CN108268036A (en) * 2018-01-19 2018-07-10 刘晋宇 A kind of novel robot intelligent barrier avoiding system
CN108403009A (en) * 2018-03-08 2018-08-17 徐志强 A kind of sweeping robot and its control method

Also Published As

Publication number Publication date
JP2019109853A (en) 2019-07-04
CN110025260A (en) 2019-07-19

Similar Documents

Publication Publication Date Title
US11119484B2 (en) Vacuum cleaner and travel control method thereof
KR101771869B1 (en) Traveling body device
KR101840158B1 (en) Electric vacuum cleaner
US20190227566A1 (en) Self-propelled vacuum cleaner
WO2018087952A1 (en) Electric vacuum cleaner
CN110636789B (en) Electric vacuum cleaner
CN110325938B (en) Electric vacuum cleaner
CN109938642B (en) Electric vacuum cleaner
KR102082757B1 (en) Cleaning robot and method for controlling the same
WO2019117078A1 (en) Electric cleaner
JP7212447B2 (en) self-propelled vacuum cleaner
CN110025260B (en) Autonomous traveling body and autonomous traveling body system
JP2018196510A (en) Vacuum cleaner
JP6912937B2 (en) Vacuum cleaner
JP2016185182A (en) Vacuum cleaner and information display method therefor
JP7014586B2 (en) Autonomous vehicle
JP2019106009A (en) Autonomous travel body and autonomous travel body system
JP7295657B2 (en) Autonomous vehicle device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant