WO2011074165A1 - 自律移動装置 - Google Patents
自律移動装置 Download PDFInfo
- Publication number
- WO2011074165A1 WO2011074165A1 PCT/JP2010/006265 JP2010006265W WO2011074165A1 WO 2011074165 A1 WO2011074165 A1 WO 2011074165A1 JP 2010006265 W JP2010006265 W JP 2010006265W WO 2011074165 A1 WO2011074165 A1 WO 2011074165A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- obstacle
- map
- mobile device
- area
- autonomous mobile
- Prior art date
Links
- 239000002131 composite material Substances 0.000 claims description 28
- 238000004364 calculation method Methods 0.000 claims description 14
- 230000007613 environmental effect Effects 0.000 claims description 14
- 230000010354 integration Effects 0.000 claims description 13
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 238000000034 method Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 239000000203 mixture Substances 0.000 description 3
- 230000002194 synthesizing effect Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/006—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
- G09B29/007—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/01—Mobile robot
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S901/00—Robots
- Y10S901/46—Sensing device
Definitions
- the present invention relates to an autonomous mobile device that autonomously moves to a destination.
- the area where the autonomous mobile device should avoid entering may not be limited to the area where the obstacle exists.
- areas where the autonomous mobile device should avoid entry include areas with steps such as stairs where the autonomous mobile apparatus cannot move, areas that are dangerous if moved such as a patient's treatment room, and the like. In this case, it is necessary to perform control so as to avoid the entry of the autonomous mobile device even in an area where no obstacle exists.
- the autonomous mobile device described in Patent Document 1 cannot detect the area where the approach should be avoided if there is no obstacle. For this reason, although there is no obstacle, it is not possible to move while avoiding autonomously the area where entry should be avoided.
- the present invention has been made to solve the above-described problems, and provides an autonomous mobile device capable of autonomously avoiding an area where entry should be avoided even if no obstacle exists, and moving.
- the purpose is to do.
- An autonomous mobile device includes an obstacle sensor that acquires surrounding obstacle information, an environmental map that indicates an obstacle area where an obstacle exists, and an entry prohibited area map that indicates an entry prohibited area where entry is prohibited.
- Storage means for storing, estimation means for estimating the self-position using the obstacle information acquired by the obstacle sensor and the environment map, the self-position estimated by the estimation means, the environment map, and the entry prohibition area map And a control means for controlling the autonomous movement based on the above.
- the entry prohibition area map indicating the entry prohibition area prohibiting the entry of the autonomous mobile device is stored by the storage unit.
- the self-position on the environment map is estimated by the estimation means.
- the autonomous movement is controlled by the control means based on the estimated self-location, the environment map, and the entry prohibition area map. Accordingly, it is possible to move while avoiding the obstacle area and the entry prohibition area while estimating the self position. In other words, even if there are no obstacles, it is possible to move while avoiding autonomously the area where entry should be avoided.
- the autonomous mobile device includes a reception unit that receives an operation for setting an entry prohibition region, and the storage unit stores an entry prohibition region map set based on the operation received by the reception unit. Is preferred. In this case, the user can arbitrarily set the entry prohibition area.
- the autonomous mobile device further includes a combining unit that generates a combined map obtained by combining the environmental map and the entry prohibition area map.
- the composition means reflects both the obstacle area indicated by the environmental map and the entry prohibited area indicated by the entry prohibited area map for each pixel corresponding to the environment map and the entry prohibited area map. It is preferable to generate a composite map.
- the autonomous mobile device further includes a calculation unit that calculates avoidance information based on the obstacle region and the entry prohibition region using the self-position estimated by the estimation unit and the composite map generated by the synthesis unit.
- the control means preferably performs control for avoiding the obstacle using the avoidance information calculated by the calculation means and the obstacle information acquired by the obstacle sensor.
- the avoidance information based on the obstacle area and the avoidance information based on the entry prohibition area in the composite map are calculated based on the self position. Thereby, even if it is a case where obstacle information cannot be acquired with an obstacle sensor, the avoidance information based on an obstacle area
- the calculation means generates virtual sensor output by calculating avoidance information that is compatible with an obstacle information output format by the obstacle sensor.
- the avoidance information obtained by the calculation means can be processed using the same algorithm as the obstacle information obtained by the obstacle sensor.
- the autonomous mobile device further includes an integration unit that integrates the obstacle information by the obstacle sensor and the avoidance information by the calculation unit, and the control unit uses the obstacle information and the avoidance information integrated by the integration unit. It is preferable to perform control to avoid obstacles. In this case, even when the number of obstacle sensors is changed, integrated obstacle information is input to the control means, so that software changes in the control means can be minimized.
- the obstacle sensor is preferably a laser range finder.
- FIG. 1 is a diagram for explaining a configuration of an autonomous mobile device 1 according to the present embodiment.
- the autonomous mobile device 1 is a device that autonomously moves to a destination while avoiding obstacles such as people and objects and set entry prohibition areas.
- the autonomous mobile device 1 includes a hollow cylindrical main body 10 made of metal, four omni wheels 11 provided on the lower side of the main body 10, and four electric motors 12 that drive the omni wheel 11. Yes.
- the autonomous mobile device 1 can move in any direction on the moving surface by individually adjusting the rotation direction and rotation speed of each of the four omni wheels 11 by each electric motor 12.
- the autonomous mobile device 1 includes a laser range finder 13, a stereo camera 14, a touch screen 15, and an electronic control device 20.
- the laser range finder 13 is a sensor that acquires obstacle information around the own machine, and functions as an obstacle sensor described in the claims.
- the laser range finder 13 is attached to the front surface of the main body 10 and emits a laser in a fan shape and in a horizontal direction, and measures a propagation time of a reflected wave reflected by an obstacle at each emission angle. The distance calculated from the emission angle and the propagation time of the reflected wave is the obstacle information output from the laser range finder 13.
- the stereo camera 14 calculates the distance and angle from its own device to the obstacle based on the principle of triangulation using stereo images. This distance and angle are obstacle information output from the stereo camera 14.
- the touch screen 15 is an input device that includes a liquid crystal display and a touch panel. When the user performs a touch operation on information displayed on the liquid crystal display, the touch panel detects the touch operation and the user's operation is accepted.
- the electronic control unit 20 inputs obstacle information output from the laser range finder 13 and the stereo camera 14 and controls autonomous movement. For this reason, the electronic control unit 20 includes a microprocessor that performs calculations, a ROM that stores programs for causing the microprocessor to execute each process, a RAM that temporarily stores various data such as calculation results, and the storage thereof. It is composed of a backup RAM, a hard disk, etc. for holding the contents.
- FIG. 2 is a block diagram showing a functional configuration of the electronic control unit 20.
- the electronic control device 20 includes a storage unit 21, a setting unit 22, a synthesis unit 23, a route planning unit 24, a self-position estimation unit 25, a map sensor 26, a sensor data integration unit 27, and a movement control unit 28.
- the storage unit 21 is configured by a backup RAM or the like, and stores an environmental map 211 and an entry prohibition area map 212. That is, the storage unit 21 functions as a storage unit described in the claims.
- the environment map 211 and the entry prohibition area map 212 are stored in different layers, and are stored so that one change does not affect the other.
- the environment map 211 is a map showing an obstacle area 31 where an obstacle exists.
- a white area indicates an obstacle area 31 where an obstacle exists
- a gray area indicates an area 32 where no obstacle exists.
- the obstacle area 31 shown in the environment map 211 is an area where a static obstacle that does not move exists, for example, an area occupied by a wall or furniture.
- the obstacle region 31 is detected in advance by the laser range finder 13 and / or the stereo camera 14.
- the obstacle region 31 can also be created by adding data such as walls and furniture to the CAD data of the building to which the autonomous mobile device 1 moves.
- the entry prohibition area map 212 is a map showing the entry prohibition area 33.
- the light gray area indicates the entry prohibition area 33
- the dark gray area indicates the area 34 other than the entry prohibition area.
- the entry prohibition area 33 is an area set to prohibit entry of the autonomous mobile device 1. Even if the laser range finder 13 and the stereo camera 14 can detect an obstacle in the entry prohibition area 33, the laser range finder 13 and the stereo camera 14 cannot detect the entry prohibition area 33 itself.
- the entry prohibition area 33 can be arbitrarily set by the user.
- the entry prohibition area 33 is set in front of a staircase or a step where the autonomous mobile device 1 cannot move, thereby preventing the autonomous mobile device 1 from entering the staircase or the step.
- a dangerous area or the like when the autonomous mobile device 1 moves such as a hospital treatment room, can be set as the entry prohibition area 33.
- An area where an obstacle exists but cannot be detected by the laser range finder 13 or the stereo camera 14 or an area that is difficult to detect may be set as the entry prohibition area 33.
- the entry prohibition area 33 can be arbitrarily set by the user via the touch screen 15. Specifically, when the user specifies an entry prohibition area by a touch operation on the environment map 211 displayed on the touch screen 15, the touch screen 15 receives the user's touch operation. That is, the touch screen 15 functions as a receiving unit described in the claims. The touch screen 15 outputs the received touch operation information to the setting unit 22.
- the setting unit 22 Based on the touch operation output from the touch screen 15, the setting unit 22 sets the designated entry prohibited area on the entry prohibited area map 212, and stores the information of the entry prohibited area map 212 after setting to the storage unit 21. Output. Thereby, the entry prohibition area arbitrarily set by the user is reflected in the entry prohibition area map 212 in the storage unit 21.
- the synthesizing unit 23 synthesizes the environment map 211 and the entry prohibited area map 212 to generate a synthesized map 213. That is, the synthesizing unit 23 functions as synthesizing means described in the claims.
- the composite map 213 is indicated by the obstacle region 31 and the entry prohibition region map 212 indicated by the environment map 211 for each pixel corresponding to the environment map 211 and the entry prohibition region map 212. Both of the entry prohibition areas 33 are reflected.
- a pixel refers to each rectangular area divided by a grid in FIGS. 3A to 3C.
- the white area 35 in FIG. 3C reflects the obstacle area 31 of the environmental map 211.
- the light gray area 36 reflects the entry prohibition area 33 of the entry prohibition area map 212.
- the dark gray area 37 is an area other than the obstacle area 31 and the entry prohibition area 33.
- the route plan unit 24 plans a route to the destination using the composite map 213 generated by the composition unit 23. That is, the route planning unit 24 functions as a planning unit described in the claims.
- the route planning unit 24 uses the composite map 213 to route so as to avoid both the obstacle region 31 included in the environment map 211 and the entry prohibited region 33 included in the entry prohibited region map 212. To plan.
- the self-position estimation unit 25 estimates the self-position using the obstacle information output from the laser range finder 13 and the environment map 211. That is, the self-position estimation unit 25 functions as an estimation unit described in the claims.
- a method for estimating the self-position will be described with reference to FIGS. 4A and 4B.
- 4A and 4B are diagrams for explaining a method of estimating the self position by the self position estimation unit 25.
- FIG. 4A and 4B are diagrams for explaining a method of estimating the self position by the self position estimation unit 25.
- the five arrows in FIG. 4A correspond to the five obstacle information 41 output from the laser range finder 13.
- An arrow indicating the obstacle information 41 indicates an emission angle and a distance included in the obstacle information 41.
- the self-position estimation unit 25 searches the environment map 211 for coordinates having a high degree of coincidence between the obstacle region 31 and the obstacle information 41 output from the laser range finder 13, and finds the coordinate having the highest degree of coincidence. Estimated position.
- the self-position estimating unit 25 estimates the direction of the own device in the environment map 211. For example, the self-position estimating unit 25 estimates the direction of the own aircraft from the rotation amount of the omni wheel 11 and information on each injection angle included in the obstacle information 41.
- the environment map 211 is used instead of the composite map 213. If the self-location is estimated using the composite map 213, the obstacle region 31 that can be detected by the laser range finder 13 and the entry prohibition region 33 that cannot be detected by the laser range finder 13 cannot be identified. This is because the possibility of mistakes increases.
- the autonomous mobile device 1 uses the environment map 211 when estimating its own position.
- a composite map 213 obtained by combining the environment map 211 and the entry prohibition map 212 is used. Therefore, in the autonomous mobile device 1, the storage unit 21 stores the environment map 211 and the entry prohibited area map 212 in different layers, and the composition unit 23 combines the environment map 211 and the entry prohibited area map 212. A map 213 is generated.
- the map sensor 26 calculates virtual obstacle information corresponding to the obstacle area 31 and the entry prohibition area 33 using the estimated self-location and the composite map 213.
- the virtual obstacle information is avoidance information that is generated based on the obstacle area 31 and the entry prohibition area 33 in the composite map 213 and is used for avoidance control. That is, the map sensor 26 functions as calculation means described in the claims.
- the map sensor 26 will be described with reference to FIG.
- the arrows shown in FIG. 5 indicate virtual obstacle information 43 calculated by the map sensor 26.
- the map sensor 26 projects the estimated self-position 50 on the composite map 213 and calculates a virtual sensor output as the obstacle information 43.
- the virtual sensor output is obtained when the area 35 corresponding to the obstacle area 31 and the area 36 corresponding to the entry prohibition area 33 are virtually regarded as areas where obstacles exist, Output obtained by a simple sensor. That is, the obstacle information 43 is not the information of the obstacle detected by the presence of the actual obstacle, but the virtual obstacle information generated by the calculation.
- the virtual obstacle information 43 has the same data format as the actual obstacle information output from the laser range finder 13 and the stereo camera 14.
- FIG. 6 is a table showing the data format of the virtual obstacle information 43.
- the virtual obstacle information 43 includes information indicating an angle number (for example, 100) and a distance from the own device to the obstacle (for example, 2525 mm). The angle number corresponds to the emission angle of the obstacle information output from the laser range finder 13.
- the sensor data integration unit 27 integrates the actual obstacle information acquired by the laser range finder 13 and the stereo camera 14 and the virtual obstacle information 43 calculated by the map sensor 26. That is, the sensor data integration unit 27 functions as an integration unit described in the claims.
- the distance at which the detection of the obstacle is guaranteed for the laser range finder 13 may be 5 m, and the distance for the stereo camera 14 may be 10 m.
- the sensor data integration unit 27 integrates the actual obstacle information that is input into information within a distance of 5 m, for example, by deleting information with a distance greater than 5 m.
- FIG. 7A shows actual obstacle information 42 acquired by the laser range finder 13 at a certain time.
- a rectangular area 39 indicated by a broken line is an area within the obstacle area 31 but not detected by the laser range finder 13. This is because the laser range finder 13 may not be able to detect the reflected wave depending on the color or material of the obstacle or the reflection angle of the laser. For this reason, a so-called flicker may occur that is detected at a certain moment by the laser range finder 13 and not detected at the next moment.
- the virtual obstacle information 43 by the map sensor 26 is calculated based on an area 35 corresponding to the obstacle area 31 stored in advance. For this reason, virtual obstacle information 43 corresponding to the region 39 that could not be detected by the laser range finder 13 is included.
- the obstacle information that could not be acquired by the laser range finder 13 or the stereo camera 14. Can be complemented. Thereby, what is called flickering can be prevented and obstacle information can be obtained stably.
- the movement control unit 28 controls the motor 12 to move along the route planned by the route planning unit 24 based on the self-position estimated by the self-position estimating unit 25.
- the movement control unit 28 includes an obstacle avoiding unit 29.
- the obstacle avoiding unit 29 detects an obstacle on the way to the destination along the route planned by the route planning unit 24, the obstacle avoiding unit 29 uses the obstacle information output from the sensor data integration unit 27, Controls avoiding obstacles by calculating interference between the aircraft and obstacles.
- the control for avoiding the obstacle includes control for stopping so as not to contact the obstacle, control for detouring, and the like.
- FIG. 8 is a flowchart showing a processing procedure for obstacle avoidance control by the autonomous mobile device 1. This obstacle avoidance control is executed by the electronic control device 20 when the autonomous mobile device 1 autonomously moves to the destination along the planned route.
- step S101 actual obstacle information acquired by the laser range finder 13 and the stereo camera 14 is read.
- step S ⁇ b> 102 the self position is estimated by the self position estimation unit 25 based on the environment map 211 and the actual obstacle information by the laser range finder 13. Since the self-position estimation method is as described above, detailed description is omitted here.
- step S103 the virtual obstacle information 43, which is a virtual sensor output based on the self-location, is obtained using the composite map 213 in which the environment map 211 and the entry prohibition area map 212 are combined. Calculated by the sensor 26. That is, when the area 35 corresponding to the obstacle area 31 and the area 36 corresponding to the entry prohibition area 33 are virtually regarded as areas where obstacles exist, the virtual sensor located at the self-position 50 is obtained. A sensor output is generated.
- step S104 the actual obstacle information read in step S101 and the virtual obstacle information 43 calculated in step S103 are integrated by the sensor data integration unit 27. That is, the obstacle information acquired by the laser range finder 13 and the stereo camera 14 and the virtual obstacle information 43 corresponding to the obstacle area 31 and the entry prohibition area 33 by the map sensor 26 are integrated.
- step S105 the movement control unit 28 performs control for avoiding the obstacle based on the obstacle information integrated in step S104.
- the autonomous mobile device 1 detects the obstacle area 31 on the environment map 211, the entry prohibition area 212 on the entry prohibition area 212, and the dynamic range detected by the laser range finder 13 and the stereo camera 14 during movement. You can move around obstacles.
- the process shown in FIG. 8 is repeated for each control period of the autonomous mobile device 1.
- the storage unit 21 stores an entry prohibition area map 212 indicating the entry prohibition area 33 in addition to the environment map 211 indicating the obstacle area 31. Based on the actual obstacle information and the obstacle area 31, the self-position is estimated. Therefore, errors in self-position estimation can be suppressed.
- autonomous movement is controlled based on the estimated self-location and the composite map 213 in which the environment map 211 and the entry prohibition area map 212 are combined. Thereby, it is possible to move while avoiding the obstacle area 31 and the entry prohibition area 33. In other words, even if there are no obstacles, it is possible to move while avoiding autonomously the area where entry should be avoided.
- an operation for setting the entry prohibition area 33 is accepted by the touch screen 15, and the set entry prohibition area 33 is reflected in the entry prohibition area map 212. For this reason, the user can arbitrarily set the entry prohibition area 33 according to the situation.
- the route can be planned so as to avoid both the obstacle region 31 and the entry prohibition region 33. .
- the virtual obstacle information 43 corresponding to the obstacle area 31 and the virtual obstacle information 43 corresponding to the entry prohibition area 33 in the composite map 213 are based on the self position. Is calculated. Thereby, even when the actual obstacle information cannot be acquired by the laser range finder 13, the virtual obstacle information 43 of the obstacle region 31 can be acquired. Therefore, so-called flicker can be prevented. Further, the virtual obstacle information 43 of the entry prohibition area 33 can be acquired by regarding the entry prohibition area 33 as an area to be avoided in the same manner as the obstacle area 31. As a result, control for avoiding the obstacle can be performed more reliably, and control for avoiding the entry prohibition area 33 can be performed.
- the virtual obstacle information 43 having the same data format as the actual obstacle information obtained by the laser range finder 13 and the stereo camera 14 is calculated by the map sensor 26, thereby providing a virtual A sensor output is generated.
- the process which integrates the actual obstacle information by the laser range finder 13 and the stereo camera 14, and the virtual obstacle information by the map sensor 26 can be easily performed.
- the map sensor 26 is added later to the autonomous mobile device including the laser range finder 13 and the stereo camera 14, the software for integrating the virtual obstacle information 43 output from the map sensor 26 is changed. Can be minimized.
- the movement control unit 28 performs control to avoid the obstacle using the integrated obstacle information. Therefore, even when the number of sensors for detecting the obstacle is changed, the integrated obstacle information is input to the movement control unit 28, so that the software change in the movement control unit 28 is minimized. Can do. Therefore, it is possible to flexibly cope with specification changes.
- the present invention is not limited to the above-described embodiments, and various modifications can be made.
- the laser range finder 13 and the stereo camera 14 are used as means for acquiring surrounding obstacle information, but the present invention is not limited to this. Either one of the laser range finder 13 and the stereo camera 14 may be used, or an ultrasonic sensor or the like may be combined.
- the obstacle information obtained by the laser range finder 13 is used for self-position estimation.
- the self-position may be estimated using obstacle information obtained by another stereo camera or an ultrasonic sensor.
- the map sensor 26 calculates the virtual obstacle information 43 having the same format as the data format of the actual obstacle information by the laser range finder 13, but the present invention is not limited to this.
- the map sensor 26 may calculate virtual obstacle information having a data format compatible with the actual obstacle information obtained by the laser range finder 13.
- the sensor data integration unit 27 performs processing for unifying the data format.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Electromagnetism (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ecology (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Instructional Devices (AREA)
Abstract
Description
13 レーザレンジファインダ
14 ステレオカメラ
15 タッチスクリーン
20 電子制御装置
21 記憶部
211 環境地図
212 進入禁止領域地図
213 合成地図
22 設定部
23 合成部
24 経路計画部
25 自己位置推定部
26 地図センサ
27 センサデータ統合部
28 移動制御部
29 障害物回避部
Claims (13)
- 周囲の障害物情報を取得する障害物センサと、
障害物が存在する障害物領域を示す環境地図、及び、進入を禁止する進入禁止領域を示す進入禁止領域地図を記憶する記憶手段と、
前記障害物センサによって取得された障害物情報と前記環境地図とを用いて自己位置を推定する推定手段と、
前記推定手段によって推定された自己位置、前記環境地図、及び前記進入禁止領域地図に基づいて、自律移動を制御する制御手段と、
を備えることを特徴とする自律移動装置。 - 前記進入禁止領域を設定するための操作を受け付ける受付手段を備え、
前記記憶手段は、前記受付手段によって受け付けられた操作に基づいて設定された前記進入禁止領域地図を記憶することを特徴とする請求項1に記載の自律移動装置。 - 前記環境地図と前記進入禁止領域地図とを合成した合成地図を生成する合成手段を更に備えることを特徴とする請求項1に記載の自律移動装置。
- 前記合成手段は、前記環境地図と前記進入禁止領域地図との相互に対応するピクセル毎に、前記環境地図によって示される障害物領域と前記進入禁止領域地図によって示される進入禁止領域との双方が反映された前記合成地図を生成することを特徴とする請求項3に記載の自律移動装置。
- 前記合成手段によって生成された合成地図を用いて、目的地までの経路を計画する計画手段を更に備え、
前記制御手段は、前記計画手段によって計画された経路に基づいて自律移動を制御することを特徴とする請求項3に記載の自律移動装置。 - 前記推定手段によって推定された自己位置と、前記合成手段によって生成された合成地図とを用いて、前記障害物領域及び前記進入禁止領域に基づく回避情報を算出する算出手段を更に備え、
前記制御手段は、前記算出手段によって算出された回避情報と、前記障害物センサによって取得された障害物情報とを用いて障害物を回避する制御を行うことを特徴とする請求項3に記載の自律移動装置。 - 前記算出手段は、前記障害物センサによる障害物情報の出力形式と互換性のある前記回避情報を算出することにより、仮想的なセンサ出力を生成することを特徴とする請求項6に記載の自律移動装置。
- 前記障害物センサによる障害物情報と前記算出手段による回避情報とを統合する統合手段を更に備え、
前記制御手段は、前記統合手段によって統合された前記障害物情報及び前記回避情報を用いて障害物を回避する制御を行うことを特徴とする請求項7に記載の自律移動装置。 - 前記障害物センサは、レーザレンジファインダであることを特徴とする請求項1に記載の自律移動装置。
- 前記環境地図と前記進入禁止領域地図とを合成した合成地図を生成する合成手段を更に備えることを特徴とする請求項2に記載の自律移動装置。
- 前記合成手段によって生成された合成地図を用いて、目的地までの経路を計画する計画手段を更に備え、
前記制御手段は、前記計画手段によって計画された経路に基づいて自律移動を制御することを特徴とする請求項4に記載の自律移動装置。 - 前記推定手段によって推定された自己位置と、前記合成手段によって生成された合成地図とを用いて、前記障害物領域及び前記進入禁止領域に基づく回避情報を算出する算出手段を更に備え、
前記制御手段は、前記算出手段によって算出された回避情報と、前記障害物センサによって取得された障害物情報とを用いて障害物を回避する制御を行うことを特徴とする請求項4に記載の自律移動装置。 - 前記推定手段によって推定された自己位置と、前記合成手段によって生成された合成地図とを用いて、前記障害物領域及び前記進入禁止領域に基づく回避情報を算出する算出手段を更に備え、
前記制御手段は、前記算出手段によって算出された回避情報と、前記障害物センサによって取得された障害物情報とを用いて障害物を回避する制御を行うことを特徴とする請求項5に記載の自律移動装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10837203.8A EP2498158A4 (en) | 2009-12-17 | 2010-10-22 | AUTONOMOUS MOBILE DEVICE |
US13/514,004 US8897947B2 (en) | 2009-12-17 | 2010-10-22 | Autonomous mobile device |
KR1020127007777A KR101420201B1 (ko) | 2009-12-17 | 2010-10-22 | 자율 이동 장치 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009286846A JP2011128899A (ja) | 2009-12-17 | 2009-12-17 | 自律移動装置 |
JP2009-286846 | 2009-12-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011074165A1 true WO2011074165A1 (ja) | 2011-06-23 |
Family
ID=44166940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/006265 WO2011074165A1 (ja) | 2009-12-17 | 2010-10-22 | 自律移動装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8897947B2 (ja) |
EP (1) | EP2498158A4 (ja) |
JP (1) | JP2011128899A (ja) |
KR (1) | KR101420201B1 (ja) |
WO (1) | WO2011074165A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105136155A (zh) * | 2015-09-24 | 2015-12-09 | 联想(北京)有限公司 | 一种导航方法及电子设备 |
JP2020509500A (ja) * | 2017-03-02 | 2020-03-26 | ロブアート ゲーエムベーハーROBART GmbH | 自律移動ロボットの制御方法 |
JP2021500688A (ja) * | 2017-10-30 | 2021-01-07 | 珠海市一微半導体有限公司Amicro Semiconductor Co., Ltd. | ロボットの走行予測及び制御方法 |
US11768494B2 (en) | 2015-11-11 | 2023-09-26 | RobArt GmbH | Subdivision of maps for robot navigation |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AP2011006008A0 (en) * | 2009-05-01 | 2011-12-31 | Univ Sydney | Integrated automation system with picture compilation system. |
JP5460413B2 (ja) * | 2010-03-26 | 2014-04-02 | ダイハツ工業株式会社 | 自車位置認識装置 |
WO2012166970A1 (en) * | 2011-05-31 | 2012-12-06 | John Bean Technologies Corporation | Deep lane navigation system for automatic guided vehicles |
US9098087B2 (en) * | 2013-02-04 | 2015-08-04 | Caterpillar Inc. | System and method for adjusting the operation of a machine |
US9141107B2 (en) | 2013-04-10 | 2015-09-22 | Google Inc. | Mapping active and inactive construction zones for autonomous driving |
AU2013350342B2 (en) | 2013-07-30 | 2015-08-13 | Komatsu Ltd. | Management system and management method of mining machine |
US9886036B2 (en) * | 2014-02-10 | 2018-02-06 | John Bean Technologies Corporation | Routing of automated guided vehicles |
US9535421B1 (en) * | 2014-02-28 | 2017-01-03 | Savioke, Inc. | Mobile delivery robot with interior cargo space |
US9605415B2 (en) | 2014-09-12 | 2017-03-28 | Caterpillar Inc. | System and method for monitoring a machine |
JP5902275B1 (ja) * | 2014-10-28 | 2016-04-13 | シャープ株式会社 | 自律移動装置 |
US20180099846A1 (en) | 2015-03-06 | 2018-04-12 | Wal-Mart Stores, Inc. | Method and apparatus for transporting a plurality of stacked motorized transport units |
WO2016142794A1 (en) | 2015-03-06 | 2016-09-15 | Wal-Mart Stores, Inc | Item monitoring system and method |
US9875502B2 (en) | 2015-03-06 | 2018-01-23 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices, and methods to identify security and safety anomalies |
US9864371B2 (en) | 2015-03-10 | 2018-01-09 | John Bean Technologies Corporation | Automated guided vehicle system |
DE102015109775B3 (de) | 2015-06-18 | 2016-09-22 | RobArt GmbH | Optischer Triangulationssensor zur Entfernungsmessung |
GB2562835B (en) * | 2015-08-07 | 2019-10-16 | Walmart Apollo Llc | Shopping space mapping systems, devices and methods |
DE102015114883A1 (de) | 2015-09-04 | 2017-03-09 | RobArt GmbH | Identifizierung und Lokalisierung einer Basisstation eines autonomen mobilen Roboters |
WO2017050357A1 (en) * | 2015-09-22 | 2017-03-30 | Bluebotics Sa | Virtual line-following and retrofit method for autonomous vehicles |
DE102015119865B4 (de) | 2015-11-17 | 2023-12-21 | RobArt GmbH | Robotergestützte Bearbeitung einer Oberfläche mittels eines Roboters |
DE102015121666B3 (de) | 2015-12-11 | 2017-05-24 | RobArt GmbH | Fernsteuerung eines mobilen, autonomen Roboters |
DE102016102644A1 (de) | 2016-02-15 | 2017-08-17 | RobArt GmbH | Verfahren zur Steuerung eines autonomen mobilen Roboters |
CA2961938A1 (en) | 2016-04-01 | 2017-10-01 | Wal-Mart Stores, Inc. | Systems and methods for moving pallets via unmanned motorized unit-guided forklifts |
JP6696273B2 (ja) * | 2016-04-08 | 2020-05-20 | 株式会社デンソー | 地図データ提供装置 |
JP6745175B2 (ja) * | 2016-09-12 | 2020-08-26 | 株式会社ダイヘン | 移動属性設定装置 |
KR101868374B1 (ko) * | 2016-10-20 | 2018-06-18 | 엘지전자 주식회사 | 이동 로봇의 제어방법 |
US11238726B2 (en) * | 2016-12-02 | 2022-02-01 | International Business Machines Corporation | Control of driverless vehicles in construction zones |
JP6809913B2 (ja) * | 2017-01-26 | 2021-01-06 | パナソニック株式会社 | ロボット、ロボットの制御方法、および地図の生成方法 |
CN107791251A (zh) * | 2017-11-22 | 2018-03-13 | 深圳市沃特沃德股份有限公司 | 机器人移动控制方法和机器人 |
US12038756B2 (en) * | 2017-12-19 | 2024-07-16 | Carnegie Mellon University | Intelligent cleaning robot |
US11453123B2 (en) * | 2017-12-27 | 2022-09-27 | Stmicroelectronics, Inc. | Robotic device with time-of-flight proximity sensing system |
JP6960518B2 (ja) * | 2018-02-28 | 2021-11-05 | 本田技研工業株式会社 | 制御装置、作業機械、プログラム及び制御方法 |
JP7063131B2 (ja) * | 2018-06-11 | 2022-05-09 | 株式会社豊田自動織機 | 自律走行台車 |
DE102018114892B4 (de) * | 2018-06-20 | 2023-11-09 | RobArt GmbH | Autonomer mobiler Roboter und Verfahren zum Steuern eines autonomen mobilen Roboters |
JP7005794B2 (ja) | 2018-06-21 | 2022-01-24 | 北京極智嘉科技股▲ふん▼有限公司 | ロボットスケジューリング、ロボットの経路制御方法、サーバーおよび記憶媒体 |
US11613041B1 (en) | 2018-11-08 | 2023-03-28 | Scepaniak IP Holdings, LLC | Coping nailer |
US11214967B1 (en) | 2018-11-08 | 2022-01-04 | Scepaniak IP Holdings, LLC | Roof rock spreader |
CN111413960A (zh) * | 2018-12-19 | 2020-07-14 | 深圳市优必选科技有限公司 | 一种基于虚拟轨道的巡航方法、装置及终端设备 |
JP2020107116A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社豊田自動織機 | 自律移動体 |
WO2020144849A1 (ja) * | 2019-01-11 | 2020-07-16 | 三菱電機株式会社 | 移動体測位装置および移動体管理装置 |
JP7340350B2 (ja) * | 2019-05-07 | 2023-09-07 | 東芝テック株式会社 | 情報処理装置及び情報処理方法 |
US11294385B2 (en) * | 2019-06-28 | 2022-04-05 | Robert Bosch Gmbh | System and method for generating a representation of an environment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0330003A (ja) * | 1989-06-28 | 1991-02-08 | Shinko Electric Co Ltd | 移動ロボットの走行制御方法 |
JP2000222563A (ja) * | 1999-02-04 | 2000-08-11 | Nec Corp | 障害物検出装置および障害物検出装置を搭載した移動体 |
JP2004299025A (ja) * | 2003-04-01 | 2004-10-28 | Honda Motor Co Ltd | 移動ロボット制御装置、移動ロボット制御方法及び移動ロボット制御プログラム |
JP2005050105A (ja) * | 2003-07-28 | 2005-02-24 | Matsushita Electric Works Ltd | 自律移動のための経路生成装置及び該装置を用いた自律移動装置 |
JP2007249632A (ja) * | 2006-03-16 | 2007-09-27 | Fujitsu Ltd | 障害物のある環境下で自律移動する移動ロボットおよび移動ロボットの制御方法。 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7366522B2 (en) * | 2000-02-28 | 2008-04-29 | Thomas C Douglass | Method and system for location tracking |
US7571511B2 (en) * | 2002-01-03 | 2009-08-11 | Irobot Corporation | Autonomous floor-cleaning robot |
US7663333B2 (en) * | 2001-06-12 | 2010-02-16 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US9128486B2 (en) * | 2002-01-24 | 2015-09-08 | Irobot Corporation | Navigational control system for a robotic device |
JP3945279B2 (ja) * | 2002-03-15 | 2007-07-18 | ソニー株式会社 | 障害物認識装置、障害物認識方法、及び障害物認識プログラム並びに移動型ロボット装置 |
US7145478B2 (en) * | 2002-12-17 | 2006-12-05 | Evolution Robotics, Inc. | Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system |
US7010425B2 (en) * | 2003-03-31 | 2006-03-07 | Deere & Company | Path planner and a method for planning a path of a work vehicle |
JP4246041B2 (ja) | 2003-11-25 | 2009-04-02 | パナソニック電工株式会社 | 自律移動装置 |
US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
JP4464893B2 (ja) * | 2004-09-13 | 2010-05-19 | パナソニック株式会社 | 移動ロボット |
US7389156B2 (en) * | 2005-02-18 | 2008-06-17 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US7620476B2 (en) * | 2005-02-18 | 2009-11-17 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
JP3931907B2 (ja) | 2005-03-14 | 2007-06-20 | 松下電工株式会社 | 自律移動装置 |
US7353034B2 (en) * | 2005-04-04 | 2008-04-01 | X One, Inc. | Location sharing and tracking using mobile phones or other wireless devices |
US20070028574A1 (en) * | 2005-08-02 | 2007-02-08 | Jason Yan | Dust collector for autonomous floor-cleaning device |
WO2008013568A2 (en) * | 2005-12-30 | 2008-01-31 | Irobot Corporation | Autonomous mobile robot |
KR100988736B1 (ko) * | 2006-03-15 | 2010-10-20 | 삼성전자주식회사 | 자율주행 이동로봇의 최단 경로 이동을 위한 홈 네트워크시스템 및 그 방법 |
US8139109B2 (en) * | 2006-06-19 | 2012-03-20 | Oshkosh Corporation | Vision system for an autonomous vehicle |
US8180486B2 (en) * | 2006-10-02 | 2012-05-15 | Honda Motor Co., Ltd. | Mobile robot and controller for same |
US8315789B2 (en) * | 2007-03-21 | 2012-11-20 | Commonwealth Scientific And Industrial Research Organisation | Method for planning and executing obstacle-free paths for rotating excavation machinery |
KR100809749B1 (ko) * | 2007-03-28 | 2008-03-04 | 엘지전자 주식회사 | 냉장고의 아이스메이커 어셈블리 |
JP2009025898A (ja) | 2007-07-17 | 2009-02-05 | Toyota Motor Corp | 経路計画装置、経路計画方法及び移動体 |
US8452450B2 (en) * | 2008-04-24 | 2013-05-28 | Evolution Robotics, Inc. | Application of localization, positioning and navigation systems for robotic enabled mobile products |
US8155811B2 (en) * | 2008-12-29 | 2012-04-10 | General Electric Company | System and method for optimizing a path for a marine vessel through a waterway |
KR101581415B1 (ko) * | 2009-02-23 | 2015-12-30 | 삼성전자주식회사 | 맵 빌딩 장치 및 방법 |
WO2010097921A1 (ja) * | 2009-02-26 | 2010-09-02 | 三菱電機株式会社 | 移動体撮像システム及び移動体及び地上局装置及び移動体撮像方法 |
US8744665B2 (en) * | 2009-07-28 | 2014-06-03 | Yujin Robot Co., Ltd. | Control method for localization and navigation of mobile robot and mobile robot using the same |
CN102782600B (zh) * | 2009-11-27 | 2015-06-24 | 丰田自动车株式会社 | 自动移动体及其控制方法 |
-
2009
- 2009-12-17 JP JP2009286846A patent/JP2011128899A/ja active Pending
-
2010
- 2010-10-22 WO PCT/JP2010/006265 patent/WO2011074165A1/ja active Application Filing
- 2010-10-22 KR KR1020127007777A patent/KR101420201B1/ko active IP Right Grant
- 2010-10-22 EP EP10837203.8A patent/EP2498158A4/en not_active Withdrawn
- 2010-10-22 US US13/514,004 patent/US8897947B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0330003A (ja) * | 1989-06-28 | 1991-02-08 | Shinko Electric Co Ltd | 移動ロボットの走行制御方法 |
JP2000222563A (ja) * | 1999-02-04 | 2000-08-11 | Nec Corp | 障害物検出装置および障害物検出装置を搭載した移動体 |
JP2004299025A (ja) * | 2003-04-01 | 2004-10-28 | Honda Motor Co Ltd | 移動ロボット制御装置、移動ロボット制御方法及び移動ロボット制御プログラム |
JP2005050105A (ja) * | 2003-07-28 | 2005-02-24 | Matsushita Electric Works Ltd | 自律移動のための経路生成装置及び該装置を用いた自律移動装置 |
JP2007249632A (ja) * | 2006-03-16 | 2007-09-27 | Fujitsu Ltd | 障害物のある環境下で自律移動する移動ロボットおよび移動ロボットの制御方法。 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105136155A (zh) * | 2015-09-24 | 2015-12-09 | 联想(北京)有限公司 | 一种导航方法及电子设备 |
CN105136155B (zh) * | 2015-09-24 | 2018-12-14 | 联想(北京)有限公司 | 一种导航方法及电子设备 |
US11768494B2 (en) | 2015-11-11 | 2023-09-26 | RobArt GmbH | Subdivision of maps for robot navigation |
JP2020509500A (ja) * | 2017-03-02 | 2020-03-26 | ロブアート ゲーエムベーハーROBART GmbH | 自律移動ロボットの制御方法 |
JP2021500688A (ja) * | 2017-10-30 | 2021-01-07 | 珠海市一微半導体有限公司Amicro Semiconductor Co., Ltd. | ロボットの走行予測及び制御方法 |
JP7075994B2 (ja) | 2017-10-30 | 2022-05-26 | 珠海一微半導体股▲ふん▼有限公司 | ロボットの走行予測及び制御方法 |
US11526170B2 (en) | 2017-10-30 | 2022-12-13 | Amicro Semiconductor Co., Ltd. | Method for detecting skidding of robot, mapping method and chip |
Also Published As
Publication number | Publication date |
---|---|
KR20120049927A (ko) | 2012-05-17 |
EP2498158A1 (en) | 2012-09-12 |
JP2011128899A (ja) | 2011-06-30 |
EP2498158A4 (en) | 2013-08-28 |
KR101420201B1 (ko) | 2014-07-17 |
US20120283905A1 (en) | 2012-11-08 |
US8897947B2 (en) | 2014-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011074165A1 (ja) | 自律移動装置 | |
US8679260B2 (en) | Methods and systems for movement of an automatic cleaning device using video signal | |
US8306684B2 (en) | Autonomous moving apparatus | |
US8423225B2 (en) | Methods and systems for movement of robotic device using video signal | |
JP6079415B2 (ja) | 自律移動体 | |
US20120182155A1 (en) | Danger presentation device, danger presentation system, danger presentation method and program | |
JP5276931B2 (ja) | 移動体および移動体の位置推定誤り状態からの復帰方法 | |
JP6481347B2 (ja) | 移動量推定装置、自律移動体、及び移動量の推定方法 | |
JP6052045B2 (ja) | 自律移動体 | |
JP6962007B2 (ja) | 自律走行台車の走行制御装置、自律走行台車 | |
JP2010072762A (ja) | 環境地図修正装置及び自律移動装置 | |
JP2007078476A (ja) | 物体位置検出装置、地図作成装置、自律移動装置、物体位置検出方法および物体位置検出プログラム | |
JP2011175393A (ja) | 経路計画装置、自律移動ロボット、及び移動経路の計画方法 | |
US20140297066A1 (en) | Remote control system | |
JP6074205B2 (ja) | 自律移動体 | |
JP2011096170A (ja) | 自律移動装置及びその制御方法 | |
JP2016206876A (ja) | 自律移動体の走行経路教示システムおよび走行経路教示方法 | |
JP6348971B2 (ja) | 移動体 | |
JP2009223757A (ja) | 自律移動体、その制御システム、自己位置推定方法 | |
JP2010026727A (ja) | 自律移動装置 | |
JP2010061483A (ja) | 自走移動体及び自走移動体の目的位置設定方法 | |
JP2008276731A (ja) | 自律移動体の経路設定装置 | |
KR100784125B1 (ko) | 단일 카메라를 이용한 이동 로봇의 랜드 마크의 좌표 추출방법 | |
JP2022075500A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2011198173A (ja) | ロボットシステム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10837203 Country of ref document: EP Kind code of ref document: A1 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10837203 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20127007777 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010837203 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13514004 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |