WO2010032381A1 - 環境地図修正装置及び自律移動装置 - Google Patents
環境地図修正装置及び自律移動装置 Download PDFInfo
- Publication number
- WO2010032381A1 WO2010032381A1 PCT/JP2009/004079 JP2009004079W WO2010032381A1 WO 2010032381 A1 WO2010032381 A1 WO 2010032381A1 JP 2009004079 W JP2009004079 W JP 2009004079W WO 2010032381 A1 WO2010032381 A1 WO 2010032381A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- environmental map
- map
- global map
- correction
- image
- Prior art date
Links
- 238000012937 correction Methods 0.000 title claims abstract description 95
- 230000007613 environmental effect Effects 0.000 title claims abstract description 65
- 238000013508 migration Methods 0.000 title 1
- 230000005012 migration Effects 0.000 title 1
- 238000006243 chemical reaction Methods 0.000 claims abstract description 38
- 230000009466 transformation Effects 0.000 claims description 5
- 230000001131 transforming effect Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 description 12
- 238000001514 detection method Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
Definitions
- the present invention relates to an environmental map correction device and an autonomous mobile device provided with the environmental map correction device.
- Patent Document 1 discloses a mobile robot that generates a topographic map (environmental map) using topographic data obtained as a result of distance measurement by a laser range finder (or camera).
- the detection result by the laser range finder or the like may include noise.
- a person, a moving object, or the like passes through the detection range of the laser range finder at the time of creating the environmental map, there is a possibility that an object which does not constantly exist may be arranged on the environmental map.
- an environmental map is generated from CAD data
- an object as an obstacle may be placed after the CAD data is generated. Therefore, it may happen that the generated or acquired environmental map differs from the actual surrounding environment.
- there is a difference between the generated or acquired environmental map and the actual surrounding environment for example, when planning a movement route from the environmental map, it is necessary to pass through an aisle that can actually pass through. It may be determined that it is not possible, and as a result, there may be a problem of selecting a bypass route.
- the present invention has been made to solve the above problems, and an environmental map correction device capable of acquiring an environmental map suited to the actual surrounding environment, and an autonomous movement provided with the environmental map correction device It aims at providing an apparatus.
- An environmental map correction apparatus comprises: display means for displaying an environmental map in which an object area in which an object is present is displayed; input means for receiving a correction operation from a user on the environmental map displayed by the display means; And correcting means for correcting the environmental map displayed on the display means based on the correction operation accepted by the means.
- the user can correct the environmental map on the display means by performing the correction operation through the input means. Therefore, the difference between the environmental map and the actual surrounding environment can be corrected by the user's manual operation. As a result, it is possible to obtain an environmental map that matches the actual surrounding environment.
- the environmental map correction apparatus comprises: conversion means for converting presence probability information of an object possessed by a plurality of grids into image information; and inverse conversion means for inversely transforming image information into object presence probability information
- the means displays the environmental map image based on the image information converted by the conversion means, the correction means corrects the environmental map image displayed by the display means, and the reverse conversion means corrects the environment corrected by the correction means It is preferable to convert the image information of the map image back to the object presence probability information.
- transformation / inverse transformation is performed between the object presence probability information and the image information. Therefore, the environmental map is converted into the environmental map image and displayed, and the corrected environmental map can be acquired by correcting the environmental map image displayed on the display means. Therefore, it becomes possible to correct the environmental map more easily.
- the image information is preferably gray-scale information of a black and white image.
- the autonomous mobile device is any of the above environmental map correction means for acquiring an environmental map acquisition means for acquiring an environmental map indicating an object region in which an object is present, and an environmental map acquired by the environmental map acquisition means.
- Device movement route planning means for planning a movement route from the environmental map corrected by the environmental map correction device, movement means for driving the own machine to move along the movement route planned by the movement route planning means, And the like.
- the autonomous mobile device of the present invention since any of the above-described environment map correction devices is provided, it is possible to acquire an environment map that matches the actual surrounding environment. Therefore, for example, when planning a movement route from an environmental map, it is possible to prevent a problem that it is determined that the user can not pass the passage which can actually pass because there is an obstacle. As a result, it is possible to move along an optimal moving path that matches the actual surrounding environment that the user is aware of. Further, since the environment map correction apparatus is mounted on the autonomous mobile device, a series of operations such as acquisition of the environment map, correction of the environment map, and route planning from the environment map can be efficiently performed.
- the displayed environmental map is corrected based on the correction operation from the user, it is possible to acquire an environmental map that is more suitable for the actual surrounding environment.
- FIG. 1 is a block diagram showing the configuration of the global map correction device 3.
- the global map correction device 3 corrects an environmental map (a map representing an area where an object (obstacle) is present and an area not present, hereinafter referred to as a "global map") based on the user's operation.
- the global map correction device 3 includes a microprocessor that performs calculations, a ROM that stores programs for causing the microprocessor to execute each process, a RAM that temporarily stores various data such as calculation results, and storage contents thereof Backup RAM and the like.
- the global map correction device 3 has a storage unit 30, a conversion unit 31, an inverse conversion unit 32, a temporary storage unit 33, a touch screen 34, and a correction unit 37 in order to correct the global map. Note that these units are constructed by a combination of the above-described hardware and software. Each component will be described in detail below.
- the storage unit 30 includes, for example, the above-described backup RAM or the like, and stores, for example, a global map acquired by a global map acquisition unit 41 described later.
- the global map can be generated, for example, using known SLAM (Simultaneous Localization and Mapping).
- SLAM Simultaneous Localization and Mapping
- the global map may be created from an architectural drawing, and further, the global map generated by another device may be ported.
- the global map is a grid map of the movement area of the autonomous mobile device, and the position of a fixed object (object) such as a wall surface is recorded.
- the grid map is a map made of a plane obtained by dividing a horizontal plane by cells of a predetermined size (for example, 1 cm ⁇ 1 cm) (hereinafter referred to as “unit grid” or simply “grid”).
- Object presence probability information indicating whether or not there is given is given.
- the unit grid is 1> Object existence probability P (n, m)> 0: occupied -1 ⁇ object presence probability P (n, m) ⁇ 0: empty (no object)
- Object existence probability P (n, m) 0: unknown (unknown)
- “0” is given to a grid whose presence or absence of an object (obstacle) is unknown. Therefore, as shown in FIG.
- the grid in which the wall surface (the area shown by the white line in FIG. 2) is located has a value of “0.8 to 1.0”. Also, the grid corresponding to the passage portion where no object is present (the area painted black in FIG. 2) has a value of “ ⁇ 0.8 to ⁇ 1.0”. Furthermore, on the rear side of the wall (the area shown in gray in FIG. 2), the presence or absence of an object is unknown, so the corresponding grid has a value of “0”.
- the conversion unit 31 reads the global map stored in the storage unit 30, and converts object presence probability information of each grid constituting the global map (grid map) into image information, that is, gray-scale information of a black and white image A display map (global map image) is generated. That is, the conversion unit 31 functions as a conversion unit described in the claims. More specifically, when object presence probability information is converted to gray level information, conversion unit 31 converts object presence probability information (-1 to 0 to 1) from gray level information (0 to 128 (80H) to each grid). Convert to 255 (FFH) to generate a display map. In the gray-scale information of the black and white image, complete white is associated with “255” and complete black is associated with “0”.
- the display map shown in FIG. 3 is a conversion of the global map (grid map) shown in FIG.
- the cell in which the wall surface is located is converted to substantially white (230-255).
- the passage portion where no object exists is converted to substantially black (0 to 26).
- the back side of the wall where the presence or absence of the object is unknown is converted to gray (128).
- the display map generated by the conversion unit 31 is output to the temporary storage unit 33 configured of, for example, the above-described RAM or the like, and temporarily stored in the temporary storage unit 33 while the correction work is performed. .
- map data object presence probability information
- 8-bit data (0 to 255 (FFH)
- Map conversion is unnecessary because data can be displayed as it is without conversion.
- the touch screen 34 is an input device configured to have a display unit 35 formed of a liquid crystal display (LCD) or the like and an operation input unit 36 formed of a touch panel or the like for detecting a touch operation (input) of the user.
- the display unit 35 displays a global map image (corresponding to an environmental map image described in the claims) based on the image information (grayscale information) of the display map generated by the conversion unit 31.
- the display unit 35 also displays a menu bar, various tools, and the like for receiving a correction operation on the global map image from the user. That is, the display unit 35 functions as a display unit described in the claims.
- the display screen displayed on the display unit 35 includes, for example, a global map image display area 400, a menu bar display area 401, a tool box display area 402, a pallet display area 403, and the like.
- the global map image display area 400 displays the global map image being corrected.
- the menu bar display area 401 for example, a menu in which various functions that can be used, such as the file dialog 410, are classified and summarized is displayed.
- the toolbox display area 402 for example, a plurality of tools having a role of selecting a global map image, drawing, etc. are displayed, such as a dropper tool 411, a pencil tool 412, an area specifying tool 413, a fill tool 414.
- a palette 415 for designating a drawing color is displayed.
- the operation input unit 36 is provided to cover the display screen of the display unit 35, and two-dimensional coordinates (X-Y coordinates) are virtually arranged on the surface.
- the operation input unit 36 receives a correction operation from the user, and outputs a coordinate information according to the touch position when the user performs a touch operation.
- a correction operation from the user, and outputs a coordinate information according to the touch position when the user performs a touch operation.
- the operation input unit 36 functions as an input unit described in the claims.
- the operation input unit 36 determines the operation content of the user based on the display positions of various icons and the like and the coordinate information indicating the touch position of the user. Then, the operation input unit 36 outputs the determined operation content to the correction unit 37.
- the correction unit 37 corrects the global map image displayed on the display unit 35 in accordance with the user's correction operation accepted by the operation input unit 36. That is, the correction unit 37 functions as a correction unit described in the claims.
- the operation input unit 36 and the correction unit 37 provide, for example, the following functions to the user.
- the function to be described next is an example of the function provided by the operation input unit 36 and the correction unit 37.
- the user can delete an object (obstacle) on the global map image or add an object on the global map image by using the above-described function. That is, for example, when erasing an object (obstacle) on the global map, the object displayed in white is corrected to black. On the other hand, when it is desired to add an object (obstacle) on the global map, the object is drawn in white in accordance with the position and shape of the object.
- the inverse transformation unit 32 converts the gray scale information of the global map image corrected by the correction unit 37 back to the object presence probability information of the grid map (global map) to convert the display map (global map image) into the global map. It is the reverse conversion. That is, the inverse conversion unit 32 functions as an inverse conversion unit described in the claims. More specifically, the inverse conversion unit 32 converts the gray level information (0 to 128 (80H) to 255 (FFH)) into object presence probability information (-1 to 0 to 1) for each grid to obtain a global map ( Get grid map). That is, the inverse conversion unit 32 has a function reverse to that of the conversion unit 31 described above.
- the grid map (global map) shown in FIG. 2 is generated.
- the global map generated by the inverse conversion unit 32 is output to the storage unit 30 and stored in the storage unit 30.
- FIG. 6 is a flowchart showing the processing procedure of the global map correction processing by the global map correction device 3.
- the global map correction process shown in FIG. 6 is executed in response to a user operation.
- step S100 in response to the map reading operation from the user, the designated global map to be corrected is read and displayed on the display unit 35. More specifically, when the operation input unit 36 configuring the touch screen 34 receives a map reading operation from the user, the global map (grid map) stored in the storage unit 30 is converted by the conversion unit 31. Then, a global map image (display map) is generated (see FIGS. 2 and 3) and displayed on the display unit 35 of the touch screen 34.
- step S102 the global map image displayed on the display unit 35 is corrected in response to a correction operation from the user. More specifically, the color specification operation from the user is performed to set the color to be overwritten, and the correction area specification operation from the user is performed to overwrite the designated area with the set color and gradation or Filled out (see Figure 4).
- the method of designating the overwrite color by the user and the method of designating the correction area are as described above, and thus detailed description will be omitted here.
- the present correction processing is repeatedly performed until the correction for all the correction points is completed.
- the corrected global map is saved in the subsequent step S104. More specifically, when the operation input unit 36 receives a map storage operation from the user, the corrected global map image (display map) held in the temporary storage unit 33 is inverted by the inverse conversion unit 32. The converted global map (grid map) is generated by conversion (see FIGS. 3 and 2) and stored in the storage unit 30.
- FIG. 5 shows an example of the global map before the correction and the global map after the correction.
- the object (obstacle) existing in the frame 500 surrounded by the dotted line before the correction is erased.
- the global map is modified to match the real surrounding environment.
- the user can correct the global map on the display unit 35 by performing the correction operation via the operation input unit 36 of the touch screen 34. Therefore, the difference between the global map and the actual surrounding environment can be corrected by the user's manual operation. As a result, it is possible to obtain a global map that matches the actual surrounding environment.
- conversion / inverse conversion is performed between the object presence probability information and the image information (density information of the black and white image). Therefore, while converting and displaying a global map into a global map image and acquiring a global map (object presence probability information) corrected by correcting the global map image (grayscale) displayed on the display unit 35 Can. Therefore, it is possible to more easily correct the global map.
- gray-scale information of a black-and-white image is used as image information corresponding to object presence probability information
- a global map composed of a plurality of grids having object presence probability information It becomes possible to visually recognize as a black-and-white image (global map image) expressed by light and shade of.
- both the object presence probability information and the gray level information of the black and white image are one-dimensional data, conversion / inverse conversion between the object presence probability information and the gray level information (image information) can be easily performed.
- FIG. 7 is a block diagram showing the configuration of the autonomous mobile device 1 on which the global map correction device 3 is mounted.
- the autonomous mobile device 1 acquires the global map and outputs it to the global map correction device 3 and, using the corrected global map obtained from the global map correction device 3, a starting point (start position on the global map And a destination (goal position) are planned, and has a function of autonomously moving from the start position to the goal position along the planned movement route. Therefore, the autonomous mobile device 1 is provided with the above-described global map correction device 3, and the main body 10 provided with the electric motor 12 and the omni wheel 13 driven by the electric motor 12 in the lower part, and obstacles existing around And a laser range finder 20 for measuring the distance between In addition, the autonomous mobile device 1 plans the movement route using the global map corrected by the global map correction device 3 and controls the electric motor 12 to move along the planned route. Is equipped. Each component will be described in detail below.
- the main body 10 is, for example, a metal frame formed in a substantially bottomed cylindrical shape, to which the above-described laser range finder 20 and an electronic control device 40 including the global map correction device 3 are attached. There is.
- the shape of the main body 10 is not limited to a substantially bottomed cylindrical shape.
- four electric motors 12 are arranged and attached in a cross shape.
- An omni wheel 13 is mounted on the drive shaft 12A of each of the four electric motors 12. That is, the four omni wheels 13 are attached at intervals of 90 ° along the circumferential direction on the same circumference.
- the omni wheel 13 is provided rotatably on two wheels 14 rotating around the drive shaft 12A of the electric motor 12 and on the outer circumference of each wheel 14 about an axis orthogonal to the drive shaft 12A of the electric motor 12
- the wheel has six free rollers 15 and is movable in all directions.
- the two wheels 14 are attached with a phase shift of 30 °.
- the omni wheel 13 can also move in the direction parallel to the rotation axis of the wheel 14. Therefore, by independently controlling the four electric motors 12 and individually adjusting the rotational direction and rotational speed of each of the four omni wheels 13, the autonomous mobile device 1 is moved in any direction (all directions). Can.
- An encoder 16 for detecting the rotation angle of the drive shaft 12A is attached to the drive shaft 12A of each of the four electric motors 12. Each encoder 16 is connected to the electronic control unit 40, and outputs the detected rotation angle of each electric motor 12 to the electronic control unit 40.
- the electronic control unit 40 calculates the amount of movement of the autonomous mobile device 1 from the input rotation angles of the respective electric motors 12.
- the laser range finder 20 is attached to the front of the autonomous mobile device 1 so as to face the front direction (front) of the own machine.
- the laser range finder 20 emits a laser and reflects the emitted laser by a rotating mirror, thereby horizontally scanning the periphery of the autonomous mobile device 1 in a fan shape having a central angle of 240 °.
- the laser range finder 20 detects, for example, a laser reflected and returned by an object such as a wall or an obstacle, a detection angle of the laser (reflected wave), and the laser is emitted and then reflected by the object and returned. By measuring the time until it comes (propagation time), the angle and distance to the object are detected.
- the laser range finder 20 is connected to the electronic control unit 40, and outputs the detected distance information and angle information to the surrounding object to the electronic control unit 40.
- the electronic control device 40 comprehensively controls the autonomous mobile device 1.
- the electronic control unit 40 includes a microprocessor for performing calculations, a ROM for storing programs for causing the microprocessor to execute each process, a RAM for temporarily storing various data such as calculation results, and storage contents thereof. Backup RAM and so on.
- the electronic control unit 40 also includes an interface circuit for electrically connecting the laser range finder 20 and the microprocessor, a motor driver for driving the electric motor 12, and the like.
- the electronic control unit 40 is configured to exchange data with the global map correction unit 3, generates a global map and outputs it to the global map correction unit 3, and is corrected from the global map correction unit 3. Get global map. Then, the electronic control unit 40 plans the movement path from the corrected global map, and controls the electric motor 12 to move along the planned movement path. Therefore, the electronic control device 40 includes a global map acquisition unit 41, a sensor information acquisition unit 42, a self position detection unit 43, a route planning unit 44, a travel control unit 45, an obstacle avoidance control unit 46, and the like. Note that these units are constructed by a combination of the above-described hardware and software.
- the electronic control unit 40, the electric motor 12, and the omni wheel 13 function as moving means described in the claims.
- the global map acquisition unit 41 generates a global map representing an object area (obstacle area) in which an object (obstacle) is present and an area in which no object (obstacle area) is present, using, for example, SLAM technology or the like. That is, the global map acquisition unit 41 functions as an environmental map acquisition unit described in the claims. For example, in the case of generating a global map using SLAM technology, first, the global map acquisition unit 41 calculates distance information / angle information with a surrounding object read from the laser range finder 20 via the sensor information acquisition unit 42. Generate a local map based on. Further, the global map acquisition unit 41 acquires the self position from the self position detection unit 43.
- the self position detection unit 43 collates the local map with the global map in consideration of the movement amount of the own machine calculated in accordance with the rotation angle of each of the electric motors 12 read from the encoder 16. Estimate the self position based on. Subsequently, the global map acquisition unit 41 performs coordinate conversion of the local map with the laser range finder 20 as the origin from the coordinate system with the laser range finder 20 as the origin to the coordinate system of the global map according to the self position. Project a local map onto a global map. Then, the global map acquisition unit 41 repeatedly executes this processing while moving, and generates a global map of the entire surrounding environment by sequentially adding (adding) the local map to the global map. The global map generated by the global map acquisition unit 41 is output to the global map correction device 3.
- the route planning unit 44 reads the global map corrected by the global map correction device 3 and plans the movement route of the autonomous mobile device 1 from the corrected global map. That is, the route planning unit 44 functions as a movement route planning unit described in the claims.
- the route planning unit 44 first expands the outline of the obstacle area included in the corrected global map by the radius of the own aircraft using Minkowski sum to generate an expanded obstacle area, and the expanded obstacle An area excluding the area is extracted as a movable area which can move without contacting with the obstacle.
- the path planning unit 44 thins out the extracted movable area using Hilditch's thinning method.
- the path planning unit 44 searches the shortest path connecting the start position and the goal position using the A * algorithm (A star algorithm) from the thinned movable area. Plan a moving route.
- the traveling control unit 45 controls the electric motor 12 so as to move the own machine to the goal position along the planned movement path while avoiding the obstacle.
- the virtual potential method is adopted as a control method for moving the own machine to the goal position along the moving path while avoiding the obstacle.
- This virtual potential method generates a virtual attractive potential field for the goal position and a virtual repulsive potential field for the obstacle to be avoided, and superimposes the goal position while avoiding contact with the obstacle. It is a way to More specifically, the traveling control unit 45 first calculates virtual attraction for going to the goal position based on the self position.
- the obstacle avoidance control unit 46 calculates a virtual repulsive force for avoiding the obstacle based on the self position, the moving speed, and the position and speed of the obstacle. Subsequently, the traveling control unit 45 calculates a virtual force vector by performing vector synthesis on the obtained virtual attraction and virtual repulsion. The traveling control unit 45 controls the traveling of the vehicle to move to the goal position while avoiding the obstacle by driving the electric motor 12 (omni wheel 13) according to the obtained virtual force vector. Do.
- the above-described global map correction device 3 since the above-described global map correction device 3 is provided, it is possible to acquire a global map that matches the actual surrounding environment. Therefore, for example, when planning the movement route from the global map, it is possible to prevent such a problem that it is judged that the passage which can actually pass can not be passed because there is an obstacle. As a result, it is possible to move along an optimal moving path that matches the actual surrounding environment.
- the global map correction device 3 is mounted on the autonomous mobile device 1, it is possible to efficiently perform a series of operations such as generation of the global map, correction of the global map, and route planning from the global map.
- the object presence probability information possessed by each grid constituting the global map is converted into gray-scale information of black and white, but the image information to be converted is converted to gray-scale information of black and white It is not limited. For example, it may be converted into gradation information or color information other than black and white (monochrome).
- the correspondence between the object presence probability and the density information is not limited to the above embodiment. For example, the relationship between white and black corresponding to the object presence probability may be reversed.
- the global map is corrected by correcting the image, but the numerical value of the object presence probability of each grid constituting the global map (grid map) may be directly corrected.
- a normal display may be used as the display unit, and a keyboard, a mouse or the like may be used as the operation input unit.
- the global map correction device 3 and the electronic control device 40 are separated.
- the configuration and function sharing of the global map correction device 3 and the electronic control device 40 are not limited to the above embodiment.
- the global map correction device 3 and the electronic control unit 40 may be configured by the same hardware.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Business, Economics & Management (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Mathematical Physics (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
3 グローバルマップ修正装置
10 本体
12 電動モータ
13 オムニホイール
14 ホイール
15 フリーローラ
16 エンコーダ
20 レーザレンジファインダ
30 記憶部
31 変換部
32 逆変換部
33 一時記憶部
34 タッチスクリーン
35 表示部
36 操作入力部
37 修正部
40 電子制御装置
41 グローバルマップ取得部
42 センサ情報取得部
43 自己位置検知部
44 経路計画部
45 走行制御部
46 障害物回避制御部
1>物体存在確率P(n,m)>0 : occupied(物体有)
-1<物体存在確率P(n,m)<0: empty (物体無)
物体存在確率P(n,m)=0 : unknown (不明)
の領域情報を有する。
すなわち、物体(障害物)が有るグリッドにはその存在確率に応じて「0~1」の値が与えられ、物体(障害物)が無いグリッドにはその存在確率に応じて「0~-1」の値が与えられる。また物体(障害物)の有無が不明なグリッドには「0」が与えられる。よって、図2に示されるように、壁面(図2中、白線で示された領域)が位置するグリッドは、「0.8~1.0」の値を有している。また、物体が存在しない通路部分(図2中、黒色で塗られた領域)に対応するグリッドは「-0.8~-1.0」の値を有している。さらに、壁面の背面側(図2中、灰色で示された領域)は、物体の有無が不明であるので、対応するグリッドは「0」の値を有している。
1.修正するグローバルマップの読み込み(グローバルマップ指定・入力)
(1)ファイルダイアログ410を開き、修正するグローバルマップを指定する。なお、ファイルは、グローバルマップ全体として呼び出すことも、いくつかの領域毎に区分けされたグローバルマップとして呼び出すことも可能である。
2.上書きする色の指定(グローバルマップ修正)
(1)スポイトツール411でグローバルマップ中の点を指定して、その点の色で指定する。
(2)パレット415の中から色を指定する。
(3)0~255の値を入力して色を指定する。なお、黒色は0~127、灰色は128、白色は129~255の範囲で指定する。
3.修正したい領域の指定及び修正(グローバルマップ修正)
(1)鉛筆ツール412でグローバルマップ中の点を指定して、その点を指定色で上書きする。
(2)領域指定ツール413で領域を指定して、塗りつぶしツール414で、その領域を指定色で塗りつぶす。
4.修正したグローバルマップの保存(グローバルマップ指定・保存)
(1)ファイルダイアログ410を開き、保存したいグローバルマップを指定する。
Claims (4)
- 物体が存在する物体領域が示される環境地図を表示する表示手段と、
前記表示手段により表示された環境地図に対する、ユーザからの修正操作を受け付ける入力手段と、
前記入力手段により受け付けられた修正操作に基づいて、前記表示手段に表示されている環境地図を修正する修正手段と、を備える、ことを特徴とする環境地図修正装置。 - 前記環境地図を構成する複数のグリッドが持つ物体の存在確率情報を画像情報に変換する変換手段と、
前記画像情報を前記物体の存在確率情報に逆変換する逆変換手段と、を備え、
前記表示手段は、前記変換手段により変換された画像情報に基づいて環境地図画像を表示し、
前記修正手段は、前記表示手段により表示された環境地図画像を修正し、
前記逆変換手段は、前記修正手段により修正された環境地図画像の画像情報を物体の存在確率情報に逆変換する、ことを特徴とする請求項1に記載の環境地図修正装置。 - 前記画像情報は、白黒画像の濃淡情報であることを特徴とする請求項2に記載の環境地図修正装置。
- 物体が存在する物体領域が示される環境地図を取得する環境地図取得手段と、
前記環境地図取得手段により取得された環境地図を修正する、請求項1~3のいずれか1項に記載の環境地図修正装置と、
前記環境地図修正装置により修正された環境地図から移動経路を計画する移動経路計画手段と、
前記移動経路計画手段により計画された移動経路に沿って移動するように自機を駆動する移動手段と、を備える、ことを特徴とする自律移動装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/063,962 US8515613B2 (en) | 2008-09-16 | 2009-08-25 | Environmental map correction device and autonomous mobile device |
KR1020117004984A KR101228487B1 (ko) | 2008-09-16 | 2009-08-25 | 환경 지도 수정 장치 및 자율 이동 장치 |
EP09814227.6A EP2348375A4 (en) | 2008-09-16 | 2009-08-25 | ENVIRONMENTAL MAP CORRECTION DEVICE AND AUTONOMOUS MIGRATION DEVICE |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008237196A JP5304128B2 (ja) | 2008-09-16 | 2008-09-16 | 環境地図修正装置及び自律移動装置 |
JP2008-237196 | 2008-09-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010032381A1 true WO2010032381A1 (ja) | 2010-03-25 |
Family
ID=42039233
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/004079 WO2010032381A1 (ja) | 2008-09-16 | 2009-08-25 | 環境地図修正装置及び自律移動装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8515613B2 (ja) |
EP (1) | EP2348375A4 (ja) |
JP (1) | JP5304128B2 (ja) |
KR (1) | KR101228487B1 (ja) |
WO (1) | WO2010032381A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017222309A (ja) * | 2016-06-17 | 2017-12-21 | 日産自動車株式会社 | 駐車支援方法および装置 |
CN108803595A (zh) * | 2017-04-27 | 2018-11-13 | 丰田自动车株式会社 | 环境布置机器人和存储其控制程序的计算机可读存储介质 |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101782057B1 (ko) * | 2010-05-03 | 2017-09-26 | 삼성전자주식회사 | 지도 생성 장치 및 방법 |
US8798840B2 (en) * | 2011-09-30 | 2014-08-05 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
TWM451103U (zh) * | 2012-10-30 | 2013-04-21 | Agait Technology Corp | 行走裝置 |
JP6136435B2 (ja) * | 2013-03-25 | 2017-05-31 | 村田機械株式会社 | 自律移動装置 |
JP6020326B2 (ja) * | 2013-04-16 | 2016-11-02 | 富士ゼロックス株式会社 | 経路探索装置、自走式作業装置、プログラム及び記録媒体 |
JP6136543B2 (ja) * | 2013-05-01 | 2017-05-31 | 村田機械株式会社 | 自律移動体 |
CN104161487B (zh) * | 2013-05-17 | 2018-09-04 | 恩斯迈电子(深圳)有限公司 | 移动装置 |
KR101509566B1 (ko) | 2013-06-18 | 2015-04-06 | 박재삼 | 이동로봇의 주행경로 입력 장치 및 방법 |
ES2818922T3 (es) * | 2014-03-12 | 2021-04-14 | Chugoku Electric Power | Método de medición de una distancia |
WO2015193941A1 (ja) * | 2014-06-16 | 2015-12-23 | 株式会社日立製作所 | 地図生成システム及び地図生成方法 |
JP5902275B1 (ja) * | 2014-10-28 | 2016-04-13 | シャープ株式会社 | 自律移動装置 |
JP6539845B2 (ja) * | 2015-03-31 | 2019-07-10 | 株式会社日本総合研究所 | 自走型走行装置、管理装置、及び歩行障害箇所判定システム |
JP2017177328A (ja) * | 2016-03-28 | 2017-10-05 | 富士ゼロックス株式会社 | プリントシステム、サーバ機器、プリンタ選定プログラム、およびプリント指示振分プログラム |
US11132611B2 (en) * | 2016-05-27 | 2021-09-28 | Kabushiki Kaisha Toshiba | Information processing apparatus and information processing method for determining presence probability of object |
US11222438B2 (en) * | 2016-05-27 | 2022-01-11 | Kabushiki Kaisha Toshiba | Information processing apparatus, vehicle, and information processing method for presence probability of object |
US11204610B2 (en) * | 2016-05-30 | 2021-12-21 | Kabushiki Kaisha Toshiba | Information processing apparatus, vehicle, and information processing method using correlation between attributes |
EP3252658B1 (en) | 2016-05-30 | 2021-08-11 | Kabushiki Kaisha Toshiba | Information processing apparatus and information processing method |
US11212684B2 (en) * | 2016-07-27 | 2021-12-28 | Ryan Robert Hadley | Systems and methods for the visual representation and simulation of electromagnetic radiation distribution in a volume of space |
JP6867120B2 (ja) * | 2016-07-28 | 2021-04-28 | シャープ株式会社 | 地図作成方法及び地図作成装置 |
IL250762B (en) | 2017-02-23 | 2020-09-30 | Appelman Dina | Method and system for unmanned vehicle navigation |
US10222215B2 (en) * | 2017-04-21 | 2019-03-05 | X Development Llc | Methods and systems for map generation and alignment |
US10921816B2 (en) * | 2017-04-21 | 2021-02-16 | Korea Advanced Institute Of Science And Technology | Method and apparatus for producing map based on hierarchical structure using 2D laser scanner |
EP3729225A1 (en) * | 2017-12-21 | 2020-10-28 | University of Zagreb Faculty of Electrical Engineering and Computing | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map |
US10835096B2 (en) * | 2018-08-30 | 2020-11-17 | Irobot Corporation | Map based training and interface for mobile robots |
PL3799618T3 (pl) | 2018-08-30 | 2023-02-06 | Elta Systems Ltd. | Sposób nawigowania pojazdem i przewidziany dla niego system |
KR102097722B1 (ko) * | 2019-03-25 | 2020-04-06 | 주식회사 트위니 | 빅셀그리드맵을 이용한 이동체의 자세 추정 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07129238A (ja) | 1993-11-01 | 1995-05-19 | Fujitsu Ltd | 障害物回避経路生成方式 |
JP2005326944A (ja) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2007323402A (ja) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | 自走式機器およびそのプログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5006988A (en) * | 1989-04-28 | 1991-04-09 | University Of Michigan | Obstacle-avoiding navigation system |
US6667592B2 (en) * | 2001-08-13 | 2003-12-23 | Intellibot, L.L.C. | Mapped robot system |
KR100703692B1 (ko) * | 2004-11-03 | 2007-04-05 | 삼성전자주식회사 | 공간상에 존재하는 오브젝트들을 구별하기 위한 시스템,장치 및 방법 |
JP4243594B2 (ja) * | 2005-01-31 | 2009-03-25 | パナソニック電工株式会社 | 清掃ロボット |
JP2006205348A (ja) * | 2005-01-31 | 2006-08-10 | Sony Corp | 障害物回避装置、障害物回避方法及び障害物回避プログラム並びに移動型ロボット装置 |
KR100791382B1 (ko) | 2006-06-01 | 2008-01-07 | 삼성전자주식회사 | 로봇의 이동 경로에 따라 소정 영역의 특성에 관한 정보를수집하고 분류하는 방법 및 상기 영역 특성에 따라제어되는 로봇, 상기 영역 특성을 이용한 ui 구성 방법및 장치 |
US8271132B2 (en) * | 2008-03-13 | 2012-09-18 | Battelle Energy Alliance, Llc | System and method for seamless task-directed autonomy for robots |
KR101075615B1 (ko) * | 2006-07-06 | 2011-10-21 | 포항공과대학교 산학협력단 | 주행 차량의 운전자 보조 정보 생성 장치 및 방법 |
JP2008134904A (ja) * | 2006-11-29 | 2008-06-12 | Toyota Motor Corp | 自律移動体 |
TWI341779B (en) * | 2007-12-04 | 2011-05-11 | Ind Tech Res Inst | Sytstem and method for graphically arranging robot's working space |
KR101372482B1 (ko) * | 2007-12-11 | 2014-03-26 | 삼성전자주식회사 | 이동 로봇의 경로 계획 방법 및 장치 |
-
2008
- 2008-09-16 JP JP2008237196A patent/JP5304128B2/ja active Active
-
2009
- 2009-08-25 WO PCT/JP2009/004079 patent/WO2010032381A1/ja active Application Filing
- 2009-08-25 US US13/063,962 patent/US8515613B2/en active Active
- 2009-08-25 KR KR1020117004984A patent/KR101228487B1/ko active IP Right Grant
- 2009-08-25 EP EP09814227.6A patent/EP2348375A4/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07129238A (ja) | 1993-11-01 | 1995-05-19 | Fujitsu Ltd | 障害物回避経路生成方式 |
JP2005326944A (ja) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2007323402A (ja) * | 2006-06-01 | 2007-12-13 | Matsushita Electric Ind Co Ltd | 自走式機器およびそのプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2348375A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017222309A (ja) * | 2016-06-17 | 2017-12-21 | 日産自動車株式会社 | 駐車支援方法および装置 |
CN108803595A (zh) * | 2017-04-27 | 2018-11-13 | 丰田自动车株式会社 | 环境布置机器人和存储其控制程序的计算机可读存储介质 |
CN108803595B (zh) * | 2017-04-27 | 2021-07-13 | 丰田自动车株式会社 | 环境布置机器人和存储其控制程序的计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
KR101228487B1 (ko) | 2013-01-31 |
EP2348375A1 (en) | 2011-07-27 |
US20110178668A1 (en) | 2011-07-21 |
EP2348375A4 (en) | 2015-04-01 |
KR20110036861A (ko) | 2011-04-11 |
JP2010072762A (ja) | 2010-04-02 |
JP5304128B2 (ja) | 2013-10-02 |
US8515613B2 (en) | 2013-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010032381A1 (ja) | 環境地図修正装置及び自律移動装置 | |
JP5157803B2 (ja) | 自律移動装置 | |
JP6949107B2 (ja) | 経路を自律走行するようにロボットを訓練するためのシステムおよび方法 | |
KR100520708B1 (ko) | 3차원 지도의 표시방법 | |
WO2010038353A1 (ja) | 自律移動装置 | |
WO2011074165A1 (ja) | 自律移動装置 | |
JP5287051B2 (ja) | 自律移動装置 | |
US20200292319A1 (en) | Systems and methods for electronic mapping and localization within a facility | |
WO2014178273A1 (ja) | 自律移動体の移動制御装置、自律移動体及び移動制御方法 | |
Lavrenov et al. | Tool for 3D Gazebo map construction from arbitrary images and laser scans | |
Kim et al. | UAV-UGV cooperative 3D environmental mapping | |
Lavrenov et al. | Automatic mapping and filtering tool: From a sensor-based occupancy grid to a 3D Gazebo octomap | |
JP5361257B2 (ja) | 自律移動装置 | |
JP2017198517A (ja) | 3次元地図生成システム | |
JP5212939B2 (ja) | 自律移動装置 | |
JP4462156B2 (ja) | 自律移動装置 | |
US20100309227A1 (en) | Map display device | |
KR20190045104A (ko) | 위치 판단 시스템, 방법 및 컴퓨터 판독 가능한 기록매체 | |
KR101966396B1 (ko) | 위치 판단 시스템, 방법 및 컴퓨터 판독 가능한 기록매체 | |
WO2020179459A1 (ja) | 地図作成装置、地図作成方法、及びプログラム | |
Digor et al. | Exploration strategies for a robot with a continously rotating 3d scanner | |
KR20220046304A (ko) | 물류 공간의 환경을 인식하고 자율주행 가능한 로봇 및 상기 자율주행 로봇의 물류공간 환경인식 및 자율주행 방법 | |
Kaneko et al. | Point cloud data map creation from factory design drawing for LiDAR localization of an autonomous mobile robot | |
Yu et al. | 3D virtual reality simulator for planetary rover operation and testing | |
JP7221897B2 (ja) | 推定装置、移動体、推定方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09814227 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20117004984 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2009814227 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009814227 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13063962 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |