WO2007113956A1 - 移動体位置の推定装置と推定方法及び推定プログラム - Google Patents
移動体位置の推定装置と推定方法及び推定プログラム Download PDFInfo
- Publication number
- WO2007113956A1 WO2007113956A1 PCT/JP2007/053746 JP2007053746W WO2007113956A1 WO 2007113956 A1 WO2007113956 A1 WO 2007113956A1 JP 2007053746 W JP2007053746 W JP 2007053746W WO 2007113956 A1 WO2007113956 A1 WO 2007113956A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wide
- cut
- self
- map
- angle camera
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the present invention relates to an apparatus, an estimation method, and an estimation program for estimating the position of a moving body such as a robot.
- the self-position based on this is output on the assumption that the landmark candidate and the map are correctly associated with which the error is minimized.
- the error for example, square the viewing angle error between the landmark candidate on the camera image and the landmark on the map, and use the median value. Not only the median value but also an integrated value of errors may be used.
- Patent Document 1 Japanese Patent Laid-Open No. 2004-34272 proposes that a fluorescent lamp on the ceiling be a landmark.
- fluorescent lamps are widely used in large rooms. If there are many fluorescent lamps, the fluorescent lamps are regularly arranged in the same shape, so it is not easy to recognize which fluorescent lamp is visible in the camera image.
- Non-patent document 1 Mouth-bust self-localization method of soccer robot with omnidirectional camera and dead reckoning function, Journal of the Robotics Society of Japan, Vol.22, No.3, pp.343-352, 2004,4 1: JP 2004-34272 A
- An object of the present invention is to be able to accurately estimate the position of a moving object even if a part of a landmark is hidden after the map is created or something that is confusing with the landmark occurs. .
- An additional problem of the present invention is to make it possible to estimate the position of the moving object even when there is a portion that cannot be seen in the wide-angle camera image at the gap between the floor and the wall side.
- the moving object position estimating apparatus of the present invention includes a wide-angle camera and a landmark map, and compares the landmark obtained from the wide-angle force mela image with the landmark on the map to determine the self-position of the moving object.
- the map stores a break between the floor and an object perpendicular to the floor and evaluates the validity of the estimated self-location based on the estimated self-location.
- a matching means is provided for evaluating an error between the cut projected on the wide-angle camera image and the cut on the wide-angle camera image stored in the map.
- the method for estimating the position of the moving body uses the map of the wide-angle camera and the landmark, and compares the landmark obtained from the wide-angle force mela image with the landmark on the map.
- the map stores a break between the floor and an object perpendicular to the floor, and stores it in the map based on the estimated self-position in order to evaluate the validity of the estimated self-position. It is characterized by evaluating the validity of the estimated self-position by evaluating the error between the cut projected on the wide-angle camera image and the cut on the wide-angle camera image.
- a moving object position estimation program includes an image from a wide-angle camera and landmarks.
- a program for estimating the self-position of a moving object by referring to a map and comparing landmarks obtained from wide-angle camera images with landmarks on the map.
- a command for estimating the self-position of the moving object by collating with the landmark a command for reading the gap between the floor stored in the map and an object perpendicular to the floor, and the estimated self-position
- it is necessary to evaluate the error between the discontinuity projected on the wide-angle camera image and the discontinuity projected on the wide-angle camera image based on the estimated self-position. It features a matching instruction.
- the cut stored in the map based on the self-position obtained by the internal sensor of the moving object is wide-angle.
- An error between the cut projected on the camera image and the cut projected based on the estimated self-position is evaluated.
- the description related to the estimation of the moving body position applies to any of the estimation device, the estimation method, and the estimation program.
- the description related to the estimation device also applies to the estimation method and the estimation program.
- the description also applies to estimators and programs.
- the landmark extracted from the wide-angle camera image is collated with the landmark on the map to estimate the self position. If at least three landmarks are extracted from the wide-angle camera image and the map and correspond to 1: 1, the candidate for the self-position is obtained.
- the estimated self-position depends on the combination of landmarks extracted from the map or wide-angle camera image and the corresponding relationship between the landmarks. If these change, another self-position is estimated. Therefore, if the number of landmarks is large, a large number of self-position candidates are generated.
- a landmark is, for example, a pattern perpendicular to the floor surface, such as an edge of an object such as a wall, furniture arranged along the wall, a locker, a window, or a door.
- the landmark may be hidden or a landmark that is not on the map may occur. Since there are many landmarks, some landmarks are not visible, and false landmarks are generated, it is necessary to estimate the correct self-position. Arithmetic has increased dramatically, and the possibility of outputting incorrect self-positions has increased.
- the wide-angle image has a break at the boundary between the floor and the wall.
- Landmarks that are perpendicular to the floor are, for example, radial line segments in wide-angle camera images, whereas cuts are circumferential curves and line segments. If an object perpendicular to the floor is placed at a place other than the boundary between the floor and the wall, a break will occur, so there will be many breaks, and there may be breaks that are not in the force map. Therefore, there is a limit to obtaining a cut corresponding to the cut on the wide-angle camera image in the map.
- the self-position is estimated, based on the estimated self-position, it is possible to determine what kind of cut should appear on the map on the wide-angle camera image.
- the validity of the self-position estimation is verified by matching the projected cut with the cut on the wide-angle camera image and evaluating the error. it can. If the validity of self-position estimation can be verified, the amount of calculation required to correct and self-position estimation can be reduced, and erroneous estimation due to invisible or fake landmarks can be reduced.
- cuts on the floor and the wall side may not be visible in the wide-angle camera image.
- a moving body estimates its own position by using an inner world sensor provided on a traveling wheel or foot, a rotating shaft of a traveling motor, a steering device, or the like.
- the self-position obtained by the internal sensor is low in accuracy, but the cut is lost from the wide-angle camera image and cannot be matched with the cut projected from the map. Based on the self-position, the cut on the map can be used as the cut projected on the wide-angle camera image.
- FIG. 1 is a block diagram showing a self-position recognition unit and a traveling system of a robot according to an embodiment.
- FIG. 2 is a diagram showing a configuration of an omnidirectional camera used in the embodiment.
- FIG. 5 is a flowchart showing an algorithm of the self-position estimation method of the embodiment.
- FIG. 7 Diagram showing the projection of the cut from the map (solid line), the cut in the camera image (broken line), and the projection of the cut based on the dead recording position (dashed line)
- Figs. 1 to 7 show an embodiment relating to the estimation of the self-position of a moving object.
- Fig. 1 shows the traveling system of a robot incorporating the self-position estimation device of the embodiment, and 2 and 2 are a pair of drive wheels.
- the motor 4 is rotated by the traveling motor 4 and the gear head 5, and the number of rotations of the motor 4 is monitored by the encoder 7 and input to the dead recording unit 9 described later.
- the dead recording section 9 also receives the gear reduction ratio of the gear head 5 and the type of forward / reverse rotation.
- 3 and 3 are a pair of caster wheels.
- the travel command generator 8 controls the travel motors 4 and 4 and the gear heads 5 and 5 independently of each other, thereby independently controlling the rotation amount and direction of the drive wheels 2 and 2, and moving the mouth bot to the target position. Move. In the case of a walking robot, instead of the drive wheels 2 and caster wheels 3, you can use feet consisting of joints. As the type of the moving body, in addition to the robot, any mechanism such as a transport cart or a transfer device may be used.
- Encoders 7 and 7 are examples of internal sensors, and input their signals to the dead recording unit 9 and integrate the movement distances to obtain their own positions. Determining the self-position with the signal from the internal sensor is called dead reckoning. The obtained position is stored in the position storage unit 10 and updated whenever a more accurate position is estimated by the position estimation unit 16, and the travel command generation unit 8 generates a travel command based on the data in the position storage unit 10. appear.
- the map 12 is a map of a range in which a moving body such as a robot can move, and the movable range is indoors, buildings, or outdoors.
- the edge perpendicular to the floor surface is used as the landmark, and the position between the landmark and the cut between the horizontal surface and the vertical surface such as the boundary between the floor and the wall are stored in the map 12.
- Edges perpendicular to the floor are, for example, edges of furniture and lockers, and vertical edges of doors and windows.
- the position of the landmark is the force expressed by the position in the horizontal plane of (X, Y).
- the Z direction (vertical direction) coordinate is added to this, and the Z axis direction coordinate Z0 of the start point of the landmark and the Z axis direction coordinate of the end point of the landmark Z1 may be added.
- the break is not limited to the break between the floor and the wall, but a break occurs at the position where the object stands up with respect to the floor surface, that is, at the end of the floor surface. In addition, breaks occur at the boundary between the sidewalk and the roadway, the boundary between the sidewalk and the flower bed, and the boundary between the pedestrian path and the ditch.
- the cuts on map 12 are represented by curves and line segments, and the cuts are at a higher position where only the cuts at the floor level can be used, for example, the edge between the top and side of the step rising from the floor May also be used.
- the map 12 is created by manually recognizing a wide-angle camera image captured at a known position with respect to a moving space such as a manual or a moving body such as a room, a building, or the entrance of a building. 12 may be created. 13 is a mirror and 14 is a camera. They are called wide-angle cameras.
- the output image of the wide-angle camera is stored in the camera image storage unit 15, and the position estimation unit 16 estimates the self position of the moving object using the data in the position storage unit 10, the map 12, and the camera image.
- the self-position data is, for example, three components: (X, Y) coordinates with respect to the origin and the direction of the moving object.
- FIG. 2 shows a state in which the landmark M perpendicular to the floor is captured by the camera 14, and if the height of the landmark M consisting of a vertical line is different, the radius from the center of the field of view of the camera 14 is different. Light enters at different positions on the line segment along the direction.
- FIG. 3 schematically shows how the three landmarks M1 to M3 and the three cuts N1 to N3 appear in the wide-angle camera image.
- the direction is the radial direction
- ⁇ represents the direction
- a line segment extending in the radial direction with a constant direction is obtained.
- the X and Y coordinates can be obtained.
- the breaks N1 to N3 appear, for example, as a circumferential curve with a substantially constant radial position.
- FIG. 4 shows the structure of the position estimation unit 16.
- the landmark matching unit 20 associates the three landmarks extracted from the camera image with the three landmarks extracted from the map 12 in a 1: 1 ratio, and estimates the self-position based on this correspondence.
- the estimated self-location is based on the assumed correspondence between landmarks.
- the landmark error evaluation unit 22 evaluates the error based on the estimated self-position using the landmark that was not used for the self-position estimation. That is, when the self-position is estimated, it is possible to estimate in which direction the other landmarks on the map 12 appear in the wide-angle camera image, and an error from the actual camera image can be evaluated.
- the landmark error evaluation unit 22 need not be provided.
- the cut projection unit 24 projects the cut on the map 12 into a wide-angle camera image based on the self-position estimated by the landmark matching.
- the self-position is estimated, it is possible to estimate how the cut on the map 12 looks in the wide-angle camera image based on this, and this is called the cut projection.
- the cut projection unit 26 is based on the position stored in the position storage unit 10, in other words, the self-position obtained by dead recording using the encoder 7, and how the cut on the map 12 looks on the wide-angle camera image. Project.
- the cut projection part 26 may not be provided.
- the cut matching unit 28 includes a cut projected by the cut projection unit 24 and a wide-angle camera image. Match the breaks. If there is no cut on the camera image within a predetermined distance on the wide-angle camera image from the projected cut, the cut projected by the projection unit 24 is matched with the cut projected by the projection unit 26, in other words Calculate the distance between the two breaks. This distance is the sum of the errors at each part of the cut, the sum of the products obtained by converting the error at each part of the cut with an appropriate function, and the like.
- the position estimation unit 30 verifies the validity of the estimated value of the self-position based on the obtained error, and outputs an estimated position where the error is less than a predetermined value or the error is sufficiently small as the self-position. In other cases, the combination of the landmark on the wide-angle camera image side and the landmark on the map side to be matched by the matching unit 20 is changed, and the processing up to the matching unit 28 is repeated.
- FIG. 5 shows a self-position estimation algorithm.
- An omnidirectional camera image is input, and for example, three landmark candidates are extracted from the camera image force. Extraction may be performed at random, or a landmark that should be visible from its current position may be extracted based on data in the position storage unit 10. Similarly, extract 3 landmark candidates from the map and associate them with 1: 1. This correspondence may be random, or the most likely correspondence on the data in the position storage unit 10 may be prioritized. If three landmarks are associated with a wide-angle camera image and a map, one self-position candidate is obtained. For example, the self-position estimation error is evaluated using other candidates and other landmarks. This step may be omitted.
- a cut on the map is projected onto the wide-angle camera image.
- the landmark perpendicular to the floor surface is the force S that gives data on one point on the floor surface, and the cut line gives data on straight lines and curves on the floor surface. Therefore, the error can be evaluated more accurately by matching the breaks.
- the cut on the map is projected onto the wide-angle camera image. This process may be omitted or may be executed only for an area where the cut line projected based on the estimated value of the self-position and the wide-angle camera image do not correspond.
- an error between the projection based on the estimated self-position on the map and the cut on the wide-angle camera image is evaluated.
- the cut line on the map is projected based on the position obtained by dead recording.
- Estimated by landmark Compare with the projected one on the map according to your position.
- FIG. 6 shows the self-position estimation program 60.
- the landmark matching instruction 62 extracts, for example, three landmark candidates of the wide-angle camera image and three landmarks on the map, and temporarily associates them.
- the self-position candidate calculation instruction 63 calculates a self-position candidate based on the above correspondence.
- the landmark error evaluation instruction 64 evaluates an error between another landmark on the map and another landmark candidate in the wide-angle camera image based on the calculated self-position.
- the landmark error evaluation instruction 64 may not be provided.
- the cut projection command 65 projects the cut on the map onto the wide-angle force mela image based on the estimated self-position candidate.
- the cut projection command 66 projects the cut on the map to the wide-angle camera image based on the position obtained by dead recording, and the command 66 need not be provided.
- the break matching instruction 67 performs matching between breaks and evaluates errors.
- the self-position estimation instruction 68 evaluates the validity of the estimated self-position based on the matching error of the cut evaluated by the cut matching instruction 67, and outputs the position if it is valid. Repeat ⁇ 67 processing. In general, there are a plurality of breaks. For example, there are several breaks in a room. Therefore, matching is performed for each cut, and the error is the sum of errors of individual cuts, or a statistic that statisticalizes the error of each cut.
- Fig. 7 shows an example of the matching of cuts.
- 70 is a cut based on the self-position estimation and the map, and the cut on the map is projected onto the wide-angle camera image based on the self-position estimated from the landmark.
- 72 is a break in the camera image. A part of the break is hidden by another object or a part is lost due to lighting conditions.
- Projected cuts on the map based on the self-positions obtained by dead reconging are shown as cuts 74 from dead reconging. Also, for example, the distance between the arrows in FIG.
- the error between the breaks 70 and 74 is a self-position estimation error, and is an ambiguous error because the reliability of position recognition by dead recording is low.
- Both the breaks 70 and 74 are maps Even in an area where the cut 70 and the cut 72 do not correspond, the cut 74 corresponding to the cut 70 exists. Therefore, in the area corresponding to cuts 70 and 72, the error is evaluated by adding the distance of the arrow in Fig. 7, and in the area where cuts 70 and 72 do not correspond, the error between cuts 70 and 74 is added and evaluated. To do.
- the matching error may be evaluated by simply integrating the errors or by statistically evaluating the errors.
- a function that attenuates when the error increases may be determined so that the matching result is not affected by an area where the error is extremely large, and the error may be multiplied and then added.
- An upper limit may be set for the error.
- the error between the breaks 70 and 74 may be multiplied by a weight smaller than 1.
- the vertical landmark is one point on the XY plane.
- the force break is a force line segment on the XY plane and has more information. Therefore, it is possible to easily evaluate whether or not the correspondence between the wide-angle camera image and the landmark is correct by using a cut with the floor wall.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/295,088 US8204643B2 (en) | 2006-03-31 | 2007-02-28 | Estimation device, estimation method and estimation program for position of mobile unit |
JP2008508464A JP4753103B2 (ja) | 2006-03-31 | 2007-02-28 | 移動体位置の推定装置と推定方法及び推定プログラム |
EP07737500.4A EP2017573B1 (en) | 2006-03-31 | 2007-02-28 | Estimation device, estimation method and estimation program for estimating a position of mobile unit |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006-097422 | 2006-03-31 | ||
JP2006097422 | 2006-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007113956A1 true WO2007113956A1 (ja) | 2007-10-11 |
Family
ID=38563228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/053746 WO2007113956A1 (ja) | 2006-03-31 | 2007-02-28 | 移動体位置の推定装置と推定方法及び推定プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8204643B2 (ja) |
EP (1) | EP2017573B1 (ja) |
JP (1) | JP4753103B2 (ja) |
KR (1) | KR101013392B1 (ja) |
WO (1) | WO2007113956A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015049717A1 (ja) * | 2013-10-01 | 2015-04-09 | 株式会社日立製作所 | 移動体位置推定装置および移動体位置推定方法 |
WO2016016955A1 (ja) * | 2014-07-30 | 2016-02-04 | 株式会社日立製作所 | 自律移動装置及び自己位置推定方法 |
JP2022035936A (ja) * | 2020-08-20 | 2022-03-04 | 上海姜歌机器人有限公司 | ロボットの再位置決め方法、装置及び機器 |
US11662738B2 (en) | 2018-03-09 | 2023-05-30 | Casio Computer Co., Ltd. | Autonomous mobile apparatus, autonomous move method, and recording medium that use a selected environment map |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008065598A2 (en) * | 2006-11-28 | 2008-06-05 | Koninklijke Philips Electronics N.V. | A method, an apparatus and a computer program for data processing |
KR101503904B1 (ko) * | 2008-07-07 | 2015-03-19 | 삼성전자 주식회사 | 이동 로봇의 지도 구성 장치 및 방법 |
US9930252B2 (en) | 2012-12-06 | 2018-03-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Methods, systems and robots for processing omni-directional image data |
US9536152B2 (en) * | 2013-02-14 | 2017-01-03 | Xerox Corporation | Methods and systems for multimedia trajectory annotation |
US20150293533A1 (en) * | 2014-04-13 | 2015-10-15 | Bobsweep Inc. | Scanned Code Instruction and Confinement Sytem for Mobile Electronic Devices |
DE102016217637A1 (de) | 2016-09-15 | 2018-03-15 | Volkswagen Aktiengesellschaft | Odometrie-Verfahren zum Ermitteln einer Position eines Kraftfahrzeugs, Steuervorrichtung und Kraftfahrzeug |
US10824923B1 (en) * | 2019-01-23 | 2020-11-03 | Facebook Technologies, Llc | System and method for improving localization and object tracking |
US11461971B1 (en) * | 2021-03-25 | 2022-10-04 | Cesium GS, Inc. | Systems and methods for interactively extrapolating breaklines over surfaces |
KR102556767B1 (ko) * | 2022-06-28 | 2023-07-18 | 주식회사 브이알크루 | 비주얼 로컬라이제이션을 위한 방법 및 장치 |
KR20240004102A (ko) * | 2022-07-04 | 2024-01-11 | 주식회사 브이알크루 | 비주얼 로컬라이제이션을 위한 방법 및 장치 |
KR102556765B1 (ko) * | 2022-07-04 | 2023-07-18 | 주식회사 브이알크루 | 비주얼 로컬라이제이션을 위한 방법 및 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08247775A (ja) * | 1995-03-15 | 1996-09-27 | Toshiba Corp | 移動体の自己位置同定装置および自己位置同定方法 |
JPH0953939A (ja) * | 1995-08-18 | 1997-02-25 | Fujitsu Ltd | 自走車の自己位置測定装置および自己位置測定方法 |
JP2004034272A (ja) | 2002-07-08 | 2004-02-05 | Mitsubishi Heavy Ind Ltd | 移動体の自己位置同定装置 |
JP2005315746A (ja) * | 2004-04-28 | 2005-11-10 | Mitsubishi Heavy Ind Ltd | 自己位置同定方法及び該装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4940925A (en) * | 1985-08-30 | 1990-07-10 | Texas Instruments Incorporated | Closed-loop navigation system for mobile robots |
US4933864A (en) * | 1988-10-04 | 1990-06-12 | Transitions Research Corporation | Mobile robot navigation employing ceiling light fixtures |
US5051906A (en) * | 1989-06-07 | 1991-09-24 | Transitions Research Corporation | Mobile robot navigation employing retroreflective ceiling features |
US5525883A (en) * | 1994-07-08 | 1996-06-11 | Sara Avitzour | Mobile robot location determination employing error-correcting distributed landmarks |
JP3833786B2 (ja) * | 1997-08-04 | 2006-10-18 | 富士重工業株式会社 | 移動体の3次元自己位置認識装置 |
CA2373669A1 (en) * | 2002-02-27 | 2003-08-27 | Indal Technologies Inc. | Imaging system for a passenger bridge of the like for docking automatically with an aircraft |
JP2003263104A (ja) * | 2002-03-11 | 2003-09-19 | Mitsubishi Electric Corp | 撮像情報認識システム |
US6825485B1 (en) * | 2002-05-08 | 2004-11-30 | Storage Technology Corporation | System and method for aligning a robot device in a data storage library |
KR100493159B1 (ko) * | 2002-10-01 | 2005-06-02 | 삼성전자주식회사 | 이동체의 효율적 자기 위치 인식을 위한 랜드마크 및 이를이용한 자기 위치 인식 장치 및 방법 |
US20050234679A1 (en) * | 2004-02-13 | 2005-10-20 | Evolution Robotics, Inc. | Sequential selective integration of sensor data |
JP2005325746A (ja) | 2004-05-13 | 2005-11-24 | Toyota Industries Corp | 車両用排熱回収システム |
KR100809342B1 (ko) * | 2004-10-05 | 2008-03-05 | 삼성전자주식회사 | 조도기반 네비게이션 장치 및 방법 |
US7191056B2 (en) * | 2005-01-04 | 2007-03-13 | The Boeing Company | Precision landmark-aided navigation |
-
2007
- 2007-02-28 JP JP2008508464A patent/JP4753103B2/ja not_active Expired - Fee Related
- 2007-02-28 US US12/295,088 patent/US8204643B2/en active Active
- 2007-02-28 KR KR1020087022761A patent/KR101013392B1/ko not_active IP Right Cessation
- 2007-02-28 WO PCT/JP2007/053746 patent/WO2007113956A1/ja active Application Filing
- 2007-02-28 EP EP07737500.4A patent/EP2017573B1/en not_active Not-in-force
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08247775A (ja) * | 1995-03-15 | 1996-09-27 | Toshiba Corp | 移動体の自己位置同定装置および自己位置同定方法 |
JPH0953939A (ja) * | 1995-08-18 | 1997-02-25 | Fujitsu Ltd | 自走車の自己位置測定装置および自己位置測定方法 |
JP2004034272A (ja) | 2002-07-08 | 2004-02-05 | Mitsubishi Heavy Ind Ltd | 移動体の自己位置同定装置 |
JP2005315746A (ja) * | 2004-04-28 | 2005-11-10 | Mitsubishi Heavy Ind Ltd | 自己位置同定方法及び該装置 |
Non-Patent Citations (3)
Title |
---|
"Robust Self-Position Identification Method of Soccer Robot Having Omnidirectional Camera and Dead Reckoning Function", JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, vol. 22, no. 3, 2004, pages 343 - 352 |
ADACHI T. ET AL.: "Kichi Kankyo no Edge Joho o Mochiita Tangan Camera no Jiko Ichi Suitei (Self-Location Estimation of a Moving Camera Using Edge Information of Known Environment)", IEICE TECHNICAL REPORT, vol. 105, no. 295, 10 September 2005 (2005-09-10), pages 31 - 36, XP003018225 * |
See also references of EP2017573A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015049717A1 (ja) * | 2013-10-01 | 2015-04-09 | 株式会社日立製作所 | 移動体位置推定装置および移動体位置推定方法 |
JPWO2015049717A1 (ja) * | 2013-10-01 | 2017-03-09 | 株式会社日立製作所 | 移動体位置推定装置および移動体位置推定方法 |
WO2016016955A1 (ja) * | 2014-07-30 | 2016-02-04 | 株式会社日立製作所 | 自律移動装置及び自己位置推定方法 |
JPWO2016016955A1 (ja) * | 2014-07-30 | 2017-04-27 | 株式会社日立製作所 | 自律移動装置及び自己位置推定方法 |
US11662738B2 (en) | 2018-03-09 | 2023-05-30 | Casio Computer Co., Ltd. | Autonomous mobile apparatus, autonomous move method, and recording medium that use a selected environment map |
JP2022035936A (ja) * | 2020-08-20 | 2022-03-04 | 上海姜歌机器人有限公司 | ロボットの再位置決め方法、装置及び機器 |
Also Published As
Publication number | Publication date |
---|---|
US8204643B2 (en) | 2012-06-19 |
EP2017573A4 (en) | 2012-10-31 |
US20090248305A1 (en) | 2009-10-01 |
EP2017573A1 (en) | 2009-01-21 |
KR20080106930A (ko) | 2008-12-09 |
KR101013392B1 (ko) | 2011-02-14 |
EP2017573B1 (en) | 2013-11-20 |
JP4753103B2 (ja) | 2011-08-24 |
JPWO2007113956A1 (ja) | 2009-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4753103B2 (ja) | 移動体位置の推定装置と推定方法及び推定プログラム | |
KR101782057B1 (ko) | 지도 생성 장치 및 방법 | |
US20090312871A1 (en) | System and method for calculating location using a combination of odometry and landmarks | |
US8825398B2 (en) | Device for assisting in the navigation of a person | |
JP4798450B2 (ja) | ナビゲーション装置とその制御方法 | |
JP4264380B2 (ja) | 自己位置同定方法及び該装置 | |
JP5130419B2 (ja) | 自己位置認識方法及び自己位置認識装置 | |
US20100121488A1 (en) | Method and system for creating indoor environment map | |
US20100161224A1 (en) | Apparatus and method for detecting position and orientation of mobile object | |
JP5298741B2 (ja) | 自律移動装置 | |
JP5800613B2 (ja) | 移動体の位置・姿勢推定システム | |
JP2012084149A (ja) | モバイル機器のナビゲーション | |
JP5016399B2 (ja) | 地図情報作成装置及びそれを備えた自律移動装置 | |
JP2007249735A (ja) | ロボット位置制御装置およびロボット自己位置回復方法 | |
EP3079031B1 (en) | Moving robot and method of recognizing location of moving robot | |
KR20090000500A (ko) | 이동 로봇의 리로케이션 방법 및 장치 | |
KR20090025822A (ko) | 표식과 근거리무선통신을 이용한 로봇의 자기위치인식방법, 그를 이용한 로봇의 위치데이터 발생장치 및그를 이용한 로봇 | |
Hoang et al. | Multi-sensor perceptual system for mobile robot and sensor fusion-based localization | |
US20190331496A1 (en) | Locating a vehicle | |
JP2006234453A (ja) | 自己位置標定用ランドマーク位置の登録方法 | |
JP2008158690A (ja) | 移動体検知装置、警備ロボット、移動体検知方法および移動体検知プログラム | |
GB2567144B (en) | Apparatus and method for localising a vehicle | |
Jung et al. | Simultaneous localization and mapping of a wheel-based autonomous vehicle with ultrasonic sensors | |
Rusdinar et al. | Vision-based indoor localization using artificial landmarks and natural features on the ceiling with optical flow and a kalman filter | |
Kim et al. | Indoor localization using laser scanner and vision marker for intelligent robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07737500 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007737500 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008508464 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087022761 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12295088 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |