KR101153221B1 - Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors - Google Patents

Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors Download PDF

Info

Publication number
KR101153221B1
KR101153221B1 KR1020090067983A KR20090067983A KR101153221B1 KR 101153221 B1 KR101153221 B1 KR 101153221B1 KR 1020090067983 A KR1020090067983 A KR 1020090067983A KR 20090067983 A KR20090067983 A KR 20090067983A KR 101153221 B1 KR101153221 B1 KR 101153221B1
Authority
KR
South Korea
Prior art keywords
robot
position
image
direction
ceiling
Prior art date
Application number
KR1020090067983A
Other languages
Korean (ko)
Other versions
KR20110010422A (en
Inventor
김형석
양창주
진홍신
Original Assignee
전북대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전북대학교산학협력단 filed Critical 전북대학교산학협력단
Priority to KR1020090067983A priority Critical patent/KR101153221B1/en
Publication of KR20110010422A publication Critical patent/KR20110010422A/en
Application granted granted Critical
Publication of KR101153221B1 publication Critical patent/KR101153221B1/en

Links

Images

Abstract

The present invention relates to a method of accurately calculating a direction and a position of a robot without accumulating errors during movement of the robot when a ceiling or floor of the robot workplace has a grid-like repeating pattern at regular intervals. The direction of the robot in the present invention is calculated by using the angle between the straight line detected from the image and the robot, and the position of the robot is determined by the number of repetitions from the origin of the work site and the number of repetitions from the intersection to the robot center . In an image frame in which no straight line or intersection point is detected, the negative value of the sum vector of the accumulated optical flow vectors, which are motion vectors between pixels, for each image frame is added to the previous image direction or position. Then, when straight lines or intersections are detected in the image, correcting the direction and position information of the robot by the detected straight line or crossing point information avoids accumulation of errors. More specifically, a camera is installed toward the ceiling at the center of the mobile robot to obtain a set of angles based on the origin of the robot workplace for the repeated linear components of the grid-like ceiling pattern, and all intersections in the grid- To prepare a set of locations for. The direction of the moving robot detects the straight line in the image and calculates the angle between the straight line and the current direction of the robot. Then, in addition to the individual angles with respect to the origin straight lines of the work site prepared in advance, Is determined as the current robot angle. In order to calculate the position of the robot, the coordinates of the intersections around the previous position point of the robot are taken among the set of the intersections of the workplace. In addition, by selecting the most distinct intersection point in the current image, the position of the center of the image is expressed as a relative vector with respect to the intersection, converted into coordinates of the reference origin, and the position closest to the robot position in the previous image is converted into the coordinates . In the case of an image in which no straight line or crossing points are detected, an optical flow between pixels is calculated. If an image without a straight line or an intersection is continued, a sum vector between the motion vectors is obtained. Predicted and used. When an image in which a straight line or an intersection is detected is obtained, the direction and the position of the predicted robot are predicted. The present invention uses only regular repeating patterns of the ceiling in order to detect the position and direction of the robot, so that a device other than the imaging device is not required.
Ceiling repeating pattern, ceiling lattice point, ceiling line, pixel motion vector, direction of robot, position of robot

Description

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to a method and apparatus for calculating a position and a direction of a robot using a repeating pattern of a ceiling and a pixel motion vector,

In the case of an image frame in which no straight line or intersection point is detected, Optical Flow Vectors, which are motion vectors between pixels, are displayed on the image The position information can be predicted by utilizing the value of the sum vector accumulated for each frame. The present invention relates to an effective method of finding out the position and direction of a robot by utilizing only a repeating pattern of a ceiling.

In order to calculate the position of the mobile robot, a dead reckoning method has been used in the past to calculate the position based on the initial position of the robot. Although this method is simple, there has been a problem that errors caused by sliding of the wheels are accumulated. In order to overcome this problem, many ultrasonic sensors have been used. This method is a method of measuring the distance by calculating the time-of-flight (TOF) of the ultrasound to the object, which is not very accurate and is highly influenced by the environment.

Recently, many methods of using landmarks have been studied. One of the methods for this is to use RFID technology to locate the robot by locating RFID tags in the work environment and then locating the RFID tags while the robot equipped with the RFID reader moves The RFID tag can be used to locate the robot because each RFID tag has its own position, and another method is to use a visual landmark. A landmark is a method of attaching a landmark to a ceiling, and an effective method of using the landmark is to attach the landmark to the ceiling. Color-based ceiling land developed to include a large amount of code information in small-sized landmarks. In this method, the robot calculates the position by accumulating the relative positions of the landmarks, allowing each landmark to express its own relative position. This method involves attaching a landmark to an unused ceiling There is an advantage to increase the utilization of space, but there is a problem that the efforts to attach the landmark to the ceiling and the aesthetics are damaged.

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and it is a new and effective method for determining the direction and position of a robot by utilizing only ceiling repeated patterns without attaching landmarks to the ceiling. In the case of an image frame in which a straight line or an intersection point is not detected, Optical Flow Vectors, which are motion vectors between pixels, are used for each image frame The location information can be predicted by utilizing the accumulated sum vector. The present invention relates to a technique for achieving such a target.

In order to solve the above-described problems, the present invention aims at recognizing the position and direction of a robot by using only a video image without a side device such as a landmark by using the feature of a ceiling with a grid pattern as shown in Fig. To do this, a camera is installed facing the ceiling at the center of the mobile robot to obtain a continuous digital image of the ceiling.

The configuration of the present invention detects the direction from the straight line of the image and recognizes the position from the intersection as shown in Fig. If a straight line or an intersection is not detected in the image, the direction and position of the robot are predicted using the motion vector between the image frames. More specifically, the direction of the robot in the present invention is calculated by using the angle between the straight line detected from the image and the robot, and the position of the robot is determined based on the number of repetitions from the origin point of the work site and the intersection point, Calculate using the vector up to the center. In an image frame in which no straight line or intersection point is detected, the negative value of the sum vector of the accumulated optical flow vectors, which are motion vectors between pixels, for each image frame is added to the previous image direction or position. Then, when straight lines or intersections are detected in the image, correcting the direction and position information of the robot by the detected straight line or crossing point information avoids accumulation of errors.

For this purpose, a camera is installed toward the ceiling at the center of the mobile robot, and an angle set is prepared based on the origin of the robot work site for the repeated linear components of the grid-like ceiling pattern, Prepare a location set.

Direction recognition of robot

As shown in FIG. 3, the direction of the moving robot is such that an angle between the straight line and the current direction of the robot is calculated and then added to the individual angles of the reference straight lines of the workplace prepared in advance, The direction closest to the direction of the robot in the image is determined as the current robot angle.

There are two parallel straight lines intersecting the ceiling image of the grid pattern. As shown in FIG. 4 (a), the direction of a straight line parallel to the x-axis with respect to the origin of the actual space is represented by -

Figure 112009045496472-pat00001
, And a straight line parallel to the y-axis defines the direction as zero. In this way, the direction parallel to the -x axis is
Figure 112009045496472-pat00002
/ 2, and the direction parallel to the -y axis is
Figure 112009045496472-pat00003
.

In the spatial coordinate system shown in Fig. 4 (b), the direction of each straight line is

Figure 112009045496472-pat00004
As shown in FIG. In the image coordinate system, the position of the robot and the camera system and the direction of the upper side of the image are defined as y 'as the front direction of the robot, and the angle between the y-
Figure 112009045496472-pat00005
Let's say. One straight line is obtained and the angle between the straight line and the front surface of the linear robot
Figure 112009045496472-pat00006
, The direction of the actual robot is
Figure 112009045496472-pat00007
.

In other words, a straight line closest to the vertical line connecting the top and bottom of the image is first obtained, and the angle formed by the vertical line and the straight line is

Figure 112009045496472-pat00008
. However, the straight line obtained from the image is oriented at 0 degree with respect to the origin,
Figure 112009045496472-pat00009
,
Figure 112009045496472-pat00010
or -
Figure 112009045496472-pat00011
Which of the straight lines
Figure 112009045496472-pat00012
It should be distinguished.

Assuming that the rotation speed of the robot is sufficiently low, that is, if the change in the direction of the robot is very small, the direction will be similar to the direction of the most recent captured image in the current direction of the current robot. This direction

Figure 112009045496472-pat00013
, The current direction is
Figure 112009045496472-pat00014
medium
Figure 112009045496472-pat00015
.

for this,

Figure 112009045496472-pat00016
Is the current direction,

Equation 1

Figure 112009045496472-pat00017

.

However, it is not necessary to calculate the initial direction very accurately, and if only the rough direction is known,

Figure 112009045496472-pat00018
It is only necessary to determine which angle is close to the angle. Therefore, there is a great advantage that this error does not accumulate. The next section calculates the range of allowed errors.

 Robot position recognition

In order to calculate the position of the robot, as shown in FIG. 5, the position coordinates of the intersections around the previous position point of the robot are taken among the positions of the intersections of the workplace. The position of the center of the image is represented as a relative vector to the intersection by selecting the most distinct intersection point in the current image, and the rotation is converted into the coordinates of the reference origin. Then, Position.

As shown in Fig. 4, there are many intersections formed by intersection of two straight lines in the target ceiling pattern. Let d be the distance between two adjacent intersections of the ceiling pattern. Thus, the positions of these intersections can be known and, if a coordinate axis is defined, the positions of the intersections can be calculated.

If the distance from this intersection to the center of the image is calculated, the current position of the robot can also be calculated.

The actual position of the intersection recognized in the image of the previous frame is

Figure 112009045496472-pat00019
, The actual position of the intersection point in the current image frame is one of the following points.

Equation 2

Figure 112009045496472-pat00020
.

here

Figure 112009045496472-pat00021
and
Figure 112009045496472-pat00022
Is the distance between the x and y axes.

The position of the robot can be obtained by calculating the position of the intersection using the method as described above and referring to FIG. That is, since the position of the robot is the same as the position of the center point of the image, this is C, and if the detected point of intersection is A and B is the intersection point of the adjacent position,

Figure 112009045496472-pat00023
The direction of the robot is the same as that of the robot.

The current robot direction

Figure 112009045496472-pat00024
Respectively,
Figure 112009045496472-pat00025
=
Figure 112009045496472-pat00026
Can be calculated. If the magnitude ratio relationship between the image plane and the ceiling plane is used,
Figure 112009045496472-pat00027
The position of the point can be calculated as shown in Equation (3).

Equation 3

Figure 112009045496472-pat00028

point

Figure 112009045496472-pat00029
Can have nine candidate points,
Figure 112009045496472-pat00030
The a point also has 9 points. Such a conversion procedure is described in the conversion method to the actual distance relative position vector in FIG.

However, since the speed of the robot is not fast,

Figure 112009045496472-pat00031
The point is
Figure 112009045496472-pat00032
It is located close to the point. Therefore, the current position point of the robot is

Equation 4

Figure 112009045496472-pat00033

Lt; / RTI >

Equation 5

Figure 112009045496472-pat00034

The position point of the robot can be obtained.

Figure 112009045496472-pat00035
Wow
Figure 112009045496472-pat00036
Is a value having the minimum value of the expression (5) in {-1, 0, 1}. According to Equation (2), the intersection point
Figure 112009045496472-pat00037
Can also be found.

In conclusion,

Figure 112009045496472-pat00038
And the robot position point in the previous frame
Figure 112009045496472-pat00039
The current position of the robot and the position of the intersection point appearing in the current image can be calculated using Equation (5) and Equation (2). Therefore, given the initial position of the robot, the position of the robot can be gradually calculated using Equation (2), and since Equation (5) is used,
Figure 112009045496472-pat00040
There is an important feature that errors for points are not accumulated.

Complementing robot direction and position using vector of optical flow

In the case of an image in which no straight line or intersection points are detected, an optical flow between pixels is calculated as shown in FIG. 8 or FIG. 9. If a straight line or an image without an intersection continues, a sum vector between the motion vectors is calculated The direction and the position of the robot are predicted and used. When the robot obtains an image in which a straight line or an intersection is detected, the predicted direction and position of the used robot are predicted.

When the robot moves as a motion vector of a pixel between adjacent image frames, the optical flow moves in a direction opposite to the robot. At this time, since the motion vectors of pixels of many images may be slightly different, a typical motion vector between image frames is used.

Figure 112009045496472-pat00041
Let be a representative motion vector between m and n frames. When the optical flows of k image frames from the m-th frame are accumulated, accumulated optical flow,
Figure 112009045496472-pat00042
The

Equation 6

=

Figure 112009045496472-pat00043
=
Figure 112009045496472-pat00044
+
Figure 112009045496472-pat00045
+++ ... +
Figure 112009045496472-pat00046

.

Therefore, when the direction angle of the robot at the time point m is based on the origin

Figure 112009045496472-pat00047
, Then the angle after k frames is
Figure 112009045496472-pat00048
The

Equation 7

Figure 112009045496472-pat00049
=
Figure 112009045496472-pat00050
-
Figure 112009045496472-pat00051

As shown in Fig.

If an image having a straight line is obtained from the image, the angle of the robot is obtained on the basis of the straight line, and the angle obtained from the straight line is added to the individual straight line angles in the direction angle candidate set, The direction of the plurality of robot direction candidates having the smallest difference from the robot direction determined by the sum of the motion vectors is determined as the current robot direction.

If there is no intersection in the image, the sum of the motion vectors is obtained by accumulating the vectors of the optical flow as shown in Equation (6) as shown in FIG. 9, and the sum of the motion vectors is added to the position of the robot obtained by zero- And the robot predicted position is obtained.

If the intersection point (s) is detected in the image, a candidate set for the robot actual position is obtained as in the robot position recognition method, and a position point closest to the robot predicted position from the candidate set for the robot actual position is determined as the current Correct the robot position.

The present invention is a technique for recognizing the position and direction of a robot even if the grid pattern shown in FIG. 1 is a ceiling pattern only. If the initial direction and position are given, the direction and position of the robot are gradually , And small errors occurring at this time are corrected each time the intersection of each ceiling pattern is found, and the movement can be continued without accumulation. Particularly, since the present invention utilizes the pattern of the ceiling as it is, it is possible to measure an accurate position without using an additional device such as a landmark or an encoder, which is essential in existing products. Since most offices have a lattice pattern to which the present invention can be applied, it can be said that the present invention is useful for office robots.

In order to implement the recognition method of position and orientation based on the developed ceiling pattern recognition robot, it was installed in an indoor mobile robot to perform an errand function. The task the robot should perform is

 a. At a certain location in the office where several people work, if a person randomly designates a person to deliver after making the coffee,

 b. The robot will deliver coffee to the designated person avoiding obstacles,

 c. Once delivery is complete, return to the original coffee shop and wait for another delivery order

.

For this purpose, a small indoor robot as shown in Fig. 10 was installed with a camera toward the ceiling and recognized the grid pattern of the office ceiling as shown in Fig. 4 (a).

Figure 11 shows the layout of the office and the people performing the above scenario. At the top of the picture is a table for making coffee, with a destination desk 1 for delivering coffee to the left of the first third. In order to carry out the task of the scenario, the robot loads coffee from the coffee table position, arrives at the destination desk1 along the first and second path, and then rings the bell to notify that the coffee delivery has arrived and when the person sitting on the desk 1 takes the coffee , The robot is going back to the coffee table via the overpass.

Fig. 12 shows the process of the robot performing the above-mentioned tasks in order. As shown in FIG. 12 (a), the robot that receives and receives coffee from the coffee table rotates the curve in the images of FIGS. 12 (b), 12 (c), and 12 (B), (o), and (p), the robot returns to the position of the coffee table for the first time after completing the delivery mission as shown in FIG. 12 (r) .

As a result of implementing the present invention on an office coffee delivery robot as described above, the robot has always reached the destination precisely along the correct path and has accomplished the errand mission excellently. The results showed that the position error was less than 2 cm and the direction error was less than 2 degrees.

FIG. 1 is a view illustrating ceilings having repeated grid patterns; FIG.

FIG. 2 is a main flow chart of the present invention for determining the direction and position of a robot using a repeated ceiling pattern; FIG.

3 is a flowchart of a routine for determining the direction of the robot using the detected straight line;

FIG. 4 (a) is a diagram illustrating an origin-based coordinate and a robot coordinate for a straight line and an intersection point in a typical ceiling pattern;

4 (b) shows the origin-based coordinate system (x-y) and the robot reference coordinate system (x'-y ') superimposed;

5 is a flowchart of a routine for calculating the position of a robot using an intersection of straight lines in an image;

Figure 6 is an auxiliary figure used to calculate the actual position of the robot in the routine of Figure 5;

Fig. 7 is a routine describing a method of converting to an actual distance relative position vector according to claim 2; Fig.

In the case of an image in which a straight line is not found, a method of predicting the robot direction by using a pixel motion vector (optical flow) between the image frames or an accumulated vector thereof, and a method of estimating the robot direction from a cumulative vector A routine showing a method of correcting the predicted direction of the robot;

In the case of an image in which an intersection of straight lines is not found, a method of predicting a robot position by using a pixel motion vector or an accumulated vector between image frames, and a method of estimating a robot position using an intersection, And correcting the predicted robot position.

10 is a view of a mobile robot for applying the present invention to a robot for delivering coffee in an office;

11 is a view of an indoor room for performing an office coffee errand function;

12 shows a scene in which the robot performs a coffee delivery errand mission. here

(a) is a scene where coffee is taken from a coffee table, (j) (k) (l) is a scene where coffee is delivered to a destination, It is a scene that returns to the position.

Claims (4)

  1. A camera is installed at the center of the mobile robot toward the ceiling to obtain continuous digital images of the ceiling;
    Calculating an angle with respect to the y-axis at the reference origin of the robot workplace for the repeated linear components of the repeated ceiling pattern of the robot workstation, preparing and preparing the angle set;
    Preparing and initializing an intersection set composed of positional information on all intersections in the ceiling pattern;
    In order to calculate the direction during the autonomous movement of the robot, a straight line is detected from the image taken during the movement and the angle between the straight line and the current direction of the robot is obtained from the image. Then, In addition to each individual angle in the set of angles for the robot, is referred to as a direction angle candidate set of the robot;
    Selecting an angle closest to the robot direction angle obtained from the robot position of the previous image among the angles in the direction angle candidate set as a current robot angle;
    In order to calculate the position during robot autonomous movement, only the positional coordinates of the intersections around the previous image position of the robot on the intersection set prepared at the initialization are taken as a robot reference position candidate set;
     Selecting the most significant intersection point among the ceiling images photographed at the current position, displaying the pixel coordinates of the intersection point as relative coordinates from the center of the image, and displaying it as a transformed actual distance relative position vector;
    Adding the transformed actual distance relative position vector to the robot reference position candidate set and designating it as a robot actual position candidate set;
    Among the elements of the robot actual position candidate set, a position point closest to the position of the robot obtained from the previous image is determined as the current robot position point
    Computation of Position and Orientation of Robot Using Repeater Pattern and Pixel Motion Vector of Ceiling
  2. The method according to claim 1,
    The transformed actual distance relative position vector
    Obtaining a relative vector from the image center for the selected intersection;
    The intersection rotates the vector in a direction opposite to the center of the image by the direction of the robot obtained in the above-mentioned claim 1;
    Transforming the rotation-converted intersection vector by the actual distance to the unit pixel interval to convert the position of the intersection point to the actual distance relative vector from the center of the robot;
    The actual distance relative vector to the center of the robot at the intersection is multiplied by a negative sign so that the center of the robot is represented by an actual distance relative vector to the intersection to calculate a transformed actual distance relative position vector
    Computation of Position and Orientation of Robot Using Repeater Pattern and Pixel Motion Vector of Ceiling
  3. The method according to claim 1,
    When calculating the robot direction,
    If it is difficult to calculate the direction of the robot because no straight line component is found in the image, the optical flow vector is accumulated until a linear component is found in the image, and the negative value of the motion vector sum is added to the direction of the robot in the direction of the robot Predict and use;
    When an image having a straight line is obtained, the angle of the robot is obtained from a straight line, and a direction angle candidate set is obtained as in the method of finding the direction of claim 1, and the difference between the individual angles in the set and the cumulative sum of the motion vectors By determining the less direction as the current robot direction,
    Computation of Position and Orientation of Robot Using Repeater Pattern and Pixel Motion Vector of Ceiling
  4. The method according to claim 1 or 2, wherein
    When calculating the robot position,
    If there is no intersection in the image, the intersection is found, then the vector of the optical flow is accumulated until the image comes out, and the sum of the motion vectors is obtained. In the previous position of the robot obtained from the image with the intersection, To obtain a robot predicted position;
    If the intersection is detected in the image, the robot position calculation method is the same as the robot position calculation method according to claim 1 or claim 2, wherein the range of the reference robot position candidate set is enlarged by 2 to 3 times, To obtain a set;
    The position point closest to the robot predicted position is corrected from the candidate set for the actual position of the robot to the current robot position, thereby completing the calculation of the position point of the robot
    Computation of Position and Orientation of Robot Using Repeater Pattern and Pixel Motion Vector of Ceiling
KR1020090067983A 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors KR101153221B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Publications (2)

Publication Number Publication Date
KR20110010422A KR20110010422A (en) 2011-02-01
KR101153221B1 true KR101153221B1 (en) 2012-06-05

Family

ID=43770964

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Country Status (1)

Country Link
KR (1) KR101153221B1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101122303B1 (en) * 2009-07-24 2012-03-21 삼성중공업 주식회사 Calibration unit and working system having the same
FR2979428B1 (en) 2011-08-24 2013-10-04 Commissariat Energie Atomique Method of locating an object using a reference grid
CN104540648B (en) * 2012-08-02 2018-10-12 株式会社富士 Have the working rig and electronic part mounting of articulated robot
CN106020190B (en) * 2016-05-26 2019-03-01 山东大学 Track learning controller, control system and method with initial state error correction
CN108888188A (en) * 2018-06-14 2018-11-27 深圳市沃特沃德股份有限公司 Sweeping robot position calibration method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100564236B1 (en) 2001-09-26 2006-03-29 현대중공업 주식회사 Self-localization apparatus and method of mobile robot

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100564236B1 (en) 2001-09-26 2006-03-29 현대중공업 주식회사 Self-localization apparatus and method of mobile robot

Also Published As

Publication number Publication date
KR20110010422A (en) 2011-02-01

Similar Documents

Publication Publication Date Title
Ohya et al. Vision-based navigation by a mobile robot with obstacle avoidance using single-camera vision and ultrasonic sensing
JP5881743B2 (en) Self-position estimation of mobile camera using depth map
CN103778635B (en) For the method and apparatus processing data
US8401242B2 (en) Real-time camera tracking using depth maps
Xu et al. Ceiling-based visual positioning for an indoor mobile robot with monocular vision
Surmann et al. An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor environments
KR101003168B1 (en) Multidimensional Evidence Grids and System and Methods for Applying Same
JP2007316966A (en) Mobile robot, control method thereof and program
US6453223B1 (en) Infrastructure independent position determining system
Wu et al. Recovery of the 3-d location and motion of a rigid object through camera image (an Extended Kalman Filter approach)
Burschka et al. Vision-based control of mobile robots
JP2009193240A (en) Mobile robot and method for generating environment map
JP4375320B2 (en) Mobile robot
Mautz et al. Survey of optical indoor positioning systems
Park et al. Three-dimensional tracking of construction resources using an on-site camera system
Garcia et al. Positioning an underwater vehicle through image mosaicking
US9355453B2 (en) Three-dimensional measurement apparatus, model generation apparatus, processing method thereof, and non-transitory computer-readable storage medium
Dorfmüller Robust tracking for augmented reality using retroreflective markers
KR20110097140A (en) Apparatus for estimating location of moving robot and method thereof
KR20040029493A (en) Landmark, apparatus and method for determining position of autonomous vehicles effectively
Madsen et al. Optimal landmark selection for triangulation of robot position
WO2004044528A1 (en) Surveying instrument and electronic storage medium
US8588471B2 (en) Method and device of mapping and localization method using the same
Rekleitis et al. Simultaneous planning, localization, and mapping in a camera sensor network
Kitt et al. Monocular visual odometry using a planar road model to solve scale ambiguity

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20150511

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20160517

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20170515

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20180418

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20190422

Year of fee payment: 8