KR20110010422A - Computation of robot location and orientation using repeating feature of ceiling textures and optical flow vectors - Google Patents

Computation of robot location and orientation using repeating feature of ceiling textures and optical flow vectors Download PDF

Info

Publication number
KR20110010422A
KR20110010422A KR1020090067983A KR20090067983A KR20110010422A KR 20110010422 A KR20110010422 A KR 20110010422A KR 1020090067983 A KR1020090067983 A KR 1020090067983A KR 20090067983 A KR20090067983 A KR 20090067983A KR 20110010422 A KR20110010422 A KR 20110010422A
Authority
KR
South Korea
Prior art keywords
robot
position
image
direction
intersection
Prior art date
Application number
KR1020090067983A
Other languages
Korean (ko)
Other versions
KR101153221B1 (en
Inventor
김형석
양창주
진홍신
Original Assignee
전북대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전북대학교산학협력단 filed Critical 전북대학교산학협력단
Priority to KR1020090067983A priority Critical patent/KR101153221B1/en
Publication of KR20110010422A publication Critical patent/KR20110010422A/en
Application granted granted Critical
Publication of KR101153221B1 publication Critical patent/KR101153221B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/01Mobile robot

Abstract

The present invention relates to a method of accurately calculating a direction and a position of a robot without accumulating errors during movement of the robot when a ceiling or floor of the robot workplace has a grid-like repeating pattern at regular intervals. The direction of the robot in the present invention is calculated by using the angle between the straight line detected from the image and the robot, and the position of the robot is determined by the number of repetitions from the origin of the work site and the number of repetitions from the intersection to the robot center . In an image frame in which no straight line or intersection point is detected, the negative value of the sum vector of the accumulated optical flow vectors, which are motion vectors between pixels, for each image frame is added to the previous image direction or position. Then, when straight lines or intersections are detected in the image, correcting the direction and position information of the robot by the detected straight line or crossing point information avoids accumulation of errors. More specifically, a camera is installed toward the ceiling at the center of the mobile robot to obtain a set of angles based on the origin of the robot workplace for the repeated linear components of the grid-like ceiling pattern, and all intersections in the grid- To prepare a set of locations for. The direction of the moving robot detects the straight line in the image and calculates the angle between the straight line and the current direction of the robot. Then, in addition to the individual angles with respect to the origin straight lines of the work site prepared in advance, Is determined as the current robot angle. In order to calculate the position of the robot, the coordinates of the intersections around the previous position point of the robot are taken among the set of the intersections of the workplace. In addition, by selecting the most distinct intersection point in the current image, the position of the center of the image is expressed as a relative vector with respect to the intersection, converted into coordinates of the reference origin, and the position closest to the robot position in the previous image is converted into the coordinates . In the case of an image in which no straight line or crossing points are detected, an optical flow between pixels is calculated. If an image without a straight line or an intersection is continued, a sum vector between the motion vectors is obtained. Predicted and used. When an image in which a straight line or an intersection is detected is obtained, the direction and the position of the predicted robot are predicted. The present invention uses only regular repeating patterns of the ceiling in order to detect the position and direction of the robot, so that a device other than the imaging device is not required.

Description

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to a method and apparatus for calculating a position and a direction of a robot using a repeating pattern of a ceiling and a pixel motion vector,

In the case of an image frame in which no straight line or intersection point is detected, Optical Flow Vectors, which are motion vectors between pixels, are displayed on the image The position information can be predicted by utilizing the value of the sum vector accumulated for each frame. The present invention relates to an effective method of finding out the position and direction of a robot by utilizing only a repeating pattern of a ceiling.

In order to calculate the position of the mobile robot, a dead reckoning method has been used in the past to calculate the position based on the initial position of the robot. Although this method is simple, there has been a problem that errors caused by sliding of the wheels are accumulated. In order to overcome this problem, many ultrasonic sensors have been used. This method is a method of measuring the distance by calculating the time-of-flight (TOF) of the ultrasound to the object, which is not very accurate and is highly influenced by the environment.

Recently, many methods of using landmarks have been studied. One of the methods for this is to use RFID technology to locate the robot by locating RFID tags in the work environment and then locating the RFID tags while the robot equipped with the RFID reader moves The RFID tag can be used to locate the robot because each RFID tag has its own position, and another method is to use a visual landmark. A landmark is a method of attaching a landmark to a ceiling, and an effective method of using the landmark is to attach the landmark to the ceiling. Color-based ceiling land developed to include a large amount of code information in small-sized landmarks. In this method, the robot calculates the position by accumulating the relative positions of the landmarks, allowing each landmark to express its own relative position. This method involves attaching a landmark to an unused ceiling There is an advantage to increase the utilization of space, but there is a problem that the efforts to attach the landmark to the ceiling and the aesthetics are damaged.

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and it is a new and effective method for determining the direction and position of a robot by utilizing only ceiling repeated patterns without attaching landmarks to the ceiling. In the case of an image frame in which a straight line or an intersection point is not detected, Optical Flow Vectors, which are motion vectors between pixels, are used for each image frame The location information can be predicted by utilizing the accumulated sum vector. The present invention relates to a technique for achieving such a target.

In order to solve the above-described problems, the present invention aims at recognizing the position and direction of a robot by using only a video image without a side device such as a landmark by using the feature of a ceiling with a grid pattern as shown in Fig. To do this, a camera is installed facing the ceiling at the center of the mobile robot to obtain a continuous digital image of the ceiling.

The configuration of the present invention detects the direction from the straight line of the image and recognizes the position from the intersection as shown in Fig. If a straight line or an intersection is not detected in the image, the direction and position of the robot are predicted using the motion vector between the image frames. More specifically, the direction of the robot in the present invention is calculated by using the angle between the straight line detected from the image and the robot, and the position of the robot is determined based on the number of repetitions from the origin point of the work site and the intersection point, Calculate using the vector up to the center. In an image frame in which no straight line or intersection point is detected, the negative value of the sum vector of the accumulated optical flow vectors, which are motion vectors between pixels, for each image frame is added to the previous image direction or position. Then, when straight lines or intersections are detected in the image, correcting the direction and position information of the robot by the detected straight line or crossing point information avoids accumulation of errors.

For this purpose, a camera is installed toward the ceiling at the center of the mobile robot, and an angle set is prepared based on the origin of the robot work site for the repeated linear components of the grid-like ceiling pattern, Prepare a location set.

Direction recognition of robot

As shown in FIG. 3, the direction of the moving robot is such that an angle between the straight line and the current direction of the robot is calculated and then added to the individual angles of the reference straight lines of the workplace prepared in advance, The direction closest to the direction of the robot in the image is determined as the current robot angle.

There are two parallel straight lines intersecting the ceiling image of the grid pattern. As shown in FIG. 4 (a), the direction of a straight line parallel to the x-axis with respect to the origin of the actual space is represented by -

Figure 112009045496472-PAT00001
, And a straight line parallel to the y-axis defines the direction as zero. In this way, the direction parallel to the -x axis is
Figure 112009045496472-PAT00002
/ 2, and the direction parallel to the -y axis is
Figure 112009045496472-PAT00003
.

In the spatial coordinate system shown in Fig. 4 (b), the direction of each straight line is

Figure 112009045496472-PAT00004
As shown in FIG. In the image coordinate system, the position of the robot and the camera system and the direction of the upper side of the image are defined as y 'as the front direction of the robot, and the angle between the y-
Figure 112009045496472-PAT00005
Let's say. One straight line is obtained and the angle between the straight line and the front surface of the linear robot
Figure 112009045496472-PAT00006
, The direction of the actual robot is
Figure 112009045496472-PAT00007
.

In other words, a straight line closest to the vertical line connecting the top and bottom of the image is first obtained, and the angle formed by the vertical line and the straight line is

Figure 112009045496472-PAT00008
. However, the straight line obtained from the image is oriented at 0 degree with respect to the origin,
Figure 112009045496472-PAT00009
,
Figure 112009045496472-PAT00010
or -
Figure 112009045496472-PAT00011
Which of the straight lines
Figure 112009045496472-PAT00012
It should be distinguished.

Assuming that the rotation speed of the robot is sufficiently low, that is, if the change in the direction of the robot is very small, the direction will be similar to the direction of the most recent captured image in the current direction of the current robot. This direction

Figure 112009045496472-PAT00013
, The current direction is
Figure 112009045496472-PAT00014
medium
Figure 112009045496472-PAT00015
.

for this,

Figure 112009045496472-PAT00016
Is the current direction,

Equation 1

Figure 112009045496472-PAT00017

.

However, it is not necessary to calculate the initial direction very accurately, and if only the rough direction is known,

Figure 112009045496472-PAT00018
It is only necessary to determine which angle is close to the angle. Therefore, there is a great advantage that this error does not accumulate. The next section calculates the range of allowed errors.

 Robot position recognition

In order to calculate the position of the robot, as shown in FIG. 5, the position coordinates of the intersections around the previous position point of the robot are taken among the positions of the intersections of the workplace. The position of the center of the image is represented as a relative vector to the intersection by selecting the most distinct intersection point in the current image, and the rotation is converted into the coordinates of the reference origin. Then, Position.

As shown in Fig. 4, there are many intersections formed by intersection of two straight lines in the target ceiling pattern. Let d be the distance between two adjacent intersections of the ceiling pattern. Thus, the positions of these intersections can be known and, if a coordinate axis is defined, the positions of the intersections can be calculated.

If the distance from this intersection to the center of the image is calculated, the current position of the robot can also be calculated.

The actual position of the intersection recognized in the image of the previous frame is

Figure 112009045496472-PAT00019
, The actual position of the intersection point in the current image frame is one of the following points.

Equation 2

Figure 112009045496472-PAT00020
.

here

Figure 112009045496472-PAT00021
and
Figure 112009045496472-PAT00022
Is the distance between the x and y axes.

The position of the robot can be obtained by calculating the position of the intersection using the method as described above and referring to FIG. That is, since the position of the robot is the same as the position of the center point of the image, this is C, and if the detected point of intersection is A and B is the intersection point of the adjacent position,

Figure 112009045496472-PAT00023
The direction of the robot is the same as that of the robot.

The current robot direction

Figure 112009045496472-PAT00024
Respectively,
Figure 112009045496472-PAT00025
=
Figure 112009045496472-PAT00026
Can be calculated. If the magnitude ratio relationship between the image plane and the ceiling plane is used,
Figure 112009045496472-PAT00027
The position of the point can be calculated as shown in Equation (3).

Equation 3

Figure 112009045496472-PAT00028

point

Figure 112009045496472-PAT00029
Can have nine candidate points,
Figure 112009045496472-PAT00030
The a point also has 9 points. Such a conversion procedure is described in the conversion method to the actual distance relative position vector in FIG.

However, since the speed of the robot is not fast,

Figure 112009045496472-PAT00031
The point is
Figure 112009045496472-PAT00032
It is located close to the point. Therefore, the current position point of the robot is

Equation 4

Figure 112009045496472-PAT00033

Lt; / RTI >

Equation 5

Figure 112009045496472-PAT00034

The position point of the robot can be obtained.

Figure 112009045496472-PAT00035
Wow
Figure 112009045496472-PAT00036
Is a value having the minimum value of the expression (5) in {-1, 0, 1}. According to Equation (2), the intersection point
Figure 112009045496472-PAT00037
Can also be found.

In conclusion,

Figure 112009045496472-PAT00038
And the robot position point in the previous frame
Figure 112009045496472-PAT00039
The current position of the robot and the position of the intersection point appearing in the current image can be calculated using Equation (5) and Equation (2). Therefore, given the initial position of the robot, the position of the robot can be gradually calculated using Equation (2), and since Equation (5) is used,
Figure 112009045496472-PAT00040
There is an important feature that errors for points are not accumulated.

Complementing robot direction and position using vector of optical flow

In the case of an image in which no straight line or intersection points are detected, an optical flow between pixels is calculated as shown in FIG. 8 or FIG. 9. If a straight line or an image without an intersection continues, a sum vector between the motion vectors is calculated The direction and the position of the robot are predicted and used. When the robot obtains an image in which a straight line or an intersection is detected, the predicted direction and position of the used robot are predicted.

When the robot moves as a motion vector of a pixel between adjacent image frames, the optical flow moves in a direction opposite to the robot. At this time, since the motion vectors of pixels of many images may be slightly different, a typical motion vector between image frames is used.

Figure 112009045496472-PAT00041
Let be a representative motion vector between m and n frames. When the optical flows of k image frames from the m-th frame are accumulated, accumulated optical flow,
Figure 112009045496472-PAT00042
The

Equation 6

=

Figure 112009045496472-PAT00043
=
Figure 112009045496472-PAT00044
+
Figure 112009045496472-PAT00045
+++ ... +
Figure 112009045496472-PAT00046

.

Therefore, when the direction angle of the robot at the time point m is based on the origin

Figure 112009045496472-PAT00047
, Then the angle after k frames is
Figure 112009045496472-PAT00048
The

Equation 7

Figure 112009045496472-PAT00049
=
Figure 112009045496472-PAT00050
-
Figure 112009045496472-PAT00051

As shown in Fig.

If an image having a straight line is obtained from the image, the angle of the robot is obtained on the basis of the straight line, and the angle obtained from the straight line is added to the individual straight line angles in the direction angle candidate set, The direction of the plurality of robot direction candidates having the smallest difference from the robot direction determined by the sum of the motion vectors is determined as the current robot direction.

If there is no intersection in the image, the sum of the motion vectors is obtained by accumulating the vectors of the optical flow as shown in Equation (6) as shown in FIG. 9, and the sum of the motion vectors is added to the position of the robot obtained by zero- And the robot predicted position is obtained.

If the intersection point (s) is detected in the image, a candidate set for the robot actual position is obtained as in the robot position recognition method, and a position point closest to the robot predicted position from the candidate set for the robot actual position is determined as the current Correct the robot position.

The present invention is a technique for recognizing the position and direction of a robot even if the grid pattern shown in FIG. 1 is a ceiling pattern only. If the initial direction and position are given, the direction and position of the robot are gradually , And small errors occurring at this time are corrected each time the intersection of each ceiling pattern is found, and the movement can be continued without accumulation. Particularly, since the present invention utilizes the pattern of the ceiling as it is, it is possible to measure an accurate position without using an additional device such as a landmark or an encoder, which is essential in existing products. Since most offices have a lattice pattern to which the present invention can be applied, it can be said that the present invention is useful for office robots.

In order to implement the recognition method of position and orientation based on the developed ceiling pattern recognition robot, it was installed in an indoor mobile robot to perform an errand function. The task the robot should perform is

 a. At a certain location in the office where several people work, if a person randomly designates a person to deliver after making the coffee,

 b. The robot will deliver coffee to the designated person avoiding obstacles,

 c. Once delivery is complete, return to the original coffee shop and wait for another delivery order

.

For this purpose, a small indoor robot as shown in Fig. 10 was installed with a camera toward the ceiling and recognized the grid pattern of the office ceiling as shown in Fig. 4 (a).

Figure 11 shows the layout of the office and the people performing the above scenario. At the top of the picture is a table for making coffee, with a destination desk 1 for delivering coffee to the left of the first third. In order to carry out the task of the scenario, the robot loads coffee from the coffee table position, arrives at the destination desk1 along the first and second path, and then rings the bell to notify that the coffee delivery has arrived and when the person sitting on the desk 1 takes the coffee , The robot is going back to the coffee table via the overpass.

Fig. 12 shows the process of the robot performing the above-mentioned tasks in order. As shown in FIG. 12 (a), the robot that receives and receives coffee from the coffee table rotates the curve in the images of FIGS. 12 (b), 12 (c), and 12 (B), (o), and (p), the robot returns to the position of the coffee table for the first time after completing the delivery mission as shown in FIG. 12 (r) .

As a result of implementing the present invention on an office coffee delivery robot as described above, the robot has always reached the destination precisely along the correct path and has accomplished the errand mission excellently. The results showed that the position error was less than 2 cm and the direction error was less than 2 degrees.

FIG. 1 is a view illustrating ceilings having repeated grid patterns; FIG.

FIG. 2 is a main flow chart of the present invention for determining the direction and position of a robot using a repeated ceiling pattern; FIG.

3 is a flowchart of a routine for determining the direction of the robot using the detected straight line;

FIG. 4 (a) is a diagram illustrating an origin-based coordinate and a robot coordinate for a straight line and an intersection point in a typical ceiling pattern;

4 (b) shows the origin-based coordinate system (x-y) and the robot reference coordinate system (x'-y ') superimposed;

5 is a flowchart of a routine for calculating the position of a robot using an intersection of straight lines in an image;

Figure 6 is an auxiliary figure used to calculate the actual position of the robot in the routine of Figure 5;

Fig. 7 is a routine describing a method of converting to an actual distance relative position vector according to claim 2; Fig.

In the case of an image in which a straight line is not found, a method of predicting the robot direction by using a pixel motion vector (optical flow) between the image frames or an accumulated vector thereof, and a method of estimating the robot direction from a cumulative vector A routine showing a method of correcting the predicted direction of the robot;

In the case of an image in which an intersection of straight lines is not found, a method of predicting a robot position by using a pixel motion vector or an accumulated vector between image frames, and a method of estimating a robot position using an intersection, And correcting the predicted robot position.

10 is a view of a mobile robot for applying the present invention to a robot for delivering coffee in an office;

11 is a view of an indoor room for performing an office coffee errand function;

12 shows a scene in which the robot performs a coffee delivery errand mission. here

(a) is a scene where coffee is taken from a coffee table, (j) (k) (l) is a scene where coffee is delivered to a destination, It is a scene that returns to the position.

Claims (4)

  1. A camera is installed at the center of the mobile robot toward the ceiling to obtain continuous digital images of the ceiling;
    Calculating an angle with respect to the y-axis at the reference origin of the robot workplace for the repeated linear components of the repeated ceiling pattern of the robot workstation, preparing and preparing the angle set;
    Preparing and initializing an intersection set composed of positional information on all intersections in the ceiling pattern;
    In order to calculate the direction during the autonomous movement of the robot, a straight line is detected from the image taken during the movement and the angle between the straight line and the current direction of the robot is obtained from the image. Then, In addition to each individual angle in the set of angles for the robot, is referred to as a direction angle candidate set of the robot;
    Selecting an angle closest to the robot direction angle obtained from the robot position of the previous image among the angles in the direction angle candidate set as a current robot angle;
    In order to calculate the position during robot autonomous movement, only the positional coordinates of the intersections around the previous image position of the robot on the intersection set prepared at the initialization are taken as a robot reference position candidate set;
     Selecting the most significant intersection point among the ceiling images photographed at the current position, displaying the pixel coordinates of the intersection point as relative coordinates from the center of the image, and displaying the coordinates as the converted actual distance relative position vector described in claim 2;
    Adding the transformed actual distance relative position vector to the reference robot position candidate set and calling it a candidate set for the actual position of the robot;
    Among the elements of the candidate set of the robot actual position, a position point closest to the position of the robot obtained from the previous image is determined as the current robot position point
    Method of calculating direction and position of robot using ceiling pattern.
  2. The transformed relative position vector according to claim 1,
    Obtaining a relative vector from the image center for the selected intersection;
    Wherein the intersection rotates and converts the vector in a direction opposite to the center of the image by the direction of the robot obtained in claim 1;
    Transforming the rotation-converted intersection vector by the actual distance to the unit pixel interval to convert the position of the intersection point to the actual distance relative vector from the center of the robot;
    Calculating an actual distance relative position vector by multiplying an actual distance relative vector to the center of the robot at the intersection by a negative sign so that the center of the robot is represented by an actual distance relative vector to the intersection.
  3. In the method of finding the direction of the robot according to claim 1, when it is difficult to calculate the direction of the robot because one linear component is not found in the image, the optical flow vector is accumulated until a linear component is found in the image, In addition to the direction of the robot in the direction of the robot;
    When an image having a straight line is obtained, the angle of the robot is obtained from a straight line, and a direction angle candidate set is obtained as in the method of finding the direction of claim 1, and the difference between the individual angles in the set and the cumulative sum of the motion vectors Determine the less direction as the current robot direction
    Directional Correction Method of Robot Using Optical Flow
  4. The method according to claim 1 or claim 2, wherein when there is no intersection point in the image, the sum of the motion vectors is accumulated by accumulating the vector of the optical flow until the next image where the intersection point is found, Adding a negative value of a sum of motion vectors to a position to obtain a robot predicted position;
    If the intersection point is detected in the image, the method is the same as the robot position calculation method according to claim 1 or claim 2, wherein the range of the reference robot position candidate set is enlarged by 2 to 3 times, ≪ / RTI >
    A position point closest to the robot predicted position is corrected from the candidate set for the robot actual position to the current robot position
    Robot position point calculation and complementary method using optical flow.
KR1020090067983A 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors KR101153221B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Publications (2)

Publication Number Publication Date
KR20110010422A true KR20110010422A (en) 2011-02-01
KR101153221B1 KR101153221B1 (en) 2012-06-05

Family

ID=43770964

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090067983A KR101153221B1 (en) 2009-07-24 2009-07-24 Computation of Robot Location and Orientation using Repeating Feature of Ceiling Textures and Optical Flow Vectors

Country Status (1)

Country Link
KR (1) KR101153221B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101122303B1 (en) * 2009-07-24 2012-03-21 삼성중공업 주식회사 Calibration unit and working system having the same
WO2013026872A1 (en) 2011-08-24 2013-02-28 Commissariat à l'énergie atomique et aux énergies alternatives Method for locating an object using a reference grid
US20150158176A1 (en) * 2012-08-02 2015-06-11 Fuji Machine Mfg. Co., Ltd. Work machine provided with articulated robot and electric component mounting machine
CN106020190A (en) * 2016-05-26 2016-10-12 山东大学 Track learning controller, control system and method with initial state error correction
WO2019237434A1 (en) * 2018-06-14 2019-12-19 深圳市沃特沃德股份有限公司 Sweeping robot position calibration method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100564236B1 (en) 2001-09-26 2006-03-29 현대중공업 주식회사 Self-localization apparatus and method of mobile robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101122303B1 (en) * 2009-07-24 2012-03-21 삼성중공업 주식회사 Calibration unit and working system having the same
WO2013026872A1 (en) 2011-08-24 2013-02-28 Commissariat à l'énergie atomique et aux énergies alternatives Method for locating an object using a reference grid
FR2979428A1 (en) * 2011-08-24 2013-03-01 Commissariat Energie Atomique Method of locating an object using a reference grid
US9317774B2 (en) 2011-08-24 2016-04-19 Commissariat à l'énergie atomique et aux énergies alternatives Method for locating an object using a reference grid
US20150158176A1 (en) * 2012-08-02 2015-06-11 Fuji Machine Mfg. Co., Ltd. Work machine provided with articulated robot and electric component mounting machine
US10099365B2 (en) * 2012-08-02 2018-10-16 Fuji Corporation Work machine provided with articulated robot and electric component mounting machine
CN106020190A (en) * 2016-05-26 2016-10-12 山东大学 Track learning controller, control system and method with initial state error correction
WO2019237434A1 (en) * 2018-06-14 2019-12-19 深圳市沃特沃德股份有限公司 Sweeping robot position calibration method and system

Also Published As

Publication number Publication date
KR101153221B1 (en) 2012-06-05

Similar Documents

Publication Publication Date Title
Zhang et al. Visual-lidar odometry and mapping: Low-drift, robust, and fast
CN104536445B (en) Mobile navigation method and system
US9242171B2 (en) Real-time camera tracking using depth maps
US9927222B2 (en) Position/orientation measurement apparatus, measurement processing method thereof, and non-transitory computer-readable storage medium
JP5881743B2 (en) Self-position estimation of mobile camera using depth map
US9134127B2 (en) Determining tilt angle and tilt direction using image processing
Park et al. Three-dimensional tracking of construction resources using an on-site camera system
CN104848858B (en) Quick Response Code and be used for robotic vision-inertia combined navigation system and method
US8705893B1 (en) Apparatus and method for creating floor plans
CN104811683B (en) Method and apparatus for estimated location
Alcantarilla et al. On combining visual SLAM and dense scene flow to increase the robustness of localization and mapping in dynamic environments
CN104204721B (en) Single camera distance estimations
Kitt et al. Monocular visual odometry using a planar road model to solve scale ambiguity
Ohya et al. Vision-based navigation by a mobile robot with obstacle avoidance using single-camera vision and ultrasonic sensing
Veľas et al. Calibration of rgb camera with velodyne lidar
Xu et al. Ceiling-based visual positioning for an indoor mobile robot with monocular vision
Robertson et al. An Image-Based System for Urban Navigation.
KR101003168B1 (en) Multidimensional Evidence Grids and System and Methods for Applying Same
Ellenberg et al. Use of unmanned aerial vehicle for quantitative infrastructure evaluation
Mautz et al. Survey of optical indoor positioning systems
JP2013186816A (en) Moving image processor, moving image processing method and program for moving image processing
JP6044005B2 (en) Method for camera localization and 3D reconstruction in a partially known environment
CN103941748A (en) Autonomous navigation method and system and map modeling method and system
Wu et al. Recovery of the 3-d location and motion of a rigid object through camera image (an Extended Kalman Filter approach)
Rekleitis et al. Simultaneous planning, localization, and mapping in a camera sensor network

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20150511

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20160517

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20170515

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20180418

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20190422

Year of fee payment: 8