KR101725685B1 - Method and apparatus for detecting localization of mobile robot by ceiling outline detection - Google Patents

Method and apparatus for detecting localization of mobile robot by ceiling outline detection Download PDF

Info

Publication number
KR101725685B1
KR101725685B1 KR1020150170036A KR20150170036A KR101725685B1 KR 101725685 B1 KR101725685 B1 KR 101725685B1 KR 1020150170036 A KR1020150170036 A KR 1020150170036A KR 20150170036 A KR20150170036 A KR 20150170036A KR 101725685 B1 KR101725685 B1 KR 101725685B1
Authority
KR
South Korea
Prior art keywords
ceiling
image
mobile robot
area
origin
Prior art date
Application number
KR1020150170036A
Other languages
Korean (ko)
Inventor
박태형
김영규
Original Assignee
충북대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 충북대학교 산학협력단 filed Critical 충북대학교 산학협력단
Priority to KR1020150170036A priority Critical patent/KR101725685B1/en
Application granted granted Critical
Publication of KR101725685B1 publication Critical patent/KR101725685B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to an apparatus to recognize a localization of a mobile robot, comprising: an image acquisition unit to acquire an image of a ceiling side through a camera installed in the mobile robot, perpendicularly to the ground; a ceiling extraction unit to extract a ceiling region using an optical flowchart of performing pretreatment of the image inputted from the image acquisition unit, recognizing the outline of the ceiling from the image, discriminating the ceiling region based on the outline of the ceiling, and using an original point image captured in the image acquisition unit at a constant time interval and the current image to estimate a degree of movement of the image; and a location recognition unit comparing the central coordinates and an angle of direction of the ceiling region in an original point image recognized through ceiling region extraction with the central coordinates and the angle of direction of the ceiling region in the current image to recognize the location of the mobile robot. According to the present invention, location recognition is realized without a separate mark in a regular indoor environment, and the correctness of the location recognition is high in comparison with an existing manner.

Description

Field of the Invention [0001] The present invention relates to a method and apparatus for locating a mobile robot by detecting a ceiling outline,

The present invention relates to a method and apparatus for recognizing a position of a mobile robot, and more particularly, to a mobile robot capable of autonomous movement (hereinafter referred to as a "mobile robot") using a camera with a fisheye lens, And recognizing the position of the mobile robot using ceiling outline detection.

With the development of control and sensor technology, robots are used in a variety of fields such as precision control, medical and service, and personal assistance. Among them, mobile robots are being developed and used in various fields such as exploration, transportation of goods, defense / disaster response areas, and the like, which are difficult to carry out on the basis of the advantage of being movable.

In the mobile robot, position recognition is the most basic and essential technology, which is to recognize the actual position of the robot by analyzing the external environment information of the robot through sensors such as ultrasonic waves, laser, RFID, and camera. This is an essential technique for precisely controlling the mobile robot.

According to the importance of location recognition, many studies have been carried out to date. Especially, in case of study using ceiling image, many researches have been carried out based on the advantage that ceiling image has no change in scale of image and noise is smaller than other studies using image.

However, in the case of the conventional technology using the camera, additional attachment is required and accurate position recognition is difficult. For example, in the case of a location recognition method using a camera and an artificial landmark, additional cost is incurred because a separate system configuration is required, and the recognition is impossible in other environments other than the attached environment. There are disadvantages.

As described above, there are various methods for assisting the position recognition in the past, but both artificial markers are required or the accuracy is low, which is inconvenient to use.

Korean Patent Publication No. 10-2010-0098999 (published on September 10, 2010). Korean Patent Publication No. 10-2006-0015163 (published February 16, 2006).

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide an apparatus and method for recognizing a position of a mobile robot using a ceiling outline, which is a natural marker, The purpose is to provide.

The objects of the present invention are not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

In order to achieve the above object, there is provided an apparatus for recognizing the position of a mobile robot, comprising: an image acquiring unit for acquiring an image of a ceiling side through a camera installed on the mobile robot perpendicularly to the ground; The image processing method includes the steps of: performing preprocessing on an input image, recognizing an outline of a ceiling from the image, identifying a ceiling area based on a ceiling outline, and using the origin image and the current image captured at a predetermined time interval A ceiling extracting unit for extracting a ceiling area by using an optical flow chart that estimates the degree of movement of an image, and an origin center coordinate and an origin direction angle of the ceiling area in the origin image recognized through the ceiling area extraction, The position of the mobile robot is recognized by comparing the center coordinates and the direction angle of the ceiling area It includes parts of recognition.

The image acquiring unit may acquire a ceiling image through a fish-eye lens and a camera installed on the mobile robot perpendicularly to the ground.

In the present invention, feature values unique to the camera including the camera focal length and principal point can be found, and distortion of the image can be corrected using the camera feature value.

The ceiling extracting unit can extract an area having the smallest average value and the size of the area larger than a predetermined size as a ceiling area by using the average value of the optical flow chart for each ceiling candidate area.

The ceiling extracting unit calculates an average value of the optical flow charts for each ceiling candidate region, extracts an area having the smallest average value and the size of the area larger than a predetermined size as a ceiling area,

Figure 112015117528789-pat00001
And the origin direction angle using the image moment Can be calculated.

When the current image is input, the location recognition unit extracts a ceiling outline, extracts a ceiling area by using an optical flowchart, calculates center coordinates

Figure 112015117528789-pat00003
And direction angle using image moment
Figure 112015117528789-pat00004
And calculates the origin center coordinates calculated from the origin image
Figure 112015117528789-pat00005
And origin direction angle
Figure 112015117528789-pat00006
And the center coordinates calculated from the current image
Figure 112015117528789-pat00007
And direction angle
Figure 112015117528789-pat00008
So that the position of the mobile robot can be recognized.

In the method for recognizing a position of a mobile robot in a position recognition apparatus for a mobile robot according to the present invention, there is provided a method for recognizing a position of a mobile robot using a camera mounted on the mobile robot perpendicularly to the ground, Extracting a ceiling outline, which is a boundary line existing in a ceiling outline of the corrected image, extracting a ceiling candidate area based on the extracted ceiling outline, and using the origin image and the current image captured at a predetermined time interval, A step of calculating an optical flow chart for estimating a degree of movement of an image and extracting a ceiling area by using an average value of optical flow charts for each ceiling candidate area; The center coordinate and the origin direction angle are compared with the center coordinates and direction angle of the ceiling area in the current image And a step of recognizing a position of the mobile robot.

The lens may be a fish-eye lens.

In the step of correcting the distortion, a characteristic value unique to a camera including a camera focal length and a principal point may be determined, and distortion of the image may be corrected using the camera characteristic value.

In the step of extracting the ceiling area, an area having the smallest average value and the size of the area larger than a predetermined size can be extracted as a ceiling area by using the average value of the optical flowcharts for each ceiling candidate area.

Calculating an average value of optical flow charts for each ceiling candidate area in the step of extracting the ceiling area; extracting an area having a minimum average value and a size of the area larger than a predetermined size as a ceiling area; Origin center coordinates for the region

Figure 112015117528789-pat00009
And the origin direction angle using the image moment
Figure 112015117528789-pat00010
And a step of calculating

Extracting a ceiling outline and extracting a ceiling area by using an optical flowchart when a current image is input in the step of recognizing a position of the mobile robot;

Figure 112015117528789-pat00011
And direction angle using image moment
Figure 112015117528789-pat00012
Calculating the origin center coordinates
Figure 112015117528789-pat00013
And origin direction angle
Figure 112015117528789-pat00014
And the center coordinates calculated from the current image
Figure 112015117528789-pat00015
And direction angle
Figure 112015117528789-pat00016
And recognizing the position of the mobile robot.

According to the position recognition system using the ceiling image of the mobile robot of the present invention, it is possible to recognize the position without any additional mark in the general indoor environment, and the accuracy of the position recognition is more accurate than the existing method. In this way, it is possible to reduce labor and cost for installing an artificial marker.

1 is an explanatory view of a ceiling outline according to an embodiment of the present invention.
2 is an explanatory diagram of a fisheye distortion effect according to an embodiment of the present invention.
3 is an explanatory diagram of an optical flowchart according to an embodiment of the present invention.
4 is a flowchart of the entire system according to an embodiment of the present invention.
5 is a detailed flowchart of camera distortion correction according to an exemplary embodiment of the present invention.
6 is a detailed flowchart of extracting a ceiling candidate region by extracting an outline according to an embodiment of the present invention.
FIG. 7 is a view showing extracting a ceiling outline according to the embodiment of FIG. 6. FIG.
8 is a detailed flowchart of ceiling extraction using an optical flow diagram according to an embodiment of the present invention.
9 is an exemplary view for explaining the ceiling extraction according to the embodiment of FIG.
10 is a detailed view of position recognition using a current image according to an embodiment of the present invention.
FIG. 11 is a diagram illustrating an origin image and a current image for explaining a position recognition process according to the embodiment of FIG. 10;
FIG. 12 is a block diagram showing an internal configuration of a mobile robot position recognition apparatus according to an embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It is to be understood, however, that the invention is not to be limited to the specific embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprises" or "having" and the like are used to specify that there is a feature, a number, a step, an operation, an element, a component or a combination thereof described in the specification, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted in an ideal or overly formal sense unless expressly defined in the present application Do not.

In the following description of the present invention with reference to the accompanying drawings, the same components are denoted by the same reference numerals regardless of the reference numerals, and redundant explanations thereof will be omitted. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail.

1 is an explanatory view of a ceiling outline according to an embodiment of the present invention.

In the present invention, a ceiling outline refers to a point where a wall surface and a ceiling meet.

Referring to FIG. 1, there is a line segment where a wall surface and a ceiling meet in a red circle, which is referred to as a ceiling outline (11).

In the present invention, the ceiling area is divided through ceiling outline extraction, and the ceiling area is determined by using the optical flow diagram among the divided ceiling areas. The center point 12 is determined for the ceiling area determined from the origin point image, and the center coordinates

Figure 112015117528789-pat00017
(14) and the origin direction angle
Figure 112015117528789-pat00018
(13), the center coordinates of the current image
Figure 112015117528789-pat00019
(16) and the current video direction angle
Figure 112015117528789-pat00020
(15) are compared with each other and utilized for the position recognition of the mobile robot.

2 is an explanatory diagram of a fisheye distortion effect according to an embodiment of the present invention.

Referring to FIG. 2, it can be seen that the ceiling outline of the distorted portion 21 is distorted in a circular shape different from the ceiling outline of the post-distortion corrected portion 22. This is a phenomenon caused by the distortion of the fish-eye lens used in the present invention. A fish-eye lens is a lens with a viewing angle of 170 degrees or higher. It has a wide viewing angle and short focal length. That is, a wider space than a general lens can be seen, and it can be obtained as an image. Therefore, it is indispensable to use a fisheye lens to capture the entire ceiling. Such a fisheye lens is characterized in that a circular distortion is generated like a distorted portion 21 because a concave lens is used inside the lens. In order to reduce this distortion effect, the distortion is corrected as in (22) after distortion correction using the fisheye lens distortion model.

3 is an explanatory diagram of an optical flowchart according to an embodiment of the present invention. In the present invention, the optical flow diagram refers to motion information patterns representing movement of an object within a predetermined time.

Referring to FIG. 3, the origin image 31 and the current image 32 are images captured at regular time intervals. Referring to the optical flow diagram image 33 which is an image obtained by comparing the two images, it is possible to observe the optical flow 34 in which the same object moves in a predetermined direction according to the change of time. The combination of these optical flows 34 is referred to as optical flow. In the present invention, the ceiling area is determined using the optical flow diagram.

4 is a flowchart of the entire system according to an embodiment of the present invention.

Referring to FIG. 4, the fisheye lens position recognition system has a total of four steps.

The first step is the camera distortion correction step (S41). As shown in FIG. 2, a fisheye distortion due to use of a fisheye lens occurs, and a step of correcting the fisheye distortion is necessary to correct the fisheye distortion. For this purpose, a fisheye lens model is applied to correct fisheye distortion.

And an outline extraction step S42 in two steps. As shown in FIG. 1, the ceiling outline is a boundary existing in the ceiling outline, and serves as a reference for separating the ordinary wall from the ceiling. By clearly separating this, the ceiling area extraction is accurate. The outline is separated by using the outline extraction, and the region is divided based on the detected outline to extract the ceiling candidate region.

And a ceiling extraction step (S43) by an optical flow chart in three steps.

In the present invention, the optical flow diagram is calculated using the origin image and the current image obtained after moving in a certain distance from the origin image.

Then, using the average value of the optical flow chart for each candidate region, the region having the smallest average value and the size of the region larger than a certain size is extracted as a ceiling region. The position of the robot can be recognized based on the direction angle obtained by using the center coordinates and the image moment of the area.

Step S44 of recognizing the position of the current image in four steps.

Step 4 is a step of recognizing the position of the current mobile robot by comparing the center coordinate and direction angle of the ceiling area at the recognized origin through the ceiling area extraction described in step 3 with the center coordinates and direction angle of the ceiling area in the current image.

The present invention recognizes the position of the mobile robot through the steps 1 to 4 described above.

5 is a detailed flowchart of camera distortion correction according to an exemplary embodiment of the present invention.

Referring to FIG. 5, the camera distortion correction process proceeds to step 6.

First, in order to correct the distortion of the camera, camera characteristic values such as the focal distance and the principal point of the camera are required. In order to extract this, it is necessary to take several chess board images and use them to find the feature values of the camera.

Thus, the chess board image is first photographed (S51).

In the step of finding the feature value in step S51, the more the number of chess board images is, the better, and in the embodiment of FIG. 5, more than 20 chess board images are utilized (S52).

Then, the camera feature value is extracted using this (53).

Then, when a video is inputted (S54), the distortion generated in the camera is corrected using the feature value of the camera (S55).

FIG. 6 is a detailed flowchart of extracting a ceiling candidate region by extracting an outline according to an exemplary embodiment of the present invention. FIG. 7 is a diagram illustrating extraction of a ceiling outline according to the embodiment of FIG.

Referring to FIGS. 6 and 7, when an image is input first (S61), a ceiling outline is extracted (S62), and a ceiling candidate region is extracted (S63).

Ceiling outlines are relatively dark compared to other walls. This is because a large amount of light can not reach because the outline portion is narrower than the surface in contact with the light. As shown in FIG. 7, the ceiling outline can be extracted using the above characteristics. Then, each ceiling candidate region is extracted based on the extracted ceiling outline.

FIG. 8 is a detailed flowchart for ceiling extraction using an optical flow diagram according to an embodiment of the present invention, and FIG. 9 is an exemplary view for explaining ceiling extraction according to the embodiment of FIG.

Referring to FIGS. 8 and 9, in order to extract a ceiling using an optical flow diagram, an optical flow diagram is first calculated using an origin image and a moving image (S81).

In the present invention, two or more images are required to calculate the optical flow diagram. In the embodiment of FIGS. 8 and 9 of the present invention, the optical flow diagram is calculated using the origin image, which is the image of the origin, and the current image shifted by 10 cm from the origin (S81).

Then, an average value of the optical flow charts is calculated for each separated ceiling candidate area (S82).

Then, an area having the smallest average value of the optical flow diagrams and the size of the area is determined for each area, and extracted into a ceiling area (S83). Usually the ceiling is farthest from the camera when no other obstacles are present. In the case of the same size object, the size of the image formed on the camera sensor decreases as the distance from the camera decreases. This means that the smaller the size of the optical flow chart, the farther away from the camera. By using this, the area having the smallest average value of the optical flow diagram within the area can be regarded as the ceiling area.

Next, after extracting the ceiling area, the origin center coordinates

Figure 112015117528789-pat00021
(14) and the origin direction angle using the image moment
Figure 112015117528789-pat00022
(13) (S84, S85).

Then, the center coordinates are extracted from the origin image and the moving image obtained after moving in a certain distance from the origin image, and then the actual moving distance per pixel is calculated using the actual moving distance and the variation amount of the center coordinate. This is then used to determine the actual travel distance when locating the current image.

In the present invention,

Figure 112015117528789-pat00023
(14) and the origin direction angle using the image moment
Figure 112015117528789-pat00024
(13) is used for position recognition.

FIG. 10 is a detail view of position recognition using a current image according to an embodiment of the present invention, and FIG. 11 is an exemplary view of a home image and a current image for explaining a position recognition process according to the embodiment of FIG.

Referring to FIGS. 10 and 11, when a current image is input (S101), a ceiling area is extracted from the current image using the extracted ceiling outline and the optical flow diagram (S102).

Then, the actual moving distance per pixel, the origin center coordinates calculated from the origin image

Figure 112015117528789-pat00025
(14) and the origin direction angle
Figure 112015117528789-pat00026
(13), center coordinates calculated from the current image
Figure 112015117528789-pat00027
(16) and the direction angle
Figure 112015117528789-pat00028
(15) and recognizes the position of the mobile robot.

FIG. 12 is a block diagram showing an internal configuration of a mobile robot position recognition apparatus according to an embodiment of the present invention.

Referring to FIG. 12, the mobile robot position recognition apparatus of the present invention includes an image acquisition unit 110, a ceiling extraction unit 120, and a position recognition unit 130.

The image acquisition unit 110 acquires a ceiling image through a camera installed on the mobile robot perpendicular to the ground.

The ceiling extracting unit 120 performs preprocessing on the image input from the image obtaining unit 110, recognizes the outline of the ceiling from the image, identifies the ceiling area based on the ceiling outline, And extracts a ceiling area by using an optical flow diagram that estimates the degree of movement of the image using the origin image and the current image captured at a predetermined time interval in the first step.

The position recognition unit 130 compares the center coordinates of the origin point and the origin point angle of the ceiling area in the origin image recognized through the ceiling area extraction and the center coordinates and the direction angle of the ceiling area in the current image, do.

In an embodiment of the present invention, the image obtaining unit 110 may obtain a ceiling image through a fish-eye lens and a camera installed on a mobile robot perpendicular to the ground.

In the present invention, the apparatus for recognizing the position of the mobile robot can find a characteristic value inherent to a camera including a camera focal length and a principal point, and correct distortion of the image using the camera characteristic value.

The ceiling extracting unit 120 extracts an area having the smallest average value and the size of the area equal to or larger than a predetermined size as a ceiling area by using an average value of optical flow charts for each ceiling candidate area.

In detail, the ceiling extracting unit 120 calculates an average value of optical flow charts for each ceiling candidate region, extracts a region having the smallest average value and the size of the region larger than a predetermined size as a ceiling region, The origin center coordinates for

Figure 112015117528789-pat00029
And the origin direction angle using the image moment
Figure 112015117528789-pat00030
.

When the current image is input, the position recognition unit 130 extracts the ceiling outline, extracts the ceiling area using the optical flow chart, calculates the center coordinates

Figure 112015117528789-pat00031
And direction angle using image moment
Figure 112015117528789-pat00032
And calculates the origin center coordinates calculated from the origin image
Figure 112015117528789-pat00033
And origin direction angle
Figure 112015117528789-pat00034
And the center coordinates calculated from the current image
Figure 112015117528789-pat00035
And direction angle
Figure 112015117528789-pat00036
And recognizes the position of the mobile robot.

While the present invention has been described with reference to several preferred embodiments, these embodiments are illustrative and not restrictive. It will be understood by those skilled in the art that various changes and modifications may be made therein without departing from the spirit of the invention and the scope of the appended claims.

11: Ceiling outline
12: Center point
13: Origin direction angle
14: origin center coordinate
15: Current video direction angle
16: current image center coordinate
21: Distorted part
22: After distortion correction
31: Origin image
32: Current video
33: Optical flow diagram image
34: Optical flow
81: Origin image
82: Current video
110:
120: ceiling extractor
130:

Claims (12)

In an apparatus for recognizing the position of a mobile robot,
An image acquisition unit for acquiring an image of a ceiling side through a camera installed on the mobile robot perpendicularly to the ground;
The image processing apparatus according to claim 1, wherein the image processing unit performs preprocessing on the image input from the image obtaining unit, recognizes the outline of the ceiling from the image, identifies the ceiling area based on the ceiling outline, A ceiling extracting unit for extracting a ceiling area using an optical flow diagram for estimating a degree of movement of an image using a current image; And
And a position recognition unit for recognizing the position of the mobile robot by comparing the coordinates of the center of origin and the origin of the ceiling area in the origin image recognized through the ceiling area extraction and the center coordinates and the direction angle of the ceiling area in the current image, A position recognition apparatus for a mobile robot.
The method according to claim 1,
Wherein the image acquiring unit acquires a ceiling image through a fish-eye lens and a camera installed on the mobile robot perpendicularly to the ground.
The method according to claim 1,
Wherein the camera is characterized in that it obtains characteristic values unique to the camera including the camera focal length and principal point, and corrects distortion of the image using the camera characteristic values.
The method according to claim 1,
Wherein the ceiling extracting unit extracts a region having the smallest average value and the size of the region larger than a predetermined size as a ceiling region by using an average value of optical flow charts for each ceiling candidate region.
The method of claim 4,
The ceiling extracting unit calculates an average value of the optical flow charts for each ceiling candidate region, extracts an area having the smallest average value and the size of the area larger than a predetermined size as a ceiling area,
Figure 112015117528789-pat00037
And the origin direction angle using the image moment
Figure 112015117528789-pat00038
Of the mobile robot (10).
The method of claim 5,
When the current image is input, the location recognition unit extracts a ceiling outline, extracts a ceiling area by using an optical flowchart, calculates center coordinates
Figure 112015117528789-pat00039
And direction angle using image moment
Figure 112015117528789-pat00040
And calculates the origin center coordinates calculated from the origin image
Figure 112015117528789-pat00041
And origin direction angle
Figure 112015117528789-pat00042
And the center coordinates calculated from the current image
Figure 112015117528789-pat00043
And direction angle
Figure 112015117528789-pat00044
And recognizes the position of the mobile robot.
In a mobile robot position recognition method in a position recognition apparatus of a mobile robot,
Acquiring an image of a ceiling side through a camera installed on the mobile robot perpendicularly to the ground;
Correcting a distortion caused by the lens used in the acquired image;
Extracting a ceiling outline which is a boundary line existing in a ceiling outline from the corrected image;
The optical flow diagram is calculated by extracting the ceiling candidate region based on the extracted ceiling outline, estimating the degree of movement of the image by using the origin image and the current image captured at regular time intervals, Extracting a ceiling area using an average value of the ceiling area; And
Recognizing a position of the current mobile robot by comparing the coordinates of the center of origin and the angle of origin of the ceiling area in the recognized origin image through the ceiling area extraction and the center coordinates and the direction angle of the ceiling area in the current image, A method of recognizing a mobile robot position.
The method of claim 7,
Wherein the lens is a fish-eye lens.
The method of claim 7,
Wherein the step of correcting the distortion finds a feature value unique to a camera including a camera focal length and a principal point, and corrects distortion of the image using the camera feature value.
The method of claim 7,
And extracting the area having the smallest average value and the size of the area larger than the predetermined size as the ceiling area by using the average value of the optical flow charts for each ceiling candidate area in the extracting of the ceiling area, .
The method of claim 10,
In the step of extracting the ceiling area,
Calculating a mean value of optical flow charts for each ceiling candidate region, extracting a region having a smallest average value and a size of a region larger than a predetermined size as a ceiling region,
Figure 112015117528789-pat00045
And the origin direction angle using the image moment
Figure 112015117528789-pat00046
And calculating the position of the mobile robot based on the position information.
The method of claim 11,
In the step of recognizing the position of the mobile robot,
Extracting a ceiling outline and extracting a ceiling area using an optical flowchart when the current image is input;
Figure 112015117528789-pat00047
And direction angle using image moment
Figure 112015117528789-pat00048
Calculating the origin center coordinates
Figure 112015117528789-pat00049
And origin direction angle
Figure 112015117528789-pat00050
And the center coordinates calculated from the current image
Figure 112015117528789-pat00051
And direction angle
Figure 112015117528789-pat00052
And recognizing the position of the mobile robot by comparing the position of the mobile robot with the position of the mobile robot.
KR1020150170036A 2015-12-01 2015-12-01 Method and apparatus for detecting localization of mobile robot by ceiling outline detection KR101725685B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150170036A KR101725685B1 (en) 2015-12-01 2015-12-01 Method and apparatus for detecting localization of mobile robot by ceiling outline detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150170036A KR101725685B1 (en) 2015-12-01 2015-12-01 Method and apparatus for detecting localization of mobile robot by ceiling outline detection

Publications (1)

Publication Number Publication Date
KR101725685B1 true KR101725685B1 (en) 2017-04-26

Family

ID=58705094

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150170036A KR101725685B1 (en) 2015-12-01 2015-12-01 Method and apparatus for detecting localization of mobile robot by ceiling outline detection

Country Status (1)

Country Link
KR (1) KR101725685B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107403151A (en) * 2017-07-18 2017-11-28 广州贰拾肆机器人科技有限公司 Method, apparatus, equipment and the computer-readable recording medium positioned by indoor ceiling

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
제어로봇시스템학회 논문지. 제어로봇시스템학회. 2010.01., 제16권, 제1호, pp40-47
제어로봇시스템학회 논문지. 제어로봇시스템학회. 2011.02., 제17권, 제2호, pp164-170
제어시스템학회 논문지. 제어시스템학회. 2013.04., 제19권, 제4호, pp379-384
한국지능시스템학회 논문지. 한국지능시스템학회. 2011.08., Vol21, No4, pp442-448

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107403151A (en) * 2017-07-18 2017-11-28 广州贰拾肆机器人科技有限公司 Method, apparatus, equipment and the computer-readable recording medium positioned by indoor ceiling

Similar Documents

Publication Publication Date Title
Levinson et al. Automatic online calibration of cameras and lasers.
US8265425B2 (en) Rectangular table detection using hybrid RGB and depth camera sensors
KR102016549B1 (en) System and methof of detecting vehicle and lane position
KR100776215B1 (en) Apparatus and method for estimating location and generating map of mobile body, using upper image, computer-readable recording media storing computer program controlling the apparatus
KR101072876B1 (en) Method and apparatus for estimating position in a mobile robot
KR101329111B1 (en) System and method for indoor navigation
EP2476996B1 (en) Parallax calculation method and parallax calculation device
TW201715476A (en) Navigation system based on augmented reality technique analyzes direction of users' moving by analyzing optical flow through the planar images captured by the image unit
CN108038139B (en) Map construction method and device, robot positioning method and device, computer equipment and storage medium
CN108007456A (en) A kind of indoor navigation method, apparatus and system
WO2018142533A1 (en) Position/orientation estimating device and position/orientation estimating method
Tamjidi et al. 6-DOF pose estimation of a portable navigation aid for the visually impaired
CN111160280B (en) RGBD camera-based target object identification and positioning method and mobile robot
KR101725685B1 (en) Method and apparatus for detecting localization of mobile robot by ceiling outline detection
CN102044079B (en) Apparatus and method for tracking image patch in consideration of scale
KR20120108277A (en) Method for localizing intelligent mobile robot by using both natural landmark and artificial landmark
KR101305405B1 (en) Method for Localizing Intelligent Mobile Robot by using a lateral landmark
Mutka et al. A low cost vision based localization system using fiducial markers
KR20140053712A (en) The localization method for indoor mobile robots by sensor fusion
JP6580286B2 (en) Image database construction device, position and inclination estimation device, and image database construction method
KR20140032113A (en) Method for localizing intelligent mobile robot by using natural landmark, artificial landmark and encoder
KR101844328B1 (en) Occlusion and rotation invariant object recognition system and method in factory automation
KR100844640B1 (en) Method for object recognizing and distance measuring
KR101979003B1 (en) Method for Localizing Intelligent Mobile Robot by Using Natural Landmark, Artificial Landmark and Inertial Sensor
KR101002776B1 (en) Apparatus and method for estimating location and generating map of mobile body, using upper image, computer-readable recording media storing computer program controlling the apparatus

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant