KR101652300B1 - Indoor navigation system for the visually impaired using wearable device and marker recognition - Google Patents
Indoor navigation system for the visually impaired using wearable device and marker recognition Download PDFInfo
- Publication number
- KR101652300B1 KR101652300B1 KR1020160023887A KR20160023887A KR101652300B1 KR 101652300 B1 KR101652300 B1 KR 101652300B1 KR 1020160023887 A KR1020160023887 A KR 1020160023887A KR 20160023887 A KR20160023887 A KR 20160023887A KR 101652300 B1 KR101652300 B1 KR 101652300B1
- Authority
- KR
- South Korea
- Prior art keywords
- marker
- wearable device
- image
- module
- visually impaired
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3469—Fuel consumption; Energy use; Emission aspects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
Abstract
The present invention relates to an indoor navigation system and an indoor route guidance method using the indoor navigation system. More particularly, the present invention relates to an indoor navigation system and a indoor navigation method using the same, And an indoor navigation system capable of providing a route guidance service. According to the present invention, it is possible to provide a wearable interface that can be worn and used easily and conveniently by a visually handicapped person, store indoors position information without power supply, utilize markers capable of reading position information through image processing, It is possible to provide convenience of management and cost reduction, and more reliable indoor guidance can be provided to visually impaired persons.
Description
The present invention relates to an indoor navigation system and an indoor route guidance method using the indoor navigation system. More particularly, the present invention relates to an indoor navigation system and a indoor navigation method using the same, And an indoor navigation system capable of providing a route guidance service.
Currently, there are about 2.5 million disabled people in Korea, among which blind people make up about 10% of the total disabled people, about 250,000 people, and this is the third most common type of disability type. As a result of the survey, people with visual impairments rated "the most uncomfortable thing in life" when looking for an unfamiliar road. This reflects the lack of facilities and equipment for the visually impaired.
To overcome these inconveniences, various devices for the visually impaired have been developed. In outdoor, many devices have been developed to guide the visually impaired through GPS, and indoor navigation technology using Bluetooth, WIFI, RFID, NFC, and beacon has been proposed. However, it is impossible to use the GPS in the indoor environment. Therefore, it is necessary to install a short range wireless identification device such as RFID, NFC, and beacon in all the indoor spaces. There is a disadvantage that maintenance and management are not easy.
On the other hand, in order to solve these drawbacks, Korean Patent Laid-Open Publication No. 10-2014-0036543, Korean Patent Laid-Open No. 10-2013-0091908, etc. have proposed an indoor navigation system using marker recognition. The advantage of using the marker recognition is that it is possible to save costs by eliminating the need for a wireless identification device that requires a relatively high initial installation cost, such as a beacon, and is easy to maintain and manage since it does not require continuous electricity supply. However, in the case of such a conventionally proposed technique, most of the system is intended for the general public, and the characteristics and difficulties of the visually impaired person are not taken into account, making it impossible or inconvenient for the visually impaired to actually use the system. Particularly, the visually impaired users use various auxiliary devices to solve the inconvenience when they are moving. It is desired to develop an auxiliary device for the visually impaired which can be easily and conveniently worn and used because of the inconvenience of wearing and using these auxiliary devices Demand is reported to be very large.
The present invention has been made to overcome the disadvantages of the conventional indoor navigation system as described above, and it is an object of the present invention to provide a wearable interface that can be easily worn and used by a visually impaired person, The present invention provides an indoor navigation system capable of providing convenience and cost reduction in installation and maintenance cost utilizing the indoor navigation system, and more reliable indoor navigation.
According to an aspect of the present invention, there is provided an indoor navigation system for a visually impaired person, comprising: a marker attached to a ceiling of a room for storing position information of a specific point; A main controller for recognizing the current position of the visually impaired person by recognizing the positional information from the photographed image of the marker and calculating the shortest path to the destination for the visually impaired person, A first wearable device provided with a vibration means for guiding a direction in which the visually impaired person should proceed to the left and right vibration; A camera module for photographing a marker attached to a ceiling, the camera module being configured to be able to be worn by a visually impaired person on the ceiling, and to transmit an image of a marker photographed by the camera module to a main control terminal of the first
Here, the main control terminal receives the marker image photographed by the second wearable device, recognizes the position information included in the marker through an image processing process, and outputs the recognized position information to the indoor map information provided from the management server A marker information recognition module for identifying a current position of a blind person by matching; A destination information providing module for providing indoor destinations in a list form; Based on the current position information of the blind person recognized by the marker information recognition module, the specific destination information selected from the destination list provided by the destination information providing module, and the indoor map information provided from the management server, A path calculating module for calculating a shortest path of the packet; A route guidance module for guiding a shortest path from the route calculation module to a destination to a blind person in real time; A vibrating means control module for selectively activating vibrating means of a first wearable device located in a direction in which the visually impaired person should proceed based on a shortest path to a destination calculated from the path calculating module so as to generate vibration; A second wearable device, and a third wearable device.
The second wearable device includes an eyeglass part including an eyeglass frame and a pair of eyeglasses legs and a Bluetooth earphone on the rear end of the eyeglass leg; The right and left eyeglasses legs of the eyeglass part are connected to each other along the head circumference of the visually impaired person. Both right and left ends of the eyeglass part are pivotally coupled to the outside of each eyeglass leg by a hinge in the forward and backward directions, A hair band portion; And a communication module for transmitting the marker image photographed by the camera module to the main controller of the first wearable device.
Here, a mounting protrusion for preventing the hair band from sagging downwardly protrudes from the right and left sides of the hair band portion when the hair band portion is hooked on the eyeglass leg when the hair band portion is turned forward.
The third wearable device includes an operation button for controlling the route guidance to be started by inquiring and selecting a destination list provided from the main control terminal of the first wearable device; A control circuit for controlling the overall operation of the third wearable device; A Bluetooth module for receiving voice information from the main control terminal of the first wearable device and transmitting a control command signal generated through the operation button to the main control terminal; A speaker for outputting audio information to the outside; And a battery for operating power supply.
The marker has a square outline, the inside is divided into a total of nine square subregions, eight square cells are spaced apart from each other at predetermined intervals in each subregion except the uppermost left side, The inside of the cell is filled with white or black.
Meanwhile, a guide line for interconnecting markers adjacent to each other is additionally displayed on the ceiling of the room, and the second wearable device photographs the guide line image through the camera module in real time and transmits it to the main control terminal of the first wearable device The main control unit senses the direction of the guide line in real time through image processing and selectively operates one of the vibration means attached to the left and right shoulders of the first wearable device to guide the visually impaired do.
The management server includes an indoor map DB storing the indoor map information in which coordinates of the marker are plotted on the drawing of the specific building and information of the other markers adjacent to the marker is set for all the markers .
According to another aspect of the present invention, there is provided an indoor route guidance method comprising: capturing an image of a marker attached to a ceiling of a room to obtain an image; Recognizing positional information from a photographed image of the marker and determining a current position of the blind person; Receiving a destination to which the visually impaired person intends to go, and calculating a shortest path from the current position of the visually impaired person to the destination; And guiding the shortest path to the calculated destination to the visually impaired in real time.
The method further includes photographing a guide line connecting the markers to each other in real time, and guiding a direction in which a visually impaired person should turn to the blind by detecting the direction of the guide line from the photographed image.
The position information recognition from the photographed image of the marker includes: binarizing the photographed marker image; Extracting a label corresponding to a border of the marker from the binarized image; Extracting an outline of the marker from the extracted marker label; Extracting four corner coordinates of the marker from the outline of the extracted marker; Transforming the distorted marker into a square by projectively transforming four corner coordinates of the extracted marker; Recognizing a direction of the converted marker; And extracting data of the marker.
The recognition of the direction of the marker is performed by checking the number of pixels of the cell region in the marker and reading the position of the empty region where the cell is not displayed.
The extraction of the marker data is performed by equally dividing the marker converted into a square into 4 × 4, and reading the pixel values of the respective cells located at the intersection of the equal lines to obtain the sum of the weights.
The direction of the guide line is detected by converting an RGB image into an HSV image in order to recognize the color of the guide line in the captured guide line image, binarizing the image according to the HSV range of the color of the guide line, Lt; / RTI >
According to the present invention, it is possible to provide a wearable interface that can be worn and used easily and conveniently by a visually impaired person, store indoor position information without power application, and utilize a marker capable of reading position information through image processing Thereby making it easier to install and maintain, and to reduce costs. Further, it is possible to guide a visually impaired person in a more reliable way.
1 is a configuration diagram of an indoor navigation system according to the present invention;
2 is a detailed configuration diagram of a first wearable device of an indoor navigation system according to the present invention,
3 is a main body terminal configuration diagram of a first wearable device,
4 is a detailed configuration diagram of a second wearable device of an indoor navigation system according to the present invention,
Fig. 5 is a view showing the wear and use state of the second wearable device,
6 is a detailed configuration diagram of a third wearable device of the indoor navigation system according to the present invention,
FIG. 7 is a diagram illustrating a marker of an indoor navigation system according to the present invention;
8 is an exemplary screen of a marker data setting program built in the
Fig. 9 is a view for explaining the direction determination at the branch road,
FIG. 10 is an actual use state of the indoor navigation system according to the present invention,
11 is a flowchart showing a marker recognition algorithm,
FIG. 12 shows a result obtained by binarizing and inverting an image (a) having a non-uniform brightness under the influence of illumination through an Otsu algorithm (b) and an adaptive binarization algorithm (c)
13 is a view showing a state in which a label in a binarized image is extracted,
14 is a view for extracting an outline located at the outermost position of a marker to obtain four corner coordinates of the marker,
Fig. 15 is a view showing a distorted marker image (a), a square marker image (b) generated by projecting and transforming the distorted marker image
16 is a diagram for explaining a method of extracting marker data,
17 is a photograph showing a state where a marker and a guide line are displayed on a ceiling,
18 is a flowchart showing a guide line recognition algorithm,
Figure 19 also shows the HSV con model,
20 shows results (b), (b) and (c) of binarizing the HSV image (a) based on the color range of the guide line L,
Figure 21 shows the results of binarization from marker images taken under various lighting conditions,
22 is a result of converting an image of a guide line L taken in two different illumination environments into an HSV image and then binarizing it.
Hereinafter, an indoor navigation system and method according to the present invention will be described in detail with reference to the accompanying drawings and preferred embodiments.
As shown in FIG. 1, an indoor navigation system according to the present invention includes a user device U, a marker M, and a
The user device U is configured in the form of a wearable device that is easily worn by a visually impaired person. The user device U acquires positional information from a marker M previously displayed in the room, So as to grasp the current position of the visually impaired person and to guide the route to the desired destination in real time. The user device U includes a first
2 is a block diagram of a first
The first
The
FIG. 3 shows a block diagram of the
The marker
The destination
The
The
The vibrating means
The
Fig. 4 is a perspective view showing an external configuration of the second
The
The
In addition, a mounting
The
In Fig. 5, a wear and use state diagram of the second
5B is an indoor mode. When the visually impaired person is located in the room, the
Fig. 6 shows a configuration diagram of the third
The
FIG. 7 is a block diagram of a marker used in an indoor navigation system according to the present invention. FIG. 7A is a diagram illustrating a design of a marker including specific positional information, and FIG. 7B is a diagram showing numerical values of weights of cells constituting the marker. Hereinafter, the configuration and action of the marker will be described with reference to FIG.
The marker M stores positional information of a specific point in the room and is attached to a ceiling of a main facility such as a main facility or a branch road. When the marker M is attached to the floor, the marker M may not be caught by an obstacle such as an object located in the middle or a person, It is not preferable because it is easily damaged because it is a place where a person's foot is touched, and also when it is attached to a wall, it may be affected by obstacles and may be damaged. Therefore, it is preferable to attach the marker M to the ceiling which is least affected by the obstacle and is not easily damaged.
In the present invention, since the marker M attached to the ceiling is recognized, the marker M can be changed in consideration of the influence of the surrounding illumination and the recognition rate change depending on the height of the ceiling, Should be designed. There are QRCode, ARTag, SCR, and HOM as representative markers (M) that can represent a large number of data with a high recognition rate, and most of them have a common form that data is coded in the structure of a grid pattern.
The shape of the marker M suggested by the present invention is as shown in Fig. 7 (a), and has a square outline. The inside of the marker M is divided into a total of nine square sub-regions, and each sub- A total of eight square cells are spaced apart from each other. The inside of the cell is filled with white or black. The cell filled with white means that the corresponding cell does not have any weight, and the cell filled with black means that the cell corresponding to the corresponding And a weight corresponding to the position of the cell. At this time, the data represented by the marker M is defined as the sum of the weights of all the cells. The empty space at the upper left is used for recognizing the direction in which the marker M is viewed so as to prevent duplication of data depending on the viewing direction. The marker (M) shown in (a) of Figure 7 when in the form of a 3 × 3 comprising eight cell marker, gajitsu of data that can be represented is 28, that is, extended to 256 kinds, and, N × N type
It can be used as a marker for representing data corresponding to a branch location.Meanwhile, as shown in FIG. 1, the indoor navigation system according to the present invention includes a
In particular, the
The coordinates of the markers set through the marker data setting program of the
FIG. 10 shows an actual state of use of the indoor navigation system according to the present invention including the user device U, the marker and the
Up to now, the configuration of the indoor navigation system according to the present invention has been described. Hereinafter, a specific method of recognizing the marker M and a route guidance method will be described.
One. Marker recognition
In order to recognize the marker M, an image is inputted first through the
1-1. Adaptive binarization
Binarization, which is a key element of the image preprocessing process, divides the pixels of an image into two groups. It uses a global binarization algorithm that uses one threshold for the entire image, and uses different information about neighboring pixels to different threshold values And adaptive binarization algorithms to be applied. As a representative global binarization algorithm, there is an Otsu algorithm that finds a threshold value using the variance value of the histogram. The adaptive binarization algorithm includes the Niblack algorithm that uses the mean and standard deviation of neighboring pixels, and the Sauvola algorithm that complements the disadvantage of Niblack have. 12 is a result obtained by binarizing and inverting the image (a) having a non-uniform brightness under the influence of illumination through the Otsu algorithm (b) and the adaptive binarization algorithm (c) provided by OpenCV. In the present invention, the marker M is attached to the ceiling and is easily affected by ambient light, and the brightness is unevenly distributed throughout the image as a whole. Since the global binarization method using a fixed threshold can not be used for images with uneven brightness, an adaptive binarization algorithm that applies different threshold values to each pixel to binarize the marker image should be used. In the present invention, an adaptive binarization algorithm for determining a threshold value as an arithmetic average of neighboring pixels through an adaptive threshold of OpenCV is used.
1-2. Marker Extract region label
A label is an object composed of connected pixels, and labeling is a process of extracting such labels in an image. There are many ways to implement the labeling, but it is generally implemented through the Flood-Fill algorithm. In the present invention, labeling is implemented through a flood-fill algorithm using a queue. To separate the region corresponding to the marker from the binarized image, labeling is performed on the binarized image, and the characteristic of each detected label is examined to find a label corresponding to the border of the marker M. FIG. 13 shows a label on the original image extracted from the binarized image. It can be confirmed that a label surrounding the border of the marker M exists. In the present invention, labeling is performed on each of the label areas detected in the first labeling process. In the present invention, labeling is performed on each of the label areas detected in the first labeling process. If the number of labels located inside is 8 and the width and height of each label are within ± 30% of the error range and the aspect ratio is 0.6 or more and 1.7 or less, it is judged as a marker.
1-3. Extracting outlines
Because the camera does not look exactly perpendicular to the ceiling marker (M), the marker (M) in the image generally appears as a distorted square rather than a square. Since it is difficult to extract data from the distorted form of the marker M, it must undergo a perspective transformation that transforms it into a square. The outline is extracted from the marker label area prior to the projection transformation. The outline consists of a set of consecutive points, and various outlines can be detected in the image. As shown in FIG. 14, in order to obtain the coordinates of the four corners of the marker, an outline located at the outermost position of the marker should be extracted. In general, the outline can be easily extracted by using the feature that the outline is longest.
1-4. Marker Corner Extraction
When the outline of the marker M is extracted, four corner coordinates of the marker M can be obtained. The first three coordinates are determined as the first coordinate with the maximum distance for one coordinate arbitrarily selected on the outline, and the coordinate having the maximum distance from the first coordinate is defined as the second coordinate, the first coordinate and the second coordinate Is the third coordinate. The fourth coordinate (x 1, y 1), the three coordinates previously determined, (x 2, y 2) , (x 3, y 3) coordinates the following formula (1) when d is the maximum (x 4, y 4 ).
1-5. Projection transformation
The projection transformation matrix is calculated to transform the distorted marker image as shown in Fig. 15 (a) into a square marker image as shown in Fig. 15 (b). In general, the equation for projection transformation is given by the following equation (2).
From Equation (2), the equations for x 'and y' can be written as Equation (3) and Equation (4).
It can be seen from the equations (3) and (4) that if the coordinates (x, y) and the corresponding pairs of post-transformation coordinates (x ', y') are known, eight unknowns of the projection matrix can be obtained have. Therefore, the projection matrix is obtained by substituting the four corner coordinates extracted from the outline and the transformed square coordinates, and the projection transformation is applied to all the points on the distorted marker to restore the original square shape. In the present invention, conversion into square coordinates of 200 x 200 size was performed.
1-6. Marker direction recognition
Since the position of the cell is changed according to the direction in which the marker M is photographed, in order to correctly extract the data, the direction of the marker M is recognized. Then, in order to fix the position of each cell constant in the image, Rotational transformation must be performed. Since the direction of the marker M determines the position of the empty cell without the cell, the number of pixels in each cell region is checked and the direction of the marker M is determined by finding the position of the empty cell. In the present invention, pixels of four regions in which the x and y values are within the range of [25,75] and [125,175] are checked for the restored image of FIG. 15 (b) When it is less than the number, it is judged to be the area corresponding to the blank space.
1-7. Data extract
In order to extract the marker data, a value corresponding to each cell must be read. In the present invention, the restored image is divided into 4 × 4 pixels as shown in FIG. 16, the pixel values of the respective cells located at the intersection of the equal lines are read, Marker data was extracted. At this time, the weights corresponding to the respective cells are as shown in FIG. 7 (b). The extracted marker data represents a specific location in the building, and is used as data for searching for a route from the indoor map in which the marker position is indicated to the destination.
2. Guide Line Recognition
It is difficult for the visually impaired to know which direction the next marker is located just by voice guidance. 17, a guide line L is drawn on a ceiling to connect a marker and a marker, and an image photographed from the
2-1. HSV conversion
It is necessary to recognize the color of the guide line L in order to recognize the guide line L. However, the RGB model which is the basic color model of the image has a disadvantage that it is difficult to express a change in brightness. On the other hand, the HSV color space, which expresses color by hue, saturation, and value, is useful for color recognition because it is advantageous to express color change by brightness. The formula for converting RGB to HSV is shown in Equation (5). In equation (5), R, G, and B have a value of [0,1], H has a value of [0,360], and S and V have a value of [0,1]. Here, H, S, and V are defined in the HSV con model shown in FIG. 19, respectively.
In order to recognize the color of the guide line L in the image, the RGB image is converted into the HSV image and the image is binarized according to the HSV range of the color of the guide line (L). The range of the hue corresponding to the hue of the guide line L used in the present invention is 40-60, and the saturation and the value are set to 50 or more and 10 or more, respectively. 20 shows a result (b) obtained by binarizing the HSV image (a) with reference to the color range of the guide line (L).
2-2. Straight line extraction
To obtain the slope of the guide line L, a straight line is extracted from the image of FIG. 20 (b), and the angle is calculated through the slope of the straight line. In the case of extracting a straight line from an image, Hough transform is generally used. In the Hough transform, a straight line corresponding to a cumulative parameter value is detected by accumulating two parameter values expressing a straight line passing through an arbitrary point in the image . In this case, since there are innumerable straight lines passing through one point, when expressing a straight line in Equation (6) for Hough transform, θ and r are used as parameters, and parameter value is recorded by increasing θ constantly. Therefore, the amount of increase of θ will affect the accuracy and speed of straight line detection.
In general, the Hough transform is performed on the outline image generated by the Canny edge detector. In the case of the present invention, as shown in FIG. 19, a color corresponding to the guide line L is classified in the HSV image, and then a Cunny image is generated and Hough transform is performed. However, due to the characteristics of the surrounding environment, other areas besides the guide line L can be detected. In addition to the straight line corresponding to the guide line L at the time of detecting a straight line, a straight line due to noise or the like can be detected. Therefore, in the present invention, the straight line is detected by using the modified probabilistic Hough transform and the straight line below the predetermined critical length is removed, thereby increasing the recognition rate of the guide line (L). Two straight lines corresponding to the left and right boundary lines of the guide line L are detected from one guide line L image. If the slopes of the two straight lines are m 1 and m 2 , respectively, the angle θ g of the guide line L is defined by equation (7). Since the error range is ± 10 °, if the range of θ g is from -80 ° to 0 °, the left vibration means will be heard. If the range is 0 ° to 80 °, the right vibration means will ring to guide the visually impaired.
3. Experimental Results and Performance Analysis
3-1. Marker recognition rate measurement
In order to analyze the performance of the marker recognition algorithm implemented in the present invention, the marker recognition rate according to the gait, the marker recognition rate according to the distance, and the marker recognition rate according to illumination were measured on the
In this experiment,
Table 1. Marker recognition success rate
In Table 1, the success rate of marker recognition per second is the ratio of the number of successful marker recognition per second at least once per second. Therefore, accurate marker recognition is possible because the marker does not deviate from the angle of view of the camera within one second when moving at 0.4 m / s. In addition, this means that the blurring effect due to the walking of the visually handicapped person is small and does not significantly affect the recognition of the marker.
Table 2. Success rate of marker recognition according to distance between marker and camera module
Table 2 shows the results of recognizing a marker with a size of 0.2m × 0.2m by changing the distance between the camera and the marker. According to the experimental results, the recognition rate was 100% within 1.5m. Since the marker is attached to the chest area, it can be seen that accurate marker recognition is possible in a building with a ceiling height of 3 m or less when assuming an adult's chest height of 1.5 m. Considering that the height of most of the ceilings is less than 3m, this marker can be applied to a typical ceiling height building. In addition, it is possible to increase the recognition rate by applying a marker larger than 0.2m × 0.2m in a building with a ceiling height of 3m or more.
The change of the marker recognition rate by illumination is determined only by the binarization algorithm. FIG. 21 shows the result of binarization from the marker image taken under various illuminations. In FIG. 21, (a), (b), and (c), the marker region was correctly binarized and the marker recognition rate was 100% for all the frames. Therefore, it can be seen that this marker recognition algorithm is strong against the influence of illumination, and accurate marker recognition is possible in a place where there is not a strong light source enough to cover the space or image of light.
3-2. Guide line recognition rate measurement
In order to analyze the performance of the guide recognition algorithm implemented in the present invention, the guide line recognition rate due to illumination was measured. The guideline recognition criterion is set when the angle of the extracted guide line L is within the range. 22 is a result of converting an image of a guide line L taken in two different illumination environments into an HSV image and then binarizing it.
As a result of measuring the guide recognition rate in FIG. 22 (a), the average guide recognition rate was 76%. The average number of recognition per second was 8 times. On the other hand, in (b), the average recognition rate per FPS was 99% and the average recognition rate per second was 24.95. The reason for such a phenomenon is that in the case of FIG. 22 (b), the noise is removed rather than the illumination, so that the speed of the Huff conversion is improved, and only the area corresponding to the guide line L is binarized, It seems possible. It is confirmed that guide line (L) is detected smoothly as a whole, and it is expected that the recognition rate of guide line (L) can be greatly increased if noise removal is further performed in the HSV binarized image.
The specific embodiments of the present invention have been described above. It is to be understood, however, that the scope and spirit of the present invention is not limited to these specific embodiments, and that various modifications and changes may be made without departing from the spirit of the present invention. If you have, you will understand. Therefore, it should be understood that the above-described embodiments are provided so that those skilled in the art can fully understand the scope of the present invention. Therefore, it should be understood that the embodiments are to be considered in all respects as illustrative and not restrictive, The invention is only defined by the scope of the claims.
U: user device 100: first wearable device
110: Key word terminal 111: Marker information recognition module
112: Destination information providing module 113: Path calculating module
114: path guide module 115: vibration means control module
116: communication module 120: auxiliary battery
130: vibration means 200: second wearable device
210: glasses section 212: eyeglass frame
214: Glasses bridge 216: Bluetooth earphone
220: Hair band part 222:
230: camera module 300: third wearable device
310: Operation button 320: Control circuit
330: Bluetooth module 340: Speaker
350: Battery part M: Marker
L: guide line 400: management server
410: indoor map DB
Claims (13)
The second wearable device includes an eyeglass part including an eyeglass frame and a pair of eyeglasses legs, the eyeglass part having a Bluetooth earphone on a rear end side of the eyeglass leg; And a pair of right and left eyeglasses legs of the eyeglass part are connected to each other along the head circumference of a blind person, both left and right ends of the eyeglass leg being pivotally connected to the outside of the eyeglass legs by a hinge in forward and backward directions, Wow; And a communication module for transmitting a marker image photographed by the camera module to the main control terminal of the first wearable device, wherein the left and right sides of the hair band portion are hooked on the glasses legs when the hair band portion is turned toward the front side, A protrusion is formed to prevent the protrusion from being sagged downward,
The marker has a square outline, the inside is divided into a total of nine square subareas, eight square cells are spaced apart from each other except for the uppermost left, and the inside of the cell is white or black Filled with color,
In the ceiling of the room, guides for interconnecting the adjacent markers are displayed,
Wherein the management server includes an indoor map DB for storing indoor map information in which coordinates of a marker are plotted on a drawing of a specific building and information of other markers adjacent to the marker is set for all the markers,
The main control terminal receives the marker image photographed in the second wearable device, recognizes the position information included in the marker through an image processing process, and matches the recognized position information with the indoor map information provided from the management server A marker information recognition module for identifying a current position of the blind person; A destination information providing module for providing indoor destinations in a list form; Based on the current position information of the blind person recognized by the marker information recognition module, the specific destination information selected from the destination list provided by the destination information providing module, and the indoor map information provided from the management server, A path calculating module for calculating a shortest path of the packet; A route guidance module for guiding a shortest path from the route calculation module to a destination to a blind person in real time; A vibrating means control module for selectively activating vibrating means of a first wearable device located in a direction in which the visually impaired person should proceed based on a shortest path to a destination calculated from the path calculating module so as to generate vibration; A communication module for mutual communication with the management server, the second wearable device, and the third wearable device,
The main control terminal binarizes the photographed marker image, extracts a label corresponding to a border of the marker from the binarized image, extracts the outline of the marker from the extracted marker label, extracts four corner points of the marker from the outline of the extracted marker Extracts the markers, extracts the four corners of the marker, transforms the distorted markers into squares, and checks the number of pixels in the cell area of the marker to read the position of the empty area where no cell is displayed. After the direction is recognized, the rotation of the marker image is performed so as to fix the position of each cell of the marker to be constant in the image, and the marker converted into the square is divided into 4x4, Extracts the data of the marker by reading the pixel value of each cell and obtaining the sum of the weights,
In addition, the main control terminal may include a current marker located at a current position of the blind person included in the indoor map information provided from the management server, a previous marker adjacent to the current marker among the blinders passed by the blind, Determines the direction in which the visually impaired person should move by calculating the angle between the straight line passing through the current marker and the current marker and the straight line passing through the current marker and the next marker,
The second wearable device captures the guide line image in real time through the camera module and transmits the guide line image to the main language terminal of the first wearable device. The main language terminal uses the HSV image to recognize the color of the guide line in the taken guide line image, The direction of the guide line is detected by calculating the angle of the guide line from the binarized image after the image is binarized according to the HSV range of the color of the guide line, and one of the vibration means attached to the left and right shoulders of the first wearable device is selected To guide the visually impaired to turn his or her body.
Wherein the third wearable device comprises:
An operation button for controlling the route guidance to be started by inquiring and selecting a destination list provided from the main control terminal of the first wearable device;
A control circuit for controlling the overall operation of the third wearable device;
A Bluetooth module for receiving voice information from the main control terminal of the first wearable device and transmitting a control command signal generated through the operation button to the main control terminal;
A speaker for outputting audio information to the outside;
And a battery for operating power supply.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160023887A KR101652300B1 (en) | 2016-02-29 | 2016-02-29 | Indoor navigation system for the visually impaired using wearable device and marker recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160023887A KR101652300B1 (en) | 2016-02-29 | 2016-02-29 | Indoor navigation system for the visually impaired using wearable device and marker recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101652300B1 true KR101652300B1 (en) | 2016-09-01 |
Family
ID=56942673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160023887A KR101652300B1 (en) | 2016-02-29 | 2016-02-29 | Indoor navigation system for the visually impaired using wearable device and marker recognition |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101652300B1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101999065B1 (en) * | 2018-07-31 | 2019-07-10 | 조선대학교산학협력단 | Method for measuring distance between the camera and the object using milliradian |
KR102012452B1 (en) * | 2018-02-28 | 2019-08-20 | 한국기계연구원 | Wearable qr code recognition device and qr code recognition system for the blind |
WO2019235786A1 (en) * | 2018-06-08 | 2019-12-12 | 한국교통대학교 산학협력단 | Indoor direction guiding method and system therefor |
CN112985409A (en) * | 2021-02-26 | 2021-06-18 | 吉林建筑大学 | Navigation method and related device for vision-impaired person |
KR20210102708A (en) | 2020-02-12 | 2021-08-20 | 대한민국(행정안전부 국립재난안전연구원장) | Route marker device for pedestrian guidance and disaster response method using the same |
KR102467141B1 (en) * | 2022-04-27 | 2022-11-16 | 정원재 | Directional beacon system and how it works |
KR20230052774A (en) * | 2021-10-13 | 2023-04-20 | (주)나임기술 | Method and system for controlling robot arm using obtaining re-size image by frame grabber |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0576563A (en) * | 1991-06-06 | 1993-03-30 | Shinko Electric Co Ltd | Goggles for emergency refuge |
JP2009222682A (en) * | 2008-03-18 | 2009-10-01 | Saitama Univ | Navigation system |
KR20130071419A (en) * | 2010-03-10 | 2013-06-28 | 토마스 엠. 리카드 | Communication eyewear assembly |
KR20130091908A (en) | 2012-02-09 | 2013-08-20 | 한국전자통신연구원 | Apparatus and method for providing indoor navigation service |
KR20140036543A (en) | 2012-09-17 | 2014-03-26 | 서세원 | Indoor positioning system and method using marker and smart device |
JP5864800B1 (en) * | 2015-04-11 | 2016-02-17 | 治幸 岩田 | Wearable navigation system, ring, watch, bracelet, program |
-
2016
- 2016-02-29 KR KR1020160023887A patent/KR101652300B1/en active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0576563A (en) * | 1991-06-06 | 1993-03-30 | Shinko Electric Co Ltd | Goggles for emergency refuge |
JP2009222682A (en) * | 2008-03-18 | 2009-10-01 | Saitama Univ | Navigation system |
KR20130071419A (en) * | 2010-03-10 | 2013-06-28 | 토마스 엠. 리카드 | Communication eyewear assembly |
KR20130091908A (en) | 2012-02-09 | 2013-08-20 | 한국전자통신연구원 | Apparatus and method for providing indoor navigation service |
KR20140036543A (en) | 2012-09-17 | 2014-03-26 | 서세원 | Indoor positioning system and method using marker and smart device |
JP5864800B1 (en) * | 2015-04-11 | 2016-02-17 | 治幸 岩田 | Wearable navigation system, ring, watch, bracelet, program |
Non-Patent Citations (6)
Title |
---|
[1] 이진현, 이해균, 송병섭, "실내 환경에 효율적인 모바일 내비게이션을 위한 마커인식", 시각장애연구, 제 22권 2호, pp.31-48, 2006. |
[2] 최태웅, 이현철, 허기택, 김은석, "마커 방식 실내 내비게이션을 위한 조명 변화에 강한 임계값 결정 방법", 한국콘텐츠학회 논문지, 제 12권 1호, pp.1-8, 2012.1. |
[3] 주재현, 오정수, "저화질 문서영상들을 위한 적응적 알고리즘", 한국통신학회 논문지, 제 37권 7호, pp.581-585, 2012.7. |
[4] N. Otsu, "A Thresholding Selection Method from Gray-level Histogram", IEEE Transactions on Systems, Man, and Cybernetics, Vol.9, No.1, pp.62-66, 1979. |
[5] 이권, 이철희, "LBP와 HSV 컬러 히스토그램을 이용한 내용 기반 검색", 방송공학회 논문지, 제 18권 3호, pp. 372-379, 2013.5. |
[6] N. Kiryati, Y. Elder, A. M. Bruckstein, "A probabilistic Hough transform", Pattern Recognition, Vol.24, pp.303-316, 1991. |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102012452B1 (en) * | 2018-02-28 | 2019-08-20 | 한국기계연구원 | Wearable qr code recognition device and qr code recognition system for the blind |
WO2019235786A1 (en) * | 2018-06-08 | 2019-12-12 | 한국교통대학교 산학협력단 | Indoor direction guiding method and system therefor |
KR101999065B1 (en) * | 2018-07-31 | 2019-07-10 | 조선대학교산학협력단 | Method for measuring distance between the camera and the object using milliradian |
KR20210102708A (en) | 2020-02-12 | 2021-08-20 | 대한민국(행정안전부 국립재난안전연구원장) | Route marker device for pedestrian guidance and disaster response method using the same |
CN112985409A (en) * | 2021-02-26 | 2021-06-18 | 吉林建筑大学 | Navigation method and related device for vision-impaired person |
CN112985409B (en) * | 2021-02-26 | 2024-03-26 | 吉林建筑大学 | Navigation method and related device for vision disorder person |
KR20230052774A (en) * | 2021-10-13 | 2023-04-20 | (주)나임기술 | Method and system for controlling robot arm using obtaining re-size image by frame grabber |
KR102619814B1 (en) | 2021-10-13 | 2024-01-03 | (주)나임기술 | Method and system for controlling robot arm using obtaining re-size image by frame grabber |
KR102467141B1 (en) * | 2022-04-27 | 2022-11-16 | 정원재 | Directional beacon system and how it works |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101652300B1 (en) | Indoor navigation system for the visually impaired using wearable device and marker recognition | |
US20210019854A1 (en) | Location Signaling with Respect to an Autonomous Vehicle and a Rider | |
US10655970B2 (en) | Beacon-based indoor wayfinding system with automated beacon placement | |
JP6525229B1 (en) | Digital search security system, method and program | |
Angin et al. | A mobile-cloud collaborative traffic lights detector for blind navigation | |
US9922236B2 (en) | Wearable eyeglasses for providing social and environmental awareness | |
US9224037B2 (en) | Apparatus and method for controlling presentation of information toward human object | |
JP5674406B2 (en) | Surveillance system, monitoring device, autonomous mobile body, monitoring method, and monitoring program using autonomous mobile body | |
US20180196417A1 (en) | Location Signaling with Respect to an Autonomous Vehicle and a Rider | |
US6690451B1 (en) | Locating object using stereo vision | |
Simôes et al. | Blind user wearable audio assistance for indoor navigation based on visual markers and ultrasonic obstacle detection | |
EP3175630A1 (en) | Wearable earpiece for providing social and environmental awareness | |
RU2019136741A (en) | LOCATION-BASED WIRELESS AUTHENTICATION | |
KR101054025B1 (en) | Visually impaired walking guidance method and system | |
JP2006251596A (en) | Support device for visually handicapped person | |
US20180196415A1 (en) | Location Signaling with Respect to an Autonomous Vehicle and a Rider | |
US20120327203A1 (en) | Apparatus and method for providing guiding service in portable terminal | |
Coughlan et al. | Functional assessment of a camera phone-based wayfinding system operated by blind and visually impaired users | |
KR20130086861A (en) | Guide device for blind people using electronic stick and smartphone | |
JP7052305B2 (en) | Relief systems and methods, as well as the servers and programs used for them. | |
JP2020053028A (en) | Object-tracking system | |
KR102336264B1 (en) | The method, the system and the program of In-store automatic payment | |
JPWO2018084191A1 (en) | Congestion situation analysis system | |
Lee et al. | Magnetic tensor sensor and way-finding method based on geomagnetic field effects with applications for visually impaired users | |
Badave et al. | Android based object detection system for visually impaired |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20190718 Year of fee payment: 4 |