KR101652300B1 - Indoor navigation system for the visually impaired using wearable device and marker recognition - Google Patents

Indoor navigation system for the visually impaired using wearable device and marker recognition Download PDF

Info

Publication number
KR101652300B1
KR101652300B1 KR1020160023887A KR20160023887A KR101652300B1 KR 101652300 B1 KR101652300 B1 KR 101652300B1 KR 1020160023887 A KR1020160023887 A KR 1020160023887A KR 20160023887 A KR20160023887 A KR 20160023887A KR 101652300 B1 KR101652300 B1 KR 101652300B1
Authority
KR
South Korea
Prior art keywords
marker
wearable device
image
module
visually impaired
Prior art date
Application number
KR1020160023887A
Other languages
Korean (ko)
Inventor
황준호
Original Assignee
황준호
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 황준호 filed Critical 황준호
Priority to KR1020160023887A priority Critical patent/KR101652300B1/en
Application granted granted Critical
Publication of KR101652300B1 publication Critical patent/KR101652300B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3469Fuel consumption; Energy use; Emission aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt

Abstract

The present invention relates to an indoor navigation system and an indoor route guidance method using the indoor navigation system. More particularly, the present invention relates to an indoor navigation system and a indoor navigation method using the same, And an indoor navigation system capable of providing a route guidance service. According to the present invention, it is possible to provide a wearable interface that can be worn and used easily and conveniently by a visually handicapped person, store indoors position information without power supply, utilize markers capable of reading position information through image processing, It is possible to provide convenience of management and cost reduction, and more reliable indoor guidance can be provided to visually impaired persons.

Description

[0001] INDOOR NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED USING WEARABLE DEVICE AND MARKER RECOGNITION [0002]

The present invention relates to an indoor navigation system and an indoor route guidance method using the indoor navigation system. More particularly, the present invention relates to an indoor navigation system and a indoor navigation method using the same, And an indoor navigation system capable of providing a route guidance service.

Currently, there are about 2.5 million disabled people in Korea, among which blind people make up about 10% of the total disabled people, about 250,000 people, and this is the third most common type of disability type. As a result of the survey, people with visual impairments rated "the most uncomfortable thing in life" when looking for an unfamiliar road. This reflects the lack of facilities and equipment for the visually impaired.

To overcome these inconveniences, various devices for the visually impaired have been developed. In outdoor, many devices have been developed to guide the visually impaired through GPS, and indoor navigation technology using Bluetooth, WIFI, RFID, NFC, and beacon has been proposed. However, it is impossible to use the GPS in the indoor environment. Therefore, it is necessary to install a short range wireless identification device such as RFID, NFC, and beacon in all the indoor spaces. There is a disadvantage that maintenance and management are not easy.

On the other hand, in order to solve these drawbacks, Korean Patent Laid-Open Publication No. 10-2014-0036543, Korean Patent Laid-Open No. 10-2013-0091908, etc. have proposed an indoor navigation system using marker recognition. The advantage of using the marker recognition is that it is possible to save costs by eliminating the need for a wireless identification device that requires a relatively high initial installation cost, such as a beacon, and is easy to maintain and manage since it does not require continuous electricity supply. However, in the case of such a conventionally proposed technique, most of the system is intended for the general public, and the characteristics and difficulties of the visually impaired person are not taken into account, making it impossible or inconvenient for the visually impaired to actually use the system. Particularly, the visually impaired users use various auxiliary devices to solve the inconvenience when they are moving. It is desired to develop an auxiliary device for the visually impaired which can be easily and conveniently worn and used because of the inconvenience of wearing and using these auxiliary devices Demand is reported to be very large.

[1] Korean Patent Publication No. 10-2014-0036543 [2] Korean Patent Publication No. 10-2013-0091908

[1] Lee, Jinhyun, and Byung-Seop Song, "Marker recognition for efficient mobile navigation in indoor environments", Visual Disability Research, Vol.22 No. 2, pp.31-48, 2006. [2] Choi, Tae-Woong, Hyun-Cheol Lee, Kee-Taek Hwang, and Eun-Seok Kim, "A Method for Determining Strong Thresholds for Lighting Changes for Marker Type Indoor Navigation", The Journal of the Korea Contents Association, Vol.1 No.1, pp.1-8, 2012.1. [3] Ju Jae Hyun and O Jung Soo, "Adaptive Algorithm for Low-Quality Document Images", The Journal of the Korean Institute of Communication Sciences, Vol.77, No.7, pp.581-585, 2012.7. [4] N. Otsu, "A Thresholding Selection Method from Gray-level Histogram", IEEE Transactions on Systems, Man, and Cybernetics, Vol.9, No. 1, pp. 62-66, 1979. [5] Lee, Chul-hee Lee, "Content-Based Retrieval Using LBP and HSV Color Histogram", Journal of Broadcast Engineering, Vol. 18, No. 3, pp. 372-379, 2013.5. [6] N. Kiryati, Y. Elder, A. M. Bruckstein, "A probabilistic Hough transform", Pattern Recognition, Vol.24, pp.303-316, 1991.

The present invention has been made to overcome the disadvantages of the conventional indoor navigation system as described above, and it is an object of the present invention to provide a wearable interface that can be easily worn and used by a visually impaired person, The present invention provides an indoor navigation system capable of providing convenience and cost reduction in installation and maintenance cost utilizing the indoor navigation system, and more reliable indoor navigation.

According to an aspect of the present invention, there is provided an indoor navigation system for a visually impaired person, comprising: a marker attached to a ceiling of a room for storing position information of a specific point; A main controller for recognizing the current position of the visually impaired person by recognizing the positional information from the photographed image of the marker and calculating the shortest path to the destination for the visually impaired person, A first wearable device provided with a vibration means for guiding a direction in which the visually impaired person should proceed to the left and right vibration; A camera module for photographing a marker attached to a ceiling, the camera module being configured to be able to be worn by a visually impaired person on the ceiling, and to transmit an image of a marker photographed by the camera module to a main control terminal of the first wearable device 2 wearable device; A third wearable device configured to be worn on the wrist by a visually impaired person, for manipulating operation of the main control terminal of the first wearable device; And a management server connected to the main control terminal of the first wearable device via a network and storing the indoor map of the specific building in advance and providing the indoor map information at the request of the main control terminal of the first wearable device.

Here, the main control terminal receives the marker image photographed by the second wearable device, recognizes the position information included in the marker through an image processing process, and outputs the recognized position information to the indoor map information provided from the management server A marker information recognition module for identifying a current position of a blind person by matching; A destination information providing module for providing indoor destinations in a list form; Based on the current position information of the blind person recognized by the marker information recognition module, the specific destination information selected from the destination list provided by the destination information providing module, and the indoor map information provided from the management server, A path calculating module for calculating a shortest path of the packet; A route guidance module for guiding a shortest path from the route calculation module to a destination to a blind person in real time; A vibrating means control module for selectively activating vibrating means of a first wearable device located in a direction in which the visually impaired person should proceed based on a shortest path to a destination calculated from the path calculating module so as to generate vibration; A second wearable device, and a third wearable device.

The second wearable device includes an eyeglass part including an eyeglass frame and a pair of eyeglasses legs and a Bluetooth earphone on the rear end of the eyeglass leg; The right and left eyeglasses legs of the eyeglass part are connected to each other along the head circumference of the visually impaired person. Both right and left ends of the eyeglass part are pivotally coupled to the outside of each eyeglass leg by a hinge in the forward and backward directions, A hair band portion; And a communication module for transmitting the marker image photographed by the camera module to the main controller of the first wearable device.

Here, a mounting protrusion for preventing the hair band from sagging downwardly protrudes from the right and left sides of the hair band portion when the hair band portion is hooked on the eyeglass leg when the hair band portion is turned forward.

The third wearable device includes an operation button for controlling the route guidance to be started by inquiring and selecting a destination list provided from the main control terminal of the first wearable device; A control circuit for controlling the overall operation of the third wearable device; A Bluetooth module for receiving voice information from the main control terminal of the first wearable device and transmitting a control command signal generated through the operation button to the main control terminal; A speaker for outputting audio information to the outside; And a battery for operating power supply.

The marker has a square outline, the inside is divided into a total of nine square subregions, eight square cells are spaced apart from each other at predetermined intervals in each subregion except the uppermost left side, The inside of the cell is filled with white or black.

Meanwhile, a guide line for interconnecting markers adjacent to each other is additionally displayed on the ceiling of the room, and the second wearable device photographs the guide line image through the camera module in real time and transmits it to the main control terminal of the first wearable device The main control unit senses the direction of the guide line in real time through image processing and selectively operates one of the vibration means attached to the left and right shoulders of the first wearable device to guide the visually impaired do.

The management server includes an indoor map DB storing the indoor map information in which coordinates of the marker are plotted on the drawing of the specific building and information of the other markers adjacent to the marker is set for all the markers .

According to another aspect of the present invention, there is provided an indoor route guidance method comprising: capturing an image of a marker attached to a ceiling of a room to obtain an image; Recognizing positional information from a photographed image of the marker and determining a current position of the blind person; Receiving a destination to which the visually impaired person intends to go, and calculating a shortest path from the current position of the visually impaired person to the destination; And guiding the shortest path to the calculated destination to the visually impaired in real time.

The method further includes photographing a guide line connecting the markers to each other in real time, and guiding a direction in which a visually impaired person should turn to the blind by detecting the direction of the guide line from the photographed image.

The position information recognition from the photographed image of the marker includes: binarizing the photographed marker image; Extracting a label corresponding to a border of the marker from the binarized image; Extracting an outline of the marker from the extracted marker label; Extracting four corner coordinates of the marker from the outline of the extracted marker; Transforming the distorted marker into a square by projectively transforming four corner coordinates of the extracted marker; Recognizing a direction of the converted marker; And extracting data of the marker.

The recognition of the direction of the marker is performed by checking the number of pixels of the cell region in the marker and reading the position of the empty region where the cell is not displayed.

The extraction of the marker data is performed by equally dividing the marker converted into a square into 4 × 4, and reading the pixel values of the respective cells located at the intersection of the equal lines to obtain the sum of the weights.

The direction of the guide line is detected by converting an RGB image into an HSV image in order to recognize the color of the guide line in the captured guide line image, binarizing the image according to the HSV range of the color of the guide line, Lt; / RTI >

According to the present invention, it is possible to provide a wearable interface that can be worn and used easily and conveniently by a visually impaired person, store indoor position information without power application, and utilize a marker capable of reading position information through image processing Thereby making it easier to install and maintain, and to reduce costs. Further, it is possible to guide a visually impaired person in a more reliable way.

1 is a configuration diagram of an indoor navigation system according to the present invention;
2 is a detailed configuration diagram of a first wearable device of an indoor navigation system according to the present invention,
3 is a main body terminal configuration diagram of a first wearable device,
4 is a detailed configuration diagram of a second wearable device of an indoor navigation system according to the present invention,
Fig. 5 is a view showing the wear and use state of the second wearable device,
6 is a detailed configuration diagram of a third wearable device of the indoor navigation system according to the present invention,
FIG. 7 is a diagram illustrating a marker of an indoor navigation system according to the present invention;
8 is an exemplary screen of a marker data setting program built in the management server 400,
Fig. 9 is a view for explaining the direction determination at the branch road,
FIG. 10 is an actual use state of the indoor navigation system according to the present invention,
11 is a flowchart showing a marker recognition algorithm,
FIG. 12 shows a result obtained by binarizing and inverting an image (a) having a non-uniform brightness under the influence of illumination through an Otsu algorithm (b) and an adaptive binarization algorithm (c)
13 is a view showing a state in which a label in a binarized image is extracted,
14 is a view for extracting an outline located at the outermost position of a marker to obtain four corner coordinates of the marker,
Fig. 15 is a view showing a distorted marker image (a), a square marker image (b) generated by projecting and transforming the distorted marker image
16 is a diagram for explaining a method of extracting marker data,
17 is a photograph showing a state where a marker and a guide line are displayed on a ceiling,
18 is a flowchart showing a guide line recognition algorithm,
Figure 19 also shows the HSV con model,
20 shows results (b), (b) and (c) of binarizing the HSV image (a) based on the color range of the guide line L,
Figure 21 shows the results of binarization from marker images taken under various lighting conditions,
22 is a result of converting an image of a guide line L taken in two different illumination environments into an HSV image and then binarizing it.

Hereinafter, an indoor navigation system and method according to the present invention will be described in detail with reference to the accompanying drawings and preferred embodiments.

As shown in FIG. 1, an indoor navigation system according to the present invention includes a user device U, a marker M, and a management server 400.

The user device U is configured in the form of a wearable device that is easily worn by a visually impaired person. The user device U acquires positional information from a marker M previously displayed in the room, So as to grasp the current position of the visually impaired person and to guide the route to the desired destination in real time. The user device U includes a first wearable device 100, a second wearable device 200 and a third wearable device 300, as shown in FIG.

2 is a block diagram of a first wearable device 100 constituting an indoor navigation system according to the present invention. As shown in the figure, the first airlaid device is preferably formed in a vest shape so that the blind can easily wear it. The main wearable terminal (110) and the auxiliary battery (120) are installed in the first wearable device (100) of the vest shape. The main control unit 110 and the auxiliary battery 120 may be housed in an inner bag when they are used or may be installed inside the vest itself.

The first wearable device 100 further includes vibration means 130, as shown in Fig. The vibrating means 130 is for guiding the direction in which the blind person should proceed in the path guidance of the main control unit 110 by the left and right vibrations. The vibrating means 130 includes a vibrating motor 130 attached to left and right shoulders of the vest type first wearable device 100, . The vibration means 130 is connected to the main control terminal 110 in a wireless or wired manner, and generates a right or left vibration selectively under the control of the main control terminal 110 to inform the visually impaired person in which direction to move.

The main control terminal 110 is a main terminal for recognizing the current position of the visually impaired person by recognizing the positional information of the marker M and performing route guidance in real time to a destination where the visually impaired person wants to go. The main control unit 110 preferably uses a small computer Raspberry Pie to reduce weight and volume and reduce power consumption. The raspberry pie has low-cost and low-power characteristics, which can reduce the cost of manufacturing the main unit 110, and is advantageous in that a long battery life can be maintained even with a small capacity battery.

FIG. 3 shows a block diagram of the main control terminal 110. The main control unit 110 includes a marker information recognition module 111, a destination information provision module 112, a path calculation module 113, a route guidance module 114 and a communication module 116 .

The marker information recognition module 111 receives the marker image photographed by the second wearable device 200 to be described later, recognizes the position information included in the marker M through an image processing process, To identify the current position of the blind person.

The destination information providing module 112 is a section for providing the destination of the indoor room to which the visually impaired person wants to go in a list form. The destination is specified as the location of all the main facilities in the room to which the marker M is attached, The wearable device 200 and the third wearable device 300, and the third wearable device 300 is configured to allow a visually impaired person to select a specific destination.

The path calculation module 113 calculates the current location information of the visually impaired person recognized by the marker information recognition module 111 and the specific destination information selected by the blind person among the destination lists provided by the destination information provision module 112, The shortest route from the current position of the visually impaired person to the destination is calculated based on the indoor map information provided from the user. Here, the shortest path to the destination is calculated through the Daikstra algorithm, and the calculated shortest path is composed of the positions of the consecutive markers M.

The route guidance module 114 guides the shortest path from the route calculation module 113 to the destination to the visually impaired person in real time by voice, vibration, or the like. The route guidance is performed in such a manner that the direction from the current position to the position of the marker M to be next moved to the visually handicapped person is sequentially informed.

The vibrating means control module 115 controls whether or not the vibrating means 130 mounted on the shoulder portion of the first wearable device 100 is operated, and based on the shortest path from the path calculating module to the destination, And controls the vibration means 130 located in a direction in which the blind person should proceed to selectively operate to generate vibration.

The communication module 116 accesses the management server 400 via the network to browse the indoor map information and receives the image information of the marker M obtained from the second wearable device 200 described later, 3 wearable device 300 and provides a communication interface for transmitting voice guidance information to the second wearable device 200 and the third wearable device 300, respectively. To this end, the communication module 116 includes a communication protocol for communicating with the management server 400 using CDMA, WLAN, and the like, and a communication protocol processing module for analyzing and generating protocol packets transmitted from the management server 400, A Bluetooth module, an RF module, or other wired communication module.

Fig. 4 is a perspective view showing an external configuration of the second wearable device 200. As shown in Fig. As shown in the figure, the second wearable device 200 is a spectacles type device including an eyeglass part 210, a hair band part 220, a camera module 230, and a communication module (not shown).

The eyeglass part 210 is configured in the form of an ordinary eyeglass including the eyeglass frame 212, the eyeglass legs 214 and eyeglasses and receives voice guidance information transmitted from the main control part of the first wearable device 100 It is preferable that the Bluetooth earphone 216 is formed on the rear end side of the eyeglass leg 214. To this end, a control circuit for operating the Bluetooth earphone 216, a charging module, and a communication module capable of receiving voice guidance information of the main control terminal 110 are built in the spectacle leg 214.

The hair band unit 220 is mounted with a camera module 230 for photographing the markers M attached to the ceiling of the room as described later and attaches the camera module 230 to the camera module 230, 4, the left and right eyeglass legs 214 are connected to each other along the head circumference of the visually impaired person, and both right and left ends of the eyeglass legs 214 are connected to the outside of the eyeglass legs 214 It is preferable that they are pivotally coupled to each other in the front-rear direction by hinges.

In addition, a mounting protrusion 222 protruding by a predetermined length in an inner horizontal direction may be formed on left and right upper ends of the hair band 220. The mounting protrusion 222 serves to prevent the hair band 220 from being sagged downward when the hair band 220 is hooked on the eyeglass leg 214 when the hair band 220 is turned forward as described later.

The camera module 230 is mounted on the upper center of the hair band 220. The camera module 230 includes a web cam for photographing the marker M attached to the ceiling of the room in real time and delivering the captured image to the main controller 110 of the first wearable device 100. For this purpose, 220 may include a communication module for transmitting an image photographed by the camera module 230 to the main control unit 110 or may be electrically connected to a communication module built in the glasses unit 210, Or the like.

In Fig. 5, a wear and use state diagram of the second wearable device 200 is shown. 5A is an outdoor mode. As shown in FIG. 5A, when the visually impaired person is outdoors, the hair band unit 220 is turned to the front side to position the camera module 230 toward the front side . At this time, as the mounting protrusion 222 protruding from the inside of the upper portion of the hair band 220 is supported on the left and right glasses legs 214, the hair band is not stuck downward but stably. In this mode, the camera module 230 captures a forward image in real time and transmits it to the main control terminal 110 of the first wearable device 100. In the marker information recognition module 111 of the main control terminal 110, When an obstacle exists in the path through which the visually impaired person travels, an alarm is generated by voice or vibration through the route guidance module or the vibration control module 115, thereby helping a visually impaired person to walk safely outdoors.

5B is an indoor mode. When the visually impaired person is located in the room, the hair band unit 220 is vertically turned upward to position the camera module 230 facing the ceiling of the room, And photographs the marker M attached to the ceiling.

Fig. 6 shows a configuration diagram of the third wearable device 300. As shown in Fig. As shown in the figure, the third wearable device 300 is configured to wear like a watch on the wrist. As shown in the figure, the third wearable device 300 includes an operation button 310 for transmitting an operation command, a control circuit 320 for controlling the overall operation of the third wearable device 300, A Bluetooth module 330 for receiving voice information from the main control terminal 110 of the mobile communication terminal 100 and transmitting a control command signal generated through the operation button 310 to the main control terminal 110, And a battery unit 350 for supplying operating power.

The operation button 310 is a button for transmitting an operation command to the main control terminal 110 of the first wearable device 100 in accordance with a pressing operation of the user and is a button assigned to the next, It consists of three buttons in total. The user can sequentially inquire a specific destination from the destination list provided from the destination information providing module 112 of the main control terminal 110 through the 'Next' and 'Previous' buttons. The destination information is presented as a voice through the Bluetooth earphone 216 of the second wearable device 200 or the speaker 340 of the third wearable device 300. When the desired destination is presented, And controls to set a destination and start route guidance. If the 'Next' and 'Previous' buttons are pressed simultaneously with the 'OK' button, the destination list for the current layer can be changed to the destination list corresponding to the upper layer or the lower layer.

FIG. 7 is a block diagram of a marker used in an indoor navigation system according to the present invention. FIG. 7A is a diagram illustrating a design of a marker including specific positional information, and FIG. 7B is a diagram showing numerical values of weights of cells constituting the marker. Hereinafter, the configuration and action of the marker will be described with reference to FIG.

The marker M stores positional information of a specific point in the room and is attached to a ceiling of a main facility such as a main facility or a branch road. When the marker M is attached to the floor, the marker M may not be caught by an obstacle such as an object located in the middle or a person, It is not preferable because it is easily damaged because it is a place where a person's foot is touched, and also when it is attached to a wall, it may be affected by obstacles and may be damaged. Therefore, it is preferable to attach the marker M to the ceiling which is least affected by the obstacle and is not easily damaged.

In the present invention, since the marker M attached to the ceiling is recognized, the marker M can be changed in consideration of the influence of the surrounding illumination and the recognition rate change depending on the height of the ceiling, Should be designed. There are QRCode, ARTag, SCR, and HOM as representative markers (M) that can represent a large number of data with a high recognition rate, and most of them have a common form that data is coded in the structure of a grid pattern.

The shape of the marker M suggested by the present invention is as shown in Fig. 7 (a), and has a square outline. The inside of the marker M is divided into a total of nine square sub-regions, and each sub- A total of eight square cells are spaced apart from each other. The inside of the cell is filled with white or black. The cell filled with white means that the corresponding cell does not have any weight, and the cell filled with black means that the cell corresponding to the corresponding And a weight corresponding to the position of the cell. At this time, the data represented by the marker M is defined as the sum of the weights of all the cells. The empty space at the upper left is used for recognizing the direction in which the marker M is viewed so as to prevent duplication of data depending on the viewing direction. The marker (M) shown in (a) of Figure 7 when in the form of a 3 × 3 comprising eight cell marker, gajitsu of data that can be represented is 28, that is, extended to 256 kinds, and, N × N type

Figure 112016019454611-pat00001
It can be used as a marker for representing data corresponding to a branch location.

Meanwhile, as shown in FIG. 1, the indoor navigation system according to the present invention includes a management server 400. The management server 400 is connected to the main control terminal 110 of the first wearable device 100 via a network and receives an indoor map DB 410 to provide the indoor map information at the request of the main control terminal 110 of the first wearable device 100.

In particular, the management server 400 incorporates a marker data setting program. The marker data setting program loads the drawing of the building as an image form, as shown in FIG. 8, The position of the marker M can be set, and the marker data can be directly modified. The marker data setting program has a function of setting not only the position of the marker M but also adjacent marker information. In order to find the path from the location of the visually impaired to the destination, information is needed about which other nearby markers can be moved from the current marker. Accordingly, an indoor map in which information of other markers adjacent to the marker is input is created for all the markers. The route search method to the destination adopts a method of calculating the shortest distance using the Dijkstra route search algorithm based on the graph obtained from the indoor map. The extreme algorithm is an algorithm that solves the shortest path problem between a given start and end points. When the position of the marker in the building is set as a vertex and the relationship between the markers is expressed as an edge, the path from the marker to the destination marker can be found as the shortest distance.

The coordinates of the markers set through the marker data setting program of the management server 400 are referred to when the visually impaired person guides the road by voice. The direction in which the visually impaired should move is determined by the relative position of the next marker relative to the front of the visually impaired. In a simple example, when a person located at A or B in the form of a road like the one shown in FIG. 9 intends to go to C, it has to move to the right from the turning point in the position of A, but in the position of B, it has to move to the left from the turning point. Therefore, in order to guide the visually impaired to the next way, information on the direction that the visually impaired person has viewed from the current position and information on the direction from the current position to the next position are required. Given the coordinates of the previous marker, the current marker, and the next marker, information on this can be obtained by obtaining a straight line passing through the previous marker and the current marker, a straight line passing through the current marker and the next marker, You can decide what direction to do.

FIG. 10 shows an actual state of use of the indoor navigation system according to the present invention including the user device U, the marker and the management server 400 as described above. As shown in the figure, a user who is a blind person wears a vest type first wearable device 100 including a main control unit 110 and a vibration means 130, and the glasses unit 210 and the camera module 230, And a second wearable device 200 including the mounted hair band 220 is worn on the head. At this time, the marker M is attached to the ceiling of the room, and the camera module 230 of the hair band 220 is positioned toward the ceiling for recognizing the marker M. Further, in order to ensure the accuracy of the route guidance, it is preferable that a guide line L interconnecting adjacent markers M is additionally displayed on the ceiling.

Up to now, the configuration of the indoor navigation system according to the present invention has been described. Hereinafter, a specific method of recognizing the marker M and a route guidance method will be described.

One. Marker  recognition

In order to recognize the marker M, an image is inputted first through the camera module 230, and the preprocessing of the image and the data extraction process are performed according to the procedure of FIG. Image processing was performed by using OpenCV library and reducing the original image to 320 × 240 image in order to increase the processing speed in Raspberry pie.

1-1. Adaptive binarization

  Binarization, which is a key element of the image preprocessing process, divides the pixels of an image into two groups. It uses a global binarization algorithm that uses one threshold for the entire image, and uses different information about neighboring pixels to different threshold values And adaptive binarization algorithms to be applied. As a representative global binarization algorithm, there is an Otsu algorithm that finds a threshold value using the variance value of the histogram. The adaptive binarization algorithm includes the Niblack algorithm that uses the mean and standard deviation of neighboring pixels, and the Sauvola algorithm that complements the disadvantage of Niblack have. 12 is a result obtained by binarizing and inverting the image (a) having a non-uniform brightness under the influence of illumination through the Otsu algorithm (b) and the adaptive binarization algorithm (c) provided by OpenCV. In the present invention, the marker M is attached to the ceiling and is easily affected by ambient light, and the brightness is unevenly distributed throughout the image as a whole. Since the global binarization method using a fixed threshold can not be used for images with uneven brightness, an adaptive binarization algorithm that applies different threshold values to each pixel to binarize the marker image should be used. In the present invention, an adaptive binarization algorithm for determining a threshold value as an arithmetic average of neighboring pixels through an adaptive threshold of OpenCV is used.

1-2. Marker  Extract region label

  A label is an object composed of connected pixels, and labeling is a process of extracting such labels in an image. There are many ways to implement the labeling, but it is generally implemented through the Flood-Fill algorithm. In the present invention, labeling is implemented through a flood-fill algorithm using a queue. To separate the region corresponding to the marker from the binarized image, labeling is performed on the binarized image, and the characteristic of each detected label is examined to find a label corresponding to the border of the marker M. FIG. 13 shows a label on the original image extracted from the binarized image. It can be confirmed that a label surrounding the border of the marker M exists. In the present invention, labeling is performed on each of the label areas detected in the first labeling process. In the present invention, labeling is performed on each of the label areas detected in the first labeling process. If the number of labels located inside is 8 and the width and height of each label are within ± 30% of the error range and the aspect ratio is 0.6 or more and 1.7 or less, it is judged as a marker.

1-3. Extracting outlines

  Because the camera does not look exactly perpendicular to the ceiling marker (M), the marker (M) in the image generally appears as a distorted square rather than a square. Since it is difficult to extract data from the distorted form of the marker M, it must undergo a perspective transformation that transforms it into a square. The outline is extracted from the marker label area prior to the projection transformation. The outline consists of a set of consecutive points, and various outlines can be detected in the image. As shown in FIG. 14, in order to obtain the coordinates of the four corners of the marker, an outline located at the outermost position of the marker should be extracted. In general, the outline can be easily extracted by using the feature that the outline is longest.

1-4. Marker Corner Extraction

When the outline of the marker M is extracted, four corner coordinates of the marker M can be obtained. The first three coordinates are determined as the first coordinate with the maximum distance for one coordinate arbitrarily selected on the outline, and the coordinate having the maximum distance from the first coordinate is defined as the second coordinate, the first coordinate and the second coordinate Is the third coordinate. The fourth coordinate (x 1, y 1), the three coordinates previously determined, (x 2, y 2) , (x 3, y 3) coordinates the following formula (1) when d is the maximum (x 4, y 4 ).

Figure 112016019454611-pat00002

1-5. Projection transformation

  The projection transformation matrix is calculated to transform the distorted marker image as shown in Fig. 15 (a) into a square marker image as shown in Fig. 15 (b). In general, the equation for projection transformation is given by the following equation (2).

Figure 112016019454611-pat00003

From Equation (2), the equations for x 'and y' can be written as Equation (3) and Equation (4).

Figure 112016019454611-pat00004

It can be seen from the equations (3) and (4) that if the coordinates (x, y) and the corresponding pairs of post-transformation coordinates (x ', y') are known, eight unknowns of the projection matrix can be obtained have. Therefore, the projection matrix is obtained by substituting the four corner coordinates extracted from the outline and the transformed square coordinates, and the projection transformation is applied to all the points on the distorted marker to restore the original square shape. In the present invention, conversion into square coordinates of 200 x 200 size was performed.

1-6. Marker direction recognition

  Since the position of the cell is changed according to the direction in which the marker M is photographed, in order to correctly extract the data, the direction of the marker M is recognized. Then, in order to fix the position of each cell constant in the image, Rotational transformation must be performed. Since the direction of the marker M determines the position of the empty cell without the cell, the number of pixels in each cell region is checked and the direction of the marker M is determined by finding the position of the empty cell. In the present invention, pixels of four regions in which the x and y values are within the range of [25,75] and [125,175] are checked for the restored image of FIG. 15 (b) When it is less than the number, it is judged to be the area corresponding to the blank space.

1-7. Data extract

  In order to extract the marker data, a value corresponding to each cell must be read. In the present invention, the restored image is divided into 4 × 4 pixels as shown in FIG. 16, the pixel values of the respective cells located at the intersection of the equal lines are read, Marker data was extracted. At this time, the weights corresponding to the respective cells are as shown in FIG. 7 (b). The extracted marker data represents a specific location in the building, and is used as data for searching for a route from the indoor map in which the marker position is indicated to the destination.

2. Guide Line Recognition

 It is difficult for the visually impaired to know which direction the next marker is located just by voice guidance. 17, a guide line L is drawn on a ceiling to connect a marker and a marker, and an image photographed from the camera module 230 of the second wearable device 200 is transmitted to the first wearable device 100 The direction of the guide line L is sensed in real time through image processing in the marker information recognition module 111 of the main control unit 110 and the left and right shoulders To selectively guide one of the blind means 130 to the blind. The recognition of the guide line L is performed by the procedure of FIG. Recognizes the inclined angle of the guide line L through the image and generates a vibration through the vibration means 130 attached to both shoulders of the first wearable device 100 to guide the visually impaired . In the present invention, when the angle formed by the y-axis and the guide line L exceeds the error range of ± 10 °, the vibration is perceived to recognize the direction.

2-1. HSV conversion

  It is necessary to recognize the color of the guide line L in order to recognize the guide line L. However, the RGB model which is the basic color model of the image has a disadvantage that it is difficult to express a change in brightness. On the other hand, the HSV color space, which expresses color by hue, saturation, and value, is useful for color recognition because it is advantageous to express color change by brightness. The formula for converting RGB to HSV is shown in Equation (5). In equation (5), R, G, and B have a value of [0,1], H has a value of [0,360], and S and V have a value of [0,1]. Here, H, S, and V are defined in the HSV con model shown in FIG. 19, respectively.

Figure 112016019454611-pat00005

 In order to recognize the color of the guide line L in the image, the RGB image is converted into the HSV image and the image is binarized according to the HSV range of the color of the guide line (L). The range of the hue corresponding to the hue of the guide line L used in the present invention is 40-60, and the saturation and the value are set to 50 or more and 10 or more, respectively. 20 shows a result (b) obtained by binarizing the HSV image (a) with reference to the color range of the guide line (L).

2-2. Straight line extraction

  To obtain the slope of the guide line L, a straight line is extracted from the image of FIG. 20 (b), and the angle is calculated through the slope of the straight line. In the case of extracting a straight line from an image, Hough transform is generally used. In the Hough transform, a straight line corresponding to a cumulative parameter value is detected by accumulating two parameter values expressing a straight line passing through an arbitrary point in the image . In this case, since there are innumerable straight lines passing through one point, when expressing a straight line in Equation (6) for Hough transform, θ and r are used as parameters, and parameter value is recorded by increasing θ constantly. Therefore, the amount of increase of θ will affect the accuracy and speed of straight line detection.

Figure 112016019454611-pat00006

In general, the Hough transform is performed on the outline image generated by the Canny edge detector. In the case of the present invention, as shown in FIG. 19, a color corresponding to the guide line L is classified in the HSV image, and then a Cunny image is generated and Hough transform is performed. However, due to the characteristics of the surrounding environment, other areas besides the guide line L can be detected. In addition to the straight line corresponding to the guide line L at the time of detecting a straight line, a straight line due to noise or the like can be detected. Therefore, in the present invention, the straight line is detected by using the modified probabilistic Hough transform and the straight line below the predetermined critical length is removed, thereby increasing the recognition rate of the guide line (L). Two straight lines corresponding to the left and right boundary lines of the guide line L are detected from one guide line L image. If the slopes of the two straight lines are m 1 and m 2 , respectively, the angle θ g of the guide line L is defined by equation (7). Since the error range is ± 10 °, if the range of θ g is from -80 ° to 0 °, the left vibration means will be heard. If the range is 0 ° to 80 °, the right vibration means will ring to guide the visually impaired.

Figure 112016019454611-pat00007

3. Experimental Results and Performance Analysis

3-1. Marker recognition rate measurement

   In order to analyze the performance of the marker recognition algorithm implemented in the present invention, the marker recognition rate according to the gait, the marker recognition rate according to the distance, and the marker recognition rate according to illumination were measured on the main control terminal 110. The recognition rate is calculated as shown in equation (8).

Figure 112016019454611-pat00008

In this experiment, Raspberry pie 2 Model B version was used as the main handset terminal 110, and a 900 MHz ARM Cortex-A7 quad core CPU having a specification of 1 GB RAM was used. On the Raspbian OS, we performed the marker recognition algorithm and showed average processing speed of 10 frames per second. Noise such as blur may occur in the image depending on the processing speed of the image. In consideration of the average walking speed of the blind person is 0.4 m / s, the distance from the camera module 230 to the ceiling of 0.2 m Table 1 shows the recognition results of moving a marker of 0.2 m in size at a speed of 0.4 m / s.

Table 1. Marker recognition success rate

Figure 112016019454611-pat00009

In Table 1, the success rate of marker recognition per second is the ratio of the number of successful marker recognition per second at least once per second. Therefore, accurate marker recognition is possible because the marker does not deviate from the angle of view of the camera within one second when moving at 0.4 m / s. In addition, this means that the blurring effect due to the walking of the visually handicapped person is small and does not significantly affect the recognition of the marker.

Table 2. Success rate of marker recognition according to distance between marker and camera module

Figure 112016019454611-pat00010

Table 2 shows the results of recognizing a marker with a size of 0.2m × 0.2m by changing the distance between the camera and the marker. According to the experimental results, the recognition rate was 100% within 1.5m. Since the marker is attached to the chest area, it can be seen that accurate marker recognition is possible in a building with a ceiling height of 3 m or less when assuming an adult's chest height of 1.5 m. Considering that the height of most of the ceilings is less than 3m, this marker can be applied to a typical ceiling height building. In addition, it is possible to increase the recognition rate by applying a marker larger than 0.2m × 0.2m in a building with a ceiling height of 3m or more.

The change of the marker recognition rate by illumination is determined only by the binarization algorithm. FIG. 21 shows the result of binarization from the marker image taken under various illuminations. In FIG. 21, (a), (b), and (c), the marker region was correctly binarized and the marker recognition rate was 100% for all the frames. Therefore, it can be seen that this marker recognition algorithm is strong against the influence of illumination, and accurate marker recognition is possible in a place where there is not a strong light source enough to cover the space or image of light.

3-2. Guide line recognition rate measurement

 In order to analyze the performance of the guide recognition algorithm implemented in the present invention, the guide line recognition rate due to illumination was measured. The guideline recognition criterion is set when the angle of the extracted guide line L is within the range. 22 is a result of converting an image of a guide line L taken in two different illumination environments into an HSV image and then binarizing it.

 As a result of measuring the guide recognition rate in FIG. 22 (a), the average guide recognition rate was 76%. The average number of recognition per second was 8 times. On the other hand, in (b), the average recognition rate per FPS was 99% and the average recognition rate per second was 24.95. The reason for such a phenomenon is that in the case of FIG. 22 (b), the noise is removed rather than the illumination, so that the speed of the Huff conversion is improved, and only the area corresponding to the guide line L is binarized, It seems possible. It is confirmed that guide line (L) is detected smoothly as a whole, and it is expected that the recognition rate of guide line (L) can be greatly increased if noise removal is further performed in the HSV binarized image.

The specific embodiments of the present invention have been described above. It is to be understood, however, that the scope and spirit of the present invention is not limited to these specific embodiments, and that various modifications and changes may be made without departing from the spirit of the present invention. If you have, you will understand. Therefore, it should be understood that the above-described embodiments are provided so that those skilled in the art can fully understand the scope of the present invention. Therefore, it should be understood that the embodiments are to be considered in all respects as illustrative and not restrictive, The invention is only defined by the scope of the claims.

U: user device 100: first wearable device
110: Key word terminal 111: Marker information recognition module
112: Destination information providing module 113: Path calculating module
114: path guide module 115: vibration means control module
116: communication module 120: auxiliary battery
130: vibration means 200: second wearable device
210: glasses section 212: eyeglass frame
214: Glasses bridge 216: Bluetooth earphone
220: Hair band part 222:
230: camera module 300: third wearable device
310: Operation button 320: Control circuit
330: Bluetooth module 340: Speaker
350: Battery part M: Marker
L: guide line 400: management server
410: indoor map DB

Claims (13)

A marker attached to a ceiling of a room to store position information of a specific point; A main body terminal for recognizing the current position of the visually impaired person by recognizing the positional information from the photographed image of the marker and calculating the shortest path to the destination where the visually impaired person wants to go, A first wearable device provided with a vibration means for guiding a direction in which the visually impaired person should proceed to the left and right vibration; A second wearable device having a camera module configured to be able to be worn by a visually impaired person on a ceiling and photographing a marker attached to the ceiling in real time and transmitting an image of the photographed marker to a main control terminal of the first wearable device; A third wearable device configured to be worn on the wrist by a visually impaired person, for manipulating operation of the main control terminal of the first wearable device; And a management server connected to the main control terminal of the first wearable device via a network and storing the indoor map of the specific building in advance and providing the indoor map information at the request of the main control terminal of the first wearable device,
The second wearable device includes an eyeglass part including an eyeglass frame and a pair of eyeglasses legs, the eyeglass part having a Bluetooth earphone on a rear end side of the eyeglass leg; And a pair of right and left eyeglasses legs of the eyeglass part are connected to each other along the head circumference of a blind person, both left and right ends of the eyeglass leg being pivotally connected to the outside of the eyeglass legs by a hinge in forward and backward directions, Wow; And a communication module for transmitting a marker image photographed by the camera module to the main control terminal of the first wearable device, wherein the left and right sides of the hair band portion are hooked on the glasses legs when the hair band portion is turned toward the front side, A protrusion is formed to prevent the protrusion from being sagged downward,
The marker has a square outline, the inside is divided into a total of nine square subareas, eight square cells are spaced apart from each other except for the uppermost left, and the inside of the cell is white or black Filled with color,
In the ceiling of the room, guides for interconnecting the adjacent markers are displayed,
Wherein the management server includes an indoor map DB for storing indoor map information in which coordinates of a marker are plotted on a drawing of a specific building and information of other markers adjacent to the marker is set for all the markers,
The main control terminal receives the marker image photographed in the second wearable device, recognizes the position information included in the marker through an image processing process, and matches the recognized position information with the indoor map information provided from the management server A marker information recognition module for identifying a current position of the blind person; A destination information providing module for providing indoor destinations in a list form; Based on the current position information of the blind person recognized by the marker information recognition module, the specific destination information selected from the destination list provided by the destination information providing module, and the indoor map information provided from the management server, A path calculating module for calculating a shortest path of the packet; A route guidance module for guiding a shortest path from the route calculation module to a destination to a blind person in real time; A vibrating means control module for selectively activating vibrating means of a first wearable device located in a direction in which the visually impaired person should proceed based on a shortest path to a destination calculated from the path calculating module so as to generate vibration; A communication module for mutual communication with the management server, the second wearable device, and the third wearable device,
The main control terminal binarizes the photographed marker image, extracts a label corresponding to a border of the marker from the binarized image, extracts the outline of the marker from the extracted marker label, extracts four corner points of the marker from the outline of the extracted marker Extracts the markers, extracts the four corners of the marker, transforms the distorted markers into squares, and checks the number of pixels in the cell area of the marker to read the position of the empty area where no cell is displayed. After the direction is recognized, the rotation of the marker image is performed so as to fix the position of each cell of the marker to be constant in the image, and the marker converted into the square is divided into 4x4, Extracts the data of the marker by reading the pixel value of each cell and obtaining the sum of the weights,
In addition, the main control terminal may include a current marker located at a current position of the blind person included in the indoor map information provided from the management server, a previous marker adjacent to the current marker among the blinders passed by the blind, Determines the direction in which the visually impaired person should move by calculating the angle between the straight line passing through the current marker and the current marker and the straight line passing through the current marker and the next marker,
The second wearable device captures the guide line image in real time through the camera module and transmits the guide line image to the main language terminal of the first wearable device. The main language terminal uses the HSV image to recognize the color of the guide line in the taken guide line image, The direction of the guide line is detected by calculating the angle of the guide line from the binarized image after the image is binarized according to the HSV range of the color of the guide line, and one of the vibration means attached to the left and right shoulders of the first wearable device is selected To guide the visually impaired to turn his or her body.
delete delete delete The method according to claim 1,
Wherein the third wearable device comprises:
An operation button for controlling the route guidance to be started by inquiring and selecting a destination list provided from the main control terminal of the first wearable device;
A control circuit for controlling the overall operation of the third wearable device;
A Bluetooth module for receiving voice information from the main control terminal of the first wearable device and transmitting a control command signal generated through the operation button to the main control terminal;
A speaker for outputting audio information to the outside;
And a battery for operating power supply.
delete delete delete delete delete delete delete delete
KR1020160023887A 2016-02-29 2016-02-29 Indoor navigation system for the visually impaired using wearable device and marker recognition KR101652300B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160023887A KR101652300B1 (en) 2016-02-29 2016-02-29 Indoor navigation system for the visually impaired using wearable device and marker recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160023887A KR101652300B1 (en) 2016-02-29 2016-02-29 Indoor navigation system for the visually impaired using wearable device and marker recognition

Publications (1)

Publication Number Publication Date
KR101652300B1 true KR101652300B1 (en) 2016-09-01

Family

ID=56942673

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160023887A KR101652300B1 (en) 2016-02-29 2016-02-29 Indoor navigation system for the visually impaired using wearable device and marker recognition

Country Status (1)

Country Link
KR (1) KR101652300B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101999065B1 (en) * 2018-07-31 2019-07-10 조선대학교산학협력단 Method for measuring distance between the camera and the object using milliradian
KR102012452B1 (en) * 2018-02-28 2019-08-20 한국기계연구원 Wearable qr code recognition device and qr code recognition system for the blind
WO2019235786A1 (en) * 2018-06-08 2019-12-12 한국교통대학교 산학협력단 Indoor direction guiding method and system therefor
CN112985409A (en) * 2021-02-26 2021-06-18 吉林建筑大学 Navigation method and related device for vision-impaired person
KR20210102708A (en) 2020-02-12 2021-08-20 대한민국(행정안전부 국립재난안전연구원장) Route marker device for pedestrian guidance and disaster response method using the same
KR102467141B1 (en) * 2022-04-27 2022-11-16 정원재 Directional beacon system and how it works
KR20230052774A (en) * 2021-10-13 2023-04-20 (주)나임기술 Method and system for controlling robot arm using obtaining re-size image by frame grabber

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0576563A (en) * 1991-06-06 1993-03-30 Shinko Electric Co Ltd Goggles for emergency refuge
JP2009222682A (en) * 2008-03-18 2009-10-01 Saitama Univ Navigation system
KR20130071419A (en) * 2010-03-10 2013-06-28 토마스 엠. 리카드 Communication eyewear assembly
KR20130091908A (en) 2012-02-09 2013-08-20 한국전자통신연구원 Apparatus and method for providing indoor navigation service
KR20140036543A (en) 2012-09-17 2014-03-26 서세원 Indoor positioning system and method using marker and smart device
JP5864800B1 (en) * 2015-04-11 2016-02-17 治幸 岩田 Wearable navigation system, ring, watch, bracelet, program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0576563A (en) * 1991-06-06 1993-03-30 Shinko Electric Co Ltd Goggles for emergency refuge
JP2009222682A (en) * 2008-03-18 2009-10-01 Saitama Univ Navigation system
KR20130071419A (en) * 2010-03-10 2013-06-28 토마스 엠. 리카드 Communication eyewear assembly
KR20130091908A (en) 2012-02-09 2013-08-20 한국전자통신연구원 Apparatus and method for providing indoor navigation service
KR20140036543A (en) 2012-09-17 2014-03-26 서세원 Indoor positioning system and method using marker and smart device
JP5864800B1 (en) * 2015-04-11 2016-02-17 治幸 岩田 Wearable navigation system, ring, watch, bracelet, program

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
[1] 이진현, 이해균, 송병섭, "실내 환경에 효율적인 모바일 내비게이션을 위한 마커인식", 시각장애연구, 제 22권 2호, pp.31-48, 2006.
[2] 최태웅, 이현철, 허기택, 김은석, "마커 방식 실내 내비게이션을 위한 조명 변화에 강한 임계값 결정 방법", 한국콘텐츠학회 논문지, 제 12권 1호, pp.1-8, 2012.1.
[3] 주재현, 오정수, "저화질 문서영상들을 위한 적응적 알고리즘", 한국통신학회 논문지, 제 37권 7호, pp.581-585, 2012.7.
[4] N. Otsu, "A Thresholding Selection Method from Gray-level Histogram", IEEE Transactions on Systems, Man, and Cybernetics, Vol.9, No.1, pp.62-66, 1979.
[5] 이권, 이철희, "LBP와 HSV 컬러 히스토그램을 이용한 내용 기반 검색", 방송공학회 논문지, 제 18권 3호, pp. 372-379, 2013.5.
[6] N. Kiryati, Y. Elder, A. M. Bruckstein, "A probabilistic Hough transform", Pattern Recognition, Vol.24, pp.303-316, 1991.

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102012452B1 (en) * 2018-02-28 2019-08-20 한국기계연구원 Wearable qr code recognition device and qr code recognition system for the blind
WO2019235786A1 (en) * 2018-06-08 2019-12-12 한국교통대학교 산학협력단 Indoor direction guiding method and system therefor
KR101999065B1 (en) * 2018-07-31 2019-07-10 조선대학교산학협력단 Method for measuring distance between the camera and the object using milliradian
KR20210102708A (en) 2020-02-12 2021-08-20 대한민국(행정안전부 국립재난안전연구원장) Route marker device for pedestrian guidance and disaster response method using the same
CN112985409A (en) * 2021-02-26 2021-06-18 吉林建筑大学 Navigation method and related device for vision-impaired person
CN112985409B (en) * 2021-02-26 2024-03-26 吉林建筑大学 Navigation method and related device for vision disorder person
KR20230052774A (en) * 2021-10-13 2023-04-20 (주)나임기술 Method and system for controlling robot arm using obtaining re-size image by frame grabber
KR102619814B1 (en) 2021-10-13 2024-01-03 (주)나임기술 Method and system for controlling robot arm using obtaining re-size image by frame grabber
KR102467141B1 (en) * 2022-04-27 2022-11-16 정원재 Directional beacon system and how it works

Similar Documents

Publication Publication Date Title
KR101652300B1 (en) Indoor navigation system for the visually impaired using wearable device and marker recognition
US20210019854A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
US10655970B2 (en) Beacon-based indoor wayfinding system with automated beacon placement
JP6525229B1 (en) Digital search security system, method and program
Angin et al. A mobile-cloud collaborative traffic lights detector for blind navigation
US9922236B2 (en) Wearable eyeglasses for providing social and environmental awareness
US9224037B2 (en) Apparatus and method for controlling presentation of information toward human object
JP5674406B2 (en) Surveillance system, monitoring device, autonomous mobile body, monitoring method, and monitoring program using autonomous mobile body
US20180196417A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
US6690451B1 (en) Locating object using stereo vision
Simôes et al. Blind user wearable audio assistance for indoor navigation based on visual markers and ultrasonic obstacle detection
EP3175630A1 (en) Wearable earpiece for providing social and environmental awareness
RU2019136741A (en) LOCATION-BASED WIRELESS AUTHENTICATION
KR101054025B1 (en) Visually impaired walking guidance method and system
JP2006251596A (en) Support device for visually handicapped person
US20180196415A1 (en) Location Signaling with Respect to an Autonomous Vehicle and a Rider
US20120327203A1 (en) Apparatus and method for providing guiding service in portable terminal
Coughlan et al. Functional assessment of a camera phone-based wayfinding system operated by blind and visually impaired users
KR20130086861A (en) Guide device for blind people using electronic stick and smartphone
JP7052305B2 (en) Relief systems and methods, as well as the servers and programs used for them.
JP2020053028A (en) Object-tracking system
KR102336264B1 (en) The method, the system and the program of In-store automatic payment
JPWO2018084191A1 (en) Congestion situation analysis system
Lee et al. Magnetic tensor sensor and way-finding method based on geomagnetic field effects with applications for visually impaired users
Badave et al. Android based object detection system for visually impaired

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190718

Year of fee payment: 4