KR20170040652A - Device and method for indoor positioning based on sensor image - Google Patents

Device and method for indoor positioning based on sensor image Download PDF

Info

Publication number
KR20170040652A
KR20170040652A KR1020150139916A KR20150139916A KR20170040652A KR 20170040652 A KR20170040652 A KR 20170040652A KR 1020150139916 A KR1020150139916 A KR 1020150139916A KR 20150139916 A KR20150139916 A KR 20150139916A KR 20170040652 A KR20170040652 A KR 20170040652A
Authority
KR
South Korea
Prior art keywords
sensor
icon
image
indoor positioning
candidate
Prior art date
Application number
KR1020150139916A
Other languages
Korean (ko)
Other versions
KR101767743B1 (en
Inventor
최원익
이충헌
Original Assignee
인하대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 인하대학교 산학협력단 filed Critical 인하대학교 산학협력단
Priority to KR1020150139916A priority Critical patent/KR101767743B1/en
Publication of KR20170040652A publication Critical patent/KR20170040652A/en
Application granted granted Critical
Publication of KR101767743B1 publication Critical patent/KR101767743B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • G01S11/06Systems for determining distance or velocity not using reflection or reradiation using radio waves using intensity measurements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Abstract

The present invention relates to an indoor positioning device based on a sensor image and an indoor positioning method based on the sensor image. The indoor positioning device based on a sensor image according to the present invention includes a mapping unit which visualizes the signal strength of a sensor measured in a user terminal as an icon and maps the icon in a position where the sensor is located, to generate an image map, a control unit which selects n (n is a natural number of 3 or more) icons satisfying a selected criterion as a candidate icon, and a detecting unit which detects a central point in a polygon formed by the candidate icon as coordinates of the user terminal. Accordingly, the present invention can improve a sensor data processing speed.

Description

TECHNICAL FIELD [0001] The present invention relates to an indoor positioning apparatus and an indoor positioning method based on a sensor image,

The present invention relates to an indoor positioning device based on a sensor image and an indoor positioning method based on the sensor image.

BACKGROUND ART Along with the advent of the Internet of things (IoT), in recent years, a technique has been used that can provide useful information to a user through Internet access and information sharing (e.g., sensor value) between objects. Examples of the technology described above include an indoor positioning technique using the intensity of a Wi-Fi signal intensity or a technique capable of using data such as air pressure, temperature, and humidity for weather forecasting.

Due to these technologies, in the image processing field, machine learning technology such as computational performance and artificial neural network has been developed. In particular, pattern recognition technology that extracts various information from images and recognizes objects in the image has been dramatically developed .

In the prior art, data acquired from a large number of sensors is generally stored in a relational database (DB). This has the advantage that all data can be stored without loss. However, when retrieving the degree of similarity of data, a person skilled in the art has a disadvantage that all algorithms must be constructed in accordance with the data. Also, the prior art has a disadvantage that it is difficult to apply it to a new search / pattern recognition algorithm due to the restriction of data format.

Therefore, there is a need for an apparatus and method for searching similar data by simply converting a measured value from a sensor into an image form and finding a similar pattern simply by applying a pattern recognition algorithm in the prior art.

SUMMARY OF THE INVENTION It is an object of the present invention to improve the speed of processing sensor data by comparing image maps obtained by visualizing signal intensity with icons.

In order to achieve the above object, an indoor positioning apparatus based on a sensor image includes a mapping unit for visualizing a signal intensity intensity of a sensor measured at a user terminal as an icon and mapping the icon to a position at which the sensor is located, A control unit for selecting n icons (n is a natural number of 3 or more) satisfying a predetermined criterion as candidate icons in the image map and a center point in the polygon formed by the candidate icon as coordinates of the user terminal And a detection unit for detecting the detection signal.

According to another aspect of the present invention, there is provided an indoor positioning method based on a sensor image, comprising the steps of: visualizing a signal intensity intensity of a sensor measured at a user terminal as an icon; mapping the icon to a position of the sensor; Selecting a candidate icon as an icon of n (n is a natural number equal to or greater than 3) satisfying a predetermined criterion in the image map, and calculating a center point in the polygon formed by the candidate icon And detecting as the coordinates of the user terminal.

According to an embodiment of the present invention, the speed at which sensor data is processed can be improved by comparing image maps obtained by visualizing the signal intensity with an icon.

In addition, the indoor positioning apparatus 100 based on the sensor image can visualize the sensor data itself as an image and process a plurality of sensor data received from a plurality of sensors using a technique developed in the image processing field, The present invention can be applied to a field in which a desired pattern is found and utilized.

1 is a block diagram illustrating an indoor positioning apparatus based on a sensor image according to an embodiment of the present invention.
2 is a view for explaining an image map and icons according to an embodiment of the present invention.
3 and 4 are views for explaining an application example using an indoor positioning device according to an embodiment of the present invention.
5 is a diagram for explaining a process of generating an image map according to an embodiment of the present invention.
6 is a diagram illustrating a process of detecting a location of a user terminal using an indoor positioning device according to an embodiment of the present invention.
FIG. 7 is a workflow diagram specifically illustrating an indoor positioning method based on a sensor image according to an embodiment of the present invention.

Hereinafter, embodiments according to the present invention will be described in detail with reference to the accompanying drawings. However, the present invention is not limited to or limited by the embodiments. Like reference symbols in the drawings denote like elements.

In the indoor positioning apparatus and the indoor positioning method based on the sensor image described in the present specification, a representative image map including at least one icon is generated for each sensor, and the re-measured image map is compared with the representative image map, It can detect.

In the present specification, a sensor image may include an image map, a representative image map, and an image obtained by converting sensor data (sensor values).

1 is a block diagram illustrating an indoor positioning apparatus based on a sensor image according to an embodiment of the present invention.

An indoor positioning device 100 based on a sensor image of the present invention may include a mapping unit 110, a control unit 120, and a detection unit 130.

The mapping unit 110 visualizes the signal intensity intensity of the sensor measured at the user terminal as an icon, and maps the icon to a position where the sensor is located to generate an image map. That is, the mapping unit 110 can map the location of the sensor sensed through the user terminal through the icon. At this time, the icon may be a visualizable image having size, saturation, brightness, color, and shape. For example, the icon may be a green circle, a yellow square, a red triangle, or the like. The image map may be an image including an icon indicating the position of the sensor depending on the intensity of the signal. In addition, in the case of Wi-Fi, a location where the sensor is located may be a location generated using an access point (AP) or a unique number of a base station.

Further, the mapping unit 110 may determine at least one of the size, the saturation, and the brightness of the icon in proportion to the signal intensity intensity. For example, the mapping unit 110 may visualize a larger icon as the intensity of the signal intensity increases, or visualize the icon as a lower-saturation icon as the intensity of the signal intensity decreases, and so on. You can combine to visualize the strength of the signal strength.

In addition, the mapping unit 110 may visualize an icon that distinguishes color or shape according to the plurality of sensors when the sensors are a plurality of different types. That is, the mapping unit 110 can visualize different icons on the respective sensors. For example, the mapping unit 110 can visualize an icon of a green square on a geomagnetic sensor and an icon of a red triangle on a WiFi.

For a more detailed description of the icons and image maps, reference is made to Fig. 2 below.

2 is a view for explaining an image map and icons according to an embodiment of the present invention.

FIG. 2 is an example of an image map 200 showing signal strength intensity for a Wi-Fi, a mobile network, and a geomagnetic sensor, and the types of sensors are not limited thereto.

As shown in FIG. 2, the mapping unit 110 can generate the image map 200 by visualizing the intensity of the signal intensity of the sensor through an icon at a position where each sensor is located. The mapping unit 110 can visualize the geomagnetic sensor as a square 210, a WiFi triangle 220, and a mobile network as an icon of the prototype 230.

Referring again to FIG. 1, the mapping unit 110 can visualize the standard icons having size, saturation, brightness, color, and shape that are distinguished according to the sensor data using an image transformation algorithm. That is, the mapping unit 110 can convert the sensor data acquired from the sensor into an image including the icon corresponding to the sensor data through the image conversion algorithm, and visualize the sensor data. At this time, the sensor data may be a value measured from the sensor, for example, the sensor data may be data on the signal intensity intensity of the sensor. Here, the image transformation algorithm may be at least one of the commonly used algorithms that can convert the sensor data into an image. A process of generating an image map using an image transformation algorithm will be described with reference to FIG.

In addition, the mapping unit 110 can visualize the intensity of the sensor according to the sensor data associated with the sensor as a standard icon, and generate a representative image map by mapping the standard icon to a position where the sensor is located. That is, the mapping unit 110 may generate a representative image map by mapping a standard icon obtained by visualizing the signal intensity intensity of the sensor on the basis of the sensor data that was previously measured, to a point where the sensor is located. Here, the representative image map can be used to select a candidate icon through the control unit 120, which will be described below with reference to the description of the control unit 120.

In the image map, the control unit 120 selects n icons (n is a natural number of 3 or more) that satisfy the predetermined criteria as candidate icons. That is, the controller 120 may select at least three candidate icons satisfying a predetermined criterion for the signal intensity intensity of the sensor (i.e., at least one of the icon size, saturation, brightness, color, and shape).

For example, referring to FIG. 2, when the icon having the lowest brightness among the shapes is selected as the candidate icon in the image map 200, the controller 120 controls the rectangle 210, the triangle 220, ) As a candidate icon.

When a plurality of sensors are included in the plurality of sensors, the controller 120 selects a candidate icon that matches the criterion in each of the image maps generated for the plurality of sensors, Can be controlled so as not to exceed n. For example, when three different kinds of sensors are generated as an image map, the controller 120 can select three or less candidate icons. At this time, when three candidate icons are selected, the control unit 120 can select a candidate icon and each one icon per sensor as a candidate icon.

In addition, the control unit 120 may select, as the candidate icon, n icons that are mapped to the designated map area or n icons having at least one of the specified shape, color, and shade in the image map. That is, the control unit 120 can select a candidate icon in the map area designated on the image map. For example, the control unit 120 may select a candidate icon among the icons mapped to the map area corresponding to the center in the image map.

In addition, the controller 120 compares the icon in the image map with the standard icon in the representative image map, and selects the n icons having the highest similarity as the candidate icon. That is, the control unit 120 compares the visualized icons with the image map and the representative image map to determine the icon having the highest degree of similarity according to the similarity of the location of the icon, size, chroma, brightness, color, Candidate icons can be selected. At this time, the control unit 120 can select a candidate icon using a general pattern recognition algorithm.

The detection unit 130 detects the center point in the polygon formed by the candidate icon as the coordinates of the user terminal. In other words, the detection unit 130 may detect the center point of the polygon formed by connecting the candidate icon to the coordinates where the user terminal is located.

2, the detection unit 130 detects a center point 240 in a triangle formed by the rectangle 210, the triangle 220, and the circle 230 on the image map 200 as the position of the user terminal can do.

At this time, the mapping unit 110 may update the image map in consideration of the intensity of signal strength of the sensor re-measured through the user terminal at the detected coordinates. That is, the mapping unit 110 may update the already generated representative image map with the image map obtained at the point where the coordinates of the user terminal are detected. Therefore, the mapping unit 110 can update the representative image map with the newly generated image map, thereby maintaining the representative image map in the latest state.

The indoor positioning device 100 based on the sensor image can improve the processing speed of the sensor data by comparing the image map obtained by visualizing the signal intensity with the icon.

In addition, the indoor positioning apparatus 100 based on the sensor image can visualize the sensor data itself as an image and process a plurality of sensor data received from a plurality of sensors using a technique developed in the image processing field, The present invention can be applied to a field in which a desired pattern is found and utilized.

In addition to the algorithm such as K-Nearest Neighbor, the indoor positioning device 100 based on the sensor image may also include a complex algorithm such as an artificial neural network developed for recognizing a desired object in a video, Deep Learning Can be used without modification.

3 and 4 are views for explaining an application example using an indoor positioning device according to an embodiment of the present invention.

The developed view showing the corridor and the office in Fig. 3 will be described as an example, but the present invention is not limited thereto. That is, the indoor positioning device 300 can be used anywhere in the room.

The indoor positioning device 300 may divide the corridor and the office into a dotted line such as a virtual coordinate plane, and generate a plurality of image maps 310, 320, and 330 for each area. At this time, the indoor positioning device 300 may process the plurality of image maps 310, 320, and 330 to generate at least one representative image map. That is, the indoor positioning device 300 can generate at least one representative image map for each area. The indoor positioning device 300 may store a plurality of image maps (including a representative image map) in a database (DB) 301.

When the image map 341 is generated according to the signal strength intensity of the sensor measured by the user terminal 340, the indoor positioning device 300 transmits the image map 341 acquired from the user terminal 340 to the database 301 A candidate icon can be selected by comparing with the stored representative image map. At this time, the indoor positioning device 300 can compare the representative image map generated for each area with the candidate icon on the image map 341 obtained from the user terminal 340 through the pattern recognition algorithm (350, 351, 352) . If the representative image map generated from the plurality of image maps 310 and the image map 341 obtained from the user terminal 340 have the highest degree of similarity, the indoor positioning apparatus 300 may represent the position of the user terminal 340 You can detect the location where the image map was created.

FIG. 4 is a block diagram illustrating the process described in FIG.

The indoor positioning device 400 can generate a representative image map in the learning step and compare the generated image map with the representative image map with respect to the sensor data measured from the user terminal in the use step.

In the learning stage, the indoor positioning device 400 may acquire sensor data from the plurality of sensors 410 and generate a plurality of image maps 430 through the image transformation algorithm 420. At this time, the image map 430 may be stored in the database 440.

In the use step, the indoor positioning device 400 can acquire the sensor data from the plurality of sensors 411 and generate the image map 431 through the image conversion algorithm 421. [ The indoor positioning device 400 can then use the pattern recognition algorithm 450 to compare the image map 431 with a plurality of image maps 430 previously stored in the database 440 to detect the location of the user terminal have. At this time, the indoor positioning device 400 may provide useful information 460 depending on the location of the user terminal. For example, the indoor positioning device 400 may provide useful information 460, such as location, weather forecast, and the like.

For example, the indoor positioning device 300 may express the unique number of a corresponding AP using a QR code, a barcode, a color, and the like in the case of a signal strength intensity with respect to a Wi-Fi, (Visualization). The indoor positioning device 300 can generate a plurality of image maps for each point and select the icon most similar to the image map sent by the user terminal 340 using the pattern recognition algorithm and detect the position in the room.

As another example, in the case of the air pressure, the temperature and the humidity, the indoor positioning device 300 can display icons with the brightness of the three primary colors (red, green, and blue). The indoor positioning device 300 may collect data representing the sensor data measured in a nationwide unit and the weather at the point in time using transparency channels and provide weather forecasts based on past data as useful information.

In addition, by using the indoor positioning device 300, the sensor value measured according to the fields of the enterprise, the research institute, etc. can be made into an image, which can be learned by a complicated pattern recognition algorithm and used for future prediction (useful information).

5 is a diagram for explaining a process of generating an image map according to an embodiment of the present invention.

5, the sensor data generated by the image map is an outdoor signal 510, an indoor signal 520, and a terminal sensor 530. However, the present invention is not limited thereto. Here, the outdoor signal 510 may be at least one of 2G, 3G, DMB, DPGS, LTE, and the like. Indoor signal 520 may be at least one of RSSI, RTT, EDP, etc. in association with WiFi. The terminal sensor 530 may be at least one of a sensor used in connection with a terminal, such as a geomagnetism or an accelerometer.

The indoor positioning apparatus 100 can set the location of the outdoor signal 510 through the unique number of the base station and the Hash function (511). The indoor positioning apparatus 100 may normalize the intensity of the signal intensity, and determine the color and brightness of the pixel according to the normalized value to visualize (512). For example, the indoor positioning device 100 can assign a color to the outdoor signal in red and visualize the normalized value according to the brightness of red (shown in Fig. 5 by replacing the circle with a circle) .

Next, the indoor positioning apparatus 100 can set the location of the indoor signal 520 using the MAC and the Hash function of the AP (521). In addition, the indoor positioning apparatus 100 may normalize the intensity of the signal intensity of the indoor signal 520 to allocate the color and set the brightness of the pixel (522). For example, the indoor positioning apparatus 100 can assign a color to green with respect to a normalized value and visualize a value normalized according to the brightness of green (in Fig. 5, box).

Next, the indoor positioning apparatus 100 can allocate the measured data from the terminal sensor 530 to the center coordinates of the image map in 4 pixels (531). Next, the indoor positioning apparatus 100 can convert the color value into a color value and visualize it (532). For example, when the terminal sensor 530 is a geomagnetism, the indoor positioning apparatus 100 can convert the data on the magnetic field into a color value through the RGB model (shown by a square in Fig. 5).

Next, the indoor positioning apparatus 100 can generate a single image map 540 by superimposing the visualized image for each sensor. At this time, the image map 540 may be generated with a size of 100 * 100 pixels. In addition, the indoor positioning apparatus 100 can generate 20 sheets, one at every two seconds, for one point, but the present invention is not limited thereto. That is, the indoor positioning apparatus 100 may generate at least one image map 540 per one time point. The indoor positioning apparatus 100 can select at least one of the plurality of image maps 540 generated at one point as the representative image map.

6 is a diagram illustrating a process of detecting a location of a user terminal using an indoor positioning device according to an embodiment of the present invention.

First, the indoor positioning apparatus 600 can receive a plurality of image maps 610, 611, and 612 measured for each location (coordinate) from the terminal. The indoor positioning apparatus 600 may store the representative image map among the image maps 610, 611, and 612 in the database 630 through the low-pass filters 620, 621, and 622. At this time, the indoor positioning device 600 can store the representative image map by the coordinates so as to correspond to the position.

Next, the indoor positioning device 600 can receive the image map 650 re-measured and generated at a position to be detected from the user terminal 640. [ The indoor positioning apparatus 600 can receive the representative image map of the database 630 and the image map 650 received from the user terminal 640 using a position determination algorithm 660 such as a pattern recognition algorithm, Can be compared. Accordingly, the indoor positioning apparatus 600 can select an image map having the highest degree of similarity (670). The indoor positioning device 600 may then transmit the detected location to the user terminal 640 to allow the user terminal 640 to receive the detected location (680). The indoor positioning device 600 stores the image map received from the user terminal 640 at the detected position in the database 630 after passing through the low frequency filter 623 and stores the representative image map stored in the database 630 Can be updated.

FIG. 7 is a workflow diagram specifically illustrating an indoor positioning method based on a sensor image according to an embodiment of the present invention.

The indoor positioning method based on the sensor image according to the present embodiment can be performed by the indoor positioning device 100 based on the sensor image described above.

First, the indoor positioning device 100 visualizes the signal strength intensity of the sensor measured at the user terminal as an icon, and maps the icon to a position where the sensor is located to generate an image map (710). That is, step 710 may be a process of mapping through the icon to the point where the sensor sensed through the user terminal is located. At this time, the icon may be a visualizable image having size, saturation, brightness, color, and shape. For example, the icon may be a green circle, a yellow square, a red triangle, or the like. The image map may be an image including an icon indicating the position of the sensor depending on the intensity of the signal. In addition, in the case of Wi-Fi, a location where the sensor is located may be a location generated using an access point (AP) or a unique number of a base station.

Also, in the image map, the indoor positioning apparatus 100 selects n icons (n is a natural number of 3 or more) that satisfy the predetermined criteria as candidate icons (720). That is, step 720 may select at least three candidate icons that meet a predetermined criterion for the signal strength of the sensor (i.e., at least one of the size, saturation, brightness, color, and shape of the icon).

If the sensor is a plurality of different types, step 720 is to select a candidate icon that matches the criterion in each of the image maps generated for the plurality of sensors, May be controlled so as not to exceed n. For example, when three different types of sensors are generated as an image map, the indoor positioning apparatus 100 can select three or less candidate icons. In this case, when three candidate icons are selected, the indoor positioning device 100 can select a candidate icon and each one icon per sensor as a candidate icon.

According to an embodiment, the indoor positioning device 100 may visualize the intensity of a sensor according to sensor data associated with the sensor as a standard icon, and generate a representative image map by mapping the standard icon to a location where the sensor is located .

At this time, the step 720 may compare the icon in the image map with the standard icon in the representative image map, and select the n icons having the highest similarity as the candidate icon. That is, the indoor positioning device 100 can generate a representative image map by mapping a standard icon obtained by visualizing the signal intensity intensity of the sensor on the basis of the previously measured sensor data, to a location where the sensor is located.

In addition, the indoor positioning apparatus 100 detects the center point of the polygon formed by the candidate icon as the coordinates of the user terminal (730). In other words, the step 730 may be a process of detecting the center point of the polygon formed by connecting the candidate icon to the coordinates where the user terminal is located.

According to the embodiment, the indoor positioning apparatus 100 can determine at least one of the size, the saturation, and the brightness of the icon in proportion to the signal intensity intensity. For example, the indoor positioning apparatus 100 may be configured to visualize an icon having a larger size as the intensity of the signal intensity becomes stronger, or to visualize the icon with a lower saturation intensity as the intensity of the signal intensity becomes weaker, Can be combined to visualize the intensity of the signal strength.

The indoor positioning method based on the sensor image can improve the processing speed of the sensor data by comparing the image map obtained by visualizing the signal intensity with the icon.

The method according to an embodiment of the present invention may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

100: Indoor positioning device based on sensor image
110:
120:
130:

Claims (12)

A mapping unit for visualizing a signal intensity intensity of a sensor measured at a user terminal as an icon and generating an image map by mapping the icon to a position where the sensor is located;
In the image map, a controller selects n icons (n is a natural number of 3 or more) that satisfy the predetermined criteria as candidate icons. And
A detection unit for detecting a center point in the polygon formed by the candidate icon as coordinates of the user terminal,
The indoor positioning device based on the sensor image.
The method according to claim 1,
Wherein the mapping unit comprises:
Determining, in proportion to the signal intensity intensity, at least one of the size, saturation, and brightness of the icon
Indoor Positioning System Based on Sensor Image.
The method according to claim 1,
When the sensors are plural,
Wherein the mapping unit comprises:
According to the plurality of sensors, an icon for distinguishing color or shape is visualized
Indoor Positioning System Based on Sensor Image.
The method according to claim 1,
Wherein the mapping unit comprises:
Considering the signal strength intensity of the sensor re-measured through the user terminal at the detected coordinates, the image map is updated
Indoor Positioning System Based on Sensor Image.
The method according to claim 1,
Wherein the mapping unit comprises:
Visualizing the intensity of a sensor according to sensor data associated with the sensor as a standard icon, generating a representative image map by mapping the standard icon to a position where the sensor is located,
Wherein,
The icon in the image map is compared with the standard icon in the representative image map, and the n icons having the highest similarity are selected as the candidate icon
Indoor Positioning System Based on Sensor Image.
6. The method of claim 5,
Wherein the mapping unit comprises:
Visualizing the standard icon having size, saturation, brightness, color, and shape distinguished according to the sensor data using an image transformation algorithm
Indoor Positioning System Based on Sensor Image.
The method according to claim 1,
When the sensors are plural,
Wherein,
A candidate icon corresponding to the criterion is selected in each of the image maps generated for the plurality of sensors, and the sum of the candidate icons selected is controlled not to exceed n
Indoor Positioning System Based on Sensor Image.
The method according to claim 1,
Wherein,
In the image map, n icons to be mapped to the designated map area or n icons having at least one of the specified shape, color and shade are selected as the candidate icons
Indoor Positioning System Based on Sensor Image.
Visualizing a signal strength intensity of a sensor measured at a user terminal with an icon, and mapping the icon to a position where the sensor is located, thereby generating an image map;
Selecting, as candidate icons, n icons (n is a natural number of 3 or more) satisfying a predetermined criterion in the image map; And
Detecting a center point in the polygon formed by the candidate icon as coordinates of the user terminal
Based on the sensor image.
10. The method of claim 9,
Determining at least one of magnitude, saturation, and brightness of the icon in proportion to the signal intensity intensity
The method comprising the steps of:
10. The method of claim 9,
When the sensors are plural,
Wherein the step of selecting the candidate icon comprises:
Selecting candidate icons that meet the criteria in each of the image maps generated for the plurality of sensors, and controlling the total sum of the candidate icons to be selected so as not to exceed n
Based on the sensor image.
10. The method of claim 9,
Visualizing the intensity of the sensor according to the sensor data associated with the sensor as a standard icon and mapping the standard icon to the location of the sensor to generate a representative image map
Further comprising:
The step of selecting the candidate icon comprises:
Comparing the icon in the image map with a standard icon in the representative image map, and selecting the n icons having the highest similarity as the candidate icon
Based on the sensor image.
KR1020150139916A 2015-10-05 2015-10-05 Device and method for indoor positioning based on sensor image KR101767743B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150139916A KR101767743B1 (en) 2015-10-05 2015-10-05 Device and method for indoor positioning based on sensor image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150139916A KR101767743B1 (en) 2015-10-05 2015-10-05 Device and method for indoor positioning based on sensor image

Publications (2)

Publication Number Publication Date
KR20170040652A true KR20170040652A (en) 2017-04-13
KR101767743B1 KR101767743B1 (en) 2017-08-11

Family

ID=58580055

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150139916A KR101767743B1 (en) 2015-10-05 2015-10-05 Device and method for indoor positioning based on sensor image

Country Status (1)

Country Link
KR (1) KR101767743B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023140711A1 (en) * 2022-01-24 2023-07-27 파파야 주식회사 Positioning method based on artificial intelligence neural network constructed on basis of sensor map image of multi-signal environment data, and device therefor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210048697A (en) 2019-10-24 2021-05-04 파파야 주식회사 Indoor positioning apparatus and method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2973019B2 (en) * 1990-08-02 1999-11-08 株式会社光電製作所 Multiple direction display type direction finder
JP2003254761A (en) * 2002-02-28 2003-09-10 Fujitsu Ten Ltd Navigation device
JP2008298721A (en) * 2007-06-04 2008-12-11 Keio Gijuku Location estimation system and program
JP5009119B2 (en) * 2007-10-05 2012-08-22 クラリオン株式会社 Navigation device
KR100960533B1 (en) * 2008-02-05 2010-06-03 에스케이 텔레콤주식회사 Method, Apparatus and System for Mearsuring Location Based Terminal by Using U-pCell Database
US8686734B2 (en) * 2010-02-10 2014-04-01 Disney Enterprises, Inc. System and method for determining radio frequency identification (RFID) system performance
JP2013124885A (en) * 2011-12-13 2013-06-24 Information Services International Dentsu Ltd Positioning device, positioning method and program
KR101443675B1 (en) * 2013-01-16 2014-09-23 인팩일렉스 주식회사 Radio Signal Measuring System for Vehicle
KR101516769B1 (en) * 2013-12-11 2015-05-04 숭실대학교산학협력단 Indoor wireless positioning system and indoor wireless positioning method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023140711A1 (en) * 2022-01-24 2023-07-27 파파야 주식회사 Positioning method based on artificial intelligence neural network constructed on basis of sensor map image of multi-signal environment data, and device therefor
KR20230114216A (en) * 2022-01-24 2023-08-01 파파야 주식회사 A positioning method based on artificial intelligence neural network based on sensor map images of city data and an apparatus using it

Also Published As

Publication number Publication date
KR101767743B1 (en) 2017-08-11

Similar Documents

Publication Publication Date Title
US9405986B2 (en) Apparatus and method for recognizing objects using filter information
US9142011B2 (en) Shadow detection method and device
WO2019223608A1 (en) Service providing method and apparatus
US9472004B2 (en) Navigating using an indoor map representation
US9728009B2 (en) Augmented reality based management of a representation of a smart environment
CN106931945A (en) Robot navigation method and system
US9767365B2 (en) Monitoring system and method for queue
CN103162682A (en) Indoor path navigation method based on mixed reality
CN109948450A (en) A kind of user behavior detection method, device and storage medium based on image
KR102333520B1 (en) Method, device and system for detecting object on road
CN112232368B (en) Target recognition model training method, target recognition method and related devices thereof
KR101767743B1 (en) Device and method for indoor positioning based on sensor image
US20160085831A1 (en) Method and apparatus for map classification and restructuring
US9286689B2 (en) Method and device for detecting the gait of a pedestrian for a portable terminal
CN105574841B (en) A kind of image partition method and device based on color
KR101517538B1 (en) Apparatus and method for detecting importance region using centroid weight mask map and storage medium recording program therefor
CN108287845A (en) A kind of Automatic extraction method for road information and device and hybrid navigation system
US7120296B2 (en) Information processing method
US20230298309A1 (en) Multiscale object detection device and method
KR100543706B1 (en) Vision-based humanbeing detection method and apparatus
Wietrzykowski et al. Adopting the FAB-MAP algorithm for indoor localization with WiFi fingerprints
KR101199959B1 (en) System for reconnizaing road sign board of image
KR101091061B1 (en) Method for Measuring the Location Similarity of Spatial Object on Digital Maps and Map Matching using the same
CN107403151A (en) Method, apparatus, equipment and the computer-readable recording medium positioned by indoor ceiling
KR102161212B1 (en) System and method for motion detecting

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant