KR101780122B1 - Indoor Positioning Device Using a Single Image Sensor and Method Thereof - Google Patents

Indoor Positioning Device Using a Single Image Sensor and Method Thereof Download PDF

Info

Publication number
KR101780122B1
KR101780122B1 KR1020150150904A KR20150150904A KR101780122B1 KR 101780122 B1 KR101780122 B1 KR 101780122B1 KR 1020150150904 A KR1020150150904 A KR 1020150150904A KR 20150150904 A KR20150150904 A KR 20150150904A KR 101780122 B1 KR101780122 B1 KR 101780122B1
Authority
KR
South Korea
Prior art keywords
image sensor
lens
leds
led
coordinate information
Prior art date
Application number
KR1020150150904A
Other languages
Korean (ko)
Other versions
KR20170049953A (en
Inventor
김기두
호쎈 사즈드
박영일
Original Assignee
국민대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 국민대학교산학협력단 filed Critical 국민대학교산학협력단
Priority to KR1020150150904A priority Critical patent/KR101780122B1/en
Publication of KR20170049953A publication Critical patent/KR20170049953A/en
Application granted granted Critical
Publication of KR101780122B1 publication Critical patent/KR101780122B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/22Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/335
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/44Electric circuits
    • G01J2001/4446Type of detector
    • G01J2001/446Photodiode

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention is for precise positioning with one image sensor in a room without a GPS signal. To this end, a single image sensor 20 for receiving visible light including coordinate information of LEDs from a plurality of LEDs installed in a room through a lens 10; And the output signal of the image sensor 20, and calculates the center point (P) coordinate of the lens 10 calculated based on the distance information as the current indoor position And a controller for controlling the indoor position of the indoor unit using the single image sensor.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an indoor position measuring apparatus and a method thereof,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to position measurement of a room, and more particularly, to an apparatus and method for measuring an indoor position using a single image sensor.

The position measurement in the outdoors has high positioning accuracy with the help of the GPS signal. However, in the indoor environment (for example, parking lots, subways, airports, government offices, factories, shopping centers, etc.) Indoor navigation systems for humans and robots, interactive virtual games, resource development, item tracking, and location based services (LBS) require high precision. In order to measure the position of high precision in the room, triangulation, screen analysis, and analysis using the existing communication system such as GPS, RF, wireless sensor network, wireless LAN, Bluetooth, VLC and OWC system, A variety of positioning algorithms based on approximate methods are being studied. These systems have different precision, coverage, cost, complexity, sensitivity and scalability.

Of these, the indoor position measurement using the visible light communication (VLC) has complexity depending on the angle measurement, the intensity change of the light, the distance of the sending and receiving time, and the like. These limitations ultimately increase positional error and degrade system accuracy because small errors in the system cause large error rates.

In particular, Korean Patent Registration No. 10-1174126 shows that the position error appears in the calculation process of the quadratic equation for precise angle measurement, measurement of the received signal strength, and positioning. Thus, at least four LEDs are used from an LED array transmitting three-dimensional coordinate information, which is received and demodulated by two image sensors near an unknown location. The unknown position was calculated from the geometric relationship of the LED images generated on the image sensor plane using a combination of least squares and vector estimation with a precision of less than 10 cm. This method has the problem of increasing the complexity and the quantization error from both image sensors due to the two image sensors. These tasks assume that the image sensor is at the center of the pixel, but the actual center of the LED image is not always at the center of the pixel. Thus, a quantization error has occurred. In addition, the positional accuracy depends on the spacing between the image sensors.

Therefore, it is necessary to reduce the error of the dependent variable of the system by developing a consistent algorithm to improve the positioning error of the system and reduce the complexity.

SUMMARY OF THE INVENTION It is therefore an object of the present invention to provide a single image sensor which does not require any angular measurement while estimating an unknown position and does not need to consider the intensity of a received signal, And an indoor position measuring device using the same.

A second object of the present invention is to provide an indoor position measuring apparatus and method using a single image sensor having high position measurement accuracy by transmitting three-dimensional coordinate information of at least three LEDs by using visible light communication.

A third object of the present invention is to provide an image processing apparatus and a method for processing a single image having a robust feature from the surrounding environment by calculating a desired unknown position from positional information of a reference LED and a geometric relationship of an LED image in an image sensor plane, An indoor position measuring device using a sensor, and a method thereof.

The above object of the present invention is also achieved by a display apparatus comprising: a single image sensor (20) for receiving visible light including coordinate information of LEDs from a plurality of LEDs installed in a room through a lens (10); And a controller for detecting the coordinate information based on the output signal of the image sensor 20 and calculating the coordinates of the center point P of the lens 10 calculated based on the coordinate information as the current indoor position The present invention can be achieved by an indoor position measuring apparatus using a single image sensor.

Here, the LED is preferably fixed to the ceiling 30 of the room.

Further, it is more preferable that the LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each of the LEDs (LEDs A, B, C) irradiates their coordinate information with visible light.

In addition, the controller calculates the center point (P) coordinate of the lens 10 as the current indoor position based on the three-dimensional coordinate information (X-Y-Z coordinate) of the coordinate information.

The image sensor 20 may be an image pickup device or a plurality of photodiodes.

The controller can calculate the center point (P) coordinate of the lens 10 to the current indoor position based on the respective three-dimensional coordinate information (XYZ coordinate) of the LED and the focal length f of the lens 10 Do.

According to another aspect of the present invention, there is provided a method of driving a display device, including the steps of: receiving visible light including coordinate information of an LED from a plurality of LEDs installed in a room through a single image sensor; Calculating a coordinate information based on an output signal of the image sensor (20); And calculating a center point (P) coordinate (x, y) of the lens (10) calculated on the basis of the coordinate information as a current indoor position using a single image sensor As shown in FIG.

The LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each LED (LEDs A, B, C) has its own coordinate information A (x 1 , y 1 , z 1 ), B 2 , y 2 , z 2 ) and C (x 3 , y 3 , z 3 ) can be irradiated with visible light.

The controller preferably calculates the center point (P) coordinate of the lens 10 as the current indoor position based on the three-dimensional coordinate information (X-Y-Z coordinate) of the coordinate information.

The controller calculates the center point P of the lens 10 as the current indoor position based on the focal length f of the lens 10 and calculates the coordinates of the center point P by the following equation x, y) is calculated as the current indoor position.

Figure 112015105229694-pat00001

Where H is the vertical distance between the LED and the lens, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the image sensor 20 and the center of the image of the kth LED, Is the focal length of the lens (10).

Figure 112015105229694-pat00002
,

Figure 112015105229694-pat00003

According to an embodiment of the present invention, there is a feature that an unknown position is predicted in a room without a GPS signal, and no angle measurement is required.

By transmitting three-dimensional coordinate information of at least three LEDs using visible light communication, it is possible to realize indoor position measurement with high position measurement accuracy. Therefore, there is an advantage that it is not necessary to use other auxiliary communication means such as RF communication and Bluetooth.

Algorithm complexity and quantization error effects are reduced compared to the prior art because of the use of a single image sensor. In addition, when using two conventional image sensors, the dependence on the distance between the two image sensors can be neglected.

In addition, since the desired unknown position is calculated from the positional information of the reference LED and the geometric relationship of the LED image in the image sensor plane, the calculation process is simple and robust from the surrounding environment.

BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate preferred embodiments of the invention and, together with the description of the invention given below, serve to further understand the technical idea of the invention. And should not be construed as interpreted.
1 is an overall system configuration diagram of an indoor position measuring apparatus using a single image sensor according to an embodiment of the present invention;
FIG. 2 is a schematic explanatory view for explaining the geometrical relationship with respect to one LED A in FIG. 1;
3 is a system modeling configuration diagram for simulating an embodiment of the present invention,
4 is a plan view of the configuration diagram shown in Fig. 3,
Fig. 5 is a side view of the configuration diagram shown in Fig. 3,
FIG. 6 is an explanatory diagram showing an illumination region of each LED in the configuration diagram of FIG. 3;
7 is a perspective view showing a vertical FOV in the vertical direction and a horizontal horizontal FOV in the camera equipped with the image sensor.
8 to 11 are graphs showing position measurement errors for each axis with respect to a fixed unknown position P as a simulation result for an embodiment of the present invention.
12 to 15 are graphs showing a 2D positioning error distribution when the number of pixels is 3,000 as a simulation result of an embodiment of the present invention.
FIGS. 16 to 19 are graphs showing, for each axis, a position measurement error of the simulation result for the embodiment of the present invention and a position measurement error of the simulation result of the prior art.

Hereinafter, the configuration of the present invention will be described in detail with reference to the accompanying drawings. While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are herein described in detail.

It is noted that the terms "comprises" or "having" in this application are intended to specify the presence of stated features, integers, steps, operations, elements, parts, or combinations thereof, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

Also, unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.

Example  Configuration

1 is an overall system configuration diagram of an indoor position measuring apparatus using a single image sensor according to an embodiment of the present invention. 1, the ceiling 30 of the room is provided with at least three LEDs (LED A, LED B, and LED C), and each LED has fixed three-dimensional coordinate data as follows: LED A (x 1 , y 1 , z 1 ), LED B (x 2 , y 2 , z 2 ), and LED C (x 3 , y 3 , z 3 ). The three LEDs (LED A, LED B, and LED C) illuminate the modulated three-dimensional coordinate data through visible light communication (VLC).

The camera 5 may be a camera of a smart phone or a navigation device and has therein an image sensor 20 which is spaced apart from the lens 10 by a focal distance f from the lens 10 for light reception.

The lens 10 collects the light 40 and forms an image on the image sensor 20. The center of the lens 10 is defined as P (x, y, z). In the present invention, P (x, y, z) is defined as an unknown position to be known, and is regarded as a current position of a user, a vehicle, a robot, etc. holding the camera 5.

The image sensor 20 is a single sensor in which one is used, and may be an image pickup device or a plurality of photodiodes. The image sensor 20 is made up of a two-dimensional array of detectable elements so that each member or pixel can act as a separate photosensor and the multiple LED signals can be simultaneously detected and demodulated by a single image sensor.

2 is a schematic explanatory view for explaining the geometrical relationship with respect to one LED A in Fig. 2, the light irradiated from the LED A (x 1 , y 1 , z 1 ) passes through the center point (point P) of the lens 10 and is imaged at the R point of the image sensor 20. Here, H is the vertical distance from the LED A (x 1 , y 1 , z 1 ) to the horizontal plane of the lens 10, d 1 is the vertical distance between the LED A (x 1 , y 1 , z 1 ) Is the straight line distance to the center point P. The distances from the three reference LEDs A, B and C to the point P are designated as d 1 , d 2 and d 3 , respectively, and the distance from the specific LED to the point P corresponds to the focal distance f of the lens, Is calculated from the geometric relationship between the positions of the LED images (R). and f is a predetermined value as the focal length of the lens 10. [ Q is the center point of the image sensor 20 and is located directly below the center point (point P). i 1 is a linear distance between QRs, and c 1 is a hypotenuse distance of? PQR.

Although only the LED A is shown in FIG. 2, it is only for the purpose of explanation and understanding, and the same logic is applied to the LED B and the LED C.

Example  Position measurement Algorithm

Hereinafter, a position measurement algorithm for calculating the indoor position of the system according to the present invention will be described.

First, the geometric relationship of the proposed position measurement algorithm is shown in detail in FIG. If the distance between the center R of the LED image imaged on the image sensor 20 and the center Q of the image sensor is i k where k = 1, 2, 3, then d k , 2, 3) can be calculated through the following equations. The vertical distance (H) between the LED and the lens surface is also an unknown value. We assume that the reference LED (LED A (x 1 , y 1 , z 1 )) is installed in the ceiling 30 of the room.

Figure 112015105229694-pat00004

The vertical distance of the LED A is expressed by Equation (2).

Figure 112015105229694-pat00005

Where z is the vertical distance of the unknown location P. This is the same for all other reference LEDs. See equations (3) and (4).

Figure 112015105229694-pat00006

Figure 112015105229694-pat00007

The distance between the center Q of the image sensor and the center of the image R of the LED A is i 1 . Then, the hypotenuse (c 1 ) of the triangle PQR is expressed as shown in Equation (5).

Figure 112015105229694-pat00008

Here, f is the focal length of the lens. Since the triangle ABP and the triangle PQR are squared, [Equation 6] holds.

Figure 112015105229694-pat00009

Considering the coordinates of the LED A and the unknown P point, a quadratic equation as shown in Equation (7) is established.

Figure 112015105229694-pat00010

Substituting Equation (4) and Equation (6) into Equation (7) yields Equation (8).

Figure 112015105229694-pat00011

For in the same way three LED (LED A (x 1, y 1, z 1), LED B (x 2, y 2, z 2), LED C (x 3, y 3, z 3)) [ Mathematical Equation 9]. Three equations such as Equation (10) and Equation (11) can be obtained.

Figure 112015105229694-pat00012

Figure 112015105229694-pat00013

Figure 112015105229694-pat00014

Figure 112015105229694-pat00015
(9). &Quot; (9) " [Equation 10] and [Equation 11] are [Equation 12]. (13) and (14), respectively.

Figure 112015105229694-pat00016

Figure 112015105229694-pat00017

Figure 112015105229694-pat00018

Subtracting [Expression 13] from [Expression 12] yields Expression (15).

Figure 112015105229694-pat00019

Subtracting Equation (14) from Equation (12) yields Equation (16).

Figure 112015105229694-pat00020

[Equation 15] and [Equation 16] can be expressed by a matrix operation as shown in Equation 17 below.

Figure 112015105229694-pat00021

The quadratic equation of H 2 can be obtained by substituting x (H 2 ) and y (H 2 ) in [Equation 17] for x and y in Equation (8). Then, the value of H 2 can be obtained by using a quadratic equation. Finally, it is possible to find an unknown position P (x, y, z) by replacing H in [Equation 2] and [Equation 17]. This problem can be solved geometrically using the "ApolloNiuS circle".

Example  Simulation model

The system configuration for the simulation model is depicted in Figures 3-7. FIG. 3 is a system modeling configuration diagram for simulating an embodiment of the present invention, FIG. 4 is a plan view of the configuration diagram shown in FIG. 3, and FIG. 5 is a side view of the configuration diagram shown in FIG. FIG. 6 is an explanatory view showing an illumination area of each LED in the configuration diagram of FIG. 3, and FIG. 7 is a perspective view showing a vertical FOV in the vertical direction and a horizontal horizontal FOV of the camera equipped with the image sensor.

As shown in Figs. 3 to 7, three LEDs are located on the ceiling. The results of the proposed algorithm were obtained by simulation using MATLAB. The position measuring area 70 will depend on the field-of-view (FOV) of both the LED and the camera as shown in Figs. 5 and 7.

In order to calculate the size of the scene, it is necessary to take into account the vertical and horizontal angular field of view (FOV) of the camera. Today, every smartphone has a built-in camera that consists of both a lens and an image sensor. The specific simulation parameters were selected as shown in [Table 1].

Figure 112015105229694-pat00022

When the image sensor plane is parallel to the ceiling scene, the approximate location area 70 will appear in the central area of the illumination area 60, as shown in FIG. If the distance between the other two LEDs is less than r, then this area will increase.

Figure 112015105229694-pat00023

Where H is the vertical distance between the LED and the lens of the image sensor, and is the half of the FOV of the LED light.

The line-of-sight (LOS) between the LED and the image sensor 20 needs to be satisfied and the three LEDs must also be located within the FOV of the image sensor 20. [

Example  Analysis of simulation results

Assuming that the unknown position is located within the position measurement area 70 in the plan view of FIG. 4, the simulation of the present invention has been performed for fixed unknown positions (3.2, 3.5, 1.5). When the other factors are constant, the number of pixels per line of the image sensor 20 increases from 500 to 3,000 in units of 20 pixels.

8 to 11 are graphs showing position measurement errors for each axis with respect to a fixed unknown position P as a simulation result for an embodiment of the present invention. 8 to 11, the horizontal axis represents the number of pixels per line, and the vertical axis represents the X axis, Y axis, Z axis, and RMS position error (in units of m).

As can be seen from Figs. 8 to 11, it can be seen that the error of the position measurement decreases when the number of pixels increases. The error indicates a varying property, which is due to the quantization error of the pixel, and is due to the assumption that the center of the image will be the center of the pixel. However, the actual center of the LED image is not always at the center of the pixel. As a result, it has a quantization error. It can be seen from Figs. 8-11 that the error of the position measurement on all axes and the root mean square (RMS) error are less than 0.003 m (0.3 cm). However, if the number of pixels exceeds 2,000 per line, the error of the position measurement on all axes and the root mean square (RMS) error is less than 0.001 m (0.1 cm).

Then, it is assumed that the unknown position changes in the room, and the number of pixels and other simulation parameters remain constant. The distance between two adjacent positions is kept at 5 cm on the x and y axes, and assumes 101,101 = 10,201 experimental positions on the horizontal plane. FIGS. 12 to 15 show the two-dimensional position measurement error distribution of the present invention when the number of pixels is 3,000. 12 to 15, "*", "+" and "o" respectively indicate the positions of three reference LEDs: LED A (3, 0.5, 5), LED B C (4, 4, 5). When the X-Y plane is defined as a 5m × 5m room, on all axes, it can be seen that the position error near the edge of the device is high and the error decreases as approaching the center area of the three reference LEDs. This is due to the directionality between the image sensor 20 and the three reference LEDs (LEDs A, B, C). The three reference LEDs (LEDs A, B, and C) are not within the imaging size of the image sensor 20 when the user is outside the position measurement area 70 near the edge.

Based on the simulation results, the embodiment of the present invention proved that it is possible to provide a position measurement accuracy of 0.001 m near the position measurement area 70 when the number of pixels exceeds 3,000. The approximate position measuring area 70 is indicated by a red dotted line (approximately round triangle) in Fig. Although the simulation results show a position error of less than 0.001 m when the number of pixels exceeds 3,000, the setting can be changed while allowing for acceptable design criteria.

Example  Comparison of simulation results

Hereinafter, one embodiment of the present invention is compared with the prior art (e.g., Korean Patent Registration No. 10-1174126) to prove that there are significant advantages. Therefore, the simulation parameters were re-established according to the previous work as shown in [Table 2].

Figure 112015105229694-pat00024

To solve the major problems such as the complexity involved in combining the least squares and vector estimation methods used in the prior art, the combined quantization error due to two image sensors, and the algorithmic dependence on the distance between two image sensors Simulation was attempted.

FIGS. 16 to 19 are graphs showing, for each axis, a position measurement error of the simulation result for the embodiment of the present invention and a position measurement error of the simulation result of the prior art. In FIGS. 16 to 19, the vertical axis on the left and the vertical axis on the right indicate the positional measurement errors of the algorithm proposed by the prior art and the positional error of the embodiment of the present invention, respectively. As can be seen in Figures 16-19, the position measurement error on the other axes is substantially reduced, demonstrating that the algorithm of the present invention is superior to the prior art. This is due to the simplicity of the algorithm of the present invention, and the proposed system consists of a single image sensor and only three reference LEDs. Thanks to the definition of the single image sensor 20, the algorithmic dependence of the distance between the two image sensors has been eliminated. In addition, the combined quantization error effect due to the inclusion of two image sensors was also sufficiently reduced when using a single image sensor. In order to solve various quadratic equations, a conventional method known to those skilled in the art was used, which also reduced computational complexity.

From Fig. 19, when the number of pixels is 3,000, the RMS position measurement errors are 0.0097 m and 0.0002823 m, respectively, for the conventional technique and the method according to the present invention. This indicates that the position measurement error of the present invention is reduced by 97% compared to the prior art.

Although the present invention has been described in connection with the preferred embodiments set forth above, it will be readily appreciated by those skilled in the art that various other modifications and variations can be made without departing from the spirit and scope of the invention, It is obvious that all modifications are within the scope of the appended claims.

5: Camera,
10: lens,
20: Image sensor,
30: ceiling,
40: visible light (light),
50: projection distance of i 1 ,
60: illumination area,
70: Position measuring area,
f: the focal length of the lens,
P: the center of the lens (the indoor position to be measured),
Q: The center of the image sensor,
R: Center of individual LED image,
Rx: Receiver (camera).

Claims (12)

A single image sensor (20) for receiving visible light including coordinate information of the LEDs from a plurality of LEDs installed in a room through a lens (10); And
And a controller for detecting the coordinate information based on the output signal of the image sensor 20 and calculating the coordinates of the center point P of the lens 10 calculated on the basis of the coordinate information as the current indoor position and,
The LEDs are at least three mutually spaced LEDs (LEDs A, B, and C), and each of the LEDs (LEDs A, B, and C) illuminates their coordinate information with visible light,
The controller calculates the center point P coordinates of the lens 10 as the current indoor position based on the respective three-dimensional coordinate information (XYZ coordinates) of the LED and the focal length f of the lens 10 ,
Wherein the controller calculates the coordinates (x, y) of the center point (P, y) as a current indoor position according to the following equation.
Figure 112017039821421-pat00047

Here, H is the vertical distance between the LED and the lens 10, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the image sensor 20 and the center of the image of the k- , And f is the focal length of the lens (10).
Figure 112017039821421-pat00048
,
Figure 112017039821421-pat00049
The method according to claim 1,
Wherein the LED is fixed to the ceiling (30) of the room.
delete delete The method according to claim 1,
Wherein the image sensor (20) is an image pickup device or a plurality of photodiodes.
delete Receiving visible light including coordinate information of the LED from a plurality of LEDs installed in a room through a single image sensor (20) through a lens (10);
The controller calculating the coordinate information based on the output signal of the image sensor (20); And
(X, y) of the center point (P) of the lens (10) calculated on the basis of the coordinate information to the current indoor position,
Wherein the LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each LED (LEDs A, B, C) has its own coordinate information A (x 1 , y 1 , z 1 ), B x 2 , y 2 , z 2 ) and C (x 3 , y 3 , z 3 ) are irradiated with visible light,
The controller calculates the center point P of the lens 10 as the current indoor position based on the three-dimensional coordinate information (XYZ coordinate) of the coordinate information and the focal length f of the lens 10,
Wherein the controller calculates the coordinate (x, y) of the center point (P, y) as a current indoor position by the following equation.
Figure 112017039821421-pat00050

Here, H is the vertical distance between the LED and the lens 10, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the image sensor 20 and the center of the image of the k- , And f is the focal length of the lens (10).
Figure 112017039821421-pat00051
,
Figure 112017039821421-pat00052
8. The method of claim 7,
Wherein the LED is fixed to the ceiling (30) of the room.
delete delete delete delete
KR1020150150904A 2015-10-29 2015-10-29 Indoor Positioning Device Using a Single Image Sensor and Method Thereof KR101780122B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150150904A KR101780122B1 (en) 2015-10-29 2015-10-29 Indoor Positioning Device Using a Single Image Sensor and Method Thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150150904A KR101780122B1 (en) 2015-10-29 2015-10-29 Indoor Positioning Device Using a Single Image Sensor and Method Thereof

Publications (2)

Publication Number Publication Date
KR20170049953A KR20170049953A (en) 2017-05-11
KR101780122B1 true KR101780122B1 (en) 2017-09-19

Family

ID=58741989

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150150904A KR101780122B1 (en) 2015-10-29 2015-10-29 Indoor Positioning Device Using a Single Image Sensor and Method Thereof

Country Status (1)

Country Link
KR (1) KR101780122B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230101650A (en) 2021-12-29 2023-07-06 주식회사 메타모스 System and method for calculating position of skater

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109188358B (en) * 2018-08-31 2023-03-17 中山大学 High-precision visible light positioning method based on imaging sensor
CN110133592B (en) * 2019-05-09 2022-11-25 哈尔滨师范大学 Indoor two-point positioning method based on visible light communication
KR102204191B1 (en) 2019-07-12 2021-01-18 성균관대학교산학협력단 Visible light based indoor positioning apparatus considering tilt of receiver and indoor positioning method using thereof
CN110441807A (en) * 2019-07-29 2019-11-12 阎祯祺 A kind of localization method and system of indoor user mobile terminal
CN111751784B (en) * 2020-06-23 2023-11-21 上海申核能源工程技术有限公司 Three-dimensional light positioning system of nuclear power station

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009031216A (en) * 2007-07-30 2009-02-12 Nakagawa Kenkyusho:Kk Position detector and photographing device
JP2011141142A (en) 2010-01-05 2011-07-21 Sharp Corp Range finder and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009031216A (en) * 2007-07-30 2009-02-12 Nakagawa Kenkyusho:Kk Position detector and photographing device
JP2011141142A (en) 2010-01-05 2011-07-21 Sharp Corp Range finder and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
후인탄팟 외1. 이미지 센서 기반 실내 측위 알고리즘. 한국통신학회논문지. 한국통신학회. 2015.10., 40(10), pp.2062-2064

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230101650A (en) 2021-12-29 2023-07-06 주식회사 메타모스 System and method for calculating position of skater

Also Published As

Publication number Publication date
KR20170049953A (en) 2017-05-11

Similar Documents

Publication Publication Date Title
KR101780122B1 (en) Indoor Positioning Device Using a Single Image Sensor and Method Thereof
US10805535B2 (en) Systems and methods for multi-camera placement
US9109889B2 (en) Determining tilt angle and tilt direction using image processing
CN103090846B (en) A kind of range unit, range-measurement system and distance-finding method thereof
US20190053858A1 (en) Method and Apparatus for Wide Area Multi-Body 6D Pose Tracking System
Yamazato et al. Image sensor based visible light communication and its application to pose, position, and range estimations
US20140156219A1 (en) Determining tilt angle and tilt direction using image processing
US8542368B2 (en) Position measuring apparatus and method
CN106970354B (en) A kind of 3-D positioning method based on multiple light courcess and photosensor array
KR20120006306A (en) Indoor positioning apparatus and method
CN103782232A (en) Projector and control method thereof
Rahman et al. Indoor location estimation using visible light communication and image sensors
Hossen et al. Performance improvement of indoor positioning using light-emitting diodes and an image sensor for light-emitting diode communication
US20150247912A1 (en) Camera control for fast automatic object targeting
KR20190032791A (en) Real-Time Positioning System and Contents Providing Service System Using Real-Time Positioning System
Bergen et al. Design and implementation of an optical receiver for angle-of-arrival-based positioning
CN104865552A (en) Visible light positioning system and method based on two image sensors
JP2018004357A (en) Information processor, method for controlling information processor, and program
KR20170058612A (en) Indoor positioning method based on images and system thereof
US20230252666A1 (en) Systems and methods of measuring an object in a scene of a captured image
CN100582653C (en) System and method for determining position posture adopting multi- bundle light
CN109323691A (en) A kind of positioning system and localization method
Rodríguez-Navarro et al. Indoor positioning system based on PSD sensor
Zhang et al. Visual-inertial fusion based positioning systems
KR20120068668A (en) Apparatus and method for detecting a position

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant