KR101780122B1 - Indoor Positioning Device Using a Single Image Sensor and Method Thereof - Google Patents
Indoor Positioning Device Using a Single Image Sensor and Method Thereof Download PDFInfo
- Publication number
- KR101780122B1 KR101780122B1 KR1020150150904A KR20150150904A KR101780122B1 KR 101780122 B1 KR101780122 B1 KR 101780122B1 KR 1020150150904 A KR1020150150904 A KR 1020150150904A KR 20150150904 A KR20150150904 A KR 20150150904A KR 101780122 B1 KR101780122 B1 KR 101780122B1
- Authority
- KR
- South Korea
- Prior art keywords
- image sensor
- lens
- leds
- led
- coordinate information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 16
- 238000005259 measurement Methods 0.000 description 32
- 238000004088 simulation Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 11
- 238000004422 calculation algorithm Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 7
- 238000013139 quantization Methods 0.000 description 7
- 238000005286 illumination Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005094 computer simulation Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/18—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
- G01S5/22—Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
-
- H04N5/335—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/44—Electric circuits
- G01J2001/4446—Type of detector
- G01J2001/446—Photodiode
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The present invention is for precise positioning with one image sensor in a room without a GPS signal. To this end, a single image sensor 20 for receiving visible light including coordinate information of LEDs from a plurality of LEDs installed in a room through a lens 10; And the output signal of the image sensor 20, and calculates the center point (P) coordinate of the lens 10 calculated based on the distance information as the current indoor position And a controller for controlling the indoor position of the indoor unit using the single image sensor.
Description
BACKGROUND OF THE
The position measurement in the outdoors has high positioning accuracy with the help of the GPS signal. However, in the indoor environment (for example, parking lots, subways, airports, government offices, factories, shopping centers, etc.) Indoor navigation systems for humans and robots, interactive virtual games, resource development, item tracking, and location based services (LBS) require high precision. In order to measure the position of high precision in the room, triangulation, screen analysis, and analysis using the existing communication system such as GPS, RF, wireless sensor network, wireless LAN, Bluetooth, VLC and OWC system, A variety of positioning algorithms based on approximate methods are being studied. These systems have different precision, coverage, cost, complexity, sensitivity and scalability.
Of these, the indoor position measurement using the visible light communication (VLC) has complexity depending on the angle measurement, the intensity change of the light, the distance of the sending and receiving time, and the like. These limitations ultimately increase positional error and degrade system accuracy because small errors in the system cause large error rates.
In particular, Korean Patent Registration No. 10-1174126 shows that the position error appears in the calculation process of the quadratic equation for precise angle measurement, measurement of the received signal strength, and positioning. Thus, at least four LEDs are used from an LED array transmitting three-dimensional coordinate information, which is received and demodulated by two image sensors near an unknown location. The unknown position was calculated from the geometric relationship of the LED images generated on the image sensor plane using a combination of least squares and vector estimation with a precision of less than 10 cm. This method has the problem of increasing the complexity and the quantization error from both image sensors due to the two image sensors. These tasks assume that the image sensor is at the center of the pixel, but the actual center of the LED image is not always at the center of the pixel. Thus, a quantization error has occurred. In addition, the positional accuracy depends on the spacing between the image sensors.
Therefore, it is necessary to reduce the error of the dependent variable of the system by developing a consistent algorithm to improve the positioning error of the system and reduce the complexity.
SUMMARY OF THE INVENTION It is therefore an object of the present invention to provide a single image sensor which does not require any angular measurement while estimating an unknown position and does not need to consider the intensity of a received signal, And an indoor position measuring device using the same.
A second object of the present invention is to provide an indoor position measuring apparatus and method using a single image sensor having high position measurement accuracy by transmitting three-dimensional coordinate information of at least three LEDs by using visible light communication.
A third object of the present invention is to provide an image processing apparatus and a method for processing a single image having a robust feature from the surrounding environment by calculating a desired unknown position from positional information of a reference LED and a geometric relationship of an LED image in an image sensor plane, An indoor position measuring device using a sensor, and a method thereof.
The above object of the present invention is also achieved by a display apparatus comprising: a single image sensor (20) for receiving visible light including coordinate information of LEDs from a plurality of LEDs installed in a room through a lens (10); And a controller for detecting the coordinate information based on the output signal of the
Here, the LED is preferably fixed to the
Further, it is more preferable that the LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each of the LEDs (LEDs A, B, C) irradiates their coordinate information with visible light.
In addition, the controller calculates the center point (P) coordinate of the
The
The controller can calculate the center point (P) coordinate of the
According to another aspect of the present invention, there is provided a method of driving a display device, including the steps of: receiving visible light including coordinate information of an LED from a plurality of LEDs installed in a room through a single image sensor; Calculating a coordinate information based on an output signal of the image sensor (20); And calculating a center point (P) coordinate (x, y) of the lens (10) calculated on the basis of the coordinate information as a current indoor position using a single image sensor As shown in FIG.
The LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each LED (LEDs A, B, C) has its own coordinate information A (x 1 , y 1 , z 1 ), B 2 , y 2 , z 2 ) and C (x 3 , y 3 , z 3 ) can be irradiated with visible light.
The controller preferably calculates the center point (P) coordinate of the
The controller calculates the center point P of the
Where H is the vertical distance between the LED and the lens, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the
,
According to an embodiment of the present invention, there is a feature that an unknown position is predicted in a room without a GPS signal, and no angle measurement is required.
By transmitting three-dimensional coordinate information of at least three LEDs using visible light communication, it is possible to realize indoor position measurement with high position measurement accuracy. Therefore, there is an advantage that it is not necessary to use other auxiliary communication means such as RF communication and Bluetooth.
Algorithm complexity and quantization error effects are reduced compared to the prior art because of the use of a single image sensor. In addition, when using two conventional image sensors, the dependence on the distance between the two image sensors can be neglected.
In addition, since the desired unknown position is calculated from the positional information of the reference LED and the geometric relationship of the LED image in the image sensor plane, the calculation process is simple and robust from the surrounding environment.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate preferred embodiments of the invention and, together with the description of the invention given below, serve to further understand the technical idea of the invention. And should not be construed as interpreted.
1 is an overall system configuration diagram of an indoor position measuring apparatus using a single image sensor according to an embodiment of the present invention;
FIG. 2 is a schematic explanatory view for explaining the geometrical relationship with respect to one LED A in FIG. 1;
3 is a system modeling configuration diagram for simulating an embodiment of the present invention,
4 is a plan view of the configuration diagram shown in Fig. 3,
Fig. 5 is a side view of the configuration diagram shown in Fig. 3,
FIG. 6 is an explanatory diagram showing an illumination region of each LED in the configuration diagram of FIG. 3;
7 is a perspective view showing a vertical FOV in the vertical direction and a horizontal horizontal FOV in the camera equipped with the image sensor.
8 to 11 are graphs showing position measurement errors for each axis with respect to a fixed unknown position P as a simulation result for an embodiment of the present invention.
12 to 15 are graphs showing a 2D positioning error distribution when the number of pixels is 3,000 as a simulation result of an embodiment of the present invention.
FIGS. 16 to 19 are graphs showing, for each axis, a position measurement error of the simulation result for the embodiment of the present invention and a position measurement error of the simulation result of the prior art.
Hereinafter, the configuration of the present invention will be described in detail with reference to the accompanying drawings. While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are herein described in detail.
It is noted that the terms "comprises" or "having" in this application are intended to specify the presence of stated features, integers, steps, operations, elements, parts, or combinations thereof, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.
Also, unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and are to be interpreted as either ideal or overly formal in the sense of the present application Do not.
Example Configuration
1 is an overall system configuration diagram of an indoor position measuring apparatus using a single image sensor according to an embodiment of the present invention. 1, the
The
The
The
2 is a schematic explanatory view for explaining the geometrical relationship with respect to one LED A in Fig. 2, the light irradiated from the LED A (x 1 , y 1 , z 1 ) passes through the center point (point P) of the
Although only the LED A is shown in FIG. 2, it is only for the purpose of explanation and understanding, and the same logic is applied to the LED B and the LED C.
Example Position measurement Algorithm
Hereinafter, a position measurement algorithm for calculating the indoor position of the system according to the present invention will be described.
First, the geometric relationship of the proposed position measurement algorithm is shown in detail in FIG. If the distance between the center R of the LED image imaged on the
The vertical distance of the LED A is expressed by Equation (2).
Where z is the vertical distance of the unknown location P. This is the same for all other reference LEDs. See equations (3) and (4).
The distance between the center Q of the image sensor and the center of the image R of the LED A is i 1 . Then, the hypotenuse (c 1 ) of the triangle PQR is expressed as shown in Equation (5).
Here, f is the focal length of the lens. Since the triangle ABP and the triangle PQR are squared, [Equation 6] holds.
Considering the coordinates of the LED A and the unknown P point, a quadratic equation as shown in Equation (7) is established.
Substituting Equation (4) and Equation (6) into Equation (7) yields Equation (8).
For in the same way three LED (LED A (x 1,
(9). &Quot; (9) " [Equation 10] and [Equation 11] are [Equation 12]. (13) and (14), respectively.
Subtracting [Expression 13] from [Expression 12] yields Expression (15).
Subtracting Equation (14) from Equation (12) yields Equation (16).
[Equation 15] and [Equation 16] can be expressed by a matrix operation as shown in Equation 17 below.
The quadratic equation of H 2 can be obtained by substituting x (H 2 ) and y (H 2 ) in [Equation 17] for x and y in Equation (8). Then, the value of H 2 can be obtained by using a quadratic equation. Finally, it is possible to find an unknown position P (x, y, z) by replacing H in [Equation 2] and [Equation 17]. This problem can be solved geometrically using the "ApolloNiuS circle".
Example Simulation model
The system configuration for the simulation model is depicted in Figures 3-7. FIG. 3 is a system modeling configuration diagram for simulating an embodiment of the present invention, FIG. 4 is a plan view of the configuration diagram shown in FIG. 3, and FIG. 5 is a side view of the configuration diagram shown in FIG. FIG. 6 is an explanatory view showing an illumination area of each LED in the configuration diagram of FIG. 3, and FIG. 7 is a perspective view showing a vertical FOV in the vertical direction and a horizontal horizontal FOV of the camera equipped with the image sensor.
As shown in Figs. 3 to 7, three LEDs are located on the ceiling. The results of the proposed algorithm were obtained by simulation using MATLAB. The
In order to calculate the size of the scene, it is necessary to take into account the vertical and horizontal angular field of view (FOV) of the camera. Today, every smartphone has a built-in camera that consists of both a lens and an image sensor. The specific simulation parameters were selected as shown in [Table 1].
When the image sensor plane is parallel to the ceiling scene, the
Where H is the vertical distance between the LED and the lens of the image sensor, and is the half of the FOV of the LED light.
The line-of-sight (LOS) between the LED and the
Example Analysis of simulation results
Assuming that the unknown position is located within the
8 to 11 are graphs showing position measurement errors for each axis with respect to a fixed unknown position P as a simulation result for an embodiment of the present invention. 8 to 11, the horizontal axis represents the number of pixels per line, and the vertical axis represents the X axis, Y axis, Z axis, and RMS position error (in units of m).
As can be seen from Figs. 8 to 11, it can be seen that the error of the position measurement decreases when the number of pixels increases. The error indicates a varying property, which is due to the quantization error of the pixel, and is due to the assumption that the center of the image will be the center of the pixel. However, the actual center of the LED image is not always at the center of the pixel. As a result, it has a quantization error. It can be seen from Figs. 8-11 that the error of the position measurement on all axes and the root mean square (RMS) error are less than 0.003 m (0.3 cm). However, if the number of pixels exceeds 2,000 per line, the error of the position measurement on all axes and the root mean square (RMS) error is less than 0.001 m (0.1 cm).
Then, it is assumed that the unknown position changes in the room, and the number of pixels and other simulation parameters remain constant. The distance between two adjacent positions is kept at 5 cm on the x and y axes, and assumes 101,101 = 10,201 experimental positions on the horizontal plane. FIGS. 12 to 15 show the two-dimensional position measurement error distribution of the present invention when the number of pixels is 3,000. 12 to 15, "*", "+" and "o" respectively indicate the positions of three reference LEDs: LED A (3, 0.5, 5), LED B C (4, 4, 5). When the X-Y plane is defined as a 5m × 5m room, on all axes, it can be seen that the position error near the edge of the device is high and the error decreases as approaching the center area of the three reference LEDs. This is due to the directionality between the
Based on the simulation results, the embodiment of the present invention proved that it is possible to provide a position measurement accuracy of 0.001 m near the
Example Comparison of simulation results
Hereinafter, one embodiment of the present invention is compared with the prior art (e.g., Korean Patent Registration No. 10-1174126) to prove that there are significant advantages. Therefore, the simulation parameters were re-established according to the previous work as shown in [Table 2].
To solve the major problems such as the complexity involved in combining the least squares and vector estimation methods used in the prior art, the combined quantization error due to two image sensors, and the algorithmic dependence on the distance between two image sensors Simulation was attempted.
FIGS. 16 to 19 are graphs showing, for each axis, a position measurement error of the simulation result for the embodiment of the present invention and a position measurement error of the simulation result of the prior art. In FIGS. 16 to 19, the vertical axis on the left and the vertical axis on the right indicate the positional measurement errors of the algorithm proposed by the prior art and the positional error of the embodiment of the present invention, respectively. As can be seen in Figures 16-19, the position measurement error on the other axes is substantially reduced, demonstrating that the algorithm of the present invention is superior to the prior art. This is due to the simplicity of the algorithm of the present invention, and the proposed system consists of a single image sensor and only three reference LEDs. Thanks to the definition of the
From Fig. 19, when the number of pixels is 3,000, the RMS position measurement errors are 0.0097 m and 0.0002823 m, respectively, for the conventional technique and the method according to the present invention. This indicates that the position measurement error of the present invention is reduced by 97% compared to the prior art.
Although the present invention has been described in connection with the preferred embodiments set forth above, it will be readily appreciated by those skilled in the art that various other modifications and variations can be made without departing from the spirit and scope of the invention, It is obvious that all modifications are within the scope of the appended claims.
5: Camera,
10: lens,
20: Image sensor,
30: ceiling,
40: visible light (light),
50: projection distance of i 1 ,
60: illumination area,
70: Position measuring area,
f: the focal length of the lens,
P: the center of the lens (the indoor position to be measured),
Q: The center of the image sensor,
R: Center of individual LED image,
Rx: Receiver (camera).
Claims (12)
And a controller for detecting the coordinate information based on the output signal of the image sensor 20 and calculating the coordinates of the center point P of the lens 10 calculated on the basis of the coordinate information as the current indoor position and,
The LEDs are at least three mutually spaced LEDs (LEDs A, B, and C), and each of the LEDs (LEDs A, B, and C) illuminates their coordinate information with visible light,
The controller calculates the center point P coordinates of the lens 10 as the current indoor position based on the respective three-dimensional coordinate information (XYZ coordinates) of the LED and the focal length f of the lens 10 ,
Wherein the controller calculates the coordinates (x, y) of the center point (P, y) as a current indoor position according to the following equation.
Here, H is the vertical distance between the LED and the lens 10, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the image sensor 20 and the center of the image of the k- , And f is the focal length of the lens (10).
,
Wherein the LED is fixed to the ceiling (30) of the room.
Wherein the image sensor (20) is an image pickup device or a plurality of photodiodes.
The controller calculating the coordinate information based on the output signal of the image sensor (20); And
(X, y) of the center point (P) of the lens (10) calculated on the basis of the coordinate information to the current indoor position,
Wherein the LEDs are at least three mutually spaced LEDs (LEDs A, B, C), and each LED (LEDs A, B, C) has its own coordinate information A (x 1 , y 1 , z 1 ), B x 2 , y 2 , z 2 ) and C (x 3 , y 3 , z 3 ) are irradiated with visible light,
The controller calculates the center point P of the lens 10 as the current indoor position based on the three-dimensional coordinate information (XYZ coordinate) of the coordinate information and the focal length f of the lens 10,
Wherein the controller calculates the coordinate (x, y) of the center point (P, y) as a current indoor position by the following equation.
Here, H is the vertical distance between the LED and the lens 10, z 1 = z 2 = z 3 = H + z, i k is the distance between the center of the image sensor 20 and the center of the image of the k- , And f is the focal length of the lens (10).
,
Wherein the LED is fixed to the ceiling (30) of the room.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150150904A KR101780122B1 (en) | 2015-10-29 | 2015-10-29 | Indoor Positioning Device Using a Single Image Sensor and Method Thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150150904A KR101780122B1 (en) | 2015-10-29 | 2015-10-29 | Indoor Positioning Device Using a Single Image Sensor and Method Thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170049953A KR20170049953A (en) | 2017-05-11 |
KR101780122B1 true KR101780122B1 (en) | 2017-09-19 |
Family
ID=58741989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150150904A KR101780122B1 (en) | 2015-10-29 | 2015-10-29 | Indoor Positioning Device Using a Single Image Sensor and Method Thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101780122B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230101650A (en) | 2021-12-29 | 2023-07-06 | 주식회사 메타모스 | System and method for calculating position of skater |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109188358B (en) * | 2018-08-31 | 2023-03-17 | 中山大学 | High-precision visible light positioning method based on imaging sensor |
CN110133592B (en) * | 2019-05-09 | 2022-11-25 | 哈尔滨师范大学 | Indoor two-point positioning method based on visible light communication |
KR102204191B1 (en) | 2019-07-12 | 2021-01-18 | 성균관대학교산학협력단 | Visible light based indoor positioning apparatus considering tilt of receiver and indoor positioning method using thereof |
CN110441807A (en) * | 2019-07-29 | 2019-11-12 | 阎祯祺 | A kind of localization method and system of indoor user mobile terminal |
CN111751784B (en) * | 2020-06-23 | 2023-11-21 | 上海申核能源工程技术有限公司 | Three-dimensional light positioning system of nuclear power station |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009031216A (en) * | 2007-07-30 | 2009-02-12 | Nakagawa Kenkyusho:Kk | Position detector and photographing device |
JP2011141142A (en) | 2010-01-05 | 2011-07-21 | Sharp Corp | Range finder and electronic equipment |
-
2015
- 2015-10-29 KR KR1020150150904A patent/KR101780122B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009031216A (en) * | 2007-07-30 | 2009-02-12 | Nakagawa Kenkyusho:Kk | Position detector and photographing device |
JP2011141142A (en) | 2010-01-05 | 2011-07-21 | Sharp Corp | Range finder and electronic equipment |
Non-Patent Citations (1)
Title |
---|
후인탄팟 외1. 이미지 센서 기반 실내 측위 알고리즘. 한국통신학회논문지. 한국통신학회. 2015.10., 40(10), pp.2062-2064 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230101650A (en) | 2021-12-29 | 2023-07-06 | 주식회사 메타모스 | System and method for calculating position of skater |
Also Published As
Publication number | Publication date |
---|---|
KR20170049953A (en) | 2017-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101780122B1 (en) | Indoor Positioning Device Using a Single Image Sensor and Method Thereof | |
US10805535B2 (en) | Systems and methods for multi-camera placement | |
US9109889B2 (en) | Determining tilt angle and tilt direction using image processing | |
CN103090846B (en) | A kind of range unit, range-measurement system and distance-finding method thereof | |
US20190053858A1 (en) | Method and Apparatus for Wide Area Multi-Body 6D Pose Tracking System | |
Yamazato et al. | Image sensor based visible light communication and its application to pose, position, and range estimations | |
US20140156219A1 (en) | Determining tilt angle and tilt direction using image processing | |
US8542368B2 (en) | Position measuring apparatus and method | |
CN106970354B (en) | A kind of 3-D positioning method based on multiple light courcess and photosensor array | |
KR20120006306A (en) | Indoor positioning apparatus and method | |
CN103782232A (en) | Projector and control method thereof | |
Rahman et al. | Indoor location estimation using visible light communication and image sensors | |
Hossen et al. | Performance improvement of indoor positioning using light-emitting diodes and an image sensor for light-emitting diode communication | |
US20150247912A1 (en) | Camera control for fast automatic object targeting | |
KR20190032791A (en) | Real-Time Positioning System and Contents Providing Service System Using Real-Time Positioning System | |
Bergen et al. | Design and implementation of an optical receiver for angle-of-arrival-based positioning | |
CN104865552A (en) | Visible light positioning system and method based on two image sensors | |
JP2018004357A (en) | Information processor, method for controlling information processor, and program | |
KR20170058612A (en) | Indoor positioning method based on images and system thereof | |
US20230252666A1 (en) | Systems and methods of measuring an object in a scene of a captured image | |
CN100582653C (en) | System and method for determining position posture adopting multi- bundle light | |
CN109323691A (en) | A kind of positioning system and localization method | |
Rodríguez-Navarro et al. | Indoor positioning system based on PSD sensor | |
Zhang et al. | Visual-inertial fusion based positioning systems | |
KR20120068668A (en) | Apparatus and method for detecting a position |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |