KR101038504B1 - Method and apparatus for estimation location by using a plurality of sensor - Google Patents

Method and apparatus for estimation location by using a plurality of sensor Download PDF

Info

Publication number
KR101038504B1
KR101038504B1 KR1020100108359A KR20100108359A KR101038504B1 KR 101038504 B1 KR101038504 B1 KR 101038504B1 KR 1020100108359 A KR1020100108359 A KR 1020100108359A KR 20100108359 A KR20100108359 A KR 20100108359A KR 101038504 B1 KR101038504 B1 KR 101038504B1
Authority
KR
South Korea
Prior art keywords
sensors
line
determined
predetermined sensor
lines
Prior art date
Application number
KR1020100108359A
Other languages
Korean (ko)
Inventor
장택수
Original Assignee
엘아이지넥스원 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘아이지넥스원 주식회사 filed Critical 엘아이지넥스원 주식회사
Priority to KR1020100108359A priority Critical patent/KR101038504B1/en
Application granted granted Critical
Publication of KR101038504B1 publication Critical patent/KR101038504B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/028Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B7/00Measuring arrangements characterised by the use of electric or magnetic techniques
    • G01B7/003Measuring arrangements characterised by the use of electric or magnetic techniques for measuring position, not involving coordinate determination
    • H04N5/217

Abstract

From the azimuth lines of the plurality of sensors, azimuth lines valid for estimating the position of the object are extracted, and based on the extracted valid azimuth lines and the positions of the sensors corresponding to the extracted valid azimuth lines, the position of the object detected by the plurality of sensors is determined. A method and apparatus for estimating are disclosed.

Description

Method and apparatus for estimation location by using a plurality of sensors}

The present invention relates to a method and apparatus for estimating a predetermined position, and more particularly, to a method and apparatus for estimating a position of a predetermined object detected by a plurality of sensors.

Various sensors for detecting movement within a detection range are widely used for various purposes. Sensors for motion detection are used in various fields, such as doors that automatically detect the approach of an object to open or close automatically, and CCTV that automatically detects the movement of an object within the shooting range. Using a plurality of sensors for detecting the movement of the object, it is possible to estimate the position of the object based on the position and the detection of the plurality of sensors.

The technical problem to be solved by the present invention is to provide a method and apparatus for estimating the position with a higher accuracy in estimating the position by using a plurality of sensors, the computer readable recording the program for executing the method The present invention provides a recording medium.

According to an aspect of the present invention, there is provided a method of estimating a position using a plurality of sensors, the method comprising: extracting azimuth lines effective for estimating a position of an object among azimuth lines of the plurality of sensors; And estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object.

According to another embodiment of the present invention, the step of extracting the effective azimuth lines includes (a) determining a position where the sum of the mean squared distances is minimum based on the azimuth lines of the plurality of sensors and the positions of the plurality of sensors. Doing; (b) determining whether a point at which a distance between the determined position and a direction line of a predetermined sensor among the plurality of sensors is minimum is a point different from a direction of a direction line of the predetermined sensor; And (c) if the determined position and the direction of the orientation line of the predetermined sensor are opposite, determine the orientation line of the predetermined sensor as an invalid orientation line, and if the determined position and the direction of the predetermined sensor are the same, Determining the defense line as a valid defense line.

According to another embodiment of the present invention, the method for estimating the position, if it is determined that the bearing line of the predetermined sensor is invalid, except for the bearing line determined to be invalid, the (a) to (c) Repeating the steps further includes extracting azimuth lines valid for estimating the position of the object.

According to another embodiment of the present invention, the method for estimating the position further comprises repeating steps (a) to (c) for the plurality of sensors to extract the azimuth lines valid for estimating the position of the object. Include.

According to another embodiment of the present invention, the determining whether the point is a point in a direction different from the direction of the predetermined sensor comprises: a quadrant with respect to the direction of the orientation line of the predetermined sensor in a rectangular coordinate system centering on the predetermined sensor; If the quadrants for the points are different, it is determined that the point is a point in a direction different from the direction of the predetermined sensor, and the quadrant for the direction of the direction of the azimuth line of the predetermined sensor in a rectangular coordinate system centering on the predetermined sensor, and for the point And if the quadrants are the same, determining that the point is a point in the same direction as the direction of the predetermined sensor.

According to an aspect of the present invention, there is provided an apparatus for estimating a position using a plurality of sensors, the corrector configured to extract azimuth lines effective for estimating a position of the object among azimuth lines of the plurality of sensors; And a determination unit for estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object.

In order to solve the above technical problem, the present invention provides a computer-readable recording medium having recorded thereon a program for executing the method for estimating the above-described position.

According to the present invention, the position can be estimated using only a sensor valid for position estimation among the plurality of sensors, and the position can be estimated with higher accuracy.

1 shows an apparatus for estimating a position according to an embodiment of the present invention.
2A to 2C illustrate a position estimation method using a plurality of sensors according to an embodiment of the present invention.
3 is a flowchart illustrating a method of estimating a position according to an embodiment of the present invention.
4 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to an exemplary embodiment of the present invention.
5 is a view for explaining a method of determining the validity of the defense line according to an embodiment of the present invention.
FIG. 6 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to another embodiment of the present invention.

Hereinafter, with reference to the drawings will be described embodiments of the present invention;

1 shows an apparatus for estimating a position according to an embodiment of the present invention.

Referring to FIG. 1, an apparatus for estimating a position using a plurality of sensors according to an embodiment of the present invention includes a corrector 110 and a determiner 120.

The correction unit 110 extracts only a sensor useful for estimating the position of the detected object among the plurality of sensors. When an object is simultaneously detected by a plurality of sensors, the position of the detected object may be estimated based on the positions and directions of the plurality of sensors. A line of bearing, a virtual line coinciding with the direction of the sensor, may be set for each of the plurality of sensors, and the position of the detected object may be estimated based on the plurality of bearing lines.

However, even if an object is detected by a plurality of sensors at the same time, the orientation lines of the sensors on which the object is detected may not be used for position estimation. Therefore, the correction unit 110 extracts a bearing line which is effective for estimating the position of the detected object among the bearing lines of the plurality of sensors. This will be described in detail with reference to FIGS. 2A, 2B and 2C.

2A and 2B illustrate a position estimation method using a plurality of sensors according to an embodiment of the present invention.

2A illustrates a method of estimating the position of a given object based on a plurality of sensors. The position of the object sensed by the sensor # 1 210, the sensor # 2 220, and the sensor # 3 230 is estimated. When sensor # 1 210, sensor # 2 220, and sensor # 3 230 simultaneously detect an object, the position of the object may be estimated based on azimuth lines 212, 222, and 232. As described above, the azimuth lines 212, 222, and 232 may be lines corresponding to the directions of the sensors 210, 220, and 230 when the object is detected.

Once the azimuth lines 212, 222 and 232 are determined as shown in FIG. 2A, the position of the object is estimated based on the mean squared distance. The point 240 at which the sum of the mean square distances from the azimuth lines 212, 222, and 232 is minimum may be estimated as the position of the sensed object.

2B illustrates a position estimation method using an average squared distance algorithm.

Referring to FIG. 2B, when the position of the i th sensor is (x i , y i ) in the global coordinate where the positions of the plurality of sensors 210, 220, and 230 are displayed relative to the origin. The coordinate (x T , y T ) of the point at which the sum of the mean square distances from the azimuth lines 212, 222, and 232 is the minimum can be obtained by calculating the following equations. First, the sum of the distances from the bearing lines 212, 222, and 232 to the estimated position is expressed by Equation 1 below.

Figure 112010071562916-pat00001

Figure 112010071562916-pat00002

In the above formula, N means the total number of sensors,

Figure 112010071562916-pat00003
Denotes the angle between the direction line of the I-th sensor and the x-axis of the global coordinate system. If the sum of the mean squared distances calculated by Equation 1 is partially divided according to x T and y T , and each of the partial differential values is x T and y T , the following is obtained.

Figure 112010071562916-pat00004

Figure 112010071562916-pat00005

As shown in FIG. 2A, when the plurality of orientation lines 212, 222, and 232 for the plurality of sensors 210, 220, and 230 are determined, the equations 1 and 2 are calculated to calculate the orientation lines based on the plurality of orientation lines. The location 240 of the detected object may be estimated.

However, as described above, the azimuth lines of all sensors on which an object is detected are not effective azimuth lines for position estimation. This is because even though an object is detected by the sensors 210, 220, and 230 at a certain point in time, the same object may not be detected. In the example illustrated in FIG. 2A, the direction of the orientation line 232 of the sensor # 3 230 is different from the estimated position 240, and the direction of the orientation line 232 is different from the estimated position 240. This is because the object detected by the # 3 230 is an object detected by the sensor # 1 210 and the sensor # 2 220. Although sensor # 1 210, sensor # 2 220, and sensor # 3 230 detect objects at the same time, the objects detected by sensor # 3 220 are sensor # 1 210 and sensor. Since the # 2 220 senses an object different from the sensed object, the azimuth line 232 of the sensor # 3 230 may not be used to estimate the position of the object.

Accordingly, as shown in FIG. 2C, only the azimuth line 212 of the sensor # 1 210 and the azimuth line 222 of the sensor # 2 220 are considered, without considering the azimuth line 232 of the sensor # 3 230. By estimating the new position, a point 250 different from the point 240 estimated according to FIG. 2A may be estimated as the position of the detected object. A method of removing a defense line that is not valid for position estimation among all the defense lines will be described later with reference to FIGS. 4 to 6.

Referring back to FIG. 1, when the correction unit 110 removes azimuth lines that are not valid for position estimation as shown in FIG. 2C and extracts only azimuth lines that are valid for position estimation, the determination unit 120 extracts the extracted azimuth lines. Estimate the position of the detected object based on Equation 1 and Equation 2 may be calculated based on the extracted valid azimuth lines and the positions of the sensors corresponding to the valid azimuth lines to estimate a position where the sum of the mean squared distances is the minimum as the position of the detected object. In more detail, the position of the object may be estimated based on the angle of the extracted effective bearing lines with the x-axis of the coordinate system and the position in the coordinate system of the sensors corresponding to the valid bearing lines.

3 is a flowchart illustrating a method of estimating a position according to an embodiment of the present invention.

Referring to FIG. 3, in operation 310, the position estimating apparatus 100 receives information from a plurality of sensors. Information indicating that an object is detected, position information of the sensor, and information on the direction of the direction of the sensor are received from the plurality of sensors. The position information of the sensor may be the x coordinate and the y coordinate of the sensor in the world coordinate system, and the information about the direction of the orientation line may be information about an angle formed by the plurality of sensors with the x axis in the world coordinate system.

In operation 312, the location estimation apparatus 100 estimates the location based on the information received in operation 312. The position at which the sum of the mean squared distances is the minimum from all the defense lines which have received the information is estimated by calculating the above Equations 1 and 2 above.

In operation 314, the position estimating apparatus 100 removes the defense lines that are not valid for the position estimation among the defense lines used for the position estimation in operation 312, and extracts only the valid defense lines. A method of extracting valid azimuth lines is described below with reference to FIGS. 4 to 6.

In operation 316, the position estimating apparatus 100 determines whether there is a valid bearing line as a result of the extraction of operation 314. If all of the sensors sense different objects, there may be no valid bearing line, and if there is no valid bearing line, the position estimation may fail. Even if there is only one valid defense line, since the position cannot be estimated based on one defense line, the position estimation is terminated through step 820. However, if there are two or more valid bearing lines, the position is re-estimated based on the valid bearing lines in step 318, and the estimated position is output.

4 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to an exemplary embodiment of the present invention.

In operation 410, the position estimating apparatus 100 sets an index of a loop for extracting a valid bearing line to 1.

In operation 412, the position estimating apparatus 100 estimates the position based on the entire bearing line. As described above with respect to Equations 1 and 2, the position at which the sum of the mean squared distances from the total azimuth lines becomes the minimum is estimated.

In operation 414, the position estimating apparatus 100 calculates an intersection point between the estimated position and the i th azimuth line (the azimuth line of the i th sensor). In the present invention, the intersection of the defense line means a point at a position where the distance between the i-th defense line and the estimated position is minimum. i means the azimuth index of the sensor which is currently the object of validity determination when all the sensors are mapped to positive integers in a predetermined order.

In operation 416, the position estimating apparatus 100 determines whether the quadrant for the azimuth direction and the quadrant for the intersection coincide with each other. This will be described in detail with reference to FIG. 5.

5 is a flowchart illustrating a method for determining validity of a defense line according to an embodiment of the present invention.

Referring to FIG. 5, the position estimating apparatus 100 uses a two-dimensional rectangular coordinate system that uses the i-th sensor 500 as an origin to determine the validity of a bearing line. In FIG. 5, the direction of the azimuth line 510 of the i-th sensor 500 is the first quadrant direction in the two-dimensional rectangular coordinate system. However, the intersection 530 where the distance between the estimated position 520 and the orientation line 510 is minimum based on the total orientation line exists in the third quadrant. In other words, the direction of the orientation line 510 is different from the direction of the intersection 530 around the origin.

In this case, the azimuth line of the i-th sensor 500 may be determined to be invalid for position estimation. As described above, the direction of the azimuth line and the direction of the sensor are the same. However, when the direction of the position 520 estimated from the origin is determined based on the intersection 530, the direction of the position 520 is different from that of the bearing line 510, so that an object other than the object corresponding to the estimated position 520 may be detected. 500 may be determined as detected. Thus, the azimuth line of the sensor 500 is an azimuth line that is not valid for estimation of the position of the detected object.

The determination of the identity of the directions may be determined based on the signs of x and y in a rectangular coordinate system centered on the sensor 500. It is determined whether the x and y signs for the direction of the azimuth line and the x and y signs for the position of the intersection point are the same. Even if only one of the x code and the y code is inconsistent, it can be determined that the direction of the azimuth line and the direction of the intersection point are different. In the embodiment shown in FIG. 5, the x sign and the y sign of the direction of the azimuth line 510 are "+", whereas the x sign and the y sign of the position of the intersection point 530 are both "-". It may be determined that the direction and the direction of the intersection 530 are different.

Referring back to FIG. 4, if it is determined in step 416 that the quadrant for the i th direction line coincides with the quadrant for the intersection point, in step 418, the position estimating apparatus 100 determines that the defense line of the i th sensor is a valid direction line for position estimation. To judge. In contrast, if it is determined in step 416 that the quadrant for the i-th direction line does not coincide with the quadrant for the intersection, the position estimating apparatus 100 determines in step 420 that the i-th direction line is an invalid direction line.

In operation 422, the position estimation apparatus 100 determines whether the index i corresponds to the total number of bearing lines. It is determined whether the validity of the entire azimuth line is determined by determining whether the total number of sensors detecting the object and the index i coincide. If i does not match the total number of azimuth lines, then in step 424 I is increased by '1' and steps 414 to 420 are repeated. For all of the sensors that detect the object, it is determined whether the bearing line is valid for position estimation, and only valid bearing lines are extracted by repeating the step of removing the invalid bearing line.

FIG. 6 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to another embodiment of the present invention.

In operation 610, the position estimating apparatus 100 sets an index of a loop for extracting a valid bearing line to 1. It corresponds to step 410 of FIG.

In operation 612, the position estimating apparatus 100 estimates the position based on the total defense line. As described above with respect to Equations 1 and 2, the position at which the sum of the mean squared distances from the total azimuth lines becomes the minimum is estimated. Corresponds to step 412 of FIG. 4.

In operation 614, the position estimating apparatus 100 calculates an intersection point between the estimated position and the i th azimuth line (the azimuth line of the i th sensor). Corresponds to step 414 of FIG. 4.

However, according to the embodiment shown in FIG. 6, the estimated position used for the first loop is an estimated position based on the entire defense line in step 612, but the estimated position used in the next loop is an invalid defense line in step 626. It may be an estimated position based on the remaining defense lines except.

In operation 616, the position estimating apparatus 100 determines whether the quadrant for the azimuth direction and the quadrant for the intersection coincide with each other. Corresponds to step 416 of FIG. 4.

If it is determined in step 616 that the quadrant for the i-th orientation line coincides with the quadrant for the intersection, in step 618, the position estimating apparatus 100 determines that the orientation line of the i-th sensor is a valid orientation line for position estimation. On the contrary, if it is determined in step 616 that the quadrant for the i th defense line does not coincide with the quadrant for the intersection, the position estimating apparatus 100 determines in step 620 that the i th defense line is invalid.

In operation 622, the position estimation apparatus 100 determines whether the index I corresponds to the total number of defense lines. It corresponds to step 422 of FIG.

If it is determined in step 622 that the index i does not match the total number of bearing lines, then in step 624 i is increased by '1'.

Also, in step 626, the position estimating apparatus 100 re-estimates the position based on the remaining defense line except for the invalid defense line. The estimated position at step 612 is an estimated position based on the total defense line which also includes an invalid defense line. Thus, the location may be incorrect. By repeating the validity determination of steps 614 to 620 based on the incorrect position, the validity determination of the defense line may also be incorrect.

Therefore, in the embodiment shown in FIG. 6, except for the defense line determined to be invalid, the position is reestimated based on the remaining defense line, and the validity determination of steps 614 to 620 is performed based on the reestimated position.

As described above, although the present invention has been described by way of limited embodiments and drawings, the present invention is not limited to the above-described embodiments, which can be modified by those skilled in the art to which the present invention pertains. Modifications are possible. Accordingly, the spirit of the invention should be understood only by the claims set forth below, and all equivalent or equivalent modifications will fall within the scope of the invention. In addition, the system according to the present invention can be embodied as computer readable codes on a computer readable recording medium.

For example, the position estimation apparatus according to an exemplary embodiment of the present invention may include a bus coupled to respective units of the apparatus as shown in FIG. 1, and at least one processor coupled to the bus. . It may also include a memory coupled to the bus for storing instructions, received messages or generated messages and coupled to at least one processor for performing instructions as described above.

The computer-readable recording medium also includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device and the like. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.

Claims (10)

In the method of estimating the position using a plurality of sensors for sensing the object around the sensor,
Extracting azimuth lines valid for estimating a position of the object among azimuth lines corresponding to a sensing direction of the sensors set in the plurality of sensors; And
And estimating a position of the object as a minimum of the sum of the mean squared distances based on the extracted valid azimuth lines and the positions of the sensors corresponding to the extracted valid azimuth lines. Way.
The method of claim 1, wherein extracting valid bearing lines
(a) determining a position where a sum of mean squared distances is minimum based on azimuth lines of the plurality of sensors and positions of the plurality of sensors;
(b) determining whether an intersection point of the distance between the determined position and a direction line of a predetermined sensor among the plurality of sensors is minimum is located in a sensing direction of the predetermined sensor; And
(c) If the determined position is not located in the sensing direction of the predetermined sensor, the orientation line of the predetermined sensor is determined as an invalid orientation line, and if the determined position is located in the sensing direction of the predetermined sensor, the predetermined sensor And determining the bearing line as a valid bearing line.
The method of claim 2,
If it is determined that the azimuth line of the predetermined sensor is invalid, extracting azimuth lines valid for estimating the position of the object by repeating steps (a) to (c) except for the azimuth line determined as invalid. Position estimation method further comprises.
The method of claim 2,
And repeating steps (a) to (c) for the plurality of sensors to extract bearing lines effective for estimating the position of the object.
The method of claim 2, wherein step (b)
Intersection at which the distance between the determined position and the direction line of a predetermined sensor among the plurality of sensors is minimum
If the quadrant for the sensing direction of the predetermined sensor is different from the quadrant for the intersection in the rectangular coordinate system centering on the predetermined sensor, it is determined that the intersection is located in a direction different from the sensing direction of the predetermined sensor, Determining that the intersection is located in the same direction as the sensing direction of the predetermined sensor when the quadrant for the sensing direction of the predetermined sensor and the quadrant for the intersection are the same in a rectangular coordinate system centering on the predetermined sensor; Position estimation method comprising a.
An apparatus for estimating a position using a plurality of sensors for sensing an object around a sensor,
A correction unit for extracting bearing lines valid for estimating the position of the object among bearing lines matching the sensing directions of the sensors set in the plurality of sensors; And
And a determination unit for estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object. Device.
The method of claim 6, wherein the correction unit
Based on the orientation lines of the plurality of sensors and the positions of the plurality of sensors, a position at which the sum of the mean squared distances is minimized is determined, and the distance between the determined position and the orientation line of a predetermined sensor among the plurality of sensors is minimum. If the intersection is located in the sensing direction of the predetermined sensor, and if the determined position is not located in the sensing direction of the predetermined sensor, the orientation line of the predetermined sensor is determined as an invalid orientation line, and the determined position is the And a direction line of the predetermined sensor is determined to be an invalid direction line when positioned in the sensing direction of the predetermined sensor.
The method of claim 7, wherein the correction unit
If it is determined that the bearing line of the predetermined sensor is not valid, except for the bearing line determined to be invalid, it is determined that the bearing line is effective to estimate the position of the object by repeatedly determining whether the bearing line is valid. Estimation device.
The method of claim 7, wherein the correction unit
And azimuth lines effective for estimating the position of the object by repeatedly determining whether or not valid azimuths are valid for the plurality of sensors.
A computer-readable recording medium having recorded thereon a program for executing one of the methods of claims 1 to 5.
KR1020100108359A 2010-11-02 2010-11-02 Method and apparatus for estimation location by using a plurality of sensor KR101038504B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020100108359A KR101038504B1 (en) 2010-11-02 2010-11-02 Method and apparatus for estimation location by using a plurality of sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020100108359A KR101038504B1 (en) 2010-11-02 2010-11-02 Method and apparatus for estimation location by using a plurality of sensor

Publications (1)

Publication Number Publication Date
KR101038504B1 true KR101038504B1 (en) 2011-06-01

Family

ID=44404895

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020100108359A KR101038504B1 (en) 2010-11-02 2010-11-02 Method and apparatus for estimation location by using a plurality of sensor

Country Status (1)

Country Link
KR (1) KR101038504B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017155231A1 (en) * 2016-03-10 2017-09-14 삼성전자 주식회사 Position determination method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100571120B1 (en) 2005-09-29 2006-04-13 한진정보통신(주) Three dimentional survey system which use the laser apparatus
KR20070054557A (en) * 2005-11-23 2007-05-29 삼성전자주식회사 Method and apparatus for reckoning position of moving robot
KR20070072305A (en) * 2005-12-29 2007-07-04 한국생산기술연구원 Localization system for moving object using state group of moving object and method thereof
WO2009148644A1 (en) 2008-02-25 2009-12-10 Q-Track Corporation Multiple phase state near-field communication and location system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100571120B1 (en) 2005-09-29 2006-04-13 한진정보통신(주) Three dimentional survey system which use the laser apparatus
KR20070054557A (en) * 2005-11-23 2007-05-29 삼성전자주식회사 Method and apparatus for reckoning position of moving robot
KR20070072305A (en) * 2005-12-29 2007-07-04 한국생산기술연구원 Localization system for moving object using state group of moving object and method thereof
WO2009148644A1 (en) 2008-02-25 2009-12-10 Q-Track Corporation Multiple phase state near-field communication and location system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017155231A1 (en) * 2016-03-10 2017-09-14 삼성전자 주식회사 Position determination method and device
US10921438B2 (en) 2016-03-10 2021-02-16 Samsung Electronics Co., Ltd Position determination method and device

Similar Documents

Publication Publication Date Title
US10996062B2 (en) Information processing device, data management device, data management system, method, and program
CN110322500B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
CN107292949B (en) Three-dimensional reconstruction method and device of scene and terminal equipment
JP5385105B2 (en) Image search method and system
EP3001384A1 (en) Three-dimensional coordinate computing apparatus, three-dimensional coordinate computing method, and program for three-dimensional coordinate computing
CN111445531B (en) Multi-view camera navigation method, device, equipment and storage medium
US20210056715A1 (en) Object tracking method, object tracking device, electronic device and storage medium
AU2018282347B2 (en) Method and apparatus for monitoring vortex-induced vibration of wind turbine
KR20200045522A (en) Methods and systems for use in performing localization
KR101822185B1 (en) Method and apparatus for poi detection in 3d point clouds
JP5774226B2 (en) Resolving ambiguity of homography decomposition based on orientation sensor
US10223804B2 (en) Estimation device and method
WO2012166293A1 (en) Planar mapping and tracking for mobile devices
KR20100104581A (en) Method and apparatus for estimating position in a mobile robot
US20220156973A1 (en) Information processing apparatus, information processing method, and program
Zhao et al. Prediction-based geometric feature extraction for 2D laser scanner
WO2011048497A2 (en) Computer vision based hybrid tracking for augmented reality in outdoor urban environments
CN111950370B (en) Dynamic environment offline visual milemeter expansion method
JP6922348B2 (en) Information processing equipment, methods, and programs
KR101038504B1 (en) Method and apparatus for estimation location by using a plurality of sensor
JP5928010B2 (en) Road marking detection apparatus and program
CN105590086A (en) Article antitheft detection method based on visual tag identification
JP5032415B2 (en) Motion estimation apparatus and program
JP5829155B2 (en) Pedestrian detection device and program
Kröger et al. Performance evaluation on contour extraction using Hough transform and RANSAC for multi-sensor data fusion applications in industrial food inspection

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20140502

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20150504

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20160504

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20170504

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20180503

Year of fee payment: 8

FPAY Annual fee payment

Payment date: 20190503

Year of fee payment: 9