KR101038504B1 - Method and apparatus for estimation location by using a plurality of sensor - Google Patents
Method and apparatus for estimation location by using a plurality of sensor Download PDFInfo
- Publication number
- KR101038504B1 KR101038504B1 KR1020100108359A KR20100108359A KR101038504B1 KR 101038504 B1 KR101038504 B1 KR 101038504B1 KR 1020100108359 A KR1020100108359 A KR 1020100108359A KR 20100108359 A KR20100108359 A KR 20100108359A KR 101038504 B1 KR101038504 B1 KR 101038504B1
- Authority
- KR
- South Korea
- Prior art keywords
- sensors
- line
- determined
- predetermined sensor
- lines
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/028—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B7/00—Measuring arrangements characterised by the use of electric or magnetic techniques
- G01B7/003—Measuring arrangements characterised by the use of electric or magnetic techniques for measuring position, not involving coordinate determination
-
- H04N5/217—
Abstract
From the azimuth lines of the plurality of sensors, azimuth lines valid for estimating the position of the object are extracted, and based on the extracted valid azimuth lines and the positions of the sensors corresponding to the extracted valid azimuth lines, the position of the object detected by the plurality of sensors is determined. A method and apparatus for estimating are disclosed.
Description
The present invention relates to a method and apparatus for estimating a predetermined position, and more particularly, to a method and apparatus for estimating a position of a predetermined object detected by a plurality of sensors.
Various sensors for detecting movement within a detection range are widely used for various purposes. Sensors for motion detection are used in various fields, such as doors that automatically detect the approach of an object to open or close automatically, and CCTV that automatically detects the movement of an object within the shooting range. Using a plurality of sensors for detecting the movement of the object, it is possible to estimate the position of the object based on the position and the detection of the plurality of sensors.
The technical problem to be solved by the present invention is to provide a method and apparatus for estimating the position with a higher accuracy in estimating the position by using a plurality of sensors, the computer readable recording the program for executing the method The present invention provides a recording medium.
According to an aspect of the present invention, there is provided a method of estimating a position using a plurality of sensors, the method comprising: extracting azimuth lines effective for estimating a position of an object among azimuth lines of the plurality of sensors; And estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object.
According to another embodiment of the present invention, the step of extracting the effective azimuth lines includes (a) determining a position where the sum of the mean squared distances is minimum based on the azimuth lines of the plurality of sensors and the positions of the plurality of sensors. Doing; (b) determining whether a point at which a distance between the determined position and a direction line of a predetermined sensor among the plurality of sensors is minimum is a point different from a direction of a direction line of the predetermined sensor; And (c) if the determined position and the direction of the orientation line of the predetermined sensor are opposite, determine the orientation line of the predetermined sensor as an invalid orientation line, and if the determined position and the direction of the predetermined sensor are the same, Determining the defense line as a valid defense line.
According to another embodiment of the present invention, the method for estimating the position, if it is determined that the bearing line of the predetermined sensor is invalid, except for the bearing line determined to be invalid, the (a) to (c) Repeating the steps further includes extracting azimuth lines valid for estimating the position of the object.
According to another embodiment of the present invention, the method for estimating the position further comprises repeating steps (a) to (c) for the plurality of sensors to extract the azimuth lines valid for estimating the position of the object. Include.
According to another embodiment of the present invention, the determining whether the point is a point in a direction different from the direction of the predetermined sensor comprises: a quadrant with respect to the direction of the orientation line of the predetermined sensor in a rectangular coordinate system centering on the predetermined sensor; If the quadrants for the points are different, it is determined that the point is a point in a direction different from the direction of the predetermined sensor, and the quadrant for the direction of the direction of the azimuth line of the predetermined sensor in a rectangular coordinate system centering on the predetermined sensor, and for the point And if the quadrants are the same, determining that the point is a point in the same direction as the direction of the predetermined sensor.
According to an aspect of the present invention, there is provided an apparatus for estimating a position using a plurality of sensors, the corrector configured to extract azimuth lines effective for estimating a position of the object among azimuth lines of the plurality of sensors; And a determination unit for estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object.
In order to solve the above technical problem, the present invention provides a computer-readable recording medium having recorded thereon a program for executing the method for estimating the above-described position.
According to the present invention, the position can be estimated using only a sensor valid for position estimation among the plurality of sensors, and the position can be estimated with higher accuracy.
1 shows an apparatus for estimating a position according to an embodiment of the present invention.
2A to 2C illustrate a position estimation method using a plurality of sensors according to an embodiment of the present invention.
3 is a flowchart illustrating a method of estimating a position according to an embodiment of the present invention.
4 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to an exemplary embodiment of the present invention.
5 is a view for explaining a method of determining the validity of the defense line according to an embodiment of the present invention.
FIG. 6 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to another embodiment of the present invention.
Hereinafter, with reference to the drawings will be described embodiments of the present invention;
1 shows an apparatus for estimating a position according to an embodiment of the present invention.
Referring to FIG. 1, an apparatus for estimating a position using a plurality of sensors according to an embodiment of the present invention includes a
The
However, even if an object is detected by a plurality of sensors at the same time, the orientation lines of the sensors on which the object is detected may not be used for position estimation. Therefore, the
2A and 2B illustrate a position estimation method using a plurality of sensors according to an embodiment of the present invention.
2A illustrates a method of estimating the position of a given object based on a plurality of sensors. The position of the object sensed by the
Once the
2B illustrates a position estimation method using an average squared distance algorithm.
Referring to FIG. 2B, when the position of the i th sensor is (x i , y i ) in the global coordinate where the positions of the plurality of
In the above formula, N means the total number of sensors,
Denotes the angle between the direction line of the I-th sensor and the x-axis of the global coordinate system. If the sum of the mean squared distances calculated by
As shown in FIG. 2A, when the plurality of
However, as described above, the azimuth lines of all sensors on which an object is detected are not effective azimuth lines for position estimation. This is because even though an object is detected by the
Accordingly, as shown in FIG. 2C, only the
Referring back to FIG. 1, when the
3 is a flowchart illustrating a method of estimating a position according to an embodiment of the present invention.
Referring to FIG. 3, in
In
In
In
4 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to an exemplary embodiment of the present invention.
In
In
In
In
5 is a flowchart illustrating a method for determining validity of a defense line according to an embodiment of the present invention.
Referring to FIG. 5, the
In this case, the azimuth line of the i-
The determination of the identity of the directions may be determined based on the signs of x and y in a rectangular coordinate system centered on the
Referring back to FIG. 4, if it is determined in
In
FIG. 6 is a flowchart illustrating a method of extracting a bearing line effective for tracking a position among bearing lines of a plurality of sensors according to another embodiment of the present invention.
In
In
In
However, according to the embodiment shown in FIG. 6, the estimated position used for the first loop is an estimated position based on the entire defense line in
In
If it is determined in
In
If it is determined in
Also, in
Therefore, in the embodiment shown in FIG. 6, except for the defense line determined to be invalid, the position is reestimated based on the remaining defense line, and the validity determination of
As described above, although the present invention has been described by way of limited embodiments and drawings, the present invention is not limited to the above-described embodiments, which can be modified by those skilled in the art to which the present invention pertains. Modifications are possible. Accordingly, the spirit of the invention should be understood only by the claims set forth below, and all equivalent or equivalent modifications will fall within the scope of the invention. In addition, the system according to the present invention can be embodied as computer readable codes on a computer readable recording medium.
For example, the position estimation apparatus according to an exemplary embodiment of the present invention may include a bus coupled to respective units of the apparatus as shown in FIG. 1, and at least one processor coupled to the bus. . It may also include a memory coupled to the bus for storing instructions, received messages or generated messages and coupled to at least one processor for performing instructions as described above.
The computer-readable recording medium also includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device and the like. The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
Claims (10)
Extracting azimuth lines valid for estimating a position of the object among azimuth lines corresponding to a sensing direction of the sensors set in the plurality of sensors; And
And estimating a position of the object as a minimum of the sum of the mean squared distances based on the extracted valid azimuth lines and the positions of the sensors corresponding to the extracted valid azimuth lines. Way.
(a) determining a position where a sum of mean squared distances is minimum based on azimuth lines of the plurality of sensors and positions of the plurality of sensors;
(b) determining whether an intersection point of the distance between the determined position and a direction line of a predetermined sensor among the plurality of sensors is minimum is located in a sensing direction of the predetermined sensor; And
(c) If the determined position is not located in the sensing direction of the predetermined sensor, the orientation line of the predetermined sensor is determined as an invalid orientation line, and if the determined position is located in the sensing direction of the predetermined sensor, the predetermined sensor And determining the bearing line as a valid bearing line.
If it is determined that the azimuth line of the predetermined sensor is invalid, extracting azimuth lines valid for estimating the position of the object by repeating steps (a) to (c) except for the azimuth line determined as invalid. Position estimation method further comprises.
And repeating steps (a) to (c) for the plurality of sensors to extract bearing lines effective for estimating the position of the object.
Intersection at which the distance between the determined position and the direction line of a predetermined sensor among the plurality of sensors is minimum
If the quadrant for the sensing direction of the predetermined sensor is different from the quadrant for the intersection in the rectangular coordinate system centering on the predetermined sensor, it is determined that the intersection is located in a direction different from the sensing direction of the predetermined sensor, Determining that the intersection is located in the same direction as the sensing direction of the predetermined sensor when the quadrant for the sensing direction of the predetermined sensor and the quadrant for the intersection are the same in a rectangular coordinate system centering on the predetermined sensor; Position estimation method comprising a.
A correction unit for extracting bearing lines valid for estimating the position of the object among bearing lines matching the sensing directions of the sensors set in the plurality of sensors; And
And a determination unit for estimating a position at which the sum of the mean squared distances is minimum based on the extracted valid bearing lines and the positions of the sensors corresponding to the extracted valid bearing lines as the position of the object. Device.
Based on the orientation lines of the plurality of sensors and the positions of the plurality of sensors, a position at which the sum of the mean squared distances is minimized is determined, and the distance between the determined position and the orientation line of a predetermined sensor among the plurality of sensors is minimum. If the intersection is located in the sensing direction of the predetermined sensor, and if the determined position is not located in the sensing direction of the predetermined sensor, the orientation line of the predetermined sensor is determined as an invalid orientation line, and the determined position is the And a direction line of the predetermined sensor is determined to be an invalid direction line when positioned in the sensing direction of the predetermined sensor.
If it is determined that the bearing line of the predetermined sensor is not valid, except for the bearing line determined to be invalid, it is determined that the bearing line is effective to estimate the position of the object by repeatedly determining whether the bearing line is valid. Estimation device.
And azimuth lines effective for estimating the position of the object by repeatedly determining whether or not valid azimuths are valid for the plurality of sensors.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100108359A KR101038504B1 (en) | 2010-11-02 | 2010-11-02 | Method and apparatus for estimation location by using a plurality of sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100108359A KR101038504B1 (en) | 2010-11-02 | 2010-11-02 | Method and apparatus for estimation location by using a plurality of sensor |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101038504B1 true KR101038504B1 (en) | 2011-06-01 |
Family
ID=44404895
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020100108359A KR101038504B1 (en) | 2010-11-02 | 2010-11-02 | Method and apparatus for estimation location by using a plurality of sensor |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101038504B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017155231A1 (en) * | 2016-03-10 | 2017-09-14 | 삼성전자 주식회사 | Position determination method and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100571120B1 (en) | 2005-09-29 | 2006-04-13 | 한진정보통신(주) | Three dimentional survey system which use the laser apparatus |
KR20070054557A (en) * | 2005-11-23 | 2007-05-29 | 삼성전자주식회사 | Method and apparatus for reckoning position of moving robot |
KR20070072305A (en) * | 2005-12-29 | 2007-07-04 | 한국생산기술연구원 | Localization system for moving object using state group of moving object and method thereof |
WO2009148644A1 (en) | 2008-02-25 | 2009-12-10 | Q-Track Corporation | Multiple phase state near-field communication and location system |
-
2010
- 2010-11-02 KR KR1020100108359A patent/KR101038504B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100571120B1 (en) | 2005-09-29 | 2006-04-13 | 한진정보통신(주) | Three dimentional survey system which use the laser apparatus |
KR20070054557A (en) * | 2005-11-23 | 2007-05-29 | 삼성전자주식회사 | Method and apparatus for reckoning position of moving robot |
KR20070072305A (en) * | 2005-12-29 | 2007-07-04 | 한국생산기술연구원 | Localization system for moving object using state group of moving object and method thereof |
WO2009148644A1 (en) | 2008-02-25 | 2009-12-10 | Q-Track Corporation | Multiple phase state near-field communication and location system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017155231A1 (en) * | 2016-03-10 | 2017-09-14 | 삼성전자 주식회사 | Position determination method and device |
US10921438B2 (en) | 2016-03-10 | 2021-02-16 | Samsung Electronics Co., Ltd | Position determination method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10996062B2 (en) | Information processing device, data management device, data management system, method, and program | |
CN110322500B (en) | Optimization method and device for instant positioning and map construction, medium and electronic equipment | |
CN107292949B (en) | Three-dimensional reconstruction method and device of scene and terminal equipment | |
JP5385105B2 (en) | Image search method and system | |
EP3001384A1 (en) | Three-dimensional coordinate computing apparatus, three-dimensional coordinate computing method, and program for three-dimensional coordinate computing | |
CN111445531B (en) | Multi-view camera navigation method, device, equipment and storage medium | |
US20210056715A1 (en) | Object tracking method, object tracking device, electronic device and storage medium | |
AU2018282347B2 (en) | Method and apparatus for monitoring vortex-induced vibration of wind turbine | |
KR20200045522A (en) | Methods and systems for use in performing localization | |
KR101822185B1 (en) | Method and apparatus for poi detection in 3d point clouds | |
JP5774226B2 (en) | Resolving ambiguity of homography decomposition based on orientation sensor | |
US10223804B2 (en) | Estimation device and method | |
WO2012166293A1 (en) | Planar mapping and tracking for mobile devices | |
KR20100104581A (en) | Method and apparatus for estimating position in a mobile robot | |
US20220156973A1 (en) | Information processing apparatus, information processing method, and program | |
Zhao et al. | Prediction-based geometric feature extraction for 2D laser scanner | |
WO2011048497A2 (en) | Computer vision based hybrid tracking for augmented reality in outdoor urban environments | |
CN111950370B (en) | Dynamic environment offline visual milemeter expansion method | |
JP6922348B2 (en) | Information processing equipment, methods, and programs | |
KR101038504B1 (en) | Method and apparatus for estimation location by using a plurality of sensor | |
JP5928010B2 (en) | Road marking detection apparatus and program | |
CN105590086A (en) | Article antitheft detection method based on visual tag identification | |
JP5032415B2 (en) | Motion estimation apparatus and program | |
JP5829155B2 (en) | Pedestrian detection device and program | |
Kröger et al. | Performance evaluation on contour extraction using Hough transform and RANSAC for multi-sensor data fusion applications in industrial food inspection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
A302 | Request for accelerated examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20140502 Year of fee payment: 4 |
|
FPAY | Annual fee payment |
Payment date: 20150504 Year of fee payment: 5 |
|
FPAY | Annual fee payment |
Payment date: 20160504 Year of fee payment: 6 |
|
FPAY | Annual fee payment |
Payment date: 20170504 Year of fee payment: 7 |
|
FPAY | Annual fee payment |
Payment date: 20180503 Year of fee payment: 8 |
|
FPAY | Annual fee payment |
Payment date: 20190503 Year of fee payment: 9 |