KR102012513B1 - Apparatus and method for automatic extraction of field information - Google Patents
Apparatus and method for automatic extraction of field information Download PDFInfo
- Publication number
- KR102012513B1 KR102012513B1 KR1020170060556A KR20170060556A KR102012513B1 KR 102012513 B1 KR102012513 B1 KR 102012513B1 KR 1020170060556 A KR1020170060556 A KR 1020170060556A KR 20170060556 A KR20170060556 A KR 20170060556A KR 102012513 B1 KR102012513 B1 KR 102012513B1
- Authority
- KR
- South Korea
- Prior art keywords
- information
- stadium
- marker
- map image
- extracting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The present invention relates to an automatic stadium information extraction apparatus and method for automatically detecting stadium information on a map based on the user's Global Positioning System (GPS) information. Transmitting and receiving unit for transmitting and receiving data; A storage unit which stores data received through the transceiver; An analysis unit which detects the location information and the measurement area information of the user through the GPS location information of the user received from the external device of the user based on the reference area information received from the server, and stores the detected information in a storage unit; ; And a central processing unit for controlling the overall operation of the transceiver, the storage and the analysis unit.
Description
The present invention relates to a device and method for automatically extracting stadium information, and more specifically, to a stadium information automatic extraction device for automatically detecting stadium information on a map based on a user's Global Positioning System (GPS) information; It is about a method.
When you search for a place in an Internet-based map service, the place appears on the map and a placemark pin is placed in the location that represents the place. However, the location pin can only get the approximate information of the stadium's location, and the GPS information of the four vertices must be manually retrieved to identify the four vertices of the stadium. In addition, there are human errors that occur when users specify four vertices manually.
The prior art document discloses an invention related to a map service method and system for providing location-based target content. In order to search the map provided in the electronic device, the user interface can be clearly expressed by using the location as the subject of the query instead of the search word of the text. Disclosed is a configuration for identifying a pointer location based query and providing information including location coordinates, names, addresses, and telephone numbers of target content included within a certain radius.
For example, if the target content is a stadium, the prior art literature provides information including location coordinates, names, addresses and telephone numbers for the stadium. However, the prior art document does not provide the coordinates of the four vertices of the stadium, and there is a hassle that the user has to search the coordinate information by designating each vertex by hand.
The present invention has been invented to solve the above problems, the purpose of which is to detect the stadium where the user is located, based on the GPS location information of the user, to automatically grasp the GPS information of the four vertices of the stadium where the user is located .
According to an aspect of the present invention, there is provided an apparatus for automatically extracting stadium information, including: a transceiver configured to transmit and receive data to and from an external device and a server of a user; A storage unit which stores data received through the transceiver; An analysis unit which detects the location information and the measurement area information of the user through the GPS location information of the user received from the external device of the user based on the reference area information received from the server, and stores the detected information in a storage unit; ; And a central processing unit for controlling the overall operation of the transceiver, the storage and the analysis unit. Meanwhile, the GPS location information of the user may be GPS location information measured by the external device of the user for a predetermined period of time.
In the automatic extraction device, the analysis unit sets a reference position based on the received GPS position information of the user, detects a target zone including the set reference position, and compares the detected target zone with the reference zone. To detect the measurement area information.
The data transmitted to the server by the transceiver may be measurement zone information detected by the analyzer.
According to another aspect of the present invention, there is provided a method for automatically extracting stadium information, the method comprising: setting a reference position from GPS location information of a user received from the outside; Detecting a target area from the set reference position; A zone comparison step of comparing the detected target zone with a reference zone; And transmitting the measurement zone information obtained as a result of the zone comparison step to the server.
In this method, the GPS location information of the user received from the outside is the GPS location information data measured for a predetermined period of the user, and the set reference position is the GPS location information data measured for a certain period of the user. It can be an average or mode.
The detecting of the target area may include displaying a marker at the set reference position on a map service; And extracting a map image on which the marker is displayed.
The method may further include detecting the location of the marker and a region including the marker by HSV transforming the extracted map image, wherein the target region may be a region including the detected marker.
In the method, the zone comparison step may be a step of comparing a preset figure having the same size, shape, and shape as the reference zone with the target zone.
In addition, the measurement zone information may be GPS coordinate information obtained by adjusting the position of the preset figure and converting the vertex coordinates of the preset figure when the area overlapping with the target zone is the largest.
According to the automatic stadium information extraction apparatus and method according to an embodiment of the present invention, there is an effect that can know the exact location information, not the approximate location on the map for the stadium.
In addition, since the location information detection process for the stadium is automatic, the user does not need to input additional information for information analysis.
BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, included as part of the detailed description in order to provide a thorough understanding of the present invention, provide embodiments of the present invention and together with the description, describe the technical features of the present invention.
1 is a block diagram of a stadium information automatic extraction device according to an embodiment of the present invention.
2 is a flowchart briefly illustrating a method for automatically extracting stadium information according to an exemplary embodiment of the present invention.
3 and 4 are diagrams illustrating an example of a process of detecting a marker in the apparatus and method for automatically extracting stadium information according to an embodiment of the present invention.
5 is a diagram illustrating an example of a process of detecting a stadium in the apparatus and method for automatically extracting stadium information according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating an example of a process of comparing the detected stadium and predetermined stadium information in the automatic stadium information extraction apparatus and method according to an embodiment of the present invention.
7 and 8 are diagrams showing an embodiment through the automatic stadium information extraction apparatus and method according to an embodiment of the present invention.
In this specification, terms such as first and second are used only for the purpose of distinguishing one component from another component. In other words, it is not intended to limit the components by the above terms.
Components, features, and steps that are referred to herein as "comprising" mean that such components, features, and steps exist and are intended to exclude one or more other components, features, steps, and equivalents. This is not it.
Unless otherwise specified and stated in the singular, the plural forms are included. That is, the components and the like mentioned herein may mean the presence or addition of one or more other components.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. to be.
In other words, terms such as those defined in the commonly used dictionaries should be construed as meanings consistent with the meanings in the context of the related art, and, unless expressly defined herein, are construed in ideal or overly formal meanings. It doesn't work.
Hereinafter, with reference to the accompanying drawings will be described in detail with respect to the automatic stadium information extraction apparatus and method according to an embodiment of the present invention.
1 is a block diagram of a stadium information automatic extraction device according to an embodiment of the present invention.
Referring to FIG. 1, the apparatus for automatically extracting stadium information according to an exemplary embodiment of the present invention may include a
The stadium information automatic extracting
In addition, the stadium information
The
The
In addition, the
In addition, the
The
The
That is, the
The
2 is a flowchart briefly illustrating a method for automatically extracting stadium information according to an exemplary embodiment of the present invention.
Referring to FIG. 2, the method for automatically extracting stadium information according to an embodiment of the present invention includes GPS data reception and reference position setting step S201, map search step S203, marker detection step S205, and target area detection step. (S207), a figure matching step (S209), a measurement area information detection step (S211), and a server storing step (S213) may be included.
GPS data receiving and reference position setting step (S201) is a step of setting the reference position from the GPS position information of the user received through the
The map search step (S203) is a step of displaying a location corresponding to the GPS data of the reference location on the Internet-based map service. That is, the step of displaying the marker at the set reference position on the map service.
For example, when the user plays a soccer game, if the reference position is set in the reference position setting step S201, the reference position may be set as GPS data inside the soccer stadium. Since the user would have exercised in the soccer field, the average position or the most overlapped position of the position where the user measured the movement during the game time will be at one point of the playground.
The marker detection step (S205) is a step of detecting the information of the marker displayed on the map service in the map search step (S203). The marker detection step (S205) extracts the map image on the map service on which the marker is displayed through the map search step (S203), converts the extracted map image from the RGB domain to the HSV domain, and converts the HUE, SATURATION, and VALUE values of the map image. The position of the marker on the map image can be detected.
When converting the RGB image of the map where the marker is displayed to HSV Domain, the common part of the marker's HUE, SATURATION and VALUE values appears as a specific part, and the marker's HUE, SATURATION and VALUE values are detected through the specified value. As such, the detection of markers via HSV transformation can be consistent in any region.
In the target area stadium detection step (S207), the target area including the marker is detected on the map image through the HUE and SATURATION values around the marker in the map image converted into the HSV Domain, and the surrounding area outside the target area is filtered. Can be removed.
By calculating the average and the error range by finding the surrounding HUE and SATURATION values around the marker, the area that satisfies the HUE and SATURATION values can be detected as the target area, and the area outside the target area can be filtered out. Unlike the VALUE value, the HUE value and the SATURATION value are not influenced by shadows, so a relatively constant value can be obtained, and thus can be used as a feature point.
For example, if the user played a soccer game, the user's reference position would be set to a point inside the stadium, so the marker on the map image would be displayed inside the stadium. Therefore, the target area including the marker may be a soccer field where the user played a soccer game.
That is, the target area stadium detecting step (S207) is a step of detecting the target area including the marker position and the marker by HSV converting the extracted map image.
Figure matching step (S209) is a step of comparing the target zone detected in the target zone stadium detection step (S207) and the reference zone received from the server through the
After placing the center of the figure in the center of the target area on the map, move the figure up, down, left, or right to check the area of the area where the figure and the figure overlap. The more overlapping areas, the smaller the Mean Square Error (hereinafter referred to as MSE) of the stadium and the figure. If the target area and the figure perfectly match, the MSE may be zero.
When the MSE of the target area and the figure becomes the minimum, that is, the most overlap, the position of the figure on the map and the size of the rotation angle may be extracted.
The measurement area information detecting step (S211) is a step of detecting the GPS coordinate information of the vertex of the figure through the detected position of the figure and the magnitude of the rotation angle, and the measurement zone information which is the GPS coordinate information of the vertex of the reference zone. . Since the accumulation of the map is known, and the position of the figure and the size of the rotation angle on the map image are known, relative position information with respect to the vertex of the figure can be detected based on the position of the marker. In addition, since the coordinate information of the marker is known, relative position information of the vertex of the figure can be converted from the marker into GPS coordinates. Here, the GPS coordinates of the vertex of the figure is the measurement area information which is the GPS coordinate information of the vertex of the reference area.
That is, the measurement area information is GPS coordinate information obtained by adjusting the position of the figure and converting the vertex coordinates of the figure when the area overlapping with the target area is the largest.
For example, when a user plays a soccer game, a square shape having the same size and size as that of a soccer stadium where the user plays a soccer game may be compared with a target area. After positioning the center of the rectangle at the center of the target area, move or rotate the rectangle up, down, left, or right to detect the size and position of the rectangle on the map image when the MSE of the rectangle and the target area is minimum. Can be.
The relative position from the position of the marker to the four vertices of the rectangle can be detected through the position of the detected rectangle and the magnitude of the rotation angle. In addition, since the map accumulation and the GPS coordinate information of the marker are known, the relative position from the marker to the four vertices can be known, and the GPS coordinate information of the four vertices can be detected through the relative position. The GPS coordinate information of the four vertices of this square is the GPS coordinate information of the four vertices of the soccer stadium, which corresponds to the measurement area information.
In the server storage step (S213), the measurement area information detected in the measurement area information detection step (S211), that is, the GPS coordinate information of the vertex of the figure is stored in the
3 and 4 are diagrams illustrating an example of a process of detecting a marker in the apparatus and method for automatically extracting stadium information according to an embodiment of the present invention.
Referring to FIGS. 3 and 4, a common portion of the
5 is a diagram illustrating an example of a process of detecting a stadium in the apparatus and method for automatically extracting stadium information according to an embodiment of the present invention.
Referring to FIG. 5, the average and the error range may be calculated by finding the HUE and SATURATION values around the marker. Unlike VALUE values, HUE and SATURATION values are not affected by shadows, so relatively constant values can be obtained and used as feature points. By comparing the
FIG. 6 is a diagram illustrating an example of a process of comparing the detected stadium and predetermined stadium information in the automatic stadium information extraction apparatus and method according to an embodiment of the present invention.
Referring to FIG. 6, the location and rotation angle of the figure where the MSE is minimized by overlapping the figure and the target area most frequently are compared with the figure having the same size, size, and shape as the reference area. Is the process of detecting the size. By comparing the shapes of the rectangles to the
7 and 8 are diagrams showing an embodiment through the automatic stadium information extraction apparatus and method according to an embodiment of the present invention.
Referring to FIGS. 7 and 8, FIG. 7 is a diagram illustrating a simulation result of the present invention for Santiago Bernabeu Stadium, which shows four vertices of a stadium as red dots in a map image, and FIG. 8 illustrates the present invention. Is a diagram showing the results of the simulation in another football stadium.
Although the description herein has been shown in some illustrative aspects, various modifications or changes may be made from the scope defined by the claims that follow, and the technical protection scope of the invention is set forth in the following claims. It must be decided by.
100: automatic stadium information extraction device
101: transceiver
103: storage unit
105: analysis unit
107: central processing unit
300: map image
401 to 405: HSV conversion value of the map image for the marker
407: marker position
501 to 505: HSV conversion value of the map image for the target area
507: detected target area
Claims (16)
Setting a reference position based on the received position information,
Acquire a map image in which the set reference position is marked with a marker,
Extracting images obtained by processing the acquired map image,
Detecting a target area including the set reference position based on the extracted images;
Generating a figure corresponding to the received reference zone information;
By adjusting the position and rotation angle of the generated figure to adjust the Mean Square Error (MSE) between the generated figure and the detected target area,
When the adjusted MSE is minimum, the coordinates corresponding to the vertices of the figure are extracted from the obtained map image,
It includes an analysis unit for extracting the position information corresponding to the coordinates of the vertices of the figure,
Stadium information automatic extraction device.
The analysis unit,
Detecting the position of the displayed marker in the acquired map image based on a color (Hue), saturation, and lightness (Value) preset to the displayed marker,
Detecting the target area including the position of the detected marker among the overlapping areas in the extracted images,
Stadium information automatic extraction device.
The analysis unit,
Extracting processed images by adjusting color and saturation of the obtained map image,
Stadium information automatic extraction device.
The analysis unit,
Based on the position of the marker displayed on the acquired map image, converting the extracted coordinates into the position information, and extracting measurement area information based on the converted position information,
Stadium information automatic extraction device.
The analysis unit,
Adjusting the MSE between the generated figure and the detected target zone while moving and rotating the generated figure after placing the center of the generated figure at the center of the detected target zone;
Stadium information automatic extraction device.
The analysis unit,
Setting the reference position to an average value or a mode of the received location information,
Stadium information automatic extraction device.
The transceiver unit,
Transmitting measurement area information on the location information corresponding to the extracted coordinates to the server,
Stadium information automatic extraction device.
The apparatus may further include a storage unit configured to store measurement area information on the received location information, the received reference area information, and location information corresponding to the extracted coordinates.
Stadium information automatic extraction device.
Setting a reference position in the analysis unit based on the received position information;
Obtaining, by the analysis unit, a map image in which the set reference position is indicated by a marker, and extracting images obtained by processing the acquired map image;
Detecting, by the analyzing unit, a target area including the set reference position based on the extracted images;
Generating, by the analysis unit, a graphic corresponding to the received reference zone information;
Adjusting, by the analyzer, a mean square error (MSE) between the generated figure and the detected target area by adjusting a position and a rotation angle of the generated figure;
In the analyzing unit, extracting coordinates corresponding to vertices of the figure from the obtained map image when the adjusted MSE becomes minimum; And
Extracting location information corresponding to coordinates of vertices of the figure;
Including,
How to automatically extract stadium information.
Detecting the target area,
Detecting a position of the displayed marker in the obtained map image based on a color (Hue), a saturation, and a brightness value preset to the displayed marker; And
Detecting an area including the position of the detected marker among the overlapping areas in the extracted images as the target area;
Automatic stadium information extraction method.
Extracting the processed images,
Extracting processed images by adjusting color and saturation of the obtained map image, respectively.
Automatic stadium information extraction method.
Extracting the location information,
Converting the extracted coordinates into the location information based on the location of the marker displayed on the acquired map image; And
Extracting measurement area information based on the converted location information;
Automatic stadium information extraction method.
Adjusting the MSE,
Positioning a center of the generated figure at the center of the detected target zone, and adjusting an MSE between the generated figure and the detected target zone while moving and rotating the generated figure.
Automatic stadium information extraction method.
The setting of the reference position,
And setting the reference position to an average value or a mode of the received location information.
Automatic stadium information extraction method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170060556A KR102012513B1 (en) | 2017-05-16 | 2017-05-16 | Apparatus and method for automatic extraction of field information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170060556A KR102012513B1 (en) | 2017-05-16 | 2017-05-16 | Apparatus and method for automatic extraction of field information |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20180125789A KR20180125789A (en) | 2018-11-26 |
KR102012513B1 true KR102012513B1 (en) | 2019-08-22 |
Family
ID=64603122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020170060556A KR102012513B1 (en) | 2017-05-16 | 2017-05-16 | Apparatus and method for automatic extraction of field information |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR102012513B1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101991081B1 (en) * | 2018-11-28 | 2019-06-19 | 이민호 | grass cutting service system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010118019A (en) * | 2008-11-14 | 2010-05-27 | Sharp Corp | Terminal device, distribution device, control method of terminal device, control method of distribution device, control program, and recording medium |
JP4859699B2 (en) | 2007-02-07 | 2012-01-25 | ヤフー株式会社 | Deformation map position specifying method, deformed map position specifying system, measurement map position specifying method, and measurement map position specifying system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101434888B1 (en) | 2012-11-19 | 2014-09-02 | 네이버 주식회사 | Map service method and system of providing target contents based on location |
-
2017
- 2017-05-16 KR KR1020170060556A patent/KR102012513B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4859699B2 (en) | 2007-02-07 | 2012-01-25 | ヤフー株式会社 | Deformation map position specifying method, deformed map position specifying system, measurement map position specifying method, and measurement map position specifying system |
JP2010118019A (en) * | 2008-11-14 | 2010-05-27 | Sharp Corp | Terminal device, distribution device, control method of terminal device, control method of distribution device, control program, and recording medium |
Also Published As
Publication number | Publication date |
---|---|
KR20180125789A (en) | 2018-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10083368B2 (en) | Incremental learning for dynamic feature database management in an object recognition system | |
Zheng et al. | A robust and automatic recognition system of analog instruments in power system by using computer vision | |
US10559081B2 (en) | Method and system for automated visual analysis of a dipstick using standard user equipment | |
CN106767810B (en) | Indoor positioning method and system based on WIFI and visual information of mobile terminal | |
US20180188033A1 (en) | Navigation method and device | |
US9940716B2 (en) | Method for processing local information | |
US10800550B2 (en) | Positioning enhancements to localization process for three-dimensional visualization | |
JPWO2014027522A1 (en) | Image processing apparatus, image processing method, program, and image processing system | |
TW201346216A (en) | Virtual ruler | |
CN110794955B (en) | Positioning tracking method, device, terminal equipment and computer readable storage medium | |
CN106845514B (en) | Deep learning-based reading judgment method and device for pointer type dial plate | |
KR20010013521A (en) | Image processing device and method, medium on which program for image processing is stored, and inspecting device | |
US10298780B2 (en) | Long range image calibration | |
CN109745014B (en) | Temperature measurement method and related product | |
TW201531868A (en) | Multiview pruning of feature database for object recognition system | |
KR102012513B1 (en) | Apparatus and method for automatic extraction of field information | |
CN110986889A (en) | High-voltage substation panoramic monitoring method based on remote sensing image technology | |
CN110189329A (en) | System and method for positioning the color block areas of colour atla | |
KR101010904B1 (en) | Apparatus for providing augmented space without using markers | |
CN115423804B (en) | Image calibration method and device and image processing method | |
CN110162362B (en) | Dynamic control position detection and test method, device, equipment and storage medium | |
US11842452B2 (en) | Portable display device with overlaid virtual information | |
JP6702118B2 (en) | Diagnosis support device, image processing method in the diagnosis support device, and program | |
CN115115619A (en) | Feature point extraction method, device and equipment based on circle fitting and storage medium | |
CN117373565A (en) | Library construction method, identification method and device for ion mobility spectrometry-mass spectrometry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
E90F | Notification of reason for final refusal | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) |