WO2013111938A1 - Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés - Google Patents

Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés Download PDF

Info

Publication number
WO2013111938A1
WO2013111938A1 PCT/KR2012/007230 KR2012007230W WO2013111938A1 WO 2013111938 A1 WO2013111938 A1 WO 2013111938A1 KR 2012007230 W KR2012007230 W KR 2012007230W WO 2013111938 A1 WO2013111938 A1 WO 2013111938A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
editing
mobile client
augmented reality
reality server
Prior art date
Application number
PCT/KR2012/007230
Other languages
English (en)
Korean (ko)
Inventor
성동권
전형섭
강재욱
이종현
Original Assignee
(주)올포랜드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)올포랜드 filed Critical (주)올포랜드
Publication of WO2013111938A1 publication Critical patent/WO2013111938A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Definitions

  • the present invention relates to a method and system for editing a digital map in real time, a server and a recording medium therefor, and more particularly, to quickly and easily recognize changes in digital map information through a display screen of a mobile client based on augmented reality.
  • the present invention relates to a method and system for real-time editing of a digital map that can easily and quickly edit and update digital map information of an augmented reality server, and a server and a recording medium therefor.
  • the digital map means that digital maps are produced by analyzing various topographical data obtained by surveying maps, aerial photographs, satellite images, and the like in digital form.
  • the paper map When the paper map is digitized or scanned to form a digital map, the paper map is converted into an actual coordinate system according to the user's purpose through coordinate transformation.
  • attribute data is input to the numerical map that establishes the topological structure to grasp the inter-location and correlation between spatial objects.
  • the digital map thus made enables faster and more accurate map retrieval than paper maps.
  • the numerical map is excellent in information management and usability, so that it is possible to support various planning and decision making more effectively.
  • Korean Patent Registration No. 0878778 (registered date 2009.01.08) discloses a "digital map editing system".
  • the “numeric map editing system” compares the route information on the digital map with the movement route of the editing terminal measured in the field survey, and merges, divides or modifies the polygons on the numerical map to edit the route information, and the mismatched digital map To modify the attribute information of the award.
  • Korean Patent Publication No. 2011-0070663 (published on June 24, 2011, facility information providing apparatus and method) has been disclosed.
  • the "facility information providing apparatus and method” the mapping between the site information and the facility information is precisely made possible to precise augmented reality, there is a problem that can not edit the stored digital map-related information.
  • An object of the present invention for solving the problems according to the prior art, to quickly and easily recognize and edit the changes of the numerical map information through the display screen of the mobile client based on augmented reality, of the augmented reality server
  • the present invention provides a method and system for real-time editing of a digital map, which enables the user to easily edit and update the digital map information, and a server and a recording medium therefor.
  • the first aspect of the present invention for solving the technical problem is a real-time editing method of a digital map using augmented reality server and a mobile client interconnected via a network, (a) augmented reality server, transmitted from the mobile client Transmitting numerical map information corresponding to the received location information to the mobile client; (b) generating, by the mobile client, the 3D object based on the received digital map information, and matching the real-time image acquired through the camera with the 3D object and displaying it graphically on a display; (c) the mobile client providing a user with an editing function for editing the numerical map information, and if the editing information is input, transmitting the edited information to the augmented reality server, wherein the editing function is a 3D object displayed through the display. Selection of and modifying information about the selected 3D object; (d) updating, by the augmented reality server, the corresponding digital map information based on the received edited information.
  • the second aspect of the present invention for solving the technical problem is a real-time editing method of the numerical map executed in the augmented reality server connected to the mobile client via a network, (a) corresponding to the location information transmitted from the mobile client Receiving a digital map information transmission request; (b) the mobile client generates a 3D object based on the numerical map information, matches the 3D object with a real-time image obtained through a camera based on the location information and the attitude information of the mobile client, and displays the graphic on the display.
  • Editing function for editing the numerical map information-the editing function is a selection function for selecting at least one point on the 3D object displayed through the display and a modification for modifying information of the 3D object defined by the selected point; Sending the requested numerical map information to the mobile client so as to provide the user and receive the edited information; And (c) updating the numerical map information based on the edit information when the edit information is transmitted from the mobile client.
  • the 3D object may be classified into at least one of a point, a line, and a plane.
  • the information about the 3D object includes at least one of spatial information and attribute information
  • the editing may be an edit regarding at least one of the spatial information and attribute information
  • the point object is defined as spatial information by coordinates of a point
  • the editing of the point object is defined as new spatial information as coordinates of a new point with the editing function for the point object being activated. Can be done.
  • the linear object is spatial information is defined by the coordinates of both ends of the line constituting both ends of the line, the editing of the linear object is a new point of one or both ends with the editing function for the linear object is activated This can be done by defining the coordinates as new spatial information.
  • the planar object has spatial information defined as at least three vertex coordinates for forming a face, and the editing of the planar object includes at least one new vertex coordinate with the editing function for the planar object activated. This can be done by defining new spatial information.
  • the attribute information may include object rendering attribute information for generating the 3D object and object identification attribute information for identifying the 3D object.
  • the object identification attribute information (aa) the mobile client requesting the augmented reality server for object identification attribute information for the 3D object displayed on the display; (ab) transmitting, by the augmented reality server, object identification property information of the requested 3D object to the mobile client; (ac) displaying the attribute information for object identification received by the mobile client through the display; And (ad) receiving edited information on the object identification attribute information from a user and transmitting the edited information to the augmented reality server.
  • the method of the first aspect of the present invention after the step (d), (e) the augmented reality server, transmitting the updated numerical map information to the mobile client; And (f) generating, by the mobile client, a 3D object based on the updated numerical map information, matching the real-time image acquired through a camera with the 3D object, and updating and displaying the graphic graphically on a display. It may be configured to include more.
  • the digital map information transmitted to the mobile client is information of a 3D object on a terrain, and corresponds to the location information among a plurality of model tiles of a digital elevation model (DEM) tiled to a predetermined size.
  • the central tile includes information on the outer tile surrounding the central tile, and may include 3D object information of a facility corresponding to a predetermined radius around the location information as information of the 3D object of the facility.
  • the augmented reality server periodically receives the location information of the mobile client, when the location information received from the mobile client is changed within the range maintained within the central tile, the augmented reality server is the location information 3D object information which is not duplicated with 3D object information of the facility immediately transmitted among the 3D object information of the facility corresponding to a certain radius, is transmitted to the mobile client, and the location information received from the mobile client is the central tile.
  • the augmented reality server newly recognizes the corresponding outer tile as the central tile, and schedules the information based on the information on the new outer tile corresponding to the change direction of the position information and the position information.
  • 3D object information of the facility corresponding to the radius 3D object information that is not duplicated with the sent 3D object information may be transmitted to the mobile client.
  • the third aspect of the present invention for solving the technical problem is a numerical map real-time editing system including an augmented reality server and a mobile client interconnected via a network, a numerical value for managing the numerical map information in conjunction with the digital map information DB
  • a map information module a digital map information providing module for receiving a digital map information request for predetermined location information of the mobile client and transmitting corresponding digital map information to the mobile client, based on the edit information transmitted from the mobile client
  • An augmented reality server including a digital map information updating module for updating the digital map information; And a location information module for providing location information, a numerical map information receiving module for transmitting the location information to the augmented reality server to receive numerical map information corresponding to the location information, and generating a 3D object based on the numerical map information.
  • a 3D object generation module a posture information module for providing posture information, an image acquisition module for obtaining real-time image information through a camera, and superimposing the 3D object on the image information based on the position information and the posture information
  • a matching module a display module for graphically displaying the matched 3D object and image information on a display, an editing function for editing a numerical map information-the editing function being selected and selected for the 3D object displayed on the display; Editing module to provide information about 3D object-and provide editing information to user
  • a mobile client including an edit information transmission module for transmitting the edit information to the augmented reality server.
  • the 3D object may be classified into at least one of a point, a line, and a plane.
  • the information about the 3D object includes at least one of spatial information and attribute information
  • the editing may be an edit regarding at least one of the spatial information and attribute information
  • the point object is defined as spatial information by the coordinates of a point
  • the editing of the point object is made by defining the coordinates of a new point as a new spatial information with the editing function for the point object is activated.
  • the linear object defines spatial information as the coordinates of both ends of the end of the line, and the editing of the linear object is performed by adding new coordinates of one point or both ends with the edit function for the linear object activated.
  • the spatial information is defined as at least three vertex coordinates for forming a surface, and the editing of the surface object is performed at least one state in which an editing function for the surface object is activated. This can be done by defining new vertex coordinates as new spatial information.
  • the attribute information may include object rendering attribute information for generating the 3D object and object identification attribute information for identifying the 3D object.
  • the mobile client requests the object identification attribute information for the 3D object displayed on the display to the augmented reality server, the augmented reality server object for the requested 3D object
  • the identification attribute information is transmitted to the mobile client, the object identification attribute information received by the mobile client is displayed on the display, and the augmented reality server receives the edit information on the object identification attribute information from the user. Editing can be done by sending to.
  • the fourth aspect of the present invention for solving the technical problem is an augmented reality server for real-time editing of a numerical map, a numerical map information module for managing numerical map information in conjunction with a digital map information DB, predetermined of the mobile client
  • a digital map information providing module for receiving a digital map information request for location information and transmitting corresponding digital map information to the mobile client, and updating the digital map information based on the edit information transmitted from the mobile client.
  • An information update module wherein the mobile client generates a 3D object based on the received numerical map information, matches the real-time image acquired through the camera with the 3D object, and displays the graphic graphically on a display; Editing function for editing map information-The editing function is the display Including a selection function for the displayed 3D object and the correction function of the information on the selected 3D object through-to the user, and if the edit information is input, to transmit it to the augmented reality server to perform do.
  • the 3D object may be classified into at least one of a point, a line, and a plane.
  • the information about the 3D object includes at least one of spatial information and attribute information
  • the editing may be an edit regarding at least one of the spatial information and attribute information
  • the point object is defined as spatial information by the coordinates of a point
  • the editing of the point object is made by defining the coordinates of a new point as a new spatial information with the editing function for the point object is activated.
  • the linear object defines spatial information as the coordinates of both ends of the end of the line, and the editing of the linear object is performed by adding new coordinates of one point or both ends with the edit function for the linear object activated.
  • the spatial information is defined as at least three vertex coordinates for forming a surface, and the editing of the surface object is performed at least one state in which an editing function for the surface object is activated. This can be done by defining new vertex coordinates as new spatial information.
  • the attribute information may include object rendering attribute information for generating the 3D object and object identification attribute information for identifying the 3D object.
  • the mobile client requests the object identification attribute information for the 3D object displayed on the display to the augmented reality server, the augmented reality server object for the requested 3D object
  • the identification attribute information is transmitted to the mobile client, the object identification attribute information received by the mobile client is displayed on the display, and the augmented reality server receives the edit information on the object identification attribute information from the user. Editing can be done by sending to.
  • the fifth aspect of the present invention for solving the above technical problem, provides a computer-readable recording medium recording a program for operating each module of the augmented reality server made as described above.
  • the present invention can easily and quickly recognize and edit the changed information of the numerical map information through the display screen of the mobile client based on the augmented reality, so that the numerical map information of the augmented reality server can be easily and quickly edited and updated.
  • the advantage is that you can.
  • At least one of the spatial information and the attribute information of the 3D object may be selected by selecting any 3D object among the point, linear, and planar 3D objects that are graphically displayed through the display of the mobile client based on the numerical map information.
  • any 3D object selected among the 3D objects of the viscous, linear, and planar shape is displayed distinguished from other 3D objects, there is an advantage that the 3D object to be edited can be visually checked.
  • the corresponding numerical map information of the augmented reality server is updated based on the edited information transmitted from the mobile client, and the updated numerical map information is transmitted back to the mobile client so that the operator can check the update of the augmented reality server in real time.
  • VRS server uses a conventional wireless communication network such as CDMA network, 3G network, 4G network, LTE network, Wibro network, RTK (Realtime Kinematic) or DGPS needs
  • a mobile client can be implemented with a conventional smart phone, a mobile tablet PC, a UMPC, etc. having a wireless communication function and a computing function without a base station GPS or a wireless modem device.
  • FIGS. 1A and 1B are diagrams illustrating a configuration of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • Figure 2 is a block diagram showing the configuration of augmented reality server according to an embodiment of the present invention.
  • Figure 3 is a block diagram showing the configuration of a mobile client according to an embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a hardware configuration of a mobile client according to an embodiment of the present invention.
  • FIG. 5 is a block diagram showing the configuration of a VRS server according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a DEM of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating a case where location information of a real-time editing system of a numerical map is changed within a center tile according to an embodiment of the present invention.
  • FIG. 8 is a view showing a case where the location information of the real-time editing system of the numerical map is changed from the center tile to the outer tile according to an embodiment of the present invention.
  • FIG. 9 is a view showing a radius around the position of the position information of the real-time editing system of the numerical map according to an embodiment of the present invention.
  • FIG. 10 is a diagram illustrating contour lines for explaining a height map technique used in a 3D engine of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 11 shows an example of a height map for explaining a height map technique used in a 3D engine of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 12 is a view showing a 3D terrain generated as a height map to explain the height map technique used in the 3D engine of the real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 13 is a photograph showing that the terrain is rendered on the display screen provided in the mobile client of the real-time editing system of the numerical map according to an embodiment of the present invention.
  • FIG. 14 is a photo showing a screen on which a 3D object to be edited is selected while a terrain is rendered on a display screen provided to a mobile client of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 15A is a photograph showing that property information of a building is displayed in a state in which terrain information is rendered on a display screen provided in a mobile client of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • FIG. 15B is a photograph showing that buildings and underground facilities are rendered in a state in which terrain information is rendered on a display screen provided in a mobile client of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • 15C is a photograph showing that a building is rendered in a state in which terrain information is rendered on a display screen provided in a mobile client of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a process of editing digital map information in a real-time editing system of a digital map according to an embodiment of the present invention.
  • 17 is a flowchart illustrating a process of editing digital map information in a real-time editing system of a digital map according to an embodiment of the present invention.
  • FIG. 18 is a flowchart illustrating a process of receiving edit information of a real-time editing system of a numerical map according to an embodiment of the present invention.
  • 19 is a flow chart showing a process of augmented reality server according to the location information of the real-time editing system of the numerical map according to an embodiment of the present invention.
  • 20 is a flowchart illustrating a process of editing attribute information for object identification in a real-time editing system of a numerical map according to an embodiment of the present invention.
  • the terms are used only for the purpose of distinguishing one component from another.
  • the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
  • the real-time editing system of a numerical map according to an embodiment of the present invention, as shown in Figure 1a, comprises augmented reality server 100, a mobile client 200 are interconnected via a network.
  • the augmented reality server 100 the numerical map information is classified and stored to exchange information with the mobile client 200.
  • the numerical map information stored in the augmented reality server 100 may include 3D object information about a terrain, 3D object information about a facility, and the like.
  • the 3D object information of the terrain may include spatial information and attribute information of the terrain
  • the 3D object information of the facility may include spatial information and attribute information of the facility.
  • the spatial information may be coordinate information about the position of the 3D object
  • the attribute information may be divided into object rendering attribute information for rendering of the 3D object and object identification attribute information for identifying the 3D object. have.
  • the 3D object is expressed on the basis of a mesh form taking the form of a virtual earth surface, and coordinate information of each point for defining spatial information of the 3D object is defined by x, y coordinate values of the plane and h coordinate of the altitude. Can be.
  • the spatial information about the terrain may include, for example, data having three-dimensional information about the terrain, such as DEM, aerial photographs, and intellectual lines.
  • DEM Digital Elevation Models
  • DTM digital terrain model
  • the digital topographic model is a computer program that records the height values of points distributed at regular intervals on the earth's surface.
  • Examples of the DEM collection method include ground surveying, photometric methods, digital maps, radar (Radio Detecting And Ranging), lidar (Light Detection And Ranging), sonar (sound navigation and ranging), and the like. It can be acquired using.
  • the augmented reality server 100 stores a plurality of tiled DEM model tiles divided into a predetermined size.
  • the attribute information about the terrain may include, for example, intellectual information, land information, area information, and the like.
  • the spatial information about the facility may include, for example, 3D object information generated by a program such as 3D Max, 3D CAD, or the like.
  • the attribute information about the facility may include, for example, information on an area of a building, a management institution, a management number, a date, an object name, an object unique number, an area or a location of a building, a building name, a floor number, a telephone number, a feature code, and the like. May be included.
  • the attribute information includes object rendering attribute information and object identification attribute information.
  • the attribute information for object rendering is attribute information necessary for rendering a 3D object when the mobile client 200 implements augmented reality.
  • the attribute information for object rendering may be numerical information about a radius of a manhole, numerical information about a diameter of a water pipe or a sewer pipe, and attribute information such as a height or a number of floors of a building.
  • the attribute information for object identification identifies each 3D object such as unique information of each 3D object, for example, a management authority, a management number, a date, an object name, an object unique number, a building name, a phone number, and a feature code.
  • a management authority a management authority
  • a management number a date
  • an object name a management number
  • an object unique number a building name
  • a phone number a feature code.
  • the augmented reality server 100 as shown in Figure 2, digital map information module 110, digital map information providing module 120, digital map information update module 130 and digital map information DB 140 It is configured to include.
  • the digital map information module 110 is a module that manages digital map information in conjunction with the digital map information DB 140, the spatial information and attribute information on the terrain stored in the digital map information DB 140 to the facility Manages spatial and attribute information.
  • the digital map information module 110 may include two-dimensional or three-dimensional spatial information (visually represented as 'a' in FIG. 13) regarding the terrain, and attribute information ('b' in FIG. 13) of the terrain.
  • 3D spatial information about the shape of the facility (visual representation such as 'a' in FIG. 14, 'a' in FIG. 15B, 'b' in FIG. 15B, and 'a' in FIG. 15C).
  • the attribute information of the facility (visually represented as 'a' in FIG. 15A and 'b' in FIG. 15A), and the like.
  • the digital map information providing module 120 is a module that receives the digital map information request for the predetermined position information of the mobile client 200 and transmits the corresponding digital map information to the mobile client 200.
  • the predetermined location information of the mobile client 200 may be "location information" generated by the mobile client 200, and the location information will be described in detail when describing the mobile client 200.
  • the digital map information transmitted by the digital map information providing module 120 to the mobile client 200 is digital map information for implementing augmented reality in the mobile client 200. It may contain 3D object information.
  • the digital map information for implementing augmented reality is information for rendering a 3D object about a terrain and a facility, and spatial information of a 3D object on a terrain, spatial information of a 3D object about a facility, and a rendering of a 3D object. It may include attribute information for object rendering.
  • the spatial information of the 3D object related to the terrain corresponds to the position information among a plurality of model tiles that tile a digital elevation model (DEM) to a predetermined size.
  • the central tile may include information on an outer tile surrounding the central tile.
  • the spatial information of the 3D object related to the facility may include, for example, 3D object information of the facility corresponding to a predetermined radius around the location information, as shown in FIG. 9.
  • the object rendering attribute information may be, for example, information on the type of cadastral, numerical information on a radius of a manhole, numerical information on a diameter of a water pipe or a sewer pipe, a height of a building, or the number of floors.
  • the numerical map information update module 130 is a module for updating the numerical map information based on the edit information transmitted from the mobile client 200.
  • the editing information is information edited by the user in the mobile client 200 and transmitted to the augmented reality server 100.
  • the edit information may include edit information (at least one of the spatial information and attribute information) of the 3D object with respect to the terrain and / or edit information (at least one of the spatial information and the attribute information with respect to the facility).
  • One edited information may be included.
  • the mobile client 200 exchanges information with the augmented reality server 100 and uses the digital map information received from the augmented reality server 100 to display a 3D virtual image in the real world based on augmented reality. Overlaid.
  • the mobile client 200 may be implemented through, for example, a smartphone having a wireless communication function and a computing function, a mobile tablet PC, a UMPC (Ultra Mobile Personal Computer), and the like.
  • a smartphone having a wireless communication function and a computing function
  • a mobile tablet PC having a wireless communication function and a computing function
  • UMPC Ultra Mobile Personal Computer
  • the mobile client 200 includes a location information module 210, a digital map information receiving module 215, a 3D object generation module 220, a posture information module 225, and an image. It comprises an acquisition module 230, a matching module 235, a display module 240, an editing module 245 and an edit information transmission module 250.
  • the location information module 210 is a module for providing location information in conjunction with a GPS (210a of FIG. 4), and the GPS 210a may preferably use a GPS having a precision capable of implementing augmented reality. .
  • the GPS 210a may be configured of, for example, one of a general Global Positioning System (GPS) or a Differential Global Positioning System (DGPS).
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the GPS 210a may be configured to have a precision capable of implementing augmented reality by interworking with a virtual reference station (VRS) server.
  • VRS virtual reference station
  • the location information module 210 of the mobile client 200 the location information provided from the general GPS, DGPS to the VRS server 300 in a conventional wireless communication method (CDMA network, 3G network, 4G network, LTE network, Wibro network, etc.)
  • the location information module 210 may generate and provide more accurate location information by receiving a correction value corresponding to the location information from the VRS server 300 through a conventional wireless communication method.
  • the VRS (Virtual Reference Station) server is a position measurement server for calculating a correction value corresponding to the position information received from the mobile client 200 in order to reduce the error of the general GPS, DGPS.
  • the VRS server 300 may be, for example, a known server operated by the National Geographic Information Institute, or may be configured in a manner of operating separate VRS permanent observation station 310 and a server.
  • the VRS server 300 calculates a correction value using the constant observation station 310.
  • the station 310 is distributed in several places throughout the country, is installed at a precisely measured reference position and receives the satellite position signal from the satellite for 24 hours and transmits the received result to the VRS server (300).
  • the mobile client 200 transmits the location information provided from the GPS 210a to the VRS server 300 in a normal wireless communication method (CDMA network, 3G network, 4G network, LTE network, Wibro network, etc.). Send it through.
  • a normal wireless communication method CDMA network, 3G network, 4G network, LTE network, Wibro network, etc.
  • the VRS server 300 calculates a correction value corresponding to the point where the mobile client 200 is located, based on the received position information and the correction value of each of the always observed stations 310.
  • the VRS server 300 transmits the calculated correction value to the mobile client 200 through a normal wireless communication scheme, and allows the mobile client 200 to receive the correction value transmitted by the VRS server 300. do.
  • the mobile client 200 which has received the correction value corrects the position information using the correction value to generate final position information.
  • the final position information thus generated can be used as precise position information than the position information provided from the GPS 210a.
  • the accuracy of the position information finally obtained using the VRS server 300 is an error range of ⁇ 1 to ⁇ 2cm, the position information thus obtained has a much more accurate advantage than general GPS, DGPS.
  • the location information module 210 may use a GPS having a precision capable of implementing augmented reality or may have a precision capable of implementing augmented reality by interworking general GPS, DGPS, and VRS. have.
  • the numerical map information receiving module 215 is a module for transmitting the finally obtained position information to the augmented reality server 100 to receive the numerical map information corresponding to the position information.
  • digital map information received by the digital map information receiving module 215 from the digital map information providing module 120 of the augmented reality server 100 may include 3D object information about the terrain and 3D object information about the facility.
  • the digital map information for implementing augmented reality is information for rendering a 3D object about a terrain and a facility.
  • the digital map information may include spatial information of a 3D object on a terrain, spatial information of a 3D object on a facility, and attribute information for object rendering for rendering of the 3D object.
  • the spatial information of the 3D object related to the terrain corresponds to the position information among a plurality of model tiles that tile a digital elevation model (DEM) to a predetermined size.
  • the central tile may include information on an outer tile surrounding the central tile.
  • the spatial information of the 3D object related to the facility may include, for example, 3D object information of the facility corresponding to a predetermined radius around the location information, as shown in FIG. 9.
  • the object rendering attribute information may be, for example, information on the type of land, numerical information on a radius of a manhole, numerical information on a diameter and a thickness of a water pipe or a sewer pipe, a height of a building, or the number of floors.
  • the 3D object generation module 220 uses at least one of a height map technique, a quadtree technique, and a truncated curling technique by using the digital map information received from the digital map information providing module 120 of the augmented reality server 100.
  • the module generates a 3D object for the resin and the 3D object for the facility by modeling with a computing unit (220a of FIG. 4) for the function of the applied 3D engine.
  • the computing unit 220a of the 3D object generation module 220 generates the height map using the height map technique.
  • the computing unit 220a of the 3D object generation module 220 generates a unit node by dividing the height map by a quad tree technique.
  • the computing unit 220a of the 3D object generation module 220 generates a 3D object related to the terrain by curling and modeling the unit node using a frustum culling technique.
  • the height map technique applies the principle of contour to real-time three-dimensional graphics, and as shown in FIG. 10, the height value is represented as the color value of the contour line in the contour line, but the height map is 0 to 255 in the height map. It can be referred to as the contrast value between.
  • this means that the three-dimensional terrain to be made is a height map having only two-dimensional height information, and as shown in FIG. 12, it is reconstructed into three-dimensional terrain using the height map information.
  • the quadtree is a tree that is one of data structures, and refers to a tree having four child nodes, such as a method of recursively dividing a space into four child nodes.
  • the amount of data to be processed by the computing unit 220a can be reduced quickly.
  • the frustum culling is a technique for selecting and rendering only the visible parts by determining whether the visible frustum is visible or not on the screen.
  • the frustration is a technique for rendering only those that are actually included in the field of view of the camera (230a of FIG. 4) among the numerous polygons and objects, and do not render the others.
  • the computing unit 220a of the 3D object generation module 220 uses the spatial information of the 3D object of the facility and the attribute information for object rendering of the facility, based on the 3D object of the terrain, regarding the facility. You can create 3D objects.
  • a 3D object may be generated for a corresponding facility by using spatial information about a specific building (eg, coordinate information about a location of the building) and attribute information for object rendering (ex. The number of floors of a building).
  • spatial information about a specific building eg, coordinate information about a location of the building
  • attribute information for object rendering e.g. The number of floors of a building
  • the posture information module 225 is a module that provides posture information in conjunction with an IMU (225a, inertial measurement unit, inertial measurement device) fixedly installed in the mobile client 200.
  • the posture information module 225 provides the posture information of the mobile client 200 by calculating the Euler angles of yaw, pitch, and roll based on the magnetic north and the magnetic north.
  • the image acquisition module 230 is a module for obtaining real-time image information through a camera 230a fixedly installed in the mobile client 200.
  • the matching module 235 is a module that matches the 3D object generated by the 3D object generation module 220 with the real time image information of the camera 230a based on the position information and the attitude information.
  • the position of the 3D object and the position of the mobile client 200 are matched with each other based on the position information, and the position and direction of the 3D object and the position and direction of the camera 230a are matched with each other.
  • the object and the real-time image information of the camera 230a overlap to match.
  • the display module 240 is a module for graphically displaying the mutually matched 3D object and image information through the display 240a as described above.
  • the operator working in the field through the display module 240 visually changes the numerical information of the digital information such as changing the type of land, changing the number of floors of the building, changing the position of the manhole, changing the lane and the like through the display 240a. It will be easy to check.
  • the editing module 245 is a module that provides a user with an editing function for editing the numerical map information and receives the editing information.
  • the editing module 245 includes a function for selecting a 3D object displayed on the display 240a and a function for correcting information on the selected 3D object.
  • the editing module 245 may provide a menu for editing the information of the 3D object to the user and receive the editing information.
  • the editing module 245 may receive editing information through a menu for selecting a 3D object displayed through the display 240a and a menu for modifying information of the selected 3D object.
  • the editing module 245 may include a 3D object type selection unit 245a, a 3D object selection start unit 245b, an edit object selection unit 245c, a 3D object selection completion unit 245d, and a selection object display unit. 245e, a selection object editing unit 245f, and a 3D object editing completion unit 245g.
  • the 3D object type selector 245a provides a user with a menu for selecting a type of 3D object to be edited before selecting the 3D object to be edited, and the 3D object is one of at least one of a point, a line, and a face. Can be classified as an object.
  • the point object may be defined by spatial information as a coordinate of a point.
  • the linear object may have spatial information defined as coordinates of both ends of the line.
  • the planar object may have spatial information defined as at least three vertex coordinates for forming a plane.
  • the 3D object selection starter 245b provides the user with a menu specifying that selection of the 3D object to be edited is started after the type of 3D object to be edited is selected by the 3D object type selector 245a.
  • the editing object selecting unit 245c provides a user with a menu for selecting a 3D object to be edited.
  • the editing object selector 245c provides a menu for the user to select a 3D object of the terrain three-dimensionally displayed through the display 240a or a 3D object of the facility.
  • the point object may be selected by touching a point of the corresponding 3D object displayed on the display 240a.
  • the linear object may be selected by touching both ends of the corresponding 3D object displayed on the display 240a.
  • planar object may be selected by touching each vertex of the corresponding 3D object displayed on the display 240a.
  • a menu for specifying the selection of the 3D object to be edited is provided to the user.
  • the selection object display unit 245e classifies the selected 3D object and displays the selected 3D object on the display 240a.
  • the selection object display unit 245e may display the selected 3D object separately from other 3D objects by rendering the region or the edge of the selected 3D object in a specific color.
  • the selection object editing unit 245f provides a user with a menu for editing the numerical map information of the selected 3D object.
  • the selection object editing unit 245f provides a menu for editing information about the shape of the selected 3D object or editing information about attributes.
  • Editing of the point object may be performed by defining new coordinates as new spatial information while the editing function for the point object is activated.
  • Editing of the linear object may be performed by defining new coordinates of one point or both ends as new spatial information while the editing function for the linear object is activated.
  • Editing of the planar object may be performed by defining coordinates of at least one or more new vertices as new spatial information while the editing function for the planar object is activated.
  • the mobile client 200 may request the AR server 100 for object identification attribute information on the 3D object displayed on the display 240a. have.
  • the augmented reality server 100 transmits the object identification attribute information on the requested 3D object to the mobile client 200.
  • the augmented reality server 100 may display the object identification attribute information transmitted by the mobile client 200 through the display 240a, and receive edit information on the object identification attribute information from the user. .
  • the edit information for the object identification attribute information may be, for example, edit information for building name editing, phone number editing, feature code editing, management number editing, management institution editing, date editing, and the like. .
  • the 3D object editing completion unit 245g provides a user with a menu specifying that the editing of the 3D object by the selection object editing unit 245f is completed.
  • the edit information transmitting module 250 transmits the edit information by the edit module 245 to the augmented reality server 100.
  • the augmented reality server 100 transmits numerical map information corresponding to the location information transmitted from the mobile client 200 to the mobile client 200. (S100)
  • the location information is transmitted by the mobile client 200 to the VRS server 300 by providing the location information provided from the GPS 210a to the VRS server 300 to receive a correction value corresponding to the location information from the VRS server 300, and receives the received correction value. As the position information is corrected using, final position information may be generated.
  • the location information generated by the mobile client 200 is transmitted to the augmented reality server 100 through the above-described process.
  • the augmented reality server 100 transmits the corresponding numerical map information to the mobile client 200 in response to the numerical map information request corresponding to the location information.
  • the digital map information transmitted to the mobile client 200 may include 3D object information on terrain and 3D object information on facilities as digital map information for implementing augmented reality in the mobile client 200.
  • the digital map information transmitted to the mobile client 200 may include spatial information of a 3D object on a terrain, spatial information of a 3D object on a facility, and attribute information for object rendering for rendering of the 3D object. Can be.
  • the mobile client 200 generates a 3D object based on the received digital map information, matches the real-time image acquired through the camera 230a with the 3D object, and displays the graphic graphically through the display 240a. (S200)
  • the mobile client 200 uses the GPS (to implement the tracking based on the real-time image together with the 3D object displayed on the display 240a).
  • the location information provided from 210a is transmitted to the VRS server 300 to receive a correction value corresponding to the location information from the VRS server 300, and the location information is corrected using the received correction value to generate precise location information. Repeat what you do.
  • the augmented reality server 100 periodically receives location information generated by the mobile client 200 repeatedly, and as shown in FIG. 19, periodically received location information is maintained within a central tile. You will see the change in.
  • the augmented reality server 100 may be configured based on the location information.
  • the 3D object information which is not duplicated with the 3D object information of the facility immediately transmitted among the 3D object information of the facility corresponding to the radius is transmitted to the mobile client 200.
  • the augmented reality server 100 is a central tile of the outer tile 3D object not newly overlapped with the 3D object information transmitted immediately before among the 3D object information newly recognized by the new outer tile corresponding to the change direction of the location information and the facility information corresponding to a certain radius based on the location information.
  • the object information is transmitted to the mobile client 200.
  • the old central tile is located on the outside of the new central tile is newly recognized as the outer tile.
  • the mobile client 200 corresponds to the three new outer tiles newly received from the augmented reality server 100, deletes the three previous outer tiles opposite to the change direction of the location information of the mobile client 200 It is desirable to prevent the load.
  • the mobile client 200 provides the user with an editing function for editing the numerical map information, and if the editing information is input, transmits it to the augmented reality server 100 (S300).
  • the editing function includes a selection function for the 3D object displayed through the display 240a and a correction function for the information on the selected 3D object.
  • the user is provided with a menu for selecting the type of 3D object to be edited (S310).
  • the 3D object may be classified into at least one of a point, a line, and a plane.
  • the display 240a provides a menu for the user to select a 3D object of the terrain displayed in three dimensions or a 3D object of the facility.
  • the selected 3D object may be displayed separately from other 3D objects by rendering and displaying a region or an edge of the selected 3D object in a specific color.
  • a menu may be provided to edit information about a shape of a selected 3D object or edit information about a property.
  • one point object can be designated only by the user's one-time selection, but when a plurality of point objects are specified together, the user needs to select two or more times.
  • the object to be edited may be a point, a line, or a face.
  • the mobile client 200 requests the augmented reality server 100 for object identification attribute information on the 3D object displayed on the display 240a. (S310 ')
  • the AR server 100 transmits the object identification attribute information on the requested 3D object to the mobile client 200. (S320 ').
  • the mobile client 200 displays the received attribute information for object identification through the display 240a. (S330 ')
  • edit information about the object identification attribute information is received from the user (S340 ').
  • the mobile client 200 transmits the edited information input through the above process to the augmented reality server 100.
  • the edit information may be transmitted as string information of a string.
  • the augmented reality server 100 updates the numerical map information based on the received edited information.
  • S400 That is, the augmented reality server searches for the numerical map information and then based on the edited information. Update the numerical map information.
  • the augmented reality server 100 transmits the updated numerical map information to the mobile client 200. That is, the newly updated numerical map information is transmitted to the mobile client 200. .
  • the mobile client 200 generates a 3D object based on the updated numerical map information, matches the real-time image obtained through the camera 230a with the 3D object, and displays the graphic through the display 240a.
  • the display is automatically updated.
  • the update status of the augmented reality server 100 can be confirmed in real time.
  • Embodiments of the invention include a computer readable medium containing program instructions for performing various computer-implemented operations.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical recording media such as CD-ROMs, DVDs, magnetic-optical media such as floppy disks, and ROM, RAM, flash memory, and the like. Hardware devices specifically configured to store and execute the same program instructions are included.
  • the medium may be a transmission medium such as an optical or metal wire, a waveguide, or the like including a carrier wave for transmitting a signal specifying a program command, a data structure, or the like.
  • Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • Tourism & Hospitality (AREA)
  • Geometry (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Architecture (AREA)
  • Primary Health Care (AREA)
  • Marketing (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un système d'édition en temps réel de cartes numériques ainsi qu'un serveur et un support d'enregistrement associés. Ledit procédé d'édition en temps réel de cartes numériques selon un aspect de la présente invention, qui est un procédé d'édition en temps réel de cartes numériques à l'aide d'un serveur de réalité augmentée et d'un client mobile connectés l'un à l'autre sur un réseau, comprend les étapes suivantes : (a) la transmission au client mobile par le serveur de réalité augmentée, d'informations cartographiques numériques correspondant aux informations de localisation transmises par le client mobile ; (b) la génération, par le client mobile, d'un objet 3D en fonction des informations cartographiques numériques transmises, la mise en correspondance d'une image en temps réel capturée par un appareil photographique avec l'objet 3D, et l'affichage graphique du résultat mis en correspondance par l'intermédiaire d'un affichage ; (c) la fourniture par le client mobile, à l'utilisateur, d'une fonction d'édition pour éditer les informations cartographiques numériques, et la transmission des informations d'édition au serveur de réalité augmentée lorsque les informations d'édition sont entrées, les informations d'édition comprenant une fonction pour sélectionner l'objet 3D affiché par l'intermédiaire du dispositif d'affichage, et une fonction pour modifier les informations sur l'objet 3D sélectionné ; et (d) la mise à jour par le serveur de réalité augmentée des informations cartographiques numériques correspondantes en fonction des informations d'édition reçues.
PCT/KR2012/007230 2012-01-25 2012-09-07 Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés WO2013111938A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2012-0007429 2012-01-25
KR1020120007429A KR101176446B1 (ko) 2012-01-25 2012-01-25 수치지도의 실시간 편집 방법 및 시스템, 이를 위한 서버 및 기록매체

Publications (1)

Publication Number Publication Date
WO2013111938A1 true WO2013111938A1 (fr) 2013-08-01

Family

ID=47113339

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/007230 WO2013111938A1 (fr) 2012-01-25 2012-09-07 Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés

Country Status (2)

Country Link
KR (1) KR101176446B1 (fr)
WO (1) WO2013111938A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101690311B1 (ko) * 2015-12-30 2017-01-09 한국해양과학기술원 증강현실객체 배치, 공유 및 전시 시스템 및 배치, 공유 및 전시 방법
KR102195179B1 (ko) * 2019-03-05 2020-12-24 경북대학교 산학협력단 항공사진을 이용한 정사영상 구축방법
KR102361735B1 (ko) * 2021-09-09 2022-02-14 (주)국토공간정보 지하시설물의 지도제작을 위한 측지측량 시스템

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100452606B1 (ko) * 2004-07-08 2004-10-13 공간정보기술 주식회사 3차원 공간정보 생성 및 편집 시스템
KR20090029350A (ko) * 2007-09-18 2009-03-23 에스케이텔레콤 주식회사 모바일 가상세계 서비스 시스템 및 방법
KR100997084B1 (ko) * 2010-06-22 2010-11-29 (주)올포랜드 지하시설물의 실시간 정보제공 방법 및 시스템, 이를 위한 서버 및 그 정보제공방법, 기록매체
KR20110134660A (ko) * 2010-06-09 2011-12-15 엘지전자 주식회사 이동 단말기 및 그 제어 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100452606B1 (ko) * 2004-07-08 2004-10-13 공간정보기술 주식회사 3차원 공간정보 생성 및 편집 시스템
KR20090029350A (ko) * 2007-09-18 2009-03-23 에스케이텔레콤 주식회사 모바일 가상세계 서비스 시스템 및 방법
KR20110134660A (ko) * 2010-06-09 2011-12-15 엘지전자 주식회사 이동 단말기 및 그 제어 방법
KR100997084B1 (ko) * 2010-06-22 2010-11-29 (주)올포랜드 지하시설물의 실시간 정보제공 방법 및 시스템, 이를 위한 서버 및 그 정보제공방법, 기록매체

Also Published As

Publication number Publication date
KR101176446B1 (ko) 2012-09-13

Similar Documents

Publication Publication Date Title
WO2019093532A1 (fr) Procédé et système d'acquisition de coordonnées de position tridimensionnelle sans points de commande au sol à l'aide d'un drone de caméra stéréo
JP6674822B2 (ja) 点群データ生成用画像の撮影方法、及び当該画像を用いた点群データ生成方法
WO2017222136A1 (fr) Système bim de terrassement 3d d'excavatrice destiné à fournir des informations géométriques en temps réel de l'excavatrice pendant des travaux de terrassement
KR101305059B1 (ko) 수치지도의 실시간 편집 방법 및 시스템, 이를 위한 서버 및 기록매체
WO2019143102A1 (fr) Procédé et système de simulation de dispositif photovoltaïque, et programme
WO2011074759A1 (fr) Procédé d'extraction d'informations tridimensionnelles d'objet d'une image unique sans méta-informations
WO2017007166A1 (fr) Procédé et dispositif de génération d'image projetée et procédé de mappage de pixels d'image et de valeurs de profondeur
WO2011059243A2 (fr) Serveur, terminal utilisateur, et procédé de délivrance de service et procédé de commande de celui-ci
JP2008275391A (ja) 位置姿勢計測装置及び位置姿勢計測方法
WO2022131727A1 (fr) Dispositif de fourniture d'informations de biens immobiliers et procédé de fourniture d'informations de biens immobiliers
WO2013111938A1 (fr) Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés
KR101874498B1 (ko) Uas 기반의 공간 정보 취득을 위한 지상 기준점의 항공 측량 시스템 및 그 방법
WO2021125395A1 (fr) Procédé pour déterminer une zone spécifique pour une navigation optique sur la base d'un réseau de neurones artificiels, dispositif de génération de carte embarquée et procédé pour déterminer la direction de module atterrisseur
WO2016126083A1 (fr) Procédé, dispositif électronique et support d'enregistrement pour notifier des informations de situation environnante
WO2020189909A2 (fr) Système et procédé de mise en oeuvre d'une solution de gestion d'installation routière basée sur un système multi-capteurs 3d-vr
WO2011087249A2 (fr) Système de reconnaissance d'objets et procédé de reconnaissance d'objets l'utilisant
WO2020138760A1 (fr) Dispositif électronique et procédé de commande associé
WO2018135745A1 (fr) Procédé et dispositif pour générer une image pour indiquer un objet sur la périphérie d'un véhicule
WO2019009624A1 (fr) Procédé et appareil de fourniture de services de carte mobile numérique pour une navigation en toute sécurité d'un véhicule aérien sans pilote
JP3251557B2 (ja) 壁面の画像・損害・修復情報登録管理システム及び方法
WO2023055033A1 (fr) Procédé et appareil pour l'amélioration de détails de texture d'images
JP2021157717A (ja) 拡張現実表示装置
WO2022059878A1 (fr) Procédé et dispositif de guidage sur un itinéraire faisant appel à une vue en réalité augmentée
WO2009131361A2 (fr) Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel
WO2016153189A1 (fr) Procédé de gestion d'état de disposition d'équipements utilisant une base de données

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12866941

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12866941

Country of ref document: EP

Kind code of ref document: A1