WO2009131361A2 - Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel - Google Patents

Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel Download PDF

Info

Publication number
WO2009131361A2
WO2009131361A2 PCT/KR2009/002080 KR2009002080W WO2009131361A2 WO 2009131361 A2 WO2009131361 A2 WO 2009131361A2 KR 2009002080 W KR2009002080 W KR 2009002080W WO 2009131361 A2 WO2009131361 A2 WO 2009131361A2
Authority
WO
WIPO (PCT)
Prior art keywords
texture
contrast
modeling data
editing
map service
Prior art date
Application number
PCT/KR2009/002080
Other languages
English (en)
Korean (ko)
Other versions
WO2009131361A3 (fr
Inventor
서정각
Original Assignee
팅크웨어(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020080037325A external-priority patent/KR100898262B1/ko
Priority claimed from KR1020080054223A external-priority patent/KR100896137B1/ko
Application filed by 팅크웨어(주) filed Critical 팅크웨어(주)
Publication of WO2009131361A2 publication Critical patent/WO2009131361A2/fr
Publication of WO2009131361A3 publication Critical patent/WO2009131361A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]

Definitions

  • the present invention relates to a 3D map service, and more particularly, to an apparatus and method for editing map data in a 3D map service.
  • the 3D map service provides a map service that is more similar to an actual building by representing the buildings 110 to 140 in a three-dimensional three-dimensional representation in the map data 100 as shown in FIG. 1.
  • the conventional 3D map service is merely a 3D representation of a building, there is a demand for a method of expressing changes in how a building reacts to light according to night, day, or weather in order to feel more realistic. .
  • the number of information required for contrast processing for a building is large, and there is a problem in that it takes a long time to perform the contrast processing because the contrast must be processed using a lot of information.
  • a texture suitable for the shape of 3D map data is mapped and displayed on the screen through a procedure as illustrated in FIG. 2.
  • the conventional texture processing apparatus produces a texture image (S210) and converts the produced texture image into a resource file according to a format rule (S220). Then, the conventional texture processing apparatus reads the resource file mapped to the shape of the 3D map data in the display engine (S230), and finally displays the resource file read on the 3D map display screen (S240).
  • the conventional texture processing method In the conventional texture processing method, a manufacturer changes and modifies a texture image several times in order to produce an optimal texture image. That is, the conventional texture processing method has a problem in that a texture processing process as shown in FIG. 2 must be performed every time in order to display an optimal texture image desired by a manufacturer.
  • 3D map service generally expresses the building's shape as modeling data in the same way as the actual building.
  • the modeling data is data created by a designer in the same way as a real building using a 3D modeling tool. That is, the 3D map service reads the modeling data in a prescribed format and expresses the shape of the building in 3D.
  • the modeling data is produced in three-dimensional space, so the actual location of the building, the angle of the building based on true north, the size of the building, etc. are expressed, and thus do not exactly match the actual building. There is this.
  • the conventional 3D map service since many buildings exist in nature, when the modeling data and the actual building do not match, there is a problem that a lot of manpower and time are consumed in manually editing all the modeling data.
  • the present invention provides an apparatus and method for expressing contrast in a three-dimensional map service that reads information about a contrast object from map data, applies a color to the contrast object, and processes the contrast for the contrast object according to the applied color. .
  • the present invention provides a contrast device in a three-dimensional map service for processing the contrast between the upper and lower portions by applying a color for the upper end of the contrast target and the lower end of the contrast processing target using a gradient technique And methods.
  • the present invention provides a texture editing apparatus and method for retrieving a texture ID using a file name of a texture, changing a texture using a retrieved texture ID, and storing the texture ID in a texture resource file to edit the texture file. do.
  • the present invention provides an apparatus and method for texture editing in a 3D map service for managing and separating all textures used in the 3D map service by features.
  • the present invention also provides an apparatus and method for texture editing in a 3D map service for changing a texture using a file name of a texture and a texture ID retrieved from a graphic library.
  • the present invention provides an apparatus and method for editing a texture file in a 3D map service for enabling the user to directly store the changed texture in the texture resource file.
  • the present invention is to check the texture ID from the graphic library using the file name of the texture input from the user to change the color and size of the texture, 3D map service for checking the changed texture on the 3D map service execution screen It provides a texture editing apparatus and method in.
  • the texture editing apparatus in the 3D map service which stores all the size and color information of the changed texture in advance and stores the final texture resource file. And methods.
  • the present invention is to select the ID for the modeling data from the map data to be edited from the user, three-dimensional to accurately edit the modeling data by modifying the position, angle or size of the modeling data corresponding to the selected ID on the three-dimensional map
  • An apparatus and method for editing modeling data in a map service are provided.
  • the present invention also provides an apparatus and method for editing modeling data in a 3D map service that allows a user to determine whether editing of modeling data has been completed.
  • the present invention also provides an apparatus and method for editing modeling data in a 3D map service that allows a user to check whether modeling data to be edited collides with other buildings or roads.
  • the present invention provides an apparatus and method for editing modeling data in a three-dimensional map service providing the angle of the modeling data according to the reference direction, the position and size of the modeling data on the three-dimensional map as the editing information.
  • the present invention reflects the edit information on the modeling data to the map data, and displays the map data reflecting the edit information on the 3D map service execution screen to allow the user to check the edited modeling data in the 3D map service
  • An apparatus and method for editing modeling data is provided.
  • the contrast representation apparatus in the 3D map service applies an information reader to read information about a contrast target from map data, and applies a first color to an upper end of the contrast target. And a color applying unit applying a second color different from the first color to the lower end of the contrast processing object, and a contrast processing unit processing the contrast for the contrast processing object according to the applied color.
  • Contrast representation method in the three-dimensional map service comprises the steps of reading the information on the contrast processing object from the map data, applying a first color to the upper end of the contrast processing object, And applying a second color different from the first color to the lower end of the contrast processing object, and processing the contrast for the contrast processing object according to the applied color.
  • the texture editing apparatus of the 3D map service includes an input unit for receiving a file name of a texture from a user, a search unit for searching for a texture ID corresponding to the file name of the input texture, and the retrieved texture. It includes a changer that changes the texture using the ID.
  • the method comprises: receiving a file name of a texture from a user, searching for a texture ID corresponding to the file name of the input texture, and searching for the texture ID. Including the step of changing the texture.
  • An apparatus for editing modeling data in a 3D map service includes a selection unit for selecting an ID for a target to be edited from map data and an editing unit for editing modeling data corresponding to the selected ID on a 3D map. It includes.
  • a method of editing modeling data in a 3D map service includes selecting an ID for an object to be edited from map data and editing modeling data corresponding to the selected ID on a 3D map. Include.
  • a method and apparatus for expressing contrast in a 3D map service that reads information on a contrast target from map data, applies a color to the contrast target, and processes the contrast for the target according to the applied color. Can provide.
  • a texture editing apparatus and method for retrieving a texture ID corresponding to a file name of a texture, changing a texture using the retrieved texture ID, and storing the texture ID in a texture resource file to edit the texture file Can be provided.
  • the texture in the 3D map service that stores both the size and color information of the changed texture in advance and stores it as the final texture resource file
  • the user can select the ID of the modeling data from the map data including the modeling data to be edited by the user and modify the modeling data accurately by modifying the position, angle or size of the modeling data on the 3D map.
  • An apparatus and method for editing modeling data in a map service may be provided.
  • the three-dimensional map service to reflect the edit information on the modeling data to the map data, and to display the map data reflecting the edit information on the 3D map service execution screen so that the user can check the edited modeling data It is possible to provide an apparatus and method for editing modeling data in.
  • FIG. 1 is a diagram illustrating an example of a conventional three-dimensional map service.
  • FIG. 3 is a diagram illustrating a configuration of an apparatus for representing contrast in a 3D map service according to an exemplary embodiment of the present invention.
  • FIG. 4 is a diagram showing an example of vertex information required according to the case where the light and dark object is a cube.
  • FIG. 5 is a diagram illustrating an example of vertex information in map data recorded in a database.
  • FIG. 6 is a view showing an example of the contrast representation for the building in the three-dimensional map service according to the present invention.
  • FIG. 7 is a flowchart illustrating a method of expressing contrast in a 3D map service according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a conventional texture processing procedure.
  • FIG. 8 is a diagram illustrating a configuration of a texture editing apparatus in a 3D map service according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an example of a graphic library that provides a texture ID corresponding to a file name of a texture classified by use.
  • FIG. 10 is a diagram illustrating an example of displaying a state before texture editing and a state after texture editing on a 3D map service execution screen.
  • FIG. 11 is a diagram illustrating a flow of a texture editing method in a 3D map service according to an embodiment of the present invention.
  • FIG. 12 is a diagram illustrating a configuration of an apparatus for editing modeling data in a 3D map service according to an embodiment of the present invention.
  • FIG. 13 is a diagram illustrating an example of an editing service screen for modeling data provided by the modeling data editing apparatus according to the present invention.
  • FIG. 14 is a diagram illustrating an example of a state before modeling data to be edited is edited in an edit service screen for modeling data according to the present invention.
  • 15 is a diagram illustrating an example of a state after modeling data, which is an editing target, is edited in an editing service screen for modeling data according to the present invention.
  • FIG. 16 is a flowchart illustrating a method for editing modeling data in a 3D map service according to an embodiment of the present invention.
  • Apparatus and method for editing map data in three-dimensional map data service are an apparatus and method for expressing contrast in three-dimensional map service, an apparatus and method for texture editing in three-dimensional map service, or modeling in three-dimensional map service. It can be implemented as a data editing apparatus and method.
  • FIG. 3 is a diagram illustrating a configuration of an apparatus for representing contrast in a 3D map service according to an exemplary embodiment of the present invention.
  • the contrast representation apparatus 300 in the 3D map service includes a database 310, an information reader 320, a color applier 330, and a contrast processor 340. ) And the display unit 350.
  • the database 310 records and maintains various data and information necessary to provide a 3D map service according to the present invention. That is, the database 310 records and maintains map data used in the 3D map service according to the present invention, color information on the contrast processing target included in the map data, and location information.
  • the information reading unit 320 reads out information on the light and shade processing target from the map data recorded in the database 310. That is, the information reader 320 reads out color or location information of the contrast processing object included in the map data recorded in the database 310.
  • the information reading unit 320 reads out information about a vertex of the upper end and information about a vertex of the lower part of the contrast processing object. For example, when the contrast object is a hexahedron building, the information reading unit 320 reads information on four vertices located at the upper end of the hexahedral building and information on four vertices located at the lower end of the hexahedral building.
  • the color applying unit 330 applies a first color to the upper end of the contrast subject and applies a second color different from the first color to the lower end of the contrast subject. For example, when present at the vertex in the contrast processing object, the color applying unit 330 applies the first color to the vertex located on the upper end of the business card processing target, the first to the vertex located on the lower end of the contrast processing target 2 Apply color. For example, when the time for representing the contrast for the contrast target is low, the color applying unit 330 applies the first color as a lighter color than the second color, and the second color is darker than the first color. Apply by color. For example, when the time indicating the contrast for the contrast target is night, the color applying unit 330 may apply the color in the same manner as when the time indicating the contrast for the target contrast is low.
  • the contrast processing unit 340 processes the contrast for the contrast processing object according to the applied color.
  • the contrast processing unit 340 processes the contrast for the contrast target by changing the color of the portion located between the upper end and the lower end from the first color to the second color in stages. For example, when the contrast time for the contrast target is low, the first color applied to the upper end should be brighter than the second color applied to the lower end. Contrast may be processed to be the brightest and the light and dark to gradually darker gradually toward the lower end of the light and light processing object. For example, when the time indicating the contrast for the contrast target is night, the contrast processor 340 may process the contrast darker than when the time indicating the contrast for the contrast target is low. In addition, the contrast processing unit 340 overlaps the texture applied to the contrast processing object and processes the contrast in a gradation method.
  • the display unit 350 displays the three-dimensional map data processed with the contrast for the contrast object. For example, when the time indicating the contrast for the contrast target is low, the display unit 350 is a 3D map that is contrasted so that the upper end of the contrast processing object is the brightest and gradually darkens toward the lower end of the contrast processing object. Data can be displayed. For example, when the contrast time for the contrast target is night, the display unit 350 may display the 3D map data that is darker than when the contrast time for the contrast target is low. .
  • the contrast expression apparatus 300 applies a color differently to the upper end of the contrast processing object and the color of the lower end of the contrast processing object by differently. Contrast 3D map data can be provided using color gradients.
  • FIG. 4 is a diagram showing an example of vertex information required according to the case where the light and dark object is a cube.
  • the contrast target when the contrast target is a hexahedron, four vertices V11 to V14 located at the upper end of the contrast target and four vertices V21 to V24 located at the lower end of the contrast target ).
  • FIG. 5 is a diagram illustrating an example of vertex information in map data recorded in a database.
  • the first data Data1 when the contrast object is a hexahedral building as illustrated in FIG. 4 includes color information and position information for eight vertices, respectively.
  • the first vertex V11 is any one of vertices disposed at an upper end of the hexahedron building, and the first color information C11 and the first one. It may have location information XYZ11.
  • the second vertex V12 is any one of vertices disposed at an upper end of the hexahedron building, and the second color information C12 and the second. It may have location information XYZ12.
  • the third vertex V13 is any one of vertices disposed at an upper end of the hexahedron building, and the third color information C13 and the third. It may have location information XYZ13.
  • the fourth vertex V14 is any one of vertices disposed at an upper end of the hexahedron building, and the fourth color information C14 and the fourth position. It may have information XYZ14.
  • the first to fourth color information C11 to C14 may be the same color, or may be different colors.
  • the fifth vertex V21 is any one of vertices disposed at the lower end of the hexahedron building, and the fifth color information C21 and the fifth color. It may have location information XYZ21.
  • the sixth vertex V22 is any one of vertices disposed at a lower end of the hexahedron building, and the sixth color information C22 and the sixth color. It may have location information XYZ22.
  • the seventh vertex V23 is any one of vertices disposed at the lower end of the hexahedron building, and the seventh color information C23 and the seventh. It may have location information XYZ23.
  • the eighth vertex V24 is any one of vertices disposed at the lower end of the hexahedron building, and the eighth color information C24 and the eighth position May have information XYZ24.
  • the fifth to eighth color information C21 to C24 may be the same color or may be different from each other, and may be different from the colors of the first to fourth color information C11 to C14.
  • FIG. 6 is a view showing an example of the contrast representation for the building in the three-dimensional map service according to the present invention.
  • the three-dimensional map service screen 600 provides the buildings 610 ⁇ 640 which are processed in a three dimensional map according to a gradient technique.
  • the 3D map service screen 600 according to the present invention is the brightest contrast of the upper end of the building (610 ⁇ 640) that is the target of the contrast processing, 3D map data processed with contrast may be provided to gradually darken gradually toward the lower ends of the buildings 610 to 640 that are the targets of contrast.
  • FIG. 7 is a flowchart illustrating a method of expressing contrast in a 3D map service according to an embodiment of the present invention.
  • the contrast controller 300 reads information on a contrast process object from map data.
  • the business card representation device 300 records and maintains various map data and information necessary to provide a 3D map service according to the present invention in the database 310.
  • the database 310 includes map data used in the 3D map service according to the present invention as shown in FIG. 5, vertex information (V11 ⁇ V24) of the contrast processing object Data1 included in the map data, Color information C11 to C24 and position information XZY11 to XYZ24 can be recorded and maintained.
  • the contrast representation apparatus 300 reads color information or position information of the contrast process target from map data recorded in the database as illustrated in FIG. 5.
  • the contrast controller 300 reads information about an upper end point and information about an upper end point of the lower end from the contrast processing target.
  • the contrast expression device 300 is located at the lower end of the hexahedron building with information about four vertices V11 to V14 located at the upper end of the hexahedron building. Information about four vertices (V21 to V24) can be read.
  • the contrast controller 300 applies a first color to the upper end of the contrast target. For example, if there is a vertex in the contrast processing object, the contrast representation device 300 in step (S620) the first color (C11 ⁇ C14) to the vertex (V11 ⁇ V14) located in the upper end of the business card processing target Applicable
  • the contrast expression apparatus 300 may apply the first color to a lighter color than the second color in step S720.
  • the contrast representation apparatus 300 may apply the color in the same manner as in the case where the time for representing the contrast for the contrast processing target is low.
  • the contrast controller 300 applies a second color different from the first color to the lower end of the contrast processing target. For example, if there is a vertex in the contrast processing object, in operation S730, the contrast expression apparatus 300 applies the second colors C21 to C24 to the vertices V21 to V24 located at the lower end of the contrast processing object. Applicable For example, when the time for representing the contrast for the contrast processing object is low, the contrast expression apparatus 300 may apply the second color to a color darker than the first color in step S730. For example, when the time for representing the contrast for the contrast target is night, the contrast expression apparatus 300 may apply the color in the same manner as in the case where the time for representing the contrast for the contrast target is low. .
  • the contrast controller 300 processes the contrast for the contrast target according to the applied color. That is, in operation S740, the contrast controller 300 may change the color of the portion located between the upper end of the contrast target and the lower end of the contrast target from the first color to the second color in steps. Contrast Processes the contrast for the target. For example, when the time indicating the contrast for the light and shade processing target is low, the contrast expression apparatus 300 in step (S740) should be brighter than the second color applied to the lower end of the first color applied to the upper end of the Contrast may be processed for the contrast target so that the upper end of the contrast target is the brightest and gradually darkens gradually toward the lower end of the contrast target.
  • the contrast expression apparatus 300 may process the contrast darker than when the time indicating the contrast for the contrast target is low.
  • the contrast controller 300 may process the contrast using a gradation method by overlapping the texture applied to the contrast processing target.
  • the contrast controller 300 displays the map data processed by the contrast. For example, when the time indicating the contrast for the contrast target is low, the contrast expression apparatus 300 in step (S750) so that the upper end of the contrast processing object is the brightest and gradually darker gradually toward the lower end of the contrast processing object. Contrast processed three-dimensional map data can be displayed. For example, when the time indicating the contrast for the contrast target is night, the contrast expression apparatus 300 in step S750 is darker and darker than the case where the time indicating the contrast for the contrast target is three-dimensional map Data can be displayed.
  • the method for expressing contrast according to the present invention applies colors differently for the upper end of the contrast target and the lower end of the contrast target, and applies colors between the upper end of the contrast target and the lower end of the contrast target.
  • the gradient technique can be used to provide three-dimensional map data processed with contrast.
  • the contrast representation method in the three-dimensional map service is implemented in the form of program instructions that can be executed by various computer means can be recorded on a computer-readable recording medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks such as floppy disks.
  • Magneto-optical media and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
  • FIG. 8 is a diagram illustrating a configuration of a texture editing apparatus in a 3D map service according to an embodiment of the present invention.
  • the texture editing apparatus 800 of the 3D map service may include an input unit 810, a search unit 820, a graphic library 825, a change unit 830, The display unit 835 and the storage unit 840 are included.
  • the input unit 810 receives a file name of a texture to be changed from a user of the texture editing apparatus 800.
  • the input unit 810 receives a file name of a texture for the building from the user.
  • the input unit 810 may receive a 'second color name' as a file name of a texture for the building from the user.
  • the search unit 820 searches for a texture ID using the file name of the texture. That is, the search unit 820 searches for the texture ID designated by the graphic library 825 using the file name of the texture.
  • All textures used in the 3D map service according to the present invention are classified by features. That is, the texture editing apparatus 800 according to the present invention may classify all textures used in the 3D map service by feature using a matching table between the file name of the texture and the texture ID.
  • the graphic library 825 provides the user with a texture ID corresponding to the file name of the classified texture. For example, if all textures used in the 3D map service are classified according to characteristics such as color, brightness, saturation, or use, the graphic library 825 is classified by color, brightness, saturation, or use. A texture ID corresponding to the file name of the texture may be provided.
  • FIG. 9 is a diagram illustrating an example of a graphic library that provides IDs of textures corresponding to textures classified by use.
  • the textures used in the 3D map service may be classified into buildings, roads, or the like, depending on the purpose.
  • the graphic library 825 may provide a texture ID corresponding to a file name of a texture classified according to the purpose.
  • the graphic library 825 may be a building graphic library 910, a road graphic library 920, and the like, which provide a texture ID corresponding to a file name of a texture classified according to the use in the 3D map service. May include a graphics library 930.
  • the building graphic library 910 may provide texture IDs for the first to fifth building texture images 911 to 915 associated with the building among the texture images used in the 3D map service.
  • the first to fifth building texture images 911 to 915 may be configured with colors associated with buildings, and may be differently designated so that texture IDs may be distinguished according to each color.
  • the texture ID for the first building texture image 911 is designated as '911'
  • the texture ID for the second building texture image 912 is designated as '912'
  • the third building texture image The texture ID for 913 is designated as '913'
  • the texture ID for the fourth building texture image 914 is designated as '914'
  • the texture ID for the fifth building texture image 915 is '915'. May be specified.
  • the road graphic library 920 may provide texture IDs of the first to fifth road texture images 921 to 925 associated with the road among the texture images used in the 3D map service.
  • the first to fifth road texture images 921 to 925 may include colors associated with roads, and may be differently designated so that texture IDs may be distinguished according to each color.
  • the texture ID for the first road texture image 921 is designated as '921'
  • the texture ID for the second road texture image 922 is designated as '922'
  • the third road texture image Texture ID for 923
  • texture ID for the fourth road texture image 924 is designated as '924'
  • texture ID for the fifth road texture image 925 is '925' May be specified.
  • the other graphic library 930 may provide texture IDs for the first to fifth other texture images 931 to 935 associated with other uses other than buildings or roads among the texture images used in the 3D map service.
  • the first to fifth other texture images 921 to 925 are configured with colors suitable for each purpose, and may be differently designated so that texture IDs may be distinguished according to each color.
  • the texture ID of the first other texture image 931 may be designated as '931'
  • the texture ID of the second other texture image 932 may be designated as '932'
  • the third other texture image ( The texture ID for 933) is designated as '933'
  • the texture ID for the fourth other texture image 934 is designated as '934'
  • the texture ID for the fifth other texture image 935 is '935'. May be specified.
  • the changer 830 changes the texture by using the texture ID. That is, the changer 830 changes the color and size of the texture by using the retrieved texture ID. For example, when the searched texture ID is '915', the changer 830 may change the color and size of the existing texture to match the size of the blue and the building corresponding to the searched texture ID '915'. .
  • the display unit 835 displays the changed texture through the 3D map service execution screen.
  • FIG. 10 is a diagram illustrating an example of displaying a state before texture editing and a state after texture editing on a 3D map service execution screen.
  • the display unit 835 displays a target building 1011 on which a texture is to be edited according to a texture editing state 1010 on the 3D map service execution screen.
  • the display unit 235 displays a target building 1021 in which the texture is edited according to the state 1020 after texture editing on the 3D map service execution screen.
  • the storage unit 840 checks the changed texture and stores it in a texture resource file. That is, the storage unit 840 stores color and size information of the changed texture in advance and stores the texture resource file as the texture resource file.
  • the storage unit 840 may be configured to modify the target building 1021 edited by the user who has checked the target building 1021 whose texture displayed on the 3D map service execution screen is edited as shown in FIG. 4.
  • the changed texture may be stored as the texture resource file.
  • the texture editing apparatus 800 may search for the texture ID using the file name of the texture, change the texture using the retrieved texture ID, and then save the texture resource file to edit the texture file. .
  • FIG. 11 is a diagram illustrating a flow of a texture editing method in a 3D map service according to an embodiment of the present invention.
  • the texture editing apparatus 800 of the 3D map service receives a file name of a texture from a user.
  • the texture editing apparatus 800 receives a file name of a texture for the building from the user.
  • the texture editing apparatus 800 may receive a 'blue' as the file name of the texture for the building from the user.
  • the texture editing apparatus 800 searches for a texture ID corresponding to the file name of the texture. That is, in operation S1120, the texture editing apparatus 800 retrieves a texture ID corresponding to the file name of the texture from the graphic library. For example, when the file name of the change texture is 'blue' and the texture ID designated as 'blue' in the building graphic library 910 is '915', the texture editing apparatus 800 may perform the step (S1120). If 'blue' is present in the building graphic library 910 included in the graphic library 825 using the file name 'blue' of the changed texture, the texture ID designated as 'blue' from the building graphic library 910 is determined. You can search for '915'.
  • the texture editing apparatus 800 classifies all textures used in the 3D map service by features, and provides the texture ID corresponding to file names of the textures classified by the features through the graphic library. It may further comprise a step. That is, in operation S1120, the texture editing apparatus 800 classifies all textures used in the 3D map service by a feature using a matching table between a texture ID and a file name of a texture, and uses the texture library to classify the textures. You can provide an ID.
  • the texture editing apparatus 800 classifies all textures used in the 3D map service according to characteristics such as color, brightness, saturation, or use, and classifies the textures through the graphic library.
  • a texture ID corresponding to the file name of the texture may be provided.
  • the texture editing apparatus 800 classifies the textures used in the 3D map service into buildings, roads, and the like, according to the use, and according to the use through the graphic library.
  • a texture ID corresponding to a file name of the classified texture may be provided.
  • the texture editing apparatus 800 changes the texture by using the texture ID. That is, in operation S1130, the texture editing apparatus 800 changes the color and size of the texture by using the retrieved texture ID. For example, when the searched texture ID is '915', the texture editing apparatus 800 displays the color and size of the existing texture in blue color corresponding to the searched texture ID '915' and the size of the building in step S1130. You can change it to suit your needs.
  • the texture editing apparatus 800 may further include displaying the changed texture on the 3D map service execution screen.
  • the texture editing apparatus 800 checks the changed texture and stores the changed texture in a texture resource file. That is, in operation S1140, the texture editing apparatus 800 checks the color and size information of the changed texture and stores the texture resource file as the texture resource file. For example, in operation S1140, the texture editing apparatus 800 checks the changed texture displayed through the 3D map service execution screen, and then receives a request for storing the changed texture from the user. Can be stored in a resource file.
  • a texture ID corresponding to a file name of a texture input from a user is searched for, a texture is changed using the retrieved texture ID, and then stored in a texture resource file. You can edit the texture file.
  • the texture editing method in the 3D map service may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable recording medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks such as floppy disks.
  • Magneto-optical media and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.
  • FIG. 12 is a diagram illustrating a configuration of an apparatus for editing modeling data in a 3D map service according to an embodiment of the present invention.
  • the modeling data editing apparatus 1200 of the 3D map service may include a selecting unit 1210, an editing unit 1220, a storage unit 1230, and a reflecting unit 1240. And a display unit 1250.
  • the selector 1210 selects an ID of a target to be edited from the map data.
  • the selector 1210 may receive an ID of modeling data which is an editing target to be edited from the user in the map data.
  • FIG. 13 is a diagram illustrating an example of an editing service screen for modeling data provided by the modeling data editing apparatus according to the present invention.
  • the modeling data editing service screen 1300 may include a leaf information providing area 1310, a landmark information providing area 1320, an ID providing area 1330 of modeling data, an editing information providing area 1340, and the like.
  • the modification complete menu 1350, the save menu 1360, and the edit execution screen 1370 are included.
  • the leaflet information providing area 1310 includes a leaflet loading method, a leaflet number region, and a moving menu.
  • the foliage loading method one mesh or multiple meshes may be selected by a user who uses the modeling data editing apparatus 1200.
  • the leaf number area may be directly input by the user if the user knows the leaf number for the desired area, or the leaf number for the searched area may be input if the user does not know the leaf number for the desired area. have.
  • the moving menu is a menu requesting to be moved to map data for the leaf number entered in the leaf number area.
  • the landmark information providing area 1320 provides IDs and names of modeling data corresponding to landmarks, and when the landmarks are in a pre-edit state (1321, 1322, 1323, 1324, and 1327). When the landmark is in the edited state (1325, 1326) is displayed differently to distinguish.
  • the landmark is a representative landmark that stands out in the surrounding landscape and can be easily found by people, including Gwangjin Bridge, Songpa Culture & Arts Center, Charlotte Theatre, Gyrodrop, Park Hyatt Seoul Hotel, Korea General Trade Center, Seoul Asan Hospital, etc. Can be.
  • the ID providing area 1330 of modeling data provides IDs for modeling data included in a template list, displays IDs 1331 of modeling data selected for editing from the user, and displays IDs of modeling data that have been edited. 1332) is displayed differently from the IDs of modeling data before editing is completed.
  • the user may determine whether or not editing of the modeling data included in the landmark or the modeling data included in the template list is completed through the landmark information providing region 1320 or the ID providing region 1330 of the modeling data. have.
  • the edit information providing area 1340 may include collision prediction numbers for the modeling data different from the selected modeling data, total number of landmarks / templates, position coordinates of the selected modeling data (Pos X, Pos Y, Pos Z), Various edit information about the selected modeling data is included, such as an angle according to a reference direction with respect to the selected modeling data and a size (Scale X, Scale Y, Scale Z) for the selected modeling data.
  • the angle according to the reference direction with respect to the modeling data is an angle based on the true north direction when the modeling data is located on the northern hemisphere, and the angle based on the true south direction when the modeling data is located on the southern hemisphere. Can be.
  • the modification completion menu 1350 is a menu selected by the user when the editing information on the selected modeling data is completed, and the storage menu 1360 is modified as the modification on the selected modeling data is completed.
  • the edit execution screen 1370 displays the edit modeling data 1371 selected by the user on map data, and the selected edit target modeling data 1372 may display a pre-edit state or a post-edit state according to the edit information. Can be.
  • the selector 1210 receives the modeling data 1372, which is an editing target, from the user in the map data provided to the user through the edit execution screen 1370. Then, in the ID providing region 1330 of the modeling data, it may indicate that the ID 1331 corresponding to the modeling data 1372, which is the editing target selected by the user, is selected.
  • the editor 1220 edits modeling data corresponding to the selected ID on the 3D map. For example, when modeling data corresponding to the selected ID collides with other building data or road data by referring to road terrain data on the 3D map, the editor 1220 modifies the position, size, or angle of the modeling data. To edit the modeling data.
  • the modeling data editing apparatus 1200 of the 3D map service receives an ID for the modeling data to be edited from the map data including the modeling data to be edited by the user, and selects another ID on the 3D map.
  • Modeling data can be edited accurately by modifying the location, angle, or size of modeling data that collides with buildings or roads.
  • the storage unit 1230 stores editing information for each modeling data. That is, the storage unit 1230 has an ID for each of the modeling data, whether or not the modeling data has been modified or not, whether or not there is a collision with the modeling data and other modeling data, the angle of the modeling data according to a reference direction, and the modeling on the map. The position of the data or the size of the modeling data is stored as the edit information.
  • the reflecting unit 1240 reflects the edited information in the map data. That is, the reflecting unit 1240 reflects the modeling data edited according to the editing information on the map data.
  • the reflecting unit 1240 may reflect the modeling data in which the angle of the modeling data, the position of the modeling data on the map, or the size of the modeling data is modified and edited according to the editing information.
  • the display unit 1250 displays the map data reflecting the edited information on the 3D map service screen. That is, the display unit 1250 displays map data including modeling data before editing through the 3D map service screen, and displays map data reflecting the edited information after the modeling data is edited.
  • the modeling data editing apparatus 1200 reflects the editing information about the modeling data in the map data, and displays the map data in which the editing information is reflected on the 3D map service execution screen, thereby making the user edited modeling data. You can check.
  • FIG. 14 is a diagram illustrating an example of a state before editing object modeling data is edited in an editing service screen for modeling data according to the present invention.
  • the editing service screen 1400 for modeling data provides various information related to the modeling data according to a state before modeling data, which is an editing target selected by the user, is edited.
  • the display unit 1250 may display various information related to the modeling data according to a state before the modeling data, which is an editing target selected by the user, is edited through the modeling data editing service screen 1400.
  • the ID providing area 1430 of the modeling data displays the ID 1431 of the modeling data selected for editing from the user, and the ID 1432 of the modeling data that has been edited and the IDs of the modeling data before the editing is completed. Mark them separately.
  • the edit information providing area 1440 may include collision prediction numbers for modeling data and other modeling data, total number of landmarks / templates, position coordinates of the selected modeling data (Pos X, Pos Y, Pos Z), and the selected modeling data. Pre-editing information about the selected modeling data is provided, such as an angle according to a reference direction with respect to data and a size (Scale X, Scale Y, Scale Z) for the selected modeling data.
  • the edit execution pre-screen 1470 displays modeling data 1471, which is an editing target selected by the user on the map data, and collides with neighboring buildings 1472 and 373 and modeling data 1471 that collide with the modeling data 1471. Roads 1475 that do not collide with surrounding buildings 1474 and modeling data 1471.
  • the user may determine whether the modeling data 1471 to be edited collides with other buildings 1472 to 1474 or the road 1475 through the edit before execution screen 1470.
  • the editing unit 1220 may convert road terrain data on the 3D map.
  • the modeling data may be edited by modifying the location, size, or angle of the modeling data so as not to collide with other buildings 1472 and 1473 and the surrounding road 1475.
  • 15 is a diagram illustrating an example of a state after modeling data, which is an editing target, is edited in an editing service screen for modeling data according to the present invention.
  • the modeling data editing service screen 1500 represents a state after modeling data, which is an editing target selected by the user, is edited.
  • the display unit 1250 may display various information related to a state in which modeling data, which is an editing target selected by the user, is edited through the modeling data editing service screen 1500.
  • the ID providing area 1530 of the modeling data displays the ID 1531 of the modeling data selected and edited by the user and edited the ID 1532 of the modeling data that has been edited before the editing is completed. Mark it with IDs.
  • the edit information providing area 1540 may include collision prediction numbers for modeling data and other modeling data, total number of landmarks / templates, position coordinates of the edited modeling data (Pos X, Pos Y, Pos Z), and the edited data.
  • Various edited information about the edited modeling data is provided, such as an angle according to a reference direction with respect to modeling data and a size (Scale X, Scale Y, Scale Z) for the edited modeling data.
  • the screen 1570 displays the modeling data 1571 edited on the map data. That is, after the edit execution, the screen 1570 is provided on the edit execution prescreen 1470 according to the pre-editing state as shown in FIG. 14 according to the edited information provided in the edit information providing area 1540 on the map data. The position, angle, and size of the modeling data 1471 to be edited are corrected, and the edited modeling data 1571 is displayed by the correction result.
  • FIG. 16 is a flowchart illustrating a method for editing modeling data in a 3D map service according to an embodiment of the present invention.
  • the modeling data editing apparatus 1200 selects an ID of modeling data for a target to be edited from the map data.
  • the modeling data editing apparatus 1200 edits the modeling data corresponding to the ID of the modeling data selected on the 3D map. That is, in operation S1620, the modeling data editing apparatus 1200 refers to the road terrain data on the 3D map, and when modeling data corresponding to the selected ID collides with other building data or road data, Edit by modifying the position, size, or angle for the
  • the modeling data editing apparatus 1200 may be configured in step S1620.
  • the modeling data may be edited by modifying a location, a size, or an angle of the modeling data so as not to collide with other buildings 1472 and 1473 and the surrounding road 1475 by referring to the road terrain data on the 3D map.
  • an ID of the modeling data to be edited is selected from the map data including the modeling data to be edited by the user, and another building or road on the 3D map is selected.
  • Modeling data can be edited by modifying the position, angle, or size of modeling data that collides with the.
  • the modeling data editing apparatus 1200 stores the editing information for each modeling data. That is, in operation S1630, when the modeling data editing apparatus 1200 selects a storage menu 1360 for storing editing information about the modeling data from the user, the modeling data editing apparatus 1200 stores the editing information.
  • the modeling data editing apparatus 1200 may include IDs for each of the modeling data, presence or absence of modifications to the modeling data, collisions between the modeling data and other buildings, and angles of the modeling data according to reference directions. The location information of the modeling data or the size information of the modeling data may be stored on the map.
  • the modeling data editing apparatus 1200 reflects the editing information on the map data. That is, in operation S1640, the modeling data editing apparatus 1200 reflects the modeling data edited according to the editing information to the map data. For example, in operation S1640, the modeling data editing apparatus 1200 may modify the modeling data in which the angle of the modeling data, the position of the modeling data on the 3D map, or the size of the modeling data is modified according to the editing information. Can be reflected in map data.
  • the modeling data editing apparatus 1200 displays the map data on the 3D map service screen. That is, in operation S1650, the modeling data editing apparatus 1200 displays the map data reflecting the editing information on the 3D map service screen. For example, when the modeling data 1471 collides with other buildings 1472, 1473 or the road 1475 as illustrated in FIG. 14, the modeling data editing apparatus 1200 is illustrated in FIG. 15 in step S1650. As described above, the modeling data 1571 may be displayed on the map data on which the edit information in which the position, size, or angle of the modeling data is modified is reflected through the 3D map service screen 1500.
  • the user can confirm the modeling data edited by reflecting the editing information about the modeling data on the map data and displaying the map data reflecting the editing information on the 3D map service execution screen. have.
  • the method of editing modeling data in the 3D map service may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable recording medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks such as floppy disks.
  • Magneto-optical media and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)

Abstract

L'invention concerne un appareil et un procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel. Selon un mode de réalisation de l'invention, un appareil de représentation d'ombres comprend : une unité de lecture d'informations qui lit des informations relatives au sujet d'un traitement d'ombres, une unité d'application de couleurs qui applique une première couleur sur l'extrémité supérieure du sujet de traitement d'ombres et une seconde couleur, différente de la première, sur l'extrémité inférieure du sujet de traitement d'ombres, et une unité de traitement qui exécute un traitement d'ombres concernant le sujet de traitement d'ombres, en fonction des couleurs appliquées. Dans un autre mode de réalisation, on utilise un appareil et un procédé d'édition de texture de service cartographique tridimentionnel. Ledit appareil d'édition de texture comprend : une unité d'entrée qui reçoit un nom de fichier de texture entré par un utilisateur, une unité de recherche qui recherche une ID de texture correspondant au nom de fichier entré, et une unité de modification qui modifie la texture de l'ID de texture recherchée. Dans un dernier mode de réalisation de l'invention, on utilise un procédé et un appareil d'édition de données de modélisation. L'appareil d'édition de données de modélisation d'un service cartographique tridimentionnel comprend : une unité de sélection qui sélectionne l'ID de l'article à éditer à partir des données cartographiques, et une unité d'édition qui édite les données de modélisation correspondant à l'ID sélectionnée sur la carte tridimentionnelle.
PCT/KR2009/002080 2008-04-22 2009-04-21 Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel WO2009131361A2 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR1020080037325A KR100898262B1 (ko) 2008-04-22 2008-04-22 3차원 지도 서비스에서의 모델링 데이터 편집 장치 및 방법
KR10-2008-0037325 2008-04-22
KR1020080037324 2008-04-22
KR10-2008-0037324 2008-04-22
KR10-2008-0054223 2008-06-10
KR1020080054223A KR100896137B1 (ko) 2008-06-10 2008-06-10 3차원 지도 서비스에서의 명암 표현 장치 및 방법

Publications (2)

Publication Number Publication Date
WO2009131361A2 true WO2009131361A2 (fr) 2009-10-29
WO2009131361A3 WO2009131361A3 (fr) 2010-01-21

Family

ID=41217254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2009/002080 WO2009131361A2 (fr) 2008-04-22 2009-04-21 Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel

Country Status (1)

Country Link
WO (1) WO2009131361A2 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107481296A (zh) * 2017-08-02 2017-12-15 长威信息科技发展股份有限公司 一种基于二维地图显示建筑高度的方法及装置
CN117573795A (zh) * 2024-01-15 2024-02-20 北京山维科技股份有限公司 一种地物显示区分方法和装置
CN117724647A (zh) * 2024-02-07 2024-03-19 杭州海康威视数字技术股份有限公司 信息配置展示方法、装置、电子设备及机器可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020041387A (ko) * 2002-05-13 2002-06-01 (주)이지스 2차원 공간정보를 활용한 솔리드모델 타입의 3차원 공간정보구축 자동화방법 및 3차원 공간정보 운용방법
KR100356017B1 (ko) * 1999-12-24 2002-10-18 한국전자통신연구원 엑스엠엘을 이용한 3차원 지리 피쳐 및 3차원 지리 객체의생성, 편집, 저장, 가시화할 수 있는 3차원 지리 정보시스템 및 그 운용 방법
KR20050019833A (ko) * 2002-07-10 2005-03-03 하만 베커 오토모티브 시스템즈 게엠베하 물체의 전자식 표현의 텍스처라이징용 시스템
KR100609786B1 (ko) * 2005-12-28 2006-08-09 공간정보기술 주식회사 도화원도를 이용한 3차원 건물 모델링 방법
KR100657943B1 (ko) * 2005-01-07 2006-12-14 삼성전자주식회사 2차원 건물 데이터의 실시간 3차원 변환 방법 및 장치,그리고 이를 이용한 2차원 건물 데이터의 실시간 3차원시각화 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100356017B1 (ko) * 1999-12-24 2002-10-18 한국전자통신연구원 엑스엠엘을 이용한 3차원 지리 피쳐 및 3차원 지리 객체의생성, 편집, 저장, 가시화할 수 있는 3차원 지리 정보시스템 및 그 운용 방법
KR20020041387A (ko) * 2002-05-13 2002-06-01 (주)이지스 2차원 공간정보를 활용한 솔리드모델 타입의 3차원 공간정보구축 자동화방법 및 3차원 공간정보 운용방법
KR20050019833A (ko) * 2002-07-10 2005-03-03 하만 베커 오토모티브 시스템즈 게엠베하 물체의 전자식 표현의 텍스처라이징용 시스템
KR100657943B1 (ko) * 2005-01-07 2006-12-14 삼성전자주식회사 2차원 건물 데이터의 실시간 3차원 변환 방법 및 장치,그리고 이를 이용한 2차원 건물 데이터의 실시간 3차원시각화 방법 및 장치
KR100609786B1 (ko) * 2005-12-28 2006-08-09 공간정보기술 주식회사 도화원도를 이용한 3차원 건물 모델링 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KOO, JA YEONG TRANSLATED: 'Computer graphies using OpenGL Second edition', February 2002, YOUNGHAN PUBLISHING COMPANY pages 279 - 285 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107481296A (zh) * 2017-08-02 2017-12-15 长威信息科技发展股份有限公司 一种基于二维地图显示建筑高度的方法及装置
CN107481296B (zh) * 2017-08-02 2020-10-09 长威信息科技发展股份有限公司 一种基于二维地图显示建筑高度的方法及装置
CN117573795A (zh) * 2024-01-15 2024-02-20 北京山维科技股份有限公司 一种地物显示区分方法和装置
CN117724647A (zh) * 2024-02-07 2024-03-19 杭州海康威视数字技术股份有限公司 信息配置展示方法、装置、电子设备及机器可读存储介质
CN117724647B (zh) * 2024-02-07 2024-06-04 杭州海康威视数字技术股份有限公司 信息配置展示方法、装置、电子设备及机器可读存储介质

Also Published As

Publication number Publication date
WO2009131361A3 (fr) 2010-01-21

Similar Documents

Publication Publication Date Title
US5317678A (en) Method for changing color of displayed images by use of color components
WO2020180084A1 (fr) Procédé permettant d'achever la coloration d'une image cible, et dispositif et programme informatique associés
WO2022234876A1 (fr) Procédé permettant de fournir des informations sur un article et appareil correspondant
KR100995107B1 (ko) 항공촬영이미지의 이해도를 높인 영상이미지 편집처리시스템
WO2021107204A1 (fr) Procédé de modélisation tridimensionnelle pour vêtements
WO2019132566A1 (fr) Procédé de génération automatique d'image à profondeurs multiples
WO2020091207A1 (fr) Procédé, appareil et programme informatique pour compléter une peinture d'une image et procédé, appareil et programme informatique pour entraîner un réseau neuronal artificiel
WO2009131361A2 (fr) Appareil et procédé pour éditer des données cartographiques dans un service cartographique tridimensionnel
CN104111934A (zh) 电子地图搜索结果的展示方法以及电子地图客户端
WO2013111938A1 (fr) Procédé et système d'édition en temps réel de cartes numériques, et serveur et support d'enregistrement associés
WO2019132563A1 (fr) Procédé de création de panoramique d'image
WO2024005242A1 (fr) Procédé de traitement de données concernant une transaction pour œuvre d'art
WO2022131723A1 (fr) Procédé pour offrir une fonction de lecture et de recherche de dessin, et dispositif et système associés
WO2021107202A1 (fr) Procédé de modélisation tridimensionnelle de vêtement
WO2016182357A1 (fr) Procédé de visualisation treemap et dispositif l'utilisant
WO2018097361A1 (fr) Procédé de création de fonctions erp définies par l'utilisateur et système informatique d'exécution associé
CN114964210A (zh) 地图绘制方法、装置、计算机设备及存储介质
WO2023136549A1 (fr) Procédé et dispositif de fourniture de contenu augmenté par l'intermédiaire d'une vue en réalité augmentée sur la base d'un espace unitaire préétabli
WO2024101629A1 (fr) Procédé de fourniture de métavers à nft, et appareil associé
JP2003141194A (ja) 通信線路設計システム、通信線路管理システム、通信線路設計プログラムおよび通信線路管理プログラム
WO2024158065A1 (fr) Dispositif d'affichage basé sur la réalité augmentée pour véhicule et son procédé de commande
WO2024076169A1 (fr) Procédé d'apprentissage de modèle de reconnaissance d'objets à l'aide d'informations spatiales et dispositif informatique pour réaliser celui-ci
WO2022071730A1 (fr) Procédé et dispositif pour réaliser une détection de plan
WO2023074948A1 (fr) Procédé de service pour générer et fournir une mission à l'aide d'un contenu reposant sur le positionnement
WO2024219563A1 (fr) Procédé et dispositif pour fournir un service de décoration intérieure

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09735505

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/01/2011)

122 Ep: pct application non-entry in european phase

Ref document number: 09735505

Country of ref document: EP

Kind code of ref document: A2