WO2019124818A1 - 혼합 현실 서비스 제공 방법 및 시스템 - Google Patents
혼합 현실 서비스 제공 방법 및 시스템 Download PDFInfo
- Publication number
- WO2019124818A1 WO2019124818A1 PCT/KR2018/015197 KR2018015197W WO2019124818A1 WO 2019124818 A1 WO2019124818 A1 WO 2019124818A1 KR 2018015197 W KR2018015197 W KR 2018015197W WO 2019124818 A1 WO2019124818 A1 WO 2019124818A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal
- virtual image
- building
- image
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present invention relates to a mixed reality service providing method and system, and more particularly, to provide a mixed reality image of a real building size by providing three-dimensional modeling data corresponding to a building site to facilitate management and maintenance / And more particularly to a method and system for providing a mixed reality service.
- VR virtual reality
- AR augmented reality
- MR Mixed Reality
- the present invention provides a mixed reality service providing method and system for displaying three-dimensional modeling data corresponding to a building site as mixed reality images of actual building size.
- the present invention also provides a mixed reality service providing method and system for real-time editing three-dimensional modeling data displayed in a mixed reality space and sharing it with other workers.
- a method for providing a mixed reality service in a server including loading three-dimensional modeling data including design information on an arbitrary building, Generating three-dimensional modeling data and a virtual image for at least one user interface based on the tracking information transmitted from the terminal, and transmitting the virtual image to the terminal,
- the virtual image includes at least one of geographical location information of the building, identification information of a real space where the terminal is located, three-dimensional rotation information, and speed information, Is synthesized with the real image at the at least one terminal, And is displayed as an image.
- the step of generating the virtual image may further include the steps of acquiring an external view or an internal image of the building according to the geographical location information of the terminal, determining, based on the tracking information of the terminal, Extracting one area corresponding to the display area, and generating a virtual image from the extracted one area.
- the generating of the virtual image may further include acquiring the internal image for any layer selected by the user input or tracking information when the building comprises a plurality of layers .
- the at least one user interface may include a user interface for at least one of design information addition, editing, deletion, and memo management for the building.
- the generating of the virtual image may include generating a virtual image by disposing a user interface for the at least one object on at least one object in the building.
- the method may further include receiving a processing request for the three-dimensional modeling data through the at least one user interface, editing the three-dimensional modeling data corresponding to the processing request, Reproducing the virtual image based on the data, and transmitting the regenerated virtual image to the terminal.
- a system for providing a mixed reality service including a terminal for collecting and transmitting tracking information while using a mixed reality service, Dimensional modeling data including at least one user interface, and a server for generating the three-dimensional modeling data and the virtual image for at least one user interface based on the tracking information transmitted from the terminal and transmitting the generated virtual image to the terminal,
- the tracking information includes at least one of geographical position information of the terminal, identification information of a real space in which the terminal is located, three-dimensional rotation information, and speed information, Is generated so that it can be displayed in an actual size, and the at least one stage And is synthesized with a real image in a horse and displayed as a mixed reality image.
- the server may acquire an external appearance or an internal image of the building according to the geographical position information of the terminal, and, in accordance with the tracking information of the terminal, And the virtual image is generated by extracting the region.
- the server may acquire the internal image for an arbitrary layer selected by the input or tracking information received from the terminal when the building comprises a plurality of layers.
- the terminal transmits a processing request for the 3D modeling data to the server, based on a user input received through the at least one user interface, and the server transmits, to the server, Dimensional modeling data and regenerating the virtual image based on the edited three-dimensional modeling data and transmitting the regenerated virtual image to the terminal.
- the method and system for providing a mixed reality service according to the present invention can easily compare a building with three-dimensional modeling data of an actual building size through a mixed reality image on a building site to facilitate verification, management and maintenance / repair of the building .
- the mixed reality service providing method and system according to the present invention can edit real-time three-dimensional modeling data displayed as a mixed reality image and share it with other workers.
- FIG. 1 is a diagram illustrating a network structure of a mixed reality service providing system according to the present invention.
- FIG. 2 is a diagram illustrating the structure of a server according to the present invention.
- FIG. 3 is a diagram illustrating a structure of a terminal according to the present invention.
- FIG. 4 is a diagram illustrating an example in which a terminal according to the present invention is implemented as a head-mounted display device.
- FIG. 5 is a flowchart illustrating a method of providing a mixed reality service according to the present invention.
- FIG. 6 is a flowchart illustrating a method for performing processing on 3D modeling data in a mixed reality service providing method according to the present invention.
- FIG. 7 and 8 are views showing an example of a mixed reality image according to the present invention.
- the method and system for providing a mixed reality service according to the present invention can be provided for designing a building. More specifically, the method and system for providing a mixed reality service according to the present invention displays design data of a building as a mixed reality to a plurality of user terminals entering a mixed reality conference room as three-dimensional modeling data, Modify / delete, etc.).
- FIG. 1 is a diagram illustrating a network structure of a mixed reality service providing system according to the present invention.
- a mixed reality service providing system 1 may include a server 10 for providing a mixed reality service and a terminal 20 for receiving a mixed reality service.
- the server 10 is provided for providing a mixed reality service according to the present invention, and may be a network server, an application server, a domain server, or the like, which is operated by a provider of a mixed reality service.
- the server 10 generates a virtual image from the 3D modeling data and provides the virtual image to the terminal 20 when the mixed reality service is requested from the terminal 20.
- the virtual image provided to the terminal 20 may be combined with the real image photographed by the terminal 20 and displayed as a mixed reality image.
- the server 10 may also provide the terminal 20 with a virtual image of various user interfaces for performing additional processing on the three-dimensional modeling data.
- the server 10 can modify and manage the three-dimensional modeling data according to a user command received through a user interface.
- the terminal 20 performs data communication with the server 10 and receives mixed reality service.
- the terminal 20 synthesizes a virtual image received from the server 10 with a real image photographed by the terminal 20 to generate a mixed image and displays it as a left eye image and a right eye image, Make modeling data realistic.
- the three-dimensional modeling data may be modeling data for building design.
- the three-dimensional modeling data may be Industry Foundation Classes (IFC) data as Building Information Modeling (BIM) data, but is not limited thereto.
- IFC Industry Foundation Classes
- BIM Building Information Modeling
- Three-dimensional modeling data for building design can include information related to the structure, shape, dimensions, materials, materials, colors, patterns, facilities and the like of the building.
- the server 10 and the terminal 20 are connected to each other through a network to perform data communication and perform a control operation for mixed reality service. A more detailed description thereof will be described later.
- FIG. 2 is a diagram illustrating the structure of a server according to the present invention.
- the server 10 may include a communication unit 11, a control unit 12, and a storage unit 13.
- the communication unit 11 can transmit / receive data to / from the outside via a network.
- the communication unit 11 receives a request corresponding to a user input and information necessary for generating a virtual image from the terminal 20, and transmits the request to the terminal 20 under the control of the control unit 12 A virtual image can be transmitted.
- the controller 12 is configured to control the components of the server 10 in order to provide the mixed reality service according to the present invention. More specifically, the control unit 12 may include a mixed reality service providing unit 121 and a virtual image generating unit 122.
- the mixed reality service providing unit 121 performs various control operations for providing the mixed reality service to the terminal 20.
- the mixed reality service provider 121 may create and manage a user account of the terminal 20 provided with the mixed reality service.
- the mixed reality service provider 121 can store and manage data generated or obtained through the mixed reality service in association with a user account.
- the mixed reality service provider 121 stores and manages an ID, a password, and the like as identification information for identifying a user account, and stores modeling data related to the user account in association with the user account It can be stored and managed in a data directory.
- the mixed reality service providing unit 121 receives information required for generating a virtual image, for example, tracking information, from the terminal 20 as the mixed reality service is activated at the request of the terminal 20, To the virtual image generating unit 122. [0050] FIG. The mixed reality service providing unit 121 transmits the virtual image generated by the virtual image generating unit 122 to the terminal 20 and generates a mixed reality image by synthesizing the virtual image with the real image in the terminal 20 .
- the mixed reality service providing unit 121 may process the 3D modeling data, for example, add, edit, delete, and delete the design information according to the user input received from the terminal 20 while the mixed reality service is being provided. Management and so on.
- the mixed reality service providing unit 121 transmits the edited three-dimensional modeling data to the virtual image generating unit 122 so that a virtual image based on the edited three-dimensional modeling data can be generated.
- the 3D modeling data is edited by an arbitrary terminal 20, the regenerated virtual image can be transmitted to the terminal 20 and at least one other devices. Accordingly, the devices provided with the mixed reality service can perform collaboration while sharing the editing situation of the three-dimensional modeling data.
- the virtual image generation unit 122 may generate a virtual image based on the tracking information received from the terminal 20.
- the tracking information may include, for example, identification information (e.g., mesh network information) about the physical space of the terminal 20 and geolocation information of the terminal 20, three-dimensional rotation information, speed information, .
- the virtual image generating unit 122 may generate a virtual image from the 3D modeling data based on the tracking information of the terminal 20 received from the terminal 20. [ The virtual image generating unit 122 may generate a virtual image using an actual size of the three-dimensional modeling data, and may generate a virtual image by determining a display area of the three-dimensional modeling data according to the tracking information.
- the virtual image generating unit 122 may generate a virtual image for processing the three-dimensional modeling data, for example, adding, editing, deleting, and managing the design information (for example, inserting a memo,
- the user interface can be created as a virtual image.
- the virtual image generating unit 122 may rotate or magnify / reduce the user interface according to the three-dimensional shape or tracking information of the object to generate a virtual image.
- control unit 12 may be implemented as physically separate devices in the control unit 12, but the technical idea of the present invention is not limited thereto. That is, the components of the control unit 12 described above can be modularized or programmed in one physical processor. In addition, the components of the control unit 12 described above are merely divided into operational aspects of the control unit 12, and at least one or all of the above-described components can be integrated into one component.
- the storage unit 13 may store an operating system, a program, a software, and the like necessary for the operation of the server 10. In various embodiments of the present invention, the storage unit 13 may store and manage at least one three-dimensional modeling data. In one embodiment, at least one of the three-dimensional modeling data may be stored in association with user accounts and geo-location information.
- FIG. 3 is a diagram illustrating a structure of a terminal according to the present invention.
- the terminal 20 includes a camera unit 21, a sensor unit 22, an input unit 23, a display unit 24, an output unit 25, a communication unit 26, 27, and a storage unit 28.
- the camera unit 21 includes at least one camera module, and is configured to capture an image of a user in front of the user.
- the camera portion 32 may be configured to include a depth camera that can identify the shape and depth of the user's real space (the subject).
- the camera unit 32 may include an infrared camera or the like for photographing a user's hand gesture or the like.
- the sensor unit 22 may include at least one sensor capable of sensing various information related to the operation of the terminal 20.
- the sensor portion 22 may include a GPS sensor for sensing the geographic location of the user.
- the sensor unit 22 may include a gyro sensor, a velocity sensor, an acceleration sensor, and the like for sensing motion such as three-dimensional rotation, tilt, and speed of the terminal 20.
- the input unit 23 receives various inputs from the user.
- the input unit 23 may include a touch pad, a keypad, a jog dial, a dome switch, a button, and the like.
- the input portion 23 when receiving a user's hand gesture as input through the camera portion 21, the input portion 23 is configured to perform an operation to identify the hand gesture taken through the camera portion 21 .
- the input unit 23 may further include a microphone for receiving sounds such as a user's voice and processing the received sounds as electrical voice data.
- the display unit 24 can visually display various information processed by the terminal 20.
- the display unit 24 includes a left-eye display unit for displaying a left-eye image and a right-eye display unit for displaying a right- Lt; / RTI >
- the output unit 25 is configured to output information processed by the terminal 20 in various forms such as sound, vibration, and light.
- the output unit 25 may include a speaker, a haptic module, an LED lamp, and the like.
- the communication unit 26 can transmit / receive data to / from the outside via the network.
- the communication unit 26 transmits various requests and / or information for receiving the mixed reality service to the server 10 through the control of the control unit 27, and transmits three-dimensional modeling data and / / RTI > and / or a virtual image for various objects.
- the control unit 27 is configured to control the respective components of the terminal 20 in order to receive the mixed reality service according to the present invention. More specifically, the control unit 27 may include a mixed reality service management unit 271 and a mixed reality generation unit 272.
- the mixed reality service management unit 271 controls operations related to the mixed reality service provided through the server 10.
- the mixed reality service management unit 271 may transmit a request to the server 10 to create / manage a user account or log in to the user account in response to a user input.
- the mixed reality service management unit 271 can input a user ID, a password, and the like to the server 10 as identification information for creating a user account in the server 10 or logging into a user account.
- the mixed reality service management unit 271 may transmit a request for driving the mixed reality service to the server 10 in response to a user input.
- the mixed reality service management unit 271 may collect tracking information through the camera unit 21 and / or the sensor unit 22 and transmit the collected tracking information to the server 10 .
- the mixed reality service management unit 271 can receive the virtual image generated based on the tracking information from the server 10. [
- the mixed reality service management unit 271 may transmit the received virtual image to the mixed reality generation unit 272 and synthesize the real image and the virtual image captured through the camera unit 21 to generate a mixed reality image .
- the mixed reality service management unit 271 may receive user input, for example, three-dimensional modeling (e.g., 3D modeling), via at least one user interface received from the server 10 and displayed by the mixed reality generation unit 272, You can receive input for adding, editing, deleting, managing notes, etc. for any object in the data.
- the mixed reality service management unit 271 processes the user input and transmits the user input to the server 10, and receives the response from the server 10 and processes the input.
- the mixed reality generation unit 272 generates a mixed reality image by combining the real image photographed through the camera unit 21 and the virtual image received from the server 10. [ The generated mixed reality image can be displayed through the display unit 24.
- the mixed reality generation unit 272 may generate the left eye image and the right eye image for the mixed reality image .
- the generated left-eye image and right-eye image can be displayed on the left-eye display unit and the right-eye display unit of the head-mounted display 30, respectively.
- the storage unit 28 may store an operating system, a program, software, and the like necessary for the operation of the terminal 20.
- the terminal 20 may be configured as a head-mounted display 30 as shown in Fig.
- the head mount display 30 may be configured as a frame 29.
- the frame 29 may be formed of a flexible material so as to be easily worn on a user's head, for example, and may be formed in a shape of a frame to be worn on the face of a user.
- the frame 29 may be referred to as a body portion or a body portion.
- the frame 29 may be provided with the camera unit 21, the sensor unit 22, the input unit 23, the display unit 24, and the output unit 25 described above.
- the display unit 24 may include left and right eye display units 24 corresponding to the left and right eyes, respectively, in a state worn by the user. Accordingly, the head mount display 30 allows the user to feel a sense of depth corresponding to the parallax between the left eye image and the right eye image, and to experience a more realistic mixed reality space.
- the structure of the head mount display 30 is not limited to that described above, and the head mount display may have various structures and shapes.
- the mixed reality service providing method described below may be performed through an application program, software, or the like installed in the terminal 20 or through an HTTP based web service.
- the technical idea of the present invention is not limited to this, and a mixed reality service providing method according to the present invention can be implemented through various methods.
- FIG. 5 is a flowchart illustrating a method of providing a mixed reality service according to the present invention.
- the terminal 20 may receive a user input for driving a mixed reality service (501).
- the user input for driving the mixed reality service may be received through, for example, moving an application, a program, a software for providing a mixed reality service, a web page providing a mixed reality service, or the like.
- the terminal 20 may transmit the mixed reality service driving request to the server 10 in response to the user input (502).
- the server 10 may transmit a driving response to the driving request of the terminal 20 to the terminal 20 (503).
- the server 10 may perform device authentication and / or security authentication on the terminal 20 to determine whether to provide the mixed reality service to the terminal 20, Response can be transmitted.
- such an authentication process may not be performed separately.
- the server 10 and the terminal 20 may perform operations for user account creation and / or user account login.
- the terminal 20 may transmit an account creation request or an account login request to the server 10 according to user input.
- the account creation request or the account login request is identification information of the user, and may include, for example, an ID, a password, and the like.
- the server 10 stores the identification information of the user included in the account creation request, and can set and load the data directory so that data related to the user can be stored in association with the identification information of the user.
- the server 10 may search the stored user's identification information matching the user's identification information included in the account login request, and if the matching user's identification information exists, the stored data directory corresponding to the user's identification information Can be loaded.
- the server 10 may not separately provide user account related services, the above process may not be performed.
- the terminal 20 can collect tracking information in real time (504).
- the terminal 20 may collect tracking information via the camera section 22 and / or the sensor section 23.
- Tracking information collected at the terminal 20 may include geographical location information, identification information about a physical space, three-dimensional rotation information, speed information, and the like.
- the terminal 20 can determine the current geographical position of the terminal 20 through the GPS sensor.
- the terminal 20 can analyze the image of the real space photographed through the camera unit 21 to identify the shape, depth, and the like of the real space. Through the space identification, the terminal 20 can determine the shape, size, position, etc. of the ground, a building, etc., located in front of the user.
- the terminal 20 can sense movement such as three-dimensional rotation, tilting, and speed (movement) of the terminal 20 through the sensor unit 22.
- the tracking information collected by the terminal 20 in the present invention is not limited to the above-described space identification information and motion information, but may be various information required for generating a virtual image, for example, marker recognition information, hand gesture identification Information, and the like.
- the terminal 20 can transmit the collected tracking information to the server 10 in real time (505).
- the server 10 loads the three-dimensional modeling data (506).
- the three-dimensional modeling data may be three-dimensional modeling data including a design for a building.
- the server 10 can load the selected three-dimensional modeling data through the user input or the like at the terminal 20. [ Alternatively, the server 10 can determine the geographical position of the terminal 20 from the received tracking information, and load the previously stored three-dimensional modeling data corresponding to the position. For example, when the terminal 20 is located at a building site of a specific geographical location, the server 10 can load the three-dimensional modeling data of the building designed corresponding to the building site.
- the server 10 may generate a virtual image of the loaded three-dimensional modeling data based on the received tracking information (507).
- the server 10 can generate a virtual image so that a building created with three-dimensional modeling data on the real space identified through the tracking information can be displayed at an actual size according to the design information. At this time, the server 10 can arrange the virtual image so that the building can be displayed at the designed position based on the geographical location information.
- the virtual image can be composed of the exterior of the building or a region inside the building.
- one area inside the building may correspond to the front display area (display direction) of the terminal 20 extracted based on tracking information, for example, three-dimensional rotation information, from the exterior or internal image of the building.
- the server 10 may create a virtual image as part of the interior image of the building.
- the server 10 can create a virtual image of one area inside for any layer selected by user input or the like.
- the user of the terminal 20 may be physically located in a particular floor of the building.
- the server 10 may acquire the height information of the terminal 20 from the tracking information, and may generate one area inside the corresponding layer as a virtual image.
- the server 10 may generate a virtual image as a user interface for performing processing on the three-dimensional modeling data.
- These user interfaces may include additional features such as structure (location), shape, dimensions, materials, materials, colors, patterns, equipment, etc. of any objects (e.g., walls, ceilings, floors, doors, windows, A user interface for editing, deleting, editing,
- the user interface may also include a user interface for inserting, editing, and deleting notes for any object in the displayed building.
- the user interface may also include a user interface for selecting a layer of a building to be displayed or for controlling an indication (e.g., electrical equipment, gas equipment, water supply, etc.) for a particular facility.
- an indication e.g., electrical equipment, gas equipment, water supply, etc.
- the server 10 can create a virtual image by placing a user interface on a corresponding object.
- the server 10 can create a virtual image by disposing the user interface at a specific position on the screen, for example, an upper end, a side, or a lower end.
- the server 10 may generate a virtual image by rotating or zooming the user interface based on tracking information and / or the placement state of the corresponding object.
- the server 10 may transmit the generated virtual image to the terminal 20 (508).
- the terminal 20 may generate a mixed reality image by synthesizing the received virtual image with a real image photographed through the camera unit 21 (509).
- the mixed reality image thus generated may be an image in which buildings corresponding to the three-dimensional modeling data are arranged at actual sizes on the real space where the terminal 20 is located.
- the generated mixed reality image may be an image in which at least one user interface capable of performing processing on any object of the displayed building is located at a position corresponding to the object and / or in a specific area on the screen.
- the terminal 20 can display the generated mixed reality image through the display unit 24 (510).
- the terminal 20 While the mixed reality image is being displayed, the terminal 20 continuously collects the tracking information in real time and transmits it to the server 10. In addition, the server 10 generates a virtual image based on the tracking information continuously received and transmits the virtual image to the terminal 20.
- the terminal 20 can display the mixed reality image to the user by synthesizing the virtual image changing according to the real time tracking information into the real image.
- FIG. 7 An example of the mixed reality image thus displayed is shown in FIGS. 7 and 8.
- FIG. 7 An example of the mixed reality image thus displayed is shown in FIGS. 7 and 8.
- the three-dimensional modeling images 701 and 801 are displayed at actual sizes on corresponding building sites.
- the user can feel as if the actual size building is displayed at the designed position through the displayed mixed reality image.
- the geographical location of the terminal 20 is inside the building design location, the user may feel as if it is located inside the building constructed by the three-dimensional modeling data, as shown in FIG.
- a user can indirectly experience a shape realistically when a building is actually constructed on a building site before a building is actually designed.
- the mixed reality service according to the present invention when used after actually building the building according to the design content of the three-dimensional modeling data, the user compares the constructed building with the three- It is possible to check whether or not the building is built in conformity with the designed modeling data.
- the constructed building can be compared with the 3D modeling data indicated as the mixed reality, and the damaged or changed part of the building after the construction can be identified and maintenance / repair can be performed.
- At least one user interface 702, 802 may be displayed on the displayed three-dimensional modeling data to perform the control available for any object as described above.
- the 3D modeling data can be processed through the user interface. This will be described with reference to FIG.
- FIG. 6 is a flowchart illustrating a method for performing processing on 3D modeling data in a mixed reality service providing method according to the present invention.
- the terminal 20 can sense a user input inputted through the camera unit 21, the sensor unit 22 and / or the input unit 23 (601 ).
- the user input is processing for design information in the three-dimensional modeling data, and may include information for editing the structure, form, dimension, material, material, color, pattern, and equipment for an arbitrary object.
- the user input may be received via the displayed user interface.
- the terminal 20 may sense user input through the infrared camera through the user's hand gesture identification.
- the terminal 20 may sense a user input for executing the user interface displayed at the location to which the finger is pointing.
- the terminal 20 may sense a user input to display a corresponding menu or list or the like.
- the terminal 20 can identify the number of user fingers and sense user input to display the mixed reality image in the corresponding layer.
- the type or form of the user input corresponding to the hand gesture of the user is not particularly limited.
- the terminal 20 may transmit a processing request for the three-dimensional modeling data to the server 10 in response to the user input (602).
- the server 10 may process the three-dimensional modeling data (603).
- the server 10 can change the size, change the arrangement, change the color, material, pattern, and the like of the object selected by the processing request in the three-dimensional modeling data according to the processing request.
- the server 10 may store the processed three-dimensional modeling data, or store the three-dimensional modeling data before the processing, for backup, and perform a management and storage operation on the three-dimensional modeling data.
- the server 10 may generate a virtual image based on the edited three-dimensional modeling data (604) and transmit the virtual image to the terminal (605).
- the terminal 20 generates (607) a mixed reality image based on the virtual image transmitted from the server 10 and displays it (608), thereby displaying various types of objects in the building corresponding to the three- .
- the terminal 20 can store the three-dimensional modeling data indicated by the mixed reality image in the server 10, thereby enabling the other terminals to check the processing state of the three-dimensional modeling.
- the terminal 20 can share the 3D modeling data with other terminals in real time through a mixed reality conference or the like.
Abstract
Description
Claims (10)
- 서버의 혼합 현실 서비스를 제공하기 위한 방법으로,임의의 건축물에 대한 설계 정보를 포함하는 3차원 모델링 데이터를 로드하는 단계;단말로부터 전송되는 트랙킹 정보를 기초로, 상기 3차원 모델링 데이터 및 적어도 하나의 사용자 인터페이스에 대한 가상 이미지를 생성하는 단계; 및상기 가상 이미지를 상기 단말로 전송하는 단계를 포함하되,상기 트랙킹 정보는,상기 단말의 지리적 위치 정보, 상기 단말이 위치한 현실 공간에 대한 식별 정보, 3차원 회전 정보 및 속도 정보 중 적어도 하나를 포함하고,상기 가상 이미지는,상기 건축물이 상기 설계 정보에 따라 실제 크기로 표시될 수 있도록 생성되며, 상기 적어도 하나의 단말에서 현실 이미지와 합성되어 혼합 현실 이미지로 표시되며,상기 가상 이미지는,상기 단말의 지리적 위치가 상기 실제 크기로 표시되는 상기 건축물의 내부 위치에 대응하는 경우 상기 건축물의 내부 이미지를 이용하여 생성되고 상기 건축물의 외부 위치에 대응하는 경우 상기 건축물의 외관 이미지를 이용하여 생성되며,상기 가상 이미지를 생성하는 단계는,상기 건축물이 복수의 층으로 구성되는 경우, 상기 단말로부터 수신되는 입력 또는 트랙킹 정보에 의해 선택된 임의의 층에 대하여 상기 내부 이미지를 획득하는 단계; 및상기 획득된 내부 이미지를 기초로 상기 가상 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 방법.
- 제1항에 있어서, 상기 가상 이미지를 생성하는 단계는,상기 단말의 지리적 위치 정보에 따라, 상기 건축물의 외관 또는 내부 이미지를 획득하는 단계;상기 단말의 트랙킹 정보에 따라, 상기 획득된 이미지 내에서 상기 단말의 정면 표시 영역에 대응되는 일 영역을 추출하는 단계; 및상기 추출된 일 영역으로부터 가상 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 방법.
- 제1항에 있어서, 상기 적어도 하나의 사용자 인터페이스는,상기 건축물에 대한 설계 정보의 추가, 편집, 삭제, 메모 관리 중 적어도 하나를 위한 사용자 인터페이스를 포함하는 것을 특징으로 하는 방법.
- 제3항에 있어서, 상기 가상 이미지를 생성하는 단계는,상기 건축물 내의 적어도 하나의 객체 상에 상기 적어도 하나의 객체를 위한 사용자 인터페이스를 배치하여 상기 가상 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 방법.
- 제4항에 있어서,상기 적어도 하나의 사용자 인터페이스를 통하여 상기 3차원 모델링 데이터에 대한 처리 요청을 수신하는 단계;상기 처리 요청에 대응하여 상기 3차원 모델링 데이터를 편집하는 단계;상기 편집된 3차원 모델링 데이터를 기초로 상기 가상 이미지를 재생성하는 단계; 및상기 재생성된 가상 이미지를 상기 단말로 전송하는 단계를 더 포함하는 것을 특징으로 하는 방법.
- 제1항에 있어서, 상기 가상 이미지를 상기 단말로 전송한 이후에,상기 단말로부터 변경된 트랙킹 정보를 수신하는 단계;상기 변경된 트랙킹 정보를 기초로 상기 3차원 모델링 데이터 및 상기 적어도 하나의 사용자 인터페이스에 대한 가상 이미지를 재생성하는 단계; 및상기 재생성된 가상 이미지를 상기 단말로 전송하는 단계를 더 포함하는 것을 특징으로 하는 방법.
- 혼합 현실 서비스를 제공하는 시스템으로,혼합 현실 서비스를 이용하는 동안 트랙킹 정보를 수집하여 전송하는 단말; 및임의의 건축물에 대한 설계 정보를 포함하는 3차원 모델링 데이터를 로드하고, 상기 단말로부터 전송되는 상기 트랙킹 정보를 기초로, 상기 3차원 모델링 데이터 및 적어도 하나의 사용자 인터페이스에 대한 가상 이미지를 생성하여 상기 단말로 전송하는 서버를 포함하되,상기 트랙킹 정보는,상기 단말의 지리적 위치 정보, 상기 단말이 위치한 현실 공간에 대한 식별 정보, 3차원 회전 정보 및 속도 정보 중 적어도 하나를 포함하고,상기 가상 이미지는,상기 건축물이 상기 설계 정보에 따라 실제 크기로 표시될 수 있도록 생성되며, 상기 적어도 하나의 단말에서 현실 이미지와 합성되어 혼합 현실 이미지로 표시되며,상기 가상 이미지는,상기 단말의 지리적 위치가 상기 실제 크기로 표시되는 상기 건축물의 내부 위치에 대응하는 경우 상기 건축물의 내부 이미지를 이용하여 생성되고 상기 건축물의 외부 위치에 대응하는 경우 상기 건축물의 외관 이미지를 이용하여 생성되며,상기 서버는,상기 건축물이 복수의 층으로 구성되는 경우, 상기 단말로부터 수신되는 입력 또는 트랙킹 정보에 의해 선택된 임의의 층에 대하여 상기 내부 이미지를 획득하고, 상기 획득된 내부 이미지를 기초로 상기 가상 이미지를 생성하는 것을 특징으로 하는 시스템.
- 제7항에 있어서, 상기 서버는,상기 단말의 지리적 위치 정보에 따라, 상기 건축물의 외관 또는 내부 이미지를 획득하고, 상기 단말의 트랙킹 정보에 따라, 상기 획득된 이미지 내에서 상기 단말의 정면 표시 영역에 대응되는 일 영역을 추출하여 상기 가상 이미지를 생성하는 것을 특징으로 하는 시스템.
- 제7항에 있어서, 상기 적어도 하나의 사용자 인터페이스는,상기 건축물에 대한 설계 정보의 추가, 편집, 삭제, 메모 관리 중 적어도 하나를 위한 사용자 인터페이스를 포함하는 것을 특징으로 하는 시스템.
- 제9항에 있어서, 상기 단말은,상기 적어도 하나의 사용자 인터페이스를 통하여 수신되는 사용자 입력을 기초로, 상기 3차원 모델링 데이터에 대한 처리 요청을 상기 서버로 전송하고,상기 서버는,상기 처리 요청에 대응하여 상기 3차원 모델링 데이터를 편집하고, 상기 편집된 3차원 모델링 데이터를 기초로 상기 가상 이미지를 재생성하여 상기 단말로 전송하는 것을 특징으로 하는 시스템.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201880082551.3A CN111492396A (zh) | 2017-12-19 | 2018-12-03 | 混合现实服务提供方法及系统 |
US16/956,392 US11030359B2 (en) | 2017-12-19 | 2018-12-03 | Method and system for providing mixed reality service |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2017-0175376 | 2017-12-19 | ||
KR1020170175376 | 2017-12-19 | ||
KR1020180096262A KR102010030B1 (ko) | 2018-08-17 | 2018-08-17 | 혼합 현실 서비스 제공 방법 및 시스템 |
KR10-2018-0096262 | 2018-08-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019124818A1 true WO2019124818A1 (ko) | 2019-06-27 |
Family
ID=66994174
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2018/015197 WO2019124818A1 (ko) | 2017-12-19 | 2018-12-03 | 혼합 현실 서비스 제공 방법 및 시스템 |
Country Status (3)
Country | Link |
---|---|
US (1) | US11030359B2 (ko) |
CN (1) | CN111492396A (ko) |
WO (1) | WO2019124818A1 (ko) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR3116629B1 (fr) * | 2020-11-25 | 2024-01-19 | Patrick Herbault | Outil de suivi de chantier par réalité augmentée |
FR3117241A1 (fr) | 2020-12-08 | 2022-06-10 | Patrick Herbault | Outil de visualisation 3D d’un bâtiment couplé à un planning |
CN113239443B (zh) * | 2021-06-03 | 2022-07-05 | 广州中硕建筑设计院有限公司 | 一种基于bim软件的智慧空间展示结构设计建模方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090029350A (ko) * | 2007-09-18 | 2009-03-23 | 에스케이텔레콤 주식회사 | 모바일 가상세계 서비스 시스템 및 방법 |
KR20110107542A (ko) * | 2010-03-25 | 2011-10-04 | 에스케이텔레콤 주식회사 | 가상 사용자 인터페이스를 이용한 증강현실 시스템 및 그 방법 |
KR20150058617A (ko) * | 2013-11-18 | 2015-05-29 | 순천대학교 산학협력단 | 모바일 증강현실 기반의 설계도면 3차원 모델 시각화 시스템 및 그 방법 |
KR20150068088A (ko) * | 2013-12-11 | 2015-06-19 | 류지혁 | 3d모델 증강현실 서비스 시스템 |
KR20160033495A (ko) * | 2014-09-18 | 2016-03-28 | 서강대학교산학협력단 | 증강현실을 이용한 가구 배치 장치 및 방법 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9418346B2 (en) * | 2013-03-28 | 2016-08-16 | Nokia Technologies Oy | Method and apparatus for providing a drawer-based user interface for content access or recommendation |
WO2018080552A1 (en) * | 2016-10-24 | 2018-05-03 | Carrington Charles C | System for generating virtual building plan data based upon stored and scanned building data and related methods |
CN107168532B (zh) * | 2017-05-05 | 2020-09-11 | 武汉秀宝软件有限公司 | 一种基于增强现实的虚拟同步显示方法及系统 |
CN107222468B (zh) * | 2017-05-22 | 2020-12-18 | 北京邮电大学 | 增强现实处理方法、终端、云端服务器和边缘服务器 |
-
2018
- 2018-12-03 US US16/956,392 patent/US11030359B2/en active Active
- 2018-12-03 WO PCT/KR2018/015197 patent/WO2019124818A1/ko active Application Filing
- 2018-12-03 CN CN201880082551.3A patent/CN111492396A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090029350A (ko) * | 2007-09-18 | 2009-03-23 | 에스케이텔레콤 주식회사 | 모바일 가상세계 서비스 시스템 및 방법 |
KR20110107542A (ko) * | 2010-03-25 | 2011-10-04 | 에스케이텔레콤 주식회사 | 가상 사용자 인터페이스를 이용한 증강현실 시스템 및 그 방법 |
KR20150058617A (ko) * | 2013-11-18 | 2015-05-29 | 순천대학교 산학협력단 | 모바일 증강현실 기반의 설계도면 3차원 모델 시각화 시스템 및 그 방법 |
KR20150068088A (ko) * | 2013-12-11 | 2015-06-19 | 류지혁 | 3d모델 증강현실 서비스 시스템 |
KR20160033495A (ko) * | 2014-09-18 | 2016-03-28 | 서강대학교산학협력단 | 증강현실을 이용한 가구 배치 장치 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
CN111492396A (zh) | 2020-08-04 |
US11030359B2 (en) | 2021-06-08 |
US20200334396A1 (en) | 2020-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019124726A1 (ko) | 혼합 현실 서비스 제공 방법 및 시스템 | |
WO2019124818A1 (ko) | 혼합 현실 서비스 제공 방법 및 시스템 | |
WO2014107068A1 (en) | Method for controlling camera operation based on haptic function and terminal supporting the same | |
KR102010030B1 (ko) | 혼합 현실 서비스 제공 방법 및 시스템 | |
WO2011031026A2 (ko) | 배경 이미지를 이용한 3차원 아바타 서비스 제공 시스템 및 방법 | |
WO2012091326A2 (ko) | 고유식별 정보를 이용한 3차원 실시간 거리뷰시스템 | |
CN108028910A (zh) | 信息处理设备、信息处理方法与程序 | |
WO2019017582A1 (ko) | 클라우드 소싱 기반의 ar 컨텐츠 템플릿을 수집하여 ar 컨텐츠를 자동으로 생성하는 방법 및 시스템 | |
CN101198945A (zh) | 用于富媒体环境的管理系统 | |
JP2021044804A (ja) | 映像通話をしながら使用する360度パノラマ背景提供方法および装置 | |
US20220301270A1 (en) | Systems and methods for immersive and collaborative video surveillance | |
CN111222190A (zh) | 一种古建筑管理系统 | |
WO2015008932A1 (ko) | 증강현실에서의 원격 협업을 위한 디지로그 공간 생성기 및 그를 이용한 디지로그 공간 생성 방법 | |
JP4327545B2 (ja) | 現場監視システムおよび現場監視方法 | |
CN108648266A (zh) | 一种全通透扫描3d空间模型的管理方法和系统 | |
WO2021086018A1 (ko) | 3차원 증강현실 표시 방법 | |
CN109842593B (zh) | 信息获取方法、装置及计算机可读存储介质 | |
WO2012074174A1 (ko) | 고유식별 정보를 이용한 증강 현실 구현시스템 | |
WO2018194340A1 (ko) | 적층형 홀로그램용 콘텐츠 제공방법 및 제공장치 | |
WO2021029566A1 (ko) | 공통 좌표계 기반의 가상공간에서 가상 컨텐츠 제공 방법 및 장치 | |
KR102010023B1 (ko) | 혼합 현실 서비스 제공 방법 및 시스템 | |
WO2021101213A1 (ko) | 3d 투어 내에서 실제 거리 측정 방법 | |
KR102057845B1 (ko) | 단말기와의 연계를 통한 영상 관제 방법 및 이를 이용한 영상 관제 시스템 | |
KR101814962B1 (ko) | 수변전설비 관리장치 및 관리방법 | |
JP2021190729A (ja) | 画像特定システムおよび画像特定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18891605 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18891605 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02/02/2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18891605 Country of ref document: EP Kind code of ref document: A1 |