KR20160109828A - Augmented reality system - Google Patents
Augmented reality system Download PDFInfo
- Publication number
- KR20160109828A KR20160109828A KR1020150034951A KR20150034951A KR20160109828A KR 20160109828 A KR20160109828 A KR 20160109828A KR 1020150034951 A KR1020150034951 A KR 1020150034951A KR 20150034951 A KR20150034951 A KR 20150034951A KR 20160109828 A KR20160109828 A KR 20160109828A
- Authority
- KR
- South Korea
- Prior art keywords
- augmented reality
- content information
- user terminal
- information
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides an augmented reality system. The augmented reality system includes a position sensor for providing its own position information to a user terminal, 3D (3-Dimensional) contents information of an object existing in a specific area, and the user terminal Wherein the 3D contents information is provided to the user terminal, and an object is located adjacent to the AR, using the location information provided from the location sensor, and the 3D contents information provided from the augmented reality server And a user terminal for extracting the 3D content information, synthesizing the extracted 3D content information and the image photographed through the camera, and outputting the synthesized image.
Description
The present invention relates to augmented reality technology, and more particularly, to a system for providing a 3D image related to an object to a user accessing a specific object.
Augmented Reality (AR) technology is a technology that shows a virtual world overlaid on the reality that the user sees.
If a conventional virtual reality (VR) technology targets a virtual space and objects, the augmented reality technology displays a virtual world having additional information together with the real world as a single image, It has features that provide information. It is also called mixed reality technology.
However, the conventional virtual reality technique has a problem that the positional error between the real image and the augmented image is large. Therefore, it is necessary to develop a technique for reducing the positional error of the real image and the augmented image.
In order to solve the above-described problems, the present invention uses a location information provided from a sensor to distinguish objects adjacent to the object, extracts an augmented image associated with the object from information provided from a server, And a synthesized real image and outputs the synthesized real image.
According to another aspect of the present invention, there is provided an augmented reality system, comprising: a position sensor for providing position information of a user terminal to a user terminal; 3-dimensional content information of an object existing in a specific area; An augmented reality server that provides the 3D content information to the user terminal when the user terminal approaches within a predetermined range in the specific area and an augmented reality server that distinguishes the augmented reality server from the augmented reality server using position information provided from the position sensor, And a user terminal for extracting 3D content information about the separated object from the 3D content information provided from the 3D content information and synthesizing the extracted 3D content information and the image photographed through the camera.
According to the present invention, the proximity of an object is classified using the position information provided from the sensor, and the positional error between the real image and the augmented image is reduced by synthesizing the augmented image of the adjacent object and the real image captured through the camera It provides an advantage that can be made.
1 is a block diagram illustrating a configuration of an augmented reality system according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating an operation of an augmented reality system according to an exemplary embodiment of the present invention. Referring to FIG.
BRIEF DESCRIPTION OF THE DRAWINGS The above and other objects, advantages and features of the present invention and methods of achieving them will be apparent from the following detailed description of embodiments thereof taken in conjunction with the accompanying drawings.
The present invention may, however, be embodied in many different forms and should not be construed as being limited to the exemplary embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, And advantages of the present invention are set forth in the appended claims.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular forms herein include plural forms unless the context clearly dictates otherwise. &Quot; comprises "and / or" comprising ", as used herein, is intended to include the use of one or more other components, steps, operations, and / And does not exclude the presence or addition of a compound.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
1 is a block diagram illustrating a configuration of an augmented reality system according to an embodiment of the present invention.
As shown in FIG. 1, an augmented reality system according to an embodiment of the present invention includes a
The
The augmented
To this end, the augmented
The
The
Hereinafter, the operation of the above-described devices will be described in detail.
The sensor
The 3D content
In this case, the 3D data may be generated using modeling using a SFM (Structure Form Motion) technique and a combination of basic solid figures such as a hexahedron and a cylinder, using data of feature points of previously extracted objects, The augmented
The specific areas described in this specification may be sightseeing spots, experiential spots, cultural properties, etc., and the objects present in a specific area are tourist spots, experience spots, buildings, monuments, places, etc. existing in cultural properties.
For example, the 3D content
The
At this time, the manner in which the
The position and direction are calculated using the continuous image information and the position information of the sensor, thereby reducing the positional error of the real image and the augmented image and improving the accuracy of the augmented image.
If it is determined that the position calculated by the
At this time, the 3D content
Then, the
Then, the
For example, if the current location is determined to be located in front of Cheongwoongyo in Bulguksa, the
Then, the
In addition, the
FIG. 2 is a flowchart illustrating an operation of an augmented reality system according to an exemplary embodiment of the present invention. Referring to FIG.
As shown in FIG. 2, the user terminal calculates a current location and a direction of movement (S200) and determines whether the specific area is accessed (S210).
As a result of the determination, if it is determined that the calculated position is an entrance of a specific area or a 3D content information reception position provided in a specific area, or that the movement direction is directed toward an entrance of a specific area, And stores the received 3D content information (S220).
At this time, the user terminal calculates the current position and direction using the continuous real image, the position information measured from the position sensor, and the like. The augmented reality server and the WiFi, Zigbee, 3G (Generation) (4 Generation), Long Term Evolution (LTE), or the like.
In addition, the augmented reality server includes unique ID information of a plurality of position sensors mounted on a specific area, position information on the plurality of position sensors, content information databaseized by a specific region, 3D data of objects existing in a specific area, The same 3D content information is stored.
In this case, the 3D data may be generated using modeling using a SFM (Structure Form Motion) technique and a combination of basic solid figures such as a hexahedron and a cylinder, using data of feature points of previously extracted objects, It may be information generated from the Augmented Reality itself.
The specific areas described in this specification may be sightseeing spots, experiential spots, cultural properties, etc., and the objects present in a specific area are tourist spots, experience spots, buildings, monuments, places, etc. existing in cultural properties.
Then, the user terminal continuously calculates a position and a direction to determine whether it is in proximity with a specific object existing in a specific area or is performing a specific object (S230).
As a result of the determination, if it is determined that the specific object is located close to a specific object or is facing a specific object, the 3D content information for the object is searched (S240).
Then, the 3D content information and the real image are combined with each other (S250), and the synthesized 3D image is displayed to the user (S260).
For example, if the current location of the user terminal is determined to be located in front of the cloudy sky of Bulguksa, a scene in which a cloud image is constructed on the actual cloud image of the user, a scene of the cloud image of the old people and a historical background related to cloud cloud are superimposed And reproduced in 3D form.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the invention.
Therefore, the embodiments of the present invention are not intended to limit the scope of the present invention, and the scope of the present invention is not limited by these embodiments. It is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.
100: Position sensor 200: User terminal
300: augmented reality server
Claims (1)
An augmented reality server that stores 3D (3-Dimensional) content information of an object existing in a specific area and provides the 3D content information to the user terminal when the user terminal approaches the predetermined area within a predetermined range; And
Extracting the 3D content information about the separated object from the 3D content information provided from the augmented reality server, separating the 3D content information from the 3D content information provided from the augmented reality server by using the position information provided from the position sensor, And outputting the combined images,
Wherein the augmented reality system comprises:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150034951A KR20160109828A (en) | 2015-03-13 | 2015-03-13 | Augmented reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150034951A KR20160109828A (en) | 2015-03-13 | 2015-03-13 | Augmented reality system |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160109828A true KR20160109828A (en) | 2016-09-21 |
Family
ID=57080902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150034951A KR20160109828A (en) | 2015-03-13 | 2015-03-13 | Augmented reality system |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160109828A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190106404A (en) * | 2018-03-09 | 2019-09-18 | (주)벨류데이터 | Selfie support Camera System using augmented reality |
US10930084B2 (en) | 2018-10-18 | 2021-02-23 | Samsung Display Co., Ltd. | Electronic device including display unit and method of operating the same |
US11215690B2 (en) | 2020-05-11 | 2022-01-04 | Ajou University Industry-Academic Cooperation Foundation | Object location measurement method and augmented reality service providing device using the same |
-
2015
- 2015-03-13 KR KR1020150034951A patent/KR20160109828A/en unknown
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190106404A (en) * | 2018-03-09 | 2019-09-18 | (주)벨류데이터 | Selfie support Camera System using augmented reality |
US10930084B2 (en) | 2018-10-18 | 2021-02-23 | Samsung Display Co., Ltd. | Electronic device including display unit and method of operating the same |
US11215690B2 (en) | 2020-05-11 | 2022-01-04 | Ajou University Industry-Academic Cooperation Foundation | Object location measurement method and augmented reality service providing device using the same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10964108B2 (en) | Augmentation of captured 3D scenes with contextual information | |
US10462406B2 (en) | Information processing apparatus and information processing method | |
US8963999B1 (en) | Augmented reality with earth data | |
US9392248B2 (en) | Dynamic POV composite 3D video system | |
JP2020504872A (en) | System and method for controlling vehicle movement | |
EP2672455B1 (en) | Apparatus and method for providing 3D map showing area of interest in real time | |
US20220385721A1 (en) | 3d mesh generation on a server | |
CN110361005B (en) | Positioning method, positioning device, readable storage medium and electronic equipment | |
KR102197615B1 (en) | Method of providing augmented reality service and server for the providing augmented reality service | |
CN106289263A (en) | Indoor navigation method and device | |
KR102383567B1 (en) | Method and system for localization based on processing visual information | |
KR101996241B1 (en) | Device and method for providing 3d map representing positon of interest in real time | |
KR20160109828A (en) | Augmented reality system | |
US10957100B2 (en) | Method and apparatus for generating 3D map of indoor space | |
KR101700651B1 (en) | Apparatus for tracking object using common route date based on position information | |
CN113378605A (en) | Multi-source information fusion method and device, electronic equipment and storage medium | |
KR101611427B1 (en) | Image processing method and apparatus performing the same | |
CN109214482A (en) | A kind of indoor orientation method, device, terminal device and storage medium | |
US20220138979A1 (en) | Remote measurements from a live video stream | |
US11836942B2 (en) | Information integration method, information integration device, and information integration program | |
JP2014106602A (en) | Information terminal device | |
KR20210049527A (en) | Method for receiving map information from map generating server and user equipment performing method | |
KR20210048928A (en) | Method for determining location to implement virtual object and user equipment performing method | |
KR101316387B1 (en) | Method of object recognition using vision sensing and distance sensing | |
KR20210048798A (en) | Method for determining pose of camera provided in user equipment and location calculation server performing method |