KR20110072108A - Vision based augmented reality system using mobile terminal having tag - Google Patents
Vision based augmented reality system using mobile terminal having tag Download PDFInfo
- Publication number
- KR20110072108A KR20110072108A KR1020090128910A KR20090128910A KR20110072108A KR 20110072108 A KR20110072108 A KR 20110072108A KR 1020090128910 A KR1020090128910 A KR 1020090128910A KR 20090128910 A KR20090128910 A KR 20090128910A KR 20110072108 A KR20110072108 A KR 20110072108A
- Authority
- KR
- South Korea
- Prior art keywords
- mobile terminal
- tag
- camera
- augmented reality
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W88/00—Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
- H04W88/02—Terminal devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Abstract
The present invention relates to a vision-based augmented reality implementation system using a mobile terminal having a tag, and more particularly, to install a software capable of realizing augmented reality, having a first camera and at least a tag attached to the outer surface An image of one or more mobile terminals, a second camera for capturing the mobile terminal with the tag attached to the outer surface, and a tag attached to the outer surface of the mobile terminal connected to the second camera and photographed within the shooting zone of the second camera Is inputted, and analyzes the input tag image, calculates a relative position of the mobile terminal and the second camera photographed by the second camera, and stores the location information of the mobile terminal and the computer stored in the mobile terminal. Enabling wired and wireless communication between a mobile terminal and a computer for transmission to the terminal Including a new means, the mobile terminal extracts a virtual image stored in the mobile terminal based on the location information of the mobile terminal received through the communication means, and renders the extracted virtual image using the location information of the mobile terminal, The present invention relates to a vision-based augmented reality implementation system using a mobile terminal having a tag, characterized in that for outputting a new image generated by synthesizing a virtual image rendered on a real image captured by a first camera through a screen of a mobile terminal. . According to the present invention, in a specific augmented reality experience zone (near DID, near exhibits, etc.) using a tag attached mobile terminal to implement augmented reality in a way that the user does not see the tag, stably as tag-based It has the effect of supporting augmented reality.
Augmented Reality, Infrared, Camera, Tag, Mobile
Description
The present invention relates to a system for implementing augmented reality, in particular, having a tag attached to a tag of a user's mobile terminal and having the augmented reality output to the user's mobile terminal within a shooting range where a camera capable of recording the tag is located. It relates to a vision-based augmented reality implementation system using a mobile terminal.
Augmented Reality is a modified form of Virtual Reality that aims to help users' perception and interaction with reality by adding virtual objects to real environments.
Such augmented reality has been applied to a variety of medical, industrial, entertainment and military fields.
The general augmented reality can be designed as an optical or video based system, and the optical based system allows the user to see the real environment directly through the translucent screen of the head-mounted display (HMD), Project information to a user by projecting a virtual object.
In contrast, an image-based system obtains an image from a camera and then synthesizes virtual information into the image.
At this time, a tag is used to classify information, and the tag is also called a context or a visual marker.
As a method of vision-based augmented reality, a method of recognizing a specific tag and a method of recognizing a location by extracting feature points of an image are mainly used.
For existing vision-based mobile augmented reality, a method of forming a tag in augmented position or using a feature point has been used.
In both of these methods, a method of analyzing a video from a mobile device and drawing a 3D object at a corresponding position is used to augment an object.
Each tag has a unique ID (ID) that is arbitrarily assigned according to its type. The tag has a unique ID (ID), which stores data such as text, graphics, and sound on a computer, and outputs data by recognizing the tag. do.
Such tags are recognized by various IDs according to their direction and shape. Conventional tag devices include AR Toolkit marker system, HOM marker system, IGD marker system, and SCR marker system.
In more detail, FIG. 1 attaches a
At this time, the
In this state, through the
The tag used in the augmented reality system has a combination of a picture, a character, or a figure for the recognition rate of the tag and the image captured by the camera, and is visible to the naked eye.
That is, since the tag is visible to the naked eye, there is an aesthetic problem, and there is a problem in that the place or location is used.
The tag-based method has a problem of installation and aesthetics, and in particular, the method of extracting feature points has a problem that the tag is not visible to the user, but the processing speed is remarkably slowed down when the number of recognized images increases. .
An object of the present invention devised to solve the above-described problems, a tag to a user in a way that can implement augmented reality using a mobile terminal tagged in a specific augmented reality experience zone (near DID, near the exhibit, etc.) Without seeing, is to provide a vision-based augmented reality implementation system using a mobile terminal having a tag that can support augmented reality stably like a tag.
The present invention devised to achieve the above object is, at least one or more mobile terminals having a first camera and a tag attached to the outer surface, the software is installed to implement augmented reality, the tag is attached to the outer surface A second camera photographing the mobile terminal, an image of the tag connected to the second camera and attached to an outer surface of the mobile terminal photographed in the photographing area of the second camera is input, and the input tag image A computer configured to store the location information of the mobile terminal by calculating a relative position of the mobile terminal and the second camera photographed by the second camera, and the location information of the mobile terminal stored in the computer; Enable wired and wireless communication between the mobile terminal and the computer for transmission to Includes a communication means, wherein the mobile terminal extracts a virtual image stored in the mobile terminal based on the location information of the mobile terminal received through the communication means, and locates the extracted virtual image in the mobile terminal. After rendering using the information, the new image generated by synthesizing the rendered virtual image to the real image captured by the first camera is characterized in that for outputting through the screen of the mobile terminal.
The mobile terminal further includes an infrared reflecting film in the shape of the tag image attached to the tag surface, wherein the second camera is characterized in that the infrared camera to shoot the infrared reflecting film.
The tag attached to an outer surface of the mobile terminal is an infrared tag represented by an infrared reflecting material, and the second camera is an infrared camera that photographs the infrared tag.
The mobile terminal further comprises an infrared transmission plate for recognizing the tag by passing the light in the infrared region.
The communication means is any one of RS232C, IEEE 1394, High-Definition Multimedia Interface (HDMI), Ultra Wide-Band (UWB), Bluetooth, Infrared Data Association (IRDA), ZigBee, Near Field Communication (NFC), WIFI and RF. It is characterized by the above.
The present invention is a method that can implement augmented reality using a mobile terminal with a tag in a specific augmented reality experience zone (near DID, near exhibits, etc.), without showing the tag to the user, stably augmented reality like a tag-based It is effective to support.
Hereinafter, with reference to the accompanying drawings will be described in detail a preferred embodiment of a vision-based augmented reality implementation system using a mobile terminal having a tag according to the present invention.
2 to 4 is a conceptual diagram showing a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention. 2 to 4, a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention is a
The
The
The computer is connected to the
And the position information of the
In this case, the shooting zone refers to the maximum field of view of the
The communication means enables wired and wireless communication between the
In this case, the
The communication means may be any one of RS232C, IEEE 1394, High-Definition Multimedia Interface (HDMI), Ultra Wide-Band (UWB), Bluetooth, Infrared Data Association (IRDA), ZigBee, Near Field Communication (NFC), WIFI, and RF. More than one.
And after rendering using the location information of the
That is, the 3D position of the
The recognized direction and location of the
Meanwhile, the
The tag attached to the outer surface of the
The
The tag attached to the external surface of the
In the above, the present invention has been described based on the preferred embodiments, but the technical idea of the present invention is not limited thereto, and modifications or changes can be made within the scope of the claims. It will be apparent to those skilled in the art, and such modifications and variations will belong to the appended claims.
The following drawings, which are attached in this specification, illustrate the preferred embodiments of the present invention, and together with the detailed description thereof, serve to further understand the technical spirit of the present invention, and therefore, the present invention is limited only to the matters described in the drawings. It should not be interpreted.
1 conceptually illustrates a system configuration for conceptually explaining a conventional augmented reality;
2 to 4 is a conceptual diagram showing a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention.
<Description of the symbols for the main parts of the drawings>
100: mobile terminal 110: the first camera
200: second camera 300: computer
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020090128910A KR20110072108A (en) | 2009-12-22 | 2009-12-22 | Vision based augmented reality system using mobile terminal having tag |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020090128910A KR20110072108A (en) | 2009-12-22 | 2009-12-22 | Vision based augmented reality system using mobile terminal having tag |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20110072108A true KR20110072108A (en) | 2011-06-29 |
Family
ID=44403079
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020090128910A KR20110072108A (en) | 2009-12-22 | 2009-12-22 | Vision based augmented reality system using mobile terminal having tag |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20110072108A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101342932B1 (en) * | 2011-10-13 | 2014-01-02 | (주)엔에이에스씨 | Apparatus for guiding |
KR101498636B1 (en) * | 2013-10-22 | 2015-03-04 | 김시원 | Smart pad |
CN104463055A (en) * | 2014-12-29 | 2015-03-25 | 重庆甲虫网络科技有限公司 | Augmented reality system based on wireless smart label and control method of augmented reality system |
WO2018045722A1 (en) * | 2016-09-06 | 2018-03-15 | 中兴通讯股份有限公司 | Terminal control method and apparatus |
-
2009
- 2009-12-22 KR KR1020090128910A patent/KR20110072108A/en not_active Application Discontinuation
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101342932B1 (en) * | 2011-10-13 | 2014-01-02 | (주)엔에이에스씨 | Apparatus for guiding |
KR101498636B1 (en) * | 2013-10-22 | 2015-03-04 | 김시원 | Smart pad |
CN104463055A (en) * | 2014-12-29 | 2015-03-25 | 重庆甲虫网络科技有限公司 | Augmented reality system based on wireless smart label and control method of augmented reality system |
WO2018045722A1 (en) * | 2016-09-06 | 2018-03-15 | 中兴通讯股份有限公司 | Terminal control method and apparatus |
CN107809755A (en) * | 2016-09-06 | 2018-03-16 | 中兴通讯股份有限公司 | Terminal control method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3039476B1 (en) | Head mounted display device and method for controlling the same | |
US9076033B1 (en) | Hand-triggered head-mounted photography | |
JP5967839B2 (en) | Display device using wearable glasses and operating method thereof | |
KR101239284B1 (en) | Control terminal and server for managing target devices using Augmented Reality Contents | |
US9298970B2 (en) | Method and apparatus for facilitating interaction with an object viewable via a display | |
TWI620098B (en) | Head mounted device and guiding method | |
CN111095364A (en) | Information processing apparatus, information processing method, and program | |
KR20160015972A (en) | The Apparatus and Method for Wearable Device | |
CN112614057A (en) | Image blurring processing method and electronic equipment | |
CN113545030B (en) | Method, user equipment and system for automatically generating full-focus image through mobile camera | |
US11709370B2 (en) | Presentation of an enriched view of a physical setting | |
JP4303087B2 (en) | Data signal transmission method and reception method and apparatus, system, program, and recording medium | |
CN108027707A (en) | Subscriber terminal equipment, electronic equipment and the method for control subscriber terminal equipment and electronic equipment | |
US20150172550A1 (en) | Display tiling for enhanced view modes | |
KR20110072108A (en) | Vision based augmented reality system using mobile terminal having tag | |
TWM482797U (en) | Augmented-reality system capable of displaying three-dimensional image | |
EP3038061A1 (en) | Apparatus and method to display augmented reality data | |
KR101767220B1 (en) | System and method for processing hand gesture commands using a smart glass | |
CN104239877A (en) | Image processing method and image acquisition device | |
KR20170002921A (en) | Apparatus and method for creating digital building instruction | |
US11624924B2 (en) | Image capturing system including head-mount type display device, and display device and method of controlling the same | |
US20230035360A1 (en) | Mapping networked devices | |
US20230400978A1 (en) | Eyewear device user interface | |
TWI784645B (en) | Augmented reality system and operation method thereof | |
US20170147177A1 (en) | Method of transmitting information via a video channel between two terminals |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |