KR20110072108A - Vision based augmented reality system using mobile terminal having tag - Google Patents

Vision based augmented reality system using mobile terminal having tag Download PDF

Info

Publication number
KR20110072108A
KR20110072108A KR1020090128910A KR20090128910A KR20110072108A KR 20110072108 A KR20110072108 A KR 20110072108A KR 1020090128910 A KR1020090128910 A KR 1020090128910A KR 20090128910 A KR20090128910 A KR 20090128910A KR 20110072108 A KR20110072108 A KR 20110072108A
Authority
KR
South Korea
Prior art keywords
mobile terminal
tag
camera
augmented reality
image
Prior art date
Application number
KR1020090128910A
Other languages
Korean (ko)
Inventor
임상민
Original Assignee
(주)포스트미디어
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)포스트미디어 filed Critical (주)포스트미디어
Priority to KR1020090128910A priority Critical patent/KR20110072108A/en
Publication of KR20110072108A publication Critical patent/KR20110072108A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Abstract

The present invention relates to a vision-based augmented reality implementation system using a mobile terminal having a tag, and more particularly, to install a software capable of realizing augmented reality, having a first camera and at least a tag attached to the outer surface An image of one or more mobile terminals, a second camera for capturing the mobile terminal with the tag attached to the outer surface, and a tag attached to the outer surface of the mobile terminal connected to the second camera and photographed within the shooting zone of the second camera Is inputted, and analyzes the input tag image, calculates a relative position of the mobile terminal and the second camera photographed by the second camera, and stores the location information of the mobile terminal and the computer stored in the mobile terminal. Enabling wired and wireless communication between a mobile terminal and a computer for transmission to the terminal Including a new means, the mobile terminal extracts a virtual image stored in the mobile terminal based on the location information of the mobile terminal received through the communication means, and renders the extracted virtual image using the location information of the mobile terminal, The present invention relates to a vision-based augmented reality implementation system using a mobile terminal having a tag, characterized in that for outputting a new image generated by synthesizing a virtual image rendered on a real image captured by a first camera through a screen of a mobile terminal. . According to the present invention, in a specific augmented reality experience zone (near DID, near exhibits, etc.) using a tag attached mobile terminal to implement augmented reality in a way that the user does not see the tag, stably as tag-based It has the effect of supporting augmented reality.

Augmented Reality, Infrared, Camera, Tag, Mobile

Description

Vision Based Augmented Reality System Using Mobile Terminal having Tag}

The present invention relates to a system for implementing augmented reality, in particular, having a tag attached to a tag of a user's mobile terminal and having the augmented reality output to the user's mobile terminal within a shooting range where a camera capable of recording the tag is located. It relates to a vision-based augmented reality implementation system using a mobile terminal.

Augmented Reality is a modified form of Virtual Reality that aims to help users' perception and interaction with reality by adding virtual objects to real environments.

Such augmented reality has been applied to a variety of medical, industrial, entertainment and military fields.

The general augmented reality can be designed as an optical or video based system, and the optical based system allows the user to see the real environment directly through the translucent screen of the head-mounted display (HMD), Project information to a user by projecting a virtual object.

In contrast, an image-based system obtains an image from a camera and then synthesizes virtual information into the image.

At this time, a tag is used to classify information, and the tag is also called a context or a visual marker.

As a method of vision-based augmented reality, a method of recognizing a specific tag and a method of recognizing a location by extracting feature points of an image are mainly used.

For existing vision-based mobile augmented reality, a method of forming a tag in augmented position or using a feature point has been used.

In both of these methods, a method of analyzing a video from a mobile device and drawing a 3D object at a corresponding position is used to augment an object.

Each tag has a unique ID (ID) that is arbitrarily assigned according to its type. The tag has a unique ID (ID), which stores data such as text, graphics, and sound on a computer, and outputs data by recognizing the tag. do.

Such tags are recognized by various IDs according to their direction and shape. Conventional tag devices include AR Toolkit marker system, HOM marker system, IGD marker system, and SCR marker system.

In more detail, FIG. 1 attaches a tag 2 to a board 1 of a reality, as shown in a diagram conceptually showing a system configuration for conceptually explaining a conventional augmented reality, and the tag 2 The image information 2 'corresponding to the ID of is stored in the computer 4 or the tablet computer 5 or the media 6.

At this time, the computer 4 or the tablet computer 5 or the PD 6 is provided with a calculation program necessary to implement augmented reality.

In this state, through the camera 3, the board 1 of the reality and the tag 2 attached to the board 1 of the reality are photographed, and the computer 4 or the tablet computer 5 or the media 6 is taken. When transmitted, the computer 4 or the tablet computer 5 or the PD 6 determines the unique ID of the tag 2 in the image photographed by the camera 3, and the image corresponding to the determined unique ID. After reading the information 2 'and processing information necessary for augmented reality, as shown in FIG. 1 (b), the virtual house 2' is located on the real board 1 on the screen 7 as shown in FIG. Output the same composite image.

The tag used in the augmented reality system has a combination of a picture, a character, or a figure for the recognition rate of the tag and the image captured by the camera, and is visible to the naked eye.

That is, since the tag is visible to the naked eye, there is an aesthetic problem, and there is a problem in that the place or location is used.

The tag-based method has a problem of installation and aesthetics, and in particular, the method of extracting feature points has a problem that the tag is not visible to the user, but the processing speed is remarkably slowed down when the number of recognized images increases. .

An object of the present invention devised to solve the above-described problems, a tag to a user in a way that can implement augmented reality using a mobile terminal tagged in a specific augmented reality experience zone (near DID, near the exhibit, etc.) Without seeing, is to provide a vision-based augmented reality implementation system using a mobile terminal having a tag that can support augmented reality stably like a tag.

The present invention devised to achieve the above object is, at least one or more mobile terminals having a first camera and a tag attached to the outer surface, the software is installed to implement augmented reality, the tag is attached to the outer surface A second camera photographing the mobile terminal, an image of the tag connected to the second camera and attached to an outer surface of the mobile terminal photographed in the photographing area of the second camera is input, and the input tag image A computer configured to store the location information of the mobile terminal by calculating a relative position of the mobile terminal and the second camera photographed by the second camera, and the location information of the mobile terminal stored in the computer; Enable wired and wireless communication between the mobile terminal and the computer for transmission to Includes a communication means, wherein the mobile terminal extracts a virtual image stored in the mobile terminal based on the location information of the mobile terminal received through the communication means, and locates the extracted virtual image in the mobile terminal. After rendering using the information, the new image generated by synthesizing the rendered virtual image to the real image captured by the first camera is characterized in that for outputting through the screen of the mobile terminal.

The mobile terminal further includes an infrared reflecting film in the shape of the tag image attached to the tag surface, wherein the second camera is characterized in that the infrared camera to shoot the infrared reflecting film.

The tag attached to an outer surface of the mobile terminal is an infrared tag represented by an infrared reflecting material, and the second camera is an infrared camera that photographs the infrared tag.

The mobile terminal further comprises an infrared transmission plate for recognizing the tag by passing the light in the infrared region.

The communication means is any one of RS232C, IEEE 1394, High-Definition Multimedia Interface (HDMI), Ultra Wide-Band (UWB), Bluetooth, Infrared Data Association (IRDA), ZigBee, Near Field Communication (NFC), WIFI and RF. It is characterized by the above.

The present invention is a method that can implement augmented reality using a mobile terminal with a tag in a specific augmented reality experience zone (near DID, near exhibits, etc.), without showing the tag to the user, stably augmented reality like a tag-based It is effective to support.

Hereinafter, with reference to the accompanying drawings will be described in detail a preferred embodiment of a vision-based augmented reality implementation system using a mobile terminal having a tag according to the present invention.

2 to 4 is a conceptual diagram showing a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention. 2 to 4, a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention is a mobile terminal 100, a second camera 200, a computer 300 and Communication means (not shown), which will be described in detail as follows.

The mobile terminal 100 may be provided with at least one or more and is provided with a software for implementing augmented reality, provided with a first camera 110 that is a general visible light camera. And, a tag (not shown) is attached to the outer surface.

The second camera 200 photographs the mobile terminal 100 having the tag attached to an outer surface thereof. In this case, the second camera 200 is preferably installed near a digital information display (DID) or an exhibit.

The computer is connected to the second camera 200 and the image of the tag attached to the outer surface of the mobile terminal 100 photographed in the shooting zone of the second camera 200 is input.

And the position information of the mobile terminal 100 is analyzed by calculating the relative position of the mobile terminal 100 and the second camera 200 photographed by the second camera 200 by analyzing the input tag image. Save it.

In this case, the shooting zone refers to the maximum field of view of the second camera 200, that is, the maximum shooting range. This is called an augmented reality experience zone, and the augmented reality can be experienced only in a zone in which the second camera 200 can photograph and recognize the tag.

The communication means enables wired and wireless communication between the mobile terminal 100 and the computer 300 to transmit the location information of the mobile terminal 100 stored in the computer 300 to the mobile terminal 100. .

In this case, the mobile terminal 100 extracts a virtual image stored in the mobile terminal 100 based on the location information of the mobile terminal 100 received through the communication means.

The communication means may be any one of RS232C, IEEE 1394, High-Definition Multimedia Interface (HDMI), Ultra Wide-Band (UWB), Bluetooth, Infrared Data Association (IRDA), ZigBee, Near Field Communication (NFC), WIFI, and RF. More than one.

And after rendering using the location information of the mobile terminal 100 to the extracted virtual image, the new image generated by synthesizing the rendered virtual image to the real image taken by the first camera 110 is Outputs through the screen of the mobile terminal 100.

That is, the 3D position of the mobile terminal 100 is calculated by the computer 300 to which the second camera 200 is connected. Each mobile terminal 100 is tagged, and each tag is used to calculate the direction and position of the mobile terminal 100.

The recognized direction and location of the mobile terminal 100 is sent to the mobile terminal 100, and the mobile terminal 100 can utilize the location information to augment data in its camera image.

Meanwhile, the mobile terminal 100 may further include an infrared reflecting film in the shape of the tag image attached to the tag surface, and the second camera 200 may be an infrared camera photographing the infrared reflecting film.

The tag attached to the outer surface of the mobile terminal 100 may be an infrared tag represented by an infrared reflecting material, and the second camera 200 may be an infrared camera photographing the infrared tag.

The mobile terminal 100 may further include an infrared transmission plate for passing the light in the infrared region to recognize the tag.

The tag attached to the external surface of the mobile terminal 100 is an infrared reflecting film or an infrared reflecting material so that it is not visible to the user's eyes, and the second camera 200 which recognizes this can be recognized as an infrared camera. will be.

In the above, the present invention has been described based on the preferred embodiments, but the technical idea of the present invention is not limited thereto, and modifications or changes can be made within the scope of the claims. It will be apparent to those skilled in the art, and such modifications and variations will belong to the appended claims.

The following drawings, which are attached in this specification, illustrate the preferred embodiments of the present invention, and together with the detailed description thereof, serve to further understand the technical spirit of the present invention, and therefore, the present invention is limited only to the matters described in the drawings. It should not be interpreted.

1 conceptually illustrates a system configuration for conceptually explaining a conventional augmented reality;

2 to 4 is a conceptual diagram showing a vision-based augmented reality implementation system using a mobile terminal having a tag according to an embodiment of the present invention.

<Description of the symbols for the main parts of the drawings>

100: mobile terminal 110: the first camera

200: second camera 300: computer

Claims (5)

At least one mobile terminal having software for implementing augmented reality and having a first camera and a tag attached to an outer surface thereof; A second camera photographing the mobile terminal having the tag attached to an outer surface thereof; An image of the tag connected to the second camera and attached to an external surface of the mobile terminal photographed in the photographing area of the second camera is input, and the image is captured by analyzing the input tag image. A computer configured to calculate relative positions of the mobile terminal and the second camera to store location information of the mobile terminal; And Communication means for enabling wired and wireless communication between the mobile terminal and the computer to transmit the location information of the mobile terminal stored in the computer to the mobile terminal, The mobile terminal extracts a virtual image stored in the mobile terminal based on the location information of the mobile terminal received through the communication means, renders the extracted virtual image using the location information of the mobile terminal, Vision-based augmented reality implementation system using a mobile terminal having a tag, characterized in that for outputting a new image generated by synthesizing the rendered virtual image to the real image taken by the first camera through the screen of the mobile terminal. . The method of claim 1, The mobile terminal further includes an infrared reflecting film of the tag image shape attached to the tag surface, The second camera is a vision-based augmented reality implementation system using a mobile terminal having a tag, characterized in that the infrared camera to shoot the infrared reflecting film. The method of claim 1, The tag attached to the outer surface of the mobile terminal is an infrared tag represented by an infrared reflecting material, The second camera is a vision-based augmented reality implementation system using a mobile terminal having a tag, characterized in that the infrared camera to shoot the infrared tag. The method according to claim 2 or 3, The mobile terminal is a vision-based augmented reality implementation system using a mobile terminal having a tag characterized in that it further comprises an infrared transmission plate for passing the light in the infrared region to recognize the tag. The method of claim 1, The communication means is any one of RS232C, IEEE 1394, High-Definition Multimedia Interface (HDMI), Ultra Wide-Band (UWB), Bluetooth, Infrared Data Association (IRDA), ZigBee, Near Field Communication (NFC), WIFI and RF. Vision-based augmented reality implementation system using a mobile terminal having a tag characterized in that above.
KR1020090128910A 2009-12-22 2009-12-22 Vision based augmented reality system using mobile terminal having tag KR20110072108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020090128910A KR20110072108A (en) 2009-12-22 2009-12-22 Vision based augmented reality system using mobile terminal having tag

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020090128910A KR20110072108A (en) 2009-12-22 2009-12-22 Vision based augmented reality system using mobile terminal having tag

Publications (1)

Publication Number Publication Date
KR20110072108A true KR20110072108A (en) 2011-06-29

Family

ID=44403079

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090128910A KR20110072108A (en) 2009-12-22 2009-12-22 Vision based augmented reality system using mobile terminal having tag

Country Status (1)

Country Link
KR (1) KR20110072108A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101342932B1 (en) * 2011-10-13 2014-01-02 (주)엔에이에스씨 Apparatus for guiding
KR101498636B1 (en) * 2013-10-22 2015-03-04 김시원 Smart pad
CN104463055A (en) * 2014-12-29 2015-03-25 重庆甲虫网络科技有限公司 Augmented reality system based on wireless smart label and control method of augmented reality system
WO2018045722A1 (en) * 2016-09-06 2018-03-15 中兴通讯股份有限公司 Terminal control method and apparatus

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101342932B1 (en) * 2011-10-13 2014-01-02 (주)엔에이에스씨 Apparatus for guiding
KR101498636B1 (en) * 2013-10-22 2015-03-04 김시원 Smart pad
CN104463055A (en) * 2014-12-29 2015-03-25 重庆甲虫网络科技有限公司 Augmented reality system based on wireless smart label and control method of augmented reality system
WO2018045722A1 (en) * 2016-09-06 2018-03-15 中兴通讯股份有限公司 Terminal control method and apparatus
CN107809755A (en) * 2016-09-06 2018-03-16 中兴通讯股份有限公司 Terminal control method and device

Similar Documents

Publication Publication Date Title
EP3039476B1 (en) Head mounted display device and method for controlling the same
US9076033B1 (en) Hand-triggered head-mounted photography
JP5967839B2 (en) Display device using wearable glasses and operating method thereof
KR101239284B1 (en) Control terminal and server for managing target devices using Augmented Reality Contents
US9298970B2 (en) Method and apparatus for facilitating interaction with an object viewable via a display
TWI620098B (en) Head mounted device and guiding method
CN111095364A (en) Information processing apparatus, information processing method, and program
KR20160015972A (en) The Apparatus and Method for Wearable Device
CN112614057A (en) Image blurring processing method and electronic equipment
CN113545030B (en) Method, user equipment and system for automatically generating full-focus image through mobile camera
US11709370B2 (en) Presentation of an enriched view of a physical setting
JP4303087B2 (en) Data signal transmission method and reception method and apparatus, system, program, and recording medium
CN108027707A (en) Subscriber terminal equipment, electronic equipment and the method for control subscriber terminal equipment and electronic equipment
US20150172550A1 (en) Display tiling for enhanced view modes
KR20110072108A (en) Vision based augmented reality system using mobile terminal having tag
TWM482797U (en) Augmented-reality system capable of displaying three-dimensional image
EP3038061A1 (en) Apparatus and method to display augmented reality data
KR101767220B1 (en) System and method for processing hand gesture commands using a smart glass
CN104239877A (en) Image processing method and image acquisition device
KR20170002921A (en) Apparatus and method for creating digital building instruction
US11624924B2 (en) Image capturing system including head-mount type display device, and display device and method of controlling the same
US20230035360A1 (en) Mapping networked devices
US20230400978A1 (en) Eyewear device user interface
TWI784645B (en) Augmented reality system and operation method thereof
US20170147177A1 (en) Method of transmitting information via a video channel between two terminals

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application