KR20090000777A - Augmented reality system using tangible object and method for providing augmented reality - Google Patents
Augmented reality system using tangible object and method for providing augmented reality Download PDFInfo
- Publication number
- KR20090000777A KR20090000777A KR1020070032538A KR20070032538A KR20090000777A KR 20090000777 A KR20090000777 A KR 20090000777A KR 1020070032538 A KR1020070032538 A KR 1020070032538A KR 20070032538 A KR20070032538 A KR 20070032538A KR 20090000777 A KR20090000777 A KR 20090000777A
- Authority
- KR
- South Korea
- Prior art keywords
- tracking information
- image
- sensory object
- augmented reality
- color
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims description 30
- 230000001953 sensory effect Effects 0.000 claims description 128
- 239000003550 marker Substances 0.000 claims description 54
- 238000000605 extraction Methods 0.000 claims description 9
- 238000006073 displacement reaction Methods 0.000 claims description 5
- 239000003086 colorant Substances 0.000 claims description 3
- 239000000284 extract Substances 0.000 description 9
- 230000009466 transformation Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000002194 synthesizing effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00249—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
1 is a block diagram showing the configuration of an augmented reality system using a sensory object according to an embodiment of the present invention.
FIG. 2 is a block diagram illustrating a detailed configuration of an apparatus for providing augmented reality of the augmented reality system shown in FIG. 1.
3 is a flowchart illustrating a process of providing augmented reality according to an embodiment of the present invention.
4 is a flowchart illustrating a process of extracting two-dimensional tracking information of a sensory object shown in FIG. 3 in detail.
Explanation of symbols on the main parts of the drawings
110: 2D tracking information extractor 120: 3D tracking information extractor
130: tracking information matching unit 141: color correction module
143: coordinate transformation module 150: image synthesis unit
160: storage unit
The present invention relates to an augmented reality system, and more particularly, to augmented reality system and augmented reality providing method using a sensory object that can track the sensory object located on the table in a table-based augmented reality system.
Augmented Reality is a fusion of physical and electronic spaces that augments the real world by adding virtual information to the user's perception by mixing virtual images such as computer graphics with real or real world images. Used for.
A tangible object is a metaphor that means how a user interacts in a virtual environment. The tangible object provides a user with an intuitive interface to control digital information in the virtual environment for easy interaction between the user and the virtual content. to provide.
Conventional table-based augmented reality systems include a method using a magnetic tracker and a method using an infrared camera on the table floor.
The method for tracking an object using the magnetic tracker includes a source in which a coil for emitting an electromagnetic field is installed, and a plurality of sensors for sensing the electromagnetic field. The sensor senses and calculates the position and orientation of the source.
However, in the method using the magnetic tracker, since the source and the plurality of sensors are connected to a system that calculates the movement and orientation of the source through a wired cable, a general object cannot be used as an object and inconvenience in use due to the wired cable. There is this.
In addition, the method of tracking an object using the infrared camera has a disadvantage in that the system configuration is complicated because an infrared illuminator and a camera with a visible light filter are attached and an infrared reflector is attached to the object.
Accordingly, it is a first object of the present invention to provide an augmented reality system using a sensory object that can track a sensory object with a simple configuration.
In addition, a second object of the present invention is to provide a method for providing augmented reality that can accurately track the sensory object.
An augmented reality system using a sensory object according to an aspect of the present invention for achieving the first object of the present invention described above, and a projector for projecting a background image on the bottom surface of the table, by photographing the bottom surface of the table first In the augmented reality system using a table-based sensory object including a first camera for providing an image and a second camera for providing a second image by photographing the sensory object located above the table, the original image of the background image And correcting the color error between the first image and the first image, extracting 2D tracking information and 3D tracking information of the sensory object based on the first image and the second image, and extracting the 2D tracking information and the 3D tracking information. And augmented reality providing apparatus for matching and generating tracking information of the sensory object. The augmented reality providing apparatus may generate tracking information of the sensory object based on the 3D tracking information extracted based on the second image when the sensory object does not contact the surface of the table. The apparatus for augmented reality providing eight colors including red, green, and blue as the original background image on the bottom surface of the table, and a first image in which the first camera captures the projected original background image. The color error table can be generated by matching the color values. The apparatus for providing augmented reality includes a color correction module configured to generate a color error table for correcting color errors between the original background image and the first image photographed by the first camera, and the sensory object based on the first image. A two-dimensional tracking information extraction unit for recognizing a first marker installed on a bottom of the at least one and extracting the two-dimensional tracking information including at least one of a position and an orientation of the sensory object located on the surface of the table; A 3D tracking information extraction unit for recognizing a second marker installed on an upper surface of the sensory object based on the 3D tracking information including at least one of 3D position information, orientation information, and attitude information of the sensory object; Estimation information definition for generating tracking information of the sensory object by matching 2D tracking information and 3D tracking information It may include a. The 2D tracking information extracting unit receives a first image including the image of the first marker, corrects the color of the original background image, and compares the corrected background image with the color value of the first image to determine the color value. An area having a large error displacement may be estimated as an area in which the sensory object is located, the first marker may be recognized in the estimated area, and two-dimensional tracking information of the sensory object may be extracted based on the recognized first marker. . The tracking information matching unit compares at least one pattern information among shapes and sizes of the first marker and the second marker with a predefined reference pattern to determine a similarity of the pattern, and the two-dimensional tracking based on the similarity The tracking information of the sensory object may be generated by matching the tracking information by differently assigning weights to the information and the 3D tracking information.
In addition, the augmented reality providing method according to an aspect of the present invention for achieving a second object of the present invention generates a color error table for correcting the color error of the original background image and the first image photographed the table bottom surface And extracting two-dimensional tracking information and three-dimensional tracking information of the sensory object, and generating the tracking information of the sensory object by matching the extracted two-dimensional tracking information and three-dimensional tracking information. . Generating a color error table for correcting color errors between the original background image and the first image photographing the bottom surface of the table includes eight colors including red, green, and blue as the original background image. The color error table may be generated by projecting onto a bottom surface of the first camera, and matching the color values of the first image in which the first camera photographs the projected original background image. Extracting the 2D tracking information and the 3D tracking information of the sensory object may include receiving a first image including an image of a first marker installed on the bottom of the sensory object, and using the color error table. After correcting the original background image, comparing the corrected background image with the color values of the first image, estimating a region having a large error displacement of the color values as an area where the sensory object is located, and a region where the sensory object is located Setting the estimated area to a uniform color and recognizing the first marker in the area set to the uniform color and extracting the 2D tracking information based on the recognized first marker. have. The step of generating the tracking information of the sensory object by matching the extracted 2D tracking information and 3D tracking information may include a shape of a first marker installed on a bottom of the sensory object and a second marker installed on an upper surface of the sensory object. And determining the similarity of the patterns by comparing at least one pattern information among sizes with a predefined reference pattern, and differently assigning weights to the two-dimensional tracking information and the three-dimensional tracking information based on the similarity. And matching the tracking information. The step of generating the tracking information of the sensory object by matching the extracted 2D tracking information and the 3D tracking information is based on the extracted 3D tracking information when the sensory object does not contact the surface of the table. Tracking information of the sensory object may be generated.
As the present invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description.
However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all modifications, equivalents, and substitutes included in the spirit and scope of the present invention.
Terms such as first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component. The term and / or includes a combination of a plurality of related items or any item of a plurality of related items.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. Hereinafter, the same reference numerals are used for the same components in the drawings, and duplicate descriptions of the same components are omitted.
1 is a block diagram showing the configuration of an augmented reality system using a sensory object according to an embodiment of the present invention.
Referring to FIG. 1, an augmented reality system using a sensory object according to an embodiment of the present invention includes a
The
The
The
As the
The table 40 is formed of a translucent material so that the background image projected from the
The
The
In addition, the
The
The
The augmented
In addition, the
The augmented
When the
In the augmented reality system according to an embodiment of the present invention shown in FIG. 1, the background image projected by the
In addition, in the exemplary embodiment of the present invention illustrated in FIG. 1, the
FIG. 2 is a block diagram illustrating a detailed configuration of an apparatus for providing augmented reality of the augmented reality system shown in FIG. 1.
2, the augmented
The two-dimensional
Specifically, the 2D tracking
Then, the area corresponding to the estimated area of the original background image is set to a uniform color and is projected on the bottom surface of the table 40, and the projected background image is installed on the bottom of the
The 3D
The tracking
Here, the tracking
For example, the tracking
In addition, the tracking
The
The color error as described above is projected by distorting the color by the mechanical and / or optical characteristics of the
For example, when a background image composed of red (R (Red) = 255, G (Green) = 0, B (Blue) = 0) is projected through the
Accordingly, the
The color error table projects a background image having a predetermined color value to the entire area of the table 40 and captures the projected background image through the
For example, in generating a color error table, project red color (R = 255, G = 0, B = 0) as the original background image, and record the color value of the first image taken by matching the color value of the original background image. After projecting green (R = 0, G = 255, B = 0) and blue (R = 0, G = 0, B = 255) as the original background image, the above process is repeated. In addition, the above process is repeated for a combination of eight cases in which 0 or 255 is substituted for each of the color elements of R, G, and B of the original background image.
The coordinate
The
The
In addition, the
The
The
3 is a flowchart illustrating a process of providing augmented reality according to an embodiment of the present invention.
Referring to FIG. 3, in the method of providing augmented reality according to an embodiment of the present invention, first, when an augmented reality providing system is initialized, the
The color error correction may be performed before the augmented reality is performed, and the generated color error table may be stored in the
Then, when augmented reality is started, the two-dimensional
The tracking
The tracking
If it is determined in
The coordinate
The
The
Thereafter, the
4 is a flowchart illustrating a process of extracting two-dimensional tracking information of a sensory object shown in FIG. 3 in detail.
First, the 2D tracking
Thereafter, the 2D tracking
Then, the area in which the
Thereafter, the 2D tracking
In
If it is determined in
According to the augmented reality system and augmented reality providing method using the sensory object as described above, first correct the color error of the original image projected on the table and the first image of the projected background image, and located on the upper surface of the translucent table The 2D tracking information and the 3D tracking information are extracted from the first image and the second image respectively photographing the sensory object. After converting the matched tracking information into the coordinates of the virtual space, the predetermined image, which is a metaphor of the background image and the sensory object, is synthesized and displayed on the display device.
Therefore, two general-purpose cameras simultaneously capture the sensory objects located on the top of the translucent table and track the sensory objects by matching the tracking information extracted from each captured image. In addition, since a general-purpose camera is used, an augmented reality system having a simple configuration can be constructed.
Although described with reference to the embodiments above, those skilled in the art will understand that the present invention can be variously modified and changed without departing from the spirit and scope of the invention as set forth in the claims below. Could be.
Claims (11)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020070032538A KR20090000777A (en) | 2007-04-02 | 2007-04-02 | Augmented reality system using tangible object and method for providing augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020070032538A KR20090000777A (en) | 2007-04-02 | 2007-04-02 | Augmented reality system using tangible object and method for providing augmented reality |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20090000777A true KR20090000777A (en) | 2009-01-08 |
Family
ID=40483914
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020070032538A KR20090000777A (en) | 2007-04-02 | 2007-04-02 | Augmented reality system using tangible object and method for providing augmented reality |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20090000777A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110055868A (en) * | 2009-11-20 | 2011-05-26 | 엘지전자 주식회사 | Mobile terminal, digital television and method of controlling dtv |
KR101496761B1 (en) * | 2014-08-25 | 2015-02-27 | 연세대학교 산학협력단 | 3D Model Control System and Method Based on multi-screen Projection |
US8988464B2 (en) | 2010-12-14 | 2015-03-24 | Samsung Electronics Co., Ltd. | System and method for multi-layered augmented reality |
WO2015187309A1 (en) * | 2014-06-03 | 2015-12-10 | Intel Corporation | Projecting a virtual image at a physical surface |
KR101641672B1 (en) * | 2015-07-21 | 2016-07-21 | (주)디자인비아트 | The system for Augmented Reality of architecture model tracing using mobile terminal |
-
2007
- 2007-04-02 KR KR1020070032538A patent/KR20090000777A/en not_active Application Discontinuation
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110055868A (en) * | 2009-11-20 | 2011-05-26 | 엘지전자 주식회사 | Mobile terminal, digital television and method of controlling dtv |
US8988464B2 (en) | 2010-12-14 | 2015-03-24 | Samsung Electronics Co., Ltd. | System and method for multi-layered augmented reality |
WO2015187309A1 (en) * | 2014-06-03 | 2015-12-10 | Intel Corporation | Projecting a virtual image at a physical surface |
US9972131B2 (en) | 2014-06-03 | 2018-05-15 | Intel Corporation | Projecting a virtual image at a physical surface |
KR101496761B1 (en) * | 2014-08-25 | 2015-02-27 | 연세대학교 산학협력단 | 3D Model Control System and Method Based on multi-screen Projection |
KR101641672B1 (en) * | 2015-07-21 | 2016-07-21 | (주)디자인비아트 | The system for Augmented Reality of architecture model tracing using mobile terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6531823B2 (en) | Imaging system, imaging apparatus, imaging method, and imaging program | |
JP6507730B2 (en) | Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination | |
JP5905540B2 (en) | Method for providing a descriptor as at least one feature of an image and method for matching features | |
US20150369593A1 (en) | Orthographic image capture system | |
CN103765870B (en) | Image processing apparatus, projector and projector system including image processing apparatus, image processing method | |
CN104885098B (en) | Mobile device based text detection and tracking | |
CN113514008B (en) | Three-dimensional scanning method, three-dimensional scanning system, and computer-readable storage medium | |
JP4739004B2 (en) | Information processing apparatus and information processing method | |
US20160371855A1 (en) | Image based measurement system | |
KR101506610B1 (en) | Apparatus for providing augmented reality and method thereof | |
KR102354299B1 (en) | Camera calibration method using single image and apparatus therefor | |
EP3910451B1 (en) | Display systems and methods for aligning different tracking means | |
WO2019035155A1 (en) | Image processing system, image processing method, and program | |
JP5756322B2 (en) | Information processing program, information processing method, information processing apparatus, and information processing system | |
KR101227237B1 (en) | Augmented reality system and method for realizing interaction between virtual object using the plural marker | |
US20190073796A1 (en) | Method and Image Processing System for Determining Parameters of a Camera | |
KR20180039013A (en) | Feature data management for environment mapping on electronic devices | |
CN103198286B (en) | Information processing terminal, information processing method, and program | |
CN114766042A (en) | Target detection method, device, terminal equipment and medium | |
KR20090000777A (en) | Augmented reality system using tangible object and method for providing augmented reality | |
CN117173756A (en) | Augmented reality AR system, computer equipment and storage medium | |
JP6027952B2 (en) | Augmented reality image generation system, three-dimensional shape data generation device, augmented reality presentation device, augmented reality image generation method, and program | |
JP2020042575A (en) | Information processing apparatus, positioning method, and program | |
KR20190005222A (en) | How to adjust the direction of the line of sight in the representation of the virtual environment | |
WO2017126072A1 (en) | Image recognition system, camera state estimation device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |