KR20100138193A - The augmented reality content providing system and equipment for the user interaction based on touchscreen - Google Patents
The augmented reality content providing system and equipment for the user interaction based on touchscreen Download PDFInfo
- Publication number
- KR20100138193A KR20100138193A KR1020090056608A KR20090056608A KR20100138193A KR 20100138193 A KR20100138193 A KR 20100138193A KR 1020090056608 A KR1020090056608 A KR 1020090056608A KR 20090056608 A KR20090056608 A KR 20090056608A KR 20100138193 A KR20100138193 A KR 20100138193A
- Authority
- KR
- South Korea
- Prior art keywords
- user
- augmented reality
- content
- marker
- touch screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention relates to an augmented reality technology, a system for providing augmented reality content that supports the user interaction based on the touch screen that allows the user to manipulate the virtual image, such as 3D content, video, audio, images using a touch screen and It relates to a device method.
The present invention is a type of apparatus and method for providing realistic content, an image input unit for receiving real-world images, a touch input unit for interacting with a user, and AR- for receiving augmented reality content by receiving the two data. software, a display system for providing the information to the user.
Description
Augmented reality technology derived from virtual reality, augmented reality content that supports touch screen-based user interaction that allows users to manipulate virtual images such as 3D content, video, audio, and images using the touch screen The present invention relates to a providing system and an apparatus method.
Augmented Reality is generally a term derived from virtual environment and virtual reality, and refers to a mixture of real world image and virtual image by inserting computer graphic image into the real environment. Real-world information may contain information that the user does not need, and sometimes the user may lack information. But with computer-generated virtual environments, you can simplify or hide information you don't need. In other words, the augmented reality system combines the real world and the virtual world to allow interaction with the user in real time.
Applications that use augmented reality in its simplest form are entertainment and news. If you're forecasting on TV, you'll see the map behind the weathercaster change naturally. Actually, the weather cast is standing in front of the blue-screen, and virtual images made by computer create a virtual studio environment for augmented reality. To pay.
The simplest form of augmented reality is applied in the fields of entertainment and news. If you're forecasting on TV, you'll see the weather map in front of the weather caster changing naturally. Actually, the weather cast is in front of a blue-screen, and the virtual images created by the computer create a virtual studio environment to create augmented reality. It is done.
In addition, 3D data of the patient is acquired in real time through indirect inspection such as magnetic resonance image, computed tomography image, and ultrasound image, and then the acquired data is rendered in real time and overlapped with the patient. By expressing it, the doctor can perform the patient's operation smoothly. You can also display virtual images on the visor's visor or cockpit windshield to tell the pilot a lot of information he needs to fly.
The method of implementing such augmented reality is generally divided into two methods.
First, there is a method of floating an object on top of it through shape matching using feature points collected from objects in the real world.
Second, there is a method of inserting a pre-made marker in the process of receiving the image of the real world.
Previously, published paper "Immersive Authoring of Tangible Augmented Reality Applications" published in 2004 in Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality, co-authored by Gun A. Lee; Claudia Nelles; Mark Billinghurst; Gerard Jounghyun Kim , pp172-181) uses geometric marker and digital channel coding technology, and like the existing system, based on the vision recognition algorithm, it finds four vertices, extracts the RT matrix, and the inside of the marker to distinguish the IDs of the markers. 36 bits are allocated by dividing into 36 areas, 10 bits are used to distinguish marker IDs, and 26 bits are allocated by CRC (Cyclical Redundancy Check) code, which is a kind of channel coding, to implement a marker reality system. .
The method of implementing augmented reality is generally divided into two methods.
First, there is a method of floating an object on top of it through shape matching using feature points collected from objects in the real world. This is a method that is directed in the field of augmented reality, but when the feature point information collected from the object is small or environmental conditions such as lighting is unstable, the performance is greatly reduced or does not operate.
The second is to insert and use a marker produced in advance in the process of receiving the image of the real world, which has a stable performance compared to the former, but has to be artificially produced and inserted into the image input unit without deformation, There is a disadvantage in that it is cumbersome to make a custom and distribution method is limited to users and inconvenient to use. Due to these drawbacks, there is a difficulty in the production of augmented reality program and the commercialization of the product is delayed.
An object of the present invention is to precisely and freely control the use of augmented reality content, to improve the accessibility while accessing the content while adding a tactile feeling to the visual.
An object of the present invention for solving the above problems is to replace the marker and to compensate for the shortcomings of the marker based on the touch screen, using the sensor and action that occurs between the user's interaction with the virtual content and the virtual content It is to provide a system and method for easily controlling the phenomena.
By using the augmented reality content providing system and method that supports the touch screen-based user interaction according to the present invention, the user can experience the various changes to the image of the contents inside the system by themselves, and by adding the tactile sense to the visual augmentation It can dramatically expand the real world application and save effort and cost to create markers whenever the contents of the content changes.
This tool promotes the sales of electronic products and contributes to the vitalization of the electronic parts industry.The online / offline grows together by linking the e-learning industry, which is currently online, with the existing offline industry. It can bring an effect. In addition, the multi-touch technology enables users to communicate with accurate and specific systems by simultaneously recognizing multiple points due to the development effect of the technology. It can contribute to the development of simulation of educational entertainment industry such as promotion, electronic publishing, e-learning, multimedia video, and can be widely applied in defense field such as aircraft training, medical surgery, exhibition, advertisement, broadcasting, theme park, etc. This has the effect of overcoming the limitations of social tolerance of mixed reality technology and deriving the utilization plan and mixed reality application service that can be used by ordinary users in everyday life.
Augmented reality content providing system that supports the user interaction based on the touch screen (substitute the marker) of the present invention for achieving the above object is a kind of device and method for providing a touch screen and realistic content image of the real world The image input unit receives an input, a touch input unit for interacting with a user, AR-software for receiving augmented reality content by receiving the two data, and a display unit system for providing the information.
DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The touch screen used in the touch input unit of the present invention is characterized in that a marker and an object control can be simultaneously executed using a multi-touch screen using a technique of FIGS. 1A, b, c, and d. In the case of using the input method, a stable marker input can be received without being affected by changes in the surrounding environment.
The existing marker is a reference for positioning virtual content in an MR (Mixed Reality) application that synthesizes virtual content on a real image. Picture of a printed square. Information is displayed through the camera to show augmented reality content or to perform defined operations (move, delete, enlarge, help, etc.).
Accordingly, the present invention is a method of matching a marker defined by a user by drawing a marker defined by using a touch screen with a marker defined in the AR-Software, and a method of creating a definition of a marker in a shape desired by a user and then integrating content. . As a first method, as shown in FIG. 2A, a user directly inputs a touch screen by replacing a marker of a printed rectangle, and extracts a coordinate value from touch screen input information as shown in FIG. 3 to calculate an average of all coordinate values. After the conversion, it transforms into a preliminary context to form a rectangular box based on the maximum width and the maximum height, and then extracts the four vertices and the RT matrix.
36 bits are allocated by dividing the inside of the box into 36 areas for ID identification of the markers defined inside the AR-Software, and 10 bits are used for ID identification of the user's touch screen input information (marker replacement). 26 bits are assigned as CRC (Cyclical Redundancy Check) code, which is a kind of channel coding, and compares them with shape matching between internal marker and user's touch screen input information (marker replacement). Make the content available by mapping it.
As a second method, as shown in Fig. 4, the coordinate values are extracted from the touch screen input information, and the average of all the coordinate values is obtained. Then, the coordinates are converted into a preliminary context to form a rectangular box based on the maximum width and the maximum height. Extract the last four vertices and the RT matrix. 36 bits are allocated by dividing the inside of the box into 36 areas for object ID defined inside AR-Software, and 10 bits are used for ID identification of user's touch screen input information (marker replacement). Bit is assigned to CRC (Cyclical Redundancy Check) code, which is a kind of channel coding, and the user's touch screen input information (marker replacement) is linked with the ID of the object defined by the user. When the information is input, the object is mapped so that the content can be used. As shown in FIG. 5, the content mapped to the display can be used by the user in size, position, direction, or marker. It is characterized by controlling the translation, rotation, and scale. Figure 6 shows the overall configuration of the augmented reality content providing system and apparatus that supports the touch screen-based user interaction having such a feature.
1A is a diagram of a touch screen DI (Differed Illumination) method for inputting an interaction according to the present invention;
1B is a diagram of a touch screen LLP (Laser Light Plane) method for inputting an interaction according to the present invention;
1c is a diagram of a touch screen diffused surface illumination (DSI) method for inputting an interaction according to the present invention;
1d is a diagram of a touch screen FTIR (Frustrated Total Internal Reflection) method for inputting an interaction according to the present invention;
2A illustrates an example of generating a user interaction according to the present invention on a touch screen;
2b is an exemplary diagram of mapping 3D content to a defined position by recognizing a user interaction according to the present invention as a marker;
3 is a flowchart illustrating a method for replacing touch screen input data of a user input with a marker according to the present invention;
4 is a flowchart illustrating a method for integrating a user-created marker and a defined content according to the present invention;
5 is a flowchart illustrating a method of converting a form of an object by interaction of content provided at a defined location;
Figure 6 is a block diagram showing the overall configuration of the augmented reality content providing system for supporting a marker-based user interaction in accordance with the present invention.
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020090056608A KR20100138193A (en) | 2009-06-24 | 2009-06-24 | The augmented reality content providing system and equipment for the user interaction based on touchscreen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020090056608A KR20100138193A (en) | 2009-06-24 | 2009-06-24 | The augmented reality content providing system and equipment for the user interaction based on touchscreen |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20100138193A true KR20100138193A (en) | 2010-12-31 |
Family
ID=43511695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020090056608A KR20100138193A (en) | 2009-06-24 | 2009-06-24 | The augmented reality content providing system and equipment for the user interaction based on touchscreen |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20100138193A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693347A (en) * | 2011-03-22 | 2012-09-26 | 王鹏勃 | Novel paper-media publication system based on augmented reality technology |
CN108305325A (en) * | 2017-01-25 | 2018-07-20 | 网易(杭州)网络有限公司 | The display methods and device of virtual objects |
KR102159721B1 (en) * | 2019-12-05 | 2020-09-24 | 에스피테크놀러지 주식회사 | Communication apparatus providing augmented reality service using amr space position information, and control method thereof |
-
2009
- 2009-06-24 KR KR1020090056608A patent/KR20100138193A/en not_active Application Discontinuation
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693347A (en) * | 2011-03-22 | 2012-09-26 | 王鹏勃 | Novel paper-media publication system based on augmented reality technology |
CN108305325A (en) * | 2017-01-25 | 2018-07-20 | 网易(杭州)网络有限公司 | The display methods and device of virtual objects |
KR102159721B1 (en) * | 2019-12-05 | 2020-09-24 | 에스피테크놀러지 주식회사 | Communication apparatus providing augmented reality service using amr space position information, and control method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10565796B2 (en) | Method and system for compositing an augmented reality scene | |
US10089794B2 (en) | System and method for defining an augmented reality view in a specific location | |
Amin et al. | Comparative study of augmented reality SDKs | |
Zollmann et al. | Visualization techniques in augmented reality: A taxonomy, methods and patterns | |
Wang | Augmented reality in architecture and design: potentials and challenges for application | |
Ha et al. | Digilog book for temple bell tolling experience based on interactive augmented reality | |
CN110163942B (en) | Image data processing method and device | |
CN107168534B (en) | Rendering optimization method and projection method based on CAVE system | |
CN108133454B (en) | Space geometric model image switching method, device and system and interaction equipment | |
CN117635815A (en) | Initial visual angle control and presentation method and system based on three-dimensional point cloud | |
CN112614234A (en) | Method for editing mixed reality three-dimensional scene and mixed reality equipment | |
Fadzli et al. | VoxAR: 3D modelling editor using real hands gesture for augmented reality | |
KR20100138193A (en) | The augmented reality content providing system and equipment for the user interaction based on touchscreen | |
KR101582225B1 (en) | System and method for providing interactive augmented reality service | |
Romli et al. | AR@ UNIMAP: A development of interactive map using augmented reality | |
CN114187426A (en) | Map augmented reality system | |
Hwang et al. | 3D building reconstruction by multiview images and the integrated application with augmented reality | |
WO2017147826A1 (en) | Image processing method for use in smart device, and device | |
Bocevska et al. | Implementation of interactive augmented reality in 3D assembly design presentation | |
Abdullah et al. | Virtual gasoline engine based on augment reality for mechanical engineering education | |
Yihang et al. | Virtual Design of Chinese Theater Arts in Augmented Reality Media System | |
Hagbi et al. | In-place augmented reality | |
Gunawan et al. | Acehnese Traditional Clothing Recognition Prototype System Design Based On Augmented Reality | |
Weerasinghe et al. | Playing with the Artworks: A Personalised Artwork Experience. | |
KR102622709B1 (en) | Method and Apparatus for generating 360 degree image including 3-dimensional virtual object based on 2-dimensional image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |