KR101678994B1 - Interactive Media Wall System and Method for Displaying 3Dimentional Objects - Google Patents

Interactive Media Wall System and Method for Displaying 3Dimentional Objects Download PDF

Info

Publication number
KR101678994B1
KR101678994B1 KR1020150081416A KR20150081416A KR101678994B1 KR 101678994 B1 KR101678994 B1 KR 101678994B1 KR 1020150081416 A KR1020150081416 A KR 1020150081416A KR 20150081416 A KR20150081416 A KR 20150081416A KR 101678994 B1 KR101678994 B1 KR 101678994B1
Authority
KR
South Korea
Prior art keywords
user
data
sketch
image
dimensional
Prior art date
Application number
KR1020150081416A
Other languages
Korean (ko)
Inventor
박홍규
Original Assignee
주식회사 미디어프론트
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 미디어프론트 filed Critical 주식회사 미디어프론트
Priority to KR1020150081416A priority Critical patent/KR101678994B1/en
Application granted granted Critical
Publication of KR101678994B1 publication Critical patent/KR101678994B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The present invention provides an interactive media wall system and a method for displaying a three-dimensional object. According to an embodiment of the present invention, the interactive media wall system displays contents directly generated by a user to induce participation of the user, thereby maximizing an interest of the user, and displays the pre-stored information together with the contents directly generated by the user, thereby creating a study effect. In addition, the interactive media wall system converts contents generated by a user into 3D contents to be displayed, thereby providing a vivid virtual reality experience to the user.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an interactive media wall system and an interactive media wall system,

The present invention relates to a media wall system for displaying three-dimensional objects, and more particularly to a method of displaying a 3D object in a media wall that interacts with a user and a shaping apparatus for interacting with a user.

Generally, wall surfaces such as passageways and inside of a stairway building have no function other than merely functions as a wall. However, the development of advertising, public relations technology and the display industry is developing using these walls. In recent years, there have been disclosed strategies for maximizing the advertising effect by increasing the visibility of the passers by installing the display on the wall and outputting the same image. These methods embed the LCD or LED display on the wall so that the surface of the display coincides with the surface of the wall so that the image is outputted from the wall surface to enhance the visibility.

In recent years, the wall display device has been applied to theme parks and various venues to maximize the display effect.

Particularly, in a theme park or the like, there is a growing demand for a service and service system technology for displaying a three-dimensional object, responding to a user's location and touch, or displaying and interacting with data generated by a user.

DISCLOSURE OF THE INVENTION The present invention provides a media wall service apparatus and media wall system for interacting with a user and converting a two-dimensional object into a three-dimensional display, and a shaping apparatus for interacting with a user, .

According to an aspect of the present invention, there is provided a method of displaying three-dimensional objects in a media wall, the method comprising: displaying sketch data input by a user from a scanner, names of objects included in a sketch, Receiving the photographed user image and the image data; A step of converting the received data, the three-dimensional model data required for creating the three-dimensional object, the displayed object information, and the animation data into a database; Tracking a movement of a user included in the photographed image; Extracting a three-dimensional model corresponding to input sketch data; Generating a three-dimensional object displayed in a media wall by synthesizing the extracted three-dimensional model and the input sketch data; Extracting the generated three-dimensional object related information; Displaying at least one of the extracted three-dimensional object related information, the three-dimensional object, the received user image, the tracked user's movement, and the captured user image together with the animation stored in the media wall in three dimensions.

In a preferred embodiment, the method includes receiving data; Recognizing a marker included in the sketch board for recognizing the sketch of the user in the scanner; Processing the image included in the sketch board inside the marker based on the recognized marker; Reading the processed image as data; And transmitting the read image data to the media wall; And the scanner is provided in an image input device in the form of a kiosk.

According to another aspect of the present invention, there is provided an interactive sculpture service method comprising: recognizing touch information of a user and real-time location information of a user through a sensor or a camera; Mapping the output information including the output time according to the output object and the output sound according to the touch information of the recognized user and the real time position information of the user; And controlling an output time and an output position of the mapped output information according to a user's location information and touch information; .

According to another aspect of the present invention, there is provided an interactive media wall service apparatus including: an image scanning device including a kiosk for performing a communication function; a communication unit for receiving user image and moving image data and sketch data captured by the camera; A database for storing data necessary for converting the two-dimensional sketch data into a three-dimensional object, an object to be displayed and display background and object related information, a user image and moving image data, and a user's sketch scan data; A scan image processing unit for recognizing a line included in the received two-dimensional scan data by edge detection to convert the received two-dimensional scan data into a three-dimensional object, and generating coordinate data included in the two- ; An operation recognizing unit for recognizing an operation of a user included in the user image and the moving picture data through edge or color change sensing; A motion tracking unit for tracking the recognized motion; A rendering unit for converting the tracked motion and the 2D scan data into a 3D object; And an output control unit for extracting and mapping information related to the object and the object converted into the three-dimensional space from the database and controlling the output time and output position of the object and the mapped information according to the real-time location information of the user and the touch information of the user; .

According to another aspect of the present invention, there is provided a sketch input apparatus including: a marker recognition unit for recognizing a marker included in a sketch board; An image processing unit for reading a line and a point of an image included in the recognized marker and processing the read line and point as image data; And a communication unit for transmitting the processed image data to a display device including a media wall and an interactive framing device; .

According to another aspect of the present invention, there is provided an interactive media wall system including: a media table for receiving a sketch from a user; A camera for photographing the user ' s photos and videos; A media wall for converting a sketch and a photographed image input into a media table and an object included in the moving image into a three-dimensional image and displaying the converted three-dimensional object; And an interactive shaping device for recognizing the touch information of the user or the real time location information of the user and outputting at least one of the previously stored sound, image, and illumination based on the recognized information; .

According to another aspect of the present invention, a three-dimensional object display method in an interactive media wall system includes receiving a sketch from a media table from a user, processing a sketch into image data, capturing an image and a user of the user by a camera, Lt; / RTI > Interlocking the media table, the camera, and the media wall; Transmitting a sketch, a photograph, and moving picture data of a user processed in a media table and a camera to a media wall; Transforming an object included in the transmitted sketch, photo, and moving picture data into three-dimensional; Outputting the converted three-dimensional object according to the position and touch information of the sensed user; .

In a preferred embodiment, the step of outputting the transformed three-dimensional object according to the user's position and touch information includes: mapping information related to the transformed three-dimensional object and the object; And recognizing the user's touch information and real-time position information through a sensor and a camera; Outputting the mapped information and the three-dimensional object together based on the touch information of the recognized user and the real-time position information; .

In a preferred embodiment, converting objects included in the transmitted sketch, photograph, and moving picture data into three-dimensional; Extracting coordinates and lines included in sketch, picture, and moving picture data through edge detection; Mapping pre-stored three-dimensional model data corresponding to sketched image data, based on coordinates and line data extracted from the sketch; Transforming the object included in the sketch into a three-dimensional object by synthesizing the sketch data transmitted to the three-dimensional model data; And recognizing an operation of an object included in a photograph and a moving image based on coordinate and line data extracted from the photograph and moving image, and performing motion tracking based on the recognized operation; .

The present invention provides a media wall system and method for interacting with a user, thereby providing an opportunity for a user to directly produce contents displayed in a media wall.

Also, by displaying the content generated by the user through the present invention, the user's interest is maximized by inducing user participation, and the content generated by the user and the previously stored information are displayed together to create learning effects.

FIG. 1A is a view showing an example of a theme park in which a media wall service system according to an embodiment of the present invention is implemented.
1B is a diagram illustrating a media wall system experience flow according to an embodiment of the present invention.
2 is a diagram illustrating a configuration of a media wall system according to an embodiment of the present invention.
FIG. 3A is a front view, a side view, and a perspective view of a kiosk that is an example of an input device that receives a sketch directly from a user according to an embodiment of the present invention. FIG. 3C is a diagram illustrating a marker included in a sketch board according to an embodiment of the present invention. FIG. 3C is a block diagram illustrating a schematic configuration of a kiosk as an input device according to an embodiment of the present invention.
4A is a view showing an embodiment in which a sketch a drawn directly by a user in a media wall according to an embodiment of the present invention and a photograph b photographed by a user are converted and displayed in three dimensions, 4C is a diagram illustrating a process of converting an image in a media wall according to an exemplary embodiment of the present invention, and FIG. 4D is a flowchart illustrating an image conversion process according to an exemplary embodiment of the present invention. A three-dimensional object transformed from a sketch of a user and a user, and a description related to an object.
FIG. 5A is a view illustrating a function of an interactive forming apparatus according to an exemplary embodiment of the present invention, and FIG. 5B is a schematic view illustrating a functional configuration of an interactive forming apparatus according to an exemplary embodiment of the present invention.
6 is a flowchart illustrating a three-dimensional object display process in a media wall according to an embodiment of the present invention.
7 is a diagram illustrating an output process in an interactive molding according to an embodiment of the present invention.
8 is a flowchart illustrating an operation flow of an interactive media wall system according to an embodiment of the present invention.

According to an embodiment of the present invention, there is provided a media wall service apparatus and a media wall system for interacting with a user, converting a two-dimensional object into a three-dimensional display and displaying the same, and a shaping apparatus for interacting with a user. Since the present invention provides a media wall service apparatus and method for interacting with a user, the user can have an opportunity to produce content displayed directly in the media wall. In addition, through the present invention, the content generated by the user is displayed to maximize the user interest by inducing user participation, and the content generated by the user and the pre-stored information are displayed together to create a learning effect. In addition, the media wall service apparatus according to the embodiment transforms the content generated by the user into 3D and displays it, thereby providing a more vivid virtual reality experience to the user.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. And is intended to enable a person skilled in the art to readily understand the scope of the invention, and the invention is defined by the claims. It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. It is noted that " comprises, " or "comprising," as used herein, means the presence or absence of one or more other components, steps, operations, and / Do not exclude the addition.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings.

FIG. 1A is a view showing an example of a theme park in which a media wall service system according to an embodiment of the present invention is implemented.

1A, a media wall system according to an embodiment of the present invention includes a sketch input device 100, a media wall 200, a molding 300 interacting with a viewer, and a camera (not shown) As shown in FIG.

For ease of understanding, a media wall system according to an embodiment of the present invention will be described with reference to FIG. 1B and FIG. 1A, which show the media wall system experience flow.

The sketch input device 100 is a device including a kiosk. The sketch input device 100 receives a sketch drawn directly by a user through a scanner or the like provided in the kiosk. To do this, the user can select a drawing to be sketched in the drawing box and freely sketch it using the selected drawing. After completion of the sketch, the user inputs the sketched picture to the input device 100. [ The sketch input device 100 images the input sketch, generates image data, and transmits the image data to the media wall 200. [

The media wall 200 extracts lines and point coordinates included in the received image data, converts the two-dimensional data into three-dimensional object data, and displays the converted three-dimensional object. That is, in the present invention, a sketch drawn by a user can be converted into a three-dimensional object and displayed, thereby interacting with a user.

In addition, the media wall system according to the embodiment may further include a sculpture 300 interacting with a viewer. The molding 300 recognizes the user's real-time location information and user touch information, and outputs sounds, objects, lights and the like suitable for the recognized information, thereby interacting with the user and providing various information more realistically.

2 is a diagram illustrating a configuration of a media wall system according to an embodiment of the present invention.

The media wall system according to an embodiment may include a scanning zone 100, an interactive media wall 200, a photo gallery wall 210, and a 3D projection mapping artifact 300, It focuses on the area of the components that make up the system.

Referring to FIG. 2, a scanning zone 100 includes a touch monitor, a media PC, a scanner, and the like. The user scans and inputs a sketch drawn by himself / herself from various input devices provided in the scanning zone 100. [

In the interactive media wall 200, a sketch image directly drawn by a user is displayed in three dimensions. To this end, a projector 10 for projecting an object displayed in a media wall, A media server 220 connected to the camera sensor, an amplifier 230 for outputting a sound mapped to an object to be displayed, and a speaker 240.

The photo gallery wall 210 displays a photograph and a moving image of the photographed users, or generates a display object using the photographed images and the moving images of the users. To this end, the photo gallery wall 210 includes a projector 211 for projecting photographs and moving pictures of the photographed users, a media PC 213 for storing and managing the photographs and images and controlling the display, And an amplifier 215 and a speaker 217 for outputting a sound signal.

In addition, the 3D projection mapping feature 300 is a device for recognizing touch and real-time location information of a user and displaying corresponding objects, sounds, and lights. The 3D projector 300 includes a projector 310 for projecting a display object, A media PC 320 capable of storing and controlling the display, and an amplifier 330 and a speaker 340 for sound output.

FIG. 2 is a view for explaining an example of the configuration of the media wall system provided by the embodiment of the present invention, but the media wall system according to the embodiment is not limited thereto. Hereinafter, embodiments of each component constituting the media wall system will be described in more detail.

FIG. 3A is a front view, a side view, and a perspective view of a kiosk that is an example of an input device that receives a sketch directly from a user according to an embodiment of the present invention. 1 is a block diagram illustrating a schematic configuration of a kiosk that is an input device according to an embodiment of the present invention.

3A and 3B, a kiosk according to an embodiment of the present invention may include a marker recognition unit 31, a scanner 33, an image processing unit 35, and a communication unit 37.

The marker recognition unit 31 recognizes the markers for accurate scan coordinate recognition formed at the corners of the sketch board as shown in FIG. 3C. For example, the marker recognition unit 31 recognizes the markers including black-and-white dots or barcode images at each corner, thereby allowing the scanner 33 to scan only the images included inside the markers.

The scanner 33 receives the recognized marker coordinates, and takes an image inside the coordinate through a camera or a sensor provided in the scanner.

The image processing section 35 reads the photographed sketch image information and performs image processing.

Thereafter, the communication unit 37 transmits the processed video information to the media wall 200.

4A is a view showing an embodiment in which a sketch a drawn directly by a user in a media wall according to an embodiment of the present invention and a photograph b photographed by a user are converted and displayed in three dimensions, FIG. 4C is a diagram illustrating a process of converting an image in a media wall according to an exemplary embodiment of the present invention. Referring to FIG.

4B through 4C, the media wall according to the embodiment includes a scan image processing unit 410, an operation recognition unit 420, a motion tracking unit 430, a rendering unit 440, A database 450, and an output control unit 460.

The scan image processing unit 410 receives the sketch image data drawn by the user directly from the kiosk, which is an input device according to the embodiment.

The motion recognition unit 420 recognizes the motion of the user included in the image or image photographed by the camera. For example, the motion recognition unit 420 recognizes an operation of the object by recognizing an edge or a color change of the object included in the image or the image.

The motion tracking unit 430 tracks the motion of the recognized object and reproduces the motion of the object by data streaming or the like. That is, the motion of the recognized object is tracked and stored so that the motion of the object can be reproduced later.

The rendering unit 440 converts the two-dimensional object data to be displayed on the media wall from the motion tracking unit 430 and the scan image processing unit 410 into three-dimensional data. For example, as shown in FIG. 4C, the rendering unit 440 extracts a 3D model corresponding to the received sketch data from the database 250, and synthesizes and converts the synthesized 3D model, thereby converting the two- . In addition, the rendering unit 440 synthesizes the animation with the object converted to the 3D, and outputs the synthesized result in the media wall.

The output control unit 460 receives the object information of the 3D form converted by the rendering unit 440 and extracts the information related to the object information from the database 450 and outputs the extracted related information and the 3D form object together Respectively.

For example, as shown in FIG. 4D, the output control unit 460 may include a name of an image designated by the user, an academic name of the image drawn by the user, Various learning information can be displayed together to create a learning effect. In addition, the output control unit 480 also controls outputs such as sounds and voices corresponding to the objects converted into 3D images.

FIG. 5A is a view illustrating a function of an interactive forming apparatus according to an exemplary embodiment of the present invention, and FIG. 5B is a schematic view illustrating a functional configuration of an interactive forming apparatus according to an exemplary embodiment of the present invention.

5B, an interactive molding apparatus according to an exemplary embodiment of the present invention may include a recognition unit 410, a mapping unit 520, a control unit 530, and an output unit 540.

The recognition unit 410 recognizes the touch information of the user and the real-time position information of the user with the installed sensor or camera.

The mapping unit 520 maps the recognized touch information and output information corresponding to the position information. For example, when the user recognizes that the user has continuously touched for more than a predetermined time, the mapping unit 520 maps the corresponding illumination and sound to output.

The control unit 530 controls the output information mapped corresponding to the touch information and the position information of the user to be output at an appropriate position for an appropriate time. For example, the control unit 530 controls the display position of the illumination or the like in the sculpture according to the real-time position of the user, and the time of continuous display.

Hereinafter, a three-dimensional object display method using a media wall according to the present invention will be described in order. Since the function (function) of the 3D object display method using the media wall according to the present invention is essentially the same as the function on the order management system using the electronic table, the overlapping description with FIGS. 1A to 5B will be omitted.

6 is a flowchart illustrating a three-dimensional object display process in a media wall according to an embodiment of the present invention.

In step S610, the process of receiving the sketch data input by the user from the scanner, the user image and the image data taken from the camera is performed. In an embodiment, step S610 may include recognizing a marker included in the sketch board for recognizing a user's sketch; Processing the image included in the sketch board inside the marker based on the recognized marker; Reading the processed image as data; And transmitting the read image data to the media wall; . ≪ / RTI > In an embodiment, the scanner includes an image input device in the form of a kiosk.

In operation S620, the user's motion included in the photographed image is tracked. Tracking is a process for tracking user movement, and is a process for data streaming and display in a media wall in the future.

In step S630, a 3D model corresponding to the input sketch data is extracted.

In step S640, the extracted three-dimensional model and the input sketch data are combined to generate a three-dimensional object displayed in the media wall.

In step S650, the generated three-dimensional object related information is extracted. In step S660, at least one of the extracted three-dimensional object related information, the three-dimensional object, the received user image, and the tracking user's motion is three- And performs a display process.

7 is a diagram illustrating an output process in an interactive molding according to an embodiment of the present invention.

Referring to FIG. 7, in step S710, a process of recognizing the touch information of the user and the real-time position information of the user through the sensor and the camera installed in the interactive sculpture is performed.

In step S720, the output information including the pre-stored output object, the output position, the output time, and the output sound is mapped according to the recognized user's touch information and the real-time position information of the user.

In step S730, the output time and output position of the mapped output information are controlled.

8 is a flowchart illustrating an operation flow of an interactive media wall system according to an embodiment of the present invention.

In the media table, a sketch is received from the user and processed as image data. The camera captures the image and the image of the user, and processes the image and the image data (S810).

Then, the media table, the camera, and the media wall are linked (S820).

The media table and the camera transmit the sketch, the photograph, and the moving picture data of the processed user (S830). Then, the media wall converts the objects included in the transmitted sketch, picture, and moving picture data into three-dimensional (S840).

At this time, the process of converting the objects included in the transmitted sketch, picture, and moving picture data into three-dimensional data includes extracting coordinates and lines included in the sketch, photograph, and moving picture data; Mapping pre-stored three-dimensional data corresponding to sketched image data, based on coordinates and line data extracted from the sketch; Synthesizing the transmitted sketch data with the mapped three-dimensional data to convert the object included in the sketch into a three-dimensional object; Recognizing an operation of an object included in a photograph and a moving image based on coordinate and line data extracted from the photograph and moving image, and performing motion tracking on the recognized operation; . ≪ / RTI >

Then, the converted three-dimensional object is output according to the position and touch information of the user (S850). In an exemplary embodiment of the present invention, the outputting process may include mapping the transformed three-dimensional object and information related to the object; And recognizing touch information and real time location information of the user; Outputting the information mapped according to the touch information of the recognized user and the real-time position information together with the three-dimensional object; . ≪ / RTI > Through the above-described process, the sketch drawn by the user can be displayed together with the learning information as a three-dimensional object or the molding device and the media wall service that interact with the user by outputting the appropriate object and sound according to the user's touch information and real- .

Thus, by providing a media wall system that interacts with a user, the user interest is maximized by maximizing the user interest, and the content generated by the user and the previously stored information are displayed together to create a learning effect. In addition, the media wall service apparatus according to the embodiment converts a content generated by a user into three-dimensional information and displays it, thereby providing a user with a more vivid virtual reality experience.

Meanwhile, the 3D object display method in the interactive media wall system according to the above-described embodiment of the present invention can be implemented as a computer-readable code on a computer-readable recording medium. The computer-readable recording medium includes all kinds of recording media storing data that can be decoded by a computer system. For example, there may be a ROM (Read Only Memory), a RAM (Random Access Memory), a magnetic tape, a magnetic disk, a flash memory, an optical data storage device and the like. The computer-readable recording medium may also be distributed and executed in a computer system connected to a computer network and stored and executed as a code that can be read in a distributed manner.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims. Therefore, the scope of the present invention should not be limited by the illustrated embodiments, but should be determined by the scope of the appended claims and equivalents thereof.

Claims (9)

delete delete delete An image scanning device including a kiosk for performing a communication function; a communication unit for receiving user image and moving image data and sketch data taken by the camera;
A database for storing data necessary for converting the two-dimensional sketch data into a three-dimensional object, an object to be displayed and a display background, the object related information, the user image and moving image data, and the sketch scan data of the user;
Recognizing a line included in the received two-dimensional scan data through edge detection to convert the received two-dimensional scan data into a three-dimensional object, and generating coordinate data included in the two-dimensional scan data A scan image processing unit;
An operation recognizing unit for recognizing an operation of the user included in the user image and the moving image data through edge or color change sensing;
A motion tracking unit for tracking the recognized motion; And
A rendering unit for converting the tracked motion and the 2D scan data into a 3D object; And
Extracting and mapping information related to the object and the object converted into the three-dimensional space from the database, and controlling output time and output position of the object and the mapped information according to the real-time position information of the user and the touch information of the user An output control unit The interactive media wall service device comprising:
delete A media table for receiving a sketch from a user;
A camera for photographing the user ' s photos and videos;
A media wall for converting a sketch input to the media table, an object included in the photographed photograph and moving image into a three-dimensional image and displaying the converted three-dimensional object; And
An interactive shaping device for recognizing the touch information of the user or the real time location information of the user and outputting at least one of a previously stored sound, image, and illumination based on the recognized information; ≪ / RTI >
Receiving a sketch from a media table from a user, processing the sketch as image data, capturing the image and the image of the user by a camera, and processing the image data;
Interlocking the media table, the camera, and the media wall;
Transmitting the sketch, photo and moving picture data of the user processed in the media table and the camera to the media wall;
Transforming the objects included in the transmitted sketch, photograph, and moving picture data into three-dimensional;
Outputting the transformed three-dimensional object according to the position and touch information of the sensed user; And displaying the three-dimensional object in an interactive media wall system.
The method as claimed in claim 7, wherein the step of outputting the transformed three-dimensional object according to a user's position and touch information comprises:
Mapping the transformed three-dimensional object and information associated with the object; And
Recognizing the touch information and real-time position information of the user through a sensor and a camera;
Outputting the mapped information and the 3D object on the basis of the touch information of the recognized user and the real-time position information; And displaying the three-dimensional object in the interactive media wall system.
The method of claim 8, further comprising: transforming objects included in the transmitted sketch, photo, and moving picture data into three-dimensional data; The
Extracting coordinates and lines included in the sketch, photograph, and moving picture data through edge detection;
Mapping pre-stored three-dimensional model data corresponding to the sketched image data, based on coordinates and line data extracted from the sketch;
Synthesizing the transmitted sketch data with the 3D model data to convert the object included in the sketch into a 3D object; And
Recognizing an operation of an object included in the photograph and moving image based on coordinate and line data extracted from the photograph and moving image, and performing motion tracking based on the recognized operation; And
And displaying the three-dimensional object in the interactive media wall system.
KR1020150081416A 2015-06-09 2015-06-09 Interactive Media Wall System and Method for Displaying 3Dimentional Objects KR101678994B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150081416A KR101678994B1 (en) 2015-06-09 2015-06-09 Interactive Media Wall System and Method for Displaying 3Dimentional Objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150081416A KR101678994B1 (en) 2015-06-09 2015-06-09 Interactive Media Wall System and Method for Displaying 3Dimentional Objects

Publications (1)

Publication Number Publication Date
KR101678994B1 true KR101678994B1 (en) 2016-11-24

Family

ID=57705541

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150081416A KR101678994B1 (en) 2015-06-09 2015-06-09 Interactive Media Wall System and Method for Displaying 3Dimentional Objects

Country Status (1)

Country Link
KR (1) KR101678994B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101923507B1 (en) * 2018-01-11 2018-11-29 김승은 Method for providing interactive story-telling contents based on augmented reality
KR101975150B1 (en) * 2018-10-12 2019-05-03 (주)셀빅 Digital contents temapark operating system
KR102289448B1 (en) * 2020-12-08 2021-08-12 믹스비전 주식회사 Apparatus for providing interactive content
KR102341294B1 (en) * 2020-12-08 2021-12-21 믹스비전 주식회사 Method and apparatus for providing interactive content
WO2023075125A1 (en) * 2021-10-28 2023-05-04 주식회사 스페이스엘비스 Content producing system on basis of extended reality
KR102539395B1 (en) * 2021-12-30 2023-06-05 (주)웅진씽크빅 Electronic device for implementing metaverse environment using drawing motion and method for operating the same
KR102573198B1 (en) 2023-02-24 2023-09-01 와우하우스 주식회사 Media wall service operating system based on user participation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
https://www.team-lab.net/exhibitions/odaibayumetairiku (Hopscotch for Geniuses 2013, Sketch Town 2014) (2014.12.31.)*

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101923507B1 (en) * 2018-01-11 2018-11-29 김승은 Method for providing interactive story-telling contents based on augmented reality
KR101975150B1 (en) * 2018-10-12 2019-05-03 (주)셀빅 Digital contents temapark operating system
KR102289448B1 (en) * 2020-12-08 2021-08-12 믹스비전 주식회사 Apparatus for providing interactive content
KR102341294B1 (en) * 2020-12-08 2021-12-21 믹스비전 주식회사 Method and apparatus for providing interactive content
WO2023075125A1 (en) * 2021-10-28 2023-05-04 주식회사 스페이스엘비스 Content producing system on basis of extended reality
KR102539395B1 (en) * 2021-12-30 2023-06-05 (주)웅진씽크빅 Electronic device for implementing metaverse environment using drawing motion and method for operating the same
KR102573198B1 (en) 2023-02-24 2023-09-01 와우하우스 주식회사 Media wall service operating system based on user participation

Similar Documents

Publication Publication Date Title
KR101678994B1 (en) Interactive Media Wall System and Method for Displaying 3Dimentional Objects
US8644467B2 (en) Video conferencing system, method, and computer program storage device
KR100918392B1 (en) Personal-oriented multimedia studio platform for 3D contents authoring
US20150080072A1 (en) Karaoke and dance game
WO2016051366A2 (en) Switching between the real world and virtual reality
US11037321B2 (en) Determining size of virtual object
WO2010038693A1 (en) Information processing device, information processing method, program, and information storage medium
KR102186607B1 (en) System and method for ballet performance via augumented reality
KR101553273B1 (en) Method and Apparatus for Providing Augmented Reality Service
US10970932B2 (en) Provision of virtual reality content
US11062422B2 (en) Image processing apparatus, image communication system, image processing method, and recording medium
WO2017124870A1 (en) Method and device for processing multimedia information
KR101641672B1 (en) The system for Augmented Reality of architecture model tracing using mobile terminal
KR100901111B1 (en) Live-Image Providing System Using Contents of 3D Virtual Space
KR101177058B1 (en) System for 3D based marker
CN112686332A (en) AI image recognition-based text-based intelligence-creating reading method and system
KR101518696B1 (en) System for augmented reality contents and method of the same
JP6091850B2 (en) Telecommunications apparatus and telecommunications method
KR101860215B1 (en) Content Display System and Method based on Projector Position
KR101807813B1 (en) Motion Recognition Service Offering System and Method thereof
JP5066047B2 (en) Information processing apparatus, information processing method, program, and information storage medium
US11295531B1 (en) System and method for generating interactive virtual image frames in an augmented reality presentation
US10714146B2 (en) Recording device, recording method, reproducing device, reproducing method, and recording/reproducing device
CN108270978A (en) A kind of image processing method and device
WO2022075073A1 (en) Image capture device, server device, and 3d data generation method

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190918

Year of fee payment: 4