KR20130076131A - Apparatus for displaying interactive content - Google Patents

Apparatus for displaying interactive content Download PDF

Info

Publication number
KR20130076131A
KR20130076131A KR1020110144581A KR20110144581A KR20130076131A KR 20130076131 A KR20130076131 A KR 20130076131A KR 1020110144581 A KR1020110144581 A KR 1020110144581A KR 20110144581 A KR20110144581 A KR 20110144581A KR 20130076131 A KR20130076131 A KR 20130076131A
Authority
KR
South Korea
Prior art keywords
content
user
control device
image output
terminal
Prior art date
Application number
KR1020110144581A
Other languages
Korean (ko)
Inventor
고희애
성정환
Original Assignee
유니웹스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유니웹스 주식회사 filed Critical 유니웹스 주식회사
Priority to KR1020110144581A priority Critical patent/KR20130076131A/en
Publication of KR20130076131A publication Critical patent/KR20130076131A/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The interactive content display device according to an embodiment of the present invention may include a position detection sensor capable of detecting the position of the user and a control device for controlling the image output unit to output the set content according to the detected position of the user.

Description

Interactive content display device {APPARATUS FOR DISPLAYING INTERACTIVE CONTENT}

The present invention relates to a technology capable of variously changing the shape of a wall or the like in a room such as a cafe and outputting content desired by the user to a wall or the like.

Interaction refers to the blurring of the distinction between the subject and the object by the interaction between the subject and the object, and leads to a transition from a mechanical mindset to a fuzzy mindset. In the future, the functionality of interactivity will be driven by the development of enhanced computers.

It will be closer to human life.

Recently, only the functions / contents set by the manufacturer, etc., are displayed on the basis of the user's environment such as the user's emotion, motion, health status, etc. Devices such as mobile phones, TVs, and the like, in which contents are changed, are being produced, and technology for this is being developed. As a specific example, a technology for detecting a location / action of a user and executing a specific function based thereon has been developed. The related technology is Korean Patent Publication No. 10-2009-0004207, Applicant: Cyclone Soft, Title of the invention: 3D dance online game system, and Korean Patent Publication No. 10-2004-0101984, Applicant: Korea Information and Communication University Disclosed is an apparatus and method for providing media entertainment using interoperability.

The present invention is to provide an interactive content display device that can not only change the shape of the wall surface in a room such as a cafe, but also output the content desired by the user.

An interactive content display device according to an embodiment of the present invention may include a position detection sensor capable of detecting a position of a user and a control device for controlling the image output unit to output content set according to the detected position of the user.

The controller may control the image output unit to change and output the set content at each set period.

The interactive content display apparatus may further include an extractor configured to extract an output position corresponding to the sensed user's position, and the controller may control the image output unit to display the set content based on the extracted output position.

When the user inputs the first content through the user input unit, the controller may control the image output unit to output the input first content.

The controller may control the image output unit to output the content set on the side wall where the user is located according to the detected user's position.

The interactive content display apparatus further includes a terminal for transmitting the inputted second content to the control apparatus when the user inputs the second content through the terminal, and the control apparatus is configured to output the second content received from the terminal. Can be controlled.

The image output unit may include a projector capable of projecting content or a first display unit capable of displaying content.

The interactive content display device may further include a second display unit connected to the control device, and the control device may display a window on which the user can input content on the second display unit.

The terminal may display a window on which the user can input content on the third display unit of the terminal.

The window may include an area where the user can directly enter a picture.

The window contains a create icon for creating a new file, an import icon for importing an image, a text insert icon for inserting text, a save icon for storing information, a projection icon for projecting the displayed information through the projector, and a painting tool. It may include at least one of a picture tool icon for selecting the type and thickness of, a color selection icon for selecting a color of the picture tool, an emoticon insertion icon for inserting an emoticon, and a speech bubble insertion icon for inserting a speech balloon.

The image output unit may be installed at at least one of the wall, the bottom, and the ceiling.

The controller may include a controller configured to control the image output unit to output the set contents when the detected user position is within a set distance from the table or the position at which the contents are output.

The content may be at least one of a character, a sign, an image, a video, an emoticon, and an icon.

In accordance with another aspect of the present invention, an interactive content display device includes a position detection sensor capable of detecting a location of a user, a distance detection sensor detecting whether a terminal is located within a set distance from a control device, and a terminal within a set distance. If is located, it may include a control device for automatically receiving the content stored in the terminal, and controls the image output unit to output the received content in accordance with the detected position of the user.

The interactive content display apparatus may further include an extractor configured to extract an output position corresponding to the sensed user's position, and the controller may control the image output unit to display the received content based on the extracted output position.

When the user inputs the first content through the user input unit, the controller may control the image output unit to output the input first content.

The interactive content display apparatus further includes a terminal for transmitting the inputted second content to the control apparatus when the user inputs the second content through the terminal, and the control apparatus is configured to output the second content received from the terminal. Can be controlled.

The controller may control the image output unit to output the received content when the sensed user's location is within a predetermined distance from the table or the location where the content is output.

According to the disclosed invention can not only vary the shape of the wall surface in the room such as a cafe, but also output the content desired by the user on the wall surface.

1 is a block diagram of an interactive content display device according to an embodiment of the present invention.
2 is a block diagram of an interactive content display device according to another embodiment of the present invention.
3 is a diagram for describing a driving process of an interactive content display device according to an embodiment of the present invention.
4 is a block diagram of an interactive content display device according to another embodiment of the present invention.
5 is a block diagram of an interactive content display device according to another embodiment of the present invention.
6A and 6B are diagrams for describing a driving process of an interactive content display device according to another embodiment of the present invention.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram of an interactive content display device according to an embodiment of the present invention.

Referring to FIG. 1, the interactive content display device 100 may include a location sensor 110, a distance sensor 120, an image output unit 130, and a control device 140. Each component can be connected wirelessly or wired to transfer data.

The location sensor 110 may detect a location of a user. For example, the location sensor 110 may be various types of sensors capable of detecting a user's location such as an infrared sensor and an ultrasonic sensor. For example, the position sensor 110 may be installed near the image output unit 130 and may detect a position of a moving user.

The distance sensor 120 may detect whether the terminal 150 is located within a distance set from the controller 140. For example, the distance sensor 120 may detect a distance from the control device 140 to the terminal 150 based on the current location information of the terminal 150. When the detected distance is within the set distance, the distance sensor 120 may generate a distance detection signal and transmit it to the control device 140.

The image output unit 130 is a device that can output various types of content. The content may be at least one of a character, a sign, an image, a video, an emoticon, and an icon. For example, the image output unit 130 may include a projector 131 that can project content or a first display unit 132 that can display content. The image output unit 130 may be installed on at least one of the wall, the bottom, and the ceiling. The first display unit 132 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display A flexible display, a 3D display, and a sensor for detecting a touched operation (hereinafter, referred to as a “touch sensor”) may include at least one of a touch screen including a mutual layer structure.

The control device 140 may control the image output unit 130 to output the set content according to the detected position of the user. For example, the controller 140 may control the image output unit 130 to output the set contents when the detected user's position is within a set distance from the table or contents output position.

The controller 140 may control the image output unit 130 to change and output the set content at each set period. For example, the controller 140 may control the image output unit 130 to change the A content, the B content, the C content, and the D content every second.

When the user inputs the first content through the user input unit, the controller 140 may control the image output unit 130 to output the input first content. The user may input content such as a text, an image, a video, or the like through the user input unit. The content may be generated in real time by the user or may be pre-stored in a storage medium. Accordingly, the user may freely input the desired content so that the user outputs it through the image output unit 130.

The control device 140 may control the image output unit 130 to output the set content on the side wall where the user is located according to the detected user's motion. For example, the image output unit 130 may include a driving unit (not shown), and the control device 140 may control the driving unit so that the image output unit 130 is output on the side wall where the user is located.

When the user inputs the second content through the terminal 150, the terminal 150 may transmit the input second content to the control device 140. The controller 140 may control the image output unit 130 to output the second content received from the terminal 150. Accordingly, the user may input the desired content through the terminal owned by the user, and the input content may be output through the control device 140 and the image output unit 130. In this manner, the user can conveniently input desired content.

The control device 140 may display a window on which the user can input content on the second display unit 141.

The terminal 150 may display a window on which the user can input content on the third display unit 151 of the terminal 150.

The window may include an area where the user can directly enter a picture. In addition, the window includes a creation icon for creating a new file, an import icon for importing an image, a text insertion icon for text insertion, a storage icon for storing information, a projection icon for projecting the displayed information through the projector, It may include at least one of a picture tool icon for selecting a type and thickness of the picture tool, a color selection icon for selecting a color of the picture tool, an emoticon insertion icon for inserting an emoticon, and a speech bubble insertion icon for inserting a speech bubble. have. A detailed description thereof will be described with reference to FIG. 7.

The controller 140 may control the image output unit 130 to output the set contents when the detected user's position is within a set distance from the table or contents output position. For example, the controller 140 may control the image output unit 130 to output the set content when the detected user's position is within 1 m ('set distance') of the table. As another example, when the detected user's location is within 1 m ('set distance') of the location where the content is output (for example, a wall surface), the image output unit 130 is set to the content. Can be controlled to output.

When the terminal 150 is located within a set distance, the controller 140 automatically receives the content stored in the terminal 150 and outputs the received content by the image output unit 130 according to the detected user's position. Can be controlled.

The controller 140 may control the image output unit 130 to output the received content when the detected user's location is within a set distance from the table or the location where the content is output.

By using the interactive content display device, not only the shape of the wall surface or the like can be variously changed in a room such as a cafe, but also the user can output the content desired by the user to the wall surface or the like.

2 is a block diagram of an interactive content display device according to another embodiment of the present invention.

Referring to FIG. 2, the interactive content display device 200 may include a location sensor 210, an extractor 215, a distance sensor 220, an image output unit 230, and a control device 240. have. Each component can be connected wirelessly or wired to transfer data. The position sensor 210, the distance sensor 220, the image output unit 230, and the control device 240 may include the position sensor 110, the distance sensor 120, and the image output unit 130 of FIG. 1. And since the same function as the control device 140, description thereof will be omitted.

The location sensor 210 may detect a location of a user.

The extractor 215 may extract an output position corresponding to the position of the user sensed by the position sensor 210. For example, the extractor 215 may extract an output position corresponding to the detected user's position based on matching information where the user's position and the output position match. The controller 240 may control the image output unit 230 to display the set content based on the extracted output position. For example, the controller 240 may control the image output unit 230 to display the set content on the left, right, or top of the extracted output position.

When the terminal 150 is located within a set distance, the controller 140 automatically receives the content stored in the terminal 150 and outputs the received content by the image output unit 130 according to the detected user's position. Can be controlled.

The extractor 215 may extract an output position corresponding to the position of the user sensed by the position sensor 210. The controller 240 may control the image output unit 230 to display the received content based on the extracted output position.

By using the interactive content display device, not only the shape of the wall surface or the like can be variously changed in a room such as a cafe, but also the user can output the content desired by the user to the wall surface or the like.

3 is a diagram for describing a driving process of an interactive content display device according to an embodiment of the present invention.

Referring to FIG. 3, the interactive content display device may include a location sensor 310, a projector 330, and a control device 340. Each component can be connected wirelessly or wired to transfer data.

The location sensor 310 may detect a location of a user.

The projector 330, which is an image output unit, is a device capable of projecting various kinds of contents. The content may be at least one of a character, a sign, an image, a video, an emoticon, and an icon.

The controller 340 may control the projector 330 to output the set content according to the detected user's position. For example, the controller 340 may control the image output unit 330 to output the set content when the detected user's position is within a set distance from the table or image output position. The location 320 at which the image is output may be a wall or a projector screen.

The controller 340 may control the image output unit 330 to change and output the set content at each set period. For example, the controller 340 may control the image output unit 130 to change the A content, the B content, the C content, and the D content every second.

The controller 340 may control the projector 330 to output the input first content when the user inputs the first content through the user input unit.

When the user inputs the second content through the terminal 350, the terminal 350 may transmit the input second content to the control device 340. The controller 340 may control the projector 330 to output the second content received from the terminal 350. Accordingly, the user may input desired content through a terminal owned by the user, and the input content may be output through the control device 340 and the image output unit 330. In this manner, the user can conveniently input desired content.

4 is a diagram for describing a driving process of an interactive content display device according to another embodiment of the present invention.

Referring to FIG. 4, the interactive content display device may include a location sensor 410, a projector 430, a control device 440, and a distance sensor 450. Each component can be connected wirelessly or wired to transfer data.

The location sensor 410 may detect a user's location.

The distance sensor 450 may detect whether the terminal 460 is located within a distance set from the controller 440.

The projector 430, which is an image output unit, is a device capable of projecting various kinds of contents. The content may be at least one of a character, a sign, an image, a video, an emoticon, and an icon.

The controller 440 may automatically receive content stored in the terminal 460 when the terminal 460 is located within a set distance based on the signal sensed by the distance sensor 450. The controller 440 may control the projector 430 to output the received content according to the detected position of the user.

The controller 440 may control the projector 430 to output the received content when the detected user's location is within a set distance from the table or the location where the content is output. The location 420 where the image is output may be a wall or a projector screen.

5 is a view for explaining a driving process of an interactive content display device according to another embodiment of the present invention.

Referring to FIG. 5, the interactive content display device may include a position sensor 510, an extractor 515, a projector 530, and a control device 540. Each component can be connected wirelessly or wired to transfer data.

The location sensor 510 may detect a location of a user.

The extractor 515 may extract an output position 550 corresponding to the position of the user sensed by the position sensor 510. For example, the extractor 515 may extract an output position corresponding to the detected user's position based on matching information where the user's position and the output position match.

 The controller 540 may control the projector 530 to display the set content based on the extracted output position 550. For example, the controller 540 may control the projector 530 to display the set content or the content input from the user on the upper side 561 and the left side 562 of the extracted output position 550.

Using the interactive content display device, the position of the output content can be in harmony with the position of the user, thereby maximizing the aesthetic effect.

6A and 6B are diagrams for describing a driving process of an interactive content display device according to another embodiment of the present invention.

Referring to FIG. 6A, the interactive content display device may include a position sensor 610, a display unit 620, and a control device 630. Each component can be connected wirelessly or wired to transfer data.

The location sensor 610 may detect a location of a user.

The display unit 620 is a device capable of displaying various types of content. The content may be at least one of a character, a sign, an image, a video, an emoticon, and an icon. The display unit 620 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, and a sensor for detecting a touched operation (hereinafter, referred to as a “touch sensor”) may include at least one of a touch screen including a mutual layer structure.

The controller 630 may control the display unit 620 to output the set content according to the sensed user's position. For example, the controller 630 may control the display unit 620 to output the set content when the detected user location is within a set distance from the table or the display unit 620.

Referring to FIG. 6B, when the user inputs first content to the control device 630 through the user input unit, the control unit 630 may control the display unit 620 to output the input first content.

Alternatively, when the user inputs the second content through the terminal (not shown), the terminal (not shown) may transmit the input second content to the control device 630. The controller 630 may control the display 620 to output the second content received from the terminal (not shown). Here, the first content and the second content may be content in the form of a speech bubble 640, but the type of content is not limited thereto.

By using the interactive content display device, not only the shape of the wall surface or the like can be variously changed in a room such as a cafe, but also the user can output the content desired by the user to the wall surface or the like.

7 is a view for explaining a window that can be displayed on the display unit according to an embodiment of the present invention.

1 and 7, the control device 140 may display a window 700 on which the user can input content on the second display unit 141.

The terminal 150 may display a window 700 on which the user can input content on the third display unit 151 of the terminal 150. In the present embodiment, the control device 140 and the terminal 150 are expressed as directly displaying the window 700. However, a terminal or control device (for example, a PC) dedicated program capable of executing the above window is installed in the control device 140 and the terminal 150, and the user displays the window 700 by executing the dedicated program. It can also be indicated in the unit.

Window 700 includes a create icon 710 for creating a new file, a load icon 720 for importing an image, a text insert icon 730 for text insertion, and a save icon 740 for storing information. A projection icon (750) for projecting the displayed information through the projector, a painting tool icon (760) for selecting the type and thickness of the painting tool, a color selection icon (770) for selecting the color of the painting tool, and an emoticon. It may include at least one of an emoticon insertion icon 780 for inserting and a speech bubble insertion icon 790 for inserting a speech bubble. When the icon is selected by the user, the control device 140 or the terminal 150 may execute a function corresponding to the icon.

The window 700 may include an area 800 in which a user may directly input a picture.

The user may conveniently input desired content to the control device 140 and the terminal 150 using the window 700.

The embodiments described may be constructed by selectively combining all or a part of each embodiment so that various modifications can be made.

It should also be noted that the embodiments are for explanation purposes only, and not for the purpose of limitation. In addition, it will be understood by those of ordinary skill in the art that various embodiments are possible within the scope of the technical idea of the present invention.

Further, according to an embodiment of the present invention, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

Claims (19)

A position detection sensor capable of detecting a location of a user; And
And a controller for controlling the image output unit to output the content set according to the detected position of the user.
The method of claim 1,
The control device includes:
And controlling the image output unit to change and output the set content every set period.
The method of claim 1, wherein
Further comprising an extraction unit for extracting the output position corresponding to the detected user's position,
The control device includes:
And the image output unit controls to display the set content based on the extracted output position.
The method of claim 1,
The control device includes:
And when the user inputs the first content through a user input unit, the image output unit controls to output the input first content.
The method of claim 1,
The control device includes:
And displaying the content set in the image output unit on the side wall where the user is located according to the detected position of the user.
The method of claim 1,
If the user inputs the second content through the terminal, further comprising a terminal for transmitting the input second content to the control device,
The control device includes:
And the image output unit controls to output the second content received from the terminal.
The method of claim 1,
The image output unit,
And a projector capable of projecting the content or a first display unit capable of displaying the content.
The method of claim 1,
Further comprising a second display unit connected to the control device,
And the control device displays a window on which the user can input content on the second display unit.
The method according to claim 6,
The terminal is
And a window on which a user can input content, on a third display unit of the terminal.
10. The method according to claim 8 or 9,
The window is
An interactive content display device including an area where a user can directly enter a picture.
10. The method according to claim 8 or 9,
The window is
Create icon to create a new file, Load icon to import images, Text insert icon to insert text, Save icon to save information, Projection icon to project the displayed information through the projector, Type of drawing tool And a picture tool icon for selecting a thickness, a color selection icon for selecting a color of the picture tool, an emoticon insertion icon for inserting an emoticon, and a speech bubble insertion icon for inserting a speech bubble.
The method of claim 1,
And the image output unit is installed at at least one of a wall, a bottom, and a ceiling.
The method of claim 1,
The control device includes:
And a controller configured to control the image output unit to output the set contents when the detected position of the user is within a set distance from the table or the position at which the contents are output.
The method of claim 1,
The content is,
An interactive content display device comprising at least one of a character, a sign, an image, a video, an emoticon, and an icon.
A position detection sensor capable of detecting a location of a user;
A distance detecting sensor for detecting whether the terminal is located within a set distance from the control device; And
And a control device for automatically receiving the content stored in the terminal when the terminal is located within a set distance, and controlling the image output unit to output the received content according to the detected position of the user.
The method of claim 15, wherein
Further comprising an extraction unit for extracting the output position corresponding to the detected user's position,
The control device includes:
And the image output unit controls to display the received content based on the extracted output position.
The method of claim 15,
The control device includes:
And when the user inputs the first content through a user input unit, the image output unit controls to output the input first content.
The method of claim 15,
If the user inputs the second content through the terminal, further comprising a terminal for transmitting the input second content to the control device,
The control device includes:
And the image output unit controls to output the second content received from the terminal.
The method of claim 15,
The control device includes:
And the image output unit controls the image output unit to output the received content when the detected user's position is within a set distance from the table or the content output position.
KR1020110144581A 2011-12-28 2011-12-28 Apparatus for displaying interactive content KR20130076131A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110144581A KR20130076131A (en) 2011-12-28 2011-12-28 Apparatus for displaying interactive content

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110144581A KR20130076131A (en) 2011-12-28 2011-12-28 Apparatus for displaying interactive content

Publications (1)

Publication Number Publication Date
KR20130076131A true KR20130076131A (en) 2013-07-08

Family

ID=48989797

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110144581A KR20130076131A (en) 2011-12-28 2011-12-28 Apparatus for displaying interactive content

Country Status (1)

Country Link
KR (1) KR20130076131A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020022569A1 (en) * 2018-07-27 2020-01-30 (주)휴맥스 Smart projector and control method therefor
GB2607569A (en) * 2021-05-21 2022-12-14 Everseen Ltd A user interface system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020022569A1 (en) * 2018-07-27 2020-01-30 (주)휴맥스 Smart projector and control method therefor
GB2607569A (en) * 2021-05-21 2022-12-14 Everseen Ltd A user interface system and method

Similar Documents

Publication Publication Date Title
US11494000B2 (en) Touch free interface for augmented reality systems
CN107810470B (en) Portable device and method for changing screen thereof
US9507424B2 (en) User location-based display method and apparatus
US20180292907A1 (en) Gesture control system and method for smart home
US9430041B2 (en) Method of controlling at least one function of device by using eye action and device for performing the method
RU2010153555A (en) PROVISION OF MULTI-LEVEL OF CONTEXT FOR CONTENT USED ON COMPUTERS AND PLAYERS OF MULTIMEDIA DATA
KR20110066901A (en) User interface device, user interface method, and recording medium
KR101413649B1 (en) Touch table top display apparatus for multi-user
US20150312559A1 (en) Display device, control method, and control program
US9691358B2 (en) Electronic apparatus and method for outputting content
KR20170057823A (en) Method and electronic apparatus for touch input via edge screen
KR20140116240A (en) Server apparatus, game control method 0f server apparatus, mobile apparatus, control method of mobile apparatus, display apparatus and game image display method of display apparatus
US8352267B2 (en) Information processing system and method for reading characters aloud
TW201604720A (en) Electronic device and recording medium
JP6297484B2 (en) Control elements or items based on gestures
CN109714647B (en) Information processing method and device
KR101181740B1 (en) Smart show window
KR20210005041A (en) Modal control initiation technique based on hand position
KR20130076131A (en) Apparatus for displaying interactive content
CN103752010A (en) Reality coverage enhancing method used for control equipment
KR102317619B1 (en) Electronic device and Method for controling the electronic device thereof
US20150262013A1 (en) Image processing apparatus, image processing method and program
JP4611416B2 (en) Teaching materials
KR20130083111A (en) Image display apparatus and method for operating the same
TWI430170B (en) Electronic billboard

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application