EP2735164A1 - Inhaltswiedergabeverfahren und -vorrichtung - Google Patents

Inhaltswiedergabeverfahren und -vorrichtung

Info

Publication number
EP2735164A1
EP2735164A1 EP12814483.9A EP12814483A EP2735164A1 EP 2735164 A1 EP2735164 A1 EP 2735164A1 EP 12814483 A EP12814483 A EP 12814483A EP 2735164 A1 EP2735164 A1 EP 2735164A1
Authority
EP
European Patent Office
Prior art keywords
content
user
location
virtual viewpoint
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12814483.9A
Other languages
English (en)
French (fr)
Other versions
EP2735164A4 (de
Inventor
Sang Keun Jung
Hyun Cheol Park
Moon Sik Jeong
Kyung Sun Cho
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP2735164A1 publication Critical patent/EP2735164A1/de
Publication of EP2735164A4 publication Critical patent/EP2735164A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic

Definitions

  • the present invention relates generally to a content playing method and apparatus, and more particularly, to a method for playing content corresponding to a location of a user and an apparatus thereof.
  • the demand for Three-Dimensional (3D) image technology has increased and with more common use of digital broadcasting, stereoscopic image use in 3D TV and 3D information terminals is being actively researched.
  • the stereoscopic image implemented through 3D technology is formed by a stereo sight principle as experienced through two eyes. Because two eyes are spaced apart from each other by approximately 65 mm, a binocular parallax acts as a main factor of depth.
  • the left and right eyes view different stereoscopic images
  • the two different stereoscopic images are transferred to a brain through a retina, wherein the brain combines the two different stereoscopic image such that the user experiences the depth of the stereoscopic image.
  • a 3D TV is capable of showing a 3D image having a fixed viewpoint regardless of a location of a user, the TV cannot provide a realistic image where the user is present in the TV.
  • the present invention has been made to solve the above mentioned problems occurring in the prior art, and the present invention provides a method for playing realistic content stimulating at least one of the senses of a user, corresponding to a location of the user, and an apparatus thereof.
  • a content playing method including determining a first location of a user; mapping a content space displayed on a display unit to correspond with an actual space in which the user is present based on the first determined location; determining a virtual viewpoint in the content space corresponding to a second location of the user; and playing content corresponding to the determined virtual viewpoint.
  • a content playing apparatus including a content collecting unit for collecting content stimulating senses of a user; a content processor for performing a processing operation such that content input from the content collecting unit are played; a content playing unit for playing content input from the content collecting unit; a sensor for collecting information associated with a location of the user such that the content is played corresponding to the location of the user; and a controller for determining a virtual viewpoint in a virtual content space corresponding to the location of the user based on received information from the sensor and controlling such that content corresponding to the determined virtual viewpoint are played.
  • the present invention provides a method for playing realistic content stimulating at least one of the senses of a user, corresponding to a location of the user, and an apparatus thereof.
  • FIG. 1 is a block diagram illustrating a configuration of a content playing apparatus according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating a three-dimensional coordinate system according to an embodiment of the present invention.
  • FIGS. 3A, 3B, and 3C are diagrams illustrating a perspective view, a plan view, and a side view, respectively, of a actual space according to an embodiment of the present invention
  • FIGS. 4A and 4B are diagrams illustrating a perspective view and a plan view, respectively, of a virtual content space according to an embodiment of the present invention
  • FIG. 5 is a block diagram illustrating a realistic type playing unit according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a space mapping method according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating a space mapping method according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating a virtual viewpoint determining method according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating an angle control method of a virtual camera according to an embodiment of the present invention.
  • FIG. 10 is a diagram illustrating a stereoscopic image control method according to an embodiment of the present invention.
  • FIG. 11 is a diagram illustrating a stereoscopic sound control method according to an embodiment of the present invention.
  • FIG. 12 is a block diagram illustrating a network of a home network system to which a content playing apparatus is applied according to an embodiment of the present invention
  • FIG. 13 is a flowchart illustrating a content playing method according to an embodiment of the present invention.
  • FIG. 14 is a flowchart illustrating a content playing method according to an embodiment of the present invention.
  • the term “content” refers to an operation for stimulating senses of a user, such as a sense of sight, a sense of hearing, and a sense of touch.
  • the content may be an image, light, voice, and wind.
  • realistic type playback may refer to playing content corresponding to a location of the user. That is, the user may experience the same content differently in different locations of the user. For example, when the content is a car displayed on a screen, a front surface or a side surface of the car is viewed by the user depending on the user location.
  • the content playing method and apparatus are applicable to an electronic device having a function playing content stimulating senses of the user.
  • the content playing method and apparatus are applicable to a notebook computer, a desktop PC, a tablet PC, a smart phone, a High Definition TeleVision (HDTV), a smart TV, a 3-Dimensional (3D) TV, an Internet Protocol Television (IPTV), a stereoscopic sound system, a theater system, a home theater, a home network system and the like.
  • HDMI High Definition TeleVision
  • IPTV Internet Protocol Television
  • the content playing method and apparatus provide a function for tracking location variation of the user, and a function for realistically playing content corresponding to the tracked location of the user.
  • the content playing method and apparatus may provide a function that receives content, such as images from a content provided through Local Area Network (LAN), wireless LAN, 3-Generation (3G) or 4-Generation (4G) wireless communication network, stores and plays a database including the received images in a real-time manner.
  • the images may include stereoscopic image.
  • the stereoscopic image may become 3D movie, 3D animation, or 3D computer graphics. Further, the stereoscopic image may be a multi-media combined with the stereoscopic sound.
  • FIG. 1 is a block diagram illustrating a configuration of a content playing apparatus according to an embodiment of the present invention. It is assumed that the content playing apparatus of FIG. 1 is a 3D TV which enables content to appear to exist in a space between a screen and the user.
  • a content playing apparatus 100 according to an embodiment of the present invention includes an input unit 110, a remote controller 120, a remote controller receiver 125, a sensor 130, a content collecting unit 140, a content processor 150, a sound output unit 161, an image display unit 162, a memory 170, an interface unit 180, and a controller 190.
  • the input unit 110 may include a plurality of input keys and function keys for receiving input of numeral or character information, and for setting various functions.
  • the functions may include arrow keys, side keys, and hot keys set to perform a predetermined function.
  • the input unit 110 creates and transfers a key event associated with user setting and function control of the content playing apparatus 100.
  • the key event may include a power on/off event, a volume control event, a screen on/off event, etc.
  • the controller 190 controls the foregoing elements in response the key event.
  • the remote controller 120 creates various key events for operating the content playing apparatus 100, and converts the created key event into a wireless signal, and transmits the wireless signal to the remote controller receiver 125.
  • the remote controller 120 of the present invention may create a start event for requesting realistic type playback and a termination event for terminating the realistic type playback.
  • the realistic type playback may be defined to play content corresponding to a location of the user.
  • the remote controller receiver 125 converts a received wireless signal into an original key event, and transfers the original key event to the controller 190.
  • the sensor 130 collects information associated with the location of the user such that the user may track the location of the user, and transfers the collected information to the controller 190.
  • the sensor 130 may be implemented by an image sensor or an optical sensor for sensing light of a predetermined wavelength such as infrared ray.
  • the sensor 130 converts a sensed physical amount into an electric signal, and an Analog to Digital Converter (ADC) converts the electric signal into data, and transfers the data to the controller 190.
  • ADC Analog to Digital Converter
  • the content collecting unit 140 performs a function for collecting content stimulating senses of the user. Specifically, the content collecting unit 140 performs a function for collecting images and sounds from a network or a peripheral device. That is, the content collecting unit 140 may include a broadcasting receiver 141 and an Internet communication unit 142. Specifically, the broadcasting receiver 141 selects one from a plurality of broadcasting channels, and demodulates a broadcasting signal of the selected broadcasting channel to original broadcasting content.
  • the Internet communication unit 142 includes a wired modem or a wireless modem for receiving various information for home shopping, home banking, and on-line gaming, and MP3 use and additional information with respect thereto.
  • the Internet communication unit 142 may include a mobile communication module (e.g., 3G mobile communication module, 3.5G mobile communication module, and 4G mobile communication module) and a near distance communication module (e.g., a Wi-Fi module).
  • a mobile communication module e.g., 3G mobile communication module, 3.5G mobile communication module, and
  • the content processor 150 performs a processing function to play content from a content collecting unit 140. Specifically, the content processor 150 classifies input content into a stereoscopic image and a stereoscopic sound.
  • the content processor 150 may include a sound processor for decoding and outputting the classified stereoscopic sound to a sound output unit 161, and an image processor for decoding the classified stereoscopic sound into a left image and a right image and outputting the left image and the right image to the image processor 152. Further, the content processor 150 may compress and transfer input content to the controller 190 under the control of the controller 190. Accordingly, the controller 190 transfers compressed content to the memory 170.
  • the sound processor 150 may control a direction or a distance of a stereoscopic sound according to the location of the user.
  • the sound processor 150 may change a type of the sound output from the sound output unit 161 according to a location of the user or change the volume according to a type of the sound.
  • the image processor 160 may control brightness, stereoscopic sensation, and depth according to the location of the user.
  • the content playing unit 160 performs a function for playing content processed by the content processor 150.
  • the content playing unit 160 may include a sound output unit 161 and an image display unit 162.
  • the sound output unit 161 outputs a decoded stereoscopic sound, and includes a plurality of speakers, for example, 5.1 channel speakers.
  • the image display unit 161 displays a stereoscopic image.
  • the image display unit 162 displays a stereoscopic image with depth, as the stereoscopic image actually exists in a three-dimensional space between the image display unit 162 and the screen through a display unit for displaying a stereoscopic image and a 2D implementing unit for allowing a user to experience depth with respect to a displayed stereoscopic image.
  • the display unit may be implemented as a Liquid Crystal Display (LCD), Organic Light Emitting Diodes (OLED), or Active Matrix Organic Light Emitting Diodes (AMOLED).
  • the 3D implementing unit is a structural element formed in an accumulated with the display unit that makes different images to be recognized at binocular right and left eyes.
  • the 3D implementing scheme is divided into a glass scheme and a glass-free scheme.
  • the glass scheme includes a color filter scheme, a deflection filter scheme, and a shutter glass scheme.
  • the glass-free scheme includes a lenticular lens scheme and a parallax barrier. Because the 3d implementing scheme is known in the art, a detailed description thereof is omitted.
  • the memory 170 stores programs and data necessary for an operation of the content playing apparatus 100.
  • the memory 170 may be configured by a volatile storage medium, a nonvolatile storage medium, or a combination thereof.
  • the volatile storage medium includes a semiconductor memory such as RAM, DRAM, or SRAM.
  • the non-volatile storage medium may include a hard disk.
  • the memory 170 may be divided into a data area and a program area. Specifically, a data area of the memory 170 may store data created by the controller 160 according to use of the content playing apparatus 100. The data area may store content compressed in a predetermine format provided from the controller 160.
  • the program area of the memory 140 may store an operating system (OS) for booting the content playing apparatus 100 and operating respective elements, and applications for supporting various user functions, for example, a web browser for accessing an Internet server, an MP3 user function for playing other sound sources, an image output function for playing photographs, and a moving playback function.
  • OS operating system
  • the program area of the present invention may store a realistic type playback program.
  • the realistic type playing program may include a routine for determining an initial location of a user, a routine for mapping a actual space with a content space based on the initial location, a routine for tracking location variation, a routine for determining a virtual viewpoint in a content space corresponding to the location of the user, and a routine for playing content corresponding to a time point in the content space.
  • the initial location is defined as a reference value for mapping a content space to a actual space.
  • the actual space is a 3D space in which the user and the display unit are located.
  • the content space is a virtual space in which content displayed through a display unit exists.
  • the virtual viewpoint is defined as a viewpoint of the user in a content space mapped with a actual space.
  • the interface unit 180 performs a function for connecting the content playing apparatus 100 with a peripheral device in a wired or wireless scheme.
  • the interface unit 180 may include a Zigbee® module, a Wi-Fi module, or a Bluetooth® module.
  • the interface unit 180 may receive and transfer a control signal for realistic type playback from the controller 190 to a peripheral device. That is, the controller 190 may control the peripheral device through the interface unit 180.
  • the peripheral device may become a home network device, a stereoscopic sound device, a lamp, an air conditioner, and a heater.
  • the controller 190 may control the peripheral device to play content for stimulating senses of the user, for example, a touch sense, a sight sense, and a smell sense.
  • the controller 190 may control an overall operation of the content playing apparatus 100, and signal flow between internal structural elements of the content playing apparatus 100. Further, the controller 190 may control power supply to internal elements in a battery. Moreover, the controller 190 may execute various application stored in the program area. Specifically, in the present invention, if a start event for realistic type playback is sensed, the controller 190 may execute the foregoing realistic type playback program. That is, if the realistic type playback program is executed, the controller 190 determines an initial location of the user and tracks location variation of the user. Further, the controller 190 maps a content space with a actual space based on the initial location, determines a virtual viewpoint corresponding to the tracked location, and controls the content processor 150 to play content corresponding to the virtual viewpoint. Furthermore, the controller 190 may control a peripheral device through the interface unit 180 to play content corresponding to the virtual viewpoint. The realistic type playback function of the controller 190 will be described in detail.
  • FIG. 2 is a diagram illustrating a three dimensional coordinate system according to an embodiment of the present invention.
  • a method for expressing a 3D space according to the present may use a three-dimensional coordinate system.
  • a solid line expresses a positive value
  • a dotted line expresses a negative value.
  • coordinates of the user in a actual space at time t are expressed as (x u ,t , y u,t , z u ,t )
  • coordinates of a camera in a content space at time t are expressed as (x c ,t , y c ,t , z c ,t ).
  • FIGS. 3A, 3B, and 3C are diagrams illustrating a perspective view, a plan view, and a side view of a actual space according to an embodiment of the present invention.
  • a central point of a screen 301 of a display unit in a actual space is set to (0,0,0) of a coordinate system.
  • a right direction and a left direction of the central point 302 become a positive direction and a negative direction of an X u axis with reference to a direction that the user views a screen, respectively.
  • An upward direction and a downward direction in the central point 302 become a positive direction and a negative direction of an Y u axis.
  • a direction to the user 303 on the screen 301 becomes a positive direction of the Z u axis, and an opposite direction thereof becomes a negative direction of the Z u axis.
  • the location of the user may be expressed with (x u , y u , z u ).
  • a horizontal length of the screen 301 may be expressed with Display Screen Width (DSW)
  • a vertical length of the screen 302 be with Display Screen Height (DSH)
  • a straight distance between the screen 301 and the user 303 be expressed with Watching Distance (WD).
  • FIGS. 4A and 4B are diagrams illustrating a perspective view and a plan view illustrating a virtual content space.
  • a virtual camera described herein is not a real camera, but refers to a user in a content space corresponding to that in a actual space.
  • a focus 402 on a focus surface 401 in a content space is set to (0,0,0) of a coordinate system.
  • a right direction and a left direction of the focus 402 become a positive direction and a negative direction of an X c axis based on a direction that the camera 430 directs the focus surface 401, respectively.
  • An upward direction and a downward direction of the focus 402 become a positive direction and a negative direction of a Y c axis, respectively.
  • a direction of the focus surface 401 to the virtual camera 403 becomes a positive direction of the Z c axis and an opposite direction thereof becomes a negative direction of the Z c axis.
  • the location of the virtual camera 403 may be expressed with (x c , y c , z c ).
  • a horizontal length of the focus surface 401 may be expressed with Focal Width (FW), a vertical length of the focus surface 401 be expressed with Focal Height (FH), and a straight distance between the focus surface 401 and the virtual camera 430 be expressed with Focal Length (FL).
  • the FL may be set by the user.
  • the size of the focus surface 401 may be set by adjusting an angle of the camera. That is, because the virtual camera 403 is virtual, the distance of the focus and the angle of the camera may be set as needed.
  • FIG. 5 is a block diagram illustrating a realistic playing unit according to an embodiment of the present invention.
  • the realistic type playing unit 500 may be configured inside the controller 190 or be separately configured. It is assumed that the realistic playing unit 500 may be configured inside the controller 190.
  • the realistic playing unit 500 of the present invention such as a controller 190 may include a tracker 510, an initiator 520, a space mapper 530, a virtual viewpoint determinator 540, and a content processing controller 550.
  • the tracker 510 tracks a location of a user 303. That is, the tracker 510 tracks coordinates (x u , y u , z u ) of the user using data received from the sensor 130.
  • the tracker 510 detects characteristic information, for example, a face of the user 303 from received sensing information, and determines a central point of the detected face as coordinates (x u , y u , z u ) of the user 303.
  • z u , WD may be computed using the size of the detected face.
  • the initiator 520 determines an initial location of the user being a reference value for mapping a content space with a actual space. That is, if a start event for realistic type playback is sensed, the initiator 520 determines coordinates input from the tracker 510 an initial location of the user. Specifically, if the location of the user 303 is not changed within a preset error after watching the content starts, the initiator 520 may determine the location of the user 303 as an initial location. Further, if a predetermined key value is input from the remote controller 120, the initiator 520 may determine a location of the user in the input time of the predetermined key value as the initial location.
  • the initiator 520 may determine a location of the user in the input time of the predetermined key value as the initial location. To do this, the tracker 510 may detect a predetermined gesture of the user, for example, action taking a hand down after lifting it using a template matching manner. If the predetermined gesture is detected, the tracker 510 informs the initiator 520 of this.
  • a predetermined gesture of the user for example, action taking a hand down after lifting it using a template matching manner. If the predetermined gesture is detected, the tracker 510 informs the initiator 520 of this.
  • FIG. 6 is a diagram illustrating a space mapping method according to an embodiment of the present invention.
  • a following Equation (1) expresses a relative ratio of a content space coordinate system to a actual space coordinate system in a space mapping method according to an embodiment of the present invention.
  • t 0 represents a time point determined as an initial location by the initiator 520.
  • the space mapper 530 maps the content space 603 to the actual space 602 based on an initial location of the user 601. That is, the space mapper 530 determines FW, FH, and FL and then computes X_Ratio, Y_Ratio, and Z_Ratio(at t 0 ) as illustrated in the Equation (1).
  • FIG. 7 is a diagram illustrating a space mapping method according to an embodiment of the present invention.
  • a following Equation (2) expresses a relative ratio of a content space coordinate system to a actual space coordinate system in a space mapping method according to an embodiment of the present invention.
  • DSW, DSH, and WD are values depending on the size of the display unit and a actual space.
  • the space mapper 530 may add or subtract a predetermine adjustment to or from the foregoing values to extend or shorten a actual space mapped to the content space.
  • the space mapper 530 may control the size of displayed content using the adjustment.
  • the space mapper 530 may receive the adjustment from the remote controller 120 through the receiver 125 at any time, before start or during start of realistic type playback.
  • FIG. 8 is a diagram illustrating a virtual viewpoint determining method according to an embodiment of the present invention.
  • a following Equation (3) is a calculation equation of the virtual viewpoint determining method.
  • a virtual viewpoint determinator 540 receives coordinates (x u ,t+1 , y u ,t+1 , z u ,t+1 ) of the user 801 from the tracker 510, and receives coordinate transformation values, namely, X_Ratio, Y_Ratio and Z_Ratio from the space mapper 530. As illustrated in the Equation (3), the virtual viewpoint determinator 540 computes coordinates (x c ,t+1 , y c ,t+1 , z c ,t+1 ) of the virtual camera 702 mapped to coordinates of the user 801 using the foregoing received information.
  • a Minimum Transition Threshold (MTT) for moving a location of the camera may be previously set in the present invention.
  • the MIT may be an option item that the user may directly set.
  • the virtual viewpoint determinator 540 may compute coordinates of the virtual camera 802.
  • the MTT may be differently set for X, Y, Z axes.
  • the MTT for the Z axis may be set to have the greatest value.
  • coordinates of the virtual camera 802 may be computed.
  • FIG. 9 is a diagram illustrating an angle control method of a virtual camera according to an embodiment of the present invention.
  • the virtual viewpoint determinator 540 calculates angle variation amounts ⁇ ( ⁇ X , ⁇ Y , ⁇ Z ) of the user 902 based on a central point 901 of a screen.
  • FIG. 10 is a diagram illustrating a stereoscopic image control method according to an embodiment of the present invention.
  • a content processing controller 550 receives a virtual viewpoint, namely, coordinates x c ,t+1 , y c ,t+1 , z c ,t+1 of a virtual camera from a virtual viewpoint determinator 540. Further, the content processing controller 550 may receive an angle adjustment of a virtual viewpoint, namely, an angle control value ⁇ of the virtual camera from the virtual viewpoint determinator 540. Moreover, the content processing controller 550 controls the content processor 150 based on received information to adjust brightness, stereoscopic sensation, and depth of a stereoscopic image. Referring to FIG. 10, if an initial location 1001 of a user is determined, the content processing controller 550 perform a control operation to display an object that the virtual camera orients in the initial location 902.
  • a location of the user is moved to 1001 -> 1003 -> 1005, a location of a virtual camera in a content space mapped to a actual space to 1002 -> 1004 -> 1006. Accordingly, another part of the object is displayed according to location variation of a virtual camera. If the user moves from 1005 to 1007, the camera moves from 1007 to 1008. If the camera is located at 1008, because an object is separated from an angle of a virtual camera, it is not displayed longer. However, the content processing controller 550 rotates the camera in a direction of the object 1009 by an angle control value ⁇ to continuously display the object 1009.
  • FIG. 11 is a diagram illustrating a stereoscopic sound control method according to an embodiment of the present invention.
  • the content processing controller 550 controls the content processor 150 to adjust a direction and a distance of a stereoscopic sound.
  • the content processing controller 550 may adjust a distance in a manner that a sound of a car is gradually increased such that the user experiences the car approaching.
  • the content processing controller 550 may adjust a distance in a manner that a sound of a car is gradually reduced such that the user experiences the car leaving.
  • the content processing controller 550 controls a direction of a stereoscopic sound such that a sound of the car is output from a front speaker and a centre speaker.
  • the content processing controller 550 controls the direction of a stereoscopic sound such that a sound of the car is output from a rear speaker.
  • the foregoing content playing apparatus 100 may further include constructions that are not mentioned such as a camera, a microphone, and a GPS receiving module. Since the structural elements can be variously changed according to convergence trend of a digital device, no elements can be listed. However, the content playing apparatus 100 may include structural elements equivalent to the foregoing structural elements. Further, the content playing apparatus 100 of the present invention may be substituted by specific constructions in the foregoing arrangements according to the provided form or another structure. This can be easily understood to those skilled in the present art.
  • FIG. 12 is a block diagram illustrating a network of a home network system to which a content playing apparatus is applied according to an embodiment of the present invention.
  • a home network system may include a content playing apparatus 1200, a home network server 1210, and a plurality of home network devices.
  • the content playing apparatus may include the foregoing structural elements.
  • the content playing apparatus 1200 may communicate with a home network server 1210 in a communication scheme such as a ZigBee scheme, a Bluetooth® scheme, or a Wireless LAN scheme. If a start event for realistic type playback is sensed, the content playing apparatus 1200 executes a realistic type playing program.
  • the content playing apparatus 1200 may control the home network device to play the content corresponding to a location of the user.
  • the home network server 1210 controls home network devices.
  • the home network server 1210 may drive home network devices under the control of the content playing apparatus 1200.
  • the home network devices have unique addresses, respectively, and are controlled through the addresses by the home network server 1210.
  • the home network device plays content stimulating senses of the user under the control of the content playing apparatus 1200.
  • the home network devices may include an air conditioner 1220, a humidifier 1230, a heater 1240, and a lamp 1250.
  • the content playing apparatus 1200 may control the air conditioner 1220, the humidifier 1230, the heater 1240, and the lamp according to location variation of the user to adjust an intensity of wind, peripheral brightness, temperature, and humidity. For example, referring to FIG. 11, when the user is located at 1102, the content playing apparatus 1200 may increase the intensity of wind such as the user experiences the car being near the user.
  • the user may be stimulated with a touch sense, and experience to be present in a real content space.
  • FIG. 13 is a flowchart illustrating a content playing method according to an embodiment of the present invention.
  • a controller 190 may firstly be in an idle state.
  • the idle state may be defined as a state displaying an image before a realistic type playback step. If the user operates a remote controller 120 in the idle state, the controller 190 senses a start event for realistic playback. As illustrated previously, if a start event is sensed, an initiator 520 determines coordinates input from a tracker 510 as an initial location of a user in step 1301. The tracker 510 may track a location of the user before a realistic type playing step.
  • a space mapper 530 maps a content space to a actual space based on the determined initial location in step 1302.
  • a virtual viewpoint determinator 540 computes location variation amounts ( ⁇ x u , ⁇ y u , ⁇ z u ) of the user in step 1303.
  • the virtual viewpoint determinator 540 compares the computed location variation amounts of the user with an MTT in step 1304. As a comparison result of step 1304, when the computed location variation amounts of the user is greater than the MTT, the process proceeds to step 1305.
  • the virtual viewpoint determinator 540 determines and transfers locations x c ,t+1 , y c,t+1 , and z c ,t+1 of a virtual camera, namely, a virtual viewpoint to the content processing controller 550 using the Equation (3) in step 1305.
  • the content processing controller 550 controls the content processor 150 based on the received virtual viewpoint in step 1306. That is, the content processing controller 550 controls the content processor 150 to play content corresponding to the received virtual viewpoint. Further, the content processing controller 550 controls the content processor 150 based on the received virtual viewpoint to adjust a direction or a distance of a stereoscopic sound in step 1306.
  • the content processing controller 550 may control a peripheral device, such as home network devices based on the virtual viewpoint to adjust an intensity of wind, temperature, humidity, or brightness.
  • the content processing controller 550 determines whether a termination event of realistic type playback is sensed in step 1307. If the termination event of realistic type playback is sensed at step 1307, a process for realistic playback is terminated. Conversely, if the termination event is not sensed, the process returns to step 1303.
  • FIG. 14 is a flowchart illustrating a content playing method according to another embodiment of the present invention.
  • a content playing method according to another embodiment of the present invention may include steps 1401 to 1407. Because steps 1401, 1402, 1404, 1405, and 1407 correlate with the forgoing steps 1301, 1302, 1304, 1305, and 1307, a description thereof will be omitted.
  • Step 1403 is a step in which a virtual viewpoint determinator 540 calculates an angle variation amount ⁇ of FIG. 9 together with location variation amounts ⁇ x u , ⁇ y u , and ⁇ z u of the user.
  • the content processing controller 550 plays content corresponding to a received virtual viewpoint, rotates a direction of the virtual viewpoint by the computed angel variation amount ⁇ , and plays content corresponding to the rotated virtual viewpoint.
  • a method for providing a user interface in a portable terminal according to an embodiment of the present invention as described above may be implemented in an executable program command form by various computer means and be recorded in a computer readable recording medium.
  • the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof.
  • the program command recorded in a recording medium may be specially designed or configured for the present invention or be known to a person having ordinary skill in a computer software field to be used.
  • the computer readable recording medium includes Magnetic Media such as hard disk, floppy disk, or magnetic tape, Optical Media such as Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), Magneto-Optical Media such as floptical disk, and a hardware device such as ROM.
  • RAM random access memory
  • flash memory storing and executing program commands.
  • the program command includes a machine language code created by a complier and a high-level language code executable by a computer using an interpreter.
  • the foregoing hardware device may be configured to be operated as at least one software module to perform an operation of the present invention, and a reverse operation thereof is the same.
  • a content playing method and an apparatus thereof according to the present invention have an effect that they may realistically play content stimulating a sense of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
EP12814483.9A 2011-07-18 2012-01-17 Inhaltswiedergabeverfahren und -vorrichtung Withdrawn EP2735164A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20110070959 2011-07-18
KR1020110114883A KR101926477B1 (ko) 2011-07-18 2011-11-07 콘텐츠 재생 방법 및 장치
PCT/KR2012/000375 WO2013012146A1 (en) 2011-07-18 2012-01-17 Content playing method and apparatus

Publications (2)

Publication Number Publication Date
EP2735164A1 true EP2735164A1 (de) 2014-05-28
EP2735164A4 EP2735164A4 (de) 2015-04-29

Family

ID=47839676

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12814483.9A Withdrawn EP2735164A4 (de) 2011-07-18 2012-01-17 Inhaltswiedergabeverfahren und -vorrichtung

Country Status (5)

Country Link
US (1) US20130023342A1 (de)
EP (1) EP2735164A4 (de)
KR (1) KR101926477B1 (de)
CN (1) CN103703772A (de)
WO (1) WO2013012146A1 (de)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9613461B2 (en) 2012-12-10 2017-04-04 Sony Corporation Display control apparatus, display control method, and program
KR102019125B1 (ko) * 2013-03-18 2019-09-06 엘지전자 주식회사 3d 디스플레이 디바이스 장치 및 제어 방법
KR101462021B1 (ko) * 2013-05-23 2014-11-18 하수호 음원 재생을 위한 그래픽 유저 인터페이스 제공방법 및 이를 위한 단말
KR101381396B1 (ko) * 2013-09-12 2014-04-04 하수호 입체음향 조절기를 내포한 멀티 뷰어 영상 및 3d 입체음향 플레이어 시스템 및 그 방법
AU2013402725B2 (en) 2013-10-07 2018-04-12 Apple Inc. Method and system for providing position or movement information for controlling at least one function of a vehicle
US10937187B2 (en) 2013-10-07 2021-03-02 Apple Inc. Method and system for providing position or movement information for controlling at least one function of an environment
KR101669926B1 (ko) 2014-02-03 2016-11-09 (주)에프엑스기어 사용자 시점 연동형 영상 처리 장치 및 그 방법
CN105094299B (zh) * 2014-05-14 2018-06-01 三星电子(中国)研发中心 控制电子装置的方法和设备
CN104102349B (zh) * 2014-07-18 2018-04-27 北京智谷睿拓技术服务有限公司 内容分享方法和装置
CN104123003B (zh) * 2014-07-18 2017-08-01 北京智谷睿拓技术服务有限公司 内容分享方法和装置
EP3254455B1 (de) * 2015-02-03 2019-12-18 Dolby Laboratories Licensing Corporation Selektive konferenzübersicht
CN104808946A (zh) * 2015-04-29 2015-07-29 天脉聚源(北京)传媒科技有限公司 图像播放控制方法及装置
JP6739907B2 (ja) * 2015-06-18 2020-08-12 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 機器特定方法、機器特定装置及びプログラム
JP6984416B2 (ja) * 2015-11-11 2021-12-22 ソニーグループ株式会社 画像処理装置および画像処理方法
US9851435B2 (en) * 2015-12-14 2017-12-26 Htc Corporation Electronic device and signal generating circuit
CN109963177A (zh) * 2017-12-26 2019-07-02 深圳Tcl新技术有限公司 一种电视机自动调整观看角度的方法、存储介质及电视机
KR102059114B1 (ko) * 2018-01-31 2019-12-24 옵티머스시스템 주식회사 가상 공간과 실공간의 매칭을 위한 영상 보정 장치 및 이를 이용한 영상 매칭 시스템
JP7140517B2 (ja) * 2018-03-09 2022-09-21 キヤノン株式会社 生成装置、生成装置が行う生成方法、及びプログラム
CN111681467B (zh) * 2020-06-01 2022-09-23 广东小天才科技有限公司 一种词汇学习方法及电子设备、存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
US20100225735A1 (en) * 2009-03-09 2010-09-09 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US20110149043A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Device and method for displaying three-dimensional images using head tracking
US20110157327A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1186038A (ja) * 1997-03-03 1999-03-30 Sega Enterp Ltd 画像処理装置、画像処理方法及び媒体並びにゲーム機
JP2006025281A (ja) * 2004-07-09 2006-01-26 Hitachi Ltd 情報源選択システム、および方法
JP4500632B2 (ja) * 2004-09-07 2010-07-14 キヤノン株式会社 仮想現実感提示装置および情報処理方法
WO2006096776A2 (en) * 2005-03-07 2006-09-14 The University Of Georgia Research Foundation,Inc. Teleportation systems and methods in a virtual environment
US8564532B2 (en) * 2005-12-06 2013-10-22 Naturalpoint, Inc. System and methods for using a movable object to control a computer
US20080252596A1 (en) * 2007-04-10 2008-10-16 Matthew Bell Display Using a Three-Dimensional vision System
US20090141905A1 (en) * 2007-12-03 2009-06-04 David Warhol Navigable audio-based virtual environment
US20090222838A1 (en) * 2008-02-29 2009-09-03 Palm, Inc. Techniques for dynamic contact information
CA2747544C (en) * 2008-12-19 2016-06-21 Saab Ab System and method for mixing a scene with a virtual scenario
KR101324440B1 (ko) * 2009-02-11 2013-10-31 엘지디스플레이 주식회사 입체 영상의 뷰 제어방법과 이를 이용한 입체 영상표시장치
CN102045429B (zh) * 2009-10-13 2015-01-21 华为终端有限公司 一种调节显示内容的方法和设备
KR101046259B1 (ko) * 2010-10-04 2011-07-04 최규호 응시위치를 추적하여 입체영상을 표시하는 입체영상 표시장치
US9644989B2 (en) * 2011-06-29 2017-05-09 Telenav, Inc. Navigation system with notification and method of operation thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
US20100225735A1 (en) * 2009-03-09 2010-09-09 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US20110149043A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Device and method for displaying three-dimensional images using head tracking
US20110157327A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2013012146A1 *

Also Published As

Publication number Publication date
US20130023342A1 (en) 2013-01-24
EP2735164A4 (de) 2015-04-29
WO2013012146A1 (en) 2013-01-24
CN103703772A (zh) 2014-04-02
KR20130010424A (ko) 2013-01-28
KR101926477B1 (ko) 2018-12-11

Similar Documents

Publication Publication Date Title
WO2013012146A1 (en) Content playing method and apparatus
WO2021208648A1 (zh) 虚拟对象调整方法、装置、存储介质与增强现实设备
WO2017148294A1 (zh) 基于移动终端的设备控制方法、装置和移动终端
WO2010074437A2 (en) Image processing method and apparatus therefor
CN107358007A (zh) 控制智能家居系统的方法、装置和计算可读存储介质
WO2021136091A1 (zh) 闪光灯的补光方法和电子设备
US20130044129A1 (en) Location based skins for mixed reality displays
US9443457B2 (en) Display control device, display control method, and recording medium
KR20140128428A (ko) 상호작용 정보를 제공하는 방법 및 시스템
CN104243961A (zh) 多视角影像的显示系统及方法
CN109920065A (zh) 资讯的展示方法、装置、设备及存储介质
TWI508525B (zh) 行動終端機及控制該行動終端機操作的方法
JP6972474B2 (ja) 没入型オーディオビジュアルコンテンツ投影システム
US11635802B2 (en) Combined light intensity based CMOS and event detection sensor for high speed predictive tracking and latency compensation in virtual and augmented reality HMD systems
CN109407821A (zh) 与虚拟现实视频的协作交互
WO2021147921A1 (zh) 图像处理方法、电子设备及计算机可读存储介质
CN107884942A (zh) 一种增强现实显示装置
WO2021095573A1 (ja) 情報処理システム、情報処理方法及びプログラム
CN107113467A (zh) 用户终端装置、系统及其控制方法
CN112333458B (zh) 直播房间显示方法、装置、设备及存储介质
US11323838B2 (en) Method and apparatus for providing audio content in immersive reality
CN114615556B (zh) 虚拟直播增强互动方法及装置、电子设备、存储介质
JP2020530218A (ja) 没入型視聴覚コンテンツを投影する方法
WO2021107595A1 (ko) 가상콘텐츠 체험 시스템 및 그 제어방법
KR102574730B1 (ko) Ar 글라스를 이용한 증강 현실 화면 및 리모컨 제공 방법 및 그를 위한 장치 및 시스템

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20131129

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20150327

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 19/00 20110101ALI20150323BHEP

Ipc: H04N 13/04 20060101AFI20150323BHEP

Ipc: G06T 15/20 20110101ALI20150323BHEP

Ipc: H04N 13/02 20060101ALI20150323BHEP

Ipc: G06F 3/01 20060101ALI20150323BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20170125