CN103703772A - Content playing method and apparatus - Google Patents

Content playing method and apparatus Download PDF

Info

Publication number
CN103703772A
CN103703772A CN201280035942.2A CN201280035942A CN103703772A CN 103703772 A CN103703772 A CN 103703772A CN 201280035942 A CN201280035942 A CN 201280035942A CN 103703772 A CN103703772 A CN 103703772A
Authority
CN
China
Prior art keywords
content
user
virtual view
space
play
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201280035942.2A
Other languages
Chinese (zh)
Inventor
郑相根
朴贤哲
郑文植
赵庆善
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN103703772A publication Critical patent/CN103703772A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S3/00Systems employing more than two channels, e.g. quadraphonic

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Acoustics & Sound (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and apparatus for realistically playing content stimulating sight and hearing senses of a user corresponding to a location of the user, by determining a first location of a user, mapping a content space displayed on a display unit corresponding with an actual space in which the user is positioned based on the first determined location, determining a virtual viewpoint in the content space corresponding to a second location of the user, and playing content corresponding to the determined virtual viewpoint.

Description

Content playback method and device
Technical field
The present invention relates generally to content playback method and device, and more specifically, relate to method and the device thereof of playing the content corresponding with user's position.
Background technology
In recent years, the increase in demand to three-dimensional (3D) image technique, and along with more generally the using of digital broadcasting, studying energetically in 3D TV and 3D information terminal and using stereo-picture.Conventionally, the stereo-picture of realizing by 3D technology is to form by the principle of stereoscopic vision by two experience.Because two spaced about 65mm, so binocular parallax is taken on the principal element of the degree of depth.When left eye and right eye are watched different stereo-pictures, the different stereo-picture of two width is sent to brain by retina, and its midbrain combines the different stereo-picture of two width so that user experiences the degree of depth of stereo-picture.Yet, although 3D TV can show to have the 3D rendering of fixing viewpoint and no matter user's position, TV cannot provide user to be present in the true picture in TV.
Summary of the invention
Technical problem
Therefore, make the present invention to solve the problems referred to above that occur in prior art, and the invention provides method and the device thereof of the true content of at least one sensation of playing the stimulation user corresponding with user's position.
Solution
According to an aspect of the present invention, provide a kind of content playback method, comprising: the primary importance of determining user; Based on determined primary importance, the content space showing is mapped as corresponding with the residing real space of user on display unit; Determine the virtual view in the content space corresponding with user's the second place; And the broadcasting content corresponding with determined virtual view.
According to a further aspect in the invention, provide a kind of content reproduction device, comprising: content collecting unit, for collecting the content of the sensation that stimulates user; Content handler, processes operation to play from the content of content collecting unit input for carrying out; Content play unit, for playing from the content of content collecting unit input; Transducer, for collecting the information being associated with user's position to play the content corresponding with user's position; And controller, for the information based on receiving from transducer, determine the virtual view in the virtual content space corresponding with user's position, and control the content that makes broadcasting corresponding with determined virtual view.
Beneficial effect
The invention provides method and the device thereof of the true content of at least one sensation of playing the stimulation user corresponding with user's position.
Accompanying drawing explanation
Following detailed description in conjunction with the drawings, above and other aspects, features and advantages of the present invention will be more obvious, wherein:
Fig. 1 is the diagram block diagram of the configuration of content reproduction device according to an embodiment of the invention;
Fig. 2 is the diagram figure of three-dimensional system of coordinate according to an embodiment of the invention;
Fig. 3 a, 3b and 3c are the figure of perspective view, plan view and the end view of real space according to an embodiment of the invention of diagram respectively;
Fig. 4 a and 4b are the perspective view in virtual content space and the figure of plan view according to an embodiment of the invention of diagram respectively;
Fig. 5 is the diagram block diagram of true type broadcast unit according to an embodiment of the invention;
Fig. 6 is the diagram figure of space mapping method according to an embodiment of the invention;
Fig. 7 is the diagram figure of space mapping method according to an embodiment of the invention;
Fig. 8 be diagram according to an embodiment of the invention virtual view determine the figure of method;
Fig. 9 is that diagram is according to the figure of the angle control method of the virtual camera of one exemplary embodiment of the present invention;
Figure 10 is the diagram figure of stereo-picture control method according to an embodiment of the invention;
Figure 11 is the diagram figure of stereo sound control method according to an embodiment of the invention;
Figure 12 is the diagram block diagram of the network of the domestic network system of application content playing device according to an embodiment of the invention;
Figure 13 is the diagram flow chart of content playback method according to an embodiment of the invention; And
Figure 14 is the diagram flow chart of content playback method according to an embodiment of the invention.
Embodiment
Below, with reference to accompanying drawing, describe in detail according to the various embodiment of content playback method of the present invention and device.All in accompanying drawing, with identical reference number, refer to same or analogous element.Omit the detailed description of known function and structure to avoid fuzzy theme of the present invention.
As used herein, term " content " refers to for stimulating the operation such as the user's of vision, the sense of hearing and sense of touch sensation.For example, content can be image, light, voice and wind.In addition, true type playback can refer to and user's position play content accordingly.That is, user can differently experience identical content at user's diverse location.For example, when content is the automobile being presented on screen, depend on customer location, user watch automobile above or side.This content playback method and device are applicable to have the electronic equipment of function of the content of the sensation of playing stimulation user.Particularly, this content playback method and device are applicable to notebook, desktop PC, dull and stereotyped PC, intelligent telephone set, HDTV (HDTV), intelligent TV, 3 dimension (3D) TV, IPTV machine (IPTV), stereo system, cinema system, home theater, domestic network system etc.
This content playback method and device provide tracking user change in location function and with the followed the tracks of customer location function of play content truly accordingly.According to an embodiment of the invention content playback method and device can provide receive content (such as from by Local Area Network, WLAN, the 3rd generation (3G) or the 4th generation the content that provides of (4G) cordless communication network image), storage comprise the function of the database of the image being received with real-time mode broadcasting.Described image can comprise stereo-picture.Stereo-picture can become 3D film, 3D animation or 3D computer graphical.In addition, stereo-picture can be the multimedia merging with stereo sound.
Fig. 1 is the diagram block diagram of the configuration of content reproduction device according to an embodiment of the invention.The content reproduction device of supposing Fig. 1 is 3D TV, and it makes content can seem to be present in the space between screen and user.With reference to Fig. 1, content reproduction device 100 comprises input unit 110, remote controller 120, remote controller receiver 125, transducer 130, content collecting unit 140, content handler 150, voice output unit 161, image-display units 162, memory 170, interface unit 180 and controller 190 according to an embodiment of the invention.
Input unit 110 can comprise a plurality of input keys and function button, for receiving the input of numeral or character information, and for various functions are set.Described function can comprise arrow key, side switch and the hot key that is provided for carrying out predetermined function.In addition, input unit 110 establishments transmission control with user's setting and the function of content reproduction device 100 key-press event being associated.Key-press event can comprise that electric power on/off event, volume control event, screen ON/OFF event etc.Controller 190 is controlled said elements in response to key-press event.
The various key-press event that remote controller 120 creates for content of operation playing device 100, are converted to wireless signal by created key-press event, and send wireless signal to remote controller receiver 125.Particularly, remote controller 120 of the present invention can create for asking the beginning event of true type playback and for stopping the termination event of true type playback.As implied above, true type playback can be defined as and user's position play content accordingly.Remote controller receiver 125 is converted to original key-press event by the wireless signal of reception, and transmits original key-press event to controller 190.
Transducer 130 is collected the information that is associated with user's position so that user can follow the tracks of user's position, and to the collected information of controller 190 transmission.Particularly, can be by realizing transducer 130 for sensing such as ultrared imageing sensor or the optical pickocff with the light of predetermined wavelength.In addition, transducer 130 is converted to the signal of telecommunication by the physical quantity of sensing, and analog to digital converter (ADC) converts electrical signals to data, and transmits data to controller 190.
Content collecting unit 140 is carried out for collecting the function of the content of the sensation that stimulates user.Particularly, content collecting unit 140 is carried out for the function from network or ancillary equipment collection image and sound.That is, content collecting unit 140 can comprise radio receiver 141 and internet communication unit 142.Particularly, radio receiver 141 is selected one from a plurality of broadcast channels, and the broadcast singal of selected broadcast channel is demodulated into original broadcast content.Internet communication unit 142 includes line modem or radio modem, for receiving for home shopping, Home Banking and game on line and the various information of MP3 use and relative additional information thereof.Internet communication unit 142 can comprise mobile communication module (for example, 3G mobile communication module, 3.5G mobile communication module and 4G mobile communication module) and proximity communication module (for example, Wi-Fi module).
Content handler 150 is carried out processing capacity to play the content from content collecting unit 140.Particularly, content handler 150 is categorized as stereo-picture and stereo sound by input content.Content handler 150 can comprise for the stereo sound of classification being decoded and outputing to the Sound Processor Unit of voice output unit 161 and for the stereo sound of classification being decoded as to left image and right image and left image and right image being outputed to the image processor of image processor 152.In addition, content handler 150 can compress input content and be sent to controller 190 under the control of controller 190.Correspondingly, controller 190 is sent to memory 170 by the content of compression.Particularly, Sound Processor Unit 150 can be controlled according to user's position direction or the distance of stereo sound.In other words, Sound Processor Unit 150 can change from the type of the sound of voice output unit 161 outputs according to user's position, or changes volume according to the type of sound.Image processor 160 can be controlled brightness, third dimension and the degree of depth according to user's position.
Content play unit 160 is carried out for playing the function of the content of being processed by content handler 150.Content play unit 160 can comprise voice output unit 161 and image-display units 162.The decoded stereo sound of voice output unit 161 output, and comprise a plurality of loud speakers, for example, 5.1 channel speaker.Image-display units 161 shows stereo-picture.By for showing the display unit of stereo-picture and realizing unit for allowing user with respect to the 2D of the stereo-picture depth of experience showing, image-display units 162 shows the stereo-picture with the degree of depth, as stereo-picture, actually exists in the three dimensions between image-display units 162 and screen.Display unit may be implemented as liquid crystal display (LCD), Organic Light Emitting Diode (OLED) or active matrix organic light-emitting diode (AMOLED).3D realize unit be make to identify at the right eye of binocular use and left eye place forming in display unit of different images or with the structural detail of its combination.Conventionally, 3D implementation is divided into glasses scheme and glasses-free scheme.Glasses scheme comprises filter scheme, deflection filter scheme and shutter glasses scheme.Glasses-free scheme comprises biconvex lens scheme and disparity barrier.Because 3D implementation is in the field of business is known, omit its detailed description.
Memory 170 storages are for required program and the data of operation of content reproduction device 100.Memory 170 can combine to configure by volatibility medium for storing, non-volatile storage medium or its.Volatibility medium for storing comprises the semiconductor memory such as RAM, DRAM or SRAM.Non-volatile storage medium can comprise hard disk.In addition, memory 170 can be divided into data area and program area.Particularly, the data that created by controller 160 according to the use of content reproduction device 100 can be stored in the data area of memory 170.The content with predetermined format compression providing from controller 160 can be provided in data area.The program area of memory 140 can be stored for starting content reproduction device 100 with the operating system (OS) of each element of operation and for supporting the application program of various user functions, for example, for accessing the web browser of Internet Server, for playing the MP3 user function of other sound sources, for playing Video Out and the mobile playback function of photo.Particularly, program area of the present invention can be stored true type playback program.True type playback program can comprise routine (routine) for determining user's initial position, for based on initial position by real space and the routine of content space mapping, the routine changing for tracing positional, for the routine of the virtual view of content space corresponding to definite and user's position and for playing the routine of the content corresponding with the time point of content space.Initial position is defined for the reference value that content space is mapped to real space.Real space is the residing 3d space of user and display unit.Content space is the existing Virtual Space of content showing by display unit.In addition the viewpoint of the user in the content space that, virtual view is defined as shining upon with real space.
Interface unit 180 is carried out function content reproduction device 100 being connected with ancillary equipment with wired or wireless scheme.Interface unit 180 can comprise
Figure BDA0000459077710000061
module, Wi-Fi module or
Figure BDA0000459077710000062
module.Particularly, interface module 180 can receive and transmit control signal for true type playback to ancillary equipment from controller 190.That is, controller 190 can be controlled ancillary equipment by interface unit 180.Ancillary equipment can become home network device, stereo system equipment, lamp, air-conditioning and heating installation.In other words, controller 190 for example can be controlled ancillary equipment broadcasting, for stimulating the content of user's sensation (, sense of touch, vision and the sense of taste).
Controller 190 can the integrated operation of Control the content playing device 100 and the internal structural element of content reproduction device 100 between signal stream.In addition, controller 190 can be controlled the power supply to inner member in battery.In addition, controller 190 can move the various application programs that are stored in program area.Particularly, in the present invention, if sense the beginning event for true type playback, controller 190 can move above-mentioned true type playback program.That is,, if move true type playback program, controller 190 is determined user's initial position the change in location of following the tracks of user.In addition, controller 190 by content space and real space mapping, is determined the virtual view corresponding with followed the tracks of position based on initial position, and Control the content processor 150 is play the content corresponding with virtual view.In addition, controller 190 can be controlled ancillary equipment by interface unit 180 and play the content corresponding with virtual view.The true type playback function of controller 190 will be described in detail.
Fig. 2 is the diagram figure of three-dimensional system of coordinate according to an embodiment of the invention.As shown in Figure 2, according to of the present invention, for expressing the method for 3d space, can use three-dimensional system of coordinate.Solid line represent on the occasion of, and dotted line represents negative value.In addition, at time t coordinate of user in real space, be represented as (x u,t, y u,t, z u,t), and be represented as (x at time t coordinate of camera in content space c,t, y c,t, z c,t).
Fig. 3 a, 3b and 3c are the diagram figure of perspective view, plan view and the end view of real space according to an embodiment of the invention.With reference to Fig. 3 a, in real space, the central point of the screen 301 of display unit is set to (0,0,0) of coordinate system.With reference to user, watch the direction of screen, the right-hand and left of central point 302 becomes respectively X uthe positive direction of axle and negative direction.The above and below of central point 302 becomes Y uthe positive direction of axle and negative direction.Direction towards user 303 on screen 301 becomes Z uthe positive direction of axle, and its rightabout becomes Z uthe negative direction of axle.Can use (x u, y u, z u) represent user's position.With reference to Fig. 3 b and 3c, can use display screen wide (DSW) to represent the horizontal length of screen 301, can use display screen high (DSH) to represent the vertical length of screen 302, and can use viewing distance (WD) to represent the air line distance between screen 301 and user 303.
Fig. 4 a and 4b are the figure that diagram illustrates perspective view and the plan view in virtual content space.First, virtual camera described herein is not genuine camera, and refers to the user in the content space corresponding with user in real space.With reference to Fig. 4 a, the focus 402 on the focal plane 401 in content space is set to (0,0,0) of coordinate system.The direction of guiding aggregation surface 401 based on camera 430, the right-hand and left of focus 402 becomes respectively X cthe positive direction of axle and negative direction.The above and below of focus 402 becomes respectively Y cthe positive direction of axle and negative direction.Focal plane 401 becomes Z towards the direction of virtual camera 403 cthe positive direction of axle, and its contrary direction becomes Z cthe negative direction of axle.(x can be used in the position of virtual camera 403 c, y c, z c) represent.The horizontal length of focal plane 401 can be used burnt wide (FW) to represent, the vertical length of focal plane 401 can be used burnt high (FH) to represent, and the air line distance between focal plane 401 and virtual camera 403 can be used focal length (FL) to represent.FL can be arranged by user.The size of focal plane 401 can be set by adjusting the angle of camera.That is, because virtual camera 403 is virtual, so the distance of focus and the angle of camera can be set on demand.
Fig. 5 is the diagram block diagram of true broadcast unit according to an embodiment of the invention.True type broadcast unit 500 can be configured in controller 190 inside or configuration separately.Suppose that true broadcast unit 500 can be configured in controller 190 inside.With reference to Fig. 5, true broadcast unit 500 of the present invention, can comprise tracker 510, initiator 520, spatial mappings device 530, virtual view determiner 540 and contents processing controller 550 such as controller 190.The position that tracker 510 is followed the tracks of user 303.That is, tracker 510 uses the data tracking user's who receives from transducer 130 coordinate (x u, y u, z u).Particularly, tracker 510 is detected characteristics information from received sensitive information, for example, and user 303 face, and the central point of detected face is defined as to user 303 coordinate (x u, y u, z u).In this situation, can calculate z by the size of detected face u, WD.
Initiator 520 is determined users' initial position, and it is as the reference value that content space and real space are shone upon.That is,, if sense the beginning event for true type playback, initiator 520 is defined as the coordinate from tracker 510 inputs user's initial position.Particularly, if user 303 position does not change within default error after view content starts, initiator 520 can be initial position by user 303 location positioning.In addition,, if from remote controller 120 input predetermined key values, initiator 520 can be initial position by the user's of the input time in predetermined key value location positioning.In addition,, if from tracker 510 input beginning events, initiator 520 can be initial position by the user's of the input time in predetermined key value location positioning.For this reason, tracker 510 can detect user's prearranged gesture by template matches mode, for example, put down the action of hand after lifting hand.If prearranged gesture detected, tracker 510 is notified initiator 520.
Fig. 6 is the diagram figure of space mapping method according to an embodiment of the invention.Formula below (1) represents the relative ratios of content space coordinate system and real space coordinate system in space mapping method according to an embodiment of the invention.In this formula, t 0expression is defined as the time point of initial position by initiator 520.
Formula (1)
The conversion of==X-axis==
FW=X_Ratio*DSW,X_Ratio=FW/DSW
The conversion of==Y-axis==
FH=Y_Ratio*DSH,Y_Ratio=FH/DSH
The conversion of==Z axis==
FL=Z_Ratio* is at time 0 (t 0) WD, Z_Ratio=FL/ is at t 0wD
As shown in Figure 6, the initial position of spatial mappings device 530 based on user 601 is mapped to real space 602 by content space 603.That is, spatial mappings device 530 is determined FW, FH and FL, then as shown in formula (1), calculates X_Ratio, Y_Ratio and Z_Ratio(at t 0).
Fig. 7 is the diagram figure of space mapping method according to an embodiment of the invention.Formula below (2) represents the relative ratios of content space coordinate system and real space coordinate system in space mapping method according to an embodiment of the invention.
Formula (2)
The conversion of==X-axis==
FW=X_Ratio*(DSW+X_Adjustment)
X_Ratio=FW/(DSW+X_Adjustment)
The conversion of==Y-axis==
FH=Y_Ratio*(DSH+Y_Adjustment)
Y_Ratio=FH/(DSH+Y+Adjustment)
The conversion of==Z axis==
FL=Z_Ratio* is (at t 0wD+Z_Adjustment)
Z_Ratio=FL/ is (at t 0wD+Z_Adjustment)
DSW, DSH and WD are the values that depends on the size of real space and display unit.As shown in formula (2), spatial mappings device 530 can add or deduct predetermined adjustment amount to extend or to shorten the real space that is mapped to content space from above-mentioned value to above-mentioned value.In other words, spatial mappings device 530 can be controlled with adjustment amount the size of the content of demonstration.Before true type playback starts or between true type playback elementary period, spatial mappings device 530 can receive adjustment amounts by receiver 125 from remote controller 120 at any time.
Fig. 8 be diagram according to an embodiment of the invention virtual view determine the figure of method.Formula below (3) is the computing formula that virtual view is determined method.
Formula (3)
△x u=x u,t+1–x u,t,△x c=X_Ratio*△x u,x c,t+1=x c,t+△x c
△y u=y u,t+1–y u,t,△y c=Y_Ratio*△y u,y c,t+1=y c,t+△y c
△z u=z u,t+1–z u,t,△z c=Z_Ratio*△z u,z c,t+1=z c,t+△z c
With reference to Fig. 8, virtual view determiner 540 receives user 801 coordinate (x from tracker 510 u, t+1, y u, t+1, z u, t+1), and receive coordinate transform value from spatial mappings device 530, that is, and X_Ratio, Y_Ratio and Z_Ratio.As shown in formula (3), virtual view determiner 540 is used the information of above-mentioned reception to calculate the coordinate (x of the virtual camera 702 of the coordinate that is mapped to user 801 c, t+1, y c, t+1, z c, t+1).Although user is still static, user's position may slightly change or change to predetermined extent due to the error of transducer or identification.Therefore, if consider change in location, change displaying contents, user may experience inconvenience.In the present invention, can set in advance the minimum transition threshold value (MTT) for the position of mobile camera.MIT can be the option that user can directly arrange.When user 801 location variation (is △ x u, △ y u, or △ z u) while being MIT, virtual view determiner 540 can calculate the coordinate of virtual camera 802.Can MTT be differently set for X, Y, Z axis.For example, MTT that can Z axis is set to have maximum.For example, when user when watching when seat stands, can calculate the coordinate of virtual camera 802.
Fig. 9 is that diagram is according to the figure of the angle control method of the virtual camera of one exemplary embodiment of the present invention.With reference to Fig. 9, the central point 901 of virtual view determiner 540 based on screen calculates user 902 angle variable quantity θ (θ x, θ y, θ z).That is, the initial position of virtual view determiner 540 based on user 902 carrys out calculating location variable quantity △ x u, 0(=x u,t– x u, 0), △ y u, 0(=y u,t– y u, 0), △ z u, 0(=z u,t– z u, 0), and calculated location variation is applied to trigonometric function with calculation amount.That is, can use calculated angle variable quantity θ as the angle controlling value of virtual camera 903.The calculating of angle variable quantity can optionally be arranged by user.
Figure 10 is the diagram figure of stereo-picture control method according to an embodiment of the invention.
Contents processing controller 550 receives virtual view from virtual view determiner 540, that is, and and the coordinate x of virtual camera c, t+1, y c, t+1, z c, t+1.In addition, contents processing controller 550 can receive the angle adjustment amount of virtual view from virtual view determiner 540, that is, and and the angle controlling value θ of virtual camera.In addition, the information of contents processing controller 550 based on received comes Control the content processor 150 to adjust brightness, third dimension and the degree of depth of stereo-picture.With reference to Figure 10, if determined user's initial position 1001,550 executive control operations of contents processing controller are to show that virtual camera is at the object of initial position 902 demarcation.If move as 1001->1003->1005 user's position, move as 1002->1004->1006 the position that is mapped to the virtual camera in the content space of real space.Therefore the another part that, shows object according to the change in location of virtual camera.If user moves to 1007 from 1005, camera moves to 1008 from 1007.If camera is positioned at 1008, because the angular separation of object and virtual camera, so it is no longer shown.Yet, contents processing controller 550 along the direction of object 1009 by camera anglec of rotation controlling value θ to show continuously object 1009.
Figure 11 is the diagram figure of stereo sound control method according to an embodiment of the invention.
Contents processing controller 550 Control the content processors 150 are adjusted direction and the distance of stereo sound.With reference to Figure 11, when user is positioned at 1101, the mode that contents processing controller 550 can increase gradually with the sound of automobile adjust distance so that user to experience automobile close.When user is positioned at 1102, the mode that contents processing controller 550 can reduce gradually with the sound of automobile is adjusted distance and is left so that user experiences automobile.In addition,, when user is positioned at 1101, contents processing controller 550 is controlled the direction of stereo sound so that loud speaker and the center loudspeaker output in the past of the sound of automobile.When user is positioned at 1102, contents processing controller 550 control stereo sounds direction so that the sound of automobile from rear loud speaker, export.
Foregoing playing device 100 may further include the structure of not mentioning, such as camera, microphone and GPS receiver module.Because structural detail can diversely change according to the convergence of digital device, so cannot list element.Yet content reproduction device 100 can comprise the structural detail being equal to above structural detail.In addition, content reproduction device 100 of the present invention can be by replacing according to the concrete structure in the above-mentioned layout of provided form or other structures.This is understood that to those skilled in the art.
Figure 12 is the diagram block diagram of the network of the domestic network system of application content playing device according to an embodiment of the invention.With reference to Figure 12, according to domestic network system of the present invention, can comprise content reproduction device 1200, home network server 1210 and a plurality of home network device.Content reproduction device can comprise said structure element.Content reproduction device 1200 can with such as ZigBee scheme, the communication plan of scheme or WLAN scheme is communicated by letter with home network server 1210.If sense the beginning event for true type playback, content reproduction device 1200 moves true type playing program.That is, content reproduction device 1200 can be controlled home network device and play the content corresponding with user's position.Home network server 1210 is controlled home network device.Home network server 1210 can drive home network device under the control of content reproduction device 1200.Home network device has respectively unique address, and is controlled by address by home network server 1210.Home network device is play the content of the sensation that stimulates user under the control of content reproduction device 1200.For example, as shown in the figure, home network device can comprise air-conditioning 1220, humidifier 1230, heating installation 1240 and lamp 1250.Content reproduction device 1200 can be controlled air-conditioning 1220, humidifier 1230, heating installation 1240 and lamp to regulate intensity, surrounding brightness, temperature and the humidity of wind according to user's change in location.For example, with reference to Figure 11, when user is positioned at 1102, content reproduction device 1200 can increase the intensity of wind so that user experiences the automobile that approaches user.
That is, user can be subject to haptic stimulus, and experiences and be present in genuine content space.
Figure 13 is the diagram flow chart of content playback method according to an embodiment of the invention.Referring to figs. 1 through 13, controller 190 can be first in idle condition.Idle condition can be defined in the state that true type replay procedure shows image before.If user is remote controller 120 under idle condition, controller 190 senses the beginning event for true playback.As previously shown, if sense beginning event, initiator 520 is defined as the coordinate from tracker 510 inputs in step 1301 user's initial position.Tracker 510 can be play the position of following the tracks of user before step in true type.
Spatial mappings device 530 is mapped to real space based on determined initial position by content space in step 1302.Virtual view determiner 540 calculates user's location variation (△ x in step 1303 u, △ y u, △ z u).Next, virtual view determiner 540 in step 1304 by calculated user's location variation and MTT comparison.As the comparative result of step 1304, when calculated user's location variation is greater than MTT, process advances to step 1305.Virtual view determiner 540 is used formula (3) to determine the position x of virtual camera in step 1305 c, t+1, y c, t+1and z c, t+1(that is, virtual view) also transmits to content processing controller 550.
In step 1306, the virtual view based on received carrys out Control the content processor 150 to contents processing controller 550.That is, contents processing controller 550 Control the content processors 150 are play the content corresponding with received virtual view.In addition, contents processing controller 550 in step 1306 virtual view based on received adjust direction or the distance of stereo sound.Content handler 550 can be controlled intensity, temperature, humidity or the brightness that regulates wind such as the ancillary equipment of home network device based on virtual view.
Next, contents processing controller 550 determines whether to sense the termination event of true type playback in step 1307.If sense the termination event of true type playback in step 1307, stop the process for true playback.On the contrary, if do not sense termination event, process is returned to step 1303.
Figure 14 is the flow chart of diagram content playback method according to another embodiment of the present invention.With reference to Figure 14, content playback method according to another embodiment of the present invention can comprise that step 1401 is to 1407.Because step 1401,1402,1404,1405 and 1407 relevant to above-mentioned steps 1301,1302,1304,1305 and 1307, so will the descriptions thereof are omitted.Step 1403 is that virtual view determiner 540 is together with user's location variation △ x u, △ y uwith △ z uthe step of the angle variable quantity θ of calculating chart 9 together.In step 1406, contents processing controller 550 is play the content corresponding with received virtual view, the angle variable quantity θ that the direction rotation of virtual view is calculated, and play the content corresponding with the virtual view of rotation.
The method that user interface is provided according to an embodiment of the invention in portable terminal as above can realize with form that can working procedure order by various computer meanses, and can be recorded in computer readable recording medium storing program for performing.Computer readable recording medium storing program for performing can comprise independent program command, data file and data structure or its combination.Meanwhile, the program command being recorded in recording medium can be special design or be configured for the present invention, or its use is known for computer software fields those of ordinary skill.
Computer readable recording medium storing program for performing comprises storage the magnetizing mediums such as hard disk, floppy disk or tape of working procedure order, such as the optical medium of compact disc read-only memory (CD-ROM) or digital versatile disc (DVD), such as the optomagnetic medium of photomagneto disk and such as the hardware device of ROM, RAM, flash memory.In addition, program command comprises the higher-level language code of using interpreter to move by the machine language code of compiler-creating and computer.Above-mentioned hardware device can be configured to operate as carrying out at least one software module of operation of the present invention, and its anti-operation is identical.
According to the effect of content playback method of the present invention and device thereof, be, can play truly the content of the sensation that stimulates user.
Although described various embodiment of the present invention here in detail, can carry out many variations and modification and do not depart from the spirit and scope of the present invention that limit as claims.

Claims (18)

1. a content playback method, the method comprises:
Determine user's primary importance;
Based on determined primary importance, the content space showing is mapped as corresponding with the residing real space of user on display unit;
Determine the virtual view in the content space corresponding with user's the second place; And
Play the content corresponding with determined virtual view.
2. the method for claim 1, wherein the step of play content comprises: based on determined virtual view, show image.
3. method as claimed in claim 2, wherein shows that the step of image comprises: at least one in the brightness of control chart picture, third dimension and the degree of depth is to show the image of being controlled.
4. the method for claim 1, wherein the step of play content comprises: based on determined virtual view, carry out output sound.
5. method as claimed in claim 4, wherein the step of output sound comprises: control at least one sound controlled with output in sense of direction and distance perspective.
6. the method for claim 1, the content wherein play comprises at least stimulating user's vision and the image of the sense of hearing and sound.
7. the method for claim 1, wherein determine that the step of virtual view comprises:
Calculate user's location variation; And
When calculated location variation is greater than default minimum transition threshold value (MTT), determine the virtual view corresponding with the customer location changing.
8. the method for claim 1, wherein the step in mapping content space comprises: calculate the coordinate transform value between the coordinate system of real space and the coordinate system of content space.
9. method as claimed in claim 8, wherein determines that the step of virtual view comprises: by this coordinate transform value, determine the virtual view corresponding with user's the second place.
10. method as claimed in claim 8, wherein the step in mapping content space comprises:
Use the horizontal length of the screen of display unit: the vertical length of display screen wide (DSW), screen: the air line distance between display screen high (DSH) and screen and user: viewing distance (WD) arranges the coordinate system of real space;
Use the horizontal length of focal plane: the vertical length of burnt wide (FW), focal plane: the air line distance between burnt high (FH) and focal plane and virtual view: focal length (FL) arranges the coordinate system of content space; And
When determining primary importance, by primary importance, carry out calculating coordinate change value.
11. methods as claimed in claim 10, the step that the coordinate system of real space is wherein set comprises: by least one interpolation in horizontal length, vertical length and air line distance or deduct adjustment amount the real space dwindling or expand is mapped to Virtual Space.
12. the method for claim 1, further comprise:
Primary importance based on user is calculated user's location variation;
Calculated location variation is applied to trigonometric function with calculation amount;
The angle variable quantity that the direction rotation of virtual view is calculated; And
Play content corresponding to virtual view being rotated with its direction.
13. the method for claim 1, wherein determine that the step of virtual view comprises:
Follow the tracks of user's position; And
When the position that the result of following the tracks of is user is fixed on while reaching Preset Time in default error, user's fixed position is defined as to the primary importance of mapping.
14. the method for claim 1, wherein determine that the step of virtual view comprises: when in the position of following the tracks of user, sense for true type playback beginning event time, the primary importance by followed the tracks of location positioning for mapping.
15. the method for claim 1, wherein determine that the step of virtual view comprises: when sensing user's prearranged gesture in the position of following the tracks of user, the primary importance by followed the tracks of location positioning for mapping.
16. 1 kinds of content reproduction devices, comprising:
Content collecting unit, for collecting the content of the sensation that stimulates user;
Content handler, processes operation to play from the content of content collecting unit input for carrying out;
Content play unit, for playing from the content of content collecting unit input;
Transducer, for collecting the information being associated with user's position to play the content corresponding with user's position; And
Controller, determines the virtual view in the virtual content space corresponding with user's position for the information based on receiving from transducer, and controls the content that makes broadcasting corresponding with determined virtual view.
17. devices as claimed in claim 16, its middle controller comprises:
Initiator, determines user's primary importance for the information based on receiving from transducer;
Spatial mappings device, for being mapped to the residing real space of user based on determined primary importance by the content space showing on display unit;
Virtual view determiner, for determining the virtual view of the content space corresponding with user's the second place; And
Contents processing controller, plays the content corresponding with determined virtual view for Control the content processor.
18. devices as claimed in claim 17, further comprise interface unit, and it is connected with the ancillary equipment with content playback function, and wherein contents processing controller is controlled ancillary equipment by interface unit and play the content corresponding with virtual view.
CN201280035942.2A 2011-07-18 2012-01-17 Content playing method and apparatus Pending CN103703772A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR20110070959 2011-07-18
KR10-2011-0070959 2011-07-18
KR1020110114883A KR101926477B1 (en) 2011-07-18 2011-11-07 Contents play method and apparatus
KR10-2011-0114883 2011-11-07
PCT/KR2012/000375 WO2013012146A1 (en) 2011-07-18 2012-01-17 Content playing method and apparatus

Publications (1)

Publication Number Publication Date
CN103703772A true CN103703772A (en) 2014-04-02

Family

ID=47839676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201280035942.2A Pending CN103703772A (en) 2011-07-18 2012-01-17 Content playing method and apparatus

Country Status (5)

Country Link
US (1) US20130023342A1 (en)
EP (1) EP2735164A4 (en)
KR (1) KR101926477B1 (en)
CN (1) CN103703772A (en)
WO (1) WO2013012146A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106257355A (en) * 2015-06-18 2016-12-28 松下电器(美国)知识产权公司 Apparatus control method and controller
CN107210034A (en) * 2015-02-03 2017-09-26 杜比实验室特许公司 selective conference summary
CN111681467A (en) * 2020-06-01 2020-09-18 广东小天才科技有限公司 Vocabulary learning method, electronic equipment and storage medium

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9613461B2 (en) * 2012-12-10 2017-04-04 Sony Corporation Display control apparatus, display control method, and program
KR102019125B1 (en) 2013-03-18 2019-09-06 엘지전자 주식회사 3D display device apparatus and controlling method thereof
KR101462021B1 (en) * 2013-05-23 2014-11-18 하수호 Method and terminal of providing graphical user interface for generating a sound source
KR101381396B1 (en) * 2013-09-12 2014-04-04 하수호 Multiple viewer video and 3d stereophonic sound player system including stereophonic sound controller and method thereof
US10937187B2 (en) 2013-10-07 2021-03-02 Apple Inc. Method and system for providing position or movement information for controlling at least one function of an environment
JP6683605B2 (en) * 2013-10-07 2020-04-22 アップル インコーポレイテッドApple Inc. Method and system for providing position or motion information for controlling at least one function of a vehicle
KR101669926B1 (en) 2014-02-03 2016-11-09 (주)에프엑스기어 User view point related image processing apparatus and method thereof
CN105094299B (en) * 2014-05-14 2018-06-01 三星电子(中国)研发中心 The method and apparatus for controlling electronic device
CN104123003B (en) * 2014-07-18 2017-08-01 北京智谷睿拓技术服务有限公司 Content share method and device
CN104102349B (en) 2014-07-18 2018-04-27 北京智谷睿拓技术服务有限公司 Content share method and device
CN104808946A (en) * 2015-04-29 2015-07-29 天脉聚源(北京)传媒科技有限公司 Image playing and controlling method and device
EP3376760A4 (en) * 2015-11-11 2019-04-03 Sony Corporation Image processing device and image processing method
US9851435B2 (en) 2015-12-14 2017-12-26 Htc Corporation Electronic device and signal generating circuit
CN109963177A (en) * 2017-12-26 2019-07-02 深圳Tcl新技术有限公司 A kind of method, storage medium and the television set of television set adjust automatically viewing angle
KR102059114B1 (en) * 2018-01-31 2019-12-24 옵티머스시스템 주식회사 Image compensation device for matching virtual space and real space and image matching system using the same
JP7140517B2 (en) * 2018-03-09 2022-09-21 キヤノン株式会社 Generation device, generation method performed by generation device, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
CN101799584A (en) * 2009-02-11 2010-08-11 乐金显示有限公司 Method of controlling view of stereoscopic image and stereoscopic image display using the same
US20100225735A1 (en) * 2009-03-09 2010-09-09 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
CN102045429A (en) * 2009-10-13 2011-05-04 华为终端有限公司 Method and equipment for adjusting displayed content
US20110149043A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Device and method for displaying three-dimensional images using head tracking
US20110157327A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1186038A (en) * 1997-03-03 1999-03-30 Sega Enterp Ltd Image processor, image processing method, medium and game machine
JP2006025281A (en) * 2004-07-09 2006-01-26 Hitachi Ltd Information source selection system, and method
JP4500632B2 (en) * 2004-09-07 2010-07-14 キヤノン株式会社 Virtual reality presentation apparatus and information processing method
US20080153591A1 (en) * 2005-03-07 2008-06-26 Leonidas Deligiannidis Teleportation Systems and Methods in a Virtual Environment
US8564532B2 (en) * 2005-12-06 2013-10-22 Naturalpoint, Inc. System and methods for using a movable object to control a computer
WO2008124820A1 (en) * 2007-04-10 2008-10-16 Reactrix Systems, Inc. Display using a three dimensional vision system
US20090141905A1 (en) * 2007-12-03 2009-06-04 David Warhol Navigable audio-based virtual environment
US20090222838A1 (en) * 2008-02-29 2009-09-03 Palm, Inc. Techniques for dynamic contact information
CA2747544C (en) * 2008-12-19 2016-06-21 Saab Ab System and method for mixing a scene with a virtual scenario
KR101046259B1 (en) * 2010-10-04 2011-07-04 최규호 Stereoscopic image display apparatus according to eye position
US9644989B2 (en) * 2011-06-29 2017-05-09 Telenav, Inc. Navigation system with notification and method of operation thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060038880A1 (en) * 2004-08-19 2006-02-23 Microsoft Corporation Stereoscopic image display
CN101799584A (en) * 2009-02-11 2010-08-11 乐金显示有限公司 Method of controlling view of stereoscopic image and stereoscopic image display using the same
US20100225735A1 (en) * 2009-03-09 2010-09-09 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
CN102045429A (en) * 2009-10-13 2011-05-04 华为终端有限公司 Method and equipment for adjusting displayed content
US20110149043A1 (en) * 2009-12-18 2011-06-23 Electronics And Telecommunications Research Institute Device and method for displaying three-dimensional images using head tracking
US20110157327A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107210034A (en) * 2015-02-03 2017-09-26 杜比实验室特许公司 selective conference summary
US11076052B2 (en) 2015-02-03 2021-07-27 Dolby Laboratories Licensing Corporation Selective conference digest
CN106257355A (en) * 2015-06-18 2016-12-28 松下电器(美国)知识产权公司 Apparatus control method and controller
CN111681467A (en) * 2020-06-01 2020-09-18 广东小天才科技有限公司 Vocabulary learning method, electronic equipment and storage medium
CN111681467B (en) * 2020-06-01 2022-09-23 广东小天才科技有限公司 Vocabulary learning method, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2013012146A1 (en) 2013-01-24
KR101926477B1 (en) 2018-12-11
KR20130010424A (en) 2013-01-28
US20130023342A1 (en) 2013-01-24
EP2735164A1 (en) 2014-05-28
EP2735164A4 (en) 2015-04-29

Similar Documents

Publication Publication Date Title
CN103703772A (en) Content playing method and apparatus
CN107113226B (en) Electronic device for identifying peripheral equipment and method thereof
US11024083B2 (en) Server, user terminal device, and control method therefor
CN109600678B (en) Information display method, device and system, server, terminal and storage medium
TWI471820B (en) Mobile terminal and operation control method thereof
US9032334B2 (en) Electronic device having 3-dimensional display and method of operating thereof
WO2017148294A1 (en) Mobile terminal-based apparatus control method, device, and mobile terminal
CN109407822B (en) Nausea and video streaming prevention techniques for collaborative virtual reality
CN107656718A (en) A kind of audio signal direction propagation method, apparatus, terminal and storage medium
CN104243961A (en) Display system and method of multi-view image
US20160205427A1 (en) User terminal apparatus, system, and control method thereof
CN106464976B (en) Display device, user terminal device, server, and control method thereof
US20240077941A1 (en) Information processing system, information processing method, and program
KR20170099088A (en) Electronic device and method for controlling the same
CN105721904B (en) The method of the content output of display device and control display device
WO2020248697A1 (en) Display device and video communication data processing method
US20220036075A1 (en) A system for controlling audio-capable connected devices in mixed reality environments
KR20130065074A (en) Electronic device and controlling method for electronic device
CN113691861B (en) Intelligent Bluetooth sound box sub-control adjusting system and method based on Internet
KR102168340B1 (en) Video display device
KR20180031137A (en) Server of cloud audio rendering based on 360-degree vr video
KR102687922B1 (en) Method, apparatus and system for providing interactive photo service
KR102574730B1 (en) Method of providing augmented reality TV screen and remote control using AR glass, and apparatus and system therefor
CN113709652B (en) Audio play control method and electronic equipment
CN117354567A (en) Bullet screen adjusting method, bullet screen adjusting device, bullet screen adjusting equipment and bullet screen adjusting medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140402