WO2006011401A1 - 情報処理装置および方法、記録媒体、並びにプログラム - Google Patents
情報処理装置および方法、記録媒体、並びにプログラム Download PDFInfo
- Publication number
- WO2006011401A1 WO2006011401A1 PCT/JP2005/013295 JP2005013295W WO2006011401A1 WO 2006011401 A1 WO2006011401 A1 WO 2006011401A1 JP 2005013295 W JP2005013295 W JP 2005013295W WO 2006011401 A1 WO2006011401 A1 WO 2006011401A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- audio
- user
- content data
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000010365 information processing Effects 0.000 title claims abstract description 35
- 230000006854 communication Effects 0.000 claims abstract description 176
- 238000004891 communication Methods 0.000 claims abstract description 175
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 63
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 63
- 239000000203 mixture Substances 0.000 claims description 30
- 239000002131 composite material Substances 0.000 claims description 14
- 230000002194 synthesizing effect Effects 0.000 claims description 11
- 238000003672 processing method Methods 0.000 claims description 3
- 238000010191 image analysis Methods 0.000 description 10
- 238000001514 detection method Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
- H04N7/17309—Transmission or handling of upstream communications
- H04N7/17318—Direct or substantially direct transmission and handling of requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2387—Stream processing in response to a playback request from an end-user, e.g. for trick-play
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25866—Management of end-user data
- H04N21/25875—Management of end-user data involving end-user authentication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/262—Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4135—Peripherals receiving signals from specially adapted client devices external recorder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
- H04N21/43074—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44012—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44204—Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47202—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/835—Generation of protective data, e.g. certificates
- H04N21/8355—Generation of protective data, e.g. certificates involving usage data, e.g. number of copies or viewings allowed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/155—Conference systems involving storage of or access to video conference sessions
Definitions
- the present invention relates to an information processing device and method, a recording medium, and a program, and in particular, communicates a user's voice and video with other information processing devices connected via a network and transmits the same content.
- the present invention relates to an information processing apparatus and method, a recording medium, and a program that are played back in synchronization with each other, and that are configured to record the user's voice and video and the played content.
- telephones so-called videophones, video conferencing systems, and the like exist as devices used for exchange between persons (hereinafter referred to as remote communication) in remote places.
- remote communication There is also a method of connecting to the Internet using a personal computer or the like and performing text chat, video chat with video and audio, and the like.
- each person who wants to perform remote communication uses a personal computer or the like to share a virtual space or share the same content via the Internet. It has also been proposed (see, for example, Patent Document 1).
- Patent Document 1 JP 2003-271530 A
- the present invention has been made in view of such a situation, and the content simultaneously viewed by speakers in remote locations and the video and audio of the speakers are combined at the time of recording.
- the purpose of this is to record so that it is reproducible and can be synthesized and played back in a different state from the time of recording.
- An information processing apparatus communicates, with a reproduction unit that reproduces the same content data in synchronization with another information processing apparatus, a user's voice and video with another information processing apparatus via a network.
- Compositing means for combining the video and audio of the content data reproduced by the reproducing means with the video and audio of the user based on the settings from the user, the content data, the video and audio of the user, and Storage means for storing synthesis information indicating the synthesis status of the video and audio of the content data and the video and audio of the user, and the playback means and the synthesis means are controlled based on the synthesis information stored by the storage means.
- reproduction control means for reproducing the composition of the video and audio of the content data stored in the storage means and the user's video and audio.
- the content data stored by the storage unit and the video and audio of the user are added with time information indicating the time when synthesized by the synthesis unit, and the synthesis information is generated as the synthesis information. Or time information indicating the time when the setting of the composition status is changed can be added.
- An information processing method includes a reproduction step of reproducing the same content data in synchronization with another information processing apparatus, and communication of a user's voice and video with another information processing apparatus via a network.
- a storage step for storing synthesis information indicating the synthesis status of the video and audio of the content data and the user's video and audio, and the playback step and the synthesis step are controlled based on the synthesis information stored in the processing of the storage step.
- a reproduction control step for reproducing the synthesis of the video and audio of the content data stored in the storage step and the video and audio of the user.
- a recording medium program of the present invention communicates with another information processing apparatus via a network, a reproduction step of reproducing the same content data in synchronization with the other information processing apparatus, and a user's voice and video.
- the video and audio of the content data stored in the storage step process and the user's video are stored. And characterized in that it comprises a reproduction control step of reproducing the synthesis of the speech.
- the program of the present invention includes a reproduction step for reproducing the same content data in synchronization with another information processing apparatus, and communication for communicating a user's voice and video with another information processing apparatus via a network.
- a synthesis step for synthesizing the video and audio of the content data and the video and audio of the user reproduced in the processing of the playback step based on the setting of the step and the user, the content data, the video and audio of the user, and
- a storage step for storing synthesis information indicating the synthesis status of the video and audio of the content data and the video and audio of the user, and the playback step and the synthesis step are controlled based on the synthesis information stored in the processing of the storage step. Then, the video and audio of the content data stored in the storage step process and the video and audio of the user are stored. Characterized in that it comprises a reproduction control step of reproducing the synthesis of the.
- Information processing apparatus and method of the present invention In the program, the user's audio and video are communicated with other information processing apparatuses via the network, and the reproduced content data video and audio and the user's video and video are set based on the settings from the user. Voice is synthesized. Also, content data, user video and audio, and composite information indicating the composite status of content data video and audio and user video and audio are stored, and stored content based on the stored composite information The synthesis of the video and audio of the data and the user's video and audio is reproduced.
- FIG. 1 shows a configuration example of a communication system to which the present invention is applied.
- FIG. 2A is a diagram showing an example of content video and user video.
- FIG. 2B is a diagram showing an example of content video and user video.
- FIG. 2C is a diagram showing an example of content video and user video.
- FIG. 3A is a diagram showing an example of composition of content video and user video.
- FIG. 3B is a diagram showing a synthesis example of content video and user video.
- FIG. 3C is a diagram showing an example of composition of content video and user video.
- FIG. 4 is a block diagram illustrating a configuration example of the communication device in FIG. 1.
- FIG. 5 is a flowchart explaining remote communication processing by a communication device.
- FIG. 6 is a flowchart for explaining the remote communication recording process in step S5 of FIG.
- FIG. 7 is a flowchart illustrating remote communication reproduction processing.
- FIG. 8 is a block diagram illustrating a configuration example of a general-purpose personal computer.
- FIG. 1 shows a configuration example of a communication system to which the present invention is applied.
- the communication device 1-1 is connected to another communication device 1 (communication device 1-2 in the case of Fig. 1) via the communication network 2 so as to be like a video phone.
- common content for example, program content obtained by receiving television broadcasts, content such as movies already acquired through pre-downloading, etc.
- This system supports remote communication between users by playing back and forth of private content (moving images such as content, still images, etc.) in synchronization with other communication devices 1-2.
- private content moving images such as content, still images, etc.
- the communication device 1 can be used simultaneously by a plurality of users. For example, in the case of FIG. 1, it is assumed that the communication device 1-1 is used by the users A and B, and the communication device 1-2 is used by the user X.
- the video of the common content is as shown in FIG. 2A
- the video of user A taken by the communication device 1-1 is as shown in FIG. 2B
- the communication device 1 Assume that the video of user X taken by -2 is as shown in Figure 2C.
- the display 22 (Fig. 4) of the communication device 1-1 has, for example, a picture in picture shown in Fig. 3A, a cross fade shown in Fig. 3B, or a wipe shown in Fig. 3C. (wipe) method, content and user The video is superimposed and displayed.
- the video of the user is superimposed on the video of the content as a small screen.
- the display position and size of the small screen can be arbitrarily changed by the user. It is also possible to display only one small screen for both the video of itself (user A) and the communication partner (user X). Furthermore, so-called “pre-rendering” may be performed so that the video of the content is transmitted through the small screen of the user's video.
- the video of the user (user A or user X) is ⁇ -blended and displayed on the content video.
- This cross fade can be used, for example, when the user points to an arbitrary position or region on the content video.
- the video of the user appears from a predetermined direction so as to cover the video of the content.
- a method other than the method described above may be applied to the display of the content and the video of the user.
- the volume and left / right balance of the content and the user's voice are synthesized based on the user's settings.
- the method of synthesizing content and user video and audio can be changed at any time.
- Composition of content and user video and audio for example, distinction between picture-in-picture, cross-fade, or wipe, child screen size and position when picture-in-picture is adopted, cross-fade is adopted ⁇ blending transparency, volume ratio, etc. can be arbitrarily set by the user, and parameters related to these settings are composite information including information indicating the time when the setting was changed. Recorded as 34 ( Figure 4).
- the communication network 2 is a broadband data communication network represented by the Internet or the like, and the content supply server 3 supplies content to the communication device 1 via the communication network 2 in response to a request from the communication device 1.
- the authentication server 4 is used to authenticate the user of the communication device 1 when using the communication system. Perform processing such as billing.
- the broadcast device 5 transmits content as a program such as a television broadcast. Therefore, each communication device 1 can receive and reproduce the content broadcast from the broadcast device 5 in synchronization. Note that the content transmission from the broadcasting device 5 to the communication device 1 may be wireless or wired.
- the communication network 2 may also be used.
- the standard time information supply device 6 is a standard time for matching the clock (standard time counter 41 (Fig. 4)) built in the communication device 1 with the standard time (world standard time, Japan standard time, etc.). Information is supplied to each communication device 1.
- the supply of the standard time information from the standard time information supply device 6 to the communication device 1 may be wireless or wired. In addition, it does not work through the communication network 2.
- the output unit 21 includes a display 22 and a speaker 23, displays video corresponding to the video signal and the audio signal input from the video / audio synthesis unit 31, and outputs audio. Output.
- the input unit 24 detects the camera 25 that captures the user's video (moving image or still image), the microphone 26 that collects the user's voice, and the ambient environment information (brightness, temperature, humidity, etc.) of the user.
- the real-time (RT) data of the user including the acquired moving image, sound, and surrounding environment information is output to the communication unit 28 and the storage unit 32.
- the camera 25 has a function capable of measuring the distance to the subject (user).
- the input unit 24 outputs the acquired user video and audio to the video / audio synthesis unit 31. Further, the input unit 24 outputs the acquired video to the image analysis unit 35. Note that a plurality of input units 24 (two in the case of FIG. 4) may be provided, and each may be directed to a plurality of users (users A and B in FIG. 1).
- the communication unit 28 transmits the real-time data of the user A input from the input unit 24 to the communication device 1-2 of the communication partner via the communication network 2, and the communication device 1-2 is connected to the communication device 1-2. Receive real-time data from user X The data is output to the speech synthesis unit 31, the storage unit 32, and the image analysis unit 35.
- the communication unit 28 receives the content supplied via the communication network 2 by the communication device 12 or the content supply server 3 as a communication partner, and outputs the content to the content reproduction unit 30 and the storage unit 32. Further, the communication unit 28 transmits the content 33 stored in the storage unit 32 and the operation information generated by the operation information output unit 50 to the communication device 12 via the communication network 2.
- the broadcast receiving unit 29 receives the television broadcast signal broadcast from the broadcast device 5 and outputs the content as the obtained broadcast program to the content reproduction unit 30.
- the content reproduction unit 30 reproduces the content of the broadcast program received by the broadcast reception unit 29, the content received by the communication unit 28, or the content read from the storage unit 32, and the video and audio of the obtained content Is output to the video / audio synthesis unit 31 and the image analysis unit 35.
- the video / audio synthesizer 31 combines the content video input from the content playback unit 30, the video of the user, and the video for OSD (On Screen Display) by ⁇ -plending or the like, The resulting video signal is output to the output unit 21.
- the video / audio synthesis unit 31 synthesizes the audio of the content input from the content reproduction unit 30 and the audio of the user, and outputs the audio signal obtained as a result to the output unit 21.
- the storage unit 32 is received by the transmission / reception unit 29, real-time data of the user (user ⁇ etc.) supplied from the input unit 24, real-time data of the communication partner (user X) supplied from the communication unit 28
- the standard time supplied from the standard time counter 41 via the control unit 43 is periodically added and stored in the content of the broadcast program content and the communication unit 28.
- the storage unit 32 also stores synthesis information 34 generated by the synthesis control unit 47.
- the image analysis unit 35 analyzes the brightness and luminance of the content video input from the content playback unit 30 and the user video (including those from the communication devices 1-2), and the analysis result Is output to the composition control unit 47.
- the mirror image generation unit 36 of the image analysis unit 35 generates a mirror image of the image of the user (including the one from the communication device 1-2).
- the pointer detection unit 37 includes a user motion vector detected by the motion vector detection unit 38, etc. Based on the above, the user detects a wrist, a fingertip, or the like as a pointer pointing to a desired position from the user's video (including those from the communication devices 1-2), and extracts the video.
- the motion vector detection unit 38 detects a motion vector indicating the user's motion from the user's video (including the one from the communication device 1-2), and identifies the generation point and the locus.
- the matching unit 39 determines whether the detected motion vector force of the user matches the !! or deviation of the user's motion, and outputs the determination result to the control unit 43 as motion vector matching information.
- the communication environment detection unit 40 monitors the communication environment (communication rate, communication delay time, etc.) between the communication unit 28 and the communication device 1-2 via the communication network 2, and sends the monitoring result to the control unit 43. Output. Based on the standard time information supplied from the standard time information supply device 6, the standard time counter 41 matches the standard time recorded by itself and supplies the standard time to the control unit 43.
- the operation input unit 42 also has, for example, a remote controller power, accepts a user operation, and inputs a corresponding operation signal to the control unit 43.
- the control unit 43 configures the communication device 1-1 based on an operation signal corresponding to a user operation input from the operation input unit 42, motion vector matching information input from the image analysis unit 35, and the like. Control each part.
- the control unit 43 includes a session management unit 44, a viewing / recording level setting unit 45, a playback synchronization unit 46, a composition control unit 47, a playback permission unit 48, a recording permission unit 49, an operation information output unit 50, and an electronic device control unit 51. Contains. In FIG. 4, illustration of control lines from the control unit 43 to each unit constituting the communication device 1-1 is omitted.
- the communication unit 28 communicates with the communication communication device 1 via the communication network 2.
- the viewing / recording level setting unit 45 determines whether or not the user's real-time data acquired in the input unit 24 can be reproduced in the communication device 1-2 of the communication partner based on a user-friendly setting operation. In addition, the force or power that can be recorded, the number of times that recording can be performed, and the like are set, and the communication unit 28 notifies the communication device 12 of this setting information. [0041] According to the viewing / recording level setting unit 45, since it is possible to set the viewing level and recording level for the user's video and audio, it is possible to prevent the outflow of private video and audio. It becomes possible.
- the reproduction synchronization unit 46 controls the broadcast reception unit 29 and the content reproduction unit 30 so that the same content is reproduced in synchronization with the communication device 12 of the communication partner.
- the composition control unit 47 is based on the analysis result of the image analysis unit 35 and the like so that the content video and audio and the user video and audio are synthesized according to the setting operation from the user. Controls the synthesizer 31.
- the composition control unit 47 is configured to synthesize the video and audio of the content with the video and audio of the user, for example, distinction between picture-in-picture, cross-fade, or wipe, and picture-in-picture. Parameters related to settings such as the size and position of the child screen, the alpha blending transparency when the crossfade is used, and the ratio of volume, etc., and the composite information including information indicating the standard time when the settings were changed 34 Is generated and stored in the storage unit 32.
- the reproduction permission unit 48 determines whether or not the content can be reproduced based on the license information added to the content, and controls the content reproduction unit 30 based on the determination result.
- the recording permission unit 49 determines whether or not the user can record the real-time data and the content based on the setting of the communication partner and the license information added to the content, and controls the storage unit 32 based on the determination result. According to the playback permission unit 48 and the recording permission unit 49, content viewing and recording copy control can be controlled.
- the operation information output unit 50 corresponds to a user's operation (channel switching operation when receiving a television broadcast, content playback start, playback end, fast forward playback operation, etc.). Operation information including time, etc. (details will be described later) is generated, and the communication unit 28 is also notified to the communication device 1-2 of the communication partner. This operation information is used for synchronized playback of content.
- the electronic device control unit 51 is connected to predetermined electronic devices connected to the communication device 11 (including wireless connection).
- Control equipment for example, lighting equipment, air conditioning equipment, etc., all not shown.
- This remote communication processing is performed when an operation for instructing the start of remote communication with the communication device 12 is input to the operation input unit 42, and an operation signal corresponding to this operation is input to the control unit 43. To begin.
- step S1 the communication unit 28 connects to the communication device 1-2 via the communication network 2 based on the control of the session management unit 44, and notifies the start of remote communication. In response to this notification, the communication device 1-2 returns an acceptance of the start of remote communication.
- step S2 the communication unit 28 transmits real-time data such as the user A input from the input unit 24 to the communication device 1-2 via the communication network 2 based on the control of the control unit 43. At the same time, it starts receiving the real-time data of user X sent from communication device 1-2.
- the video and audio included in the transmitted real-time data such as user A and the video and audio included in the received real-time data of user X are input to the storage unit 32 and the video / audio synthesis unit 31.
- step S3 the communication unit 28 connects to the authentication server 4 via the communication network 2 based on the control of the session management unit 44, and performs an authentication process for content acquisition. After this authentication process, the communication unit 28 accesses the content supply server 3 via the communication network 2 and acquires the content specified by the user. At this time, it is assumed that the same processing is performed in the communication device 1-2 and the same content is acquired.
- step S3 When receiving content that is broadcast on television, or when playing back content that has already been acquired and stored in storage unit 32, the process of step S3 can be omitted.
- step S 4 the content playback unit 30 performs content playback processing synchronized with the communication device 12 (hereinafter, content synchronization playback) based on the control of the playback synchronization unit 46.
- content synchronization playback synchronized with the communication device 12
- the communication device 1-2 is notified of the operation from the user (channel switching operation, fast-forward playback start operation, etc.), and the communication device 11 is made to follow the communication device 1-2. To do.
- step S5 the storage unit 32 starts a remote communication recording process.
- step S6 the video / audio synthesizing unit 31 receives the video and audio of the reproduced content, the video and audio included in the transmitted real-time data such as the user A, and the like according to the control of the synthesis control unit 47.
- the video and audio included in the real-time data of the user X thus obtained are synthesized, and the video signal and audio signal obtained as a result are supplied to the output unit 21.
- the output unit 21 displays video corresponding to the supplied video signal and outputs audio corresponding to the audio signal.
- video and audio communications between users and synchronized playback of content have started.
- step S6 the pointer detection unit 35 of the image analysis unit 35 parallels the processing of the video / audio synthesis unit 31 and the like, based on the video included in the real-time data of the user A and the like, Detects and displays on the screen (pointing process).
- step S7 the control unit 43 determines whether or not the user's power is an operation for instructing the end of the remote communication, and waits until it is determined that the operation has been performed. User power If it is determined that an operation for instructing the end of remote communication has been performed, the process proceeds to step S8.
- step S8 based on the control from the session management unit 44, the communication unit 28 connects to the communication device 1-2 via the communication network 2 and notifies the end of the remote communication. In response to this notification, the communication device 1-2 returns an acceptance of the end of the remote communication.
- step S9 the storage unit 32 ends the communication recording process.
- the recorded content recorded so far, the video and audio included in the real-time data of user A etc., the video and audio included in the received real-time data of user X, and the composite information 34 are as follows: It will be used when this remote communication is reproduced in the future.
- communication device 1 1 includes communication device 1 1.
- communication device 1 (communication device 1-2) follows the communication device 1-1, but only a plurality of communication devices 1-1 are referred to.
- the communication device 1 may be followed.
- step S11 the composition control unit 47 sets parameters for setting the composition status of the video and audio of the content and the video and audio of the user, which are currently set by the user, and the current standard time.
- step S12 the recording permission unit 49 determines whether or not the currently reproduced content is recordable based on attribute information (metadata) of the currently reproduced content. If it is determined that the currently reproduced content can be recorded, the process proceeds to step S 13.
- step S13 the storage unit 32 periodically adds the standard time supplied from the standard time counting unit 41 via the control unit 43 to the currently playing content according to the control from the recording permission unit 49. Then, the process of storing is started.
- step S13 If it is determined that the currently playing content is not recordable, step S13 is skipped.
- step S14 the recording permission unit 49 determines whether or not the real time data of the user (user A etc.) can be recorded based on the setting from the user. If it is determined that real time data such as user A can be recorded, the process proceeds to step S15.
- step S15 the storage unit 32 periodically adds the standard time supplied from the standard time counting unit 41 via the control unit 43 to the real time data of the user A or the like according to the control from the recording permission unit 49. The process to memorize is started. If it is determined in step S14 that real-time data such as user A is not recordable, step S15 is skipped.
- step S16 the recording permission unit 49 determines whether or not real-time data of the user (user X) of the communication device 1-2 can be recorded based on the notification from the communication device 1-2. If it is determined that the real-time data of the user X can be recorded, the process proceeds to step S17.
- step S17 the storage unit 32 periodically adds and stores the standard time supplied from the standard time counting unit 41 via the control unit 43 to the real time data of the user X according to the control from the recording permission unit 49. Start the process. If it is determined in step S16 that the real time data of user X is not recordable, step S17 is skipped.
- steps S12 and S13, steps S14 and S15, and steps S16 and S17 have been described in order for the sake of convenience, but are actually changed.
- step S18 the composition control unit 47 determines whether or not the setting of the composition of the content video and audio and the user video and audio has been changed by the user, and the composition setting is changed. Wait until it is determined that If it is determined that the composition setting has been changed, the process proceeds to step S 19.
- step S 19 the composition control unit 47 performs composition information including parameters indicating the setting of the composition status of the content video and audio and the user video and audio, which are changed by the user, and information indicating the current standard time. 34 is generated and stored in the storage unit 32. Thereafter, the process returns to step S18 and the subsequent processes are repeated. And this remote communication processing is as described above. The user power is also continued until an operation for instructing the end of the remote communication is performed. This is the end of the description of the remote communication recording process.
- remote communication processing for reproducing remote communication based on the video and audio of the content recorded by the remote communication recording processing, the video and audio of the user, and the synthesized information by the communication device 11 (hereinafter referred to as remote communication).
- the communication reproduction process is described with reference to the flowchart in FIG.
- This remote communication reproduction process is started, for example, in response to an operation from the user who instructs the reproduction of the content 33 stored in the storage unit 32.
- step S31 the control unit 43 determines whether or not an operation for instructing reproduction of the remote communication has been performed from the user to the operation input unit 42. If it is determined that an operation for instructing remote communication reproduction has been performed, the process proceeds to step S32.
- step S32 the composition control unit 47 acquires the composition information stored in the storage unit 32.
- step S33 the composition control unit 47 sets the standard time included in the composition information, the time information added to the content stored in the storage unit 32, and the time information added to the user's real time data. Synchronize with and start playback. As a result, the video and audio of the content and the video and audio of the user are input to the video / audio synthesizer 37.
- step S34 the video / audio synthesis unit 31 synthesizes the video and audio of the content with the video and audio of the user in accordance with the control from the synthesis control unit 47 based on the synthesis information acquired in the process of step S32. Output to output unit 21. As a result, remote communication during recording was completely reproduced.
- step S31 If it is determined in step S31 that an operation for instructing reproduction of remote communication has been performed and it is determined to be a cunning habit, the process proceeds to step S35.
- step S35 the operation input unit 42 accepts a user-friendly setting operation relating to the synthesis of content video and audio and user video and audio.
- step S36 the composition control unit 47 starts playback in synchronization with the time information added to the content stored in the storage unit 32 and the time information added to the user's real-time data.
- the video / audio synthesizer 37 receives content video and Audio and user video and audio are input.
- step S37 the video / audio synthesizing unit 31 synthesizes the video and audio of the content with the video and audio of the user in accordance with the control from the synthesis control unit 47 corresponding to the setting result in the process of step S35.
- Output to 21 The output of the content video and audio and the user video and audio is reproduced in a composite state different from the remote communication during recording. Of course, it is also possible to play back only the video and audio of the content.
- the processing of the communication device 1 described above can be executed by hardware, but can also be executed by software.
- various functions can be executed by installing a computer embedded in a dedicated hardware for the program power that constitutes the software, or by installing various programs. For example, it is installed from a recording medium in a general-purpose personal computer as shown in FIG.
- This personal computer 100 has a CPU (Central Processing Unit) 101 built therein.
- An input / output interface 105 is connected to the CPU 101 via the bus 104.
- a ROM (Read Only Memory) 102 and a RAM (Random Access Memory) 103 are connected to the node 104.
- the input / output interface 105 includes an input unit 106 including an input device such as a keyboard and a mouse for a user to input operation commands, an output unit 107 for displaying video and outputting audio, a program, A storage unit 108 including a hard disk drive for storing various data and a communication unit 109 for executing communication processing via a network typified by the Internet are connected. Also, magnetic disk (including flexible disk), optical disk (including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)), magneto-optical disk (including MD (Mini Disc)), or A drive 110 for reading / writing data from / to a recording medium 111 such as a semiconductor memory is connected.
- a recording medium 111 such as a semiconductor memory
- a program for causing the personal computer 100 to execute the processing of the communication device 1 described above is stored in the recording medium 111 and stored in the personal computer 100. And is read by the drive 110 and installed in the hard disk drive built in the storage unit 108. The programs installed in the storage unit 108 are loaded from the storage unit 108 to the RAM 103 and executed in response to a command from the CPU 101 corresponding to a command from the user input to the input unit 106.
- the program may be processed by a single computer or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system represents the entire apparatus composed of a plurality of apparatuses.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Computer Graphics (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2005800257202A CN1993990B (zh) | 2004-07-27 | 2005-07-20 | 信息处理设备和方法 |
US11/658,833 US8391671B2 (en) | 2004-07-27 | 2005-07-20 | Information processing device and method, recording medium, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-218533 | 2004-07-27 | ||
JP2004218533A JP2006041888A (ja) | 2004-07-27 | 2004-07-27 | 情報処理装置および方法、記録媒体、並びにプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006011401A1 true WO2006011401A1 (ja) | 2006-02-02 |
Family
ID=35786151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/013295 WO2006011401A1 (ja) | 2004-07-27 | 2005-07-20 | 情報処理装置および方法、記録媒体、並びにプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8391671B2 (ja) |
JP (1) | JP2006041888A (ja) |
CN (1) | CN1993990B (ja) |
WO (1) | WO2006011401A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007093104A1 (fr) | 2006-02-14 | 2007-08-23 | Huawei Technologies Co., Ltd. | Procédé et système de mise en oeuvre d'enregistrement multimédia et dispositif de gestion de ressources multimédia |
EP1986431A3 (en) * | 2007-04-24 | 2011-07-27 | LG Electronics, Inc. | Video communication terminal and method of displaying images |
CN101674470B (zh) * | 2008-09-09 | 2011-11-16 | 华为技术有限公司 | 实现客户端录制的方法、系统及录制控制实体 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006041885A (ja) * | 2004-07-27 | 2006-02-09 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP4644555B2 (ja) * | 2005-07-27 | 2011-03-02 | 日本放送協会 | 映像音声合成装置及び遠隔体験共有型映像視聴システム |
AU2007249650B2 (en) | 2006-03-17 | 2011-02-24 | Sony Corporation | System and method for organizing group content presentations and group communications during the same |
JP2008042785A (ja) * | 2006-08-10 | 2008-02-21 | Sharp Corp | 映像表示装置 |
WO2010143388A1 (ja) * | 2009-06-12 | 2010-12-16 | パナソニック株式会社 | コンテンツ再生装置、コンテンツ再生方法、プログラム、及び集積回路 |
JP2011160151A (ja) * | 2010-01-29 | 2011-08-18 | Toshiba Corp | 電子機器、動画再生システム、及び動画再生方法 |
JP2012222642A (ja) * | 2011-04-11 | 2012-11-12 | Sony Corp | データ配信装置、データ配信方法、及びプログラム |
WO2013095512A1 (en) | 2011-12-22 | 2013-06-27 | Intel Corporation | Collaborative entertainment platform |
US9241131B2 (en) | 2012-06-08 | 2016-01-19 | Samsung Electronics Co., Ltd. | Multiple channel communication using multiple cameras |
US9325889B2 (en) | 2012-06-08 | 2016-04-26 | Samsung Electronics Co., Ltd. | Continuous video capture during switch between video capture devices |
JP2015046028A (ja) * | 2013-08-28 | 2015-03-12 | ソニー株式会社 | 情報処理装置、及び情報処理方法 |
JP2015162117A (ja) * | 2014-02-27 | 2015-09-07 | ブラザー工業株式会社 | サーバ装置、プログラム、及び情報処理方法 |
JP7073702B2 (ja) * | 2017-12-11 | 2022-05-24 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及び情報処理プログラム |
CN108965785B (zh) * | 2018-06-27 | 2020-12-29 | 苏州科达科技股份有限公司 | 一种视频会议录像方法、录像装置、控制中心和终端 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001148841A (ja) * | 1999-11-19 | 2001-05-29 | Nec Corp | テレビコミュニティの形成方法とそのシステム |
JP2002507027A (ja) * | 1998-03-13 | 2002-03-05 | シーメンス コーポレイト リサーチ インコーポレイテツド | 協調的ダイナミックビデオコメント作成装置および作成方法 |
JP2003150529A (ja) * | 2001-11-19 | 2003-05-23 | Hitachi Ltd | 情報交換方法、情報交換端末装置、情報交換サーバ装置、プログラム |
JP2004088327A (ja) * | 2002-08-26 | 2004-03-18 | Casio Comput Co Ltd | 通信端末、通信端末処理プログラム、および画像配信サーバ、画像配信処理プログラム |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0648458B2 (ja) | 1985-03-01 | 1994-06-22 | 日本電信電話株式会社 | 情報入力装置 |
JPH08111858A (ja) * | 1994-10-12 | 1996-04-30 | Hitachi Ltd | テレビ対話監視システム |
US5808662A (en) * | 1995-11-08 | 1998-09-15 | Silicon Graphics, Inc. | Synchronized, interactive playback of digital movies across a network |
JP3742167B2 (ja) * | 1996-12-18 | 2006-02-01 | 株式会社東芝 | 画像表示制御装置 |
JPH1144703A (ja) | 1997-07-25 | 1999-02-16 | Matsushita Electric Ind Co Ltd | 手振り入力装置 |
US6269122B1 (en) * | 1998-01-02 | 2001-07-31 | Intel Corporation | Synchronization of related audio and video streams |
JPH11203837A (ja) * | 1998-01-16 | 1999-07-30 | Sony Corp | 編集システムおよび編集方法 |
EP2237279A3 (en) * | 2000-11-29 | 2015-04-15 | Panasonic Intellectual Property Management Co., Ltd. | Recording apparatus, method and system |
JP2003271530A (ja) | 2002-03-18 | 2003-09-26 | Oki Electric Ind Co Ltd | 通信システム,システム間関連装置,プログラム,及び,記録媒体 |
CN1431827A (zh) * | 2003-02-28 | 2003-07-23 | 周健伟 | 双机组合电视摄像监视方法 |
JP2006041886A (ja) | 2004-07-27 | 2006-02-09 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP4716083B2 (ja) | 2004-07-27 | 2011-07-06 | ソニー株式会社 | 情報処理装置および方法、記録媒体、並びにプログラム |
JP4572615B2 (ja) | 2004-07-27 | 2010-11-04 | ソニー株式会社 | 情報処理装置および方法、記録媒体、並びにプログラム |
JP4655190B2 (ja) | 2004-08-06 | 2011-03-23 | ソニー株式会社 | 情報処理装置および方法、記録媒体、並びにプログラム |
-
2004
- 2004-07-27 JP JP2004218533A patent/JP2006041888A/ja active Pending
-
2005
- 2005-07-20 WO PCT/JP2005/013295 patent/WO2006011401A1/ja active Application Filing
- 2005-07-20 CN CN2005800257202A patent/CN1993990B/zh not_active Expired - Fee Related
- 2005-07-20 US US11/658,833 patent/US8391671B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002507027A (ja) * | 1998-03-13 | 2002-03-05 | シーメンス コーポレイト リサーチ インコーポレイテツド | 協調的ダイナミックビデオコメント作成装置および作成方法 |
JP2001148841A (ja) * | 1999-11-19 | 2001-05-29 | Nec Corp | テレビコミュニティの形成方法とそのシステム |
JP2003150529A (ja) * | 2001-11-19 | 2003-05-23 | Hitachi Ltd | 情報交換方法、情報交換端末装置、情報交換サーバ装置、プログラム |
JP2004088327A (ja) * | 2002-08-26 | 2004-03-18 | Casio Comput Co Ltd | 通信端末、通信端末処理プログラム、および画像配信サーバ、画像配信処理プログラム |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007093104A1 (fr) | 2006-02-14 | 2007-08-23 | Huawei Technologies Co., Ltd. | Procédé et système de mise en oeuvre d'enregistrement multimédia et dispositif de gestion de ressources multimédia |
US8498517B2 (en) | 2006-02-14 | 2013-07-30 | Huawei Technologies Co., Ltd. | Method and system of implementing multimedia recording and media resource handling device |
EP1986431A3 (en) * | 2007-04-24 | 2011-07-27 | LG Electronics, Inc. | Video communication terminal and method of displaying images |
US8489149B2 (en) | 2007-04-24 | 2013-07-16 | Lg Electronics Inc. | Video communication terminal and method of displaying images |
EP2637401A1 (en) * | 2007-04-24 | 2013-09-11 | LG Electronics, Inc. | Video communication terminal and method of displaying images |
US9258520B2 (en) | 2007-04-24 | 2016-02-09 | Lg Electronics Inc. | Video communication terminal and method of displaying images |
CN101674470B (zh) * | 2008-09-09 | 2011-11-16 | 华为技术有限公司 | 实现客户端录制的方法、系统及录制控制实体 |
Also Published As
Publication number | Publication date |
---|---|
US20090202223A1 (en) | 2009-08-13 |
CN1993990B (zh) | 2010-05-26 |
JP2006041888A (ja) | 2006-02-09 |
CN1993990A (zh) | 2007-07-04 |
US8391671B2 (en) | 2013-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006011401A1 (ja) | 情報処理装置および方法、記録媒体、並びにプログラム | |
CN1981524B (zh) | 信息处理设备和方法 | |
JP4655190B2 (ja) | 情報処理装置および方法、記録媒体、並びにプログラム | |
WO2006011399A1 (ja) | 情報処理装置および方法、記録媒体、並びにプログラム | |
US20060025998A1 (en) | Information-processing apparatus, information-processing methods, recording mediums, and programs | |
US20060026207A1 (en) | Information-processing apparatus, information-processing methods, recording mediums, and programs | |
US20060023949A1 (en) | Information-processing apparatus, information-processing method, recording medium, and program | |
WO2006011398A1 (ja) | 情報処理装置および方法、記録媒体、並びにプログラム | |
WO2021083145A1 (zh) | 视频处理的方法、装置、终端及存储介质 | |
JP2001313915A (ja) | テレビ会議装置 | |
WO2005013618A1 (ja) | ライブストリーミング放送方法、ライブストリーミング放送装置、ライブストリーミング放送システム、プログラム、記録媒体、放送方法及び放送装置 | |
JP5359199B2 (ja) | コメント配信システム、端末、コメント出力方法及びプログラム | |
CN112004100B (zh) | 将多路音视频源集合成单路音视频源的驱动方法 | |
JP2023111906A (ja) | 記録情報作成システム、記録情報作成方法、プログラム | |
CN115086729A (zh) | 一种连麦展示方法、装置、电子设备、计算机可读介质 | |
JP6007098B2 (ja) | 歌唱動画生成システム | |
JP5111405B2 (ja) | コンテンツ制作システム及びコンテンツ制作プログラム | |
JP6063739B2 (ja) | 歌唱動画生成システム | |
JP2003140904A (ja) | 講義システム | |
JP2007199415A (ja) | カラオケ装置 | |
US20120065751A1 (en) | Signal processing apparatus and signal processing method | |
JP2014199282A (ja) | ユーザーカメラで撮影された静止画を利用可能な歌唱動画データ生成装置 | |
KR20230014522A (ko) | 크로마키를 이용한 영상제작 장치 및 그 방법 | |
CN114173147A (zh) | 将虚拟图像和3d模型与现实场景同步视频显示的系统 | |
JP2003324704A (ja) | 講義システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 200580025720.2 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11658833 Country of ref document: US |