WO1998053443A1 - Mode d'affichage graphique, procede de reproduction synchronisee, et dispositif de reproduction audiovisuelle synchronisee - Google Patents

Mode d'affichage graphique, procede de reproduction synchronisee, et dispositif de reproduction audiovisuelle synchronisee Download PDF

Info

Publication number
WO1998053443A1
WO1998053443A1 PCT/JP1998/002175 JP9802175W WO9853443A1 WO 1998053443 A1 WO1998053443 A1 WO 1998053443A1 JP 9802175 W JP9802175 W JP 9802175W WO 9853443 A1 WO9853443 A1 WO 9853443A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
music
motion
character
frame
Prior art date
Application number
PCT/JP1998/002175
Other languages
English (en)
Japanese (ja)
Inventor
Seiichi Suzuki
Yutaka Shirai
Masashi Tokunaga
Haruyo Ohkubo
Kenjirou Tsuda
Tetsuya Imamura
Original Assignee
Matsushita Electric Industrial Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP12771797A external-priority patent/JP3481077B2/ja
Priority claimed from JP9131521A external-priority patent/JPH10320589A/ja
Priority claimed from JP9141927A external-priority patent/JPH10333673A/ja
Priority claimed from JP9167802A external-priority patent/JPH1116001A/ja
Priority claimed from JP9290026A external-priority patent/JPH11126066A/ja
Priority claimed from JP563298A external-priority patent/JP3475765B2/ja
Application filed by Matsushita Electric Industrial Co., Ltd. filed Critical Matsushita Electric Industrial Co., Ltd.
Priority to CA002250021A priority Critical patent/CA2250021C/fr
Priority to US09/142,813 priority patent/US6331851B1/en
Publication of WO1998053443A1 publication Critical patent/WO1998053443A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • the present invention relates to an apparatus for displaying computer graphics (hereinafter, referred to as CG).
  • CG computer graphics
  • a graphic display device that communicates between the server and the terminal to play back audio data (Video data) and video data (Video data) on the terminal side in synchronization, and AV synchronization in an independent device It relates to a playback device.
  • CG is widely used in fields such as movies and video games.
  • 3D CG in many cases, a three-dimensional character has a skeletal model composed of bones and joints.
  • Figure 40 shows an example of a skeletal model for a human body.
  • a solid character is displayed by fleshing out each part of a standing character, such as a polygonal arm or leg, around the skeleton model and pasting the texture on the surface of the polygon. It is as follows.
  • a combination of each part constituted by polygons and the texture is referred to as shape data.
  • the movement of the three-dimensional character is realized by giving an instruction (ie, motion) for moving the joints of the skeletal model to the micro-computer overnight.
  • VRML VirtualRealityModedelinngLanguguage
  • the terminal uses the VRML browser to display the received data as a CG video.
  • the shape data and the motion data are transferred from the server to the terminal, the data transfer amount increases, and there is a problem that the data transfer time is long.
  • CG is widely used in fields such as movies and video games. Recently, there have been many CG works that play not only CG images but also music that matches them. As a technique for associating such a CG with a sound, there is a two-dimensional image processing apparatus disclosed in Japanese Patent Application Laid-Open No. Hei 8-212388.
  • the start time and end time of the night of C G are C s and C e, respectively.
  • Tempo Original tempo x (Ae—As) / (Ce—Cs) In this way, by adjusting the tempo of the sound based on the relative ratio to the playback time of the CG, it is possible to reproduce the CG and the sound of different playback times in synchronization.
  • the present invention has been made in view of the above-described problems, and provides a graphic display for reducing the data transfer amount of CG data distributed from a server via a network and for smoothing the movement of a three-dimensional character displayed on a terminal. It is intended to provide a device.
  • the present invention can synchronize and play a song and a video whose tempo changes in the middle of the song, and synchronize and play the song and the video even when the tempo of the song suddenly changes or during special playback.
  • the purpose of the present invention is to provide an AV synchronized playback device that can perform the playback.
  • the graphic display device is characterized in that the server comprises:-a data transmission means for transmitting scenario data describing the combination of motions to the terminal via the network; A data receiving means for receiving the scenario data transmitted by the data transmitting means; a shape data base required for displaying the three-dimensional character; and a motion data necessary for moving the three-dimensional character.
  • a motion switching drawing means for displaying the three-dimensional character by switching motions in the order described in the scenario data received by the data receiving means; and the switching drawing means for displaying the motion. It is equipped with automatic motion compensation means for compensating the previous and next motions so that the motions can be displayed smoothly when switching. Even if each motion does not hold the home position, Can be connected naturally to display CG videos.
  • the server includes a connection portion between the motions described in the scenario data transmitted by the data transmission device.
  • a correction data transmitting unit for transmitting the motion correction data to be corrected through a network, wherein the correction data transmitted by the correction data transmitting means is used instead of the motion automatic correction unit of the terminal.
  • Based on the motion correction data received by the correction data receiving means so that the motion can be smoothly displayed when the switching drawing means switches the motion. It features motion compensation means for compensating motion, and saves computer resources of the terminal because the server sends a pre-calculated correction data from the server. It can be.
  • the server is configured to transmit a motion data base and a data transmission unit from the motion data base.
  • the feature is that the calculation of the correction data is performed in real time, thus saving server memory resources.
  • the graphic display method according to claim 4 of the present invention is characterized in that, when a three-dimensional character is graphically displayed by instructing a terminal from a server to a terminal, a plurality of motion patterns in which a pattern of the three-dimensional character motion is described at a terminal.
  • An evening is prepared, the server sends a scenario to the terminal describing the time series combination order of the motion pattern, detects the scenario from the server, and operates based on this scenario data.
  • the terminal that displays the graphic in the same manner as the home position timing common to the motion pattern to be executed and the next execution pattern, or the position of the home position common to the motion pattern to be executed and the next execution pattern It is characterized in that scene switching is performed by switching the motion pattern to be executed in the evening.
  • the graphic display device describes a motion group describing a plurality of patterns of motion, and a pattern in which the three-dimensional character is operated based on which of the motion groups corresponds to the scene.
  • the character posture control means is provided with a home position that is common to the movement pattern to be executed and the next pattern to be executed or a position that is substantially common to the pattern to be executed and the next pattern to be executed next. Switching scenes by switching the motion pattern executed at the timing It is characterized in that it is configured to execute.
  • a graphic display device is a graphic display device in which a server and a terminal are provided on a network, and the terminal displays the graphic on the terminal.
  • a database of character data that defines the shape, a database of motion that defines the movement of the character, and a scenario database that specifies a time series combination of the character data and one or more of the motions are provided.
  • the terminal includes a character database for storing the character data, a motion database for storing the motion, and a character database specified by the scenario.
  • a data search means for searching whether or not the character data exists on the character database; Obtaining of Mashimashi such have character de Isseki characterized in that a de Isseki request means to request the server.
  • the server requests only the server to acquire character data that does not exist on the terminal's character data base, and creates the required character data together with the terminal to create a character movie on the terminal. If there is no character data necessary for the transfer, the data is transferred to the network. At this time, only the character data that was insufficient for the terminal is transferred over the network. All necessary data for displaying characters can be supplied to the terminal.
  • the graphic display device is a A graphic display device, comprising: a server and a terminal provided thereon; and a graphic display at the terminal, wherein the server includes: a database of character data defining the shape of a three-dimensional character; A motion database that defines the movement of the character, and scenario data that specifies the time series combination of the character data and one or more of the motions.
  • the terminal stores the character data Character data storage pace to be stored, motion data storage base for storing the motion, and data for searching whether the motion specified by the scenario data exists in the motion database. Overnight search means, and a data requester for requesting the server to obtain a motion that does not exist in the motion database. And said that you provided.
  • the server only requests the server to obtain motion data that does not exist on the terminal's motion data base, and creates the required motion video data to generate character animation on the terminal. Therefore, if the terminal does not have the motion data necessary for drawing a 3D character, it is transferred to the network, and at this time, only the motion data that was insufficient for the terminal is transferred to the network. In a short communication time, all data necessary for displaying 3D characters can be supplied to the terminal.
  • An AV synchronous reproducing apparatus comprises: a music playing means for performing a performance based on music data; a music position specifying a position on the music; A synchronization information table for temporarily storing the synchronized time and a synchronization information updating means for updating the synchronization information table based on the performance of the music playing means.
  • Music position calculating means for calculating a music position currently being played by the music playing means from the current time and the contents of the synchronization information table; and a frame buffer for temporarily storing a frame.
  • a frame data synchronized with the performance of the music playing means is calculated from the CG data associated with the music data based on the music position calculated by the music position calculating means, and output to the frame buffer.
  • Frame output means for displaying the frame data stored in the frame buffer as a moving image, wherein the video is displayed only for music data having a fixed tempo.
  • the synchronization information updating means is configured to update the synchronization information table each time the music performance means changes either the music position or the tempo information.
  • An AV synchronous reproducing apparatus is characterized in that, in claim 8, the synchronous information updating means is configured to update the synchronous information table at a specific cycle, and to update the synchronous information.
  • the same effect as in claim 8 can be obtained by reducing the number of times.
  • the AV synchronous reproducing apparatus is characterized in that, in claim 8, a calculation time estimating means for estimating a calculation time required by the frame data output means from the CG data amount is added, and the frame output means includes: The calculation time estimating means predicts from the music position calculated by the music position calculating means. And outputting the frame data synchronized with the music position delayed by the specified time to the frame buffer. The video can be played back in synchronization.
  • the AV synchronous reproducing apparatus is the AV synchronous reproducing apparatus according to claim 8, wherein the performance delay estimating means for estimating a time from the music data until the sound at the current music position is actually output as sound.
  • the synchronization information updating means is configured to output a music position, tempo information, and update time delayed by the expected time of the performance delay to the synchronization information table. In addition to the effect of, the video can be played back in sync with the music without delay due to the performance.
  • the display delay time from when the CG data is reproduced to when the video display means can actually display the frame buffer data is set.
  • a means for predicting video display delay to be predicted is added, and the frame output means delays the frame data synchronized with the music position delayed by the time predicted by the video display delay predicting means from the music position calculated by the music position calculating means. It is configured to output to a frame buffer.
  • the AV synchronous reproducing apparatus is the AV synchronous reproducing apparatus according to claim 8, wherein the special performance start notifying means for generating a special reproduction start signal when the music playing means starts special reproduction, and the special music playing means A special reproduction end notifying means for generating a special reproduction end signal when ending the reproduction, and a special reproduction synchronization information updating means for outputting a music position to a synchronization information table in real time during the special reproduction. Is playing a special And outputting the frame data to a frame buffer on the basis of the synchronization information table updated by the trick play synchronization information updating means.
  • the video can be synchronized with the music data during the special playback, compared to the conventional technology where the synchronization between the music data and the video could be lost during the middle.
  • An AV synchronous reproducing apparatus is an AV synchronous reproducing apparatus that reproduces music data and video data in synchronization with each other.
  • a beat generation circuit that outputs, as a synchronous message, a song position specifying the current position on the song and the tempo information that is the basis of the performance tempo each time the song progresses, and a song data based on the specified note. Linked the evening with the progress of video data
  • a synchronization message is input from the beat generation circuit.
  • a display frame determining circuit for determining the moving image data to be written to the frame buffer based on the progress position of the frame of the moving image data specified by the AV synchronization instruction data and the time interval ⁇ T. It is characterized by an octopus.
  • An AV synchronized playback device is an AV synchronized playback device that plays back a music data and a CG character movement data in synchronization with each other.
  • a beat that outputs the song position specifying the current position on the song and the basic tempo information of the performance tempo as a synchronization message A raw circuit; an AV synchronization instruction data generating circuit for generating an AV synchronization instruction data that associates the progress of the music data with the progress of the motion data of the CG character based on the specific note; and a frame.
  • a synchronization message is input from the beat generation circuit, the tempo information included in the synchronization message, and the time when the synchronization message is input.
  • a character posture calculation circuit that determines the posture of the CG character to be written into the frame buffer based on the progress position of the frame of the evening and the above time interval ⁇ T. And wherein the digit.
  • An AV synchronous reproducing apparatus is characterized in that, in claim 15 or claim 16, a tempo change input circuit for inputting a change in the tempo information is provided.
  • the music position and the tempo information of the music that specify a position on the music based on a specific note are identified. Each time the music progresses by the number of musical notes, the music is transmitted from the sound generation unit to the image generation unit.
  • the AV synchronized playback method is characterized in that, when playing back music data, each time a music piece for a specific note progresses, the music position and performance tempo at which the position on the music piece at that time is specified. Outputting the basic tempo information as a synchronization message, and an AV synchronization instruction data in which the music data and the progress of the moving image data are associated with each other based on the specific note.
  • the synchronization message is input, and the tempo information included in the synchronization message is
  • the synchronization message is input, the progress position of the frame of the video data written in the frame buffer, and when the next synchronization message is input, the AV synchronization instruction data
  • the moving image data to be written into the frame buffer is determined based on the progress position of the frame of the specified moving image data and the time interval.
  • AV synchronous reproduction method when reproducing music data, each time a music piece for a specific note progresses, a music position and a performance tempo at which a position on the music piece at that time is specified. Outputting the tempo information as a synchronization message, which is the basis of the audio data, and AV synchronization instruction data in which the progress of the music data and the progress of the motion data of the CG character are made based on the specific note.
  • the synchronization message is input, and the tempo information included in the synchronization message, the synchronization message,
  • the message is input, the frame progress position of the motion data of the CG character written in the frame buffer, and when the next synchronization message is input, the AV synchronization instruction data Is written in the frame buffer according to the progress position of the frame of the CG character specified by the above and the time interval ⁇ ⁇ c
  • the posture of the G character is determined.
  • the AV synchronized playback method according to claim 21 of the present invention, according to claim 19 or claim 20, wherein the step of inputting a change in the tempo information; and the step of inputting the tempo information of the synchronized message. Change to And a process.
  • a recording medium according to claim 22 is characterized in that a combination program for realizing the AV synchronous reproduction method according to any one of claims 19 to 21 is recorded.
  • FIG. 1 is a configuration diagram of (Embodiment 1) of the present invention.
  • FIG. 2 is an explanatory diagram of the shape data of the embodiment.
  • Fig. 3 is an explanatory diagram of a coordinate change over time of the motion Ma of the embodiment.
  • Fig. 4 is a motion graph of the motion Ma of the embodiment.
  • Fig. 5 is a motion graph of the motion Mb of the embodiment.
  • Fig. 6 is a motion graph of the motions Ma and Mb of the embodiment. Motion graph corrected
  • Fig. 8 shows the configuration of (Embodiment 2).
  • FIG. 9 is an explanatory diagram of correction data according to the embodiment.
  • FIG. 10 shows the configuration of the third embodiment.
  • Fig. 11 shows the configuration of (Embodiment 4).
  • FIG. 12 is an explanatory diagram showing an example of the motion of the embodiment.
  • Figure 13 shows the structure of scenario data according to the embodiment.
  • FIG. 14 is an explanatory diagram showing an example of motions connected according to the scenario data of the embodiment.
  • FIG. 15 is a flowchart of the motion switching drawing means of the embodiment.
  • FIG. 16 is a block diagram of (Embodiment 5) of the present invention.
  • Fig. 17 is a flowchart of the terminal shortage data processing of the embodiment.
  • Fig. 18 is a flowchart of the server's missing data processing according to the embodiment.
  • FIG. 19 is a block diagram of (Embodiment 6) of the present invention.
  • FIG. 20 is a flow chart of the embodiment.
  • FIG. 21 is a block diagram of (Embodiment 7) of the present invention.
  • FIG. 22 is a block diagram of (Embodiment 8) of the present invention.
  • FIG. 23 is a block diagram of (Embodiment 9) of the present invention.
  • FIG. 24 is a block diagram of (Embodiment 10) of the present invention.
  • FIG. 25 is a block diagram of (Embodiment 11) of the present invention.
  • FIG. 26 is a block diagram of (Embodiment 12) of the present invention.
  • FIG. 27 is a block diagram of (Embodiment 13) of the present invention.
  • FIG. 28 is an explanatory diagram showing the relationship between the number of beats of the music, the motion data, the scenario, and the AV synchronization instruction data in the embodiment.
  • FIG. 29 is a timing chart showing the temporal relationship between music data, CG character posture calculation, and rendering processing in the embodiment.
  • FIG. 30 is a flowchart showing a music performance reproduction process according to the embodiment.
  • FIG. 31 is a flowchart showing a process of generating AV synchronization instruction data according to the embodiment.
  • FIG. 32 is a flowchart showing an ACG character reproduction process according to the embodiment.
  • FIG. 33 is a configuration diagram of (Embodiment 14) of the present invention.
  • Fig. 34 is a flow chart showing the playback process of music performance in this embodiment.
  • FIG. 35 is a block diagram of (Embodiment 15) of the present invention.
  • FIG. 36 is an explanatory diagram showing the relationship between the number of beats of music, video data, video scenario data, and AV synchronization instruction data in the embodiment.
  • FIG. 37 is a flowchart showing a moving image playback process in the embodiment.
  • FIG. 38 is a configuration diagram of (Embodiment 16) of the present invention.
  • Fig. 39 is a flowchart showing the playback process of music performance in the embodiment.
  • Figure 40 is an illustration of the skeleton data of a three-dimensional character
  • a server 51 and a terminal 52 are connected by a network 53.
  • the server 51 has a data transmission means 1 and a scenario database 74.
  • the terminal 52 is composed of the data receiving means 11, the motion switching drawing means 12, the automatic motion correcting means 13, the shape data base 14, the motion database 15, and the display means 16. It is configured.
  • the overnight transmission means 1 transmits the scenario data 61 to be displayed on the terminal 52 out of the plurality of scenario data stored in the scenario database 74 via the network 53.
  • Scenario Day 6 1 specifies the three-dimensional character to be displayed, Also, it specifies the combination order of the motions for moving the three-dimensional character, and is a list of motion IDs for specifying the motions.
  • the motion corresponding to the motion ID is supplied to the terminal through a recording medium such as a CD-ROM or a floppy disk or the network 53 in advance, and the motion database of the terminal 52 is provided. It shall be stored in 15.
  • the motion corresponding to the motion ID ⁇ Ma, Mb, Mc, Md, Me ⁇ is stored in the motion data base 15, [Ma, Mc, Ma , Me] (indicating that the display is switched and displayed in order from the top of the list).
  • the scenario [Ma, Mf, Ma, Me] cannot be specified because Mf is not stored in motion day 14. Any combination of the motions stored in the motion database 15 can be defined as the scenario data.
  • the data receiving means 11 receives the scenario transmitted by the data transmitting means 1 via the network 53.
  • the shape database 14 stores the shape data.
  • the shape data consists of a set of polyhedrons called polygons composed of two or more three-dimensional (X, y, z coordinates) points.
  • the motion database 15 stores motions. Motion defines the amount of change over time in a time series.
  • the shape of a triangular prism as shown in Fig. 2 consists of six vertices a, b, c, d, e, and f, and five subsets ⁇ a, b, c ⁇ of the vertices.
  • ⁇ d, e, f ⁇ ⁇ a, b, d, e ⁇ ⁇ a, c, d, f ⁇ ⁇ b, c, e, f ⁇ (This is object A).
  • the motion of rotating object A by 180 degrees around the X axis and 360 degrees around the y axis in 60 seconds is as shown in Fig. 3 (This motion is called motion Ma Do).
  • the rotation angle is defined for a specific time (this is called a key frame), and the rest is calculated by an interpolation algorithm.
  • a key frame a specific time
  • an interpolation algorithm Several types of general algorithms have been proposed for the key frame interpolation algorithm. Here, a one-dimensional linear interpolation algorithm will be used and described below.
  • the Motion Day Base 15 is a set of what is generally defined as a key frame set. When the motion change is graphed, the motion graph is as shown in Fig. 4. The black circles and triangles in Fig. 4 indicate key frames.
  • the motion switching drawing means 12 displays the CG moving image while sequentially switching the motions based on the scenario data 61 received by the data receiving means 11.
  • Each motion is a set of keyframes within a certain time (motion time).
  • motion Ma is a motion having seven key frames in 60 seconds. By performing one-dimensional linear interpolation between key frames, it is possible to restore the motion graph as shown in Fig. 4. You.
  • one-dimensional linear interpolation is used for key frame interpolation, but other known methods such as spline nonlinear interpolation exist.
  • the present invention does not particularly limit the interpolation method, since the same effect can be obtained by using any of these methods.
  • the X-axis rotation angle and y-axis rotation angle are calculated by linear interpolation.
  • the rotation angle of the X axis is 100 degrees as calculated above, and the rotation angle of the y axis is
  • a frame can be displayed by performing a rendering process after these coordinate conversion processes.
  • the motion switching drawing means 12 performs a series of processes from the start to the end of each motion in a chain based on the frame rate (specifying how many frames are drawn per second). Display a movie. When one motion finishes the CG video within the specified motion time, the CG video of the next motion specified in the scenario data is displayed. In this way, the CG moving image is displayed continuously by switching the motion one after another.
  • the motion automatic correcting means 13 corrects the motion when the motion switching drawing means 12 switches the motion. If the motions are displayed one after another, the motions may be discontinuous at the joint.
  • FIG. 6 shows the combined motion graphs of Ma and Mb.
  • discontinuities occur at the 60-second frame of motion Ma and the 0-second frame of motion Mb.
  • the average value of each value of the 60th and 0th second frames of the motion Ma is set as a key frame at the time of switching the motion, so that the motion Avoid discontinuities in the connection.
  • Motion M Calculate keyframes when switching between a and motion Mb.
  • the motion automatic correction means 13 calculates a key frame from the preceding and following motions when switching motions, and automatically corrects the motion. There are a plurality of known correction methods. Since the same effect can be obtained by using any of these methods, the present invention does not particularly limit the correction method.
  • the scenario data that specifies the motion combination order is transmitted to the terminal 52 through the network 53, and the shape data and motion data of the three-dimensional character having a large data capacity are transmitted. Since no data is transmitted, the amount of data transferred is small and the burden on the network is small.
  • the motion automatic correction means 13 of the terminal 52 corrects the motion when switching the motion, the movement of the three-dimensional character displayed on the display means 16 of the terminal 52 is very smooth. is there.
  • the server 51 and the terminal 52 are connected by a network 53.
  • the server 51 includes a data transmission means 1 and a correction data transmission means 21.
  • the terminal 52 includes the data receiving means 11, the correction data receiving means 22, the motion switching drawing means 12, the display means 16, the motion correcting means 23, the shape data base 14 and It is composed of a plurality of motion displays 15.
  • 1, 1, 1, 12, 14, 15, and 16 are the same as those in the first embodiment, and a description thereof will be omitted.
  • the correction data transmission means 21 transmits correction data 62 for correcting the motions of the scenario data 61 transmitted by the data transmission means 1 o
  • the correction data 62 are calculated in advance by the same method as the automatic motion correction means 13 described in (Embodiment 1), and the motion of the three-dimensional character before and after the switching when the motion is switched. Is data for smooth movement.
  • the correction data 62 specifies the motion and the time of the frame.
  • the correction data 62 of the motion Ma and the motion Mb are as shown in FIG.
  • Switch motion Mb after motion Ma and display 60 seconds key frame of M a (rotation angle of X axis 180 degrees, rotation angle of y axis 360 degrees) and motion Mb 0 seconds key frame (rotation angle of X axis 140 degrees, y axis
  • the rotation angle of 320 degrees is corrected to a rotation angle of 160 degrees on the X axis and a rotation angle of 340 degrees on the y axis.
  • Such correction data 62 is necessary for the number of times of switching of each motion of the scenario data. For example, when the scenario data specifies five modes of ⁇ Ma, Mb, Mc, Md, Me ⁇ , (Ma, Mb), (Mb, Mc), (Mc, Md ), (Md, Me)
  • the correction data for the joint between the motions when switching the four motions is required.
  • the correction data transmitting means 21 transmits the correction data 62 to the terminal 52.
  • the correction data receiving means 22 receives the correction data 62 transmitted from the server.
  • the motion correction means 23 corrects the motion based on the correction data 62 received by the correction data receiving means 22 when the motion switching drawing means 12 switches the motion. Smooth character movement.
  • the motion switching drawing means 12 switches the motion from the motion Ma to the motion Mb for 60 seconds and 0 seconds, respectively.
  • the key frame is corrected and the time of the frame in Fig. 7 is specified.
  • the correction data of Ma and Mb are as shown in FIG.
  • a scenario 61 that defines the order of combination of motions, and the switching of each motion.
  • the shape of a three-dimensional character with a large amount of data is not transmitted, and not all motions are transmitted. And the burden on the network 53 is small.
  • the motion is corrected based on the transmitted correction data 62, the movement of the three-dimensional character displayed on the display means 16 of the terminal 52 is very smooth.
  • FIG. 10 shows (Embodiment 3).
  • the server 51 and the terminal 52 are connected by a network 53.
  • the server 51 is connected to the transmission means 1 It comprises a database 76, correction data transmission means 21 and correction scenario calculation means 31.
  • the terminal 52 includes a data receiving means 11, a correction data overnight receiving means 22, a motion switching drawing means 12, a display means 16, a motion correcting means 23, a shape data base 14, and a plurality of It consists of Motion Day Night Base 15.
  • the correction scenario calculating means 31 calculates a correction data based on the motion database 76 so as to smoothly display each motion of the scenario data transmitted by the scenario transmitting means 1 based on the motion database 76.
  • E2 Send to terminal using transmission means 22.
  • the method of calculating the correction data is the same as that of the automatic motion correction means 13 described in (Embodiment 1), and thus the description thereof is omitted here.
  • the motion is corrected based on the transmitted correction data, the movement of the three-dimensional character displayed on the display means 16 of the terminal 52 is very smooth.
  • FIGS. 11 to 15 show (Embodiment 4).
  • a server 91 and a terminal 92 are connected by a network 53.
  • the server 91 has a data transmission means 1 and a scenario database 74.
  • the terminal 92 includes a data receiving means 11, a motion switching drawing means 12, a display means 16, a shape database 14, and a motion data base 15.
  • 1, 11, 12, 14, 14, 15, 16, 74 are the same as those in the first embodiment.
  • the shape database 14 of the terminal 9 2 contains various types of skeletal models of solid characters, polygon data such as the head, chest, waist, both arms, both hands, and both feet, and textures to be attached to the surface of the polygon. —Data is stored.
  • Motions Ml, M2, M3, of the respective scenes of "Sl”, “Scene S2” and “Scene S3" are accumulated to form a motion database 12.
  • Motion Ml and motion M2 shown in FIG. 12 are provided with a movement home position HP1 having a common posture.
  • the motion M2 and M3 are provided with a common home position HP2.
  • Motion M3 has the same home position at the beginning of movement (HP 2) and at the end of movement (HP 2).
  • the shape data base 14 and the motion data base 15 are, for example, CD-R0M, DVD (Digital Video Disc), hard disk, rewritable semiconductor RAM, or rewritable. It can be composed of a simple optical disk or the like.
  • Figure 13 shows the structure of Scenario Day 61.
  • the scenario data 61 is composed of shape data overnight identification information and motion designation information.
  • the shape data identification report includes skeletal model identification information for identifying the skeleton model of the three-dimensional character to be displayed, polygon identification information for the three-dimensional character for identifying the polygon to be filled in with the skeleton model, and the polygon. It consists of the texture identification information of the three-dimensional character for specifying the texture to be attached to the surface of the gon.
  • the three-dimensional character to be displayed is specified by the shape data identification information.
  • the motion data overnight designation information the order of the motions used in the scenario data 61 and the time width indicating the time from the beginning to the end of the motion of each motion are recorded.
  • One motion data is assigned one scene number.
  • the scenario should be such that the posture at the end of movement of the sill (M i) (ie, the home position at the end of the movement) and the posture at the start of the motion (M i +1) (ie, the home position of the movement start) match.
  • the motion order is specified in advance.
  • Fig. 14 shows the movement of the three-dimensional character designated by the scenario data 61 in Fig. 13 as the movement of the skeleton data.
  • the scenario data 61 in FIG. 13 is composed of scenes S 1, S 2, S 3 by moving a three-dimensional character using the motions M 1 to M 3 in FIGS. 12 (a) to 12 (c).
  • the order of the motions in which the three-dimensional character can be moved and the length of time from the start to the end of each motion (hereinafter referred to as the time width) are specified.
  • the scene S 1 is assigned the motion M 1 of (a) of FIG. 12 during the time width T 1, and the posture of the three-dimensional character is changed from the initial posture of the motion M l It moves to the last posture (HP 1) for a time width T 1.
  • the necessary posture can be determined by a known method, for example, one-dimensional linear interpolation ⁇ spline interpolation.
  • the motion M2 is allocated in the scene S2 during the time width T2.
  • the three-dimensional character is in the first pose of motion M2 in Fig. 12 (b). It moves from (HP 1) to the last posture (HP 2) with the time width T 2.
  • the motions Ml and M2 between the temporally adjacent scenes S1 and S2 are the motion Ml motion end posture and the next motion
  • the posture at the start of the movement of M2 is set to be the same as the home position (HP1).
  • the scenario data in FIG. 13 shows that the motions M 2 and M 3 between the temporally adjacent scenes S 2 and S 3 correspond to the motions of the motion M 2.
  • the end position and the start position of the next motion M 3 are set to be the same home position (HP 2). Therefore, the three-dimensional character from scene S 2 to scene S 3 in FIG. 14 is set.
  • the movement is smooth.
  • the scenario data indicates that the home position between the motion (M i) used for one scene and the motion (M i +1) used next between the temporally adjacent scenes is calculated.
  • the posture at the start of movement and the posture at the end of movement are as follows. If the home position is the same (HP 3), it is possible to specify the same motion M 3 as many times as possible in Scenario Day. This is a repetition of the same action, for example, left and right It can be used when repeating the step of alternately stepping on the foot of the user.
  • the motion switching drawing means 12 in FIG. 11 performs the processing shown in FIG.
  • scenario data 61 is read in (step 1), and the motion specified by the scenario data is fetched from the motion database in (step 2).
  • step (3) the time scale of each motion Mi is adjusted so that the movement from the beginning to the end of each motion Mi is completed within the time width Ti.
  • the adjustment means extending and compressing the time scale of the motion.
  • Step 4 the motions Mi whose time scales have been adjusted are arranged in order according to the scenario data. Since motion originally possesses a keyframe as information, the data between the keyframe and the next keyframe must be generated by interpolation. Therefore, in (Step 5), key frame interpolation is performed.
  • key frame interpolation There are many known methods for one-frame interpolation, such as one-dimensional linear interpolation and spline non-linear interpolation, but the same effect can be obtained by using any of these methods.
  • the interpolation method is not limited.
  • Step 6 the motions arranged for each frame rate Determine the posture of the three-dimensional character according to, render, and display the three-dimensional character on display means 16.
  • the amount of data transmitted is small and the load on the network 53 is light.
  • the switching motion is performed at the same home position timing so that the standing character takes the same posture.
  • the graphic display of is smooth.
  • the motion pattern to be executed is switched between the motion pattern to be executed at the timing of the home position that is completely coincident with the motion pattern to be executed and the motion pattern to be executed next, in other words, The switching was executed, but the time of each scene was determined so that the switching of the scene was executed by switching the motion pattern to be executed at the timing of the position almost common to the motion pattern being executed and the next to be executed Almost the same effect can be expected even if the terminal is configured to perform (Embodiment 5)
  • FIGS. 16 to 18 show (Embodiment 5).
  • the terminal 32 is connected to the server 31 via the network 53.
  • 1 4 is the terminal 3 2 shape data overnight pace
  • 1 5 is motion data overnight base
  • 1 8 is memory
  • 7 4 is server 3 1 scenario database
  • 7 5 is shape database
  • 7 6 is motion It is evening-based.
  • the motion database 15 and the shape database 14 of the terminal 32 are CD-ROM, DVD (Digital Video Disc), rewritable semiconductor RAM, hard disk, or rewritable. It can be composed of a simple optical disc or the like.
  • the memory 18 is made of a rewritable semiconductor: AM, a hard disk, a rewritable optical disk, or the like.
  • the server 31 refers to a so-called server dedicated machine of a client / server type, and also includes both a server function and a client function of a via / via type as one unit. What you have on a machine is also called a server.
  • Shape data 14 and 75 store data on various types of three-dimensional characters. Each day is assigned a unique identification number.
  • various 3D character skeleton models-corresponding numbers of polygons such as arms, legs, face, body, etc. Managed and stored.
  • the data stored in the shape data bases 14 and 75 do not always match, and even if the data is stored in the shape data base 75 of the server 31, the shape of the terminal 32 is not Some data is not stored in database 14.
  • motion data bases 15 and 76 store motion data corresponding to each three-dimensional character. Each motion is assigned a unique identification number, and the motion database 15 Managed and stored within The data stored in these databases is not always the same, and is stored in the motion data base of the terminal 32 even though it is in the motion data server 76 on the server 31. There is not even one night.
  • Insufficient data search means 7 1 includes all the shape data and motion specified in scenario 61 sent from server 31 in shape data base 14 and motion database 15 Investigate whether or not.
  • the missing data search means 71 sends the identification information of this data to the data request means 72. .
  • the data requesting means 72 sends the data request for missing data and the identification information of the data (or those data) to the data selecting means 73 of the server 31 via the network 53.
  • the identification number (for example, an I-address) of the terminal 32 that has issued the download request is transmitted.
  • the data selection means 73 finds the corresponding data from the download request and the data identification information from the shape data base 75 or the motion data evening 76, and the terminal that issued the download request 3 2 To the data transmission means 1 together with the identification number.
  • the data transmission means 1 transmits the received data via the network 53 to the terminal 32 of the received identification number.
  • the terminal 3 2 data receiving means 1 1 stores the received data Save to 8.
  • the motion switching drawing means 12 uses the shape database 14 and the data stored in the motion database 15 and the memory 18 to draw as specified in the scenario data 61. And display the graphic on display means 16.
  • FIG. 17 shows a flowchart of the processing of the terminal 32.
  • the scenario data 61 has already been transferred from the server 31 to the terminal 32 via the network 53.
  • step B 1 the scenario data 61 is read to the terminal 32.
  • step B 2 the missing data overnight search means 71 searches the shape data base 14 and the motion data base 15, and searches all the shape data specified by the scenario data 61. For overnight and motion, it is checked whether or not it exists, based on the shape data and motion identification information, and the result is determined in (Step B3).
  • step B4 If it is determined in step B3 that the data does not exist, in step B4, the shape data (or / and) motion lacking the data requesting means 72 is transmitted to the terminal 32. Request server 31 to forward. At this time, the request is transmitted to the server 31 with the identification number of the terminal 32 requested.
  • FIG. 18 shows the processing of the server 31.
  • the server selection means 73 of the server 31 transmits the motion of the identification number requested from the terminal 32 to the motion database 76 in the server 31. And sends it to data transmission means 1.
  • the data transmission means 1 transfers the motion to the terminal 32 which has requested this motion. At this time, the transfer destination is specified by the identification number of the terminal 32.
  • the transferred motion (that is, missing data) is received by the data receiving means 11 of the terminal 32 and is stored in the memory 18. .
  • Step B6 the motion switching drawing means 12 combines the motions as specified in the scenario data 61, generates a series of motions and draws a three-dimensional character, and (Step B7) To display in step 16.
  • the rendering method and the like are the same as in (Embodiment 1).
  • the communication time is short, and the character data required for the terminal 32 is short. If the evening and motion were insufficient, only the missing character data or motion would be transferred, so the character and motion were transferred every time as before.
  • the communication time can be shortened compared to the case.
  • the missing data is set as motion data.
  • the terminal 32 is connected to the server 31 similarly.
  • Server 31 retrieves the requested data from the database and sends it to the terminal 32 via the network 53, thereby compensating for the lack of data on the terminal. Display becomes possible.
  • FIGS. 19 and 20 show (Embodiment 6).
  • FIG. 19 shows an AV synchronous reproducing apparatus according to the present invention.
  • the music playing means 101 reads music data and plays music based on it.
  • the music composition defines all parameters necessary for music performance, such as music performance tempo, tone, and tone, as in the MIDI (Musical Instrumental Digital Interface) standard.
  • MIDI Musical Instrumental Digital Interface
  • the music playing means 101 updates the position of the music currently being played in real time.
  • the music position is specified by the total number of beats from the beginning of the music.
  • any method may be used as long as the music position can be uniquely specified.
  • the position of the song is specified by the number of beats from the beginning of the tune that is currently being played (hereinafter, this is defined as beat ID).
  • the tempo information which is the basis of the performance of the present embodiment, is defined by a unit time of a beat (hereinafter, defined as tempo time). This can also be any information that controls the performance tempo.
  • the synchronization information table 102 temporarily associates a beat ID as a music position, a tempo time as tempo information, and a time at which the tempo was updated (hereinafter, this is defined as a synchronization information update time).
  • the synchronization information updating means 103 stores the beat ID played by the music playing means 101, the tempo time, and the updated time in the synchronization information table 10.
  • the music position calculating means 104 is a synchronizing information table 1002 with the current time.
  • the music position corresponding to the current time is calculated from the synchronization information.
  • the song position is calculated by the following formula (A).
  • H (t c) H t + ((t c-t) / P t)
  • the frame output means 106 receives the music data based on the music position calculated by the music position calculation means 104 from the CG data associated with the music data being played by the music performance means 101. Synchronized frame data — outputs evening to frame buffer 105, which temporarily stores frame data.
  • the frame data corresponding to the music position can be calculated using the frame interpolation technique. Since there are many known techniques for such frame interpolation techniques, such as spline interpolation and linear interpolation, their description will be omitted.
  • the CG data defined from the start time, the end time, and the key frame data can be used to determine the frame position of any frame advance position F t (start time ⁇ F t ⁇ end time). It is possible to calculate overnight.
  • the CG data related to the performance data corresponds the start time and the end time to the music position.
  • the frame data synchronized with the performance of the musical performance means 101 can be always output to the frame buffer 105 by using the frame interpolation technique based on the progress position of the frame calculated as described above. .
  • the video display means 107 displays a moving image by sequentially updating the display of the frame data stored in the frame buffer 105, and FIG. 20 shows the AV synchronous reproduction thus configured.
  • the following is a specific flowchart of the device.
  • the AV synchronous playback device repeats the following operation from the start beat ID to the end beat ID associated with CG data.
  • a process of reproducing in synchronization with the start beat ID (Hs) and the end beat ID (He) will be described.
  • the music position calculating means 104 calculates the music position (H (t)) corresponding to the current time (t) using the equation (A).
  • step S102 if the music position (H (t)) is smaller than the start beat ID (Hs), nothing is done and the process ends, and no more than the start beat ID (Hs). If it is above, go to (Step S103).
  • step S103 if the music position (H (t)) is larger than the end beat ID (He), nothing is done, and if it is less than the end beat ID (He), (step SI04) Proceed to).
  • the frame output means 106 calculates a frame date using the frame interpolation technique based on the frame advance position Ft calculated based on the equation (B), and Output to frame buffer 105. (Embodiment 7)
  • FIG. 21 shows (Embodiment 7).
  • the synchronization information real-time updating means 201 replaces the synchronization information updating means 103 of (Embodiment 6) shown in FIG.
  • the only difference is that they are provided, and the other components are the same as in (Embodiment 6).
  • the synchronization information real-time updating means 201 updates the synchronization information in the synchronization information table 102 only when the music playing means 101 updates data. For example, it is updated only when the tempo information changes or the music position changes. Therefore, if the tempo information and music position are not updated, they are not updated.
  • FIG. 22 shows (Embodiment 8).
  • the synchronization information periodic updating means 301 replaces the synchronization information updating means 103 of (Embodiment 6) shown in FIG. The only difference is that they are provided, and the other components are the same as in (Embodiment 6).
  • the synchronization information periodic updating means 301 updates the synchronization information table 102 at regular intervals.
  • the period may be a music unit such as once per kashiwa or a general unit such as once every 30 seconds. (Embodiment 9)
  • FIG. 23 shows (Embodiment 9).
  • a calculation time estimating means 401 is added to (Embodiment 6) shown in FIG.
  • the other components are the same as in (Embodiment 6).
  • the calculation time estimating means 401 predicts the time required for the calculation from the number of polygons and the number of vertices in CG data.
  • the calculation time is proportional to the number of polygons and vertices to be calculated.
  • the operation time is always estimated from the number of polygons and vertices to be processed next.
  • the calculation time estimating means 401 estimates in consideration of the processing capacity such as the CPU processing capacity of the computer. Therefore, even when the number of polygons is different from that of the frame, it is possible to predict the calculation time according to the number.
  • the frame output means 106 adds the calculation time predicted by the calculation time estimating means 401 to the music position calculated by the music position calculating means 104, and transfers frame data synchronized with the music position to the frame buffer 105. Output.
  • H s Start beat ID associated with start time
  • He End beat ID associated with the end time
  • the frame data synchronized with the performance of the performance means 101 can be always output to the frame buffer 105 using the frame interpolation technique.
  • FIG. 24 shows (Embodiment 10).
  • performance delay estimating means 501 is added to (Embodiment 6) shown in FIG.
  • the other components are the same as in (Embodiment 6).
  • the performance delay estimating means 501 predicts a performance delay time until the data is actually output as a sound from an output device such as a speaker based on the music data.
  • the performance delay time is proportional to the number of simultaneous sounds at the music position.
  • the forecast is made in consideration of the processing capacity such as the CPU processing capacity of the computer. Therefore, if the music position is specified, the performance delay time of the music position can be predicted in real time from the music data evening.
  • the synchronization information updating means 103 outputs to the synchronization information table 102 a value obtained by adding the performance delay time predicted by the performance delay estimating means 501 to the music position.
  • FIG. 25 shows (Embodiment 11).
  • display delay estimating means 501 is added to (Embodiment 6) shown in FIG. I have.
  • the other components are the same as in (Embodiment 6).
  • the display delay estimating means 501 predicts a display delay time until the video display means 107 actually displays the frame buffer 105 data. This can be expected from the performance of the image display means 107 such as the rendering ability.
  • the frame output means 106 calculates frame data of a value obtained by adding the display delay time estimated by the display delay estimating means 501 to the music position calculated by the music position calculating means 104.
  • the value of the frame time is calculated using the following equation (D).
  • D t Display delay estimating means 600 1 predicted display delay time
  • frame data synchronized with the performance of the playing means 101 can be always output to the frame buffer 105 by using the frame interpolation technique.
  • FIG. 26 shows (Embodiment 12).
  • the trick play start notifying means 701 and the trick play are added to FIG. 19 (Embodiment 6).
  • a live end notifying means 720 and a trick play synchronization information updating means 703 are added.
  • the other components are the same as in (Embodiment 6).
  • the special reproduction start notifying means 7001 generates a special reproduction start signal when the special reproduction is started by the observation.
  • the special reproduction end notifying means 7002 generates a special reproduction end signal when the special reproduction is ended by the operation.
  • the special reproduction synchronizing information updating means 7 03 is a special reproduction start information notifying means 7 0 1 from the time when the special reproduction start signal is generated until the special reproduction end notifying means 7 0 2 generates the special reproduction end signal.
  • the music position and tempo time in the synchronization information table 102 are updated according to the type of the special playback. For example, in the case of double-speed playback, the normal tempo time is reduced to 1/2, and the beat ID is advanced at twice the normal speed.
  • CG graphics a three-dimensional CG graphic image drawn by computer graphics
  • CG character a three-dimensional CG graphic image drawn by computer graphics
  • the AV synchronous playback device shown in Fig. 27 receives a performance start command from a user or an operator, and stores music data in the music data storage unit D1 and the music output from the music data storage unit D1.
  • An output waveform generator D2 that generates waveform data of the performance sound based on the data, and a sound data buffer D3 that temporarily stores a certain amount of waveform data from the output waveform generator D2. .
  • the AV synchronous playback device converts the waveform data from the audio data buffer D3 into an analog audio signal.
  • a D / A converter D4 for converting the sound signal, an amplifier D5 for amplifying the sound signal from the DZA converter D4, and a speaker D6 for sounding the amplified sound signal from the amplifier D5 as a performance. I have.
  • the music data storage unit D1 is composed of a rewritable recording medium, for example, a RAM, and is reproduced by a CD-R0M, DVD or similar recording medium or a communication line before a performance start command is input.
  • the music data for the music to be played has been obtained in advance. Since the music composition is the same as that of the sixth embodiment, the description is omitted. Further, since the sound data buffer D3 stores a fixed amount of waveform data, it is possible to prevent the performance reproduced by the speaker D6 from being interrupted.
  • a beat generation unit D7 for outputting a synchronization message is connected to the music data storage unit D1. Based on the tempo information included in the music data from the music data storage D1, the beat generation unit D7 generates a specific note, for example, a quarter note, every time the music performance progresses, at that time. A synchronous message consisting of the performance position (beat ID) of the current song and the tempo time Temp (i) (see Fig. 29) is generated. Since the beat ID and the tempo time are the same as in (Embodiment 6), description thereof will be omitted.
  • the sound generator D7 is connected to a sound buffer D3, and each time the performance sound waveform data is output from the sound buffer D3 to the D / A converter D4.
  • the output timing is transmitted from the sound data buffer D3 to the beat generation unit D7. For example, it informs the beat generation unit D7 that the waveform data has been output at a period of 44.1 kHz.
  • the timing of the output from the sound buffer D3 functions as an internal clock of 44.1 kHz in the beat generator D7.
  • the AV synchronous playback device is connected to a scenario data storage unit D8 for storing the scenario data and a scenario data storage unit D8 as a device for playing back the CG character based on the scenario data.
  • An AV synchronization instruction data generation unit D9, a beat generation unit D7, and a display frame determination unit D10 connected to the AV synchronization instruction data generation unit D9, which generate the AV synchronization instruction data;
  • a CG rendering unit D11, a frame buffer D12, and a monitor D13 are provided.
  • the display frame determination unit D10 stores a motion data storage unit D14 that stores motion data for instructing the motion (movement) of the CG character, and CG character shape data that indicates the shape of the CG character. Yes CG character shape data storage unit D15 is connected.
  • the CG rendering unit Dl1 includes a camera viewpoint information storage unit D16 and a light source information storage unit D17, which store camera viewpoint information and light source information for rendering the CG character to be displayed. It is connected.
  • Scenario data storage D8, motion data storage D14, CG character shape data storage D15, camera viewpoint information storage D16, and light source information storage D17 are rewritable records. It consists of a medium, for example, RAM.
  • the scenario data storage unit D8, the motion data storage unit D14, and the CG character shape data storage unit D15 store the CD-ROM, DVD, or similar recording medium before the performance start command is input.
  • the scenario data, the motion data, and the CG character shape data are input and stored by the or the communication line.
  • the scenario data input to the scenario data storage unit D8 is obtained by combining a plurality of motion data in a time series This is an instruction for generating a series of movements of the CG character.
  • the scenario data includes camera viewpoint information and light source information, and specifies the motion data, CG character shape data, camera viewpoint information, and light source information for each frame to be displayed.
  • the scenario data includes camera viewpoint information and light source information.
  • the camera viewpoint information and the light source information are output and stored in the camera viewpoint information storage unit D 16 and the light source information storage unit D 17 at the same time that the scenario data is stored in the scenario data storage unit D 8. .
  • the camera viewpoint information and the light source information are information indicating the shooting conditions of the virtual camera and the irradiation conditions of the virtual light source, respectively.
  • the camera viewpoint information is a data designating the camera position, the shooting direction, and the zoom.
  • the light source information is composed of data specifying the position of the light source, the irradiation direction, and the effect.
  • the AV synchronization instruction data generation unit D9 progresses the music data with respect to one quarter note (specific note) in each motion data divided by the scenario data.
  • the AV synchronization instruction data is generated by associating the CG character with the progress of the CG character.
  • the AV synchronization instruction data generation unit D9 generates AV synchronization instruction data for specifying a frame to be displayed each time a quarter note (specific note) advances by one beat.
  • FIG. 28 shows the music in the AV synchronized playback device shown in Fig. 27.
  • Fig. 28 (b) shows the relationship between the number of beats, motion data, and scenario data in the AV synchronized playback device shown in Fig. 27. ,, And the relationship between the AV synchronization instruction data.
  • FIGS. 28 (a) and (b) the horizontal axis indicates beat ID.
  • a quarter note is used as the specific note to be a reference for the number of beats.
  • FIG. 28 (b) shows an enlarged portion from the first beat to the first beat in FIG. 28 (a).
  • the scenario data overnight shows the number of beats of the music in the music data stored in the music data storage unit D1 and the motion data stored in the motion data storage unit D14.
  • the beat IDs 1 to H1 of the song are associated with the motion data M1 by the scenario data.
  • the first beat specifies the N1 first original frame
  • the H1 beat specifies the N2 first original frame
  • the beat IDs (H 1 + 1) to H 2 of the song are associated with the motion M 1 by the Sinariode.
  • the (H 1 + 1) beat specifies the N 1 2nd original frame
  • the H 2 beat specifies the N 2 2nd original frame.
  • the motion data M3 is associated with the scenario.
  • the (H2 + 1) beat specifies the N1 third original frame
  • the H3 beat specifies the N2 third original frame.
  • the original frame is referred to as a frame actually displayed on the monitor D13, which is determined by the display frame determination unit D10 based on the original frame. That's why.
  • the motion data for 100 frames can be changed to the motion for 100 frames in six beats depending on the music. It is possible to cope with various speeds of movement using a single movement, such as moving forward or moving 100 frames in eight beats. Since the storage capacity of the data storage unit D 14 can be reduced, it is economical.
  • the AV synchronization instruction data is data for associating the progress of the music data with the progress of the motion of the CG character, and the original file assigned to each of the motion data. It is generated by dividing the frame into quarter notes (specific notes), one beat at a time.
  • the original frames from Nl 1 to N 21 of the beat ID allocated to the motion data Ml are as shown in FIG. 28 (b).
  • the frames n 2 1 and n 3 for indicating the progress position of the frame displayed on the monitor D 13 correspond to the second, third, fourth,. 1, n 4 1, ...
  • the frames n 21, n 31, and n 41 have the motion data storage unit D. Some are not stored in 14. However, the frames that are not stored are stored in the motion data storage unit D by the well-known data interpolation method such as spline interpolation in the reproduction processing of the CG character in the subsequent display frame determination unit D10. Remember in 14 There is no problem because it is obtained from the frame that was obtained.
  • the display frame determination unit D 10 shown in FIG. 27 determines the next frame to be displayed based on the AV synchronization instruction data and the synchronization message in each frame of the CG character to be displayed.
  • the posture of the CG character is calculated and determined. More specifically, when the time interval at which the frame buffer D12 outputs an image to the monitor D13, which is an image display device, is ⁇ , the display frame determination unit D10 includes the above-mentioned beat generation unit D10.
  • the frame buffer is determined by the CG character movement specified by the above AV synchronization instruction data, the progress position of the frame of the night, and the time interval ⁇ T. Determine the posture of the CG character to be written to D12.
  • the specific value of the time interval ⁇ is, for example, when the monitor D 13 displays the CG character 30 times per second, ⁇ T is 1/30 second.
  • the display frame determination unit D10 sets the performance position (i-th beat) of the music of the synchronization message input from the beat generation unit D7 to the AV synchronization instruction data.
  • the movement of the CG character written in the frame notes D12 at the time of inputting this synchronization message At the timing (i + 1th beat) when is sent, the progress position F i + 1 of the frame specified by the AV synchronization instruction data is obtained.
  • the display frame determination unit D10 sets the time interval from the i-th beat to the (i + 1) -th beat indicated by the tempo time Temp (i) of the song of the synchronous message input this time. Then, the CG character to be displayed and made to progress at the above-mentioned time interval ⁇ seconds is calculated using the following equation (1).
  • j is the frame written to the frame buffer D12 between the time when the display frame determination unit D10 inputs one synchronous message and the time when the next synchronous message is input. This is a count value that counts the number of frames.
  • the value of j is set to 0 when a synchronization message is input, and then changes one by one (Temp (i) / AT) as the CG character frame progresses.
  • the values of i and Temp (i) are maintained at the same value when the next synchronization message at the (i + 1) th beat is input, and the synchronization value at the (i + 1) th beat is input.
  • a message is entered, it is updated to the value of the synchronization message o
  • the display frame determination unit D10 calculates a CG character having the progress position flaine (j) of the frame, which is obtained by applying spline interpolation to the original frame of the motion data, as a variable.
  • Motion data function P F The position of the CG character in each frame to be displayed is calculated and determined using the (frame progress position).
  • the display frame determining unit D10 calculates the vertex coordinate data of each polygon of the CG character from the determined attitude of the CG character. Further, the display frame determination unit D10 reads out the CG character shape data designated by the scenario data from the CG character shape data storage unit D15, and stores the CG character shape data The image data of the CG character is created using the calculated vertex coordinate data. Thereafter, the display frame determining unit D10 outputs the created image data to the CG rendering unit D11 together with the scenario data. The display frame determination unit D10 records the progress position ⁇ of the frame sent to the CG rendering unit D11 in order to determine whether the scenario data has been completed.o
  • the CG rendering unit D11 renders a CG character for each frame based on camera viewpoint information and light source information included in the scenario data. That is, based on the image data from the display frame determination unit D10, the CG rendering unit D11 performs the shooting conditions of the specified camera and the irradiation conditions of the specified light source such as spotlight and sunlight. Creates image data of the CG character illuminated with.
  • the CG rendering unit D11 outputs the rendered CG color image data to the frame buffer D12 for writing. Then, the frame buffer D12 outputs the image data of the CG character to the monitor D13, and the monitor D13 displays the CG character.
  • the posture calculation by the display frame determination unit D10 performed for each frame, and the CG rendering unit D1 The relationship with the rendering processing in 1 will be described.
  • FIG. 29 is a timing chart showing a temporal relationship between the music data reproduction, the CG character posture calculation, and the rendering processing in the AV synchronous reproduction device shown in FIG. 27.
  • the arrow “T” indicates the passage of time
  • the vertical lines "A", “B”, and “C” indicate the progress of the music performance in the music data and the display frame determination unit D10.
  • 3 shows the progress of the posture calculation of the CG character and the progress of the rendering process in the CG rendering unit D11.
  • this synchronous message includes the (i) beat, which is the current playing position of the music, and the tempo time Temp (i) of the music.
  • the display frame determining unit D10 calculates the frame advance position flame (j) of the CG character to be advanced every ⁇ seconds using the synchronous message by using the above equation (1). Then, the display frame determination unit D10 determines the posture of the CG character by using a function P (the progress position of the frame) of the movement data, and obtains the vertex coordinate data of each polygon constituting the CG character. Is calculated. In the figure, the time required for this posture calculation (including the calculation time of the vertex coordinate data) is represented by (i) the posture calculation on the beat, (i) + (AT / Temp (i)) the posture calculation on the beat, ⁇ '. After that, the display frame determining unit D 10 calculates the calculated vertex coordinate data. —Create image data instantly based on the evening and the CG character shape designated for the scenario.
  • the created image data is output from the display frame determination unit D10 to the CG rendering unit D11 each time, and the rendering process of the CG character is started.
  • FIG. Fig. 30 shows a flowchart of the music performance playback process.
  • the music data storage unit D1 and the scenario data storage unit D8 store the music data via a recording medium or a communication line.
  • One night and scenario de night are acquired (step S 1).
  • the AV synchronization instruction data generation unit D9 generates an AV synchronization instruction data based on the scenario data from the scenario data storage unit D8 (step S2).
  • Fig. 31 shows the procedure for creating the AV synchronization instruction data in the AV synchronization instruction data generation unit D9.
  • the AV synchronization instruction data generation unit D 9 inputs the scenario data from the scenario data storage unit D 8 (step S 13)
  • the AV synchronization instruction data generation unit D 9 outputs the scenario data.
  • the AV synchronization instruction data is generated ( Step S14).
  • step S3 when the music data storage unit D1 inputs a performance start command, the music data storage unit D1 sequentially stores the stored music data to the output waveform generation unit D2. Output.
  • the output waveform generator D2 performs digital performance based on the music data.
  • a waveform of the played sound is generated (step S4), and is sequentially output to a buffer for sound data D3.
  • the sound buffer D3 temporarily stores a certain amount of waveform data (step S5). After that, the waveform data is output from the sound data buffer D3 to the D / A converter D4.
  • the D / A converter D4 converts the input waveform data into an analog sound signal (step S6). Then, the D / A converter D4 outputs the sound signal to the amplifier D5, and the amplifier D5 amplifies the sound signal (step S7). Subsequently, the sound signal from the amplifier D5 is generated as a performance by the speaker D6 (step S8).
  • the sound buffer D3 transmits the timing to the beat generator D7 every time the waveform data is output to the D / A converter D4 (step S9).
  • the beat generation unit D7 every time a song of one quarter note progresses, plays the music position (i beat
  • the synchronization message consisting of the eye and the tempo time Temp (i) is generated and output to the display frame determination unit D10 (step S10).
  • the beat generating section D7 determines whether or not the music is at the end (step S11). If it is not the end of the music, the process returns to the process shown in (Step S4) again. If it is the end of the music, the performance ends (step S12).
  • Figure 32 shows the playback process of the CG character.
  • the display frame deciding unit D10 inputs a synchronization message from the beat generating unit D7 (step S15)
  • the display frame deciding unit D10 sets the value of j to 0 (step S16). Note that j is written to the frame buffer D12 between the time when the display frame determination unit D10 inputs one synchronization message and the time when the next synchronization message is input, as described above. The number of frames to be counted.
  • the display frame determination unit D10 compares the music playing position of the synchronization message input from the beat generation unit D7 with the AV synchronization instruction data to determine the current time (i.
  • the position of the CG character's motion data, fi and F i +1, at the time of inputting the synchronization message (the beat) and the next synchronous message (i + 1) are obtained.
  • the display frame determination unit D10 calculates the time length from the i-th beat to the (i + 1) -th beat indicated by the tempo time Temp (i) of the song of the synchronous message input this time.
  • the progress position flame (j) of the frame of the CG character to be displayed and progressed at the above-mentioned time interval ⁇ T seconds is calculated by using the above-mentioned formula (1) (step S17).
  • the display frame determining unit D 10 uses the function P (frame progress position) of the motion data of the CG character with the frame progress position flame (j) as a variable to display each frame to be displayed.
  • the posture of the CG character is calculated at step (step S18).
  • the display frame determination unit D10 determines the image data at the frame progress position flame (j). An image is generated and output to the CG rendering unit D11 (step S19).
  • the CG rendering unit D11 renders the input image data based on the camera viewpoint information and the light source information included in the scenario data (step S20).
  • the CG rendering unit D 11 writes the rendered image data into the frame buffer D 12 (step S 21). Then, the monitor D 13 (FIG. 27) The image data from the frame buffer D12 is displayed (step S22).
  • the display frame determining unit D10 When performing the process shown in (Step S19), the display frame determining unit D10 records the progress position fi of the frame output to the CG rendering unit D11 (Step S23). The display frame determination unit D10 determines whether the display of the scenario data has been completed up to the end based on the recorded traveling position fi (step S24).
  • the display frame determination unit D10 increases the value of j by one (step S25), and returns to the processing shown in step S17.
  • step S26 the drawing of the CG character is ended.
  • FIGS. 33 and 34 show (Embodiment 14).
  • the tempo change information for changing the tempo time of the synchronous message output from the beat generation unit by inputting a tempo change instruction of the music in FIG. 27 showing (Embodiment 13) is shown.
  • An input section D 18 has been added.
  • the other components are the same as those in (Embodiment 13).
  • the tempo change information input section D 18 is a beat generation section D Connected to 7.
  • the tempo change information input section D18 when a tempo change instruction of the music is input from the user or an external device during the reproduction of the music (i-th beat), as shown in the following equation (2), the new tempo time Temp (i) is extended by multiplying the original tempo time Temp (i) included in the music data stored in the overnight storage section D1 by the proportionality constant Cs.
  • the new tempo time Temp (i) is output from the tempo change information input section D18 to the beat generation section D7, and is used as the tempo time of the synchronous message output from the beat generation section D7.
  • the tempo change information input section D18 simultaneously obtains the music data to be played back by the music data storage section D1, and simultaneously obtains the original tempo time Temp (i) via a recording medium or a communication line. ) To get.
  • FIG. 34 shows a process of playing back a music performance in the AV synchronous playback device shown in FIG.
  • the music data storage unit Dl and the scenario data storage unit D8 store the music data and the scenario data using a recording medium or a communication line. Get each O
  • the AV synchronization instruction data generation unit D9 generates AV synchronization instruction data based on the scenario data from the scenario data storage unit D8.
  • the tempo change information input section D18 checks whether a tempo change instruction has been input. When a tempo change instruction is input, the tempo change information input section D18 changes the tempo time of the synchronization message output from the beat generation section D7 based on the input tempo change instruction (step S8). 3 4).
  • the music data storage unit D1 sequentially outputs the stored music data to the output waveform generation unit D2.
  • the output waveform generator D2 generates a digital waveform of the performance sound in digital form based on the music data (step S35), and sequentially outputs it to the sound data buffer D3.
  • the sound buffer D3 temporarily stores a fixed amount of waveform data overnight (step S36). After that, the waveform data is output from the sound data buffer D3 to the D / A converter D4.
  • the D / A converter D4 converts the input waveform data into an analog sound signal (step S37). Then, the D / A converter D4 outputs the sound signal to the amplifier D5, and the amplifier D5 amplifies the sound signal (step S38). Subsequently, the sound signal from the amplifier D5 is generated as a performance by the speaker D6 (step S39).
  • the sound buffer D3 outputs the waveform data to the D / A converter D4, it transmits the timing to the beat generator D7 (step S40). Then, based on the tempo time included in the music data, the beat generation unit D7 performs the music for one quarter note, and the music performance position (i-th beat) at that time Then, a synchronous message including the tempo time Temp (i) is generated and output to the display frame determining unit D10 (step S41).
  • the beat generating section D7 determines whether or not the music is at the end (step S42). If it is not the end of the music, the process returns to the process shown in (Step S34) again. If it is the end of the music, the performance ends (step S43).
  • the display frame determining unit generates the image data of the moving image synchronized with the music data by using the frame interpolation method. This makes it possible to automatically match the playback process of the moving image with the playback process of the music performance, and always synchronize the playback processes.
  • FIGS. 35 to 37 show the AV synchronous reproducing apparatus according to the (Embodiment 15).
  • this embodiment instead of the reproduction of the CG character shown in FIG. 27 showing (Embodiment 13), a series of moving image data whose frame length is not constant is reproduced in synchronization with the music performance. Configuration.
  • the other parts are the same as those in (Embodiment 13), and the duplicate description thereof will be omitted.
  • the above-mentioned moving image data include moving image data of a non-frame-independent compression method in the Moving Picture Experts Group (MPEG) standard and the like, and moving image data of which frame is not fixed length.
  • MPEG Moving Picture Experts Group
  • FIG. 35 as a device for performing video playback processing, it is connected to the video scenario data storage unit D8 'for storing video scenario data and the video scenario data storage unit D8,
  • the AV synchronization instruction data generating section D 9 ′ which generates the AV synchronization instruction data based on the data, and the display frame connected to the beat generating section D 7 and the AV synchronization instruction data generating section D 9 ′ are determined.
  • a section D 10 ′ is provided.
  • a moving image data storage unit D 14 ′ for storing moving image data is connected to the display frame determining unit D 10 ′.
  • the display frame determining unit D 10 ′ determines an image data synchronized with the music data overnight for each frame to be displayed based on the video data overnight, and generates a frame knowledge D 1. Output to 2.
  • the moving image scenario data storage unit D 8 ′ and the moving image data storage unit D 14 ′ are configured by a rewritable recording medium, for example, a RAM, and are configured to receive a CD-R0M, DVD, The moving image scenario data and the moving image data are input and stored on a similar recording medium or communication line.
  • a rewritable recording medium for example, a RAM
  • FIG. 36 (a) is an explanatory diagram showing the number of beats of music and the relationship between video data and video scenario data in the AV synchronous playback device shown in FIG. 35
  • FIG. 36 (b) is a diagram showing FIG. 2 shows the relationship between the number of beats of music in the AV synchronized playback device, the moving image data, the moving image scenario data, and the AV synchronization instruction data.
  • FIG. 36 (a) and (b) the horizontal axis indicates the number of beats of a specific note from the start of music performance.
  • Figure 36 (b) is the same as Figure 36 (a). The part from the first beat to the first beat is enlarged.
  • the video scenario data is stored in the music data storage unit D1 at the number of beats of the music data stored in the music data storage unit D1 and the video data storage unit D1 4
  • the original frame of the video data stored in 5 is associated with it.
  • the first beat to the first beat of the song are associated with the moving image data B1 by the moving image scenario.
  • the first beat specifies the Nl 1st original frame
  • the HI beat specifies the N2 1st original frame.
  • the movie scenario data is associated with the movie data B 2.
  • the (H1 + 1) beat specifies the N1 2nd original frame
  • the H2 beat specifies the N2 2nd original frame
  • the moving picture data B3 is associated with the moving picture scenario data.
  • the (H 2 + 1) beat specifies the N 1 3rd original frame
  • the H 3 beat specifies the N 2 3rd original frame.
  • the original frame is referred to as a frame actually displayed on the monitor D13, which is determined by the display frame determination unit D10, based on the original frame. Because it was a frame.
  • the moving image data for 100 frames can be converted to the 100 frames for 6 beats for some music.
  • the AV synchronization instruction data is composed of the music data and the video data based on one quarter note (specific note) in each video data divided by the video scenario data. It is data that correlates with the progress of the evening, and is generated by dividing the original frame assigned to each video data into equal quarter beats (specific notes). .
  • the monitor D 13 shows frames n 2 1, n 3 1, n 4 to indicate the progress position of the frame displayed on the monitor.
  • the frames n 2 1, n 3 1, and n 4 1 have the moving image data storage unit D 1. 4, Some are not memorized in. However, the frames that are not stored are stored in the moving image data by a well-known de-interpolation method such as spline interpolation in the reproduction processing of the moving image in the display frame determining unit D 10, at the subsequent stage. There is no problem because it is obtained from the frame stored in Part D 14 '.
  • the video data—evening B2 is also divided into frames n22, n32, n42, ... to indicate the progress position of the frame corresponding to each beat of the music data. Have been.
  • the display frame determination unit D10 compares the performance position (the i-th beat) of the music of the synchronization message input from the beat generation unit D7 with the AV synchronization instruction data, and generates the synchronization message. Is input to the frame buffer D12 at the point in time when the video is sent. The AV is sent to the frame advance position fi and the timing (i + 1st beat) at which the next synchronization message is sent. Obtain the traveling position F i +1 of the frame specified by the synchronization instruction data.
  • the display frame determination unit D 10 ′ determines whether the time interval from the i-th beat to the (i + 1) -th beat indicated by the tempo time Temp (i) of the music of the synchronous message input this time is satisfied.
  • the moving position flame (j,) of the frame of the moving image data to be displayed and made to progress at the above-mentioned time interval ⁇ seconds is calculated by using the following equation (3).
  • j ′ is a frame buffer D from the time when the display frame determining section D 10 inputs one synchronous message to the time when the next synchronous message is input. This is a count value obtained by counting the number of frames written to 12. In other words, the value of j 'is set to 0 when a synchronous message is input, and then changes to 1 (Temp (i) / AT) one by one as the frame of video data progresses . In equation (1), the values of i and Temp (i) are kept the same in the next synchronous message input at the (i + 1) th beat, and the values of (i + 1) th beat When you enter a synchronization message, it is updated to the value of that synchronization message. O
  • the display frame determining unit D 10 uses a function D () of the moving image data obtained by applying spline interpolation to the original frame of the moving image data and using the moving position flame (j ′) of the frame as a variable. Calculate and determine the image data of each frame to be displayed using (frame progress position). After that, the display frame determination unit D10, outputs the created image data to the frame buffer D12 and writes it. Then, the frame buffer D12 outputs the image data to the monitor D13, and the monitor D13 displays a moving image.
  • the display frame determination unit D 10 ′ records the progress position fi of the frame sent to the frame buffer D 12 in order to determine whether or not the moving image scenario has been completed.
  • FIG. 37 shows a moving image playback process in the AV synchronous playback device shown in FIG.
  • step S 45 when the display frame determination unit D 10 ′ inputs a synchronization message from the beat generation unit D 7 (step S 45), the display frame determination unit D 10, sets the value of j to 0. Yes (step S46).
  • j is the frame number from when the display frame determining unit D 10 inputs one synchronous message to when the next synchronous message is input. This is a count value obtained by counting the number of frames written to the buffer D12.
  • the display frame determination unit D 10 ′ compares the music playing position of the synchronization message input from the beat generation unit D 7 with the AV synchronization instruction data to determine the current (i.
  • step S47 the moving position of the frame of the moving image data to be displayed and progressed at the above-mentioned time interval ⁇ flame seconds is calculated using the above formula (3) (step S47).
  • the display frame determination unit D 10 for each frame to be displayed, uses a function D (moving position of the frame) of the moving image data with the moving position of the frame flame (j ′) as a variable.
  • the image data of the image is calculated (step S48).
  • the display frame determination unit D 10 ′ outputs the calculated image data to the frame buffer D 12 (step S 49), and the frame buffer D 12 Then, write the image data to be displayed (step S50). Then, the monitor D13 displays the image data from the frame buffer D12 (step S51).
  • the display frame determining unit D10 When performing the processing shown in (Step S49), the display frame determining unit D10, records the progress position fi of the frame output to the frame buffer D12 (Step S49). Step S52). The display frame determination unit D10 'determines whether or not the display has been completed until the end of the moving image scenario based on the recorded progress position fi (step S53). When the display is not completed, the display frame determination unit D 10 'increases the value of j by 1 (step S54), and returns to the process shown in step S47. On the other hand, if the display is completed, the reproduction of the moving image is ended (step S55).
  • the tempo change instruction of the music is input, and the tempo change information input for changing the tempo information of the synchronous message output from the beat generating section is input. Section is provided.
  • FIGS. 38 and 39 show (Embodiment 16) of the present invention.
  • a tempo change information input section D18 for inputting a tempo change instruction for a music piece and changing the tempo time of a synchronous message output from the beat generation section D7.
  • the other parts are the same as those shown in FIG. 35 showing (Embodiment 15), and thus redundant description thereof will be omitted.
  • a tempo change information input section D 18 is connected to a beat generation section D 7.
  • the tempo change information input section D18 when a tempo change instruction of the music is input from the user or an external device during the reproduction of the music (i-th beat), as shown in the following equation (4),
  • the new tempo time Temp (i) is obtained by multiplying the original tempo time Temp (i) included in the music data stored in the data storage section D1 by the proportional constant Cs'.
  • New tempo time Temp (i) is
  • the new tempo time Temp (i) is output from the tempo change information input section D18 to the beat generating section D7, and is used as the tempo time of the synchronous message output from the beat generating section D7.
  • the tempo change information input section D 18 obtains the original tempo time Temp (i) at the same time via the recording medium or the communication line when the music data played back by the music data storage section D 1 is obtained. I do.
  • FIG. 39 shows the process of playing back a music performance in the AV synchronous playback device shown in FIG.
  • Step S 6 1 As shown in FIG. 3 9, (Step S 6 1) before the start of the music performance, song de Isseki storage unit D 1 and moving scenario data storage section D 8 5 is the recording medium or a communication line, music Acquisition of a video and a video of the video.
  • the AV synchronization instruction data generating unit D9 generates AV synchronization instruction data based on the moving image scenario data from the moving image scenario data storing unit D8.
  • the tempo change information input section D18 receives a tempo change instruction. Find out if you have been forced.
  • the tempo change information input section D 18 changes the tempo time of the synchronous message output from the beat generating section 7 based on the input tempo change instruction (step S64). .
  • the music data storage unit D1 sequentially outputs the stored music data to the output waveform generation unit D2.
  • the output waveform generation unit D2 generates a digital waveform of the performance sound in digital form based on the music data (step S2).
  • the sound buffer D3 temporarily stores a fixed amount of waveform data (step S6'6). Thereafter, the waveform data is output from the sound data buffer D 3 to the D / A converter 4.
  • the 0/8 converter 04 converts the input waveform data into an analog sound signal (step S67). Then, the D / A converter D 4 outputs the sound signal to the amplifier D 5, and amplifies the sound signal with the amplifier D 5 (step S 6 8). Subsequently, the amplifier D 5 is output from the speaker D 6. The sound signal from this is pronounced as a performance (step S69).
  • the buffer D3 informs the beat generator D7 of the timing (step S7).
  • the beat generation unit D7 executes the music for one quarter note, and the musical position at that time (the i-th beat) Then, a synchronous message including the tempo time Temp (i) is generated and output to the display frame determining section D10, (Step S71).
  • the AV synchronized playback device checks whether the end of the music (Step S72). If it is not the end of the music, the process returns to the process shown in (Step S64) again. If the music ends, the performance ends (step S73).
  • the AV synchronized playback apparatus of this embodiment is provided with a tempo change information input section D18 for inputting a music tempo change instruction and changing the tempo time of a synchronous message output from the beat generation section D7. ing.
  • a tempo change information input section D18 for inputting a music tempo change instruction and changing the tempo time of a synchronous message output from the beat generation section D7.
  • the reproduction processing of the AV synchronous reproducing apparatus of each of the above embodiments can be programmed by a computer
  • the AV synchronous method of the present invention is provided on a recording medium executable by a computer. It is also possible to do so.
  • the recording medium mentioned here is a floppy disk, a CD-ROM, a DVD (digital video disk), a magneto-optical disk, a removable hard disk, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un mode d'affichage graphique selon lequel le nombre de transferts de données de présentations de graphes distribuées par un serveur à travers un réseau, est réduit et le mouvement d'un personnage tridimensionnel affiché sur un terminal est atténué. L'invention concerne également un dispositif de reproduction audiovisuelle synchronisée, dans lequel la musique, dont le rythme est modifié à mi-chemin, et l'image sont synchronisées et sont reproduites. Afin d'afficher graphiquement un personnage en trois dimensions sur le terminal par la commande du serveur, une pluralité de modèles de mouvements sont préparés côté terminal, les données de scénario sont transmises du serveur au terminal, et le terminal commute la scène en fonction des données de scénario et assure l'affichage graphique. La scène est commutée en fonction de la position initiale commune au modèle de mouvements affiché et au modèle de mouvements à afficher ensuite, ou en fonction d'une position approximativement commune au modèle de mouvements affiché ou au modèle de mouvements à afficher ensuite, de façon à assurer l'affichage graphique.
PCT/JP1998/002175 1997-05-19 1998-05-15 Mode d'affichage graphique, procede de reproduction synchronisee, et dispositif de reproduction audiovisuelle synchronisee WO1998053443A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CA002250021A CA2250021C (fr) 1997-05-19 1998-05-15 Appareil d'affichage graphique, methode de reproduction synchrone et appareil de reproduction audiovisuelle
US09/142,813 US6331851B1 (en) 1997-05-19 1998-05-15 Graphic display apparatus, synchronous reproduction method, and AV synchronous reproduction apparatus

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
JP9/127717 1997-05-19
JP12771797A JP3481077B2 (ja) 1997-05-19 1997-05-19 グラフィック表示方法と装置
JP9/131521 1997-05-22
JP9131521A JPH10320589A (ja) 1997-05-22 1997-05-22 3次元グラフィック表示装置
JP9/141927 1997-05-30
JP9141927A JPH10333673A (ja) 1997-05-30 1997-05-30 同期再生方法
JP9/167802 1997-06-25
JP9167802A JPH1116001A (ja) 1997-06-25 1997-06-25 3次元グラフィック表示装置
JP9290026A JPH11126066A (ja) 1997-10-22 1997-10-22 Av同期装置、及びav同期方法、並びにav同期プログラムを記録した記録媒体
JP9/290026 1997-10-22
JP10/5632 1998-01-14
JP563298A JP3475765B2 (ja) 1998-01-14 1998-01-14 グラフィック表示装置

Publications (1)

Publication Number Publication Date
WO1998053443A1 true WO1998053443A1 (fr) 1998-11-26

Family

ID=27547923

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1998/002175 WO1998053443A1 (fr) 1997-05-19 1998-05-15 Mode d'affichage graphique, procede de reproduction synchronisee, et dispositif de reproduction audiovisuelle synchronisee

Country Status (2)

Country Link
CN (1) CN1152364C (fr)
WO (1) WO1998053443A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1311382C (zh) * 1999-03-09 2007-04-18 索尼公司 信息分配系统
US7339589B2 (en) 2002-10-24 2008-03-04 Sony Computer Entertainment America Inc. System and method for video choreography
US10786736B2 (en) 2010-05-11 2020-09-29 Sony Interactive Entertainment LLC Placement of user information in a game space

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002093497A1 (fr) * 2001-05-14 2002-11-21 Netdimension Corporation Systeme de distribution d'informations et procede de distribution d'informations
CN1331359C (zh) * 2005-06-28 2007-08-08 清华大学 交互式多视点视频系统中视频流的传输方法
CN101523909B (zh) * 2006-08-11 2012-05-23 夏普株式会社 显示装置、数据提供装置、显示系统、显示系统的控制方法、控制程序以及记录介质
CN113491083A (zh) * 2019-02-26 2021-10-08 松下知识产权经营株式会社 摄像机影像传输重放系统、构成该系统的摄像机及查看器

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6426274A (en) * 1987-07-22 1989-01-27 Nec Corp System and device for controlling object operation
JPH05298422A (ja) * 1992-04-16 1993-11-12 Hitachi Ltd 多関節構造体の動作生成方法
JPH06274596A (ja) * 1993-03-19 1994-09-30 Internatl Business Mach Corp <Ibm> 仮想会議システム用端末装置及び仮想会議システム
JPH07325568A (ja) * 1994-06-01 1995-12-12 Casio Comput Co Ltd 映像出力機能付電子楽器
JPH0830807A (ja) * 1994-07-18 1996-02-02 Fuji Television:Kk 演奏連動型動画生成装置、音声連動型動画生成装置及びこれを利用したカラオケ装置
JPH08180208A (ja) * 1994-01-07 1996-07-12 Fujitsu Ltd 映像生成装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6426274A (en) * 1987-07-22 1989-01-27 Nec Corp System and device for controlling object operation
JPH05298422A (ja) * 1992-04-16 1993-11-12 Hitachi Ltd 多関節構造体の動作生成方法
JPH06274596A (ja) * 1993-03-19 1994-09-30 Internatl Business Mach Corp <Ibm> 仮想会議システム用端末装置及び仮想会議システム
JPH08180208A (ja) * 1994-01-07 1996-07-12 Fujitsu Ltd 映像生成装置
JPH07325568A (ja) * 1994-06-01 1995-12-12 Casio Comput Co Ltd 映像出力機能付電子楽器
JPH0830807A (ja) * 1994-07-18 1996-02-02 Fuji Television:Kk 演奏連動型動画生成装置、音声連動型動画生成装置及びこれを利用したカラオケ装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NIKKEI CG, No. 134, November 1997, (Tokyo), TOSHIYA NAKA, "Moving CG Character on the INTERNET Just as You Want (in Japanese)", p. 188-192. *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1311382C (zh) * 1999-03-09 2007-04-18 索尼公司 信息分配系统
US7339589B2 (en) 2002-10-24 2008-03-04 Sony Computer Entertainment America Inc. System and method for video choreography
US7777746B2 (en) 2002-10-24 2010-08-17 Sony Computer Entertainment America Llc System and method for video choreography
US8184122B2 (en) 2002-10-24 2012-05-22 Sony Computer Entertainment America Llc System and method for video choreography
US9114320B2 (en) 2002-10-24 2015-08-25 Sony Computer Entertainment America Llc System and method for video choreography
US10786736B2 (en) 2010-05-11 2020-09-29 Sony Interactive Entertainment LLC Placement of user information in a game space
US11478706B2 (en) 2010-05-11 2022-10-25 Sony Interactive Entertainment LLC Placement of user information in a game space

Also Published As

Publication number Publication date
CN1225734A (zh) 1999-08-11
CN1152364C (zh) 2004-06-02

Similar Documents

Publication Publication Date Title
CA2250021C (fr) Appareil d&#39;affichage graphique, methode de reproduction synchrone et appareil de reproduction audiovisuelle
JP5554677B2 (ja) 映像コンテンツ生成システム、映像コンテンツ生成装置及びコンピュータプログラム
US6958751B2 (en) Image processing device, image processing method and recording medium
JP3601350B2 (ja) 演奏画像情報作成装置および再生装置
JP4746640B2 (ja) 動的イメージのモーション遷移処理方法とそのシステム、及び、そのプログラムを記録したコンピュータ読み取り可能な記録媒体
JP2010044484A (ja) 映像コンテンツ生成装置及びコンピュータプログラム
WO1998053443A1 (fr) Mode d&#39;affichage graphique, procede de reproduction synchronisee, et dispositif de reproduction audiovisuelle synchronisee
JP3895014B2 (ja) 映像再生装置およびカラオケ装置
JP2012198380A (ja) 表示制御装置
JP6248943B2 (ja) 情報処理装置、情報処理方法、およびプログラム、並びに情報処理システム
WO2021085105A1 (fr) Dispositif de traitement d&#39;informations, dispositif de proposition, procédé de traitement d&#39;informations et procédé de proposition
JP6110731B2 (ja) ジェスチャーによるコマンド入力識別システム
JPH11126066A (ja) Av同期装置、及びav同期方法、並びにav同期プログラムを記録した記録媒体
JP5939217B2 (ja) 情報処理装置、情報処理システム及びプログラム
JP5949688B2 (ja) 情報処理装置及びプログラム
JP5954288B2 (ja) 情報処理装置及びプログラム
JP5510822B2 (ja) 歌唱情報処理システム
JP5928361B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP2003271162A (ja) カラオケ装置およびカラオケ装置を実現するためのプログラム
JP6048136B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP6361430B2 (ja) 情報処理装置及びプログラム
JP5958389B2 (ja) 情報処理装置及びプログラム
JP5928257B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP2011204113A (ja) 映像コンテンツ生成システム、メタデータ構築装置、映像コンテンツ生成装置、携帯端末、映像コンテンツ配信装置及びコンピュータプログラム
JPH10333673A (ja) 同期再生方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 98800587.5

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2250021

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 09142813

Country of ref document: US

AK Designated states

Kind code of ref document: A1

Designated state(s): CA CN US

CFP Corrected version of a pamphlet front page
CR1 Correction of entry in section i

Free format text: PAT. BUL. 47/98 UNDER (51) REPLACE "G10H 1/00" BY "G06T 15/70, G10H 1/00"