WO2008098419A1 - Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle - Google Patents

Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle Download PDF

Info

Publication number
WO2008098419A1
WO2008098419A1 PCT/CN2007/000548 CN2007000548W WO2008098419A1 WO 2008098419 A1 WO2008098419 A1 WO 2008098419A1 CN 2007000548 W CN2007000548 W CN 2007000548W WO 2008098419 A1 WO2008098419 A1 WO 2008098419A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
character
user
movie
playing
Prior art date
Application number
PCT/CN2007/000548
Other languages
English (en)
Chinese (zh)
Inventor
Xizhi Li
Original Assignee
Xizhi Li
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xizhi Li filed Critical Xizhi Li
Priority to PCT/CN2007/000548 priority Critical patent/WO2008098419A1/fr
Publication of WO2008098419A1 publication Critical patent/WO2008098419A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Definitions

  • the present invention relates to a 3D (3D) movie/game movie/game production and playback system and method, and more particularly to a role-playing real-time 3D movie/game system and method, particularly for more popular Production/play of personal 3D movies/games. Background technique
  • Role-playing refers to the user's first or third-person perspective, using a mouse and keyboard to enter a device to operate a virtual character to perform a character's performance in a 3D movie/game.
  • Role-playing as a method of operating virtual characters is widely used in 3D computer games, especially in FPS (first-person shooter) and RPG (role-play) games.
  • a conventional 3D movie refers to a non-real-time movie, which typically outputs a compressed still image sequence file through long-time rendering, and then views by decompressing and playing a sequence of still images.
  • the traditional 3D movie production process is as follows: separately create independent 3D scenes, models, characters, and actions; in the online block diagram mode, add 3D models, characters and animations to the scene from the camera's perspective; after long-time rendering, output Effect diagram or graph 'sequence; According to the rendered effect graph or graph sequence, return to the wireframe mode, adjust the original scene, and then output the effect graph or graph sequence again, and then repeat until a satisfactory effect graph or graph sequence is obtained.
  • you can refer to the famous 3D movie production software such as Autodesk 3dsmax and Maya.
  • the production of traditional 3D games is generally divided into a production development environment (game editor) and a game running environment.
  • the production and development environment of the game is similar to the production environment of 3D movies.
  • the production process is as follows: Create separate 3D scenes, models, characters, and actions; In the game editor, add 3D models to the scene from the perspective of the camera. Characters and animations; In the game editor, select the objects in the scene, in the properties window, change their properties; when the editing of the game is completed, switch to the game running environment, watch the effect, and so on, until you are satisfied with the correct Game experience.
  • you can refer to the famous 3D game development software such as Unreal Engine.
  • the present invention solves the technical problem of how to simplify the production process of a 3D movie/game, so that a person without 3D movie/game production experience can also produce a good 3D movie/game.
  • the present invention provides a system for realizing the production and playback of a real-time 3D movie/game through virtual role playing, including:
  • a role-playing system that allows users to manipulate virtual characters through a first or third person perspective to perform a character's performance in a 3D movie/game;
  • a virtual environment simulation system that automatically and in real time generates a common behavior of the virtual character according to the virtual physical environment in which the virtual character is located;
  • a camera system that allows a user to shoot a camera in a virtual scene by acting as a director and manipulating the camera in a first person;
  • a real-time movie playback system that can connect a group of lenses in series to realize real-time playback of 3D movies.
  • the system of the present invention further includes: a person who enables the user to manipulate and play different virtual characters at different times and/or locations to realize multi-person storyline production in a 3D movie/game And a network synchronization system that allows users to jointly play and record all virtual characters in the same 3D movie/game in a multi-person network environment.
  • the role playing system, the virtual environment simulation system, the character behavior system, the character management system, and the camera system may be implemented by a ParaEngine distributed game engine; the scene editing system, the character behavior recording and playing system, The movie real-time playing system and the network synchronization system can be implemented by the NPL neuron parallel computer language. Of course, it can also be implemented in other ways.
  • the present invention also provides a method for realizing a real-time 3D movie/game by virtual role playing, which includes the following steps:
  • step (1) of the method of the present invention the normal behavior of the virtual character is automatically and in real time generated by the system according to the virtual physical environment in which the virtual character is located; the user can insert a pre-made character action a sequence or intelligent behavior template; the user can create a multi-character storyline in a 3D movie/game by manipulating and playing different virtual characters at different times and/or locations; any virtual character played by the user
  • the behavior of the system can be recorded, edited and played back in real time by the system independently in the virtual scene; the role behavior of the user being recorded, edited and played back can be superimposed and synchronized with the behavior of other virtual characters in the virtual scene; Users can change or create new virtual characters or modify virtual scenes according to the needs of 3D movies/games, and new scene changes can affect the behavior of existing virtual characters.
  • the user controls the direction of the virtual camera or virtual character perspective through the mouse, and controls the movement of the virtual character in the virtual scene through the keyboard direction key; in the third person virtual character control mode, The virtual camera is locked behind the person.
  • step (1) of the method of the present invention the system automatically and in real time generates one or more common behaviors of the virtual character according to the virtual physical environment in which the virtual character is located: walking, running, swimming, jumping, Flying, talking, going up and down the stairs, crossing the obstacle, sliding along the edge of the unreachable obstacle, Turn around to face other people, drive the vehicle, and the natural connection between the above behaviors.
  • the user in the process of controlling the virtual character, the user may instruct the virtual character to perform the corresponding pre-made character action sequence by selecting the corresponding item from the text list or the schematic list.
  • the character action sequence is from a motion capture device or other three-way motion production system;
  • the smart behavior template is implemented by a computer program or a script language module, and the intelligent behavior template provides intelligent dialogue, random walk, follow , patrol, a sequence of actions in tandem.
  • step (1) of the method of the present invention the user switches to the selected virtual character and controls it by clicking a virtual character in the field of view with a mouse and clicking the "enhanced" button; or The user switches to and controls a virtual character that is closest to the virtual character that the user is playing through the mouse or keyboard.
  • step (1) of the method of the present invention the user records, edits, and plays back the behavior of the currently playing virtual character through a conventional animation production interface such as recording, pause, play button, and time lever. .
  • step (1) of the method of the present invention the action sequences of all the characters generated by the virtual role playing in the virtual scene share the same time axis and have the same time starting point; when the user plays the current virtual character Other virtual characters that already exist in the virtual scene will move in synchronization with the current virtual character.
  • the user creates and modifies the virtual scene by using the currently playing virtual character as a 3D locator, including the following contents:
  • each user uses one computer, and each person can only manipulate one character at a time; the virtual scene and the character's behavior are synchronized through a network connection;
  • the user's operating environment shares the same time starting point through the network connection;
  • the users work together through the instant text and voice communication system, and one of the users acts as the director to control the recording, pause, play, timeline of the movie.
  • Adjustments, camera shooting, etc., other users act as actors, controlling one of their virtual characters.
  • the present invention has the following advantages over the conventional 3D movie/game production technology:
  • Figure 1 is a screenshot of a display screen in a preferred embodiment of the present invention.
  • FIG. 2 is a schematic view showing the overall structure of a system in a preferred embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a character behavior system in a preferred embodiment of the present invention
  • FIG. 4 is a screen for using a character behavior recording and playback system in a preferred embodiment of the present invention. And the movie real-time playback system using the interface;
  • FIG. 5 is a schematic diagram of a related interface of a scene editing system in a preferred embodiment of the present invention. detailed description
  • the role-playing real-time 3D movie/game system of the present invention includes the following parts: 1) a role playing system, 2) a virtual environment simulation system, 3) a character behavior system, and 4) a character management system. 5) Role behavior recording and playback system, 6) Scene editing system, 7) Camera system, 8) Movie real-time playback system, 9) Network synchronization system. The following sections describe each part separately.
  • the role-playing system of the present invention allows a user to perform a character's performance in a 3D movie/game by operating a virtual character through a first or third person perspective.
  • the user controls the direction of the camera or virtual character's perspective through the mouse, and controls the movement of the virtual character in the virtual scene through the keyboard direction keys.
  • the camera is generally locked behind the character to facilitate the user to view the scene in front of the person and to operate the character.
  • the normal behavior of the virtual character operated by the user is automatically and real-time generated by the system according to the virtual physical environment in which the virtual character is located.
  • the common behaviors that virtual characters can generate automatically and in real-time according to the virtual physical environment in which they are located include: walking, running, swimming, jumping, flying, dialogue, going up and down the stairs, crossing obstacles, sliding along the edges of unobstructable obstacles, turning to face Other characters, driving vehicles, and the natural connection between the above behaviors.
  • the character behavior system in the present invention allows a user to insert a previously prepared character action sequence or an intelligent behavior template in the process of operating a virtual character.
  • the user can instruct the virtual character to perform the corresponding pre-made character action sequence or intelligent behavior template by selecting the corresponding item from the text list or the schematic list.
  • the sequence of role actions is from an action capture device or other third party action production system.
  • the intelligent behavior template is implemented by a computer program or a script language module, and the intelligent behavior template provides intelligent activity, random walk, follow, patrol, and a series of action sequences. Users can create and add richer smart behavior templates using the scripting language mentioned in the system of the present invention.
  • the intelligent behavior template is mainly used to specify the processing of various sensing events for the virtual character.
  • the character management system of the present invention allows the user to manipulate and control at different times and/or locations Play different virtual characters to create multi-character storyline in 3D movies/games.
  • the user switches to the selected character and controls (ie plays) by clicking on the virtual character in the field of view with the mouse and clicking the "enhance" button.
  • the user can also switch to and control a person who is closest to the character the user is playing by means of a mouse or keyboard.
  • the character management system should be able to efficiently simulate each character.
  • the character behavior recording and playing system of the invention allows the user to record, edit and play back the behavior of the currently playing virtual character through a traditional animation production interface such as recording, pause, play button and time lever; the user is playing the current In a role, the virtual character that already exists in the virtual scene also follows its synchronized motion.
  • the sequence of actions for all characters produced by role play in the scene share the same timeline and have the same time start.
  • the scene editing system in the present invention allows the user to use the currently playing virtual character as a 3D locator to create and modify a virtual scene, including the following:
  • the camera system of the present invention allows a user to take a shot in a virtual scene by acting as a director and manipulating the camera in a first person.
  • the real-time movie playing system of the present invention enables a group of lenses to be connected in series to realize real-time playback of 3D movies/games.
  • 3D movie viewers can watch movies from any custom perspective, including The viewer specifies the angle of view of the character, zooms in and out, rotates the camera, and the like.
  • the network synchronization system in the present invention allows users to simultaneously perform the playing and recording of all virtual characters in the same 3D movie/game in a multi-person network environment.
  • each user uses a computer, and each person can only manipulate one character at a time; the virtual scene and the behavior of the characters are synchronized through the network connection; all user operating environments are shared through the network connection.
  • the same time starting point; users work together through instant text and voice communication system, usually one of the users acts as the director to control the recording, pause, play, timeline adjustment and camera shooting, etc. They act as actors and play (control) their respective virtual characters.
  • the aforementioned system and method are implemented in a project called "Children's Game Animation Creation Platform".
  • Figure 1 shows the screenshot of the display of the “Children's Game Animation Creation Platform”.
  • the "Children's Game Animation Creation Platform” is a computer software developed by the inventor.
  • the required computer operating environment is: Windows XP/2000/2003 Server, DirectX 9. 0c, 1 GHz or more CPU, 512 megabytes or more of memory, 64 megabytes.
  • the above video memory; its bursting environment is: C++, Visual Studio. Net 2003/2005.
  • “Children's animation creation platform” is based on “ParaEn g i ne distributed game engine” software and “NPL neurons parallel computer language” Software development of a 3D movie / game software.
  • “Children's Game Animation Creation Platform” platform children can use their imagination to create a beautiful and dynamic 3D virtual world and create 3D movies/games on it.
  • the " Para Engi ne Distributed Game Engine” software is a 3D game engine developed by the inventor; the inventor submitted 22 computer software copyright registrations for the ParaEngine Distributed Game Engine software.
  • the "Pa ra Engi ne distributed game engine” software is the running platform and development platform of the Internet 3D application; it is compatible with the traditional online game development platform; trying to bring the interactive virtual world to the open platform of the Internet through the game technology on.
  • the "NPL Neuron Parallel Computer Language” software is a language technique developed by the inventor to describe, transfer, share and synchronize the content and logic of the 3D world in virtual reality; the inventors filed Three computer software copyright registrations of "NPL Neuron Parallel Computer Language".
  • 2 is a schematic diagram showing the overall structure of a system in a preferred embodiment of the present invention.
  • the virtual environment simulation system 15, the role playing system 16, the character behavior system 17, the character management system 18, and the camera system 19 described in the present invention are realized by the «ParaEngine Distributed Game Engine” software 14.
  • the network synchronization system 111 described in the present invention is implemented by the "NPL Neuron Parallel Computer Language” software 110.
  • the scene editing system 11, the character behavior recording and playing system 12, and the movie real-time playing system 13 described in the present invention are realized by the NPL computer language interface provided by the "NPL Neuron Parallel Computer Language” software 110.
  • the NPL computer language interface provided by the "NPL Neuron Parallel Computer Language” software 110 can interact with (all subsystems (including 15, 16, 17, 18, 19) in the (ParaEngine Distributed Game Engine) software 14.
  • the system 11 and 15 exchange data with each other at runtime and mutually trigger program response events
  • system 12 exchanges data with 15, 16, 17, respectively, at runtime, and mutually triggers program response events
  • systems 13 and 19 exchange data with each other at runtime , and trigger each other to respond to events.
  • the virtual environment simulation system 15 can be implemented by using the default settings provided in the system 14; the system 15 only needs to simulate the terrain, the sky, the ocean, and the static physical environment on the terrain within the virtual character's active range, and the simulation method is Basic collision detection and response, as well as rigid body physics laws.
  • the virtual environment simulation system 15 can also be implemented using existing physical simulation engine middleware such as Havok, ODE and Agiea.
  • the role behavior system 17 can be implemented using default settings provided in the system 14, the principle of which is shown in FIG.
  • the events 21, 22, 23, 24, 25, 26 are the trigger events of the character in the virtual environment.
  • the trigger event is defined as: Event 21 is called when the character is first loaded; Event 22 is called when the character first perceives the surrounding environment; Event 23 is called when the character last perceives the surrounding environment; Event 24 is when the character is clicked by the user Call; event 25 when the character perceives other people around The object time is called; Event 26 is called between a number of frames and a number of frames between the event 22 and the event 23. Editing the processing of these events may refer to the virtual character performing a corresponding pre-made sequence of character actions or implementing the functions in the described smart behavior template. As shown in FIG.
  • the edit event 21 can generate the set of action sequences 27; the edit events 21, 26 can create the smart behavior template (patrol 28); the edit event 24 can create the smart behavior template ( Intelligent dialogue 29); Edit event 25 can create the intelligent behavior template (follow 210); edit event 26 can make the intelligent behavior template (random walk 211).
  • the processing of events 21, 22, 23, 24, 25, 26 can be implemented by a computer program or the described "NPL Neuron Parallel Computer Language” software 110.
  • the user can use the "NPL Neuron Parallel Computer Language” software 110 to create and add a richer intelligent behavior template.
  • the character management system 18 can be implemented using default settings provided in the system 14.
  • the character management system 18 allows the user to create a multi-character storyline in a 3D movie/game by manipulating and playing different virtual characters at different times and/or locations.
  • the user switches to the selected character and controls (ie plays) by clicking on the virtual character in the field of view with the mouse and clicking the "Enable" button.
  • the user can also switch to and control a person who is closest to the character the user is playing by mouse or keyboard.
  • the character management system 18 should be able to efficiently simulate each character, and the simulation method is only within the range of activities of the virtual character that needs to simulate the activity.
  • N the computational complexity of this algorithm is 0 (N), N is the number of active virtual characters in the scene at a certain time (instead of the total number of virtual characters, because at a certain moment, a large number of people in the scene are still Or not related to the current simulation).
  • FIG. 4 is a schematic diagram of a usage interface of the character behavior recording and playback system 12 described.
  • the system 12 allows the user to record, edit, and play back the behavior of the currently playing virtual character through a conventional animation interface such as recording 33, pause 31, play button 32, and time lever 34; when the user plays a current role, Virtual characters that already exist in the virtual scene also follow their synchronized motion.
  • the sequence of actions for all characters produced by role play in the scene share the same timeline and have the same time start.
  • FIG. 5 is a schematic diagram of a related interface of a scene editing system in a preferred embodiment of the present invention (four images in the left side of the figure and one image in the right side), so that the user can use the currently playing virtual
  • the role acts as a 3D locator to create and modify a virtual scene, including the following: Create an instance of a pre-made 3D character or scene model at the foot of the currently playing virtual character, such as interface button 419; pan the selected object The position of the foot of the currently playing virtual character, such as interface button 45; rotation 42, zooming 43, zooming 44, shifting 41, resetting 46, deleting the selected 3D character or instance of the scene model; at the foot of the currently playing virtual character The position is centered, within a range of a given radius (such as radius control panel 415), raising 412, decreasing 410, leveling 411, smoothing 413, sharpening 414 the surface, thereby creating a high and low undulating 3D surface; In the range of the foot position of the currently playing virtual
  • the camera system 19 can be implemented using default settings provided in the system 14.
  • the camera system 19 allows the user to shoot a shot in a virtual scene by acting as a director and manipulating the camera in a first person.
  • the system 19 can be implemented by the following methods: the home key controls the lens advancement; the end key controls the lens to zoom out; the mouse wheel controls the lens movement speed; drags the mouse left and right to move the control camera to rotate left and right; drags the mouse up and down to control the camera to rotate up and down;
  • the S, A, and D keys control the lens shift; the Q key controls the lens to fall, and the E key controls the lens to rise.
  • the movie real-time playing system 13 enables a group of shots to be connected in series to realize real-time playback of 3D movies/games.
  • a viewer of a 3D movie can view the movie from any custom angle, including viewing from the perspective of the viewer's designated character, zooming in and out, rotating the camera, and the like.
  • the interface 38 in Fig. 4 is a schematic diagram of the related interface of the real-time playing system of the movie, wherein the button 35 is for playing a movie, the button 36 is for suspending the movie, and the time lever 37 is for adjusting the time axis of the currently playing movie segment. All virtual characters in the ⁇ film segment share the same time start, so in Figure 4, timeline 37 has the same time start as time axis 34.
  • the network synchronization system 111 can be implemented by using the default settings provided in the system 110.
  • System 111 allows users to collectively perform the recording and recording of all virtual characters in the same 3D movie/game in a multi-networked environment.
  • every user Using a computer each person can only manipulate one character at a time; the virtual scene and the character's behavior are synchronized through the network connection; all users' operating environments share the same time starting point through the network connection; the user can use instant text and voice
  • the communication system works together. Usually one of the users acts as the director to control the recording, pause, play, timeline adjustment and camera shooting of the film. Other users act as actors and play (control) their respective virtual characters. .
  • the network synchronization system 111 can also be utilized to enable multiple users to work together;
  • the present invention regards the playing of virtual characters as the main means of 3D movie/game production, instead of using the traditional free camera mode to edit, thereby making the production of 3D movies/games immersive in it.
  • the production of multi-character storyline is realized by time-sharing, so that the development environment and the running environment are combined in the same real-time virtual environment that is well rendered and simulated (such as the real-time shooting environment in Figure 4), instead of being segmented into A semi-real-time editing environment and a non-real-time effect preview environment; a new and streamlined 3D movie production process, especially for the production of personal 3D movies/games, without 3D movie/game production experience People can also make good 3D movies/games.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système et un procédé de production et de jeu à un jeu/film 3D en temps réel basé sur le jeu de rôle virtuel. Un utilisateur joue le rôle virtuel suivant son propre angle visuel ou suivant celui d'une troisième personne pour jouer le rôle dans le jeu/film 3D puis utilise la caméra pour filmer la scène dans la scène virtuelle suivant son propre angle visuel et relie ensuite un groupe de scènes pour jouer au jeu/film 3d en temps réel. Le système comprend un système de jeu de rôle, un système de simulation d'environnement virtuel et autres.
PCT/CN2007/000548 2007-02-15 2007-02-15 Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle WO2008098419A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2007/000548 WO2008098419A1 (fr) 2007-02-15 2007-02-15 Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2007/000548 WO2008098419A1 (fr) 2007-02-15 2007-02-15 Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle

Publications (1)

Publication Number Publication Date
WO2008098419A1 true WO2008098419A1 (fr) 2008-08-21

Family

ID=39689615

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2007/000548 WO2008098419A1 (fr) 2007-02-15 2007-02-15 Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle

Country Status (1)

Country Link
WO (1) WO2008098419A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114247143A (zh) * 2021-12-21 2022-03-29 北京蔚领时代科技有限公司 基于云服务器的数字人互动方法、装置、设备及存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1194592A (zh) * 1996-06-05 1998-09-30 世雅企业股份有限公司 游戏用图象处理装置
JP2006263122A (ja) * 2005-03-24 2006-10-05 Sega Corp ゲーム装置、ゲームシステム、ゲームデータの処理方法及びこのゲームデータの処理方法ためのプログラム並びに記憶媒体

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1194592A (zh) * 1996-06-05 1998-09-30 世雅企业股份有限公司 游戏用图象处理装置
JP2006263122A (ja) * 2005-03-24 2006-10-05 Sega Corp ゲーム装置、ゲームシステム、ゲームデータの処理方法及びこのゲームデータの処理方法ためのプログラム並びに記憶媒体

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114247143A (zh) * 2021-12-21 2022-03-29 北京蔚领时代科技有限公司 基于云服务器的数字人互动方法、装置、设备及存储介质

Similar Documents

Publication Publication Date Title
US9782678B2 (en) Methods and systems for computer video game streaming, highlight, and replay
US9381429B2 (en) Compositing multiple scene shots into a video game clip
CN101247481A (zh) 基于角色扮演的实时三维电影/游戏的制作及播放的系统和方法
US20090046097A1 (en) Method of making animated video
Stapleton et al. Applying mixed reality to entertainment
US9566517B2 (en) System and method for visualizing synthetic objects within real-world video clip
Manovich Image future
US20120021828A1 (en) Graphical user interface for modification of animation data using preset animation samples
US20120028707A1 (en) Game animations with multi-dimensional video game data
US11638871B2 (en) Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US20110181601A1 (en) Capturing views and movements of actors performing within generated scenes
JP7320672B2 (ja) 人工知能(ai)制御のカメラパースペクティブジェネレータ及びaiブロードキャスタ
Naimark Realness and interactivity
Nitsche Machinima as media
WO2018106461A1 (fr) Procédés et systèmes de diffusion en continu, de mise en évidence et de relecture de jeu vidéo informatique
Moody An ‘amuse-bouche at best': 360 vr storytelling in full perspective
US20120021827A1 (en) Multi-dimensional video game world data recorder
Zhen et al. Physical World to Virtual Reality–Motion Capture Technology in Dance Creation
WO2008098419A1 (fr) Système et procédé de production et de jeu à un jeu/film 3d en temps réel basé sur le jeu de rôle
WO2022198971A1 (fr) Procédé et appareil de commutation d'action de personnage virtuel, et support de stockage
Wolf Video Games, cinema, Bazin, and the myth of simulated lived experience
Peng et al. Analysis of artistic language in the virtual reality design
Geigel et al. Adapting a virtual world for theatrical performance
US20240037877A1 (en) Augmented reality system for enhancing the experience of playing with toys
Iseli Double Trouble. Digital Avatars on Stage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07710969

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07710969

Country of ref document: EP

Kind code of ref document: A1