WO2011142084A1 - 画像処理装置、画像処理方法、および画像処理プログラム - Google Patents
画像処理装置、画像処理方法、および画像処理プログラム Download PDFInfo
- Publication number
- WO2011142084A1 WO2011142084A1 PCT/JP2011/002280 JP2011002280W WO2011142084A1 WO 2011142084 A1 WO2011142084 A1 WO 2011142084A1 JP 2011002280 W JP2011002280 W JP 2011002280W WO 2011142084 A1 WO2011142084 A1 WO 2011142084A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- character
- virtual
- virtual sphere
- motion information
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 99
- 238000003672 processing method Methods 0.000 title claims description 6
- 230000033001 locomotion Effects 0.000 claims abstract description 394
- 238000002156 mixing Methods 0.000 claims abstract description 163
- 238000000034 method Methods 0.000 claims description 140
- 230000008569 process Effects 0.000 claims description 121
- 238000003860 storage Methods 0.000 claims description 85
- 238000012937 correction Methods 0.000 claims description 39
- 239000000203 mixture Substances 0.000 claims description 33
- 230000009471 action Effects 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 description 24
- 210000002683 foot Anatomy 0.000 description 21
- 210000003371 toe Anatomy 0.000 description 12
- 230000036544 posture Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000007796 conventional method Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 239000012636 effector Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/422—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/80—2D [Two Dimensional] animation, e.g. using sprites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
- A63F2300/643—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6607—Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
Definitions
- An image processing apparatus is an image processing apparatus that generates a motion of a character by controlling the motion of the character in a virtual space, and stores sample motion information indicating a sample motion that is a basis of the motion of the character.
- a sample motion information storage unit, and a motion information storage unit that stores motion information indicating the motion of the character, and a plurality of sample motion information stored in the sample motion information storage unit is mixed at an arbitrary mixing ratio.
- motion information registration means for registering motion information generated by the motion blending means in the motion information storage unit, and motion information generated by the motion blending means
- Reachable position detecting means for detecting a reachable position of a predetermined part of the character at the end of movement when the character is moved, coordinate information indicating the reachable position detected by the reachable position detecting means
- a mixing ratio associating means for associating with a mixing ratio
- a virtual sphere arranging means for arranging a plurality of spheres in a range where the reachable position exists in the virtual space
- Virtual sphere association means for associating the reachable position with each other, virtual sphere information that is information on the virtual sphere associated with the reachable position by the virtual sphere association means, and virtual sphere information that stores virtual sphere information Based on the virtual sphere information registration means to be registered in the information storage unit, and the motion information stored in the motion information storage unit Character control
- Output means for outputting the virtual sphere, contact determination means for determining whether there is a virtual sphere that is in contact with a contact-allowable object that is an object that the character can touch among the virtual spheres output by the output means,
- a selection receiving unit that receives a selection of a virtual sphere determined to be in contact with the contact-allowable object by the contact determination unit, and indicates a reachable position associated with the virtual sphere that has been selected by the selection receiving unit Controlling the motion of the character based on motion information corresponding to the mixing ratio associated with the coordinate information.
- the motion of the character can be appropriately expressed and the processing load in image processing can be reduced.
- the character control means includes: a mixing ratio specifying means for specifying a mixing ratio according to the virtual sphere received by the selection receiving means; and the coordinate information associated with the mixing ratio specified by the mixing ratio specifying means; Based on the position of the contact-permitted object in the virtual space, the mixture ratio correction unit corrects the mixture ratio so that motion information corresponding to the mixture ratio satisfies a predetermined condition, and the mixture ratio correction unit corrects the mixture ratio.
- a motion information specifying means for specifying motion information according to the mixed ratio, wherein the character control means controls the action of the character based on the motion information specified by the motion information specifying means; May be.
- the image processing method of the present invention is an image processing method for generating a motion of a character by controlling the motion of the character in a virtual space, and includes sample motion information indicating a sample motion that is a basis of the motion of the character.
- a plurality of sample motion information stored in the sample motion information storage unit to be stored is mixed at an arbitrary mixing ratio to generate motion information, and the motion information generated by the motion blend processing is stored in the character
- the motion information registration process to be registered in the motion information storage unit that stores the motion information indicating the motion of the character, and the character at the end of the motion when the character is operated based on the motion information generated in the motion blend process
- Predetermined part A reachable position detecting process for detecting the reachable position, a mixing ratio associating process for associating coordinate information indicating the reachable position detected in the reachable position detecting process with the mixing ratio, and the virtual space in the virtual space
- a virtual sphere arrangement process for arranging a plurality of spheres in a range where a reachable position exists
- the virtual sphere information storage unit 12c stores motion information suitable for a plurality of pieces of motion information in accordance with a character state (for example, “stopped” or “moving”), a surrounding environment of the character, an operation input from the user, and the like.
- This is a storage medium for storing virtual sphere information that is information related to the virtual sphere used to select.
- the virtual sphere means a predetermined sphere (or a group of spheres) virtually arranged in the virtual space according to the character state.
- the virtual sphere and the virtual sphere information will be described in detail in the description regarding the virtual sphere generation process and the motion generation process (see FIGS. 3 and 7) described later.
- FIG. 2 is an explanatory diagram illustrating an example of a storage state of virtual sphere information in the virtual sphere information storage unit 12c.
- the virtual sphere information includes a sphere number for uniquely identifying a virtual sphere, a character associated with the virtual sphere, a type of motion associated with the virtual sphere, and a virtual space. It includes center coordinates indicating the center position of the virtual sphere corresponding to the position of the character, a radius of the virtual sphere, a mixing ratio of sample motions constituting a motion associated with the virtual sphere, and a contact flag.
- the environment information storage unit 12d is a storage medium that stores information about the virtual space where the character exists.
- the environment information storage unit 12d includes information (for example, ground geometry) indicating a contact-allowed object (for example, the ground) that is an object that the character can touch in the virtual space.
- the display unit 13 is a display device that displays a game screen according to a user operation in accordance with the control of the control unit 11.
- the display unit 13 is configured by a liquid crystal display device, for example.
- the sound output unit 14 outputs sound in accordance with the user's operation and the character's action according to the control of the control unit 11.
- the operation accepting unit 15 accepts an operation signal corresponding to a user operation from a controller composed of a plurality of buttons, a mouse, and the like, and notifies the control unit 11 of the result.
- the motion processing unit 16 has a function of mixing a plurality of sample motion information at an arbitrary mixing ratio, and a function for executing a mixing ratio correction process in a motion generation process described later (see FIG. 7).
- the motion processing unit 16 includes a motion blending unit 16a, a virtual sphere information generation unit 16b, a contact determination unit 16c, and a mixing ratio correction unit 16d.
- the motion blend unit 16a has a function of generating motion information by mixing a plurality of sample motions at an arbitrary mixing ratio, and registering the generated motion information in the motion information storage unit 12b.
- the sample motion mixing process uses a known motion blending method in which a plurality of motions are synchronized to interpolate the posture of the operation for each time frame, and thus detailed description thereof is omitted here.
- the virtual sphere information generation unit 16b has a function of executing virtual sphere information generation processing for generating virtual sphere information.
- the virtual sphere information generation process will be described later in detail (see FIG. 3).
- the contact determination unit 16c is a function for determining whether or not the virtual sphere arranged in the virtual space is in contact with a boundary line (for example, ground geometry) between a region where the character can move and a region where the character cannot move.
- a boundary line for example, ground geometry
- the mixing ratio correction unit 16d has a function of executing a mixing ratio correction process in a motion generation process described later (see FIG. 7).
- the mixing ratio correction process is a process of correcting the mixing ratio so that, for example, the toe of the character C is accurately grounded to the ground when the motion generated according to the mixing ratio is executed.
- a method using a known kriging method is used, detailed description thereof is omitted here.
- the image processing apparatus 100 in this example generates a motion when the 3DCG character moves on a slope or stairs in the virtual space by mixing the sample motion information stored in advance in the sample motion information storage unit 12a at an arbitrary ratio. Process to do.
- FIG. 3 is an explanatory diagram illustrating an example of virtual sphere information generation processing executed by the image processing apparatus 100 as pre-computation.
- the virtual sphere information generation process a process for generating virtual sphere information corresponding to the type of motion of the character C is executed.
- the image processing apparatus 100 In order to generate the virtual sphere information, the image processing apparatus 100, as shown in FIG. 4, from the initial posture FC of the character C, a combination of a plurality of sample motions corresponding to the walking motion at various ratios (that is, a mixing ratio). ) And the reachable positions P1, P2, and P3 of the legs of the character C (character C postures AC1, AC2, and AC3 in FIG. 4) at the respective mixing ratios are derived.
- FIG. 5 shows a plurality of reachable positions P (including P1, P2, and P3) derived according to the rules stored in the storage unit 12 in advance.
- the image processing apparatus 100 virtually arranges a plurality of spheres in a range where each reachable position P of the foot of the character C exists, and associates a mixing ratio with each of the arranged spheres (virtual spheres).
- the mixing ratio can be referred to by a constant time so that the toes are arranged at the position.
- the control unit 11 uses the rules previously stored in the storage unit 12 by the motion blend unit 16a (for example, “the sample motion corresponding to the walking motion has the largest stride and the most stride Are mixed at a mixing ratio of 10%, 9: 1, 8: 2, 7: 3,..., Etc. Are mixed to generate a plurality of pieces of motion information (step S101).
- the control part 11 selects the motion information used for a virtual sphere information generation process from the motion information previously memorize
- the control unit 11 may receive a selection of a plurality of sample motions by the user by the motion blending unit 16a.
- the control unit 11 is configured to display sample motions that can be blended on the display screen based on, for example, the priority set in advance for each sample motion, so that the user can select the sample motion. It is good also as a structure which supports.
- the control unit 11 associates information (coordinate information) indicating the reachable position P with respect to the position of the initial posture FC of the character C with the mixing ratio by the virtual sphere information generation unit 16b.
- a plurality of virtual spheres are arranged in a range 101 (see FIG. 6) that is stored in the storage unit 12 and in which the reachable position P exists (step S103).
- the virtual sphere information generation unit 16b uniformly arranges virtual spheres having a predetermined radius in a virtual space where the character's feet can reach.
- the product of the radius of the virtual sphere and the total number of virtual spheres is constant.
- the amount of data generated by pre-calculation increases linearly as the total number of virtual spheres increases, it is necessary to delete the total number of virtual spheres by increasing the sphere radius.
- the virtual sphere B in which the reachable position P is not located inside is a virtual sphere that is not associated with a mixing ratio and is not selected in a motion generation process (see FIG. 7) described later.
- the virtual sphere information generation unit 16b has a mixing ratio associated with the plurality of reachable positions P. The average value may be calculated and stored.
- the motion generation process executed by the image processing apparatus 100 will be described.
- the motion generation process will also be described by taking the walking motion of the character C as an example.
- the image processing apparatus 100 arranges a plurality of virtual spheres B in which the mixing ratio is associated according to the position of the character C in the virtual space, as shown in FIG.
- a virtual sphere B associated with a mixing ratio related to walking motion is arranged will be described as an example.
- the arranged virtual sphere is not limited to this, and various motions can be calculated in advance calculation. Any virtual sphere group associated with the corresponding mixing ratio may be used.
- a plurality of sample motion information stored in the sample motion information storage unit 12a Motion information is generated by mixing at a mixing ratio, the generated motion information is registered in the motion information storage unit 12b, and a predetermined character C at the end of the operation when the character C is moved based on the generated motion information Part (for example, foot, hand) And the like, the coordinate information indicating the detected reachable position P is associated with the mixing ratio, and a plurality of spheres are arranged in a range where the reachable position P exists in the virtual space.
- a virtual sphere information storage unit 12c that associates virtual sphere B that is a sphere with reachable position P, stores virtual sphere information, and stores virtual sphere information that is information about virtual sphere B with which reachable position P is associated.
- the sample motion that was not included in the information is specified, and the ratio of the specified sample motion is included so that the motion information satisfies a predetermined condition (for example, the foot of the character C is accurately grounded to the ground) It is good also as a structure which correct
- the image processing apparatus 100 calculates a plurality of reachable positions P according to the rules stored in advance in the storage unit 12, associates coordinate information with the calculated reachable positions P, and then A case has been described in which a plurality of virtual spheres B are uniformly arranged, and a mixing ratio is associated with each virtual sphere B based on the positional relationship between the arranged virtual spheres B and the reachable positions P.
- the example of a form is not limited to this.
- control unit 11 uses the virtual sphere information generation unit 16b to set the character C (for example, the character in FIG. 4 in each mixing ratio) with the toe position of the initial posture FC (see FIG. 4) of the character C as the origin.
- C's posture AC1) is calculated.
- the image processing apparatus 100 associates a mixture ratio with a plurality of virtual spheres by blending a plurality of sample motions with various mixture ratios, and the virtual sphere group closest to the uniform arrangement in the motion generation process. It is good also as a structure which selects and arrange
- the meaning of the word “character” in the present application is not limited to a character as an object appearing in a video game.
- simulation of a physical phenomenon or virtual space It includes various objects that appear during construction. That is, the present invention is not limited to the conventional method when the image processing apparatus (for example, the image processing apparatus 100) is used not only for the realization of video games but also for the entire technical field related to computer graphics.
- the processing load on the image processing apparatus can be reduced as compared with the case where the same level of image processing is to be realized.
- the pend rate is automatically calculated from a small number of explicit control parameters such as moving direction, speed, and ground inclination angle. Since this method requires a small amount of calculation, a walking animation can be generated interactively.
- interpolation-based algorithms also do not necessarily produce high quality animations. For example, since interpolation calculation involves an error, it is necessary to correct the motion by post-processing such as inverse kinematics. In most cases, the generation calculation does not consider dynamics. Thus, blending-based techniques can obtain many excellent characteristics by allowing some degradation in animation quality.
- the conventional technique requires redundant post-processing, there is room for calculation efficiency and robustness.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
11 制御部
12 記憶部
13 表示部
14 音声出力部
15 操作受付部
16 モーション処理部
100 画像処理装置
Claims (5)
- 仮想空間においてキャラクタの動作を制御することによりキャラクタのモーションを生成する画像処理装置であって、
前記キャラクタのモーションの基礎となるサンプルモーションを示すサンプルモーション情報を記憶するサンプルモーション情報記憶部と、
前記キャラクタの動作を示すモーション情報を記憶するモーション情報記憶部とを備え、
前記サンプルモーション情報記憶部に記憶された複数のサンプルモーション情報を任意の混合比率で混合してモーション情報を生成するモーションブレンド手段と、
該モーションブレンド手段により生成されたモーション情報を前記モーション情報記憶部に登録するモーション情報登録手段と、
前記モーションブレンド手段により生成されたモーション情報に基づいて前記キャラクタを動作させたときの動作終了時のキャラクタの所定の部位の到達可能位置を検出する到達可能位置検出手段と、
該到達可能位置検出手段により検出された到達可能位置を示す座標情報と前記混合比率とを関連付ける混合比率関連付け手段と、
前記仮想空間において前記到達可能位置が存在する範囲に複数の球体を配置する仮想球体配置手段と、
該仮想球体配置手段により配置された球体である仮想球体と前記到達可能位置とを対応付ける仮想球体対応付け手段と、
該仮想球体対応付け手段により前記到達可能位置が対応付けされた仮想球体に関する情報である仮想球体情報を、仮想球体情報を記憶する仮想球体情報記憶部に登録する仮想球体情報登録手段と、
前記モーション情報記憶部に記憶されたモーション情報に基づいて前記仮想空間における前記キャラクタの動作を制御するキャラクタ制御手段とを含み、
該キャラクタ制御手段は、
前記仮想球体情報記憶部に記憶された仮想球体情報に基づいて前記仮想空間における前記キャラクタの状態に応じた前記仮想球体を出力する出力手段と、
該出力手段により出力された仮想球体のうち、前記キャラクタが接触可能な物体である接触許容物体と接触する仮想球体があるか判定する接触判定手段と、
該接触判定手段により前記接触許容物体と接触すると判定された仮想球体の選択を受け付ける選択受付手段とを有し、
該選択受付手段により選択を受け付けた仮想球体に対応付けされた到達可能位置を示す座標情報に関連付けされた混合比率に対応するモーション情報に基づいて前記キャラクタの動作を制御する
ことを特徴とする画像処理装置。 - 前記キャラクタ制御手段は、
前記選択受付手段により選択を受け付けた仮想球体に応じた混合比率を特定する混合比率特定手段と、
該混合比率特定手段により特定された混合比率に関連付けされた前記座標情報と前記仮想空間における前記接触許容物体の位置とに基づいて、当該混合比率に応じたモーション情報が所定の条件を満たすように当該混合比率を補正する混合比率補正手段と、
該混合比率補正手段により補正された混合比率に応じたモーション情報を特定するモーション情報特定手段とを有し、
前記キャラクタ制御手段は、前記モーション情報特定手段により特定されたモーション情報に基づいて前記キャラクタの動作を制御する
請求項1記載の画像処理装置。 - 前記キャラクタ制御手段は、
前記混合比率特定手段により特定された混合比率に関連付けされた前記座標情報が示す到達可能位置から前記仮想空間における前記接触許容物体の位置までの仮想距離を特定する仮想距離特定手段と、
該仮想距離特定手段により特定された仮想距離に基づいて前記到達可能位置を前記接触許容物体の位置と一致させるために必要な前記混合比率の補正量を特定する補正量特定手段とを有し、
前記混合比率補正手段は、前記補正量特定手段により特定された補正量に従って前記混合比率を補正する
請求項2記載の画像処理装置。 - 仮想空間においてキャラクタの動作を制御することによりキャラクタのモーションを生成する画像処理方法であって、
前記キャラクタのモーションの基礎となるサンプルモーションを示すサンプルモーション情報を記憶するサンプルモーション情報記憶部に記憶された複数のサンプルモーション情報を任意の混合比率で混合してモーション情報を生成するモーションブレンド処理と、
該モーションブレンド処理にて生成されたモーション情報を、前記キャラクタの動作を示すモーション情報を記憶するモーション情報記憶部に登録するモーション情報登録処理と、前記モーションブレンド処理にて生成されたモーション情報に基づいて前記キャラクタを動作させたときの動作終了時のキャラクタの所定の部位の到達可能位置を検出する到達可能位置検出処理と、
該到達可能位置検出処理にて検出された到達可能位置を示す座標情報と前記混合比率とを関連付ける混合比率関連付け処理と、
前記仮想空間において前記到達可能位置が存在する範囲に複数の球体を配置する仮想球体配置処理と、
該仮想球体配置処理にて配置された球体である仮想球体と前記到達可能位置とを対応付ける仮想球体対応付け処理と、
該仮想球体対応付け処理にて前記到達可能位置が対応付けされた仮想球体に関する情報である仮想球体情報を、仮想球体情報を記憶する仮想球体情報記憶部に登録する仮想球体情報登録処理と、
前記モーション情報記憶部に記憶されたモーション情報に基づいて前記仮想空間における前記キャラクタの動作を制御するキャラクタ制御処理とを含み、
該キャラクタ制御処理は、
前記仮想球体情報記憶部に記憶された仮想球体情報に基づいて前記仮想空間における前記キャラクタの状態に応じた前記仮想球体を出力する出力処理と、
該出力処理にて出力された仮想球体のうち、前記キャラクタが接触可能な物体である接触許容物体と接触する仮想球体があるか判定する接触判定処理と、
該接触判定処理にて前記接触許容物体と接触すると判定された仮想球体の選択を受け付ける選択受付処理とを有し、
該選択受付処理にて選択を受け付けた仮想球体に対応付けされた到達可能位置を示す座標情報に関連付けされた混合比率に対応するモーション情報に基づいて前記キャラクタの動作を制御する
ことを特徴とする画像処理方法。 - 仮想空間においてキャラクタの動作を制御することによりキャラクタのモーションを生成させるための画像処理プログラムであって、
コンピュータに、
前記キャラクタのモーションの基礎となるサンプルモーションを示すサンプルモーション情報を記憶するサンプルモーション情報記憶部に記憶された複数のサンプルモーション情報を任意の混合比率で混合してモーション情報を生成するモーションブレンド処理と、
該モーションブレンド処理にて生成されたモーション情報を、前記キャラクタの動作を示すモーション情報を記憶するモーション情報記憶部に登録するモーション情報登録処理と、前記モーションブレンド処理にて生成されたモーション情報に基づいて前記キャラクタを動作させたときの動作終了時のキャラクタの所定の部位の到達可能位置を検出する到達可能位置検出処理と、
該到達可能位置検出処理にて検出された到達可能位置を示す座標情報と前記混合比率とを関連付ける混合比率関連付け処理と、
前記仮想空間において前記到達可能位置が存在する範囲に複数の球体を配置する仮想球体配置処理と、
該仮想球体配置処理にて配置された球体である仮想球体と前記到達可能位置とを対応付ける仮想球体対応付け処理と、
該仮想球体対応付け処理にて前記到達可能位置が対応付けされた仮想球体に関する情報である仮想球体情報を、仮想球体情報を記憶する仮想球体情報記憶部に登録する仮想球体情報登録処理と、
前記モーション情報記憶部に記憶されたモーション情報に基づいて前記仮想空間における前記キャラクタの動作を制御するキャラクタ制御処理とを実行させ、
該キャラクタ制御処理において、
前記仮想球体情報記憶部に記憶された仮想球体情報に基づいて前記仮想空間における前記キャラクタの状態に応じた前記仮想球体を出力する出力処理と、
該出力処理にて出力された仮想球体のうち、前記キャラクタが接触可能な物体である接触許容物体と接触する仮想球体があるか判定する接触判定処理と、
該接触判定処理にて前記接触許容物体と接触すると判定された仮想球体の選択を受け付ける選択受付処理とを実行させ、
該選択受付処理にて選択を受け付けた仮想球体に対応付けされた到達可能位置を示す座標情報に関連付けされた混合比率に対応するモーション情報に基づいて前記キャラクタの動作を制御する処理を
実行させるための画像処理プログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11731621.6A EP2544151B1 (en) | 2010-05-10 | 2011-04-19 | Image processing device, image processing method, and image processing program |
CA2798954A CA2798954C (en) | 2010-05-10 | 2011-04-19 | Image processing apparatus, image processing method, and image processing program |
JP2012514695A JP5303068B2 (ja) | 2010-05-10 | 2011-04-19 | 画像処理装置、画像処理方法、および画像処理プログラム |
CN201180002332.8A CN102934145B (zh) | 2010-05-10 | 2011-04-19 | 图像处理装置及图像处理方法 |
KR1020117015422A KR101247930B1 (ko) | 2010-05-10 | 2011-04-19 | 화상처리장치, 화상처리방법 및 화상처리 프로그램 |
US13/145,648 US8432401B2 (en) | 2010-05-10 | 2011-04-19 | Image processing apparatus, image processing method, and image processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-108844 | 2010-05-10 | ||
JP2010108844 | 2010-05-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011142084A1 true WO2011142084A1 (ja) | 2011-11-17 |
Family
ID=44914145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/002280 WO2011142084A1 (ja) | 2010-05-10 | 2011-04-19 | 画像処理装置、画像処理方法、および画像処理プログラム |
Country Status (8)
Country | Link |
---|---|
US (1) | US8432401B2 (ja) |
EP (1) | EP2544151B1 (ja) |
JP (1) | JP5303068B2 (ja) |
KR (1) | KR101247930B1 (ja) |
CN (1) | CN102934145B (ja) |
CA (1) | CA2798954C (ja) |
TW (1) | TWI448982B (ja) |
WO (1) | WO2011142084A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017046951A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社コーエーテクモゲームス | 情報処理装置、動作制御方法及び動作制御プログラム |
JP2022503776A (ja) * | 2018-09-21 | 2022-01-12 | ピナンブラ、インク | 視覚ディスプレイの補完的なデータを生成するためのシステム及び方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5118220B2 (ja) | 2011-02-16 | 2013-01-16 | 株式会社スクウェア・エニックス | 動作モデリング装置及び方法、並びにプログラム |
JP5190524B2 (ja) | 2011-02-16 | 2013-04-24 | 株式会社スクウェア・エニックス | オブジェクト動作装置及び方法、並びにプログラム |
US9786097B2 (en) * | 2012-06-22 | 2017-10-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US10163261B2 (en) | 2014-03-19 | 2018-12-25 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
JP5734361B2 (ja) | 2012-08-01 | 2015-06-17 | 株式会社スクウェア・エニックス | オブジェクト表示装置 |
JP5480347B2 (ja) | 2012-08-31 | 2014-04-23 | 株式会社スクウェア・エニックス | ビデオゲーム処理装置、およびビデオゲーム処理プログラム |
JP6110704B2 (ja) * | 2013-03-29 | 2017-04-05 | 任天堂株式会社 | プログラム、情報処理装置、情報処理方法及び情報処理システム |
JP6576544B2 (ja) * | 2015-08-20 | 2019-09-18 | 株式会社スクウェア・エニックス | 情報処理装置、情報処理方法、コンピュータ読み取り可能な記憶媒体 |
JP6457603B1 (ja) * | 2017-09-20 | 2019-01-23 | 株式会社スクウェア・エニックス | 画像処理プログラム、画像処理装置、及び画像処理方法 |
CN111291674B (zh) * | 2020-02-04 | 2023-07-14 | 清华珠三角研究院 | 一种虚拟人物表情动作的提取方法、系统、装置及介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1021419A (ja) * | 1996-07-05 | 1998-01-23 | Hudson Soft Co Ltd | 仮想の連結動作物体及びその変形方法 |
JP2003062326A (ja) | 2001-08-23 | 2003-03-04 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2003067773A (ja) | 2001-08-23 | 2003-03-07 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2007102503A (ja) * | 2005-10-04 | 2007-04-19 | Square Enix Co Ltd | 画像生成装置及び方法、プログラム、並びに記録媒体 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6191798B1 (en) | 1997-03-31 | 2001-02-20 | Katrix, Inc. | Limb coordination system for interactive computer animation of articulated characters |
US6559845B1 (en) * | 1999-06-11 | 2003-05-06 | Pulse Entertainment | Three dimensional animation system and method |
JP3618298B2 (ja) | 2000-01-28 | 2005-02-09 | 株式会社スクウェア・エニックス | モーション表示方法、ゲーム装置及び記録媒体 |
EP1365359A1 (en) * | 2002-05-24 | 2003-11-26 | BRITISH TELECOMMUNICATIONS public limited company | Image processing method and system |
JP3974136B2 (ja) * | 2005-01-25 | 2007-09-12 | 株式会社コナミデジタルエンタテインメント | プログラム、ライト配置方法、ならびに、画像生成装置 |
JP4445449B2 (ja) | 2005-10-04 | 2010-04-07 | 株式会社スクウェア・エニックス | 画像生成装置 |
JP2007143684A (ja) * | 2005-11-25 | 2007-06-14 | Konami Digital Entertainment:Kk | ゲーム装置、ゲーム用画像処理方法及びプログラム |
US8144148B2 (en) * | 2007-02-08 | 2012-03-27 | Edge 3 Technologies Llc | Method and system for vision-based interaction in a virtual environment |
JP4519883B2 (ja) | 2007-06-01 | 2010-08-04 | 株式会社コナミデジタルエンタテインメント | キャラクター表示装置、キャラクター表示方法、ならびに、プログラム |
JP4339908B2 (ja) * | 2007-11-28 | 2009-10-07 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、画像生成方法、および、プログラム |
US8363057B2 (en) | 2008-05-28 | 2013-01-29 | Autodesk, Inc. | Real-time goal-directed performed motion alignment for computer animated characters |
US8942428B2 (en) * | 2009-05-01 | 2015-01-27 | Microsoft Corporation | Isolate extraneous motions |
US20120049450A1 (en) * | 2010-08-27 | 2012-03-01 | Mosen Agamawi | Cube puzzle game |
-
2011
- 2011-04-19 US US13/145,648 patent/US8432401B2/en active Active
- 2011-04-19 EP EP11731621.6A patent/EP2544151B1/en active Active
- 2011-04-19 WO PCT/JP2011/002280 patent/WO2011142084A1/ja active Application Filing
- 2011-04-19 CN CN201180002332.8A patent/CN102934145B/zh active Active
- 2011-04-19 CA CA2798954A patent/CA2798954C/en active Active
- 2011-04-19 JP JP2012514695A patent/JP5303068B2/ja active Active
- 2011-04-19 KR KR1020117015422A patent/KR101247930B1/ko active IP Right Grant
- 2011-04-21 TW TW100113876A patent/TWI448982B/zh active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1021419A (ja) * | 1996-07-05 | 1998-01-23 | Hudson Soft Co Ltd | 仮想の連結動作物体及びその変形方法 |
JP2003062326A (ja) | 2001-08-23 | 2003-03-04 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2003067773A (ja) | 2001-08-23 | 2003-03-07 | Namco Ltd | 画像生成システム、プログラム及び情報記憶媒体 |
JP2007102503A (ja) * | 2005-10-04 | 2007-04-19 | Square Enix Co Ltd | 画像生成装置及び方法、プログラム、並びに記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2544151A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017046951A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社コーエーテクモゲームス | 情報処理装置、動作制御方法及び動作制御プログラム |
JP2022503776A (ja) * | 2018-09-21 | 2022-01-12 | ピナンブラ、インク | 視覚ディスプレイの補完的なデータを生成するためのシステム及び方法 |
US11586276B2 (en) | 2018-09-21 | 2023-02-21 | Penumbra, Inc. | Systems and methods for generating complementary data for visual display |
Also Published As
Publication number | Publication date |
---|---|
US20110316860A1 (en) | 2011-12-29 |
EP2544151A1 (en) | 2013-01-09 |
CA2798954C (en) | 2016-05-24 |
JPWO2011142084A1 (ja) | 2013-07-22 |
KR101247930B1 (ko) | 2013-03-26 |
KR20110128794A (ko) | 2011-11-30 |
TW201207766A (en) | 2012-02-16 |
EP2544151A4 (en) | 2013-03-13 |
US8432401B2 (en) | 2013-04-30 |
CN102934145A (zh) | 2013-02-13 |
JP5303068B2 (ja) | 2013-10-02 |
CN102934145B (zh) | 2015-08-05 |
CA2798954A1 (en) | 2011-11-17 |
EP2544151B1 (en) | 2014-05-21 |
TWI448982B (zh) | 2014-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5303068B2 (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
US9050538B2 (en) | Collision detection and motion simulation in game virtual space | |
CN102221975B (zh) | 使用运动捕捉数据的项目导航 | |
Zhao et al. | Achieving good connectivity in motion graphs | |
CN107847799A (zh) | 终端装置 | |
JP2011508290A (ja) | 運動アニメーション方法及び装置 | |
US8909506B2 (en) | Program, information storage medium, information processing system, and information processing method for controlling a movement of an object placed in a virtual space | |
CN109758760B (zh) | 足球游戏中射门控制方法、装置、计算机设备及存储介质 | |
CN111135556A (zh) | 一种虚拟相机控制的方法及装置、电子设备、存储介质 | |
CN107213636A (zh) | 镜头移动方法、装置、存储介质和处理器 | |
WO2010016189A1 (ja) | ゲームプログラムおよびゲーム装置 | |
US9652879B2 (en) | Animation of a virtual object | |
Beacco et al. | Footstep parameterized motion blending using barycentric coordinates | |
JP5088972B2 (ja) | 画像処理装置、画像処理方法、コンピュータプログラム、記録媒体、及び半導体装置 | |
CN112090076B (zh) | 游戏角色动作控制方法、装置、设备和介质 | |
JP5089147B2 (ja) | カメラ制御方法および該カメラ制御を取り入れたcgゲーム装置 | |
JP2018128851A (ja) | プログラム、オブジェクト配置システム及びオブジェクト配置方法 | |
Laszlo et al. | Predictive feedback for interactive control of physics-based characters | |
JP2009251887A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP2005122558A (ja) | 画像表示装置、画像表示方法、画像表示プログラム及び画像表示プログラムを記録したコンピュータ読み取り可能な記録媒体 | |
US11138783B2 (en) | Image processing apparatus, image processing method, and image prodessing program for aligning a polygon model with a texture model | |
JP5378027B2 (ja) | ゲームプログラム、記憶媒体およびコンピュータ装置 | |
JP5957278B2 (ja) | 画像生成装置、画像生成方法、プログラムおよびコンピュータ読取り可能な記憶媒体 | |
CN117101138A (zh) | 虚拟角色的控制方法、装置、电子设备以及存储介质 | |
JP5328841B2 (ja) | プログラム、情報記憶媒体、情報処理システム及び情報処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180002332.8 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 20117015422 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011731621 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13145648 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11731621 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012514695 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 2798954 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |