WO2009119264A1 - 画像処理装置、画像処理装置の制御方法、プログラム及び情報記憶媒体 - Google Patents
画像処理装置、画像処理装置の制御方法、プログラム及び情報記憶媒体 Download PDFInfo
- Publication number
- WO2009119264A1 WO2009119264A1 PCT/JP2009/054023 JP2009054023W WO2009119264A1 WO 2009119264 A1 WO2009119264 A1 WO 2009119264A1 JP 2009054023 W JP2009054023 W JP 2009054023W WO 2009119264 A1 WO2009119264 A1 WO 2009119264A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- texture image
- image
- auxiliary lines
- auxiliary
- image processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6692—Methods for processing data by generating or executing the game program for rendering three dimensional images using special effects, generally involving post-processing, e.g. blooming
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8011—Ball
Definitions
- the present invention relates to an image processing apparatus, a control method for the image processing apparatus, a program, and an information storage medium.
- an image processing apparatus that displays an image representing a state in which an object arranged in a virtual three-dimensional space is viewed from a given viewpoint.
- a game device image processing device
- a game screen representing a state in which a virtual three-dimensional space in which player objects representing soccer players and the like are arranged is viewed from a viewpoint is displayed.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing apparatus, a control method for the image processing apparatus, a program, and a program that enable the user to easily grasp the unevenness of the object.
- An object is to provide an information storage medium.
- an image processing apparatus is an image processing apparatus that displays an image representing a state in which an object placed in a virtual three-dimensional space is viewed from a given viewpoint.
- An original texture image storage means for storing a texture image; and a plurality of auxiliary lines representing meshes or a plurality of auxiliary lines parallel to each other mapped on the original texture image.
- Display control means for displaying on the display means an image representing a state seen from the viewpoint.
- the image processing apparatus control method is an image display apparatus control method for displaying an image representing a state in which an object placed in a virtual three-dimensional space is viewed from a given viewpoint.
- the program according to the present invention is a program for causing a computer to function as an image display device that displays an image representing a state in which an object placed in a virtual three-dimensional space is viewed from a given viewpoint.
- a texture image storage means for storing the original texture image, and a texture image with auxiliary lines in which a plurality of auxiliary lines representing meshes or a plurality of auxiliary lines parallel to each other are represented on the original texture image are mapped
- the information storage medium according to the present invention is a computer-readable information storage medium recording the above program.
- a program distribution apparatus according to the present invention is a program distribution apparatus that includes an information storage medium that records the program, reads the program from the information storage medium, and distributes the program.
- the program distribution method according to the present invention is a program distribution method for reading and distributing the program from an information storage medium storing the program.
- the present invention relates to an image processing apparatus that displays an image representing a state in which an object arranged in a virtual three-dimensional space is viewed from a given viewpoint.
- the original texture image of the object is stored.
- an image representing a state in which an object to which a texture image with auxiliary lines in which a plurality of auxiliary lines representing a mesh or a plurality of auxiliary lines parallel to each other are represented on the original texture image is mapped is viewed from a viewpoint is obtained. Displayed on the display means. According to the present invention, it becomes possible for the user to easily grasp the unevenness of the object.
- the display control unit includes a texture image acquisition unit with an auxiliary line for acquiring the texture image with an auxiliary line, and the texture with an auxiliary line acquired by the texture image acquisition unit with an auxiliary line. You may make it display on the said display means the image showing a mode that the said object by which the image was mapped was seen from the said viewpoint.
- the texture image acquisition unit with auxiliary lines may generate the texture image with auxiliary lines based on the original texture image.
- the auxiliary line-attached texture image acquisition unit may draw the auxiliary lines by drawing a plurality of auxiliary lines representing the mesh or the plurality of auxiliary lines parallel to each other on the original texture image.
- An attached texture image may be generated.
- the texture image acquisition unit with auxiliary lines includes at least a plurality of first auxiliary lines parallel to each other and a plurality of lines parallel to each other and intersecting the plurality of first auxiliary lines.
- the texture image with an auxiliary line may be generated by drawing the second auxiliary line on the original texture image.
- the display control means includes means for controlling the fineness of the mesh or the interval between the plurality of auxiliary lines for each of the plurality of areas set in the texture image with auxiliary lines. It may be included.
- the display control means may include means for controlling the fineness of the mesh or the interval between the plurality of auxiliary lines based on the position of the viewpoint.
- the image processing apparatus may include means for controlling the colors of the plurality of auxiliary lines representing the mesh or the plurality of auxiliary lines parallel to each other based on the original texture image.
- the game device according to the embodiment of the present invention is realized by, for example, a home game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), a personal computer, or the like.
- a case where the game device according to the embodiment of the present invention is realized by a consumer game machine will be described.
- the present invention can also be applied to other image processing apparatuses (for example, personal computers).
- FIG. 1 shows the overall configuration of a game device according to an embodiment of the present invention.
- a game apparatus 10 shown in FIG. 1 includes a consumer game machine 11, a monitor 32, a speaker 34, and an optical disk 36 (information storage medium).
- the monitor 32 and the speaker 34 are connected to the consumer game machine 11.
- the monitor 32 for example, a home television receiver is used
- the speaker 34 for example, a speaker built in the home television receiver is used.
- the home game machine 11 is a known computer game system.
- the home game machine 11 includes a bus 12, a microprocessor 14, a main memory 16, an image processing unit 18, an input / output processing unit 20, an audio processing unit 22, an optical disc reading unit 24, a hard disk 26, a communication interface 28, and a controller 30. . Components other than the controller 30 are accommodated in the housing of the consumer game machine 11.
- the microprocessor 14 controls each part of the consumer game machine 11 based on an operating system stored in a ROM (not shown), a program read from the optical disk 36 or the hard disk 26.
- the main memory 16 includes a RAM, for example. Programs and data read from the optical disk 36 or the hard disk 26 are written in the main memory 16 as necessary.
- the main memory 16 is also used as a working memory for the microprocessor 14.
- the bus 12 is for exchanging addresses and data among the units of the consumer game machine 11.
- the microprocessor 14, the main memory 16, the image processing unit 18, and the input / output processing unit 20 are connected by the bus 12 so that mutual data communication is possible.
- the image processing unit 18 includes a VRAM, and draws a game screen on the VRAM based on image data sent from the microprocessor 14.
- the image processing unit 18 converts the game screen drawn on the VRAM into a video signal and outputs the video signal to the monitor 32 at a predetermined timing.
- the input / output processing unit 20 is an interface for the microprocessor 14 to access the audio processing unit 22, the optical disk reading unit 24, the hard disk 26, the communication interface 28, and the controller 30.
- the sound processing unit 22 includes a sound buffer, and reproduces various sound data such as game music, game sound effects, and messages read out from the optical disk 36 or the hard disk 26 to the sound buffer and outputs them from the speaker 34.
- the communication interface 28 is an interface for connecting the consumer game machine 11 to a communication network such as the Internet by wire or wireless.
- the optical disk reading unit 24 reads programs and data recorded on the optical disk 36.
- the optical disc 36 is used to supply the program and data to the consumer game machine 11, but other information storage media such as a memory card may be used. Further, for example, a program or data may be supplied to the consumer game machine 11 from a remote place via a communication network such as the Internet.
- the hard disk 26 is a general hard disk device (auxiliary storage device). Note that the game apparatus 10 may be provided with a memory card slot for reading data from the memory card and writing data to the memory card.
- the controller 30 is a general-purpose operation input means for the user to input various game operations.
- a plurality of controllers 30 can be connected to the consumer game machine 11.
- the input / output processing unit 20 scans the state of the controller 30 at regular intervals (for example, every 1/60 seconds), and passes an operation signal representing the scan result to the microprocessor 14 via the bus 12.
- the microprocessor 14 determines the player's game operation based on the operation signal.
- the controller 30 may be connected to the consumer game machine 11 by wire or wirelessly.
- a soccer game is executed.
- This soccer game is realized by executing a program read from the optical disc 36.
- FIG. 2 shows an example of a virtual three-dimensional space.
- a field object 42 representing a soccer field is arranged in the virtual three-dimensional space 40.
- a goal object 44 representing a goal
- a player object 46 representing a soccer player
- a ball object 48 representing a soccer ball
- 22 player objects 46 are arranged on the field object 42.
- each object is simplified.
- the object such as the player object 46 includes a plurality of polygons.
- a texture image is mapped to an object such as the player object 46.
- An object point (such as a vertex of a polygon) is associated with a point (pixel) on the texture image, and the color of each point of the object is based on the color of the point on the texture image associated with the point. Controlled.
- FIG. 3 is a diagram showing an example of the appearance of the head 47 of the player object 46.
- 4 is a diagram showing a wire frame of the head 47 (face 50) of the player object 46.
- FIG. 4 is a diagram illustrating an example of polygons that form the head 47 (face 50) of the player object 46.
- irregularities such as eyes 52, nose 54, mouth 56, jaw 58, cheek 59 are formed by a plurality of polygons.
- a texture image (hereinafter referred to as a “face texture image”) representing a soccer player's face (eyes, nose, mouth, skin, etc.) is mapped to the polygon of the face 50.
- FIG. 5 shows an example of a face texture image.
- the face texture image 60 shown in FIG. 5 for example, eyes 62, a nose 64, a mouth 66, and the like are drawn. Although omitted in FIG. 5, the face texture image 60 also depicts, for example, the ears of a soccer player.
- the eye 62 portion of the face texture image 60 is associated with the eye 52 polygon of the player object 46 and mapped to the eye 52 polygon of the player object 46.
- a virtual camera 49 (viewpoint) is also set in the virtual three-dimensional space 40.
- the virtual camera 49 moves in the virtual three-dimensional space 40 based on the movement of the ball object 48, for example.
- a game screen (hereinafter referred to as “main game screen”) showing the virtual three-dimensional space 40 viewed from the virtual camera 49 is displayed on the monitor 32. The user operates the player object 46 while viewing the main game screen, and aims to generate a scoring event for his team.
- the soccer game according to the present embodiment has a face deformation function for the user to change the face 50 of the player object 46 to his / her preference.
- FIG. 6 shows an example of a face deformation screen.
- the face deformation screen 70 shown in FIG. 6 includes a deformation parameter field 72 and a deformation result field 74.
- the deformation parameter column 72 is a column for the user to set parameters relating to the deformation of the face 50 of the player object 46 (hereinafter referred to as “deformation parameters”).
- deformation parameters include “eye”, “nose”, “mouth”, “chin”, and “cheek” parameters.
- the “eye”, “nose”, “mouth”, and “cheek” parameters are parameters for controlling the size and shape of the eyes 52, nose 54, mouth 56, and cheek 59 of the player object 46, respectively.
- the “jaw” parameter is a parameter for controlling the length and the like of the jaw 58 of the player object 46.
- the “eye” parameter will be mainly described in detail.
- the “nose”, “mouth”, “chin”, and “cheek” parameters are the same as the “eye” parameters.
- the “eye” parameter is a numerical value indicating how much larger or smaller the size of the eye 52 of the player object 46 is from the initial state.
- the “eye” parameter takes an integer value of, for example, ⁇ 3 to +3.
- the position of the vertex of the polygon of the eye 52 of the player object 46 is set. More specifically, the position of the vertex of the polygon of the eye 52 corresponding to each integer value of ⁇ 3 to +3 is determined in advance.
- the position of the vertex of the polygon of the eye 52 is set so that the size of the eye 52 is in the initial state.
- the position of the vertex of the polygon of the eye 52 is set so that the size of the eye 52 is larger than the initial state. In this case, the position of the vertex of the polygon of the eye 52 is set so that the size of the eye 52 increases as the value of the “eye” parameter increases.
- the value of the “eye” parameter is a negative value
- the position of the vertex of the polygon of the eye 52 is set so that the size of the eye 52 is smaller than the initial state. In this case, the position of the vertex of the polygon of the eye 52 is set so that the size of the eye 52 becomes smaller as the value of the “eye” parameter becomes smaller.
- the user selects a deformation parameter to be changed by performing an operation for instructing an upward or downward direction.
- the deformation parameters selected as the change target are displayed in a distinguished manner. In the example shown in FIG. 6, the “mouth” parameter is displayed separately.
- the user increases or decreases the value of the modification parameter to be changed by performing an operation for instructing the right or left direction.
- the deformation result column 74 displays an image of the head 47 (face 50) of the player object 46 corresponding to the change result of each deformation parameter.
- the deformation result column 74 displays an image showing the shape of the head 47 of the player object 46 when the value of each deformation parameter is set to the value displayed in the deformation parameter column 72.
- the image of the head 47 of the player object 46 displayed in the deformation result column 74 is updated. Further, the user can arbitrarily enlarge or reduce the head 47 of the player object 46 displayed in the deformation result column 74 by performing an operation for instructing enlargement or reduction.
- the user can confirm the change result of the face 50 of the player object 46 by referring to the deformation result column 74.
- an auxiliary line 76 that assists the user to easily grasp the unevenness is displayed on the face 50 of the player object 46.
- a line corresponding to the vertical direction of the face 50 of the player object 46 and a line corresponding to the horizontal direction of the face 50 of the player object 46 are displayed as auxiliary lines 76.
- These auxiliary lines 76 represent a mesh on the face 50 of the player object 46.
- the user can easily grasp the unevenness of the face 50 of the player object 46 by referring to the state of the mesh (auxiliary line 76). For example, the user can grasp at a glance the change in the unevenness of the face 50 when the deformation parameter is changed by referring to the change in the mesh shape.
- the deformation parameter data is data indicating the setting result of the deformation parameter, and is data indicating the value displayed in the deformation parameter column 72 when the determination button is pressed.
- the deformed shape data is data indicating the shape of the head 47 (face 50) of the player object 46 after being deformed by the user, and the position coordinates of the vertexes of the polygons of the head 47 of the player object 46 after being deformed by the user. It is the data shown. For example, when displaying the main game screen, post-deformation shape data (or deformation parameter data) is read.
- the shape of the head 47 (face 50) of the player object 46 arranged in the virtual three-dimensional space 40 is controlled based on the post-deformation shape data (or deformation parameter data).
- a player object 46 having a face 50 deformed by the user is displayed on the main game screen.
- FIG. 7 is a functional block diagram mainly showing functional blocks related to the face deformation function among the functional blocks realized by the game apparatus 10.
- the game apparatus 10 includes a game data storage unit 80 and a display control unit 84. These functional blocks are realized by the microprocessor 14 executing a program.
- the game data storage unit 80 is realized by, for example, the main memory 16, the hard disk 26, or the optical disk 36.
- the game data storage unit 80 stores various data for executing the soccer game. For example, data indicating each object placed in the virtual three-dimensional space 40 and the state (position, posture, etc.) of the virtual camera 49 is stored in the game data storage unit 80. Further, for example, data indicating the shape of each object is stored in the game data storage unit 80.
- the game data storage unit 80 includes an original texture image storage unit 82.
- the original texture image storage unit 82 stores the texture image of the object.
- the face texture image 60 (see FIG. 5) of the player object 46 is stored in the original texture image storage unit 82.
- the face texture image 60 and the like stored in the original texture image storage unit 82 are hereinafter referred to as “original texture image”.
- the display control unit 84 is realized mainly by the microprocessor 14 and the image processing unit 18.
- the display control unit 84 displays various screens on the monitor 32 based on various data stored in the game data storage unit 80.
- the display control unit 84 includes a first display control unit 86.
- the first display control unit 86 displays on the monitor 32 an image representing a state in which an object in which the original texture image is mapped as it is viewed from a given viewpoint.
- the first display control unit 86 displays on the monitor 32 a main game screen representing a state in which the virtual three-dimensional space 40 is viewed from the virtual camera 49.
- a player object 46 on which the face texture image 60 is mapped as it is is displayed on the main game screen.
- the display control unit 84 includes a second display control unit 88.
- the second display control unit 88 displays on the monitor 32 an image representing a state in which the object to which the texture image with auxiliary lines is mapped is viewed from a given viewpoint.
- the texture image with auxiliary lines is a texture image in which auxiliary lines 76 for assisting the user to easily grasp the unevenness of the object are represented on the original texture image. Details will be described later.
- the second display control unit 88 displays the face deformation screen 70 on the monitor 32.
- a player object 46 to which a face texture image with auxiliary lines is mapped is displayed on the face deformation screen 70 (deformation result column 74).
- the face texture image with an auxiliary line is a texture image in which an auxiliary line 76 for assisting the user to easily grasp the unevenness of the face 50 of the player object 46 is displayed on the face texture image 60.
- FIG. 8 is a diagram showing an example of a face texture image with auxiliary lines.
- a face texture image 90 with auxiliary lines shown in FIG. 8 is a texture image in which a plurality of auxiliary lines 76 a and 76 b representing meshes are displayed on the face texture image 60.
- the auxiliary line 76a is a straight line parallel to the vertical direction (Y direction shown in FIG. 5) of the face texture image 60, and is a straight line from the upper end to the lower end of the face texture image 60.
- the auxiliary line 76b is a straight line parallel to the horizontal direction (X direction shown in FIG. 5) of the face texture image 60, and is a straight line from the left end to the right end of the face texture image 60.
- the auxiliary lines 76a are drawn at equal intervals, and the auxiliary lines 76b are also drawn at equal intervals.
- the auxiliary line 76a and the auxiliary line 76b are orthogonal to each other, and as a result, a rectangular mesh is represented on the face texture image 90 with the auxiliary line.
- the interval between the auxiliary lines 76a and the interval between the auxiliary lines 76b may be different. Further, the interval between the auxiliary lines 76a and 76b may not be constant.
- auxiliary lines 76 instead of the auxiliary lines 76 a and 76 b, a right-down diagonal line or a right-up diagonal line may be represented as the auxiliary line 76.
- the auxiliary textured face texture image 90 includes a plurality of straight lines parallel to a straight line connecting the upper left vertex 60a and the lower right vertex 60d of the face texture image 60, a lower left vertex 60c and an upper right vertex 60b of the face texture image 60.
- a plurality of straight lines parallel to the straight line connecting the two lines may be represented as auxiliary lines 76.
- auxiliary lines 76 may be represented in the face texture image 90 with auxiliary lines.
- the auxiliary textured face texture image 90 includes a plurality of straight lines parallel to a straight line connecting the upper left vertex 60a and the lower right vertex 60d of the face texture image 60, a lower left vertex 60c and an upper right vertex 60b of the face texture image 60.
- a plurality of straight lines parallel to the horizontal direction (X direction shown in FIG. 5) of the face texture image 60 may be represented as auxiliary lines 76.
- the second display control unit 88 includes a texture image acquisition unit 89 with auxiliary lines.
- the texture image acquisition unit 89 with auxiliary lines acquires a texture image with auxiliary lines.
- the texture image acquisition unit with auxiliary line 89 generates a texture image with auxiliary line based on the original texture image. More specifically, the texture image acquisition unit 89 with auxiliary lines generates a texture image with auxiliary lines by drawing a plurality of auxiliary lines 76 indicating meshes on the original texture image.
- the auxiliary textured face texture image 90 shown in FIG. 8 is generated as follows. First, the texture image acquisition unit 89 with auxiliary lines reads the face texture image 60 from the original texture image storage unit 82.
- the texture image acquisition unit 89 with auxiliary lines displays a plurality of auxiliary lines 76a parallel to each other and a plurality of auxiliary lines 76b that are parallel to each other and intersect the auxiliary lines 76a on the face texture image 60.
- a face texture image 90 with auxiliary lines is generated.
- FIG. 9 is a diagram illustrating an example of a virtual three-dimensional space for the face deformation screen 70.
- a head 47a of the player object 46 and a virtual camera 49a are arranged in the virtual three-dimensional space 40a for the face deformation screen 70.
- the shape of the head 47a of the player object 46 is a shape based on post-deformation shape data (or deformation parameter data).
- a face texture image 90 with auxiliary lines is mapped to the head 47a of the player object 46.
- the second display control unit 88 displays an image representing the state of the head 47 a of the player object 46 as viewed from the virtual camera 49 a in the deformation result column 74.
- the second display control unit 88 changes the position of the virtual camera 49a by a user operation.
- the distance between the head 47a of the player object 46 and the virtual camera 49a changes according to the user's operation.
- the position of the head 47a of the player object 46 is fixed, and the virtual camera 49a moves away from the head 47a or approaches the head 47a according to the user's operation.
- the distance between 47a and virtual camera 49a changes.
- the head 47a (face 50) of the player object 46 is enlarged and displayed in the deformation result column 74 as a result of the distance between the head 47a and the virtual camera 49a being shortened. Is done.
- the head 47a (face 50) of the player object 46 is reduced in the deformation result column 74 as a result of an increase in the distance between the head 47a and the virtual camera 49a. Is displayed.
- the texture image acquisition unit 89 with auxiliary lines may control the interval (mesh fineness) of the auxiliary lines 76 represented in the texture image with auxiliary lines based on the position of the virtual camera 49a.
- the configuration for controlling the interval (mesh fineness) of the auxiliary lines 76 based on the position of the virtual camera 49a is as follows.
- the auxiliary line-attached texture image acquisition unit 89 stores interval control data for determining the interval of the auxiliary line 76 based on the position of the virtual camera 49a.
- the interval control data is data in which the position of the virtual camera 49a and the interval of the auxiliary line 76 are associated with each other.
- the interval control data is data in which a condition relating to the position of the virtual camera 49a is associated with the interval of the auxiliary line 76.
- the “condition regarding the position of the virtual camera 49a” is a condition regarding the distance between the player object 46 and the virtual camera 49a, for example.
- the “condition regarding the position of the virtual camera 49a” when the position of the head 47a of the player object 46 is fixed as in the present embodiment is, for example, a plurality of areas set in the virtual three-dimensional space 40a. It may be a condition that the virtual camera 49a is included in any of the regions.
- the interval control data indicates that when the distance between the head 47a of the player object 46 and the virtual camera 49a is relatively long, the interval between the auxiliary lines 76 is relatively wide (the mesh is relatively coarse), and When the distance is relatively short, the interval between the auxiliary lines 76 is set to be relatively narrow (the mesh is relatively fine).
- the interval control data may be table format data or arithmetic expression format data.
- the interval control data may be stored as part of the program.
- FIG. 10 shows an example of interval control data.
- the interval control data shown in FIG. 10 is data in which the distance between the head 47a of the player object 46 and the virtual camera 49a is associated with the interval of the auxiliary line 76.
- D1 to D5 in FIG. 10 have a relationship of D1 ⁇ D2 ⁇ D3 ⁇ D4 ⁇ D5.
- the auxiliary lines 76a and 76b represented in the face texture image 90 with auxiliary lines are increased.
- the distance between the auxiliary lines 76a and 76b becomes narrow (ie, the mesh becomes fine).
- the auxiliary line-attached texture image acquisition unit 89 acquires an interval corresponding to the current position of the virtual camera 49a based on the interval control data. And the texture image acquisition part 89 with an auxiliary line produces
- FIG. 11 is a flowchart showing a process executed by the game apparatus 10 to display the face deformation screen 70.
- the microprocessor 14 executes the process shown in FIG. 11 according to the program stored in the optical disk 36.
- the microprocessor 14 (texture image acquisition unit 89 with auxiliary lines) reads the face texture image 60 from the optical disk 36 onto the VRAM (S101). Further, the microprocessor 14 (texture image acquisition unit 89 with auxiliary lines) determines the interval between the auxiliary lines 76a and 76b based on the current position of the virtual camera 49a (S102). For example, interval control data (see FIG. 10) is read from the optical disc 36, and an interval corresponding to the current position of the virtual camera 49a is acquired based on the interval control data based on the interval control data. That is, an interval corresponding to the distance between the head 47a of the player object 46 and the virtual camera 49a is acquired from the interval control data.
- the microprocessor 14 draws the auxiliary lines 76a and 76b on the face texture image 60 read on the VRAM ( S103). That is, a plurality of auxiliary lines 76a parallel to the vertical direction (Y direction shown in FIG. 5) of the face texture image 60 are drawn at the intervals determined in S102. Further, a plurality of auxiliary lines 76b parallel to the horizontal direction (X direction shown in FIG. 5) of the face texture image 60 are drawn at the intervals determined in S102.
- a face texture image 90 with auxiliary lines is generated on the VRAM.
- the microprocessor 14 and the image processing unit 18 display the face deformation screen 70 on the monitor 32 (S104). For example, first, a portion other than the deformation result column 74 of the face deformation screen 70 is drawn on the VRAM. Thereafter, an image representing a state in which the virtual three-dimensional space 40a for the face deformation screen 70 is viewed from the virtual camera 49a is generated, and the image is drawn in the deformation result column 74 of the face deformation screen 70 drawn on the VRAM.
- the deformed shape data is stored in the hard disk 26, the shape of the head 47a of the player object 46 arranged in the virtual three-dimensional space 40a is set to the shape indicated by the deformed shape data.
- the shape of the head 47a of the player object 46 is set to the basic shape (initial state). Further, the face texture image 90 with an auxiliary line generated by the processing of S101 to S103 is mapped to the head 47a of the player object 46.
- the face deformation screen 70 generated on the VRAM as described above is displayed on the monitor 32.
- the microprocessor 14 determines whether or not a deformation parameter selection operation has been performed (S105). In the case of the present embodiment, it is determined whether or not an operation for instructing an upward or downward direction has been performed. If it is determined that a deformation parameter selection operation has been performed, the microprocessor 14 updates the face deformation screen 70 (S104). In this case, the deformation parameter to be changed is switched to another deformation parameter in accordance with an instruction from the user, and the new deformation parameter to be changed is distinguished and displayed in the deformation parameter column 72.
- the microprocessor 14 determines whether or not the deformation parameter value increase / decrease operation has been performed (S106). In the case of the present embodiment, it is determined whether or not an operation for instructing the right or left direction has been performed. If it is determined that the deformation parameter value increase / decrease operation has been performed, the microprocessor 14 updates the face deformation screen 70 (S104). In this case, the value of the modification parameter to be changed is increased or decreased in accordance with an instruction to the user, and the value displayed in the modification parameter column 72 of the modification parameter to be changed is updated.
- the shape of the head 47a of the player object 46 is updated based on the value of each deformation parameter displayed in the deformation parameter column 72. Then, an image representing a state in which the virtual three-dimensional space 40 a is viewed from the virtual camera 49 a is regenerated, and the image is displayed in the deformation result column 74. In this case, the head texture 47a of the player object 46 is mapped with the auxiliary line-attached face texture image 90 generated by the processing of S101 to S103 and held in the VRAM.
- the microprocessor 14 determines whether or not the movement operation of the virtual camera 49a has been performed (S107). When it is determined that the moving operation of the virtual camera 49a has been performed, the position of the virtual camera 49a is updated according to a user instruction. Thereafter, the microprocessor 14 re-executes from the processing of S101, and regenerates the face texture image 90 with auxiliary lines. That is, the face texture image 60 is read again from the optical disk 36 onto the VRAM (S101). Further, the interval between the auxiliary lines 76a and 76b is determined again based on the updated position of the virtual camera 49a (S102).
- auxiliary lines 76a and 76b are drawn on the face texture image 60 at the re-determined interval (S103), and a face texture image 90 with auxiliary lines is generated on the VRAM. Thereafter, the face deformation screen 70 is updated based on the updated position of the virtual camera 49a and the face texture image 90 with auxiliary lines regenerated on the VRAM (S104).
- the microprocessor 14 determines whether or not an enter button or a cancel button is instructed (S108). When it is determined that the determination button or the cancel button is not instructed, the microprocessor 14 re-executes the process of S105. On the other hand, if it is determined that the enter button or the cancel button is instructed, the microprocessor 14 stores the deformation parameter data and the deformed shape data in the hard disk 26 (S109). These data are referred to when the main game screen is generated.
- the user can change the face 50 of the player object 46 to his / her preference by the face deformation function (face deformation screen 70).
- face deformation screen 70 the face deformation function
- the user can grasp the unevenness of the face 50 of the player object 46 relatively easily using the mesh (auxiliary lines 76 a and 76 b). . That is, the technical problem on the user interface that it is difficult to grasp the unevenness of the face 50 of the player object 46 is attempted.
- a mesh that is simply a line is represented on the face 50 of the player object 46 so that the user can more easily understand the unevenness of the face 50 of the player object 46.
- an image of the head 47 of the player object 46 is displayed in the deformation result column 74.
- a method of displaying the wire frame of the head 47 on the image is also conceivable.
- the use of this method has the following disadvantages. That is, when the player object 46 includes a large number of polygons, the processing load for displaying the wire frame is relatively large, and thus the processing load may increase. When the user changes the value of the deformation parameter, the wire frame of the head 47 needs to be displayed again.
- the lines representing the wire frame may be dense, which may make it difficult for the user to grasp the unevenness of the face 50 of the player object 46. There is.
- the game apparatus 10 it is possible to prevent the above-described inconvenience from occurring.
- a relatively simple process of mapping the auxiliary line-added face texture image 90 in which the auxiliary lines 76 a and 76 b are drawn on the original face texture image 60 to the player object 46 is executed. Even when the user changes the value of the deformation parameter, it is not necessary to regenerate the face texture image 90 with the auxiliary line (see S106 in FIG. 11). That is, according to the game apparatus 10, it is possible to reduce the processing load.
- the game creator even when the player object 46 is configured to include a large number of polygons, the game creator appropriately sets the interval between the auxiliary lines 76a and 76b, for example, so that the auxiliary lines 76a and 76b are set. Can be made not to be too close.
- the game apparatus 10 by adopting a method of mapping the face texture image 90 with the auxiliary line to the player object 46, the shadow caused by the light source is caused by the auxiliary line 76a, like the eyes 52 and the nose 54 of the player object 46. , 76b. As a result, it becomes easier for the user to grasp the unevenness of the face 50 of the player object 46.
- the interval between the auxiliary lines 76a and 76b is adjusted based on the position of the virtual camera 49a.
- the interval becomes too wide.
- the distance between the auxiliary lines 76a and 76b displayed in the deformation result column 74 may become too small.
- the user may have difficulty grasping the unevenness of the face 50 of the player object 46.
- the face texture image 90 with auxiliary lines is generated based on the original face texture image 60, for example, it is not necessary to store the face texture image 90 with auxiliary lines in advance. For example, even when the interval between the auxiliary lines 76a and 76b is changed based on the position of the virtual camera 49a, a plurality of face texture images 90 with auxiliary lines having different intervals between the auxiliary lines 76a and 76b are stored in advance. There is no need. Thus, according to the game device 10, it becomes possible to reduce the amount of data.
- the line represented as the auxiliary line 76 in the texture image with the auxiliary line may be a line other than a straight line.
- a curve, a wavy line, or a broken line may be represented as the auxiliary line 76 as long as it is possible to assist the user to easily grasp the unevenness of the object.
- the mesh represented in the texture image with auxiliary lines may have a shape other than a rectangle. The mesh may have any shape as long as it can assist the user to easily grasp the unevenness of the object.
- the shape of the mesh represented in the texture image with auxiliary lines may not be uniform. That is, the shape may be different for each mesh.
- the second display control unit 88 may change the color of the auxiliary line 76 based on the original texture image.
- a configuration for changing the color of the auxiliary line 76 based on the original texture image will be described.
- the auxiliary line-attached texture image acquisition unit 89 stores color control data for determining the color of the auxiliary line 76 based on the original texture image.
- the color control data is data that associates the condition relating to the original texture image with the color information relating to the color of the auxiliary line 76.
- the “condition regarding the original texture image” may be a condition regarding the identification information of the original texture image or a condition regarding the color of the original texture image, for example.
- the “condition regarding the color of the original texture image” is, for example, a condition regarding a statistical value (for example, an average value) of color values of each pixel of the original texture image.
- the color control data is referred to, and color information corresponding to a condition that the original texture image satisfies is acquired. Then, a plurality of auxiliary lines 76 are drawn on the original texture image with a color based on the color information, thereby generating a texture image with auxiliary lines. In this way, the color of the auxiliary line 76 can be set in consideration of the original texture image. As a result, it becomes possible for the user to make it easier to see the auxiliary line 76.
- the user may be able to specify the reference color of the original texture image.
- the user may be able to specify the skin color (reference color) of the player object 46 on the face deformation screen 70.
- the skin color reference color
- a plurality of face texture images 60 having different skin colors may be stored in advance, and the face texture image 60 corresponding to the color designated by the user may be used.
- the color (skin color) of the face texture image 60 may be updated based on the color designated by the user, and the updated face texture image 60 may be used.
- the color of the auxiliary line 76 may be changed based on the color designated by the user.
- color control data in which the face texture image 60 is associated with the color information regarding the color of the auxiliary line 76 may be stored.
- color control data in which a color that can be designated as a skin color by the user and color information related to the color of the auxiliary line 76 are associated with each other may be stored.
- the color information corresponding to the face texture image 60 corresponding to the color designated by the user or the color information corresponding to the color designated by the user is acquired, and the auxiliary lines 76a and 76b are represented by the color based on the color information.
- the image 60 may be drawn. In this way, even when the user can specify the skin color of the player object 46 (that is, when the user can specify the reference color of the original texture image), the auxiliary line 76 is difficult to see. It becomes possible to prevent it from happening.
- the texture image acquisition unit 89 with auxiliary lines changes the interval (mesh fineness) of the auxiliary lines 76 for each of a plurality of regions set in the original texture image (texture image with auxiliary lines). May be.
- the interval between the auxiliary lines 76a and / or the auxiliary lines 76b may be changed for each of a plurality of regions set in the face texture image 60 (face texture image 90 with auxiliary lines).
- face texture image 90 with auxiliary lines a configuration for changing the interval between the auxiliary lines (mesh fineness) for each region will be described.
- the game creator sets important areas and non-important areas in the face texture image 60 in advance.
- the “important area” is an area that the game creator thinks that the unevenness should be clarified in the face 50 of the player object 46.
- an area whose shape can be changed in the face 50 of the player object 46 is set as an important area.
- an area related to each deformation parameter is set as an important area.
- a region related to the “eye” parameter (region near the eye 62), a region related to the “nose” parameter (region near the nose 64), and the like are set as important regions.
- the user may be able to specify an important area.
- Information for specifying the important area is stored in the optical disk 36 or the hard disk 26.
- FIG. 12 shows an example of a face texture image 90 with an auxiliary line when a region related to the “mouth” parameter, that is, a region near the mouth 66 is set as the important region 92.
- the interval between the auxiliary lines 76 (auxiliary lines 76a to 76d) is narrower and the mesh is finer than other regions (non-important regions).
- This auxiliary line-added face texture image 90 is generated as follows, for example. That is, first, the auxiliary lines 76 a and 76 b are drawn at equal intervals over the entire area of the face texture image 60.
- an auxiliary line 76c is further added between the auxiliary lines 76a in the important area 92
- an auxiliary line 76d is further added between the auxiliary lines 76b in the important area 92.
- the auxiliary line 76c is a straight line parallel to the auxiliary line 76a
- the auxiliary line 76d is a straight line parallel to the auxiliary line 76b.
- the auxiliary lines 76c and 76d drawn only in the important area 92 may be drawn first, and then the auxiliary lines 76a and 76b drawn in the entire area of the face texture image 60 may be drawn.
- lines other than lines parallel to the auxiliary lines 76 a and 76 b may be added to the important area 92.
- the important region 92 may have a shape other than a rectangle. According to the auxiliary textured face texture image 90 shown in FIG. 12, the user can more easily grasp the unevenness in the area near the mouth 66.
- the interval (mesh fineness) of the auxiliary lines 76 in each region may be changed based on the position of the virtual camera 49a.
- auxiliary line 76 (mesh) on the original texture image
- an auxiliary line texture image in which only the auxiliary line 76 is drawn may be stored in advance.
- the second display control unit 88 may display on the monitor 32 an image representing a state in which an object in which the original texture image and the auxiliary line texture image are mapped in an overlapping manner is viewed from the viewpoint.
- the monitor 32 may display an image representing a state in which an object mapped with a texture image with an auxiliary line formed by combining the original texture image and the auxiliary line texture image is viewed from the viewpoint. .
- the auxiliary line-attached texture image acquisition unit 89 translucently combines the auxiliary line texture image in which only the auxiliary lines 76a and 76b (or auxiliary lines 76a to 76d) are drawn and the face texture image 60. By doing so, you may make it produce
- a texture image with auxiliary lines may be stored in advance in the game data storage unit 80. Then, the texture image with auxiliary line acquisition unit 89 may acquire the texture image with auxiliary line by reading it from the game data storage unit 80.
- the interval between the auxiliary lines 76 may be changed based on the position of the viewpoint (virtual camera 49a).
- a plurality of auxiliary line texture images (or texture images with auxiliary lines) having different intervals (mesh fineness) of the auxiliary lines 76 may be stored in advance.
- a condition relating to the position of the viewpoint may be stored in association with each auxiliary line texture image (or texture image with auxiliary lines). Then, an auxiliary line texture image (or a texture image with auxiliary lines) associated with a condition that satisfies the current position of the viewpoint may be used.
- the color of the auxiliary line 76 may be changed based on the original texture image.
- a plurality of auxiliary line texture images (or texture images with auxiliary lines) having different auxiliary line 76 (mesh) colors are stored in advance.
- Each auxiliary line texture image (or texture image with auxiliary line) is associated with a condition related to the original texture image.
- an auxiliary line texture image (or a texture image with an auxiliary line) associated with a condition that the original texture image satisfies is used.
- the color of the auxiliary line 76 (mesh) may be changed based on the skin color designated by the user.
- the face texture image 60 is associated with each auxiliary line texture image (or texture image with auxiliary lines). Then, an auxiliary line texture image (or a texture image with auxiliary lines) associated with the face texture image 60 corresponding to the color designated by the user is used.
- each auxiliary line texture image (or texture image with auxiliary lines) is associated with a color that the user can specify as the skin color. And the auxiliary line texture image (or texture image with an auxiliary line) matched with the color designated by the user is used.
- the texture image with auxiliary lines may be an image in which a plurality of auxiliary lines 76 parallel to each other are represented on the original texture image.
- auxiliary lines 76a and 76b may be omitted. Even in this way, it is possible to make it easier for the user to grasp the unevenness of the face 50 of the player object 46.
- the present invention can be applied to games other than soccer games.
- the present invention can be applied to a golf game.
- the present invention can also be applied to image processing apparatuses other than the game apparatus 10.
- the present invention can be applied to a case where it is necessary for the user to easily grasp the unevenness of the object.
- the present invention can be applied to a modeling device (modeling software) for modeling an object.
- FIG. 13 is a diagram showing an overall configuration of a program distribution system using a communication network.
- the program distribution system 100 includes a game apparatus 10, a communication network 106, and a program distribution apparatus 108.
- the communication network 106 includes, for example, the Internet and a cable television network.
- the program distribution device 108 includes a database 102 and a server 104. In this system, a database (information storage medium) 102 stores a program similar to the program stored on the optical disc 36.
- the server 104 reads the program from the database 102 in response to the game distribution request and transmits it to the game apparatus 10.
- the game is distributed in response to the game distribution request, but the server 104 may transmit the game unilaterally. Further, it is not always necessary to distribute (collectively distribute) all programs necessary for realizing the game at a time, and a necessary portion may be distributed (divided distribution) according to the situation of the game. If the game is distributed via the communication network 106 in this way, the consumer can easily obtain the program.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
Description
Claims (11)
- 仮想3次元空間に配置されたオブジェクトを所与の視点から見た様子を表す画像を表示する画像処理装置において、
前記オブジェクトの原テクスチャ画像を記憶する原テクスチャ画像記憶手段と、
網目を表す複数の補助線又は互いに平行する複数の補助線が前記原テクスチャ画像上に表されてなる補助線付きテクスチャ画像がマッピングされた前記オブジェクトを前記視点から見た様子を表す画像を表示手段に表示する表示制御手段と、
を含むことを特徴とする画像処理装置。 - 請求の範囲第1項に記載の画像処理装置において、
前記表示制御手段は、
前記補助線付きテクスチャ画像を取得する補助線付きテクスチャ画像取得手段を含み、
前記補助線付きテクスチャ画像取得手段によって取得された前記補助線付きテクスチャ画像がマッピングされた前記オブジェクトを前記視点から見た様子を表す画像を前記表示手段に表示する、
ことを特徴とする画像処理装置。 - 請求の範囲第2項に記載の画像処理装置において、
前記補助線付きテクスチャ画像取得手段は、前記補助線付きテクスチャ画像を前記原テクスチャ画像に基づいて生成することを特徴とする画像処理装置。 - 請求の範囲第3項に記載の画像処理装置において、
前記補助線付きテクスチャ画像取得手段は、前記網目を表す複数の補助線又は前記互いに平行する複数の補助線を前記原テクスチャ画像上に描画することによって、前記補助線付きテクスチャ画像を生成することを特徴とする画像処理装置。 - 請求の範囲第4項に記載の画像処理装置において、
前記補助線付きテクスチャ画像取得手段は、少なくとも、互いに平行する複数の第1補助線と、互いに平行する複数の線であって前記複数の第1補助線と交わる複数の第2補助線と、を前記原テクスチャ画像上に描画することによって、前記補助線付きテクスチャ画像を生成することを特徴とする画像処理装置。 - 請求の範囲第1項に記載の画像処理装置において、
前記表示制御手段は、前記補助線付きテクスチャ画像に設定される複数の領域の各々ごとに、前記網目の細かさ又は前記複数の補助線の間隔を制御する手段を含むことを特徴とする画像処理装置。 - 請求の範囲第1項に記載の画像処理装置において、
前記表示制御手段は、前記網目の細かさ又は前記複数の補助線の間隔を前記視点の位置に基づいて制御する手段を含むことを特徴とする画像処理装置。 - 請求の範囲第1項に記載の画像処理装置において、
前記表示制御手段は、前記網目を表す複数の補助線又は前記互いに平行する複数の補助線の色を前記原テクスチャ画像に基づいて制御する手段を含むことを特徴とする画像処理装置。 - 仮想3次元空間に配置されたオブジェクトを所与の視点から見た様子を表す画像を表示する画像表示装置の制御方法において、
前記オブジェクトの原テクスチャ画像を記憶してなる原テクスチャ画像記憶手段の記憶内容を読み出すステップと、
網目を表す複数の補助線又は互いに平行する複数の補助線が前記原テクスチャ画像上に表されてなる補助線付きテクスチャ画像がマッピングされた前記オブジェクトを前記視点から見た様子を表す画像を表示手段に表示する表示制御ステップと、
を含むことを特徴とする画像処理装置の制御方法。 - 仮想3次元空間に配置されたオブジェクトを所与の視点から見た様子を表す画像を表示する画像表示装置としてコンピュータを機能させるためのプログラムであって、
前記オブジェクトの原テクスチャ画像を記憶する原テクスチャ画像記憶手段、及び、
網目を表す複数の補助線又は互いに平行する複数の補助線が前記原テクスチャ画像上に表されてなる補助線付きテクスチャ画像がマッピングされた前記オブジェクトを前記視点から見た様子を表す画像を表示手段に表示する表示制御手段、
として前記コンピュータを機能させるためのプログラム。 - 仮想3次元空間に配置されたオブジェクトを所与の視点から見た様子を表す画像を表示する画像表示装置としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体であって、
前記オブジェクトの原テクスチャ画像を記憶する原テクスチャ画像記憶手段、及び、
網目を表す複数の補助線又は互いに平行する複数の補助線が前記原テクスチャ画像上に表されてなる補助線付きテクスチャ画像がマッピングされた前記オブジェクトを前記視点から見た様子を表す画像を表示手段に表示する表示制御手段、
として前記コンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な情報記憶媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020107006310A KR101135908B1 (ko) | 2008-03-24 | 2009-03-04 | 화상 처리 장치, 화상 처리 장치의 제어 방법, 및 정보 기억 매체 |
US12/933,771 US20110018875A1 (en) | 2008-03-24 | 2009-03-04 | Image processing device, image processing device control method, program, and information storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008076348A JP5089453B2 (ja) | 2008-03-24 | 2008-03-24 | 画像処理装置、画像処理装置の制御方法及びプログラム |
JP2008-076348 | 2008-03-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009119264A1 true WO2009119264A1 (ja) | 2009-10-01 |
Family
ID=41113469
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/054023 WO2009119264A1 (ja) | 2008-03-24 | 2009-03-04 | 画像処理装置、画像処理装置の制御方法、プログラム及び情報記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20110018875A1 (ja) |
JP (1) | JP5089453B2 (ja) |
KR (1) | KR101135908B1 (ja) |
TW (1) | TW201002399A (ja) |
WO (1) | WO2009119264A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107358649A (zh) * | 2017-06-07 | 2017-11-17 | 腾讯科技(深圳)有限公司 | 地形文件的处理方法和装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010176170A (ja) * | 2009-01-27 | 2010-08-12 | Sony Ericsson Mobilecommunications Japan Inc | 表示装置、表示制御方法および表示制御プログラム |
JP5463866B2 (ja) * | 2009-11-16 | 2014-04-09 | ソニー株式会社 | 画像処理装置および画像処理方法、並びにプログラム |
US8982157B2 (en) * | 2010-07-27 | 2015-03-17 | Dreamworks Animation Llc | Collision free construction of animated feathers |
JP5258857B2 (ja) * | 2010-09-09 | 2013-08-07 | 株式会社コナミデジタルエンタテインメント | 画像処理装置、画像処理装置の制御方法、及びプログラム |
JP5145391B2 (ja) * | 2010-09-14 | 2013-02-13 | 株式会社コナミデジタルエンタテインメント | 画像処理装置、画像処理装置の制御方法、及びプログラム |
US20120200667A1 (en) * | 2011-02-08 | 2012-08-09 | Gay Michael F | Systems and methods to facilitate interactions with virtual content |
JP2013050883A (ja) * | 2011-08-31 | 2013-03-14 | Nintendo Co Ltd | 情報処理プログラム、情報処理システム、情報処理装置および情報処理方法 |
US10586570B2 (en) * | 2014-02-05 | 2020-03-10 | Snap Inc. | Real time video processing for changing proportions of an object in the video |
US10116901B2 (en) | 2015-03-18 | 2018-10-30 | Avatar Merger Sub II, LLC | Background modification in video conferencing |
US9918128B2 (en) * | 2016-04-08 | 2018-03-13 | Orange | Content categorization using facial expression recognition, with improved detection of moments of interest |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636013A (ja) * | 1992-07-14 | 1994-02-10 | Hitachi Ltd | 地形データの作成方法および装置 |
JP2004178589A (ja) * | 2002-11-22 | 2004-06-24 | Thales | 3次元相互視認画像の合成方法 |
JP2005038298A (ja) * | 2003-07-17 | 2005-02-10 | Nintendo Co Ltd | 画像処理装置および画像処理プログラム |
JP2005332395A (ja) * | 2004-05-14 | 2005-12-02 | Microsoft Corp | ネストされた正規グリッドを使用した地形レンダリング |
JP2006059176A (ja) * | 2004-08-20 | 2006-03-02 | Shima Seiki Mfg Ltd | マッピング装置とマッピング方法及びそのプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2763481B2 (ja) * | 1992-08-26 | 1998-06-11 | 株式会社ナムコ | 画像合成装置及び画像合成方法 |
JPH07271999A (ja) * | 1994-03-31 | 1995-10-20 | Oki Electric Ind Co Ltd | 3次元地形出力方法 |
JPH1125281A (ja) * | 1997-06-30 | 1999-01-29 | Seiren Syst Service:Kk | テクスチャマッピング方法 |
US7606392B2 (en) * | 2005-08-26 | 2009-10-20 | Sony Corporation | Capturing and processing facial motion data |
US8059917B2 (en) * | 2007-04-30 | 2011-11-15 | Texas Instruments Incorporated | 3-D modeling |
-
2008
- 2008-03-24 JP JP2008076348A patent/JP5089453B2/ja active Active
-
2009
- 2009-03-04 KR KR1020107006310A patent/KR101135908B1/ko active IP Right Grant
- 2009-03-04 WO PCT/JP2009/054023 patent/WO2009119264A1/ja active Application Filing
- 2009-03-04 US US12/933,771 patent/US20110018875A1/en not_active Abandoned
- 2009-03-18 TW TW098108728A patent/TW201002399A/zh not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636013A (ja) * | 1992-07-14 | 1994-02-10 | Hitachi Ltd | 地形データの作成方法および装置 |
JP2004178589A (ja) * | 2002-11-22 | 2004-06-24 | Thales | 3次元相互視認画像の合成方法 |
JP2005038298A (ja) * | 2003-07-17 | 2005-02-10 | Nintendo Co Ltd | 画像処理装置および画像処理プログラム |
JP2005332395A (ja) * | 2004-05-14 | 2005-12-02 | Microsoft Corp | ネストされた正規グリッドを使用した地形レンダリング |
JP2006059176A (ja) * | 2004-08-20 | 2006-03-02 | Shima Seiki Mfg Ltd | マッピング装置とマッピング方法及びそのプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107358649A (zh) * | 2017-06-07 | 2017-11-17 | 腾讯科技(深圳)有限公司 | 地形文件的处理方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
KR101135908B1 (ko) | 2012-04-13 |
JP2009230543A (ja) | 2009-10-08 |
KR20100055509A (ko) | 2010-05-26 |
TW201002399A (en) | 2010-01-16 |
TWI378812B (ja) | 2012-12-11 |
JP5089453B2 (ja) | 2012-12-05 |
US20110018875A1 (en) | 2011-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5089453B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
JP4833674B2 (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
KR100987624B1 (ko) | 게임 장치, 게임 장치의 제어 방법 및 정보 기억 매체 | |
JP4079378B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
JP3979540B2 (ja) | ゲーム装置、ゲーム装置の制御方法、ゲームシステム、ゲームシステムの制御方法及びプログラム | |
JP5149547B2 (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
JP4031509B1 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
JP2008228839A (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
JP5639526B2 (ja) | ゲーム装置、ゲーム装置の制御方法、及びプログラム | |
JP4567027B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP4964057B2 (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
JP4764381B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP3712065B2 (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
JP4847594B2 (ja) | 画像生成装置、画像生成方法及びプログラム | |
WO2009119399A1 (ja) | 画像処理装置、画像処理装置の制御方法、プログラム及び情報記憶媒体 | |
JP3887002B1 (ja) | ゲーム装置、ゲーム装置の制御方法及びプログラム | |
JP4252490B2 (ja) | 3次元画像生成装置および3次元画像生成プログラム | |
JP4838230B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
JP4838221B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09725194 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20107006310 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12933771 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09725194 Country of ref document: EP Kind code of ref document: A1 |