WO2008059614A1 - Programme pour commander le mouvement d'un objet affiché exécuté dans un dispositif de jeu à l'aide d'un dispositif de pointage direct - Google Patents

Programme pour commander le mouvement d'un objet affiché exécuté dans un dispositif de jeu à l'aide d'un dispositif de pointage direct Download PDF

Info

Publication number
WO2008059614A1
WO2008059614A1 PCT/JP2007/001234 JP2007001234W WO2008059614A1 WO 2008059614 A1 WO2008059614 A1 WO 2008059614A1 JP 2007001234 W JP2007001234 W JP 2007001234W WO 2008059614 A1 WO2008059614 A1 WO 2008059614A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointing device
distance
real space
coordinate
predetermined
Prior art date
Application number
PCT/JP2007/001234
Other languages
English (en)
Japanese (ja)
Inventor
Youichi Ishikawa
Original Assignee
Kabushiki Kaisha Sega Doing Business As Sega Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kabushiki Kaisha Sega Doing Business As Sega Corporation filed Critical Kabushiki Kaisha Sega Doing Business As Sega Corporation
Priority to JP2008544070A priority Critical patent/JPWO2008059614A1/ja
Publication of WO2008059614A1 publication Critical patent/WO2008059614A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Definitions

  • a program that executes and controls the movement of objects displayed on a game device that uses a direct pointing device executes and controls the movement of objects displayed on a game device that uses a direct pointing device
  • the present invention relates to a program that is executed in a game device and controls movement of an object that is displayed. More specifically, the present invention relates to a program that is executed in a game device that executes a computer game using a direct pointing device in a real space, and that controls movement of the displayed object.
  • a pointing device is used, and an object (or also called a character as appropriate) is displayed on the display device by making the moving distance of the pointing device in the real space correspond to the moving distance of the object in the virtual space. It has been done to move and display.
  • a pointing device there are a mouse pad and the like that are moved in two dimensions.
  • three-axis joysticks such as a single analog lever mouse, that move objects in three dimensions. All of these pointing devices capture movement distance as a relative value.
  • DPD direct pointing device
  • Patent Document 1 As a prior art of a game apparatus using a direct pointing device, there is an invention described in Patent Document 1.
  • a sword-shaped operation tool corresponding to the type of game is used as a direct pointing device.
  • a sword-shaped operation tool which is a DPD
  • a protection curtain is hung on the left and right, so that the player can substantially perform DPD
  • the range of real space that can be manipulated is limited, and the relationship between the three-dimensional coordinates representing this limited real space and the virtual three-dimensional space of the game generated by the main device is an absolute correspondence.
  • Patent Document 1 JP 2002-292123 A
  • the DPD position information increases in the Z-axis direction
  • the numerical range of the DPD position information increases due to camera shake and errors, and the object information in the virtual space can be corrected without correcting it.
  • the movement of the object ⁇ ⁇ on the display screen moves in such a small amount that the player feels uncomfortable.
  • an object of the present invention is to provide a problem that a moving operability of an object is deteriorated when a direct pointing device wirelessly connected to a main device is used in a game device, or a position reference display device
  • An object of the present invention is to provide a game device using a direct pointing device, an object movement control method, and a program for performing object movement control, which avoids the influence of errors on the DPD detection signal due to the distance from the object.
  • a first aspect of the present invention for solving the above-described problem is that a computer system is configured to associate a coordinate in a virtual three-dimensional space with a coordinate in a real space, and a player for a pointing device in the real space.
  • Virtual corresponding to the position coordinates of the scan A step of moving an object in a three-dimensional space, a step of transforming an object moving in the virtual three-dimensional space into a two-dimensional plane and displaying it on a display screen of a display device; and the pointing device
  • the pointing device and the computer system main body are wirelessly connected.
  • real space coordinates are initially set at the start of processing, the coordinate position of the pointing device in the initially set real space is set as the reference coordinate position, and the reference coordinate position Based on the above, a predetermined range in the X, ⁇ , and ⁇ directions is set as a predetermined region of the real space.
  • the pointing device has an infrared imaging function, images a pair of infrared elements placed in the vicinity of a display screen of the display device, and the pair of infrared elements to be imaged
  • the coordinate position in the real space of the pointing device may be obtained from the position on the imaging screen.
  • a coordinate position in a real space of the pointing device controlled to move by the player and a predetermined position The distance of the virtual three-dimensional space corresponding to the coordinate position in the real space of the pointing device increases as the distance increases according to the distance from the predetermined position. Control may be performed so as to reduce the proportion and the followability of the pointing device to the coordinate position in the real space.
  • the obtained distance between the coordinate position of the pointing device in the real space and a predetermined position is stored in a buffer, and stored in the buffer.
  • the average value of a predetermined number of distance values between the predetermined position and the predetermined position is a current distance, and based on the obtained current distance from the predetermined position, the coordinate position of the pointing device increases as the distance increases.
  • the ratio of the change in the coordinates of the virtual three-dimensional space displayed on the display device and the followability to the coordinate position of the pointing device may be reduced.
  • the predetermined position is a position where a position reference display device having at least a pair of infrared light emitting elements is placed, and the pointing device has an infrared imaging function. And imaging at least a pair of infrared light emitting elements of the position reference display device, and a distance between at least a pair of infrared light emitting elements of the position reference display device on an imaging screen imaged by the pointing device. Based on the above, the distance of the pointing device from the position reference display device may be obtained.
  • the position reference display device can be placed in the vicinity of the display screen of the display device.
  • a second aspect of the present invention for solving the above-described problem is based on movement control by a player using a pointing device in the real space by associating the coordinates of the virtual space with the coordinates of the real space.
  • a distance between a coordinate position in a real space of a pointing device controlled to move by the player and a predetermined position is obtained, and the pointing increases as the distance increases according to the obtained distance from the predetermined position.
  • the ratio of the change in the coordinates of the virtual three-dimensional space corresponding to the coordinate position in the real space of the device and the followability to the coordinate position in the real space of the pointing device are reduced.
  • the obtained distance between a coordinate position of the pointing device in the real space and a predetermined position is stored in a buffer, and the pointing stored in the buffer is stored.
  • the maximum value and the minimum value of the distance between the coordinate position in the real space of the device and the predetermined position are obtained, and the intermediate value between the maximum value and the minimum value of the distance from the predetermined position is obtained.
  • an average value of a predetermined number of distance values between a coordinate position in the real space of the pointing device stored in the buffer and a predetermined position as a current distance is obtained.
  • the obtained distance Based on the current distance from the predetermined position, the larger the distance, the more corresponding to the coordinate position of the pointing device. It is possible to reduce the rate of change in coordinates of the virtual three-dimensional space shown and the followability to the coordinate position of the pointing device.
  • the predetermined position is a position where a position reference display device having at least a pair of infrared light emitting elements is placed, and the pointing device has an infrared imaging function, and the position reference display device Imaging at least a pair of infrared light emitting elements, and based on the size of the distance between at least a pair of infrared light emitting elements of the position reference display device on an imaging screen captured by the pointing device / chair,
  • the pointing device can be configured to obtain a distance from the position reference display device.
  • the position reference display device is preferably placed in the vicinity of the display screen of the display device.
  • a predetermined area having a reference coordinate position is set in real space, and the player in the set predetermined area
  • the position coordinate of the pointing device that is controlled by movement is obtained, the obtained position coordinate of the pointing device is transmitted to the computer system body, and the computer system body virtually corresponds to the position coordinate of the pointing device.
  • Move an object in 3D space and move the object moving in the virtual 3D space to secondary The coordinate is converted to the original plane and displayed on the display screen of the display device.
  • the reference coordinates of the set predetermined area The position can be configured to move by a magnitude corresponding to the coordinate movement shift exceeding the predetermined area.
  • the pointing device and the computer system main body may be connected wirelessly.
  • real space coordinates are initially set at the start of processing, the coordinate position of the pointing device in the initially set real space is set as the reference coordinate position, and the reference coordinates
  • a predetermined range in the X, ⁇ , and ⁇ directions based on the position can be set as the predetermined region of the real space.
  • FIG. 1 is a functional block diagram of a configuration example of a game apparatus main body applied to the present invention.
  • FIG. 2 is a diagram for explaining a relationship between a position reference display device, a wireless receiver, and a direct pointing device operated by a player.
  • FIG.3 An image according to the change of the direct pointing device in the X direction or the ⁇ direction.
  • FIG. 4 An image corresponding to the change of the direct pointing device in the ⁇ direction.
  • FIG. 5 is a conceptual diagram of the relationship shown in FIG.
  • FIG. 6 Corresponding to FIG. 5, the state displayed on the object (character) force display screen is shown.
  • FIG. 7 is a diagram for explaining a case where the direct pointing device operated by the player exceeds the real space area.
  • FIG. 8 is a diagram for explaining the disappearance of an object on the screen corresponding to FIG. 7.
  • FIG. 9 is an example flow of object movement control according to the present invention for eliminating the inconvenience in operability associated with the deviation between the real space and the display area on the screen.
  • FIG. 10 Compared with FIG. 7, the movement of the pointing device and the movement display of the object when the control flow of FIG. 9 is executed by applying the present invention are explained.
  • FIG. 10 Compared with FIG. 7, the movement of the pointing device and the movement display of the object when the control flow of FIG. 9 is executed by applying the present invention are explained.
  • FIG. 11 is a diagram for explaining the movement of the direct pointing device and the movement display of the object when the control window of FIG. 9 is executed by applying the present invention, as compared with FIG. .
  • FIG. 12 is a diagram for explaining an error caused by a distance in the Z direction from the screen of the direct pointing device and an inconvenience of camera shake corresponding to the second embodiment.
  • FIG. 13 is a diagram for explaining a camera shake state caused by the distance in the Z direction from the screen of the direct pointing device, corresponding to the second embodiment.
  • FIG. 14 is an example flow of object movement control according to the present invention to achieve the second object.
  • FIG. 15 is a diagram showing the distance from the screen of the direct pointing device and the range of play corresponding to FIG.
  • FIG. 1 is a functional block diagram of a configuration example of a game apparatus body applied to the present invention. Each functional unit exchanges data through the bus 10.
  • a game stored in ROM 2 represented as external memory or internal memory.
  • the game development is controlled by the main CPU 1 according to the execution of the main program.
  • the main C as the control means
  • the main C P U 1 is perspective-converted into two-dimensional coordinates to be displayed on the display device 5.
  • the rendering processor 4 applies the texture memory 6 to the object data coordinate-converted to the two-dimensional coordinates according to the game program.
  • the texture data is read from, the read texture is pasted, and the image data is drawn in the video memory 7.
  • the image data drawn in the video memory 7 is read at a predetermined cycle and sent to the display device 5 so that a game image including an object whose movement is controlled according to the game program is displayed on the display device 5. Is done.
  • the present invention is characterized by object movement control that is processed in accordance with the execution of a game program stored in ROM 2.
  • the invention targeted by the present application resides in a game program for performing movement control of such an object.
  • the position reference display device 8 and the wireless receiver 9 are connected to the bus 10.
  • This functional unit is a direct pointing device (D P )
  • FIG. 2 is a view for explaining the relationship between the position reference display device 8, the wireless receiver 9, and the direct pointing device 20 operated by the player.
  • the position reference display device 8 is a game device main body.
  • the radio receiver 9 is provided in the game apparatus body 30. Therefore, in FIG. 2, the radio receiver 9 itself is not shown.
  • a remote control system generally used in a game device can be used.
  • the direct pointing device 20 has a three-axis acceleration sensor, and detects an acceleration given to the direct pointing device 20, that is, a movement. Further, the infrared light emitting element of the position reference display device 8 is imaged, and the position is specified in the direct pointing device 20 as will be described later.
  • the direct pointing device 20 calculates in what direction the direct pointing device 20 is given with respect to the screen.
  • the calculated information can be transmitted from the direct pointing device / chair 20 to the game apparatus body 30 through the wireless receiver 9.
  • the direct pointing device 20 images the infrared light emitting element, and the image data together with the acceleration data detected by the three-axis acceleration sensor is transmitted to the game apparatus body 30 through the wireless receiver 9. Send. Further, the game apparatus body 30 can be configured to calculate the position and movement of the direct pointing device 20 based on the received imaging data.
  • acceleration is measured by the triaxial acceleration sensor.
  • the motion detection of the direct pointing device 20 is performed only in the case of such triaxial acceleration measurement. Not limited.
  • the position reference display device 8 includes a pair of infrared light emitting elements 80, 81, and the position of the display device 5 can be changed as appropriate, but preferably the display of the display device 5 Located near face 50.
  • the direct pointing device 20 incorporates an infrared sensing image sensor and images the infrared light emitting elements 80 and 81.
  • An image picked up by the direct pointing device 20 is received by the wireless receiver 9 of the game main unit 30 using radio communication technology using electromagnetic waves, light or sound waves.
  • an existing bluetooth registered trademark
  • FIG. 3 is an image corresponding to the change of the direct pointing device 20 in the X direction or the Y direction.
  • the image shown in Fig. 3B shows the direct pointing device. This is an image when device 2 0 moves in the + Y direction and the left direction (_X direction). In other words, an image is obtained in which the position of the marker that is the infrared light emitting element 80, 8 1 is moved to the lower right on the screen.
  • the image shown in FIG. 4 is an image corresponding to the change of the direct pointing device 20 in the Z direction. It can be understood that the distance between the pair of infrared light emitting elements 80 and 8 1 in the image decreases as the pointing device 20 moves away from the position reference display device 8 (FIG. 4A ⁇ FIG. 4C).
  • the position and interval of the infrared light emitting elements 8 0, 8 1 in the captured image are detected by the processing function of the analysis program included in the direct pointing device 20, and based on this detection information
  • the position (real space three-dimensional coordinates) of the direct pointing device 20 in the real space region 40 can be easily determined.
  • the position information of the direct pointing device / device 20 analyzed and determined by the analysis program processing in the direct pointing device 20 is transmitted to the game main body device 30 through the wireless receiver 9.
  • infrared light emitting elements as markers, it is possible to make it less susceptible to noise light from illumination light, window light, and the like.
  • FIG. 5 is a conceptual diagram of the relationship shown in FIG. 2 when viewed in the X and Y directions as a plane. This shows the state where direct pointing device 2 0 is located at the center (origin) 4 1 of real space area 40. In addition, as shown in FIG. 2, the position reference display device 8 is placed in the vicinity of the display screen 50 (the most front face).
  • FIG. 6 shows a state displayed on the object (character) 5 2 force display screen 50 corresponding to FIG.
  • the player associates the direct pointing device 20 with the position of the real space corresponding to the position of the object 52 on the screen, and moves by swinging the direct pointing device 20 in the direction in which the object is to be moved. To do.
  • the image picked up by the direct pointing device 20 using the infrared light emitting devices 80 and 8 1 changes as described above with reference to FIGS. 3 and 4, and the main CPU 1 performs direct pointing.
  • the location of device 20 in real space region 40 is ascertained.
  • the object 52 is moved and displayed on the screen 50 corresponding to the position in the real space area 40 of the direct pointing device 20.
  • a real space area 40 of a predetermined range is set corresponding to the display area of the screen 50.
  • the direct pointing device 20 operated by the player may exceed the real space area 40 in a predetermined range.
  • the object 52 that is moved and displayed on the screen 50 as the pointing device 20 moves is the direct pointing device 20 0 force in a real space area within a predetermined range. When it exceeds 40, it disappears from the display screen 50.
  • the object of the direct pointing device 20 0 force until it exceeds the predetermined range of the real space region 40 and returns to the predetermined range of the real space region 40 again. There will be a period when the display of G 5 2 disappears.
  • the direct pointing device 2 When the operation area by 0 exceeds 0 and returns to the operation area 40 again, if the location that exceeds the operation area differs from the return location, the object on the screen 5 2 Instead of being displayed as a certain image, naturally the display location of the object 52 will change as if it were warped, causing the screen to feel uncomfortable and giving the player a sense of incongruity. become.
  • An object of the present invention is to eliminate inconvenience in operability associated with a shift between the real space and the display area on the screen.
  • FIG. 9 is an example flow of object movement control according to the present invention for achieving such an object.
  • FIGS. 10 and 11 are compared with FIGS. 7 and 8, respectively.
  • FIG. FIG. 10 is a diagram for explaining a moving display of an object 52.
  • a default real space area 40 is set as a default. Based on this real space region, the main CPU 1 reads the current position coordinates of the direct pointing device 20 based on the image sent from the direct pointing device 20 (step S 1).
  • the read current position coordinates are (x, y, z), for example, If the reference point (0, 0, 0), which is the origin in the real space area of the fault, is shifted by x, y, z in the X, Y, ⁇ direction, the current position of this direct pointing device 20 is used as the reference point. (0, 0, 0) (Step S2).
  • a certain real space area in each of the X, Y, and ⁇ directions is set as the movable range of the direct pointing device 20 (step S 3 )
  • the new movable range of the direct pointing device 20 is set as the real space area 40 described above.
  • Step S 4 the main CPU 1 determines whether or not the game is over (Step S 4). If the game is over (Step S 4, Yes), the game end process is performed. .
  • Step S 5 If the game is not over (Step S 4, No), the position information (position coordinates) of the direct pointing device 20 is read every frame (1 / 6,0 seconds) (Step S 5).
  • step S6 If the position of the direct pointing device 20 is within the real space area of the movable range (the real space area 40 indicated by the broken line in FIG. 10) (step S6, Yes), the game program continues.
  • Step S7 Next normal game processing according to (Step S7).
  • step S8 when the position of the direct pointing device 20 exceeds the real space area of the movable range (the real space area 40 shown by the broken line in FIG. 10) (step S6, No),
  • step S7 after performing the process of resetting the reference point (step S8), the normal game process according to the game program is performed (step S7).
  • step S8 the process of resetting the reference point (step S8) is described as follows. That is, in FIG. 10, the dashed-line real space region 40 is set from the position (x, y) of the direct pointing device 20 within the movable range indicated by the dashed-line real space region 40 set in advance.
  • the movable range is further converted into a new movable range (the region surrounded by the thick line in FIG. 10). Set to 40 N.
  • the origin O ' (0, 0)
  • the coordinate value of A' is (x, y).
  • the position information (position coordinates) of the direct pointing device 20 described above is such that the distance from the infrared light emitting elements 80 and 81, which are markers, becomes narrower as the numerical value in the Z direction increases. A slight movement for the operator Even so, the processing program that analyzes the movement based on the movement of the two markers determines that the movement is large. Therefore, the larger the numerical value of the position information is, the more the distance in the z-axis direction is, the more the object on the screen moves so as to make the operator feel uncomfortable.
  • FIGS. 12 and 13 are diagrams for explaining such a problem.
  • FIG. 11 is a diagram for explaining a problem that if the object is reflected in the virtual space without correcting the movement, the movement of the object on the display screen moves little by little to make the player feel uncomfortable.
  • FIG. 12 shows an object 5 2 represented by an arrow on a screen extending in the Z-axis direction in the virtual three-dimensional space (observable in the depth direction on the screen 50 of the display device 5).
  • a specific position is determined by moving in the direction of the arrow, and a specific menu item at a corresponding position of the object 52 is selected from a plurality of displayed menu items 53.
  • the numerical value range of the position information of the direct pointing device 20 due to camera shake or error becomes wide.
  • Figure 13 shows an enlarged view of the object on the screen as a result of the wide numerical value of the position information of the direct pointing device 20 due to camera shake and errors. Since the position information vibrates as indicated by the broken line with respect to the true object image 52, if the vibrational position information is directly reflected as the movement of the image of the object, the object It will vibrate like an enlarged image, providing an image that is undesirable for the player.
  • the present invention is a figure in which the object on the screen is enlarged as a result of the wide numerical value of the position information in the Z-axis direction of the direct pointing device / chair 20 caused by hand shake and error according to the second example.
  • the purpose is to eliminate the inconvenience of displaying the image as if it vibrates like in Fig. 3.
  • FIG. 14 is an example flow of object movement control according to the present invention for achieving the second object.
  • FIG. 15 is a diagram illustrating the distance from the position reference display device 8 of the direct pointing device 20 and the range of play corresponding to FIG.
  • the control flow shown in FIG. 14 is executed and controlled by the main CPU 1 in accordance with the game program stored in the ROM 2 as in the control flow shown in FIG.
  • step S20 When the game is started, during the period when the game is not over (step S20, No), the direct pointing device 20 captures an image at a predetermined cycle (1/60 seconds) according to the detection principle described above. Based on the image to be acquired, the main CPU 1 obtains the Z direction distance from the screen 50 and stores it in the buffer (RAM 3) (step S 2 1).
  • the maximum distance and the minimum distance stored in the buffer are obtained so as to absorb the dispersion of the distance values obtained by the above analysis program (step S 22).
  • the intermediate value between the obtained maximum distance and minimum distance is set as an approximate current distance (step S23).
  • processing such as obtaining the average value from the constant distance values stored in the buffer.
  • step S 24, Y es If the predetermined reference distance is shorter than, for example, 1.4 m (step S 24, Y es), the position reference display device 8 is closer to the position reference display device than the predetermined reference distance. Therefore, the play radius is set to the minimum and the followability to the movement of the direct pointing device 20 is set to the maximum (Fig. 15, D 3).
  • the "play radius” is the output of the direct pointing device 20.
  • “Followability” means how smoothly the object 52 in the screen moves on the screen with respect to the movement of the direct pointing device 20 as an operation means. The higher the followability, the better the movement of the direct pointing device 20 that the operator is moving.
  • the calculated current distance Z 1 force If the distance is longer than a predetermined maximum distance, for example, 2. Om (step S 26, Y es), direct pointing to the position farthest from the position reference display 8 Device 20 is located, and the effects of camera shake and errors are the greatest.
  • a predetermined maximum distance for example, 2.
  • the play radius is maximized, that is, the followability is set to the minimum (FIG. 15, D 1).
  • the sensitivity to the output of the direct pointing device 20 is low, and the follow-up property is also small, so that the effects of camera shake and errors are absorbed.
  • step S30 the play width set according to the division range of each distance is set (step S30), and normal game processing is performed (step S31).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dans un dispositif de jeu utilisant un dispositif de pointage connecté par radio à un corps de dispositif de jeu, il est possible de résoudre le problème selon lequel l'opérabilité de mouvement d'un objet est dégradée et une erreur est provoquée par une distance à partir d'un dispositif d'affichage de référence de position. Un programme amène un système informatique à exécuter une étape consistant à régler une région prédéterminée ayant une position de coordonnées de référence dans un espace réel ; une étape consistant à acquérir une coordonnée de position du dispositif de pointage dans la région prédéterminée ; une étape consistant à transmettre la coordonnée de position au corps de système informatique ; une étape consistant à déplacer l'objet dans un espace en 3D virtuel selon la coordonnée de position dans le corps de système informatique ; une étape consistant à convertir les coordonnées de l'objet se déplaçant dans l'espace en 3D virtuel en un plan en 2D et à l'afficher sur un écran d'affichage du dispositif d'affichage ; et une étape consistant à déplacer la position de coordonnée de référence de la région prédéterminée par le déplacement dépassant la région prédéterminée si la coordonnée de position dépasse la région prédéterminée.
PCT/JP2007/001234 2006-11-15 2007-11-12 Programme pour commander le mouvement d'un objet affiché exécuté dans un dispositif de jeu à l'aide d'un dispositif de pointage direct WO2008059614A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008544070A JPWO2008059614A1 (ja) 2006-11-15 2007-11-12 ダイレクトポインティングデバイスを使用するゲーム装置において実行され、表示されるオブジェクトの移動制御を行うプログラム

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2006309094 2006-11-15
JP2006-309093 2006-11-15
JP2006-309094 2006-11-15
JP2006309093 2006-11-15

Publications (1)

Publication Number Publication Date
WO2008059614A1 true WO2008059614A1 (fr) 2008-05-22

Family

ID=39401428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/001234 WO2008059614A1 (fr) 2006-11-15 2007-11-12 Programme pour commander le mouvement d'un objet affiché exécuté dans un dispositif de jeu à l'aide d'un dispositif de pointage direct

Country Status (2)

Country Link
JP (1) JPWO2008059614A1 (fr)
WO (1) WO2008059614A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110082664A (ko) * 2010-01-12 2011-07-20 엘지전자 주식회사 디스플레이 장치 및 그 제어방법
JP2013134714A (ja) * 2011-12-27 2013-07-08 Lenovo Singapore Pte Ltd 情報処理装置、ポインティングデバイスの誤操作検出方法、及びコンピュータが実行可能なプログラム
CN103529994A (zh) * 2013-11-04 2014-01-22 中国联合网络通信集团有限公司 虚拟触摸输入方法及定位采集设备
US9870119B2 (en) 2014-11-25 2018-01-16 Samsung Electronics Co., Ltd. Computing apparatus and method for providing three-dimensional (3D) interaction

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002222043A (ja) * 2001-01-29 2002-08-09 Nissan Motor Co Ltd カーソル制御装置
JP2005070996A (ja) * 2003-08-22 2005-03-17 Fuji Xerox Co Ltd 指示入力装置、指示入力システム、指示入力方法、及びプログラム
JP2007080002A (ja) * 2005-09-14 2007-03-29 Nintendo Co Ltd 仮想位置決定プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002222043A (ja) * 2001-01-29 2002-08-09 Nissan Motor Co Ltd カーソル制御装置
JP2005070996A (ja) * 2003-08-22 2005-03-17 Fuji Xerox Co Ltd 指示入力装置、指示入力システム、指示入力方法、及びプログラム
JP2007080002A (ja) * 2005-09-14 2007-03-29 Nintendo Co Ltd 仮想位置決定プログラム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110082664A (ko) * 2010-01-12 2011-07-20 엘지전자 주식회사 디스플레이 장치 및 그 제어방법
KR101646953B1 (ko) * 2010-01-12 2016-08-09 엘지전자 주식회사 디스플레이 장치 및 그 제어방법
JP2013134714A (ja) * 2011-12-27 2013-07-08 Lenovo Singapore Pte Ltd 情報処理装置、ポインティングデバイスの誤操作検出方法、及びコンピュータが実行可能なプログラム
CN103529994A (zh) * 2013-11-04 2014-01-22 中国联合网络通信集团有限公司 虚拟触摸输入方法及定位采集设备
CN103529994B (zh) * 2013-11-04 2016-07-06 中国联合网络通信集团有限公司 虚拟触摸输入方法及定位采集设备
US9870119B2 (en) 2014-11-25 2018-01-16 Samsung Electronics Co., Ltd. Computing apparatus and method for providing three-dimensional (3D) interaction

Also Published As

Publication number Publication date
JPWO2008059614A1 (ja) 2010-02-25

Similar Documents

Publication Publication Date Title
US9393487B2 (en) Method for mapping movements of a hand-held controller to game commands
US8313380B2 (en) Scheme for translating movements of a hand-held controller into inputs for a system
US10220302B2 (en) Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
JP5412227B2 (ja) 映像表示装置、および、その表示制御方法
US8614672B2 (en) Information processing apparatus, storage medium having information processing program stored therein, information processing system, and display range control method
US9250799B2 (en) Control method for information input device, information input device, program therefor, and information storage medium therefor
US20060256081A1 (en) Scheme for detecting and tracking user manipulation of a game controller body
US8994729B2 (en) Method for simulating operation of object and apparatus for the same
US20060264260A1 (en) Detectable and trackable hand-held controller
WO2014141504A1 (fr) Dispositif d'interface utilisateur tridimensionnelle et procédé de traitement d'opération tridimensionnelle
JP4677281B2 (ja) 画像処理方法、画像処理装置
US9079102B2 (en) Calculation of coordinates indicated by a handheld pointing device
US20080204406A1 (en) Computer-readable storage medium having stored therein information processing program and information processing apparatus
WO2020110659A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
EP2022039B1 (fr) Plan pour detecter et suivre une manipulation d'utilisateur d'un corps de controleurde jeu et pour traduire les mouvements de celui-ci en des entrees et des commandes de jeu
WO2008059614A1 (fr) Programme pour commander le mouvement d'un objet affiché exécuté dans un dispositif de jeu à l'aide d'un dispositif de pointage direct
US11029753B2 (en) Human computer interaction system and human computer interaction method
WO2014054317A1 (fr) Dispositif d'interface utilisateur et procédé d'interface utilisateur
US20110208494A1 (en) Method and system for simulating a handle's motion
JP7300436B2 (ja) 情報処理装置、システム、情報処理方法および情報処理プログラム
JP2009061159A (ja) プログラム、情報記憶媒体、及び、ゲームシステム
KR100636094B1 (ko) 3차원 사용자 입력 장치 및 그 입력 처리 방법
JP7394046B2 (ja) システム、撮像装置、情報処理装置、情報処理方法および情報処理プログラム
JP6363350B2 (ja) 情報処理プログラム、情報処理装置、情報処理システム、情報処理方法
WO2021256310A1 (fr) Dispositif de traitement d'informations, dispositif terminal, système de traitement d'informations, procédé de traitement d'informations et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07828012

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008544070

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07828012

Country of ref document: EP

Kind code of ref document: A1