WO2014208689A1 - 情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 - Google Patents
情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 Download PDFInfo
- Publication number
- WO2014208689A1 WO2014208689A1 PCT/JP2014/067049 JP2014067049W WO2014208689A1 WO 2014208689 A1 WO2014208689 A1 WO 2014208689A1 JP 2014067049 W JP2014067049 W JP 2014067049W WO 2014208689 A1 WO2014208689 A1 WO 2014208689A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- transmission
- data
- sensor
- destination device
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
Definitions
- the present invention relates to an information processing apparatus, a control method for the information processing apparatus, a program, and an information storage medium.
- portable information processing apparatuses including sensors for detecting posture, such as gyro sensors, motion sensors (acceleration sensors), and electronic compass.
- sensors for detecting posture such as gyro sensors, motion sensors (acceleration sensors), and electronic compass.
- the device that is the data transmission destination can be determined based on the detection result of the sensor, but the content of the data cannot be determined. Therefore, the content of the data to be transmitted has to be input by the user via an operation key or a touch panel. This is troublesome for the user.
- the content of the transmitted data can also be determined based on the attitude of the information processing device, for example, the user can seamlessly determine the destination device and the content of the transmitted data Therefore, user convenience is improved.
- the present invention has been made in view of the above problems, and one of its purposes is that of data to be transmitted based on the attitude of the information processing device specified based on the detection result of the sensor that detects the attitude. It is an object to provide an information processing apparatus, a method for controlling the information processing apparatus, a program, and an information storage medium that can determine the contents and the destination device of the data.
- an information processing apparatus is an information processing apparatus including a sensor that detects an attitude, and is based on an attitude of the information processing apparatus that is specified based on a detection result of the sensor.
- a destination device determining unit that determines a destination device, and an attitude of the information processing device specified based on a detection result of the sensor at a timing different from the timing at which the destination device is determined.
- a transmission data determining unit that determines the content of data to be transmitted to the transmission destination device, and a transmission unit that transmits the data to the transmission destination device.
- the information processing apparatus control method is an information processing apparatus control method including a sensor for detecting an attitude, and is based on the attitude of the information processing apparatus specified based on a detection result of the sensor. Determining the destination device, and based on the attitude of the information processing device identified based on the detection result of the sensor at a timing different from the timing at which the destination device is determined, Determining a content of data to be transmitted to a transmission destination device; and transmitting the data to the transmission destination device.
- the program according to the present invention is a program that is executed by a computer including a sensor that detects an attitude, and determines a transmission destination device based on the attitude of the computer specified based on a detection result of the sensor.
- Another information processing apparatus is an information processing apparatus including a sensor that detects an attitude, and is specified based on a detection result of a sensor that detects an attitude of the transmission apparatus included in the transmission apparatus. Based on the attitude of the transmission device, the transmission destination device determination unit that determines a transmission destination device, and the detection result of the sensor at a timing different from the timing at which the transmission destination device is determined From a transmission device including a transmission data determination unit that determines the content of data transmitted to the transmission destination device based on the attitude of the transmission device, and a transmission unit that transmits the data to the transmission destination device.
- a receiving unit that receives the data; an attitude specifying unit that specifies an attitude of the information processing apparatus based on a detection result of the sensor; the received data; and the information processing apparatus to be specified.
- Another information processing apparatus control method is an information processing apparatus control method including a sensor that detects an attitude, and includes a detection result of a sensor that detects an attitude of the transmission apparatus included in the transmission apparatus. Based on the attitude of the transmission device specified based on the transmission destination device determination unit that determines a transmission destination device, and the detection result of the sensor at a timing different from the timing at which the transmission destination device is determined.
- a transmission data determination unit that determines the content of data to be transmitted to the transmission destination device based on the attitude of the transmission device identified based on the transmission device; a transmission unit that transmits the data to the transmission destination device; A step of receiving the data from a transmission device including: a step of specifying an attitude of the information processing device based on a detection result of the sensor; and the data received And performing a posture of the information processing apparatus, processing corresponding to the, characterized in that it comprises a.
- another program is a program that is executed by a computer including a sensor that detects an attitude, and is specified based on a detection result of a sensor that detects an attitude of the transmission device included in the transmission device.
- the transmission destination device determination unit that determines a transmission destination device, and the detection result of the sensor at a timing different from the timing at which the transmission destination device is determined
- From a transmission device including a transmission data determination unit that determines the content of data transmitted to the transmission destination device based on the attitude of the transmission device, and a transmission unit that transmits the data to the transmission destination device.
- a procedure for receiving the data a procedure for specifying the attitude of the computer based on the detection result of the sensor, the received data, and the specified code.
- the above program can also be stored in a computer-readable information storage medium.
- the transmission destination device is determined based on the attitude of the information processing device specified based on the detection result of the sensor included in the information processing device. Then, the content of the data transmitted to the transmission destination device is determined based on the attitude of the information processing device specified based on the detection result of the sensor at a timing different from the timing at which the transmission destination device is determined. Is done.
- the content of the data to be transmitted and the transmission destination apparatus of the data are determined. it can.
- the transmission data determination unit is transmitted to the transmission destination device based on an attitude specified based on a detection result of the sensor after a timing when the transmission destination device is determined. Determine the contents of the data.
- the information processing apparatus further includes a positioning unit, and the transmission destination device determining unit is specified for each of a plurality of devices, and the positioning result of the positioning unit.
- the destination device is determined from among the plurality of devices based on the relationship with the position of the information processing device specified based on the position and the posture specified based on the detection result of the sensor. .
- the transmission destination device determining unit is specified based on the direction from the position of the information processing device toward the position of the device, and the detection result of the sensor, specified for each of a plurality of devices.
- the transmission destination device may be determined from among the plurality of devices based on the direction corresponding to the posture.
- the information processing apparatus further includes an imaging unit, and the transmission destination device determination unit is based on an imaging direction of the imaging unit that is specified based on a detection result of the sensor. Determine the destination device.
- FIG. 1 is a diagram illustrating an example of the overall configuration of an information processing system 10 according to an embodiment of the present invention.
- the information processing system 10 includes, for example, an information processing apparatus 12 (12-1, 12-2,..., 12-n) and a server 14.
- the information processing apparatus 12 and the server 14 are connected to a computer network 16 such as the Internet and can communicate with each other.
- the server 14 is, for example, a server computer that relays data exchange between the information processing apparatuses 12.
- FIG. 2A is a front view showing an example of the appearance of the information processing apparatus 12 according to the present embodiment.
- 2B is a rear view illustrating an example of an appearance of the information processing apparatus 12 illustrated in FIG. 2A.
- FIG. 3 is a configuration diagram illustrating an example of a hardware configuration of the information processing apparatus 12 illustrated in FIGS. 2A and 2B.
- the information processing apparatus 12 according to the present embodiment is assumed to be a portable device such as a portable game machine.
- the casing of the information processing apparatus 12 has a flat plate shape as a whole.
- the horizontal direction (width direction) of the housing is defined as the X1 axis direction
- the vertical direction (height direction) is defined as the Y1 axis direction
- the thickness direction (depth direction) is defined as the Z1 axis direction.
- the direction from the left to the right when viewed from the front of the housing is the X1-axis positive direction
- the direction from the bottom to the top when viewed from the front of the housing is the Y1-axis positive direction
- the direction toward the back is the Z1-axis positive direction.
- the information processing apparatus 12 includes a control unit 20, a storage unit 22, a communication unit 24, an image processing unit 26, a display unit 28, a touch sensor 30, and an operation. It includes a key 32, an imaging unit 34, a positioning unit 36, a sensor unit 38, and the like. These elements are connected via a bus.
- the control unit 20 is, for example, a CPU or the like, and executes various types of information processing according to programs stored in the storage unit 22.
- the storage unit 22 is a memory element such as a RAM or a ROM, and stores programs executed by the control unit 20 and various data.
- the communication unit 24 transmits information to another information processing apparatus 12, the server 14 on the Internet, or the like according to an instruction input from the control unit 20. In addition, the communication unit 24 outputs received information to the control unit 20.
- the communication unit 24 according to the present embodiment includes, for example, a mobile phone communication module for performing data communication using a mobile phone network and a wireless LAN module for performing data communication using a wireless LAN.
- the image processing unit 26 includes, for example, a GPU and a frame buffer memory, and draws an image to be displayed on the display unit 28 in accordance with an instruction output from the control unit 20.
- the image processing unit 26 includes a frame buffer memory corresponding to the display area of the display unit 28, and the GPU writes an image to the frame buffer memory every predetermined time in accordance with an instruction from the control unit 20. Then, the image written in the frame buffer memory is converted into a video signal at a predetermined timing and displayed on the display unit 28.
- the display unit 28 is various image display devices such as a liquid crystal display panel and an organic EL display panel.
- the touch sensor 30 is a sensor that sequentially detects contact of an object (for example, a finger or the like) on the detection surface at a predetermined time interval.
- the information processing apparatus 12 according to the present embodiment includes two touch sensors 30 (a front touch sensor 30a and a back touch sensor 30b).
- a touch panel 40 in which the display unit 28 and the front touch sensor 30a are integrated is provided on the front surface of the housing of the information processing apparatus 12 according to the present embodiment.
- the touch sensor 30 according to the present embodiment may be of any type as long as it is a device that can detect the position of an object on the detection surface, such as a capacitance type, a pressure sensitive type, or an optical type. .
- the operation key 32 is a kind of operation unit used for operation input performed by the user on the information processing apparatus 12.
- the operation keys 32 according to the present embodiment include, for example, a direction key 32a, a button group 32b, an analog stick 32c, an L button 32d, an R button 32e, and the like.
- the direction key 32a is arranged on the left side of the front surface of the housing.
- the button group 32b is disposed on the right side of the front surface of the housing.
- Two analog sticks 32c are arranged on the left and right sides of the front of the housing.
- the L button 32d is disposed on the left side of the upper side surface of the casing as viewed from the front of the casing.
- the R button 32e is arranged on the right side of the upper side surface of the housing as viewed from the front of the housing.
- the imaging unit 34 is a camera that outputs an image obtained by imaging a subject to the information processing apparatus 12.
- the information processing apparatus 12 according to the present embodiment is provided with two imaging units 34.
- the imaging unit 34 provided on the front surface of the housing is referred to as a front imaging unit 34a
- the imaging unit 34 provided on the back surface of the housing is referred to as a back imaging unit 34b.
- the positioning unit 36 is a device for measuring the position (latitude and longitude) of the information processing apparatus 12 using GPS (Global Positioning System).
- the sensor unit 38 is a device that detects the attitude of the information processing apparatus 12.
- the sensor unit 38 according to the present embodiment includes, for example, a 3-axis gyro sensor, a 3-axis motion sensor (3-axis acceleration sensor), and an electronic compass.
- the sensor unit 38 according to the present embodiment can specify not only the change in the attitude of the information processing apparatus 12 but also the direction in which the information processing apparatus 12 faces.
- various types of data are transmitted and received between the information processing apparatuses 12 via the server 14.
- the data to be transmitted and the information processing apparatus 12 that is the transmission destination of the data are determined based on the attitude of the information processing apparatus 12 specified based on the detection result of the sensor unit 38. .
- a play image 50-1 illustrated in FIG. 4 is displayed on the display unit 28 of the information processing apparatus 12 of the player participating in the shooting game.
- FIG. 4 is a diagram illustrating an example of the play image 50-1 displayed on the display unit 28 of the first information processing apparatus 12-1.
- the play image 50-1 displayed on the display unit 28 is updated at predetermined time intervals (for example, 1/60 second intervals).
- the play image 50-1 includes a captured image 52 illustrated in FIG. 5 captured by the rear imaging unit 34b as a background.
- the rear imaging unit 34b generates the captured image 52 at a predetermined time interval (for example, 1/60 second interval).
- the update interval of the play image 50-1 matches the generation interval of the captured image 52, and the latest play image 50-1 includes the captured image 52 with the latest generation date and time.
- the play image 50-1 includes an image representing a state in which the viewing direction Vd is viewed from the viewpoint Vp arranged in the three-dimensional virtual space associated with the first player (see FIG. 6).
- FIG. 6 is a diagram illustrating an example of a virtual space associated with the first player. The direction parallel to the X2-Y2 plane in the virtual space according to the present embodiment is associated with the direction in the real space.
- the X2 axis positive direction corresponds to the north direction in the real space
- the X2 axis negative direction corresponds to the south direction in the real space.
- the Y2-axis positive direction corresponds to the east direction in the real space
- the Y2-axis negative direction corresponds to the west direction in the real space
- the Z2 axis positive direction corresponds to the vertically upward direction in the real space
- the Z2 axis negative direction corresponds to the vertically downward direction in the real space.
- positions on the X2-Y2 plane corresponding to each of the players participating in the shooting game are set in the virtual space.
- FIG. 6 shows a position P1 corresponding to the first player and a position P2 corresponding to the second player.
- the position in the virtual space corresponding to the player is associated with the position in the real space of the information processing apparatus 12 used by the player.
- the first information processing apparatus 12 when the position of the first information processing apparatus 12-1 and the position of the second information processing apparatus 12-2 in the real space are connected with the shortest distance.
- the direction toward the position of the second information processing apparatus 12-2 viewed from the position -1 is specified.
- the position P1 and the position P2 are set so that the direction substantially matches the direction associated with the direction from the position P1 toward the position P2 in the virtual space.
- the position of the information processing apparatus 12 in the real space is specified based on the positioning result of the positioning unit 36 included in the information processing apparatus 12.
- the viewpoint Vp is disposed at a position corresponding to the player, and the target player object TPO is disposed at a position corresponding to another player. Since the virtual space associated with the first player is shown in FIG. 6, the viewpoint Vp is placed at the position P1 corresponding to the first player, and the second position at the position P2 corresponding to the second player. A target player object TPO corresponding to the player is arranged.
- the target player object TPO is a flat polygon model on which an icon image of the player is pasted.
- the target player object TPO is arranged so that its front faces the viewpoint Vp.
- the target player object TPO of the second player is arranged at a position away from the position P2 corresponding to the second player by a predetermined distance along the positive Z2 axis.
- the line-of-sight direction Vd is determined based on the attitude of the information processing apparatus 12 in the real space.
- the posture is specified based on the detection result of the sensor unit 38 included in the information processing apparatus 12.
- the line-of-sight direction Vd shown in FIG. 6 is determined based on the attitude of the first information processing apparatus 12-1.
- the sensor unit 38 detects the posture of the first information processing apparatus 12-1 at a predetermined time interval (for example, at an interval of 1/60 seconds).
- the line-of-sight direction Vd is updated to a value corresponding to the detected posture every time the posture of the first information processing apparatus 12-1 is detected.
- FIG. 7 shows an example of the relationship between the posture of the information processing apparatus 12 according to the present embodiment and the line-of-sight direction Vd of the viewpoint Vp arranged in the virtual space.
- the line-of-sight direction Vd is expressed by a combination of angles formed by the reference direction and the line-of-sight direction Vd when viewed from each of three orthogonal directions.
- the line-of-sight direction Vd is expressed by a combination of the yaw angle ⁇ y, the pitch angle ⁇ p, and the roll angle ⁇ r, for example.
- the yaw angle ⁇ y in the line-of-sight direction Vd corresponds to the angle formed by the positive X2-axis direction and the line-of-sight direction Vd when viewed along the negative Z2-axis direction.
- the counterclockwise rotation is positive with reference to the positive direction of the X2 axis when viewed along the negative direction of the Z2 axis.
- the pitch angle ⁇ p in the line-of-sight direction Vd corresponds to an angle formed by the positive X2-axis direction and the line-of-sight direction Vd when viewed along the Y2-axis negative direction.
- the counterclockwise rotation is positive with reference to the positive direction of the X2 axis when viewed along the negative direction of the Y2 axis.
- the roll angle ⁇ r in the line-of-sight direction Vd corresponds to an angle formed by the Z2-axis positive direction and the line-of-sight direction Vd when viewed along the X2-axis positive direction.
- the counterclockwise rotation is positive with reference to the positive direction of the Z2 axis when viewed along the positive direction of the X2 axis.
- the combination of the yaw angle ⁇ y, the pitch angle ⁇ p, and the roll angle ⁇ r is uniquely determined based on the attitude of the information processing apparatus 12.
- the angle formed by the reference direction and the Z1 axis positive direction of the information processing apparatus 12 that is the imaging direction of the back imaging unit 34b when viewed from each of three directions orthogonal to each other are set as the above-mentioned yaw angle ⁇ y, pitch angle ⁇ p, and roll angle ⁇ r, respectively.
- the angle formed by the horizontal north direction when viewed along the vertical downward direction and the Z1 axis positive direction of the information processing device 12 is set as the yaw angle ⁇ y of the line-of-sight direction Vd.
- the counterclockwise direction is positive with respect to the north direction in the horizontal direction when viewed along the vertically downward direction.
- an angle formed by the horizontal north direction when viewing the direction from east to west and the positive direction of the Z1 axis of the information processing device 12 is set as the pitch angle ⁇ p of the line-of-sight direction Vd.
- the counterclockwise direction is positive with respect to the north direction in the horizontal direction when the direction from east to west is viewed.
- the angle formed by the vertically upward direction when viewing the direction from south to north and the positive direction of the Z1 axis of the information processing device 12 is set as the roll angle ⁇ r in the line-of-sight direction Vd.
- the counterclockwise direction is positive with respect to the vertical upward direction when the direction from south to north is viewed.
- the direction of the imaging direction of the rear imaging unit 34b substantially matches the direction associated with the line-of-sight direction Vd in the virtual space.
- the Z1 axis positive direction of the information processing device 12 is the horizontal north direction
- the line-of-sight direction Vd in the virtual space is the X2 axis positive direction.
- the visual field range Vv which is a bottomless quadrangular pyramid shape is determined based on the position of the viewpoint Vp, and the gaze direction Vd.
- the play image 50-1 includes an image of the target player object TPO.
- information indicating the name of the player corresponding to the target player object TPO and the distance to the information processing device 12 used by the player corresponding to the target player object TPO is also displayed.
- the update interval of the play image 50-1 and the update interval of the line-of-sight direction Vd are the same.
- the latest play image 50-1 includes an image representing a state in which the virtual space is viewed from the latest line-of-sight direction Vd.
- the direction in the direction from the position of the first information processing apparatus 12-1 to the position of the second information processing apparatus 12-2 in the real space is in the virtual space. It substantially coincides with the direction associated with the direction from the position P1 toward the position P2. As described above, the direction of the imaging direction of the rear imaging unit 34b substantially matches the direction associated with the line-of-sight direction Vd in the virtual space. Therefore, the play image 50-1 including the captured image 52 captured by the rear imaging unit 34b in the direction from the position of the first information processing apparatus 12-1 to the position of the second information processing apparatus 12-2 includes An image of the target player object TPO of the second player is included.
- the play image 50-1 includes a missile image MP, a scope image SP, and a launch pad image LP.
- Each of the missile images MP includes a number indicating the attack power.
- the first player can select a missile to be fired by operating the operation key 32 or the touch sensor 30.
- a play image 50-1 shown in FIG. 4 shows a state in which an image of a missile with an attack power of 3 is selected. Further, the position of the image SP in the play image 50-1 does not change even when the posture of the information processing apparatus 12 changes.
- a circular area inside the scope image SP is referred to as a lock-on area.
- the target player object TPO when at least a part of the image of the target player object TPO corresponding to the second player is displayed in the lock-on area, when the first player performs a predetermined lock-on operation, The target player object TPO is locked on.
- the target player object TPO is locked on.
- FIG. 8 shows an example of a play image 50-1 displayed when the target player object TPO corresponding to the second player is locked on.
- FIG. 9 shows an example of a virtual space associated with the first player when the target player object TPO corresponding to the second player is locked on. As shown in FIG.
- a translucent trajectory object TO representing the trajectory of a missile to be launched is arranged in the virtual space.
- the position P1 is set as the position of the start point of the trajectory
- the position P2 is set as the position of the end point of the trajectory.
- the target player object TPO while the L button 32d is pressed, the target player object TPO is kept locked. Then, when the first player changes the posture of the first information processing device 12-1 with the target player object TPO locked on, the trajectory object is maintained with the start point position and end point position maintained.
- the trajectory of the missile represented by TO changes.
- FIG. 10 shows an example of the play image 50-1 after the missile trajectory has changed.
- the line-of-sight direction Vd specified based on the attitude of the information processing apparatus 12 is the missile launch direction.
- the trajectory of the missile is uniquely determined based on the combination of the value of the yaw angle ⁇ y and the value of the pitch angle ⁇ p specified based on the attitude of the information processing device 12. Yes.
- the first information processing apparatus 12-1 obtains the missile data associated with the missile.
- the data is transmitted to the second information processing apparatus 12-2 (see FIG. 11).
- FIG. 11 is a diagram illustrating an example of the data structure of missile data.
- the missile data is transmitted to the second information processing apparatus 12-2 via the server 14.
- the missile data includes a launch player ID, a target player ID, attack power data, and orbit data.
- the launch player ID is identification information of the player who performed the missile launch operation.
- the identification information of the first player is set as the launch player ID.
- the target player ID is player identification information associated with a target player object TPO that is a missile arrival target.
- the identification information of the second player is set as the target player ID.
- the attack power data is data indicating the attack power of the missile. For example, when a missile with an attack power of 3 is launched, 3 is set as the value of the attack power data.
- the trajectory data is data associated with the trajectory of the missile, which is determined based on the attitude of the information processing apparatus 12 at the time of the launch operation.
- the values of the trajectory data included in the missile data include the values of the yaw angle ⁇ y and the pitch angle ⁇ p of the line-of-sight direction Vd specified based on the attitude of the information processing apparatus 12 at the time of the missile launch operation Is set.
- the display unit 28 of the first information processing apparatus 12-1 displays a state in which a missile is launched from the launch pad.
- the missile object associated with the missile at a position along the trajectory determined based on the combination of the value of the yaw angle ⁇ y and the value of the pitch angle ⁇ p set as the value of the trajectory data.
- MO is arranged.
- FIG. 12 is a diagram illustrating an example of a virtual space in which the missile object MO is arranged and associated with the first player.
- the missile object MO is arranged at a position that is separated from the starting point by a predetermined length along the determined trajectory of the missile during the launch operation.
- FIG. 13 is a diagram illustrating an example of a play image 50-1 displayed on the display unit 28 of the first information processing apparatus 12-1 after a missile is launched.
- an image of the missile object MO is shown.
- the missile object MO arranged in the virtual space moves in the virtual space toward the position P2 at a predetermined speed along the determined trajectory.
- the information processing apparatus 12 that is the transmission destination of missile data is determined based on the attitude of the information processing apparatus 12 during the lock-on operation. Then, the value of the trajectory data included in the missile data is determined based on the attitude of the information processing device 12 during the subsequent launch operation. Then, the missile data is transmitted to the information processing apparatus 12 of the determined transmission destination.
- the value of the trajectory data included in the transmitted missile data and the information processing apparatus 12 that is the transmission destination of the missile data can be determined.
- the trajectory of the missile object MO associated with the missile data is determined based on the received missile data. Then, the missile object MO coming toward the viewpoint Vp arranged in the virtual space is arranged in the three-dimensional virtual space associated with the second player shown in FIG.
- the missile object MO will be referred to as a target missile object TMO.
- the direction toward the position of the first information processing apparatus 12-1 viewed from the position 2 is specified.
- the position P2 and the position P1 are set so that the direction substantially coincides with the direction associated with the direction from the position P2 toward the position P1 in the virtual space shown in FIG.
- the coordinate values of the position P1 and the position P2 set in the virtual space associated with the second player shown in FIG. 14 are associated with the first player shown in FIG.
- the coordinate values of the position P1 and the position P2 set in the virtual space are the same. Since the virtual space associated with the second player is shown in FIG.
- the viewpoint Vp is arranged at the position P2 corresponding to the second player, and the first position P1 corresponding to the first player is displayed.
- a target player object TPO corresponding to the player is arranged.
- the position (here, position P1) corresponding to the launch player ID included in the received missile data is set as the position of the start point of the target missile object TMO.
- a position (here, position P2) corresponding to the target player ID included in the missile data is set as the position of the end point of the target missile object TMO.
- the trajectory of the target missile object TMO is determined based on the value of trajectory data included in the received missile data (a combination of the value of the yaw angle ⁇ y and the value of the pitch angle ⁇ p).
- the target missile object TMO is arranged along the determined trajectory at a position away from the starting point by a predetermined length. Then, the target missile object TMO moves in the virtual space toward the position P2 at a predetermined speed along the determined trajectory.
- the direction in which the missile object MO fires when viewing the position P2 from the position P1 and the direction in which the missile object MO faces when viewing the position P1 from the position P2 are reversed horizontally. It is in. For example, when the missile object MO is launched from the position P1 toward the upper left, the missile object MO is viewed from the upper right when viewed from the position P2 of the viewpoint Vp arranged in the virtual space associated with the second player. It will come.
- the line-of-sight direction Vd of the viewpoint Vp arranged in the virtual space shown in FIG. 14 is determined based on the attitude of the second information processing device 12-2 in the real space. .
- the visual field range Vv is determined based on the position of the viewpoint Vp and the line-of-sight direction Vd.
- the play image 50-2 displayed on the display unit 28 of the second information processing apparatus 12-2 includes the image of the target missile object TMO. It becomes.
- FIG. 15 is a diagram showing an example of a play image 50-2 displayed on the display unit 28 of the second information processing apparatus 12-2 after the target missile object TMO is arranged in the virtual space.
- the play image 50-2 shown in FIG. 15 includes an image of the target missile object TMO. In this case, information indicating the attack power of the target missile object TMO is also displayed.
- the play image 50-2 shown in FIG. 15 includes a missile image MP ', a scope image SP', and a launch pad image LP '.
- FIG. 16 is a diagram illustrating an example of the play image 50-2 when the target missile object TMO is locked on.
- the second player in order to lock on the target missile object TMO, the second player adjusts the posture of the second information processing apparatus 12-2 so that the line-of-sight direction Vd faces the target missile object TMO. There is a need.
- FIG. 17 shows an example of a virtual space associated with the second player in which the target missile object TMO and the interceptor missile object IMO are arranged.
- the intercepting missile object IMO is arranged along the trajectory of the target missile object TMO at a position separated from the end point by a predetermined length. Further, the intercepting missile object IMO is arranged in the opposite direction to the target missile object TMO. In the present embodiment, the intercepting missile object IMO moves along the trajectory of the target missile object TMO in the virtual space at a predetermined speed in a direction opposite to the traveling direction of the target missile object TMO.
- the target missile object TMO and the intercepting missile object IMO collide.
- the attack power of the intercepting missile object IMO is larger than the attack power of the target missile object TMO, the two missile objects MO are both erased from the virtual space. That is, in this case, the second player can intercept the missile object MO fired from the first player.
- the attack power of the intercepting missile object IMO is smaller than the attack power of the target missile object TMO, the intercepting missile object IMO is deleted from the virtual space. Then, the attack power of the target missile object TMO is reduced by the attack power of the intercepting missile object IMO.
- the attack power of the target missile object TMO and the attack power of the intercepting missile object IMO are the same, the target missile object TMO is deleted from the virtual space. Then, the attack power of the intercepting missile object IMO is a value obtained by adding 1 to the target missile object TMO.
- the score of the player corresponding to the target player object TPO is reduced by a value obtained by multiplying the value indicating the attack power of the missile object MO by 10 times. . Then, the player who has the highest score when the predetermined time has elapsed since the game started is the winner.
- a game played by a player in a virtual space is as if in a real space. You will be able to experience the feelings that are being made in
- the position P1 and the position P2 are set in the virtual space, but positions corresponding to other players participating in the shooting game are also set in the same manner.
- the information processing apparatus 12 used by another player is referred to as another information processing apparatus 12, and a position corresponding to the other player is referred to as a position Px.
- a position corresponding to the other player is referred to as a position Px.
- the position Px is set in the virtual space illustrated in FIG. 6 will be described.
- the position of the first information processing apparatus 12-1 when the position of the first information processing apparatus 12-1 and the position of the other information processing apparatus 12 in the real space are connected with the shortest distance.
- the direction in the direction toward the position of the other information processing apparatus 12 viewed from the above is specified.
- the position Px in the virtual space shown in FIG. 6 is set so that the direction substantially coincides with the direction associated with the direction from the position P1 in the virtual space shown in FIG. 6 toward the position Px.
- the position Px in the virtual space shown in FIG. 14 will be described.
- the position of the second information processing apparatus 12-2 in the real space is viewed from the position of the second information processing apparatus 12-2 when the position of the other information processing apparatus 12 is connected at the shortest distance.
- the direction toward the position of the other information processing apparatus 12 is specified.
- the position Px in the virtual space shown in FIG. 14 is set so that the direction substantially coincides with the direction associated with the direction from the position P2 in the virtual space shown in FIG. 14 toward the position Px.
- the value of the trajectory data included in the missile data and the information processing device 12 that is the transmission destination of the missile data are determined.
- the missile data generation process and the transmission / reception process performed by the information processing apparatus 12 according to the present embodiment will be mainly described.
- FIG. 18 is a functional block diagram illustrating an example of functions implemented by the information processing apparatus 12 according to the present embodiment. Note that the information processing apparatus 12 according to the present embodiment does not have to include all the functions illustrated in FIG. 18, and functions other than the functions illustrated in FIG. 18 may be mounted.
- the information processing apparatus 12 functionally includes, for example, a positioning result acquisition unit 60, a position information transmission unit 62, a target position information reception unit 64, a captured image acquisition unit 66, A posture specifying unit 68, an operation receiving unit 70, a game-related data storage unit 72, a game process execution unit 74, a missile data generation unit 76, a missile data transmission unit 78, and a missile data reception unit 80 are included.
- the game related data storage unit 72 is mainly implemented by the storage unit 22 of the information processing apparatus 12.
- the position information transmission unit 62, the target position information reception unit 64, the missile data transmission unit 78, and the missile data reception unit 80 are mounted mainly on the communication unit 24 of the information processing device 12. Other functions are mainly implemented by the control unit 20 of the information processing apparatus 12.
- the above functions are implemented by causing the control unit 20 of the information processing apparatus 12 to execute a program installed in the information processing apparatus 12 that is a computer and including a command corresponding to the above functions.
- This program is supplied to the information processing apparatus 12 via a computer-readable information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, or a flash memory, or via a computer network such as the Internet.
- the positioning result acquisition unit 60 acquires the positioning result of the positioning unit 36.
- the positioning result acquisition unit 60 acquires the position coordinate data generated by the positioning unit 36, for example.
- the positioning unit 36 receives the satellite signal (GPS signal), and based on the satellite signal, the position coordinates (latitude and longitude in the present embodiment). ) Is generated. Then, the positioning result acquisition unit 60 acquires the position coordinate data.
- the location information transmission unit 62 transmits the location information acquired by the positioning result acquisition unit 60 to the server 14.
- the position information transmission unit 62 transmits position coordinate data indicating the position coordinates of the information processing apparatus 12 associated with the identification information of the player who uses the information processing apparatus 12 to the server 14.
- the server 14 receives and stores the position coordinate data associated with the identification information of all the players participating in the shooting game.
- the target position information receiving unit 64 receives position coordinate data associated with the identification information of the information processing apparatus 12 used by other players participating in the shooting game from the server 14.
- the target position information receiving unit 64 transmits a transmission request for position coordinate data for other players participating in the shooting game to the server 14.
- the server 14 transmits the position coordinate data to the information processing device 12 in response to the transmission request. Then, the target position information receiving unit 64 receives the position coordinate data.
- the captured image acquisition unit 66 acquires the captured image 52 generated by the imaging unit 34.
- the captured image acquisition unit 66 acquires the captured image 52 at a predetermined time interval (for example, 1/60 second interval).
- the posture specifying unit 68 specifies the posture of the information processing apparatus 12 detected by the sensor unit 38.
- the posture specifying unit 68 specifies the yaw angle ⁇ y, the pitch angle ⁇ p, and the roll angle ⁇ r based on the specified posture.
- the yaw angle ⁇ y, the pitch angle ⁇ p, and the roll angle ⁇ r are specified at predetermined time intervals (for example, 1/60 second intervals).
- the operation reception unit 70 receives an operation signal from the operation key 32 and identifies the input operation key 32.
- the operation receiving unit 70 specifies the input operation key 32 at a predetermined time interval (for example, 1/60 second interval).
- the game related data storage unit 72 stores various data related to the shooting game.
- the game-related data storage unit 72 is, for example, data representing a virtual space associated with a player who uses the information processing apparatus 12, data of a virtual object placed in the virtual space, an image included in the play image 50, a currently selected image Memorize the missile's attack power, etc.
- the game process execution unit 74 executes various processes related to the play of the shooting game as described above. Hereinafter, this process is referred to as a game process.
- the game process execution unit 74 executes processes such as construction of a virtual space, arrangement of the viewpoint Vp and virtual objects in the virtual space, for example.
- the viewpoint Vp is arranged at a position corresponding to the position coordinate indicated by the position coordinate data acquired by the positioning result acquisition unit 60.
- the target player object TPO of the player is arranged at a position corresponding to the position coordinate indicated by the position coordinate data received by the target position information receiving unit 64 and associated with the identification information of the other player. It will be.
- the game process execution part 74 specifies the target player object TPO matched with the information processing apparatus 12 used as the transmission destination of missile data based on the attitude
- the game process execution unit 74 performs other processes such as movement and deletion of the missile object MO in the virtual space, score deduction, generation of the play image 50, and display of the play image 50 on the display unit 28. .
- the missile data generation unit 76 generates missile data when the operation receiving unit 70 receives a missile launch operation in a state where the target player object TPO is locked on.
- the missile data generation unit 76 determines the information processing apparatus 12 used by the locked-on player as a missile data transmission destination apparatus. Further, the missile data generation unit 76 determines the value of the trajectory data included in the missile data based on the attitude specified by the attitude specifying unit 68 at the timing when the R button 32e is pressed. That is, in the present embodiment, the value of the trajectory data is determined based on the posture specified at a timing later than the timing at which the target player object TPO is locked on.
- the missile data transmission unit 78 transmits the missile data generated by the missile data generation unit 76 to the server 14.
- the server 14 specifies a target player ID included in the missile data. Then, the missile data is transmitted to the information processing device 12 of the player corresponding to the target player ID.
- the missile data receiving unit 80 receives missile data transmitted from the server 14.
- the captured image acquisition unit 66 acquires the captured image 52 acquired by the imaging unit 34 (S101). Then, the posture specifying unit 68 acquires the detection result of the sensor unit 38, and specifies the yaw angle ⁇ y, the pitch angle ⁇ p, and the roll angle ⁇ r based on the detection result (S102). Then, the operation receiving unit 70 specifies the input operation key 32 (S103).
- the game process execution part 74 confirms whether the new missile data which the missile data receiving part 80 received exists (S104). If it exists (S104: Y), the game process execution unit 74 sets a trajectory in the virtual space based on the missile data (S105). Then, the game process execution unit 74 places the target missile object TMO at a position along the trajectory (S106).
- the missile data generation unit 76 checks whether or not the operation reception unit 70 has received a new missile launch operation in a state where the target player object TPO is locked on (S107). When it is confirmed in the process shown in S104 that the missile data received by the missile data receiving unit 80 does not exist (S104: N), the process shown in S107 is also executed. If it is confirmed that a new missile launch operation has been accepted (S107: Y), the missile data generation unit 76 generates missile data (S108).
- the identification information of the player who uses the information processing apparatus 12 is set as the value of the launch player ID included in the missile data. Then, as the value of the target player ID included in the missile data, player identification information corresponding to the locked-on target player object TPO is set.
- the attack power value of the missile being selected which is stored in the game-related data storage unit 72.
- the value of the trajectory data included in the missile data a combination of the value of the yaw angle ⁇ y and the value of the pitch angle ⁇ p specified by the process shown in S102 specified by the attitude specifying unit 68 is set.
- the missile data transmission part 78 transmits the missile data produced
- the game process execution unit 74 executes a game process specified based on the result of the process shown in S101 to S109 (S110). Even when it is confirmed in the process shown in S107 that a new missile launch operation has not been accepted (S107: N), the process shown in S110 is executed. In the processing shown in S110, for example, placement of a virtual object in the virtual space, generation of the play image 50, and display of the play image 50 on the display unit 28 are performed.
- FIG. 20 illustrates an example of the flow of processing performed in the information processing system 10 according to the present embodiment when missile data is transmitted from the first information processing device 12-1 in the processing shown in S109. This will be described with reference to a flow diagram.
- the server 14 receives missile data transmitted from the first information processing apparatus 12-1 (S201). Then, the server 14 specifies the information processing apparatus 12 used by the player corresponding to the target player ID included in the missile data (S202). Here, it is assumed that the second information processing apparatus 12-2 is specified. Then, the server 14 transmits the missile data received in the process shown in S201 to the information processing apparatus 12 (here, the second information processing apparatus 12-2) identified in the process shown in S202. The information processing device 12 (here, the second information processing device 12-2) identified in the process shown in S202 receives the missile data (S203).
- the information processing apparatus 12 that has received the missile data in the process shown in S203 executes the trajectory setting process shown in S105 and the target missile object TMO placement process shown in S106.
- data transmitted / received between the information processing apparatuses 12 is not limited to missile data.
- the present embodiment may be applied to transmission / reception of messages between the information processing apparatuses 12. For example, even if the information processing device 12 existing in the imaging direction of the rear imaging unit 34b is specified as the information processing device 12 that is the message transmission destination, based on the attitude of the information processing device 12 when the L button 32d is pressed. Good. Then, when the R button 32e is subsequently pressed, a message corresponding to the attitude of the information processing apparatus 12 at that time may be transmitted to the information processing apparatus 12 that is the transmission destination. For example, when pointing up, the message "long time" is transmitted, when pointing down, the message "Hello", may also be sent.
- the division of roles between the information processing apparatus 12 and the server 14 is not limited to the above. Further, the information processing apparatus 12 and the server 14 may be configured from a plurality of cases.
- the specific character strings described above and the specific character strings in the drawings are examples, and are not limited to these character strings.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (11)
- 姿勢を検出するセンサを備える情報処理装置であって、
前記センサの検出結果に基づいて特定される前記情報処理装置の姿勢に基づいて、送信先の装置を決定する送信先装置決定部と、
前記送信先の装置が決定されたタイミングとは異なるタイミングでの前記センサの検出結果に基づいて特定される前記情報処理装置の姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する送信データ決定部と、
前記送信先の装置に前記データを送信する送信部と、
を含むことを特徴とする情報処理装置。 - 前記送信データ決定部は、前記送信先の装置が決定されたタイミングより後に前記センサの検出結果に基づいて特定される姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する、
ことを特徴とする請求項1に記載の情報処理装置。 - 前記情報処理装置は、測位部をさらに備え、
前記送信先装置決定部は、複数の装置のそれぞれについて特定される、当該装置の位置と前記測位部の測位結果に基づいて特定される前記情報処理装置の位置との関係と、前記センサの検出結果に基づいて特定される姿勢と、に基づいて、当該複数の装置のうちから前記送信先の装置を決定する、
ことを特徴とする請求項1又は2に記載の情報処理装置。 - 前記送信先装置決定部は、複数の装置のそれぞれについて特定される、前記情報処理装置の位置から当該装置の位置に向かう向きの方角と、前記センサの検出結果に基づいて特定される姿勢に対応する方角と、に基づいて、当該複数の装置のうちから前記送信先の装置を決定する、
ことを特徴とする請求項3に記載の情報処理装置。 - 前記情報処理装置は、撮像部をさらに備え、
前記送信先装置決定部は、前記センサの検出結果に基づいて特定される前記撮像部の撮像方向に基づいて前記送信先の装置を決定する、
ことを特徴とする請求項1から4のいずれか一項に記載の情報処理装置。 - 姿勢を検出するセンサを備える情報処理装置であって、
送信装置が備える当該送信装置の姿勢を検出するセンサの検出結果に基づいて特定される当該送信装置の姿勢に基づいて、送信先の装置を決定する送信先装置決定部と、前記送信先の装置が決定されたタイミングとは異なるタイミングでの当該センサの検出結果に基づいて特定される前記送信装置の姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する送信データ決定部と、前記送信先の装置に前記データを送信する送信部と、を含む送信装置から、当該データを受信する受信部と、
前記センサの検出結果に基づいて前記情報処理装置の姿勢を特定する姿勢特定部と、
受信される前記データと、特定される前記情報処理装置の姿勢と、に応じた処理を実行する処理実行部と、
を含むことを特徴とする情報処理装置。 - 姿勢を検出するセンサを備える情報処理装置の制御方法であって、
前記センサの検出結果に基づいて特定される前記情報処理装置の姿勢に基づいて、送信先の装置を決定するステップと、
前記送信先の装置が決定されたタイミングとは異なるタイミングでの前記センサの検出結果に基づいて特定される前記情報処理装置の姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定するステップと、
前記送信先の装置に前記データを送信するステップと、
を含むことを特徴とする情報処理装置の制御方法。 - 姿勢を検出するセンサを備える情報処理装置の制御方法であって、
送信装置が備える当該送信装置の姿勢を検出するセンサの検出結果に基づいて特定される当該送信装置の姿勢に基づいて、送信先の装置を決定する送信先装置決定部と、前記送信先の装置が決定されたタイミングとは異なるタイミングでの当該センサの検出結果に基づいて特定される前記送信装置の姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する送信データ決定部と、前記送信先の装置に前記データを送信する送信部と、を含む送信装置から、当該データを受信するステップと、
前記センサの検出結果に基づいて前記情報処理装置の姿勢を特定するステップと、
受信される前記データと、特定される前記情報処理装置の姿勢と、に応じた処理を実行するステップと、
を含むことを特徴とする情報処理装置の制御方法。 - 姿勢を検出するセンサを備えるコンピュータに実行させるプログラムであって、
前記センサの検出結果に基づいて特定される前記コンピュータの姿勢に基づいて、送信先の装置を決定する手順、
前記送信先の装置が決定されたタイミングとは異なるタイミングでの前記センサの検出結果に基づいて特定される前記コンピュータの姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する手順、
前記送信先の装置に前記データを送信する手順、
を前記コンピュータに実行させることを特徴とするプログラム。 - 姿勢を検出するセンサを備えるコンピュータに実行させるプログラムであって、
送信装置が備える当該送信装置の姿勢を検出するセンサの検出結果に基づいて特定される当該送信装置の姿勢に基づいて、送信先の装置を決定する送信先装置決定部と、前記送信先の装置が決定されたタイミングとは異なるタイミングでの当該センサの検出結果に基づいて特定される前記送信装置の姿勢に基づいて、前記送信先の装置に送信されるデータの内容を決定する送信データ決定部と、前記送信先の装置に前記データを送信する送信部と、を含む送信装置から、当該データを受信する手順、
前記センサの検出結果に基づいて前記コンピュータの姿勢を特定する手順、
受信される前記データと、特定される前記コンピュータの姿勢と、に応じた処理を実行する手順、
を前記コンピュータに実行させることを特徴とするプログラム。 - 請求項9又は10に記載のプログラムを記憶したコンピュータ読み取り可能な情報記憶媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480034873.2A CN105324737B (zh) | 2013-06-26 | 2014-06-26 | 信息处理器、信息处理器的控制方法、程序和信息贮存介质 |
EP14817896.5A EP3015954B1 (en) | 2013-06-26 | 2014-06-26 | Information processing device, control method for information processing device, program, and information storage medium |
US14/898,611 US10376777B2 (en) | 2013-06-26 | 2014-06-26 | Information processor, control method of information processor, program, and information storage medium |
JP2015524117A JP5921015B2 (ja) | 2013-06-26 | 2014-06-26 | 情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013134188 | 2013-06-26 | ||
JP2013-134188 | 2013-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014208689A1 true WO2014208689A1 (ja) | 2014-12-31 |
Family
ID=52142015
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/067049 WO2014208689A1 (ja) | 2013-06-26 | 2014-06-26 | 情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10376777B2 (ja) |
EP (1) | EP3015954B1 (ja) |
JP (1) | JP5921015B2 (ja) |
CN (1) | CN105324737B (ja) |
WO (1) | WO2014208689A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019517049A (ja) * | 2016-03-31 | 2019-06-20 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 姿勢および複数のdofコントローラを用いた3d仮想オブジェクトとの相互作用 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9420251B2 (en) * | 2010-02-08 | 2016-08-16 | Nikon Corporation | Imaging device and information acquisition system in which an acquired image and associated information are held on a display |
CN107913515B (zh) * | 2017-10-25 | 2019-01-08 | 网易(杭州)网络有限公司 | 信息处理方法及装置、存储介质、电子设备 |
CN110448891B (zh) * | 2019-08-08 | 2021-06-25 | 腾讯科技(深圳)有限公司 | 控制虚拟对象操作远程虚拟道具的方法、装置及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008272123A (ja) * | 2007-04-26 | 2008-11-13 | Namco Bandai Games Inc | プログラム、情報記憶媒体及びゲーム装置 |
JP2010063616A (ja) * | 2008-09-10 | 2010-03-25 | Konami Digital Entertainment Co Ltd | ゲーム装置、オブジェクト選択方法、および、プログラム |
WO2011019049A1 (ja) * | 2009-08-13 | 2011-02-17 | 株式会社コナミデジタルエンタテインメント | 端末装置、端末方法、情報記憶媒体、ならびに、プログラム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020091003A1 (en) * | 2001-01-11 | 2002-07-11 | Beken Robert A. | Multi-player electronic entertainment system |
US20020111201A1 (en) * | 2001-02-13 | 2002-08-15 | Lang Brook W. | Location-based game system |
US8469824B1 (en) * | 2004-09-27 | 2013-06-25 | Hasbro, Inc. | Device and method for an electronic tag game |
US20070037625A1 (en) * | 2005-06-28 | 2007-02-15 | Samsung Electronics Co., Ltd. | Multiplayer video gaming system and method |
JP4805633B2 (ja) | 2005-08-22 | 2011-11-02 | 任天堂株式会社 | ゲーム用操作装置 |
US20110151955A1 (en) * | 2009-12-23 | 2011-06-23 | Exent Technologies, Ltd. | Multi-player augmented reality combat |
JP5840385B2 (ja) * | 2010-08-30 | 2016-01-06 | 任天堂株式会社 | ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法 |
US8907983B2 (en) * | 2010-10-07 | 2014-12-09 | Aria Glassworks, Inc. | System and method for transitioning between interface modes in virtual and augmented reality applications |
JP5256269B2 (ja) * | 2010-10-28 | 2013-08-07 | 株式会社コナミデジタルエンタテインメント | データ生成装置、データ生成装置の制御方法、及びプログラム |
US9561443B2 (en) | 2011-03-08 | 2017-02-07 | Nintendo Co., Ltd. | Computer-readable storage medium, information processing system, and information processing method |
EP2497543A3 (en) * | 2011-03-08 | 2012-10-03 | Nintendo Co., Ltd. | Information processing program, information processing system, and information processing method |
US9155964B2 (en) | 2011-09-14 | 2015-10-13 | Steelseries Aps | Apparatus for adapting virtual gaming with real world information |
CN202427456U (zh) * | 2011-12-22 | 2012-09-12 | 东莞市宏展仪器有限公司 | 步入式环境试验箱 |
CN202427156U (zh) | 2012-01-18 | 2012-09-12 | 深圳市合智创盈电子有限公司 | 一种具有姿态感测的多用途游戏控制器及系统 |
US20180043263A1 (en) * | 2016-08-15 | 2018-02-15 | Emmanuel Brian Cao | Augmented Reality method and system for line-of-sight interactions with people and objects online |
-
2014
- 2014-06-26 EP EP14817896.5A patent/EP3015954B1/en active Active
- 2014-06-26 CN CN201480034873.2A patent/CN105324737B/zh active Active
- 2014-06-26 JP JP2015524117A patent/JP5921015B2/ja active Active
- 2014-06-26 WO PCT/JP2014/067049 patent/WO2014208689A1/ja active Application Filing
- 2014-06-26 US US14/898,611 patent/US10376777B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008272123A (ja) * | 2007-04-26 | 2008-11-13 | Namco Bandai Games Inc | プログラム、情報記憶媒体及びゲーム装置 |
JP2010063616A (ja) * | 2008-09-10 | 2010-03-25 | Konami Digital Entertainment Co Ltd | ゲーム装置、オブジェクト選択方法、および、プログラム |
WO2011019049A1 (ja) * | 2009-08-13 | 2011-02-17 | 株式会社コナミデジタルエンタテインメント | 端末装置、端末方法、情報記憶媒体、ならびに、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3015954A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019517049A (ja) * | 2016-03-31 | 2019-06-20 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 姿勢および複数のdofコントローラを用いた3d仮想オブジェクトとの相互作用 |
US11657579B2 (en) | 2016-03-31 | 2023-05-23 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-DOF controllers |
Also Published As
Publication number | Publication date |
---|---|
CN105324737A (zh) | 2016-02-10 |
JPWO2014208689A1 (ja) | 2017-02-23 |
EP3015954A1 (en) | 2016-05-04 |
EP3015954B1 (en) | 2022-05-25 |
JP5921015B2 (ja) | 2016-05-24 |
EP3015954A4 (en) | 2017-02-15 |
US10376777B2 (en) | 2019-08-13 |
CN105324737B (zh) | 2019-06-21 |
US20160129344A1 (en) | 2016-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11703993B2 (en) | Method, apparatus and device for view switching of virtual environment, and storage medium | |
US8246467B2 (en) | Interactive gaming with co-located, networked direction and location aware devices | |
US9086724B2 (en) | Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus | |
KR102680606B1 (ko) | 가상 환경에서 교통 수단을 주행시키기 위한 방법 및 장치, 및 단말 및 저장 매체 | |
WO2021203856A1 (zh) | 数据同步方法、装置、终端、服务器及存储介质 | |
CN108671543A (zh) | 虚拟场景中的标记元素显示方法、计算机设备及存储介质 | |
US9904982B2 (en) | System and methods for displaying panoramic content | |
JP5396620B2 (ja) | 情報処理プログラム及び情報処理装置 | |
JP5921015B2 (ja) | 情報処理装置、情報処理装置の制御方法、プログラム及び情報記憶媒体 | |
AU2020428058B2 (en) | Method and apparatus for skill aiming in three-dimensional virtual environment, device and storage medium | |
JP6581341B2 (ja) | 情報処理装置、情報処理プログラム、情報処理方法、および情報処理システム | |
US20230046750A1 (en) | Virtual prop control method and apparatus, computer device, and storage medium | |
CN110738738B (zh) | 三维虚拟场景中的虚拟对象标记方法、设备及存储介质 | |
CN111265857A (zh) | 虚拟场景中的弹道控制方法、装置、设备及存储介质 | |
CN111589127A (zh) | 虚拟角色的控制方法、装置、设备及存储介质 | |
JP2022524802A (ja) | 仮想環境におけるスコープの適用方法及び装置並びにコンピュータ装置及びプログラム | |
CN112330823B (zh) | 虚拟道具的显示方法、装置、设备及可读存储介质 | |
JP2024509064A (ja) | 位置マークの表示方法及び装置、機器並びにコンピュータプログラム | |
JP2024029117A (ja) | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム | |
US8708818B2 (en) | Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus | |
TW202217539A (zh) | 確定選中目標的方法及裝置、電腦設備、非臨時性電腦可讀存儲介質及電腦程式產品 | |
CN113134232A (zh) | 虚拟对象的控制方法、装置、设备及计算机可读存储介质 | |
US9126110B2 (en) | Control device, control method, and program for moving position of target in response to input operation | |
CN109806594B (zh) | 虚拟环境中的弹道显示方法、装置及设备 | |
CN114011073A (zh) | 控制载具的方法、装置、设备及计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480034873.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14817896 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015524117 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014817896 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14898611 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |