WO2015159561A1 - 情報処理装置、情報処理システム、および情報処理方法 - Google Patents
情報処理装置、情報処理システム、および情報処理方法 Download PDFInfo
- Publication number
- WO2015159561A1 WO2015159561A1 PCT/JP2015/050682 JP2015050682W WO2015159561A1 WO 2015159561 A1 WO2015159561 A1 WO 2015159561A1 JP 2015050682 W JP2015050682 W JP 2015050682W WO 2015159561 A1 WO2015159561 A1 WO 2015159561A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- real object
- real
- virtual
- processing unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H11/00—Self-movable toy figures
- A63H11/10—Figure toys with single- or multiple-axle undercarriages, by which the figures perform a realistic running motion when the toy is moving over the floor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H17/00—Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the present invention relates to information processing technology using an object in real space.
- the above-described conventional technology is based on a relatively simple image display in which an object in real space is displayed as it is, or an image obtained by replacing a part of the object with an object drawn by computer graphics is displayed.
- a relatively simple image display in which an object in real space is displayed as it is, or an image obtained by replacing a part of the object with an object drawn by computer graphics is displayed.
- the present invention has been made in view of such a problem, and an object thereof is to provide a novel technique in which the movement of an object in the real world and image display are coordinated.
- An aspect of the present invention relates to an information processing apparatus.
- the information processing apparatus includes an information processing unit that performs information processing according to at least one of a user operation for moving a real object existing in the real world and a user operation using a connected input device, and a result of the information processing.
- a display processing unit that draws a virtual space in which the virtual object exists and displays it on a display device, and the information processing unit is configured to link the real object when the virtual object is moved by a user operation using the input device.
- Real object control processing for generating a control signal and transmitting the control signal to the real object, and operation processing for the virtual space for linking the virtual object when a user operation to move the real object is performed, When the body and the virtual object are linked, and when a situation corresponding to a preset situation that is difficult to link is detected, By executing the corresponding processing selected in accordance with the situation, characterized by continuing the subsequent processing.
- the information processing apparatus includes an information processing unit that performs information processing according to a user operation that moves a real object that exists in the real world, and a display that displays a virtual space in which a virtual object exists as a result of the information processing and displays the virtual space
- a processing unit when the user operation to move the real object is performed, the information processing unit links the real object and the virtual object by an operation process on the virtual space for linking the virtual object.
- the prohibited area where the real object cannot be placed is determined in the real world, and the display processing unit further generates an image that graphically represents the position of the prohibited area on the plane where the real object is placed in the real world. And projected onto the plane by a projector, or displayed on the screen of a tablet computer constituting the plane.
- Still another aspect of the present invention also relates to an information processing apparatus.
- the information processing apparatus includes: an information processing unit that performs information processing according to a user operation that moves a real object that exists in the real world; a user operation that uses a connected input device; A display processing unit that draws a virtual space in which a virtual object exists and displays the virtual space on a display device, and the information processing unit performs a virtual object for linking the virtual object when a user operation for moving the real object is performed
- the display processing unit generates an image generated as a result of the operation process for the virtual space, and the operation
- the image generated as a result of information processing different from the processing is displayed on the display device by switching according to the operation means used by the user. .
- Still another aspect of the present invention relates to an information processing system.
- This information processing system is an information processing system including an information processing device and a real object that can be moved by a user or controlled by a control signal from the information processing device.
- An apparatus includes an information processing unit that performs information processing according to at least one of a user operation for moving a real object and a user operation using a connected input device, and a virtual space in which a virtual object exists as a result of information processing.
- a display processing unit for drawing and displaying on a display device and the information processing unit generates a control signal for interlocking the real object when the virtual object is moved by a user operation using the input device, and
- the real object control process to be transmitted to the real object and the user operation to move the real object are performed, the virtual object is linked to the virtual space.
- the real object and the virtual object are linked by at least one of the operation processes, and when a situation corresponding to a preset situation that is difficult to interlock is detected, the corresponding process selected according to the situation is executed. Thus, the subsequent information processing is continued.
- Still another aspect of the present invention relates to an information processing method.
- the information processing method includes a step in which the information processing apparatus performs information processing according to at least one of a user operation for moving a real object existing in the real world and a user operation using a connected input device; Drawing a virtual space in which the virtual object exists as a result and displaying the virtual object on the display device, and performing the information processing includes: moving the virtual object when the virtual object is moved by a user operation using the input device.
- At least one of real object control processing for generating a control signal for interlocking and transmitting it to the real object, and operation processing for the virtual space for interlocking the virtual object when a user operation for moving the real object is performed Steps for linking real objects and virtual objects, and preset situations that are difficult to link.
- FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied.
- the information processing system 1 includes a real object 120a, 120b placed on the play field 19, a camera 122 that captures the real object 120a, 120b, an information processing apparatus 10 that performs predetermined information processing, and a user operation as a general operation unit.
- the display device 16 for displaying the data output from the information processing device 10 as an image.
- the information processing apparatus 10 may be a game device or a personal computer, for example, and may implement an information processing function by loading a necessary application program.
- the information processing apparatus 10 may establish communication with another information processing apparatus or server via the network 18 as necessary, and send and receive necessary information.
- the display device 16 may be a general display such as a liquid crystal display, a plasma display, or an organic EL display. Moreover, the television provided with those displays and a speaker may be sufficient. Further, the display device 16 may be a projector that projects an image. At this time, as described later, the projection plane may be the play field 19.
- a tablet PC or the like may be used as the display device 16. Here, by placing the tablet PC horizontally, the screen may be used as the play field 19.
- the display device 16 may be a combination of a plurality of general displays, projectors, and tablet PCs. Note that the display device 16 may not be a television provided with a speaker, and a sound reproducing device (not shown) for outputting predetermined sound may be provided separately.
- the camera 122 is a digital video camera including an image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary-Metal-Oxide-Semiconductor), and a moving image of a space including at least the play field 19 and the real objects 120a and 120b thereon.
- the input device 14 may be any one or a combination of general input devices such as a game controller, a keyboard, a mouse, a joystick, and a touch pad provided on the screen of the display device 16. Connection between the information processing apparatus 10, the camera 122, the input device 14, and the display device 16 may be wired or wireless, and may be via various networks. Alternatively, any two or more of the camera 122, the information processing apparatus 10, the input apparatus 14, and the display apparatus 16, or all of them may be combined and integrally provided. Further, the camera 122 does not necessarily have to be mounted on the display device 16.
- the play field 19 is a plane that defines areas for the real information objects 120a and 120b to be placed thereon so that the information processing apparatus 10 recognizes them as processing targets and defines their position coordinates. , Cloth, desk top, game board, etc.
- the real objects 120a and 120b may have a simple shape as shown in the figure, or may have a more complicated shape such as a miniature of a real world object such as a doll or a minicar, its parts, or a game piece. Further, the size, material, color, and number of the actual objects 120a and 120b are not limited. Furthermore, it may be a structure assembled by the user, or may be a finished product.
- the real objects 120a and 120b establish communication with the information processing apparatus 10 as necessary.
- the connection may be made using a wireless interface such as Bluetooth (registered trademark) or IEEE802.11, or via a cable.
- a wireless interface such as Bluetooth (registered trademark) or IEEE802.11, or via a cable.
- the real objects 120a and 120b do not need to have a communication function. Good.
- each of the real objects 120 a and 120 b includes wheels 123 a and 123 b for moving in response to a request from the information processing apparatus 10.
- the type and mechanism of the operation are not limited to this, and a change in joint angle, vibration, light emission, sound output, or the like may be used, and a plurality of types of operations may be combined.
- the real objects 120a and 120b may be collectively referred to as the real object 120.
- the real object 120 that is associated in advance and the object in the display screen are basically linked.
- the real object 120a is associated with the first character 202a on the screen
- the real object 120b is associated with the second character 202b on the screen.
- the real object 120a also moves to the right on the play field 19.
- the real object 120a moves forward
- the first character 202a in the screen also moves forward. Which movement is reflected in the other movement is determined by the program creator or the like according to the contents and functions of information processing.
- FIG. 2 shows the configuration of the real object 120 and the information processing apparatus 10 in detail.
- each element described as a functional block for performing various processes may be configured with a CPU (Central Processing Unit), a memory, a microprocessor, other LSIs, actuators, sensors, and the like in hardware.
- CPU Central Processing Unit
- memory a memory
- microprocessor other LSIs, actuators, sensors, and the like
- software it is realized by a program loaded in a memory. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
- the real object 120 is a state information acquisition unit 143 that acquires its own state information that can be acquired inside the real object, a drive unit 146 that operates according to a control signal from the information processing apparatus 10, and information necessary for the information processing apparatus 10.
- a communication unit 148 for transmitting and receiving is included.
- the real object 120 may be a flat card, a doll having no internal structure, a game piece, or a block. Even in this case, the image of the real object can be accurately identified from the photographed image by changing the exterior, such as changing the color, pattern, or shape, or printing a two-dimensional barcode. Can do.
- the state information acquisition unit 143 is a sensor that internally measures the state of the real object 120 itself, and from the viewpoint of reflecting the form of the real object 120 and what kind of movement of the real object 120 is reflected in the object on the screen. Then, a sensor from which necessary information can be obtained is appropriately selected. For example, as shown in FIG. 1, when it is desired to reflect the movement of the real object 120 accompanied by the rotation of the wheel in the movement of the character in the screen, the movement amount and the movement direction are provided by providing a rotary encoder and a rudder angle sensor on the wheel. Is identified.
- the state information acquisition unit 143 may be a position sensor that acquires the absolute position of the real object 120, or a motion sensor such as an acceleration sensor, a gyro sensor, or a geomagnetic sensor.
- a potentiometer that specifies the joint angle may be introduced.
- the more various sensors are introduced the smaller the movement and shape change of the real object 120 can be captured and reflected in the object in the screen.
- Description will be made by paying attention to a mode in which movement and movement of an object in the screen are linked. In this case, it is possible to sequentially specify the position of the real object 120 by analyzing the image captured by the camera 122 without providing the state information acquisition unit 143 in the real object 120. Accordingly, in view of the required position detection accuracy and manufacturing cost, it is only necessary to determine whether the state information acquisition unit 143 is necessary or not.
- the driving unit 146 includes an actuator that moves the real object 120 according to a control signal from the information processing apparatus 10. As shown in FIG. 1, when the real object 120 is moved by the rotation of the wheel, the axle is rotated or the steering angle is changed by the actuator. As a result, the real object 120 moves in the direction and distance according to the control signal from the information processing apparatus 10.
- the drive unit 146 includes an actuator that generates a motion other than movement, a vibrator that vibrates the real object 120, a light emitter such as a light bulb, image data and display for displaying an image, and audio data for outputting sound. A speaker may be included. These mechanisms are also operated by a control signal from the information processing apparatus 10.
- the communication unit 148 acquires the state information of the real object 120 from the state information acquisition unit 143 and sequentially transmits it to the information processing unit 30. Further, the control signal for operating the real object 120 transmitted from the information processing apparatus 10 is received and notified to the drive unit 146.
- the communication unit 148 holds the individual identification information of its own real object 120 in an internal memory. At the time of transmitting information to the information processing apparatus 10, the information processing apparatus 10 can determine the transmission source of the information by adding the individual identification information. Further, it is determined based on the individual identification information transmitted together with the control signal whether or not the control signal transmitted from the information processing apparatus 10 is transmitted to its own real object 120.
- the information processing apparatus 10 includes a communication unit 20 that transmits and receives necessary information to and from the real object 120, an image captured by the camera 122, and a real object that identifies the state of the real object 120 based on information transmitted from the real object 120.
- the state specifying unit 22, the information processing unit 30 that performs predetermined information processing in response to a user operation on the real object 120 or the input device 14, and a display processing unit that generates an image to be displayed as a result of the information processing and outputs the image to the display device 16 32.
- the information processing apparatus 10 further includes a real object information storage unit 24 that stores information related to each real object 120, and a linked scenario storage unit 26 that stores a scenario for realizing a link between the real object and the object in the screen. .
- the communication unit 20 receives the state information and supplies it to the real object state specifying unit 22. Further, when the real object 120 is operated by the control signal from the information processing apparatus 10, the communication unit 20 acquires a control signal for that purpose from the information processing unit 30 and transmits the control signal to the real object 120.
- the real object state specifying unit 22 acquires an image frame of a captured image from the camera 122 in real time, and analyzes the frame to specify the position of the real object 120 at a predetermined time interval.
- the camera 122 when the camera 122 is a stereo camera, the absolute position of a real object in a three-dimensional space composed of the depth direction with respect to the camera 122 and the viewing plane of the camera can be acquired.
- a technique for acquiring the position of a target object in a three-dimensional space based on the principle of triangulation using parallax in images taken by a stereo camera from different left and right viewpoints is widely known.
- depth or three-dimensional information acquisition means other than binocular stereoscopic vision may be used.
- a viewpoint moving camera may be used, or the position of the real object may be specified by a TOF (Time Of Flight) method using an infrared irradiation mechanism and an infrared sensor that detects the reflected light.
- a touch panel may be provided on the top surface of the table on which the real object 120 is placed, and the position placed by the touch panel may be detected.
- the real object 120 may be tracked in time evolution using an existing visual tracking technique.
- the movement, position, shape, posture, and the like may be specified in detail using the state information transmitted from the real object 120.
- the real object information storage unit 24 stores information that associates the appearance characteristics of the real object 120 with the individual identification information.
- the appearance characteristics include the color, shape, pattern, size, and the like of the real object 120.
- a two-dimensional barcode may be used or a marker that emits light in a specific color may be attached. Then, by comparing the individual identification information added to the information transmitted from the real object 120 with the individual identification information corresponding to the appearance characteristic of the image detected in the captured image, the image in the captured image Corresponds to the transmitted status information.
- the information processing unit 30 executes processing to be performed according to the movement of the real object 120 moved by the user or the user operation via the input device 14.
- various games are exemplified, but the processing performed by the information processing unit 30 is not intended to be limited.
- the information processing unit 30 executes a process involving the linkage between the real object 120 and the object in the screen. Therefore, the correspondence relationship between the real object 120 and the object in the screen are also stored in the real object information storage unit 24.
- the object in the screen is a character appearing in a game executed by the information processing unit 30, model data for each object, a role in the game, and the like are set separately as in a general computer game. Keep it.
- the linked scenario storage unit 26 stores a scenario necessary for linking the real object 120 and the object on the screen.
- the scenario defines a solution means that makes it seem that they are linked when a situation where it is difficult to link them physically or in terms of settings occurs.
- the information processing unit 30 selects and executes an appropriate scenario from the linked scenario storage unit 26 according to the situation at that time, such as game settings, progress, and the position of the real object 120. Specific examples will be described later.
- the information processing unit 30 generates a necessary control signal and transmits it to the real object 120 via the communication unit 20.
- the transmission destination real object 120 is addressed to itself. Can be discriminated.
- the signal to be transmitted varies depending on the control method, and a technique generally used in the field of robot engineering or the like may be appropriately employed.
- the display processing unit 32 creates image data such as a game screen including an object interlocking with the real object 120, a setting screen necessary for the game, and a menu screen at a predetermined rate, and outputs the image data to the display device 16 as a video signal.
- An image captured by the camera 122 may be used as a part of the game screen.
- the display processing unit 32 may further generate an image to be projected on the play field 19 and output the data to a projector included in the display device 16.
- the play field 19 may be configured by a screen of a tablet PC, and an image to be displayed may be generated and output.
- FIG. 3 is a diagram for explaining an example of a technique for causing the information processing unit 30 to link the real object 120 and the object in the screen.
- the upper part of the figure is the virtual space 3 constructed by the information processing unit 30, and the lower part is the real space 5 in which the real objects 120a and 120b exist.
- the information processing unit 30 defines a three-dimensional coordinate system in the real space with a predetermined position in the real space 5 as an origin.
- one corner of the play field 19 is the origin
- the plane of the play field 19 is the xy plane
- the vertical upward direction is the z axis.
- the world coordinate system (XYZ coordinates) of the virtual space expressed in the screen is set.
- the position coordinates of the real objects 120a and 120b in the coordinate system of the real space 5 are determined based on the position of the image in the captured image. Then, the object of the first character 202a and the object of the second character 202b are arranged at positions corresponding to the position coordinates of the real objects 120a and 120b in the world coordinate system of the virtual space. Furthermore, objects that should be present around are also arranged as necessary. Then, a virtual viewpoint is set for the virtual space 3 constructed as described above, and each object viewed from the virtual space 3 is projected onto a predetermined plane, so that the first object is located at a position corresponding to the position of the real objects 120a and 120b. A screen on which the character 202a and the second character 202b are arranged can be displayed.
- the moving speed vector in the world coordinate system is converted to a speed vector in the three-dimensional coordinate system in the real space 5.
- the direction and amount of movement of the real object 120a are sequentially obtained.
- the movement of the first character 202a can be reflected in the movement of the real object 120a by moving the real object 120a accordingly.
- the first character 202a is converted by converting the moving velocity vector in the three-dimensional coordinate system of the real space 5 into the velocity vector in the world coordinate system of the virtual space 3.
- the direction to be moved and the amount to be moved are sequentially obtained. Accordingly, by moving the first character 202a and drawing the state as a display screen, the movement of the real object 120a can be reflected in the movement of the first character 202a in the screen.
- the relationship between the scale ratio and orientation of the three-dimensional coordinate system in the real space 5 and the world coordinate system in the virtual space 3 depends on the ratio between the area of the play field 19 and the area of the virtual world to be represented on the screen, the virtual world. Whether to display in the same direction as the real space or to display with the left and right reversed, is appropriately determined from the viewpoint of expression.
- the method shown in FIG. 3 assumes that a complete virtual world is constructed in the screen using three-dimensional graphics, but if the captured image is used as the background of the display image, the processing becomes simpler. Become. That is, the contour of the real object 120 in the captured image may be specified for each image frame by visual tracking processing or the like, and the object to be interlocked may be displayed superimposed on the position of the contour. In this case as well, objects that should be present in the surrounding area may be displayed in a superimposed manner.
- FIG. 4 shows an example of the data structure of information related to a real object stored in the real object information storage unit 24.
- the real object information 50 includes a solid identification information column 52, a shape column 54, a color column 56, a size column 58, and a corresponding object column 60.
- the solid identification information column 52 identification information given for each real object 120 is stored. This identification information is common with the identification information held inside each real object 120.
- the shape column 54, the color column 56, and the size column 58 store the shape, color, and size of each real object 120 in a predetermined format.
- the real object state identification unit 22 identifies individual identification information of each real object 120 by collating an image in the captured image with these pieces of information, or identifies detailed features that are difficult to identify from the captured image. . Since the correspondence between the individual identification information and the shape, color, and size is known when the individual identification information is given to the real object, it is set when the hardware is manufactured or when the game software is created.
- identification information of objects in the screen to be linked with the real object 120 is stored.
- object names are stored.
- the real object of the individual identification information “0001” is associated with the character object of “Prince”
- the real object of “0002” is associated with the object of the character “Monster 1”. Yes.
- each object is separately associated with an object model and incorporated in the game program. The correspondence between the individual object identification information and the object is fixed when a dedicated real object is manufactured for each object, and is therefore set when the hardware is manufactured or when the game software is created.
- the user may set a response when the game starts.
- the user places the real object 120 to be set on the play field 19 and causes the camera 122 to capture the image so that the information processing apparatus 10 recognizes the character (object) selection screen displayed on the display device 16.
- the information processing unit 30 associates them and stores them in the real object information storage unit 24 as shown in the figure.
- FIG. 5 schematically shows the information transmission system in the present embodiment.
- the user performs an operation on a game being executed on the information processing apparatus 10 via the input device 14 (arrows A and B).
- the game progresses, and a game screen with a change corresponding to the game is displayed on the display device 16 (arrow C).
- the information processing apparatus 10 moves the real object 120 in the same manner as the object in the game screen (arrow D).
- the information processing apparatus 10 recognizes the movement (arrows E and F). Then, the information processing apparatus 10 moves the corresponding object in the game screen displayed on the display device 16 in the same manner as the movement of the real object 120 (arrow C). Since both patterns are unique modes in which the real object and the object in the screen are linked, there may be a disadvantage that cannot occur in a general computer game or toy play.
- FIG. 6 illustrates the relationship between the display screen and the real object in the first pattern.
- This example assumes a battle game, and on the screen 200, it is assumed that the first character 202a and the second character 202b are in a battle.
- the first character 202a and the second character 202b are associated with the real object 120a and the real object 120b, respectively.
- the user operates the first character 202a by the input device 14, and the information processing apparatus 10 moves according to the program as the second character 202b as an enemy character. Or two users may move each character using two input devices 14.
- the information processing apparatus 10 needs to move the real object 120a in the same manner.
- the velocity vector is coordinate-converted as described above to move in the real world.
- the direction and the movement amount may be calculated sequentially.
- the information processing apparatus 10 continuously transmits a control signal to the real object 120a based on the calculation result, so that the real object 120a moves as indicated by an arrow b and is interlocked with the first character 202a.
- the above (1) means that the real object 120a is moved along the shortest path as long as there is no obstacle in the real world even if there is an obstacle or no road in the virtual world.
- the first character 202a is moved around the second character 202b in the screen 200 of FIG. 6 in the virtual world on the screen, between the first character 202a and the second character 202b.
- the first character 202a is linearly moved toward the second character 202b.
- the shortest route to the destination is derived by the existing shortest route search method. Thereby, the stagnation time of the game accompanying the movement of the real object can be reduced.
- the above (2) causes changes in the display screen that are specific to the movement period of the real object, such as darkening the game screen, displaying an animation indicating that the game is moving, or generating effects such as generating fog. It means that By doing in this way, the state where the game screen is frozen in order to wait for the completion of the movement of the real object can be eliminated, and the user's stress can be reduced. Since the above (1) and (2) are not exclusive processes, they may be performed at the same time, or only one of them may be performed in some cases.
- FIG. 7 is a flowchart showing a processing procedure for linking real objects when an object in the screen is moved using the input device 14. This flowchart is started at any time in a situation where the information processing apparatus 10 is executing a game.
- the information processing unit 30 confirms whether or not it is a warp operation for designating a reaching point (S12). If the operation is not a warp operation, that is, an operation to move gradually (N in S12), the object in the screen is moved every minute time (S14). At the same time, the real object is moved in the corresponding movement direction and movement amount (S16). This interlocking process every minute time is repeated until the moving operation stops (N in S18), and when the moving operation stops, the process is exited (Y in S18).
- the information processing unit 30 instantaneously moves an object on the screen to the arrival point (S20). At this time, effect processing indicating warping may be performed. At the same time, the information processing unit 30 determines the moving path of the real object 120 by specifying the position of the real world corresponding to the arrival point in the virtual world and searching for the shortest path to the position (S22). Then, while moving the real object 120 along the route, a predetermined moving image is displayed or an effect process is performed on the screen. When the movement is completed, the process is terminated (S24). As described above, when the real object is rearranged, the process starts from Y in S12. Or, as described above, an effect such as providing some regularity to the movement of the real object itself is given. The information processing unit 30 executes, in S22 and S24, a process selected according to the actual situation from the specific scenarios stored in the linked scenario storage unit 26.
- FIG. 8 illustrates the relationship between the display image and the real object in the second pattern described with reference to FIG.
- a battle game similar to that in FIG. 6 is assumed, and on the screen 204, the first character 202a and the second character 202b are in a battle.
- the real world there is a real object 120a associated with the first character 202a, but there is no real object corresponding to the second character 202b, which is different from FIG. That is, the character 202b exists as an enemy character only in the screen, and the information processing apparatus 10 moves according to the program.
- the following scenario is prepared in response to the case where the real object moved by the user reaches the prohibited area or when it enters the predetermined range around the prohibited area.
- An error message is output
- the real object is moved out of the prohibited area by the information processing device
- the virtual world is changed so that it is not the prohibited area
- the prohibited real object is moved by moving the spare real object To place
- the above (1) notifies the user that the real object 120a has entered the prohibited area on the screen or by voice, thereby prompting the user to move the real object 120a from the prohibited area. Since the user is touching the real object 120a, the user may be made aware by vibrating the vibrator built in the real object 120a. Alternatively, a warning may be issued by causing a light emitter built in the real object 120a to emit light, displaying a predetermined image on a display mounted thereon, or generating a predetermined sound from a speaker. Further, a predetermined image such as a message may be displayed on the display device 16 or a predetermined sound may be generated from a speaker or the like built in the display device 16. Two or more of these warning means may be used in combination. In response to this, the user can easily move out of the prohibited area by moving the real object 120a while viewing the screen 204.
- the above (2) means that the real object 120a is forcibly moved out of the prohibited area by the control signal from the information processing apparatus 10.
- the prohibited area is caused by the presence of the enemy character 202b as shown in the drawing, the prohibited area itself is moved by moving the enemy character 202b. Other than the enemy character, any moving object can be moved in the same manner. If it is not a moving object such as a mountain or a building, the forbidden area is released by causing the character to stand on it, as it collapses or decreases in height.
- the above (4) is based on the rule that a spare real object representing the prohibited area is prepared around the play field 19, and when the real object 120a moved by the user falls within the predetermined range of the prohibited area, The spare real object is moved by the control signal from the processing device 10 and placed in the prohibited area. This physically prevents the real object 120a from entering the prohibited area.
- a situation that is confused with the second pattern occurs, such as when the user moves a real object with his / her hand while a game or the like is in progress in the first pattern, the user may be warned by (1) above. Good.
- the user directly moves the real object 120 a or the like that the information processing apparatus 10 should gradually move by a user operation via the input device 14.
- inconsistencies between the display screen and the real object occur, particularly in games that do not assume warp.
- the real object state specifying unit 22 detects such an illegal movement of the real object, the user is warned of violation by the means described above.
- a penalty such as deduction on the game as an illegal act may be imposed. This allows the user to comply with the rules and prevents inconsistencies due to confusion between the first pattern and the second pattern.
- FIG. 9 is a flowchart showing a processing procedure for linking objects in the screen while considering forbidden areas when the user moves a real object. This flowchart is started at any time in a situation where the information processing apparatus 10 is executing a game.
- the information processing unit 30 monitors whether or not the real object enters the prohibited area (S32).
- the information processing unit 30 determines the position and size of the prohibited area as needed according to the movement of surrounding objects and enemy characters in the virtual world.
- the object on the screen is moved with the moving direction and moving amount corresponding to the real object every minute time (S34). This interlocking process for every minute time is repeated until the movement operation of the real object is stopped (N in S38).
- the real object enters the prohibited area during such movement processing (Y in S32)
- the real object is controlled or the virtual world is changed by any one or combination of the above processes (1) to (4). (S36).
- the process according to the selected scenario is repeated (S32 to S36), and the process exits when the movement of the real object is completed. (Y in S38).
- the information processing unit 30 executes a process selected in accordance with the actual situation from the specific scenarios stored in the linked scenario storage unit 26 in S36.
- a projector or a tablet PC is separately introduced as the display device 16 and an image is projected or displayed on the play field 19.
- the image may be, for example, an image in which the prohibited area is filled with a predetermined color, such as a place 206 in FIG. 8, or an image that represents a mountain or a building according to its shape.
- FIG. 10 schematically shows an information transmission system in a mode in which a battle game is played using two information processing apparatuses 10 connected via a network.
- an information processing apparatus 10 a used by a first user and an information processing apparatus 10 b used by a second user are connected via a network 18.
- Each user operates the input devices 14a and 14b and the real objects 120a and 120b as in FIG. 5 to cause the information processing devices 10a and 10b to perform information processing, and displays the display devices 16a and 16b accordingly.
- the actual objects 120a and 120b are moved.
- the information transmission path used only in the first pattern is indicated by a broken line.
- each information processing apparatus 10a, 10b sequentially acquires the position information of the other party's real object specified by the other object's information processing apparatus 10b, 10a via the network 18.
- the information processing part 30 of each information processing apparatus 10a, 10b processes the same program according to the motion of both real objects, and a battle game advances similarly in both.
- FIG. 11 exemplifies the relationship between a display image and a real object in a battle game via a network.
- a board game such as chess
- his pieces 208a, 208b and 208c and the opponent's pieces 210a, 210b and 210c are placed on the virtual world game board 212.
- An image of the game board 212 viewed from the opposite side is displayed on the other user's screen.
- the own pieces 208a, 208b, and 208c are operated by moving the real objects 120a, 120b, and 120c placed on the game board 214 in the real world.
- the game board using the projector is displayed so that the filled figures 216a, 216b, and 216c are displayed in the square corresponding to the square on which the opponent piece is placed among the squares of the game board 214 in the real world.
- An image is projected onto 214.
- the image is displayed on the screen of the tablet PC constituting the game board 214.
- the real objects 120a, 120b, and 120c and the pieces 208a, 208b, and 208c in the virtual world all have the same shape, but if the roles of the pieces are different, such as chess, The pieces may have different shapes to reflect this. At this time, the figure projected by the projector or displayed on the tablet PC may be changed according to the type of the placed piece.
- a light-transmissive head-mounted display may be used so that the opponent's piece actually exists on the game board 214 in the real world.
- the display of the screen 207 by the display may be omitted.
- an information processing device (not shown) of a third user who watches such a battle game is further connected, and the first user and the second user are connected to the game board placed in front of the third user. Both pieces may be projected by a projector.
- the information processing device 10 of the third user includes the position information of both real objects specified by the real object state specifying unit 22 of the information processing devices 10a and 10b of the first user and the second user, It transmits sequentially via the network 18.
- the information processing apparatus 10 of the third user sequentially generates an image representing the position of each piece with a graphic that can distinguish the two pieces, and projects or displays the image on the game board in the real world.
- the battle game using the network is not limited to the board game shown in the figure, but can be similarly applied to a battle game, a car racing game, various sports games, and the like. Further, the number of users participating in the battle is not limited to two. Even in such an embodiment, any of the scenarios (1) to (4) described above is used when a real object reaches the prohibited area without entering the projector or enters a predetermined range around the prohibited area. May be executed.
- FIG. 12 exemplifies the relationship between the display image and the real object when the real object corresponding to the opponent's piece is also arranged on the game board in the real world as a modification of the mode described in FIG. That is, among the pieces displayed on the screen 207, in addition to the real objects 120a, 120b, and 120c corresponding to the own pieces 208a, 208b, and 208c, the real objects 218a, 218b, and 218c corresponding to the opponent pieces 210a, 210b, and 210c. Also placed on the real world game board 214.
- each information processing apparatus 10a, 10b acquires the movement of the other party's real object via the network 18, it transmits a control signal to those real objects in order to reflect it on the front real objects 218a, 218b, 218c. To do. In this case, it is necessary to prevent the user from moving the real objects 218a, 218b, and 218c corresponding to the opponent's piece even if they are real objects in front of themselves.
- a mechanism for detecting movement or touch by an external force is provided in the real objects 218a, 218b, and 218c, and when such a situation is detected, the built-in vibrator is vibrated or the light emitter is caused to emit light. Or warn the user.
- the information processing apparatus 10 may control the real object or the display device 16 to output an error message on the screen of the display device 16 or output sound from them, or return to the original position even if it is moved.
- FIG. 13 illustrates the relationship between a display image and a real object in a situation where an object in the virtual world disappears. This example assumes the battle game illustrated in FIGS. 6 and 8, and on the screen 220, the second character 202 b disappears due to the attack of the first character 202 a (it is a fragment in the figure). It is represented.
- the real object 120b corresponding to the second character 202b cannot be eliminated in the same manner, the real object 120b exists in a place where there is nothing in the virtual world, and the virtual world and the real world are inconsistent. Occurs. Therefore, the following scenario is prepared in correspondence with the case where the object on the screen corresponding to the real object 120b disappears or is destroyed. (1) Output a message to the user (2) Move the real object corresponding to the object to the outside of the play field by the information processing device (3) Express the wreckage in the virtual world
- the above (3) means that the real object 120b is left as it is and the remnant of the character 202b is left in the virtual world.
- the real object 120b obstructs the movement of the real object 120a corresponding to the remaining first character 202a.
- the real object 120b obstructs the movement of the first character 202a as a wreckage in the virtual world.
- the consistency of the movable range is taken. In this case, if an image indicating that it is a wreckage is projected onto the real object 120b by a projector, it can be intuitively understood that the real object 120b is no longer a moving or moving object.
- a scenario is appropriately selected according to the content and situation of the game, such as selecting (3).
- the real world that changes using the real object 120 and the virtual world that changes using the input device 14 They have the aspect of having them consistent through mediation.
- the range affected by the operation of the real object 120 and the range affected by the operation of the input device 14 are separated, and only necessary information is integrated. For example, a scene in which an object on the screen moves by operating the real object 120 and a scene that receives an operation by the input device 14 are separated, and the latter is a scene that does not involve the interlocking of the real object 120.
- FIG. 14 schematically shows a configuration of an information processing apparatus and an example of a display screen in a mode in which scenes are divided by operation means.
- the information processing apparatus 10 performs information processing according to the movement of the real object 120, performs information processing according to the user operation of the real object correspondence processing unit 230 that generates a display image, and the input device 14, and displays the information.
- An input device correspondence processing unit 232 that generates an image is included.
- the real object correspondence processing unit 230 includes the communication unit 20, the real object state specifying unit 22, the information processing unit 30, the display processing unit 32, and the real object information storage unit 24 in the configuration of the information processing apparatus 10 illustrated in FIG. To do.
- the input device correspondence processing unit 232 includes an information processing unit 30 and a display processing unit 32. However, the information processing unit 30 does not need to generate a control signal for operating a real object.
- the real object handling processing unit 230 generates a virtual world screen 234 on which a moving object appears as the user moves the real object 120.
- the input device correspondence processing unit 232 generates a screen 236 that accepts a user operation by the input device 14.
- the screen 234 is a screen during the battle of the battle game shown in FIG. 6 and the like
- the screen 236 is a screen for selecting a weapon used by the player character in the battle.
- the screen is switched so that a virtual world such as the screen 234 is normally displayed and the screen 236 is displayed only when the user calls the weapon selection screen by using the input device 14 or the like.
- both screens may be displayed at the same time so that an active screen can be selected using the input device 14.
- the character in the screen 234 does not move using the input device 14, it is not necessary to move the real object 120 accordingly, and as a result, the consistency between the real world and the virtual world is always constant. Will be kept.
- the input device 14 can be used as long as it does not directly affect the movement of the character other than the selection operation and the screen switching operation on the selection screen as shown in the figure.
- FIG. 15 illustrates the relationship between a display image and a real object when the same object is expressed separately for each scene.
- This example assumes a soccer game, and on the main screen 244 displayed on the entire screen 240, the players' objects 242a, 242b, 242c,... Are playing in the soccer field.
- This image is generated by the input device corresponding processing unit 232, in which the user moves the player of the own team using the input device 14 or the information processing device 10 moves the player of the enemy team according to the program.
- a sub-screen 246 showing an initial formation is displayed at the lower right of the screen 240.
- the sub screen 246 shows the arrangement of each player at the time of starting the game as a drawing overlooking the soccer field, and the white circles and black circles represent individual players of the own team and the enemy team, respectively.
- the arrangement of the players of the team on the sub screen 246 reflects the positions of the real objects 120c, 120d, 120e,. Therefore, the real object correspondence processing unit 230 generates the image of the sub screen 246.
- the arrangement of the players of the enemy team is determined by the real object handling processing unit 230 according to the program.
- the position of a real object placed by another user at a remote location may be acquired via the network 18 and reflected in the placement of the enemy team. The user adjusts the formation of the players of his team by moving the real objects 120c, 120d, 120e,... While looking at the sub screen 246, or according to the movement of the player corresponding to the real object 120d approaching the enemy team.
- the sub screen 246 may not be displayed simultaneously with the main screen 244, and may be displayed only when the screen is switched or the user calls it.
- the movements of the objects 242a, 242b, 242c,... Of the players in the game on the main screen 244 are not reflected in the real objects 120c, 120d, 120e,. That is, as described above, real objects 120c, 120d, 120e,... Are used for setting the initial formation even for the same player, and the movement during the game is expressed only in the virtual world.
- a sense of speed is important, such as a soccer game, it is often convenient for the user to operate with only the input device 14 while watching the screen.
- the real object and the object in the screen are associated with each other, and a mode in which the other moves in accordance with one movement is realized.
- the occurrence of the situation where the same movement cannot be realized physically or on the setting is monitored, and the scenario according to the situation is selected and executed according to the situation prepared in advance.
- Get consistency Specifically, when a user operation is performed to move a character by specifying a reaching point with respect to the virtual world, the information processing apparatus searches for the shortest path in the real world and determines the path to determine the real object. The user's stress is reduced by performing an effect that can be enjoyed during the travel time on the image and in the real world while minimizing the travel time.
- the information processing device when a user moves a real object to a prohibited area where there is an opponent character or obstacle that exists in the virtual world but does not exist in the real world, the information processing device outputs an error message or separates the real object. Or move to the location. Alternatively, the opponent character is moved in the virtual world or the obstacle is deformed so that the player character can be at a position corresponding to the real object. In addition, an image is projected and displayed using a projector or a tablet PC so that a figure that can be recognized is displayed in the prohibited area in the real world.
- the information processing device moves the corresponding real object to the outside of the play field, or moves the spare real object to the corresponding position in the play field. I will let you. Alternatively, a forbidden area is generated by leaving a remnant of an object in the virtual world.
- the operation information of the other user is captured using the network and reflected in the movement of the real object or the image projected on the play field accordingly, the real object can move and move even though it is a network game. You can enjoy a game with a feeling.
- This configuration can be applied not only to a game for deciding victory or defeat, but also to a new game or art that creates a single work through joint work of a plurality of users.
- information processing system 10 information processing device, 14 input device, 16 display device, 18 network, 19 play field, 20 communication unit, 22 real object state identification unit, 24 real object information storage unit, 26 linked scenario storage unit, 30 Information processing unit, 32 display processing unit, 50 real object information, 120 real object, 122 camera, 143 status information acquisition unit, 146 drive unit, 148 communication unit.
- the present invention is applicable to information processing devices such as computers, game devices, and content display devices, toys, and systems including them.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(1)実物体を、仮想世界の到達点に対応する位置までの最短経路で移動させる
(2)実物体の移動中、所定の画像を表示する
(1)エラーメッセージを出力する
(2)情報処理装置によって実物体を禁止領域外に移動させる
(3)禁止領域でないように仮想世界を変化させる
(4)予備の実物体を移動させて禁止領域に配置する
(1)ユーザに対するメッセージを出力する
(2)情報処理装置によって該当オブジェクトに対応する実物体をプレイフィールド外に移動させる
(3)仮想世界で残骸が残るように表現する
Claims (17)
- 実世界に存在する実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作の少なくともいずれかに応じて情報処理を行う情報処理部と、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる表示処理部と、
を備え、
前記情報処理部は、前記入力装置を用いたユーザ操作により前記仮想オブジェクトが動かされたとき、前記実物体を連動させるための制御信号を生成して当該実物体に送信する実物体制御処理と、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理、の少なくともいずれかにより、前記実物体と前記仮想オブジェクトを連動させるとともに、あらかじめ設定された、連動が困難な状況に該当する状況が検知された際、当該状況に応じて選択した対応処理を実行することにより、以後の情報処理を継続することを特徴とする情報処理装置。 - 前記情報処理部は前記対応処理として、前記実物体に所定の動作を行わせるための制御信号を生成し、当該実物体に送信することを特徴とする請求項1に記載の情報処理装置。
- 前記情報処理部は、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、当該禁止領域またはその周囲の所定領域に前記実物体が入ったことを検知したら、前記実物体を別の領域へ移動させるための制御信号を生成し、当該実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部は、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、当該禁止領域またはその周囲の所定領域に前記実物体が入ったことを検知したら、前記実物体の振動、画像表示、音声出力の少なくともいずれかによりユーザに警告するための制御信号を生成し、当該実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部は、前記入力装置を用いたユーザ操作により、仮想世界における到達点を指定して前記仮想オブジェクトを即時に移動させる操作がなされたとき、当該到達点に対応する実世界での到達点を決定したうえ、それに至る最短経路を探索することにより、前記実物体を当該最短経路で移動させるための制御信号を生成し、当該実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部は、前記入力装置を用いたユーザ操作により、仮想世界における到達点を指定して前記仮想オブジェクトを即時に移動させる操作がなされたとき、当該到達点に対応する実世界での到達点へ、あらかじめ定めた規則に従った経路で移動させるための制御信号を生成し、当該実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部は、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、当該禁止領域の周囲の所定領域に前記実物体が入ったことを検知したら、禁止領域を示すための予備の実物体を移動させて当該禁止領域へ移動させるための制御信号を生成し、当該予備の実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部は、仮想世界において前記仮想オブジェクトが消滅したとき、前記実物体を実世界の所定の領域外へ移動させるための制御信号を生成し、仮想世界において前記仮想オブジェクトが出現したとき、予備の実物体を移動させて前記所定の領域内へ移動させるための制御信号を生成し、それぞれ対象の実物体に送信することを特徴とする請求項1または2に記載の情報処理装置。
- 前記情報処理部はさらに、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、
前記表示処理部はさらに、実世界において前記実物体が置かれた平面における当該禁止領域の位置を図形で表した画像を生成し、プロジェクタによって前記平面に投影させるか、前記平面を構成するタブレット型コンピュータの画面に表示させることを特徴とする請求項1に記載の情報処理装置。 - 前記情報処理部は、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、当該禁止領域またはその周囲の所定領域に前記実物体が入ったことを検知したら、仮想世界を操作することにより禁止領域を解消することを特徴とする請求項1に記載の情報処理装置。
- 前記情報処理部は、仮想世界において前記仮想オブジェクトが消滅したとき、その残骸を同じ場所に残すことにより、実世界における前記実物体の存在との整合性をとることを特徴とする請求項1に記載の情報処理装置。
- 実世界に存在する実物体を動かすユーザ操作に応じて情報処理を行う情報処理部と、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる表示処理部と、
を備え、
前記情報処理部は、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理により、前記実物体と前記仮想オブジェクトを連動させるとともに、仮想世界の状況上、実世界において前記実物体を置くことのできない禁止領域を決定し、
前記表示処理部はさらに、実世界において前記実物体が置かれた平面における当該禁止領域の位置を図形で表した画像を生成し、プロジェクタによって前記平面に投影させるか、前記平面を構成するタブレット型コンピュータの画面に表示させることを特徴とする情報処理装置。 - 実世界に存在する実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作と、に応じて情報処理を行う情報処理部と、
前記情報処理の結果として、少なくとも仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる表示処理部と、
を備え、
前記情報処理部は、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトをそれに連動させるための仮想空間に対する操作処理を行い、前記入力装置を用いたユーザ操作がなされたとき、前記仮想空間に対する操作処理とは異なる情報処理を行い、
前記表示処理部は、前記仮想空間に対する操作処理の結果生成した画像と、前記操作処理とは異なる情報処理の結果生成した画像とを、ユーザが用いる操作手段に応じて前記表示装置に切り替えて表示させることを特徴とする情報処理装置。 - 情報処理装置と、ユーザが動かすか前記情報処理装置からの制御信号によって動かされるか、の少なくともいずれかが可能な実物体と、を含む情報処理システムであって、
前記情報処理装置は、
前記実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作の少なくともいずれかに応じて情報処理を行う情報処理部と、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる表示処理部と、
を備え、
前記情報処理部は、前記入力装置を用いたユーザ操作により前記仮想オブジェクトが動かされたとき、前記実物体を連動させるための制御信号を生成して当該実物体に送信する実物体制御処理と、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理、の少なくともいずれかにより、前記実物体と前記仮想オブジェクトを連動させるとともに、あらかじめ設定された、連動が困難な状況に該当する状況が検知された際、当該状況に応じて選択した対応処理を実行することにより、以後の情報処理を継続することを特徴とする情報処理システム。 - 実世界に存在する実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作の少なくともいずれかに応じて情報処理を行うステップと、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させるステップと、
を含み、
前記情報処理を行うステップは、
前記入力装置を用いたユーザ操作により前記仮想オブジェクトが動かされたとき、前記実物体を連動させるための制御信号を生成して当該実物体に送信する実物体制御処理と、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理、の少なくともいずれかにより、前記実物体と前記仮想オブジェクトを連動させるステップと、
あらかじめ設定された、連動が困難な状況に該当する状況が検知された際、当該状況に応じて選択した対応処理を実行することにより、以後の情報処理を継続するステップと、
を含むことを特徴とする、情報処理装置による情報処理方法。 - 実世界に存在する実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作の少なくともいずれかに応じて情報処理を行う機能と、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる機能と、
をコンピュータに実現させるコンピュータプログラムであって、
前記情報処理を行う機能は、
前記入力装置を用いたユーザ操作により前記仮想オブジェクトが動かされたとき、前記実物体を連動させるための制御信号を生成して当該実物体に送信する実物体制御処理と、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理、の少なくともいずれかにより、前記実物体と前記仮想オブジェクトを連動させる機能と、
あらかじめ設定された、連動が困難な状況に該当する状況が検知された際、当該状況に応じて選択した対応処理を実行することにより、以後の情報処理を継続する機能と、
を含むことを特徴とするコンピュータプログラム。 - 実世界に存在する実物体を動かすユーザ操作と、接続された入力装置を用いたユーザ操作の少なくともいずれかに応じて情報処理を行う機能と、
前記情報処理の結果として仮想オブジェクトが存在する仮想空間を描画し表示装置に表示させる機能と、
をコンピュータに実現させるコンピュータプログラムであって、
前記情報処理を行う機能は、
前記入力装置を用いたユーザ操作により前記仮想オブジェクトが動かされたとき、前記実物体を連動させるための制御信号を生成して当該実物体に送信する実物体制御処理と、前記実物体を動かすユーザ操作がなされたとき、前記仮想オブジェクトを連動させるための仮想空間に対する操作処理、の少なくともいずれかにより、前記実物体と前記仮想オブジェクトを連動させる機能と、
あらかじめ設定された、連動が困難な状況に該当する状況が検知された際、当該状況に応じて選択した対応処理を実行することにより、以後の情報処理を継続する機能と、
を含むコンピュータプログラムを記録したことを特徴とするコンピュータにて読み取り可能な記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016513652A JP6340414B2 (ja) | 2014-04-16 | 2015-01-13 | 情報処理装置、情報処理システム、および情報処理方法 |
US15/302,344 US10510189B2 (en) | 2014-04-16 | 2015-01-13 | Information processing apparatus, information processing system, and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-084945 | 2014-04-16 | ||
JP2014084945 | 2014-04-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015159561A1 true WO2015159561A1 (ja) | 2015-10-22 |
Family
ID=54323780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/050682 WO2015159561A1 (ja) | 2014-04-16 | 2015-01-13 | 情報処理装置、情報処理システム、および情報処理方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10510189B2 (ja) |
JP (1) | JP6340414B2 (ja) |
WO (1) | WO2015159561A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017086258A (ja) * | 2015-11-05 | 2017-05-25 | 日本電信電話株式会社 | 情報提示装置、情報提示方法及びプログラム |
JP2017144217A (ja) * | 2016-02-19 | 2017-08-24 | 株式会社コーエーテクモゲームス | ゲームシステム、ゲーム装置、情報処理装置および制御プログラム |
JP2017164319A (ja) * | 2016-03-16 | 2017-09-21 | 国立大学法人電気通信大学 | ビデオチャットロボットシステム、手渡し遊び制御方法および手渡し遊び制御プログラム |
JP2017199238A (ja) * | 2016-04-28 | 2017-11-02 | 株式会社カプコン | 仮想空間表示システム |
EP3327544A1 (en) * | 2016-11-25 | 2018-05-30 | Nokia Technologies OY | An apparatus, associated method and associated computer readable medium |
WO2019116521A1 (ja) * | 2017-12-14 | 2019-06-20 | 株式会社ソニー・インタラクティブエンタテインメント | エンターテインメントシステム、ロボット装置およびサーバ装置 |
WO2019142228A1 (ja) * | 2018-01-16 | 2019-07-25 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
KR20190126377A (ko) * | 2017-07-25 | 2019-11-11 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 가상 캐릭터의 배치 제어 방법 및 디바이스와 저장 매체 |
JP2020022597A (ja) * | 2018-08-07 | 2020-02-13 | 株式会社スクウェア・エニックス | プログラム、投影システム及びコンピュータ装置 |
JP2021068383A (ja) * | 2019-10-28 | 2021-04-30 | 富士ゼロックス株式会社 | 情報処理装置、及び、情報処理プログラム |
KR20220027754A (ko) * | 2020-08-27 | 2022-03-08 | 가부시키가이샤 반다이 | 게임 지원 시스템, 프로그램, 정보 통신 단말기 및 접속 장치 |
JP2022039059A (ja) * | 2020-08-27 | 2022-03-10 | 株式会社バンダイ | ゲーム観戦システム、プログラム、観戦端末及び接続装置 |
JP2022186819A (ja) * | 2018-12-28 | 2022-12-15 | 株式会社イトーキ | 仮想空間提供システム、仮想空間提供方法及びプログラム |
US11733705B2 (en) | 2018-01-16 | 2023-08-22 | Sony Interactive Entertainment Inc. | Moving body and moving body control method |
US11780084B2 (en) | 2018-01-16 | 2023-10-10 | Sony Interactive Entertainment Inc. | Robotic device, control method for robotic device, and program |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9420251B2 (en) | 2010-02-08 | 2016-08-16 | Nikon Corporation | Imaging device and information acquisition system in which an acquired image and associated information are held on a display |
JP7131542B2 (ja) * | 2017-03-15 | 2022-09-06 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
US10688378B2 (en) * | 2017-07-04 | 2020-06-23 | James Andrew Aman | Physical-virtual game board and content delivery system |
CN108543309B (zh) * | 2018-04-03 | 2020-03-10 | 网易(杭州)网络有限公司 | 在增强现实中控制虚拟控制对象移动的方法、装置及终端 |
KR20200035461A (ko) * | 2018-05-02 | 2020-04-03 | 에스지 디제이아이 테크놀러지 코., 엘티디 | 광학적으로 지원되는 객체 내비게이션 |
JP7341674B2 (ja) * | 2019-02-27 | 2023-09-11 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
US20220365588A1 (en) * | 2019-09-04 | 2022-11-17 | Sony Group Corporation | Information processing apparatus, information processing method, and program |
US11517812B2 (en) | 2021-02-19 | 2022-12-06 | Blok Party, Inc. | Application of RFID gamepieces for a gaming console |
US12028507B2 (en) * | 2021-03-11 | 2024-07-02 | Quintar, Inc. | Augmented reality system with remote presentation including 3D graphics extending beyond frame |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11312039A (ja) * | 1998-04-28 | 1999-11-09 | Nippon Telegr & Teleph Corp <Ntt> | 自己移動型物理アイコン装置 |
JP2005317032A (ja) * | 2004-04-29 | 2005-11-10 | Microsoft Corp | 仮想環境と物理的オブジェクトとの間の対話を可能にするための方法およびシステム |
WO2012023573A1 (ja) * | 2010-08-16 | 2012-02-23 | 株式会社セガトイズ | 玩具セット |
JP2013149106A (ja) * | 2012-01-19 | 2013-08-01 | Kaiyodo:Kk | i−フィギュア及びi−フィギュアを利用した画像処理システム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE20203586U1 (de) * | 2002-03-07 | 2002-07-04 | A. Schweizer GmbH Optische Fabrik, 91301 Forchheim | Optische Sehhilfe |
CN100344416C (zh) * | 2003-03-23 | 2007-10-24 | 索尼株式会社 | 机器人装置和控制该装置的方法 |
US8062126B2 (en) | 2004-01-16 | 2011-11-22 | Sony Computer Entertainment Inc. | System and method for interfacing with a computer program |
US10026177B2 (en) * | 2006-02-28 | 2018-07-17 | Microsoft Technology Licensing, Llc | Compact interactive tabletop with projection-vision |
US8730156B2 (en) * | 2010-03-05 | 2014-05-20 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
JP4331190B2 (ja) | 2006-09-21 | 2009-09-16 | 株式会社ソニー・コンピュータエンタテインメント | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
GB0711052D0 (en) * | 2007-06-08 | 2007-07-18 | Ici Plc | Thermal transfer printing |
FR2918477A1 (fr) | 2007-07-04 | 2009-01-09 | Aldebaran Robotics Soc Par Act | Procede d'edition de mouvements d'un robot |
US8602857B2 (en) * | 2008-06-03 | 2013-12-10 | Tweedletech, Llc | Intelligent board game system with visual marker based game object tracking and identification |
WO2012033863A1 (en) * | 2010-09-09 | 2012-03-15 | Tweedletech, Llc | A board game with dynamic characteristic tracking |
US20110256927A1 (en) * | 2009-03-25 | 2011-10-20 | MEP Games Inc. | Projection of interactive game environment |
-
2015
- 2015-01-13 US US15/302,344 patent/US10510189B2/en active Active
- 2015-01-13 WO PCT/JP2015/050682 patent/WO2015159561A1/ja active Application Filing
- 2015-01-13 JP JP2016513652A patent/JP6340414B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11312039A (ja) * | 1998-04-28 | 1999-11-09 | Nippon Telegr & Teleph Corp <Ntt> | 自己移動型物理アイコン装置 |
JP2005317032A (ja) * | 2004-04-29 | 2005-11-10 | Microsoft Corp | 仮想環境と物理的オブジェクトとの間の対話を可能にするための方法およびシステム |
WO2012023573A1 (ja) * | 2010-08-16 | 2012-02-23 | 株式会社セガトイズ | 玩具セット |
JP2013149106A (ja) * | 2012-01-19 | 2013-08-01 | Kaiyodo:Kk | i−フィギュア及びi−フィギュアを利用した画像処理システム |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017086258A (ja) * | 2015-11-05 | 2017-05-25 | 日本電信電話株式会社 | 情報提示装置、情報提示方法及びプログラム |
JP2017144217A (ja) * | 2016-02-19 | 2017-08-24 | 株式会社コーエーテクモゲームス | ゲームシステム、ゲーム装置、情報処理装置および制御プログラム |
JP2017164319A (ja) * | 2016-03-16 | 2017-09-21 | 国立大学法人電気通信大学 | ビデオチャットロボットシステム、手渡し遊び制御方法および手渡し遊び制御プログラム |
JP2017199238A (ja) * | 2016-04-28 | 2017-11-02 | 株式会社カプコン | 仮想空間表示システム |
EP3327544A1 (en) * | 2016-11-25 | 2018-05-30 | Nokia Technologies OY | An apparatus, associated method and associated computer readable medium |
WO2018096207A1 (en) * | 2016-11-25 | 2018-05-31 | Nokia Technologies Oy | An apparatus, associated method and associated computer readable medium |
US11169602B2 (en) | 2016-11-25 | 2021-11-09 | Nokia Technologies Oy | Apparatus, associated method and associated computer readable medium |
KR20190126377A (ko) * | 2017-07-25 | 2019-11-11 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 가상 캐릭터의 배치 제어 방법 및 디바이스와 저장 매체 |
KR102574170B1 (ko) * | 2017-07-25 | 2023-09-05 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 가상 캐릭터의 배치 제어 방법 및 디바이스와 저장 매체 |
US11527052B2 (en) | 2017-07-25 | 2022-12-13 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for controlling placement of virtual character and storage medium |
JP2020527262A (ja) * | 2017-07-25 | 2020-09-03 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想キャラクターの変位制御方法、装置、および記憶媒体 |
US11049329B2 (en) | 2017-07-25 | 2021-06-29 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for controlling placement of virtual character and storage medium |
JP7023991B2 (ja) | 2017-07-25 | 2022-02-22 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | 仮想キャラクターの変位制御方法、装置、および記憶媒体 |
US12026847B2 (en) | 2017-07-25 | 2024-07-02 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for controlling placement of virtual character and storage medium |
JPWO2019116521A1 (ja) * | 2017-12-14 | 2021-01-07 | 株式会社ソニー・インタラクティブエンタテインメント | エンターテインメントシステム、ロボット装置およびサーバ装置 |
US11498206B2 (en) | 2017-12-14 | 2022-11-15 | Sony Interactive Entertainment Inc. | Entertainment system, robot device, and server device |
JP7128842B2 (ja) | 2017-12-14 | 2022-08-31 | 株式会社ソニー・インタラクティブエンタテインメント | エンターテインメントシステム、ロボット装置およびサーバ装置 |
WO2019116521A1 (ja) * | 2017-12-14 | 2019-06-20 | 株式会社ソニー・インタラクティブエンタテインメント | エンターテインメントシステム、ロボット装置およびサーバ装置 |
WO2019142228A1 (ja) * | 2018-01-16 | 2019-07-25 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
US11780084B2 (en) | 2018-01-16 | 2023-10-10 | Sony Interactive Entertainment Inc. | Robotic device, control method for robotic device, and program |
JPWO2019142228A1 (ja) * | 2018-01-16 | 2021-01-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
US11648672B2 (en) | 2018-01-16 | 2023-05-16 | Sony Interactive Entertainment Inc. | Information processing device and image generation method |
US11733705B2 (en) | 2018-01-16 | 2023-08-22 | Sony Interactive Entertainment Inc. | Moving body and moving body control method |
JP2020022597A (ja) * | 2018-08-07 | 2020-02-13 | 株式会社スクウェア・エニックス | プログラム、投影システム及びコンピュータ装置 |
JP7559026B2 (ja) | 2018-12-28 | 2024-10-01 | 株式会社イトーキ | 仮想空間提供システム、仮想空間提供方法及びプログラム |
JP2022186819A (ja) * | 2018-12-28 | 2022-12-15 | 株式会社イトーキ | 仮想空間提供システム、仮想空間提供方法及びプログラム |
JP7476513B2 (ja) | 2019-10-28 | 2024-05-01 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置、及び、情報処理プログラム |
JP2021068383A (ja) * | 2019-10-28 | 2021-04-30 | 富士ゼロックス株式会社 | 情報処理装置、及び、情報処理プログラム |
JP2022039059A (ja) * | 2020-08-27 | 2022-03-10 | 株式会社バンダイ | ゲーム観戦システム、プログラム、観戦端末及び接続装置 |
US11717754B2 (en) | 2020-08-27 | 2023-08-08 | Bandai Co., Ltd. | Game support system, program, information communication terminal, and connection device |
JP7071454B2 (ja) | 2020-08-27 | 2022-05-19 | 株式会社バンダイ | ゲーム支援システム、プログラム及び情報通信端末 |
US11819766B2 (en) | 2020-08-27 | 2023-11-21 | Bandai Co., Ltd. | Game watching system, program, watching terminal and connection device |
KR102605262B1 (ko) * | 2020-08-27 | 2023-11-24 | 가부시키가이샤 반다이 | 게임 지원 시스템, 프로그램, 및 정보 통신 단말기 |
JP7061649B2 (ja) | 2020-08-27 | 2022-04-28 | 株式会社バンダイ | ゲーム観戦システム、プログラム、観戦端末及び接続装置 |
JP2022039058A (ja) * | 2020-08-27 | 2022-03-10 | 株式会社バンダイ | ゲーム支援システム、プログラム及び情報通信端末 |
KR20220027754A (ko) * | 2020-08-27 | 2022-03-08 | 가부시키가이샤 반다이 | 게임 지원 시스템, 프로그램, 정보 통신 단말기 및 접속 장치 |
Also Published As
Publication number | Publication date |
---|---|
JP6340414B2 (ja) | 2018-06-06 |
JPWO2015159561A1 (ja) | 2017-04-13 |
US20170024934A1 (en) | 2017-01-26 |
US10510189B2 (en) | 2019-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6340414B2 (ja) | 情報処理装置、情報処理システム、および情報処理方法 | |
US10864433B2 (en) | Using a portable device to interact with a virtual space | |
US10671239B2 (en) | Three dimensional digital content editing in virtual reality | |
JP7150921B2 (ja) | 情報処理プログラム、情報処理方法、情報処理システム、および情報処理装置 | |
US9789391B2 (en) | Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting | |
EP2627420B1 (en) | System for enabling a handheld device to capture video of an interactive application | |
JP6598522B2 (ja) | 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム | |
JP6039594B2 (ja) | 情報処理装置および情報処理方法 | |
KR101929826B1 (ko) | 게임 시스템 | |
JP6581341B2 (ja) | 情報処理装置、情報処理プログラム、情報処理方法、および情報処理システム | |
JP5498690B2 (ja) | ゲームプログラムおよびゲーム装置 | |
JP2019049987A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
JP2014039862A (ja) | ゲーム装置およびゲームプログラム | |
JP2019020832A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
JP5816435B2 (ja) | 表示制御プログラム、表示制御装置、表示制御システム、表示制御方法 | |
JP6046220B2 (ja) | ゲーム装置およびゲームプログラム | |
JP7325100B2 (ja) | 3次元ゲーム画像生成プログラム、3次元ゲーム画像生成装置および3次元ゲーム画像生成方法 | |
JP4388985B2 (ja) | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム | |
TW201524563A (zh) | 視角與游標的互動方法及其系統 | |
JP4388984B2 (ja) | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム | |
JP2018207517A (ja) | ヘッドマウントデバイスにおける表示を制御するためにコンピュータで実行される方法、当該方法をコンピュータに実行させるプログラム、および情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15780601 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016513652 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15302344 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15780601 Country of ref document: EP Kind code of ref document: A1 |