GB2556894A - Apparatus and method of interactive control - Google Patents

Apparatus and method of interactive control Download PDF

Info

Publication number
GB2556894A
GB2556894A GB1619811.1A GB201619811A GB2556894A GB 2556894 A GB2556894 A GB 2556894A GB 201619811 A GB201619811 A GB 201619811A GB 2556894 A GB2556894 A GB 2556894A
Authority
GB
United Kingdom
Prior art keywords
controller
functions
real world
analysis
video image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1619811.1A
Other versions
GB201619811D0 (en
Inventor
Winesh Raghoebardajal Sharwin
Ashforth Jeremy
Benson Simon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1619811.1A priority Critical patent/GB2556894A/en
Publication of GB201619811D0 publication Critical patent/GB201619811D0/en
Publication of GB2556894A publication Critical patent/GB2556894A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In a method of interactive control for a VR controller, a plurality of functions are associated with at least a first user input of the controller. A notification that the first user action has been selected is received, along with data indicating the controller's physical position in a real world environment. One of the functions associated with the input is selected based upon analysis of this data to modify the state of a virtual environment. The position of the controller may be detected by using video image or transmitted from the controller by telemetry. Different functions may be selected based on the absolute or relative position of the controller, or according to the controllers acceleration, velocity, position or orientation. Also disclosed are a computer program adapted to cause a computer system to perform the method and an apparatus featuring components which perform each step of the method.

Description

(54) Title of the Invention: Apparatus and method of interactive control
Abstract Title: Method of interactive control, and computer program and apparatus using the method (57) In a method of interactive control for a VR controller, a plurality of functions are associated with at least a first user input of the controller. A notification that the first user action has been selected is received, along with data indicating the controller's physical position in a real world environment. One of the functions associated with the input is selected based upon analysis of this data to modify the state of a virtual environment. The position of the controller may be detected by using video image or transmitted from the controller by telemetry. Different functions may be selected based on the absolute or relative position of the controller, or according to the controller’s acceleration, velocity, position or orientation. Also disclosed are a computer program adapted to cause a computer system to perform the method and an apparatus featuring components which perform each step of the method.
Figure GB2556894A_D0001
Figure 3 co
Figure GB2556894A_D0002
Figure GB2556894A_D0003
(=ι to
V0)
u.
Figure GB2556894A_D0004
Figure GB2556894A_D0005
Figure GB2556894A_D0006
rsj
ΙΛ
Figure GB2556894A_D0007
Figure GB2556894A_D0008
cr73 □ o
Figure GB2556894A_D0009
Figure GB2556894A_D0010
3/3 /-;-;Associate a pluraiity of functions with at ieast a first user input of the controiler \__
V f \
Receive a notification that the first user input has been selected s310 s320
Receive data indicative of the controller's physical position in a real world environment
5330
Select one of the plurality of functions in dependence upon the outcome of analysis of the received data s340
Modify the state of a virtual environment responsive to the selected function
V_.
Figure 3 s35Q
APPARATUS AND METHOD OF INTERACTIVE CONTROL
The present invention relates to apparatus and method of interactive control.
Modern video games can provide a rich and immersive experience in which the user is able to perform a wide variety of behaviours, both in terms of navigation (moving and looking in different directions, and moving at different speeds and in different ways, e.g. jumping and crouching) and also interactions (fighting with multiple weapons or weapons with multiple modes, using various vehicles and objects, eating, healing and other interactions). As a result, many modern games use a combination of a mouse and a large number of keys on a keyboard to control the game, or similarly a number of joysticks, joypads and/or keys on a handheld controller.
A new form of immersive game is provided by virtual reality or VR gaming, where a user wears a head-mounted display or HMD, which substantially blocks the user’s view of the real world during gameplay. This form of gaining often encourages the user to stand and move or at least look around, since they are subjectively placed within a virtual world where there in game representative avatar will move and/or look around with them.
However, this makes the use of a mouse and keyboard difficult, and similarly makes two-handed use of a controller restrictive, and also potentially difficult if the user cannot see the various inputs on the controller.
Consequently an alternative mode of interaction would be preferable.
The present invention seeks to alleviate or mitigate this problem.
In a first aspect, a method of interactive control for a VR controller is provided in accordance with claim 1.
In another aspect, an apparatus for generating an interactive virtual environment is provided in accordance with claim 9.
Further respective aspects and features of the invention are defined in the appended claims.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which:
- Figure 1 is a schematic diagram of an entertainment device in accordance with an embodiment of the present invention.
- Figure 2 is a schematic diagram of a virtual reality controller in accordance with an embodiment of the present invention.
- Figure 3 is a flowchart of a method of interactive control for a VR controller in accordance with an embodiment of the present invention.
An apparatus and method of interactive control are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
Figure 1 schematically illustrates the overall system architecture of a Sony® PlayStation 4® entertainment device, which is a suitable example of a videogame console or apparatus in accordance with embodiments of the present invention. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises an accelerated processing unit (APU) 20 being a single chip that in turn comprises a central processing unit (CPU) 20A and a graphics processing unit (GPU) 20B. The APU 20 has access to a random access memory (RAM) unit 22.
The APU 20 communicates with a bus 40, optionally via an I/O bridge 24, which may be a discreet component or part of the APU 20.
Connected to the bus 40 are data storage components such as a hard disk drive 37, and a Blu-ray ® drive 36 operable to access data on compatible optical discs 36A. Additionally the RAM unit 22 may communicate with the bus 40.
Optionally also connected to the bus 40 is an auxiliary processor 38. The auxiliary processor 38 may be provided to run or support the operating system.
The system unit 10 communicates with peripheral devices as appropriate via an audio/visual input port 31, an Ethernet ® port 32, a Bluetooth ® wireless link 33, a Wi-Fi ® wireless link 34, or one or more universal serial bus (USB) ports 35. Audio and video may be output via an AV output 39, such as an HDMI port.
The peripheral devices may include a monoscopic or stereoscopic video camera 41 such as the
PlayStation Eye ®; wand-style videogame controllers 42 such as the PlayStation Move ® and conventional handheld videogame controllers 43 such as the Dual Shock 4 ®; portable entertainment devices 44 such as the PlayStation Portable ® and PlayStation Vita ®; a keyboard 45 and/or a mouse 46; a media controller 47, for example in the form of a remote control; and a headset 48. Other peripheral devices may similarly be considered such as a printer, or a 3D printer (not shown).
The GPU 20B, optionally in conjunction with the CPU 20A, generates video images and audio for output via the AV output 39. Optionally the audio may be generated in conjunction with or instead by an audio processor (not shown).
The video and optionally the audio may be presented to a television 51. Where supported by the television, the video may be stereoscopic. The audio may be presented to a home cinema system 52 in one of a number of formats such as stereo, 5.1 surround sound or 7.1 surround sound. Video and audio may likewise be presented to a head mounted display unit 53 worn by a user 60.
In operation, the entertainment device defaults to an operating system such as a variant of FreeBSD 9.0. The operating system may run on the CPU 20A, the auxiliary processor 38, or a mixture of the two. The operating system provides the user with a graphical user interface such as the PlayStation Dynamic Menu. The menu allows the user to access operating system features and to select games and optionally other content.
Referring now to Figure 2, the PlayStation Move ® controller 42 (or simply ‘VR controller’ hereafter) may be considered a non-limiting example of a controller that may be used with a head-mounted device when playing a VR game. Other examples may be the Oculus Rift ® Touch controllers, or the HTC Vive ® controllers.
These VR controllers are characterised by each being held in one hand and tracking movements of the user’s respective hand, to assist with representing the user within the virtual environment. To assist with this, the controller typically comprises one or more tracking means.
The illustrated VR controller comprises a tracking object 420 such as an illuminated ball, which may be used to optically track the controller’s position in space by a camera connected to a host device apparatus hosting the virtual reality environment (typically a videogame console such as the PlayStation 4® or PC, but potentially a server providing a streamed gaming experience). Other VR controllers may use a different configuration of optical tracking objects, or not include these at all and optionally instead use means such as magnetism, ultrasound, GPS, picocell / WiFi® radio triangulation, laser or light interferometry and / or accelerometer / gyroscope sensors (e.g. MEMs devices), or other suitable motion tracking techniques localised in the controller, the console (or a separate sensor peripheral) or any combination of the above.
Hence although not shown, the VR controller may also comprise for example one or more accelerometers and/or gyroscopes to track changes in acceleration and hence optionally also velocity and position, these being integrated either within the controller or by the host device.
The VR controller also comprises a handle 424 that enables it to be held in a single hand, or otherwise attached the hand, wrist or fingers, so that gross gestures that serve to change the position and orientation the user’s hand induce a corresponding change in position and orientation of the controller to a greater or lesser extent.
The VR controller then comprises a number of inputs 422, here labelled A-G. The position and number of controls in the figures are purely exemplary. These controls may be arranged in any manner suitable to the functioning and ergonomics of the controller. In the illustrated example, there are three basic groups; buttons A-D may correspond to buttons on a standard videogame controller and have corresponding functions, and similarly button E may correspond to a button on the standard controller such as a trigger or action button, but transposed relative to buttons A to D so as to be easily accessible by the user’s thumb. Button F is a trigger typically used with an index finger. Finally, button G is physically separate from the other buttons to avoid accidental use and may be used to trigger or select out-of-gameplay functions such as game menus or operating system menus.
However it will be appreciated that the total number of controls is small, compared for example to a full size keyboard or the PlayStation 4 ® controller, which has 15 buttons and two joysticks available. This lack of controls could potentially hamper the degree of interactivity and functionality available within an immersive VR game.
Accordingly, in an embodiment of the present invention buttons on the VR controller may be associated with one or more additional functions, thereby enabling a similar total number of functions on the VR controller to that available using other standard console controllers such as the PlayStation 4 ® controller.
In principle, these functions could be accessed using button E or button G for example acting like a ‘shift’ or ‘alt’ key to change the mode of other keys on the device, but this may require significant dexterity, compounded by the fact that when wearing an HMD, the user cannot see the controller in their hand. Furthermore, typically keys A-G are intended to be controlled with just the user’s thumb, making simultaneous specific keypresses difficult.
Consequently, in an embodiment of the present invention, a method of selecting which of a plurality of functions associated with a button is to be selected when the button is pressed comprises detecting the physical position of the controller where, as applicable, ‘position’ can encompass gross position (i.e. overall position), local position (i.e. orientation) and/or motion (i.e. change in position), in absolute terms and/or with reference to another physical object, and selecting one of the plurality of functions responsive to the detected physical position/state.
In this way, the user can change the effective function of a button/key either consciously - for example by tilting the VR controller up or down before pressing a button in order to obtain different functions, or unconsciously - for example where the user performs a natural action, such as swinging the wand controller like a bat, or holding it like a shield or sword, and the position (position, orientation and/or motion) resulting from this action causes the selection of a function for a button that is relevant to that action - for example to add top-spin when hitting the ball, or parrying with the shield or sword. When the controller is being held in another, neutral position, the same button may have a different function, such as throwing the ball in the air to serve, or raising the shield / drawing the sword.
Notably this is distinct from assigning functions to detected gestures of the wand controller itself; drawing specific patterns in the air or the like (for example to pretend to cast different spells) may separately be detected if desired.
Hence referring now to Figure 3, in an embodiment of the present invention, a method of interactive control for a VR controller comprises:
In a first step S310, associating a plurality of (distinct) functions with at least a first user input of the controller.
This is typically done at the videogame console apparatus. For example, an application running on the apparatus and generating a virtual environment may associate a button ‘B’ with plural functions, relying on other received data to guide selection. However, potentially a VR controller could provide distinct selections based upon detected button presses and internal telemetry such as that obtained from an accelerometer or the like.
In a second step S320, receiving a notification that the first user input has been selected. Typically this occurs at the videogame console via an input port such as a USB 35, Wi-Fi® 34, or Bluetooth® 33 port, which receives an input signal from the VR controller either wirelessly or through a wired connection which indicates the user has pressed a button on the controller, or operated some other control interface such as a joystick, joypad or touch sensitive surface.
In a third step S330, receiving data indicative of the controller’s physical position in a real world environment. As noted previously, such data could be a video image of a real-world scene typically encompassing the user and in particular a characteristic visible component of the controller captured by videocamera 41 and received at the video console apparatus via an input port such as a USB 35, Wi-Fi® 34, or Bluetooth® 33 port. Alternatively or in addition, such data could be telemetry from motion sensors within the VR controller, either transmitted by the VR controller in raw form (for example as acceleration values) or already converted for example to velocity or position data and received as described previously above.
In a fourth step S340, selecting one of the plurality of functions in dependence upon the outcome of analysis of the received data. As noted above in relation to games of tennis and sword fighting, the respective functions are likely to be dependent upon the virtual environment/game being controlled. However it will be appreciated that different functions may be ascribed to a button according to different positional states of the controller, as discussed elsewhere herein. Typically the selection will be implemented by the central processor 20A operating under suitable software instruction.
In a fifth step S350, modifying the state of a virtual environment responsive to the selected function. In this regard, the system behaves much as it would in response to when a button or other input was used on a conventional controller having a one-to-one relationship between functions and inputs. Again typically the modification will be implemented by the central processor 20A, optionally in conjunction with the graphics processor 20B, operating under suitable software instruction.
In an instance of this embodiment, the data indicative of the controller’s physical position in a real world environment comprises a captured video image of the real world environment that includes the controller. As noted above, this data may for example be captured by a video camera 41 and received at an input port of the videogame console.
Given such video data, the analysis conducted during the step of selecting a function may comprise detecting the controller’s absolute position in the captured video image; this may for example enable a threshold differentiation in behaviour depending on absolute position so that for example in a volleyball game the controller has to be above a notional height in the captured image in order to slam the ball over the net, whereas below this height the same button implements a pass to another player.
Similarly the analysis may comprise detecting the controller’s position in the captured video image relative to another controller (or another reference object, such as the user’s torso). Hence for example the function of a selected button may change if the user’s hands are brought together whilst both holding a controller, or may change between when a user clutches a controller to the chest, and when they extend their arm out. Similarly the function of a selected button may change if the controller is held above the user’s chest or below, or at shoulder or head height above head height as appropriate. Other relative positions salient to particular games or functionality would be apparent to the skilled person.
Similarly the analysis may comprise detecting the controller’s position in the captured video image relative to a head mounted display worn by a user. Typically an HMD will have similar tracking functions to the hand-held controller, namely characteristic visual features and/or accelerometer/gyroscopic telemetry, and hence can also be tracked with a similar degree of accuracy. Hence for example a function of a selected button may be to take an object from the virtual environment, but this may change to eating an object from a default environment if the controller is held close to the user’s face when the button is pressed.
The above principles can also apply to the acceleration, velocity and orientation of the controller. Hence for example if the controller is moving slowly, then a button press associated with sword fighting may cause a blocking action, whereas if the controller is moving above a threshold speed, then all as being equal pressing the same button may result in a parrying action. Similarly in a boxing game, pressing a button when the controller has little or no acceleration may result in a feint, whereas pressing the same button when the controller is above a threshold level acceleration may result in a punch. Meanwhile, orientation in one or more of one of three or six axes (depending on the sensors available) may also change the function of the button.
Instance of the embodiment, the data indicative of the controller’s physical position in a real world environment comprises telemetry transmitted by the controller. As noted above, this is typically based on signals from accelerometers and/or gyroscopic MEMs devices which may or may not be integrated to provide velocity or position data before transmission from the controller to the console.
In a similar manner to the video data, the analysis conducted during the step of selecting a function may comprise detecting one or more of the acceleration of the controller, the velocity of the controller, the position of the controller, and the orientation of the controller, and the two sources of data may be used together to provide complementary information or to improve or verify the accuracy of data from one or other of the sources.
In either case, irrespective of the source of the data, it will be appreciated that not every button needs to have a plurality of functions associated with it, and similarly that the selection of one of a plurality functions need not be responsive to every aspect of positional state that might be analysed; hence for example the selection between the function of picking up an object to store it in an inventory and the function of picking up an object to eat it may only be responsive to the relative position of the controller to the user’s head, and take no account of speed, acceleration or orientation of the controller or the absolute position of the controller or user.
In an instance of the present embodiment, the selecting step comprises selecting one of the plurality of functions in dependence upon the outcome of analysis of the received data in a first mode, and selecting only a default function independent of the outcome of analysis of the received data in a second mode.
This allows the videogame console apparatus and/or the first reality application running thereon to suspend access to plural functions for a single button where this would be inappropriate. Notably this is distinct from the case where for example switching from main gameplay to a menu might cause a button that previously had multiple functions to now only have one, different function (e.g. select menu option); rather, it may relate to when a user has to perform a frenetic activity where highly variable changes in position, velocity or acceleration could result in a button potentially passing through trigger states for multiple functions very quickly and with limited controller awareness from the user.
Examples may include when the user is serving during a tennis match, or where the VR controller is also used to enact gestures by sweeping the controller through predetermined paths in space; during these actions, the controller may potentially be at positions which have separate significance for other buttons, but selection of these functions were not be appropriate at the time.
Hence more generally secondary or additional functions of a first button may be selectively suspended when the controller is in a position (including in a motion) likely relevant to a function of a second button that is determined by the game designer to not be compatible or desirable to enact the same time.
In another instance of the embodiment, in a related fashion, where different functions are associated with different characteristics of the controller’s position in the real world (such that more than one function might be applicable for the same button), the step of selecting one of the plurality of functions comprises detecting which functions correspond to a detected characteristic of the controller’s position in the real world, and if more than one function is detected, selecting one of the functions in accordance with a predetermined priority ranking.
Hence if for example a button has one additional function with a trigger state corresponding to a certain orientation range, and another additional function with a trigger state corresponding to a threshold level of motion, it may be possible for the user to move the controller about that threshold whilst holding it in the within the relevant orientation range. In this case, which of the additional functions should be selected may be decided according to a priority list (of course if the functions are not mutually exclusive, then both may be enacted by the button press).
It will be appreciated that the above methods may be carried out on conventional hardware suitably adapted as applicable by software instruction or by the inclusion or substitution of dedicated hardware.
Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a non-transitory machine-readable medium such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device. Separately, such a computer program may be transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these or other networks.
As noted previously herein, a PlayStation 4 is an example of a suitable apparatus.
Hence in an embodiment of the present invention, an apparatus (such a PlayStation 4 (10)) for generating an interactive virtual environment, comprises an association processor (such as for example CPU 20A) adapted (by suitable software instruction) to associate a plurality of functions with at least a first user input (422A-H) of a controller (42); a receiver (such as one of the USB 35, WiFi® 43, and Bluetooth® 33 ports) adapted to receive a notification that the first user input has been selected; a receiver (such as one of the USB 35, WiFi® 43, and Bluetooth® 33 ports) adapted to receive data indicative of the controller’s physical position in a real world environment; a selection processor (such as for example CPU 20A) adapted (by suitable software instruction) to select one of the plurality of functions in dependence upon the outcome of analysis of the received data; and a modification processor (again such as for example CPU 20A) adapted (by suitable software instruction) to modify the state of the interactive virtual environment responsive to the selected function.
As noted above, this apparatus may implement the above-described methods and techniques using suitable software instruction, and hence likewise in an instance of the embodiment, the data indicative of the controller’s physical position in a real world environment may comprise a captured video image of the real world environment that includes the controller, and consequently the selection processor may be adapted to detect one or more from the list consisting of the controller’s absolute position in the captured video image, the controller’s position in the captured video image relative to another controller, the controller’s position in the captured video image relative to a head mounted display worn by a user, the acceleration of the controller, the velocity of the controller; and the orientation of the controller.
Similarly, in an instance of the embodiment, the data indicative of the controller’s physical position in a real world environment may comprise telemetry transmitted by the controller, and consequently the selection processor may be adapted to detect one or more from the list consisting of the acceleration of the controller; the velocity of the controller; the position of the controller; and the orientation of the controller.
Meanwhile in an instance of the embodiment, the selection processor may be adapted to select one of the plurality of functions in dependence upon the outcome of analysis of the received data in a first mode, and to select only a default function independent of the outcome of analysis of the received data in a second mode.
Again in an instance of the embodiment, the association processor may be adapted to associate different functions with different characteristics of the controller’s position in the real world, and the selection processor is adapted to detect which functions correspond to a detected characteristic of the controller’s position in the real world, and if more than one function is detected, select one of the functions in accordance with a predetermined priority ranking.
Hence the apparatus may implement the methods and techniques described herein.

Claims (16)

1. A method of interactive control for a VR controller, comprising the steps of: associating a plurality of functions with at least a first user input of the controller; receiving a notification that the first user input has been selected;
receiving data indicative of the controller’s physical position in a real world environment; selecting one of the plurality of functions in dependence upon the outcome of analysis of the received data; and modifying the state of a virtual environment responsive to the selected function.
2. The method of claim 1, in which the data indicative of the controller’s physical position in a real world environment comprises a captured video image of the real world environment that includes the controller.
3. The method of claim 2, in which the analysis during the selecting step comprises detecting one or more from the list consisting of:
i. the controller’s absolute position in the captured video image ii. the controller’s position in the captured video image relative to another controller;
iii. the controller’s position in the captured video image relative to a head mounted display worn by a user;
iv. the acceleration of the controller;
v. the velocity of the controller; and vi. the orientation of the controller.
4. The method of any one of the preceding claims, in which the data indicative of the controller’s physical position in a real world environment comprises telemetry transmitted by the controller.
5. The method of claim 4, in which the analysis step comprises detecting one or more from the list consisting of:
i. the acceleration of the controller;
ii. the velocity of the controller;
iii. the position of the controller; and iv. the orientation of the controller.
6. The method according to any one of the preceding claims, in which the selecting step comprises selecting one of the plurality of functions in dependence upon the outcome of analysis of the received data in a first mode, and selecting only a default function independent of the outcome of analysis of the received data in a second mode.
7. The method according to any one of the preceding claims, where different functions are associated with different characteristics of the controller’s position in the real world, and the step of selecting one of the plurality of functions comprises:
detecting which functions correspond to a detected characteristic of the controller’s position in the real world;
and if more than one function is detected;
select one of the functions in accordance with a predetermined priority ranking.
8. A computer program adapted to cause a computer system to perform the method of any one of the preceding claims.
9. An apparatus for generating an interactive virtual environment, comprising:
an association processor adapted to associate a plurality of functions with at least a first user input of a controller;
a receiver adapted to receive a notification that the first user input has been selected; a receiver adapted to receive data indicative of the controller’s physical position in a real world environment;
a selection processor adapted to select one of the plurality of functions in dependence upon the outcome of analysis of the received data; and a modification processor adapted to modify the state of the interactive virtual environment responsive to the selected function.
10. The apparatus of claim 9, in which the data indicative of the controller’s physical position in a real world environment comprises a captured video image of the real world environment that includes the controller.
11. The apparatus of claim 10, in which the selection processor is adapted to detect one or more from the list consisting of:
i. the controller’s absolute position in the captured video image ii. the controller’s position in the captured video image relative to another controller;
iii. the controller’s position in the captured video image relative to a head mounted display worn by a user;
iv. the acceleration of the controller;
v. the velocity of the controller; and vi. the orientation of the controller.
12. The apparatus of any one of claims 9 to 11, in which the data indicative of the controller’s physical position in a real world environment comprises telemetry transmitted by the controller.
13. The method of claim 12, in which the selection processor is adapted to detect one or more
10 from the list consisting of:
i. the acceleration of the controller;
ii. the velocity of the controller;
iii. the position of the controller; and iv. the orientation of the controller.
15 14. The apparatus according to any one of claims 9 to 13, in which the selection processor is adapted to select one of the plurality of functions in dependence upon the outcome of analysis of the received data in a first mode, and to select only a default function independent of the outcome of analysis of the received data in a second mode.
15. The apparatus according to anyone of claims 9 to 14, where the association processor is
20 adapted to associate different functions with different characteristics of the controller’s position in the real world, and the selection processor is adapted to:
detect which functions correspond to a detected characteristic of the controller’s position in the real world;
and if more than one function is detected;
25 select one of the functions in accordance with a predetermined priority ranking.
Intellectual
Property
Office
Application No: GB1619811.1 Examiner: Mr Patrick Lucas
GB1619811.1A 2016-11-23 2016-11-23 Apparatus and method of interactive control Withdrawn GB2556894A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1619811.1A GB2556894A (en) 2016-11-23 2016-11-23 Apparatus and method of interactive control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1619811.1A GB2556894A (en) 2016-11-23 2016-11-23 Apparatus and method of interactive control

Publications (2)

Publication Number Publication Date
GB201619811D0 GB201619811D0 (en) 2017-01-04
GB2556894A true GB2556894A (en) 2018-06-13

Family

ID=57993921

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1619811.1A Withdrawn GB2556894A (en) 2016-11-23 2016-11-23 Apparatus and method of interactive control

Country Status (1)

Country Link
GB (1) GB2556894A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008276636A (en) * 2007-05-02 2008-11-13 Nintendo Co Ltd Information processor and information processing program
US20110195782A1 (en) * 2010-02-05 2011-08-11 Sony Computer Entertainment Inc. Systems and methods for determining controller functionality based on position, orientation or motion
EP2390760A2 (en) * 2010-05-25 2011-11-30 Nintendo Co., Ltd. Information processing of attitude or motion data for object selection
US20120287043A1 (en) * 2011-05-11 2012-11-15 Nintendo Co., Ltd. Computer-readable storage medium having music performance program stored therein, music performance apparatus, music performance system, and music performance method
JP2015231545A (en) * 2015-07-28 2015-12-24 株式会社カプコン Game program and game system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008276636A (en) * 2007-05-02 2008-11-13 Nintendo Co Ltd Information processor and information processing program
US20110195782A1 (en) * 2010-02-05 2011-08-11 Sony Computer Entertainment Inc. Systems and methods for determining controller functionality based on position, orientation or motion
EP2390760A2 (en) * 2010-05-25 2011-11-30 Nintendo Co., Ltd. Information processing of attitude or motion data for object selection
US20120287043A1 (en) * 2011-05-11 2012-11-15 Nintendo Co., Ltd. Computer-readable storage medium having music performance program stored therein, music performance apparatus, music performance system, and music performance method
JP2015231545A (en) * 2015-07-28 2015-12-24 株式会社カプコン Game program and game system

Also Published As

Publication number Publication date
GB201619811D0 (en) 2017-01-04

Similar Documents

Publication Publication Date Title
US11806623B2 (en) Apparatus for adapting virtual gaming with real world information
US9933851B2 (en) Systems and methods for interacting with virtual objects using sensory feedback
US10279256B2 (en) Game medium, method of using the game medium, and game system for using the game medium
JP6401841B1 (en) Information processing method, computer, and program
US9804696B2 (en) User-input control device toggled motion tracking
KR20150141151A (en) Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity
JP2015116336A (en) Mixed-reality arena
KR101929826B1 (en) Multiplatform gaming system
CN108027987A (en) Information processing method, the program for making the computer-implemented information processing method, the information processor and information processing system for implementing the information processing method
US11547931B2 (en) Input apparatus and method
US8147333B2 (en) Handheld control device for a processor-controlled system
GB2556894A (en) Apparatus and method of interactive control
JP2017170106A (en) Game program, method, and game system
JP7073228B2 (en) Information processing methods, computers, and programs
US10242241B1 (en) Advanced mobile communication device gameplay system
JP2017099608A (en) Control system and program
JP7286857B2 (en) Information processing system, program and information processing method
JP7163526B1 (en) Information processing system, program and information processing method
JP2023015979A (en) Information processing system, program, and information processing method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)