GB2588807A - Data processing apparatus, control device and method - Google Patents

Data processing apparatus, control device and method Download PDF

Info

Publication number
GB2588807A
GB2588807A GB1916272.6A GB201916272A GB2588807A GB 2588807 A GB2588807 A GB 2588807A GB 201916272 A GB201916272 A GB 201916272A GB 2588807 A GB2588807 A GB 2588807A
Authority
GB
United Kingdom
Prior art keywords
control device
mode
determination
video game
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1916272.6A
Other versions
GB201916272D0 (en
Inventor
Lee Jones Michael
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1916272.6A priority Critical patent/GB2588807A/en
Publication of GB201916272D0 publication Critical patent/GB201916272D0/en
Publication of GB2588807A publication Critical patent/GB2588807A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

a video game system 400 comprising a video game processor 401 and a controller 403 provided with a control processor 405 configured to receive data from grip sensors 404 in the controller to determine whether the controller is being held by the user. The video game system is configured to execute a videogame in a first mode of operation when controller is being held or in a second mode of operation when it is not being held. The second mode of operation may comprise a stand-by, rest or sleep mode in order to save power when the user is not interacting with the videogame system. Also disclosed is a corresponding control device, method for controlling a control device and method for controlling a video game machine is also provided.

Description

DATA PROCESSING APPARATUS, CONTROL DEVICE AND METHOD
Technical Field
The present disclosure relates to an apparatus, control device, method for controlling an output of a control device and method for controlling an output of a video game machine that is in communication with a control device.
Background
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present disclosure.
It is known in the art for electronic devices to be provided with 'sleep' or 'rest' modes in which less power is consumed by the devices, compared with normal operation. Typically, for such modes, the brightness of a screen associated with the device is reduced, and applications are suspended or terminated. As an example, the PS4® offers a 'rest mode' in which any video game applications running at the device are suspended and an associated display is switched off.
Conventionally, a device is detected as being idle if no user inputs are detected as being received at the device after a predetermined time interval. As will be appreciated, detecting idleness of a device in this manner requires some level of guessing, as a user may well still be present, despite not providing inputs to the device. Moreover, the reduction in power consumption achieved in this manner will usually be sub-optimal, since there is no real-time detection of a lack of engagement between user and device.
As more advanced iterations of games consoles are released on the market, the complexity and power consumption of the consoles and associated peripheral devices also tends to increase. However, with these advancements in technology comes new opportunities for detecting when a device is idle. In any case, there is a need in the art to ensure that the power consumed by such devices is handled in an efficient and economic manner. The present invention seeks to address or at least alleviate this problem.
Summarv The present disclosure is defined by the appended claims.
Brief Description of the Drawings
To assist understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the accompanying drawings in which: Figure 1 shows schematically an example of a video game machine; Figure 2A shows schematically a front view of a hand-holdable control device having left and right hold portions; Figure 2B shows schematically a side view of the hand-holdable control device shown in Figure 2A; Figure 3 shows schematically an example of a hand-holdable control device having an elongate body with a camera-trackable portion disposed at a distal end; Figure 4 shows schematically an example of a data processing apparatus in accordance with the present disclosure; Figures 5A and 5B show respective schematic views of a control device that employs a proximity detector; Figure 6 shows schematically an example of a control device that employs a pressure sensor; Figure 7 shows schematically an example of a motion controller that employs two pressure bands; Figures 8A and 83 show respective schematic views of a control device employing a plurality of sensors; Figure 9 shows schematically an example of an apparatus that employs a machine learning model; Figure 10 shows schematically an example of a control device that employs one or more microphones as a sensor; Figure 11 shows schematically an example of an apparatus that employs a camera as a sensor; Figure 12 shows schematically an example of an apparatus that employs an object recognition unit and! or a facial recognition unit; Figure 13 shows schematically an example of a control device that comprises a camera; Figure 14 shows schematically an example of an apparatus comprising a plurality of output devices; Figure 15 shows schematically an example of a control device in accordance with the present disclosure; Figure 16 shows schematically an example of a method for controlling an output of a control device in accordance with the present disclosure; and Figure 17 shows schematically an example of a method for controlling the operation of a video game machine in accordance with the present disclosure.
Detailed Description
The present disclosure relates to a processing apparatus, control device, method for controlling an output of a control device and method for controlling an output of a video game machine that is in communication with the control device. In some examples, the video game machine comprises one or more video game processors operable to execute a video game application at the video game machine. The control device may comprise a hand-holdable video games controller, examples of which will be described below. The video game application may be configured to generate video frames for outputting as part of the video game and / or for receiving such frames from a cloud-based device. In the latter case, the video game application need not be installed at a conventional games console, but may form part of e.g. a display device, streaming device, smartphone, PC, tablet, laptop, dongle, etc. In the present disclosure, the term 'control device' and 'hand-holdable control device' are used interchangeably.
As an example of a videogame playing device, Figure 1 schematically illustrates the overall system architecture of a Sony® PlayStation 4® entertainment device. It will be appreciated that the device shown in Figure 1 is just an illustrative example.
A system unit 10 is provided, with various peripheral devices connectable to the system unit. The output of the peripheral devices may be controlled in dependence upon whether a user is detected as holding a control device, as will be described later.
The system unit 10 comprises an accelerated processing unit (APU) 20 being a single chip that in turn comprises a central processing unit (CPU) 20A and a graphics processing unit (GPU) 20B. 25 The APU 20 has access to a random access memory (RAM) unit 22. The APU 20 communicates with a bus 40, optionally via an I/O bridge 24, which may be a discrete component or part of the APU 20.
Connected to the bus 40 are data storage components such as a hard disk drive 37, and a Blu-ray ® drive 36 operable to access data on compatible optical discs 36A. Additionally the RAM unit 22 may communicate with the bus 40.
Optionally also connected to the bus 40 is an auxiliary processor 38. The auxiliary processor 38 may be provided to run or support the operating system.
The system unit 10 communicates with peripheral devices as appropriate via an audio/visual input port 31, an Ethernet ® port 32, a Bluetooth CD wireless link 33, a VVi-Fi wireless link 34, 5 or one or more universal serial bus (USB) ports 35. Audio and video may be output via an AV output 39, such as an HDMI port.
The peripheral devices may include a monoscopic or stereoscopic video camera 41 such as the PlayStation Eye 0; wand-style videogame controllers 42 such as the PlayStation Move and conventional handheld videogame controllers 43 such as the DualShock 4 0; portable 10 entertainment devices 44 such as the PlayStation Portable ® and PlayStation Vita 0; a keyboard 45 and/or a mouse 46; a media controller 47, for example in the form of a remote control; and a headset 48. Other peripheral devices may similarly be considered such as a microphone, speakers, mobile phone, printer, or a 3D printer (not shown). The output of any (and any combination) of these peripheral devices may be controlled by a control processor and video game processor working alone or in cooperation (not shown), as will be described later.
The GPU 20B, optionally in conjunction with the CPU 20A, generates video images and audio for output via the AV output 39. Optionally the audio may be generated in conjunction with or instead by an audio processor (not shown).
The video and optionally the audio may be presented to a television 51. Where supported by the television, the video may be stereoscopic. The audio may be presented to a home cinema system 52 in one of a number of formats such as stereo, 5.1 surround sound or 7.1 surround sound. Video and audio may likewise be presented to a head mounted display unit 53 worn by a user 60. The output of the television may be controlled by a control processor and / or video game processor, as will be described later.
In operation, the entertainment device defaults to an operating system such as a variant of FreeBSD 9.0. The operating system may run on the CPU 20A, the auxiliary processor 38, or a mixture of the two.
Figure 2A shows schematically an example of a hand-holdable control device 200 that may be used in accordance with the present disclosure. In Figure 2A, the hand-holdable control device 200 is shown face-on, i.e. a front face of the device 200 is shown. The hand-holdable control device 200 comprises left and right hold portions, 202L, 202R to be held by a user. The left and right hold portions, 202L, 202R are located to spaced apart from each other in the left-right direction and are interconnected by a central section 203. The hold portions 202L, 202R, each comprise a plurality of operating members 204 for receiving user input. In Figure 2, the operating members 204 are shown as being disposed at an upper portion of the hold portions 202L, 202R. The operating members 204 are arranged such that, in use, a user can reach the operating members 204 of a respective hold portion with the thumb of the hand gripping that hold portion.
The control device 200 also comprises additional operating members 205 that are reachable by a user's index and middle fingers when gripping the left and right hold portions 202L, 202R of the device 200. In Figure 2A, the additional operating members 205 are shown as being disposed at an upper part of each hold portion 202L, 202R.
In some examples, the additional operating members 205 may have a variable (and programmable) resistance, such that the degree of force required to press down a given operating member 205 varies depending on the context within a video game. For example, if a player is using an additional operating member 205 to drive a vehicle through rocky terrain, the operating member 205 may provide more resistance to a pressing motion relative to the resistance provided when the vehicle is being driven on a tarmacked road. The resistance to such pressing motions may be provided by controlling e.g. vibration that is applied to the additional operating members 205.
In some examples, one or more hapfic transducers may be located within the housing of each hold portion 202L, 202R. The haptic transducers enable vibrational feedback to be output by the control device 200. The output of the one or more haptic transducers may be controlled in dependence on vibration data received from a video game machine, said information indicating when a vibration is to be output by the vibrators, and optionally, a magnitude of that vibration.
Although the left and right hold portions 202L, 202R are shown in Figure 2A as being connected by a central section 203, in some embodiments, the left and right hold portions 202L, 202R may be releasably attachable to the central section 203. For example, the left and right hold portions 202L, 202R may be separable from one another and may be useable as induvial control devices. Each hold section 202L, 202R may comprise a plug (or groove) for receiving a corresponding groove (or plug) on an outer edge of the central section 203, for example. The central section 203 may comprise a display device that is securable to opposing inner edges of the left and right hold portions 202L, 202R, for example.
In Figure 2A, the central section 203 is shown as comprising two analogue sticks 206 vertically disposed at a lower portion of the central section 203. The two analogue sticks 206 allow a user to input a directional control, such as controlling the direction of movement of a character and / or pose of a virtual camera. It will be appreciated that, Figure 2A is an illustrative example, and that in other examples, the analogue sticks 206 may be arranged differently on the device 200 and / or there may be a different number of them. For example, there may be just one analogue stick disposed at one of the hold portions. In some examples, a scroll wheel may be provided at an upper portion of one of the hold sections for providing a directional input.
In Figure 2A, the hand-holdable control device 200 is also shown as comprising operating members 207 and a touch pad 208. The operating members 207 may correspond to a 'share' button for capturing video of gameplay and an 'options' button for accessing game or system options. The touchpad 208 may comprise a touch sensor for detecting a touch input. In some examples, the touch sensor may be configured to detect a position on the touchpad 208 at which the touch input is received, such that a user can provide different inputs by touching different locations on the touch pad.
In Figure 2A, the hand-holdable control device 200 is also shown as comprising a speaker 209 for outputting audio. The speaker 209 may be configured to output audio in response to an audio signal received from a video game playing device. In Figure 2A, the speaker is shown as being provided at a front face of the control device 200.
The hand-holdable control device 200 may also comprise an inertial detector (not shown) operable to detect changes in position and / or orientation of the hand-holdable device. The inertial detector may comprise an Inertial Measurement Unit (IMU). Inertial measurements are considered, for the purposes of this description, to encompass one or more of: accelerometer detections and gyroscope detection. The inertial detector may provide an additional means through which a player can provide a directional user input to a video game machine. The inertial detector may be equivalent to a motion detector.
It will be appreciated that the control device 200 may comprise additional or fewer features than those shown in Figure 2A. In some examples, the control device 200 may comprise a biometric sensor (not shown) for identifying the user. The biometric sensor may be located at a central or lower position on the front face of the central section 203, for example.
Moreover, the control device 200 may comprise a microphone (not shown) for detecting audio.
Alternatively, or in addition, the control device 200 may comprise a display for displaying visual information to the user, e.g. in the form of an LCD display. The touchpad 208 may comprise the display in the form of a touchscreen.
Although not shown, it will be appreciated that the control device 200 will comprise a communication interface for enabling bidirectional communication with a video game machine via a wired or wireless connection. The communication interface may include one or more of e.g. a Bluetooth interface, a USB socket, Micro-USB socket, etc. The control device may relay user inputs received at the control device 200 to the video game machine via the communication interface. The control device may also receive data from the video game machine so as to control an output of the control device (e.g. vibration, light, audio, etc.).
Figure 2B shows schematically an example of the control device 200 shown in Figure 2A, but from a side view. In Figure 2B, the control device 200 is shown as comprising a tracking panel 211. The tracking panel 211 comprises a light source for illuminating the tracking panel 211. The tracking panel 211 may comprise a diffuser cover arranged in front of the light source so as to diffuse light projected onto it from the light source. The intensity of light output by the light source may controllable, with the intensity varying in dependence on e.g. whether the control device is detected as being connected to the video game machine, context information associated with a video game currently being played, etc. The tracking panel 211 provides a user with a visual indication of the player / object that is under their control in a video game, the player / object being controllable via inputs receivable at the game controller. In some examples the colour with which the tracking panel 211 is illuminated may correspond to the colour of a visual indicator displayed at a screen in association with the player / object under the player's control. The colour of light output by the light source may be controllable in dependence on player identification information received from a video game machine, for example.
Returning to Figure 2A, it can be seen that the control device 200 also comprises a sensor 210. The sensor 210 is operable to detect whether the control device 200 is being held by the user. Different types of sensor 210 that may be used for this purpose will be described later. In the present disclosure, the control device 200 is said to be operated in a first mode when the control device 200 is determined as being hand-held by the user. Conversely, the control device 200 is said to be operated in a second mode when the control device 200 is determined as not being hand-held by the user. How these modes are determined will become apparent from the embodiments described herein.
It will be appreciated that Figures 2A and 2B are illustrative examples, and that a different number of sensors 210, arranged differently within the control device 200, or external to the control device 200 may be used. In any case, the sensor 210 is in communication with a control processor (not shown), wherein the control processor is configured to determine which of the first and second modes the control device is being operated in, as will be described further later.
Figure 3 shows schematically another example of a hand-holdable control device 300 that may be used in accordance with the present disclosure. The MoveTM controller provides an example of the control device 300 shown in Figure 3. More generally, such types of control device 300 may be known in the art as motion controllers.
In Figure 3, the hand-holdable control device 300 comprises an elongate body 301 for hand-holding by the user (i.e. a handle). At a distal end of the elongate body 301 a camera-trackable portion 302 is provided. The camera-trackable portion 302 comprises a light source for illuminating the camera-trackable portion 302. The illumination of the camera-trackable portion 302 facilitates tracking of the camera-trackable portion 302 by an external camera (not shown). However, tracking of the camera-trackable portion 302 may be facilitated by the shape of the camera-trackable portion 302 and the light source need not necessarily be in a light emitting state for the camera-trackable portion 302 to be tracked (ultimately, this will depend on the lighting conditions in which the device is used). The intensity of light output by the light source may be controllable, e.g. by varying the voltage that is supplied to that light source. The intensity of light may be varied so as to indicate that the control device 300 is connected to the video game machine and / or to indicate an action required by a user (e.g. by flashing), for example.
In Figure 3, the control device 300 is shown as comprising an inertial detector 302 for detecting changes in position and / or orientation of the control device 300. The inertial detector 302 may correspond to an Inertial Measurement Unit (IMU) as described previously in relation to Figure 2A. The control device 300 also comprises a communication interface 303 for receiving inertial data from the IMU and relaying this data to a video game machine such that the display of the video game can be updated in accordance with the change in pose (position and / or orientation) of the control device 300. The communication interface 303 may correspond to a wired and / or wireless interface.
In Figure 3, the IMU may correspond to the sensor 302 that is configured to obtain sensor data, such that it can be determined which of the first and second modes the control device is being operated in. However, a different sensor and / or one or more additional sensors (not shown) may also be used for this purpose, as will be described further below.
In Figure 3, the control device 300 is also shown as comprising a processor 304. This processor 304 may correspond to a control processor for determining whether the control device 300 is being hand-held by the user. Alternatively or in addition, the processor 304 may correspond to a data processor, wherein the data processor is configured to control an output of the control device and / or the video game machine, in dependence on whether the control device 300 is determined as being hand-held or not. The operation of the control processor and data processor will be described further, later. Although not shown, it will be appreciated that, the control device 300 of Figure 3 may be provided with one or more operating members (e.g. buttons) at a given face or surface portion of the device 300. The operating members may be arranged such that when a user is holding the handle of the control device 300 in their palm, the one or more operating members are accessible to the user's thumb.
Figure 4 shows schematically an example of a data processing apparatus 400 in
accordance with the present disclosure.
The data processing apparatus 400 comprises a video game processor 401. The video game processor 401 is configured to execute a video game programme at a video game machine 402 and to update the execution of the video game responsive to user inputs made at or via a hand-holdable control device 403.
The data processing apparatus 400 also comprises a control device 403 that is hand-holdable by a user, the control device 403 being configured to provide control signals to the video game processor 401 indicative of user inputs at the control device. In Figure 4, the handholdable control device 403 corresponds to a DualShock 4Tm. However, the hand-holdable control device 403 may correspond to any of the hand-holdable control devices described previously relation to Figures 2A, 2B and 3. The hand-holdable control device 403 may comprise a number of further features, as will be described further, later.
The data processing apparatus 400 also comprises at least one sensor 404 for obtaining sensor data. In Figure 4, two sensors are shown, corresponding to a pair of sensors located at the respective handle portions of the control device. It will be appreciated that Figure 4 is an illustrative example and that a different number of sensors 404, arranged differently on, in and / or external to the control device, may be provided. Examples of different sensors and their arrangements will be described further, later.
The data processing apparatus 400 also comprises a control processor 405 configured to receive sensor data from the at least one sensor 404 and to determine, in dependence on the received sensor data, whether the control device 403 is being operated in a first or second mode of operation. In Figure 4, the control processor 405 is shown in dashed to indicate that the control processor 405 may be located at either the control device 403 or the video game machine 402, or distributed across both devices. As mentioned previously, a determination of the first mode of operation corresponds to a determination that the control device 403 is being hand-held by the user and a determination of the second mode of operation corresponds to a determination that the control device 403 is not being hand-held by the user.
In Figure 4, the control device 403 and video game machine 402 are both shown as comprising communication interfaces 406A and 406B respectively. These communication interfaces enable the control device 403 to exchange data (in a bi-directional manner) with the video game machine 402. The control device 403 may connect to the video game machine 402 in any of the manners described previously.
The video game processor 401 and the control processor 405 are configured to cooperate to execute a video game program and to vary a dependency of the execution of the video game program on user inputs at the control device 403 according to the determination by the control processor 405 of whether the control device 403 is being operated in the first or second mode of operation.
In some embodiments, the control device 403 is configured to vary the provision of the control signals in response to the determination (of mode) by the control processor 405. As mentioned previously, the control device 403 may comprise one or more user-operable controls, such as e.g. operating members, thumb sticks, a share button, etc. The control device 403 may be configured to provide control signals in respect of a set of candidate operations of the user-operable controls when the control device 403 is determined to be operated in the first mode of operation. This may correspond to transmitting control signals indicative of e.g. button presses, thumbstick motions, motion of the controller itself, audio received at any microphones, etc. to the video game machine 402 via the relevant communication interfaces 406A, 406B.
The control device 403 may also be configured to inhibit the provision of control signals associated with at least a subset of the set of candidate operations of the user-operable controls in response to a determination of the second mode of operation. This may involve, for example, reducing or ceasing the transmission of control signals (or a subset thereof) form the control device 403 to the video game machine 401, responsive to the control device 403 being determined as being operated in the second mode of operation.
Generally speaking, varying the provision of control signals may involve, for example, inhibiting (reducing and / or stopping) the transmission of control signals from the control device 403 to the video game machine 401, or modifying the control signals that are sent to the video game machine 401, such that the behaviour of the video game machine 401 varies in dependence on the determined mode of operation. Modification of the behaviour of the video game machine 401 may in turn result in a modification of the output of the control device.
Further examples of how the behaviour of the control device 403 and! or video game machine 401 may be varied will become apparent from the examples of embodiments described below.
As mentioned above, the data processing apparatus 400 may comprise one or more sensors 404 for obtaining sensor data (which in turn is used to estimate a likelihood as to whether the control device 403 is being hand-held by the user).
The sensor 404 may comprise an inertial detector configured to detect an acceleration of the control device 403. The inertial detector may correspond to the IMU described previously.
That is, the IMU may comprise one or more of an accelerometer and gyroscope for detecting changes in pose of the control device 403.
The control processor 405 may be configured to determine the first or second mode of operation in dependence on a comparison of the detected acceleration with a threshold acceleration. For example, it may be determined that the control device 403 is likely not being held by a user when the detected acceleration is below a threshold amount. When a player is holding the control device 403 it is unlikely that they will hold the device 403 completely stationary (or level) and so a detection of an acceleration exceeding a threshold value may be indicative that the control device 403 is being held by the user.
The inertial detector may be located within the housing of the control device 403. It may be preferable to use an inertial detector for detecting the mode of operation of the control device 403, since this detector may already be in use for playing of a video game and thus not result in any unexpected additional power being consumed by the control device 403.
In some examples, the at least one sensor 404 may comprise a proximity detector operable to detect the proximity of the control device 403 to an object other than the user. If the control device 403 is detected as being proximal to an object or surface other than the user, then this will generally be indicative that the user is no longer holding or engaging with the control device 403. The proximity detector may be configured to determine a distance of a surface from the proximity detector, for example.
Figure 5A shows schematically an example of a control device 403 that employs a proximity detector 502 for this purpose. The proximity detector 502 may be located at a position on the control device 403 that a user does not typically rest their fingers at when gripping the left and right hold portions. In the example of Figure 5A, this is shown as corresponding to the proximity detector 504 being positioned at a base (i.e. rear) portion of the central section 203 of the control device 403. In Figure 5A, the control device 403 is shown as comprising a control processor 405 for receiving an input from the proximity detector 502.
Figure 5B shows schematically a rear view of the same controller 403 shown in Figure 5A. In Figure 5B the rear surface of the central section 203 is shown as being a flat surface; however, in some examples, the rear surface may be angled with respect to a horizontal of the rear surface.
The proximity detector 502 may comprise a depth sensor, such as a e.g. a time-of-flight sensor, structured light sensor, etc. The control processor 405 may be configured to determine the first or second mode of operation (i.e. whether the control device is likely being held or not held by the user) in dependence on a comparison of the detected proximity of the control device 403 to the object other than the user, with a threshold distance. The threshold distance may correspond to a distance that is expected to be between the rear surface of the central section and a flat surface (e.g. floor, table) upon which the control device 403 is likely to be placed. For example, for the arrangement shown in Figures 5A and 5B, the control device 403 may be detected as having been put down if the control device 403 is detected as being e.g. 1 -6 mm from a surface. The proximity detector 502 may be configured to detect the distance of the control device 403 from a given surface / object at regular periodic intervals, for example.
It will be appreciated that the control device 300 shown in Figure 3 may also comprise a proximity detector 502 (not shown). In such a device 300, the proximity detector 502 may be located in a base portion of the handle 301, being oriented as facing downwards such that the distance between a base of the controller 300 and an underlying object! surface (e.g. the floor) can be determined.
In some examples, the at least one sensor 404 may comprise a temperature sensor. The temperature sensor may be arranged to detect a surface temperature at a portion of the surface of the control device 403. The temperature sensor may be configured to detect that the device is not being held by the user in dependence upon the temperature of a surface portion of the control device 403 being below a threshold value and / or not within a range of threshold temperatures.
The temperature sensor may be integral to, and flush with a surface of the control device, for example. Generally, the temperature sensor may be arranged at a portion of the control device 403 that is typically gripped during in use, such that the corresponding portion of the device is warmed by the hands of the user. In the examples shown in Figures 2A and 2B, the temperature sensor may be located at a surface of one or both left and right hold portions. Examples of suitable temperature sensors include, for example, Negative Temperature Coefficient (NTC) thermistors, Resistance Temperature Detector (RTD) sensors, thermocouples, semi-conductor-based temperature sensors, etc. As will be appreciated, the temperature of the control device 403 may increase during use due to heating of internal electronic components. It may therefore be desirable to provide a relatively small temperature sensor (smaller than the front and / or rear surface of the hold portion) such that the measured temperature is local to the part of the device 403 that is being gripped by the user. Moreover, it may be desirable to configure the temperature sensor to detect a small range of temperatures that are expected for the portion of the control device 403 that is gripped by the user (i.e. threshold temperatures). If the measured temperature exceeds (or drops below) this small range, this may be indicative of overheating (or abandonment) of the control device 403 rather than gripping by the user.
In alternative or additional examples, the at least one sensor 404 may comprise one or more pressure sensors for detecting a pressure applied to the hand-holdable control device 403. The one or more pressure sensors may fully or at least partially cover a front and / or rear portion of the hand-holdable portion of the control device 403.
Each pressure sensor may comprise one or more pressure-sensitive regions. Suitable pressure sensors include, but are not limited to, piezo-resistive strain gauges, piezoelectric pressure sensors, and inductive/reluctive pressure sensors. Generally, the one or more pressure sensors are arranged such that contact between the user and the control device 403 can be detected.
For example, for the control device shown in Figures 2A and 2B, the pressure sensors may cover a lower front and / or rear portion of the left and / or right hold portion. An example of this is shown in Figure 6, where the pressure sensor 602 is shown as covering the gripping portions of the control device 600 (corresponding to control device 404 described previously).
For the control device 300 shown in Figure 3, the pressure sensor(s) may fully or at least partially cover a front and / or rear surface of the device 300. An example of this is shown in Figure 7, which shows schematically a control device 700 having two respective pressure detecting bands 702A, 702B wrapped around a portion of the handle. In Figure 7, the pressure detecting bands 702A, 702B are shown as being positioned above and below operating member 706. In Figure 7, the camera-trackable portion 302 corresponds to that described previously in relation to Figure 3. The control device 700 shown in Figure 7 is an example of the more generic control device 300 illustrated in Figure 3.
It will be appreciated that a pressure sensor is different to a touch sensor, such as may be found on a mobile phone screen, or laptop; the user will typically hold (and thus touch) the device and pressure sensor, but needs to apply additional pressure over and above mere touch in order to safely hold the controller 403 in normal use. Similarly, it will be appreciated that a pressure sensor is different to a push switch or button, which physically moves from an off position to an on position when sufficient pressure is applied and hence requires a visible actuating movement on the part of the user.
Each pressure sensor may be configured to detect a variable pressure applied to the pressure sensor. The control device 403 may be detected as not being held by the user when the pressure detected at the one or more pressure sensors drops below a threshold pressure value. The amount of pressure that is conventionally applied by a user holding a controller may be determined empirically and used to determine the threshold value that corresponds to a user holding the control device 403.
In some examples, the at least one sensor may comprise an optical heart rate sensor for detecting a pulse of a user that is holding the controller 403. As is known in the art, the pulse of a person can be determined by illuminating the skin of that person, and measuring light reflected (and / or refracted) off of their blood vessels. The control device 403 may therefore comprise an optical heart sensor that is arranged to emit light at a skin surface of the user, and to detect the reflected light. The optical heart rate sensor may be located at, for example, a surface of the control device 403 that a user is likely to grip most tightly (e.g. a rear surface of a hold portion).
It will be appreciated that, in some examples, any of the sensors 404 described above may be separate from the hand-holdable control device 403 but releasably attachable to a surface of the control device 403. For example, the sensors 404 may comprise an adhesive backing, or other attachment means for securing the sensor 404 to an appropriate location on the surface of the control device 403. A user may position the sensor 404 at a location on the device 403 that they expect to be in contact with (or not, in the case of e.g. the proximity detector).
It will be further appreciated that in embodiments where the at least one sensor 404 comprises a sensor for detecting (or at least inferring) contact between the control device 403 and the user, a plurality of such sensors may be used and distributed across the surface of the control device 403. As above, these types of sensors may correspond to temperature sensors, pressure sensors, optical heart rate sensors and the like.
Alternatively or in addition, the proximity detector(s) described above may be positioned at surface locations of the control device 403 that are likely to be gripped by the user, such that a detected proximity of the user at those locations can be inferred as the control device 403 being hand-held by the user. Again, this may involve determining whether the distance of the detected surface (e.g. the user's skin) is less than a threshold distance.
Figures 8A and 83 shows schematically an example of how a plurality of sensors 802 may be spatially distributed at the surface of the control device 403 so as to detect whether the user is likely holding the control device 403. As can be seen in Figures 8A and 8B, these sensors 802 are distributed across the surface of the gripping portions.
The use of a greater number of sensors 802, arranged so as to be facing different directions, may provide greater certainty that the control device 403 is being hand-held by a user. For example, if the number of sensors 802 for which contact is detected exceeds a threshold number, it may be said with some confidence that the user is likely gripping that portion of the control device 403. As will be appreciated, the sensors 802 may correspond to one or more of proximity detectors, temperature sensors, optical heart rate sensors, pressure sensors and the like. Although not shown in relation to control devices of the form shown in Figure 3, it will be appreciated that a plurality of such sensors may be arranged in a corresponding manner (i.e. across a surface that the user is likely to grip during use).
As mentioned previously, the control device 403 may comprise one or more physical buttons (i.e. operating members) for receiving a user input. In such embodiments, the control processor 405 may be configured to detect that the control device 403 is not being held by the user based on the number of user inputs received at the controller 403 (via the buttons) being less than a threshold number over a pre-determined time period.
More generally speaking, the control processor 404 may be configured to determine the first or second mode of operation in dependence on a rate at which user inputs are received at the control device 403 via the one or more user-operable controls. If the rate at which user inputs are received at the control device 403 is less than a threshold rate, it may be determined that the control device 403 is being operated in the second mode. Conversely, if the rate at which user inputs are received is equal to or exceeds a threshold rate, it may be determined that the control device 403 is being operated in the first mode of operation.
For some video games there may be points in time for which no player input is expected, for example during the display of cut-scenes. It may therefore be desirable to distinguish whether the number of player inputs received via the physical buttons within a predetermined time period is less than a threshold number, and whether the video being output (at a display) during this time period corresponds to a cut-scene or gameplay. More generally, it may be desirable to detect whether player input is expected, given a current context of the video game.
Hence, in some examples, the apparatus 400 may comprise a machine learning model trained to detect a type of scene that the video frames generated during playing of a video game correspond to. The machine learning model may be trained to detect whether video frames output by the video game machine 402 correspond to e.g. gameplay, cut-scenes, maps, menus, etc. An example of a machine learning model that may be used for this purpose, and how it might be trained, is disclosed in patent application no. GB1819865.5, the contents of which are incorporated into this description by reference. The machine learning model may serve a primary purpose of capturing highlight events, as described in GB1819865.5.
The control processor 405 may be configured to receive an input from the machine learning model and determine in dependence thereon, a scene type (or types) for the video output during the time period for which no player inputs were received. If it is determined that the scene type corresponds to a scene type for which no player input was expected (e.g. cut-scene, loading screen, etc.) then the control processor 405 may infer that no action should be taken. In such examples, the control processor 405 may only determine the second mode of operation if the corresponding scene type corresponds to a scene-type for which player input was expected (e.g. gameplay, inventory menu, etc.). That is, the second mode of operation may only be determined if user inputs are expected but none of have been received.
In some examples, the machine learning model may receive both the audio and video being generated during playing of the video game. In such examples, the machine learning model may be configured to detect the scene-type based on both of these inputs. As above, the control processor 405 may determine the second mode responsive to the video and audio being detected as gameplay (e.g. a character in an open-world map, with the corresponding background sounds), and no player inputs being received within a threshold time period. Again, GB1819865.5 discloses an example of a machine learning model that may be employed for such purposes.
In some examples, the machine learning model may be located at the control device. Although, in most embodiments, it is expected that a CPU or Tensor Processor Unit (TPU) at the video game machine will be better suited for executing the trained machine learning model. Generally, the machine learning model may be executed by a machine learning processor, located at the control device and / or the video game machine.
Figure 9 shows schematically an example of an apparatus 900 that employs a machine learning model as described above. In Figure 9, the apparatus 900 is shown as comprising a trained machine learning model 901 that receives video and / or audio generated during playing of the video game. The output of the machine learning model may correspond to a detected type of scene or in-game event which is provided to the control processor 405. In Figure 9, the control processor 405 is shown as receiving inputs from both the machine learning model 901 and the sensor(s) 404. The control processor 405 may use these inputs to determine whether the control device is likely being operated in the second mode and provide an indication of this determination to the video game processor 401. It will be appreciated that the apparatus 900 shown in Figure 9 may include any of the other previously described components (despite not being shown).
In some embodiments, the hand-holdable control device 403 may comprise one or more microphones for detecting audio. In the present disclosure, a microphone is considered an example of a sensor 404 for obtaining sensor data. The microphone(s) may be arranged such that the user's speech can be detected during normal operation of the control device 403. For example, the microphones may be arranged at a front surface of the control device 403. An example of such an arrangement of microphones is shown schematically in Figure 10. In Figure 10, an array of microphones indicated at 1002. Alternatively, or in addition, the microphone(s) may be arranged to detect pressing of the physical buttons by the player, creaking of the housing of the control device 403, etc. The control processor 405 may be configured to determine whether the control device 403 is being operated in the first or second mode of operation in dependence on the audio signal detected at the one or more microphones. This may involve, for example, determining whether an average amplitude of the detected audio over a pre-determined period is less than a threshold average.
In some examples, the apparatus may comprise a machine learning model trained to detect whether one or more characteristic frequencies are present in the detected audio signal.
The control processor 405 may be configured to determine whether the control device 403 is being operated in the first or second mode of operation based on a detected presence or absence of the one or more characteristic frequencies in the detected audio signal.
The one or more characteristic frequencies may correspond to the pressing of the physical buttons, as well as creaks from squeezing together of corresponding housing portions of the control device 403 may result in noise having a characteristic frequency. A lack of detection of this characteristic frequency (or frequencies) may be indicative that the control device 403 is no longer being held by the user. A machine learning model may be trained to recognize the presence of this noise, or lack thereof. In some examples, the machine learning model may need to be trained on a per-video game basis, as the nature of player input may vary considerably between different video games.
In alternative or additional embodiments, the at least one sensor 404 may be external to the hand-holdable control device 403.
In some examples, the at least one sensor 404 may comprise a camera configured to capture images. An example of an apparatus 1100 comprising such a camera 1101 is shown in Figure 11. In Figure 11, the camera 1101 is shown as providing an input to the control processor 405 (described previously), which in turn provides an input to the video game processor 401 (described previously).
In these examples, the control processor 405 may be configured to determine whether the control device 403 is being operated in the first or second mode based on the images captured by the camera 1101. The camera 1101 may comprise a stereoscopic camera, such as the PS CameraTM, with the camera 1101 being configured to obtain colour and depth images.
The camera 1101 may comprise a communication interface for transmitting image data to the video game machine 402. The video game machine 402 may have a corresponding communication interface for establishing a wired or wireless connection with the camera 1101, such that image data captured by the camera 1101 can be received at the video game machine 402. In some examples, the camera 1101 is operable to capture images of the user; for example, by being arranged so as to face the user.
In alternative or additional examples, the apparatus 400 may comprise an object recognition unit 1201 operable to detect the control device 403 in the captured images and one or more surfaces in the environment in which the control device 403 is being used. An example of an apparatus 1200 comprising an object recognition 1201 unit is illustrated schematically in Figure 12.
The object recognition unit 1201 may be further configured to determine whether the control device 403 is adjacent one or more of the detected surfaces and to provide an indication of this to the control processor 405. In turn, the control processor 405 may be configured determine whether the control device 403 is being operated in the first or second mode of operation. A determination of the second mode may correspond to a determination that the control device 403 has been placed upon a surface and thus is not or is not likely being held by the user.
The control device 403 and one or more surfaces may be detected in the captured images by the same or different means. For example, both the control device 403 and one or more surfaces may be detected via machine learning. However, if the control device 403 comprises a camera-trackable portion (such as e.g. light bar 211 or camera-trackable portion 302), it may be that e.g. blob and! or contour detection is used by the object recognition unit to track the controller 403. The one or more surfaces in the environment may be detected by inputting the images captured by the camera 1101 to a trained deep network, for example, such as that described in 'Designing Deep Networks for Surface Normal Estimation', Xiaolong Wang et. Al, Carnegie Mellon University, p. 1 -9, the contents of which are incorporated in this description by reference. The deep network may be trained to detect surface normals and occlusion boundaries in RGB images, for example. Once the control device 403 has been detected, it can then be determined whether location of the control device 403in the captured images coincides with a detected surface. If there is such a co-location, it can be inferred that the control device 403 is likely resting on the detected surface.
If the camera 1101 comprises a stereoscopic camera, then the RGB-D images captured by the camera 1101 may be input to a deep network, which then detects surface normals and occlusion boundaries in the images (an example of such a network is outlined in 'Physically-based rendering for indoor scene understanding using convolutional neural networks', Proceedings of the I EE Conference on Computer Vision and Pattern Recognition) , the contents of which are incorporated in this description by reference. If the control device 403 is detected as being at a position and depth that corresponds to (e.g. is on top of) a detected surface, it can then be determined that the device 403 is resting on that surface. As will be appreciated, depth data may improve the accuracy with which the position of the controller 403 relative to a surface can be determined.
As will be appreciated, for some types of control device 403, such as those described previously, it may be unusual for a user to operate the device 403 while it is resting on a surface; hence, detection of the control device 403 on a surface (or within a threshold distance of one) may be indicative that the control device 403 is not being held by the user. Hence, the control processor 405 may be configured to use such a detection to determine whether or not the control device 403 is being operated in the second mode or not. This determination may be further improved by determining whether the control device 405 has been placed on a given surface for longer than a threshold time period.
In some examples, the object recognition unit 1201 may be configured to detect only the control device 403 in the captured images. In such examples, the object recognition unit 1201 may be configured to detect the pose of the control device 403 from the appearance of the control device 403 in the captured images. This may be in addition or instead of using an IMU within the device 403 to detect the pose of the device 403. In such examples, the images of the control device 403 may be provided to the object recognition unit 1201, which then detects the control device 403 and its pose in the captured images.
The pose of the control device 403 may be determined using e.g. contour detection and / or machine learning. An example of a machine learning model that may be used for this purpose is the PoseCNN' model (see: PoseCNN: A convolutional neural network for 6D Pose Estimation in Cluttered Scenes', Y. Xiang et al, 26 May 2018, p. 1-10) , the contents of which are incorporated in this description by reference. The control processor 405 may then determine which of the modes the control device 403 is being operated in, based on whether the detected pose of the control device 403 is within a range of poses that are typical for the control device 403 during normal use (which may be determined empirically, for example). If it is determined that the control device 403 is outside of this range of typical poses for a pre-determined time period, it may be determined that the control device 403 is being operated in the second mode.
In some examples, the apparatus 1200 may comprise a facial recognition unit 1202 operable to detect the user's face in the images captured by the camera. In Figure 12 the object recognition unit 1201 and facial recognition unit 1202 are shown in dashed to indicate that the apparatus 1200 may comprise either or both of these components.
A lack of the user's face in a threshold number of successively captured images (corresponding to a threshold time period) may be indicative that the user is no longer holding the hand-holdable control device 403. The facial recognition unit 1202 may be located at, for example, the video game machine 402 as e.g. image processing software. The facial recognition unit 1202 may be trained to recognise the user's face in captured images via machine learning, as is known in the art. The control processor 405 may be configured to determine whether the control device 403 is being operated in the first or second mode of operation in dependence on whether the user's face is detected in a threshold number of successively captured images.
In alternative or additional examples, the camera may be located at the control device 403. The camera may be arranged so as to capture images of the user's face when the control device 403 is being hand-held by the user. As will be appreciated, in such examples the camera may not necessarily be 'external' to the control device in the sense that it is physically separated from the control device 403.
Figure 13 shows schematically an example of a camera device 1302 affixed to the control device 403 in such a way. In Figure 13, the camera 1302 is shown as being secured to the top edge of the central section interconnecting the left and right hold portions of the control device 403. As will be appreciated, the camera 1302 may be tilted with respect to the front face of the control device 403 so as to be oriented towards the user's face during normal use of the control device 403. It will be further appreciated that the camera device 1302 may be separable from the control device 403; which is to say, releasably securable to the control device 403 by way of an attachment means such as e.g. a clip, plug and socket-type connection, adhesive backing, etc. It will be appreciated that any of the above-described sensors (and components associated therewith) may be used in accordance with the present disclosure; either alone, or in combination with one or more others of the described sensors.
It will be further appreciated that a determination by the control processor 405 of the second mode may correspond to a determination that the control device 403 has not been hand-held by the user for longer than a pre-determined time interval. That is, the control processor 405 may act on a hysteresis, with the second mode only being verified as a detection responsive to the control device 403 having been determined as switching from the first mode to the second mode and having remained in the second mode for longer than a pre-determined time period. In this way, this can prevent actions of the control device 403 and! or video game machine 402 being varied rapidly in response to momentary actions performed by the user. For example, a user may find it frustrating if the control device 403 were to switch to a lower power mode, every time they e.g. put their control device 403 down to the check their phone.
Having determined whether the control device 403 is being operated in the first or second mode of operation On any of the manners described above), the behaviour of the control device 403 and / or video game machine 402 can be adapted accordingly. Examples of how this behaviour may be adapted will be described below.
Figure 14 shows schematically an example of a data processing apparatus 1400 as described above. In Figure 14, the data processing apparatus 1400 is shown as comprising a control device 1401, video game machine 1402, and a plurality of output devices that may be further controlled in dependence on whether the control device 1401 is determined as being operated in the first or second mode of operation. In Figure 14, the control processor 405 is not shown; as will be appreciated the control processor 405 may be located at one or more of the control device 1401 and the video game machine 1402.
The control device 1401 may correspond to any of the control devices described previously, such as those shown in Figures 2A, 2B, 3 and 7.
In Figure 1400, the control device 1401 is shown as comprising sensors 1403 at the respective gripping portions. As will be appreciated, this is just an illustrative example and different types, number and spatial arrangements of sensors may be employed, as described previously.
In Figure 1400, the apparatus is shown as comprising a display device 1404, external camera 1405 (such as the PS cameraO), speaker 1406 and streaming device 1407, having corresponding communication interfaces 1408, 1409,1410 and 1411 respectively. Each communication interface connects the video game processor 1412 to a corresponding output device. These communication interfaces 1408, 1409,1410, 1411may correspond to any of the AV In, Ethernet, Bluetooth, VViFi, USB, HDMI, Av Out, connections described previously, for example. The video game processor 1414 may form part of the CPU and GPU of the video game machine 1402, for example.
In Figure 1400, the video game machine 1402 is also shown as comprising memory 1413. Memory 1413 may correspond to electronic memory at which video game saves are stored, for example.
It will be appreciated that Figure 1400 is an illustrative example and it will be appreciated that in other examples, a different number of, and different types of output devices may be provided. A person skilled in the art will understand the data processing apparatus as claimed is not limited to the illustrative example shown in Figure 1400.
As mentioned previously, in some embodiments the control device 403, 1401 may comprise a light source. In these embodiments, the control device 403, 1401 may be configured to modify an intensity of light that is output by the light source in dependence on the determination by the control processor 405. The control device 403, 1401 may be configured to cause light to be output by the light source at a first intensity responsive to a determination of the first mode of operation. In addition, the control device 403, 1401 may be configured to cause light to be output at a second intensity responsive to a determination of the second mode of operation. The second intensity may be lower than the first intensity. In some examples, it may be that, responsive to a determination of the second mode of operation, the light source is dimmed and / or switched off. The light source may be progressively dimmed to an off state or immediately switched off, for example. In any case, the output of light source is inhibited in the second mode compared with in the first mode.
As will be appreciated, dimming of the light source will reduce the power consumed by the control device 403, 1401. In response to a determination that the mode of operation has switched from the second mode to the first mode, the intensity of light may be increased back up to the first intensity.
As also mentioned previously, the control device 403, 1401 may comprise one or more haptic transducers for generating a haptic output detectable at a surface of the control device 403, 1401. As discussed previously, this haptic output may be generated responsive to vibration data received at the control device 403, 1401 from the video game machine 402, 1402. The vibration data may provide an indication of the amplitude and frequency at which a haptic output is to be generated at the control device 403, 1401.
In embodiments involving such haptic feedback, the control device 403, 1401 may be configured to modify a haptic output generated by the one or more haptic transducers in dependence on the determination by the control processor 405. The control device 403, 1401 may be configured to generate a haptic output with a first magnitude responsive to a determination of the first mode of operation, and to generate a haptic output with a second magnitude in response to a determination of the second mode of operation. The second magnitude of vibration may be lower than the first magnitude of vibration; which is to say, the haptic output at the second magnitude may be less perceptible to a user that is holding the control device 403, 1401.
In some examples, the control device 403, 1401 may be configured to reduce and! or cease the generation of a haptic output at the control device 403, 1401, responsive to the determination by the control processor 405 that the control device 403, 1401 is being operated in the second mode. As will be appreciated, this will reduce the amount of power consumed by the control device 403, 1401. The inhibition of haptic output may be gradual or immediate.
Responsive to a determination that the mode of operation has switched from the second to the first mode, the control device 403, 1401 may be configured to resume generating a haptic output at the first magnitude.
It will be appreciated that the intensity of the haptic output will generally depend on vibration data received from the video game machine 402, 1402. The intensity of vibration may increase as e.g. a player approaches a target object or loses health. Hence, generating a haptic output at a first magnitude may not necessarily mean that each haptic output is generated with the same vibrational intensity. Rather, it is to say that, in the first mode of operation, the haptic output is not reduced or dimmed in any way; the haptic output is generated in accordance with the received vibration data. In the second mode of operation, the haptic output is generated with a lower intensity, if at all.
The decision to modify the haptic output responsive to the determination of the second mode may be performed at the control device 403, 1401 or the video game machine 402, 1402.
In the former case, the control device 403, 1401 may choose to not act on received vibration data. In the latter case, the video game machine 402, 1402 may elect to cease transmitting vibration data to the control device 403, 1401 until the control device 403, 1401 is detected as being operated in the first mode of operation. As will be appreciated, it may be more energy efficient to cease transmitting vibration data to the control device 403, 1401.
As mentioned above, in some examples, the apparatus 400, 900, 1100, 1200, 1400 may comprise a display 1404 for displaying notifications to a user. The notifications may provide information such as e.g. friendship requests received from other players, the online presence of other players, messages received from other players, the completion of system updates, installation of a video game, etc. The notifications may be generated in response to inputs received from other applications installed at the video game device (e.g. video game applications, PS Network application, PS Store application, etc.).
In these embodiments, the video game processor 401, 1412 and control processor 405 may be configured to cooperate to generate notifications for displaying to the user and to modify the display of these notifications in dependence on whether the control device 403, 1401 is determined as being operated in the first or second mode. For example, the video game processor 401, 1412 and control processor 405 may cooperate so as to limit or prevent the display of generated notifications at the display 1404, responsive to the determination that the control device 403, 1401 is being operated in the second mode. This may involve, for example, queueing any notifications received after the point in time in which the control device 403, 1401 was determined as not being held by the user. These queued notifications may then be presented to the user in response to the control device 403, 1401 being subsequently determined as being operated in the first mode of operation. For example, a user may be presented with a single 'See what you missed' notification.
In some examples, the video game processor 401, 1412 and control processor 405 may be configured to cooperate so as to switch off or dim the display 1404 associated with the video game machine 402, 1402, responsive to a determination that the control device 403, 1401 is being operated in the second mode. The display device 1404 may then be switched back on, or brought back to a nominal intensity, in response to a determination that the mode of operation has switched back from the second to the first mode.
In some embodiments, the video game processor 401, 1412 is configured to execute a video game in response to the control signals, the video game processor 401, 1412 being configured to vary the effect of the control signals on the execution of the video game in dependence on whether the control device 403, 1401 is determined as being operated in the first mode of operation or the second mode of operation.
As mentioned previously, in some examples, the control device 403, 1401 comprises one or more user-operable controls. In such examples, the control device 403, 1401 may be configured to provide control signals in respect of a set of candidate operations of the user-operable controls. The video game processor 402, 1412 may be configured to inhibit the effect on execution of the video game of control signals associated with at least a subset of the set of candidate operations of the user-operable controls in response to a determination of the second mode of operation.
In alternative or additional embodiments, the video game processor 402, 1412 may be configured to execute a game-state-save operation responsive to a determination of the second mode of operation. This may ensure, for example, that if a player has abandoned their video game session for whatever reason, they will be able to resume playing from where they left off. This also ensures that, should the video game machine 402, 1412 be switched off or operated in a lower power mode (responsive to the determination of the second mode), in-game progress will not be lost.
Alternatively or in addition, in some embodiments, the video game processor 402, 1412 may be configured to pause or adapt a video game being executed at the video game machine 402, 1402, responsive to a determination by the control processor 405 that the control device 403, 1401 is being operated in the second mode. The video game may be adapted by at least one of: being rendered at a lower resolution, a reduction or termination in the generation and rendering of non-player characters (NPCs), a modification to NPC behaviour (i.e. all stationary), etc. The video game may revert to being rendered in the normal way, responsive to a determination by the control processor 405 that the control device 403, 1401 is being operated in the second mode.
In some embodiments, it may be that a player is recording their gameplay via a video recording application (e.g. accessible via the 'Share' button). In such embodiments, the video game processor 402, 1412 may be configured to generate a pause recording instruction responsive to a determination by the control processor 405 that the control device 403, 1401 is being operated in the second mode of operation.
The pause recording instruction may be transmitted to the video recording application being executed at the video game machine 402, 1402. If a user is not holding the hand-holdable control device 403, 1401 it is unlikely that any interesting gameplay will be occurring, and so the video game processor 402, 1412 can instruct the recording application to pause (and optionally, terminate) a current recording. This may involve pausing or terminating a streaming of video where the recording is being streamed to another device. Once the control device 403, 1401 is determined as being operated in the first mode of operation again, recording may resume again.
In some embodiments, the control processor 405 and video game processor 402, 1412 may be configured to cooperate so as to control a 'power mode' that the control device 403, 1401 and / or video game machine 402, 1402 operate in, in dependence on the determined mode of operation. In such embodiments, the control device 403, 1401 and / or video game machine 402, 1402 may be instructed to operate in a low power mode responsive to a determination of the second mode of operation. This may result in less (or no) power being drawn from the video game machine 402, 1402 to the controller, where e.g. the control device 403, 1401 is being powered via a wired or wireless connection to the video game machine 402, 1402.
In some examples, the video game machine 402, 1402 may be configured to eject a connector that is connecting the control device 403, 1401 to the video game machine 402, 1402, responsive to the determination that the control device 403, 1401 is being operated in the second mode As a result of this ejection, the control device 403, 1401 may no longer receive power from the video game machine.
In further or alternative embodiments, the video game machine 402, 1402 and! or control device 403, 1401 may be connected to a cloud-gaming service. The connection to the cloud gaming service may be facilitated by a streaming device, such as e.g. a USB-powered device. In Figure 14, an example of a streaming device is shown as device 1407.
The cloud-gaming service may comprise one or more cloud devices, such as servers, or a P2P network, that are configured to transmit video frames, or objects (e.g. mesh and texture data) for rendering, to the video game machine 402, 1402. In the former case, it may be that the video game processor 401, 1412 and control processor 405 cooperate so as to cease outputting of the received video frames, responsive to a determination of the second mode of operation. In the latter case, it may be that the video game processor 401, 1412 and control processor 405 cooperate so as to cease rendering of the mesh and texture data received from the cloud device. These operations may be undone in response to a subsequent detection that the control device is being operated in the first mode of operation.
In some examples, the video game processor 401, 1412 and control processor 405 may be configured to cooperate so as to vary a resolution at which video and / or object data is requested from the cloud device. For example, if the video game machine 402, 1412 is streaming 4K video, this may be switched to e.g. 1080p or 720p, responsive to a determination that the control device 403, 1401 is being operated in the second mode of operation. Moreover, in some examples, the video game machine 402, 1412 may request updated video frames or object data at less regular intervals, responsive to a determination that the control device 403, 1401 is being operated in the second mode. The video game machine 402, 1412 may revert to streaming and / or rendering the video game in the conventional manner, responsive to a determination that the control device 403, 1401 is now being operated in the first mode of operation.
In some embodiments, the video game processor 401, 1412 and control processor 405 may cooperate so as to vary the outputting of an audio signal at a speaker (such as speaker 1406 shown in Figure 14), in dependence on the determined mode of operation. This may involve, for example, ceasing the outputting of audio at the speaker 1406, or outputting the audio at a lower volume, responsive to a determination that the control device 403, 1401 is being operated in the second mode of operation. In response to the mode of operation having been determined as switching back to the first mode from the second mode, the video game processor 401, 1412 and control processor 405 may cooperate so as to resume normal output of the video game audio.
It will be appreciated that, in some embodiments, the apparatus 400, 1400 comprises a video game console comprising the control processor 405 and the video game processor 402, 1412 (which may act and operate in any of the above described manners). It will be further appreciated that in some embodiments, the control device 403, 1401 may comprise the control processor 405, and optionally, the video game processor 402, 1412.
It will also be appreciated that the video game processor 401, 1412 and control processor 405 may cooperate so as to perform any of the above described operations; i.e. these operations need not (necessarily) be exclusively performed by one processor the other. As described above, a data processing apparatus 400, 900, 1100, 1200 is provided for performing any of the above mentioned operations in dependence on the mode of operation that the control device 403, 1401 is determined as being operated in. As will be appreciated, in some embodiments, the control device 403, 1401 may be primarily responsible for controlling the behaviour of the control device 403, 1401.
Figure 15 shows schematically an example of a control device 1500 in accordance with a second aspect of the present disclosure. As can be seen in Figure 15, the hand-holdable control device 1500 comprises: a sensor 1501 operable to generate sensor data; a control processor 1502 configured to receive said sensor data and determine, in dependence on the received sensor data, whether the control device 1500 is being operated in a first mode or second of operation; a communication interface 1504 for exchanging data with a video game machine via a wired or wireless connection; a data processor 1503 operable to receive an input from the control processor 1502, and in response to said input, modify an output of the control device 1500. As before, the determination of the first mode of operation corresponds to a determination that the control device 1500 is being hand-held by the user; the determination of the second mode of operation corresponds to a determination that the control device 1500 is not being hand-held by the user.
Functionally speaking, the control processor 1502 is configured to determine which of the two modes the control device is being operated in. The data processor 1502 is configured to generate an output in dependence on this determination of mode, so as to control an output of the control device 1502 and / or a video game machine (not shown) that is being operated via the control device 1502.
Generally, the determination of mode may be achieved in any of the previously described manners (i.e. in relation to Figures 2A -14); although, as will be appreciated, in some examples, the control device 1500 may be limited to receiving sensor data from sensors 1501 located at the control device 1500.
It will be further appreciated that the control device 1500 may have a form factor akin to that described previously in relation to Figures 3 and 7, i.e. a motion controller, despite being shown as a control device 1500 for gripping with both hands.
In the second aspect, the sensor 1501 may comprise one or more of: an inertial detector for detecting an acceleration of the control device 1500; a proximity detector detecting the proximity of the control device 1500 to an object that is not the user (or conversely, a surface that is likely to be the user) iii. a pressure sensor for detecting a force applied to the control device 1500 by the user; iv. a temperature sensor for detecting a surface temperate at a portion of the control device 1500 that is to be hand-held by the user; v. an optical heart rate sensor for detecting a user's pulse when the user is gripping the control device 1500; vi. one or more microphones for detecting audio.
The configuration of these sensors 1501 with respect to the control device 1500 may be as described previously in relation to Figures 2A -14.
As described previously, the control device 1500 may comprise one or more haptic transducers for generating a haptic output. As also described previously, the communication interface 1504 of the control device 1500 may be configured to receive vibration data from a connected video game machine and to generate a haptic output in accordance with the received vibration data. In such embodiments, the data processor 1503 may be operable to modify the haptic output of these one or more transducers in dependence on whether the control device 1500 is determined as being operated in the first or second mode of operation.
The data processor 1503 may be configured to cause the haptic output to be generated at a first magnitude responsive to the determination that the control device 1500 is being operated in the first mode of operation. Conversely, the data processor 1503 may be configured to cause the haptic output to be generated at a second magnitude, responsive to the determination that the control device 1500 is being operated in the second mode of operation.
As described previously, the second magnitude may be lower than the first, and correspond to an inhibition of the haptic output relative to the first magnitude.
It will be appreciated that the data processor 1503 may not output the haptic output directly, but rather, be in communication with the one or more transducers so as to control the haptic output generated by those transducers.
As described previously, in some examples, the control device 1500 may comprise a light source for enabling at least one of tracking of the control device 1500 by an external camera and identifying a character within a video game that a user of the control device 1500 corresponds to. In such examples, the data processor 1503 may be configured to modify an output of the control device 1500 by causing the light source to output light at a first intensity responsive to the determination that the control device 1500 is being operated in the first mode of operation. Responsive to a subsequent determination that the control device 1500 is being operated in the second mode, the data processor 1503 may be configured to cause the light source to output light at a second intensity, the second intensity being lower than the first intensity. As described previously this may correspond to a dimming and / or switching off of the light source at the control device 1500.
In some embodiments, the data processor 1503 is configured to generate a command for transmitting to the video game machine via the communication interface 1504, with the command varying in dependence on whether the control device 1500 is determined as being operated in the first or second mode of operation. The video game machine may be configured to modify its behaviour in dependence on the command received from the control device 1500, as will be described below.
It will be appreciated that, in some examples, the command may simply provide an indication of whether the control device 1500 has been detected as operating in the first or second mode. The video game machine may then act on this information accordingly.
In some examples, the command generated by the data processor 1503 may comprise a command to inhibit the transmission of vibration data to the control device 1500. This command may be generated by the data processor 1503 responsive to the determination by the control processor 1502 that the control device 1500 is being operated in the second mode of operation. This command may then be transmitted to the video game machine, which then inhibits the transmission of vibration data to the control device 1500 accordingly. If it is determined that the mode of operation has switched back from the second mode to the first mode, the data processor 1503 may generate a command to resume transmission of vibration data to the control device 1500. This command may then be transmitted to the video game machine via the communication interface. The video game machine may then resume transmitting vibration data to the control device 1500. Examples of different types of command that may be generated by the data processor 1503 are provided below.
In additional or alternative examples, the command generated by the data processor 1503 may comprise a command to save a current state of a video game being executed at the video game machine. Such a command may be generated and transmitted to the video game machine, responsive to a determination that the control device 1500 is being operated in the second mode of operation. As mentioned previously, it may be useful to save a current state of a video game, especially if the video game machine and / or any output devices are to be operated in a lower power mode.
In yet further additional or alternative examples, the command may comprise a command to vary an execution of a video game at the video game machine. Such a command may be issued responsive to a determination that the control device 1500 is being operated in the second mode. If it is subsequently detected (by the control processor 1502) that the control device is being operated in the first mode, the data processor 1503 may issue a subsequent command to resume normal execution of the video game.
Alternatively or in addition, the command generated by the data processor 1503 may comprise a command to modify a display of notifications at a display associated with the video game machine. Such a command may be generated responsive to a determination that the control device 1500 is being operated in the second mode of operation. The data processor 1503 may generate a command to resume normal display of notifications responsive to the determination that the mode of operation has switched from the second mode to the first mode. The modification of these notifications may be achieved in any of the manners described previously.
In further or alternative examples, the command generated by the data processor 1503 may comprise a command to render frames at one or more of (i) a lower resolution and (ii) a lower frame rate. This command may be generated responsive to a determination that the control device 1500 is being operated in the second mode of operation (as discussed previously). The data processor 1503 may be further configured to generate a command to render frames at one or more of (i) an increased resolution and (ii) a higher frame rate, responsive to a determination that the control device 1500 has switched from the second mode to the first mode. The video game machine may act on the commands received from the control device 1500 accordingly.
Alternatively, or in addition, the command generated by the data processor 1503 may comprise a command to inhibit the display of video at a display. As discussed previously, such a command may be generated responsive to a determination that the control device 1500 is being operated in the second mo. Conversely, the data processor 1503 may be configured to generate a command to resume display responsive to the determination that the mode of operation has switched from the second mode to the first mode.
In further or alternative examples, the command generated by the data processor 1503 may comprise a command to inhibit the outputting of audio at a speaker. Again, such a command may be generated responsive to a determination that the control device 1500 is being operated in the second mode of operation. The data processor 1503 may be configured to generate a command to resume outputting of audio at the speaker responsive to a determination the mode of operation has switched from the second mode to the first mode.
Alternatively or in addition, the command may comprise a command to disconnect the control device 1500 from the video game machine and / or any output devices connected to the video game machine, responsive to the determination that the control device 1500 is being operated in the second mode. This may be achieved as described previously in relation to Figure 14.
It will be appreciated that, in some examples, the command may simply provide an indication of whether the control device 1500 has been detected as operating in the first or second mode. The video game machine may then act on this information accordingly, based on a known mapping of detected modes and corresponding actions to be performed by the video game machine.
It will be appreciated that the control device 1500 may be configured to perform any of the operations described previously in relation to the data processing apparatus. However, in some examples, the CPU and / or GPU available at the control device may be limited, and so the operations that the control device is capable of may be limited in terms of the processing power and memory required.
Figure 16 shows schematically an example of a method for controlling an output of a hand-holdable control device in dependence on whether the control device is determined as likely being held by a user or not. As before, the control device is suitable for providing player inputs (i.e. in the form of control signals) to a video game programme being executed at a video game machine. The sensors may correspond to any of the sensors described previously in relation to Figures 2A -14.
At a first step 1601, sensor data is obtained from at least one sensor. The at least one sensor may correspond to any of the previously described sensors and be arranged in any of the previously described manners.
At a second step 1602, it is determined, based on the obtained sensor data, whether the hand-holdable control device is being operated in a first or second mode of operation. As described previously, the first mode of operation corresponds to a determination that the control device is being hand-held by the user and the second mode of operation corresponds to a determination that the control device is not being hand-held by the user. The determination of which the modes the control device is being operated in may be achieved as in any of the previously described manners.
At a third step S1603, an output of the control device is varied in dependence on the determination of whether the control device is being operated in the first or the second mode of operation. The output of the control device may be varied in accordance with any of the previously described embodiments. For example, it may be that one or more of the (i) intensity and / or (ii) colour of light output by a light source associated with the control device is varied; (iii) the intensity of a haptic output generated by one or more transducers located at the control device is varied, and so on and so forth.
Figure 17 shows schematically an example of a method for controlling an output of a video game machine in dependence on whether an associated control device is determined as being held by the user or not. As before, the video game machine is operable to execute a video game programme in dependence on player inputs received at the video game machine (i.e. as control signals) via the control device.
At a first step 1701, sensor data is obtained from at least one sensor. The sensor(s) may correspond to any of the sensors described previously.
At a second step 1702, it is determined, based on the obtained sensor data, whether the hand-holdable control device is being operated in a first or second mode of operation. As before, the first mode of operation corresponds to a determination that the control device is being hand-held by the user and the second mode of operation corresponds to a determination that the control device is not being hand-held by the user. The determination of which of the modes the control device is being operated may be achieved as described previously.
At a third step 1703, an output of the video game machine is varied in dependence on the determination of whether the control device is being operated in the first or the second mode of operation. The output of the video game machine may be varied as described previously in relation to Figure 14.
It will be appreciated that, whilst the term 'output' has been used in relation to Figure 17, in some examples, it may be that an internal state of the video game machine is modified. As mentioned previously, it may be that, responsive to a determination of the second mode of operation, a save-game operation is performed or that e.g. recording of game play is paused. In such cases, it may be that there is no discernible output' to indicate that such an event has taken place.
However, as described previously, in some examples, it may be that the output is discernible; for example, one or more of a haptic, luminous (e.g. lightbar output), video and / or audio output is modified responsive to the determined modes.
In some embodiments, there is provided computer software which, when executed by one or more computers, causes the one or more computers to perform the previously described methods. This computer software may be stored at a non-transitory machine-readable storage 30 medium.
It is noted that the term "based on" is used throughout the present disclosure. The skilled person will appreciate that this term can imply "in dependence upon", "in response to" and the like, such that data A being based on data B indicates that a change in data B will lead to a resulting change in data A. Data B may be an input to a function that calculates data A based on data B, for example.
It will be appreciated that the method(s) described herein may be carried out on conventional hardware suitably adapted as applicable by software instruction or by the inclusion or substitution of dedicated hardware. Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a non-transitory machine-readable medium such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device. Separately, such a computer program may be transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these or other networks.

Claims (48)

  1. CLAIMS1. Data processing apparatus comprising: a video game processor; a control device hand-holdable by a user, the control device being configured to provide control signals to the video game processor indicative of user inputs at the control device; at least one sensor; and a control processor configured to receive sensor data from the at least one sensor and to determine, in dependence on the received sensor data, whether the control device is being operated in a first or a second mode of operation; wherein a determination of the first mode of operation corresponds to a determination that the control device is being hand-held by the user and a determination of the second mode of operation corresponds to a determination that the control device is not being hand-held by the user; the video game processor and the control processor being configured to cooperate to execute a video game program and to vary a dependency of the execution of the video game program on user inputs at the control device according to the determination by the control processor of whether the control device is being operated in the first or the second mode of operation.
  2. 2. Apparatus according to claim 1, in which: the control device is configured to vary the provision of the control signals in response to the determination by the control processor.
  3. 3. Apparatus according to claim 2, in which: the control device comprises one or more user-operable controls; the control device is configured to provide control signals in respect of a set of candidate operations of the user-operable controls when the control device is determined to be operated in the first mode of operation; and the control device is configured to inhibit the provision of control signals associated with at least a subset of the set of candidate operations of the user-operable controls in response to a determination of the second mode of operation.
  4. 4. Apparatus according to claim 3, wherein the control device comprises: left and right hold portions to be held by a user and a central section interconnecting the left and right hold portions; and wherein at least one of the user-operable controls comprises a joystick disposed at a respective one of the hold portions and positioned so as to be accessible by a thumb of a hand that is holding that hold portion.
  5. 5. Apparatus according to any of claims 2 to 4, wherein the control device comprises a light source and a panel, the light source being arranged to illuminate the panel, the intensity of light output by the light source being controllable.
  6. 6. Apparatus according to claim 2 or claim 3, wherein the control device comprises an elongate body for hand-holding by the user; and a camera-trackable portion disposed at a distal end of the elongate body when handheld by the user, the camera-trackable portion comprising a light source.
  7. 7. Apparatus according to claim 5 or claim 6, wherein the control device is configured to modify an intensity of light output by the light source in dependence on the determination by the control processor; the control device being configured to cause light to be output by the light source at a first intensity in responsive to a determination of the first mode of operation; and to output light at a second intensity in responsive to a determination of the second mode of operation, the second intensity being lower than the first intensity.
  8. 8. Apparatus according to any preceding claim, wherein the control device comprises one or more haptic transducers for generating a haptic output detectable at a surface of the control device; wherein the control device is configured to modify a haptic output generated by the one or more haptic transducers in dependence on the determination by the control processor; the control device being configured to generate a haptic output with a first magnitude responsive to a determination of the first mode of operation, and to generate a haptic output with a second magnitude in response to a determination of the second mode of operation, the second intensity being lower than the first intensity.
  9. 9. Apparatus according to any preceding claim, comprising: a display to display notifications to a user; wherein the video game processor and the control processor are configured to cooperate to generate notifications for displaying to the user and to modify the display of notifications at the display in dependence on a determination of the first mode of operation or the second mode of operation.
  10. 10. Apparatus according to claim 1, wherein video game processor is configured to execute a video game in response to the control signals, the video game processor being configured to vary the effect of the control signals on the execution of the video game in dependence on whether the control device is determined as being operated in the first mode of operation or the second mode of operation.
  11. 11. Apparatus according to claim 10, in which: the control device comprises one or more user-operable controls; the control device is configured to provide control signals in respect of a set of candidate operations of the user-operable controls; and the video game processor is configured to inhibit the effect on execution of the video game of control signals associated with at least a subset of the set of candidate operations of the user-operable controls in response to a determination of the second mode of operation.
  12. 12. Apparatus according to any preceding claim, wherein the video game processor is configured to execute a game-state-save operation in response to a determination of the second mode of operation.
  13. 13. Apparatus according to any preceding claim, wherein the sensor comprises an inertial detector configured to detect an acceleration of the control device; wherein the control processor is configured to determine the first or second mode of operation in dependence on a comparison of the detected acceleration with a threshold acceleration.
  14. 14. Apparatus according to any preceding claim, wherein the at least one sensor comprises a proximity detector configured to detect the proximity of the control device to an object other than the user; and wherein the control processor is configured to determine the first or second mode of operation in dependence on a comparison of the detected proximity of the control device to the object other than the user with a threshold distance.
  15. 15. Apparatus according to any preceding claim, wherein the at least one sensor comprises a temperature sensor arranged so as to detect the temperature at a surface of the control device; and wherein the control processor is configured to determine the first or second mode of operation in dependence on a comparison of the temperature detected at the temperature sensor with a threshold temperature.
  16. 16. Apparatus according to any preceding claim, wherein the at least one sensor comprises one or more microphones; wherein the control processor is configured to determine the first or second mode of operation in dependence on an audio signal detected at the one or more microphones; the control processor comprising a machine learning model trained to detect whether one or more characteristic frequencies are present in the detected audio signal, the first or second mode of operation being determined based on a presence or absence of the one or more characteristic frequencies in the detected audio signal.
  17. 17. Apparatus according to any preceding claim, further comprising a camera configured to capture images; and wherein the control processor is configured to determine the first or second mode of operation in dependence on the images captured by the camera.
  18. 18. Apparatus according to claim 17, further comprising: an object recognition unit configured to detect the control device in the captured images and one or more surfaces in the environment in which the control device is being used; and wherein the control processor is configured to determine the first or second mode of operation in dependence on whether the control device is detected as being adjacent one or more of the detected surface.
  19. 19. Apparatus according to claim 17 or 18, further comprising: a facial recognition unit configured to detect the user's face in the captured images; and wherein the control processor is configured to determine the first or second mode of operation based on whether the user's face is detected in a threshold number of successively captured images.
  20. 20. Apparatus according to claim 19, wherein the camera is located at the control device and is arranged so as to capture images of the user's face when the control device being handheld by the user.
  21. 21. Apparatus according to claim 3 or claim 11, wherein the control processor is configured to determine the first or second mode of operation in dependence on a rate at which user inputs are received at the control device via the one or more user-operable controls.
  22. 22. Apparatus according to any preceding claim, wherein a determination of the second mode of operation corresponds to a determination that the hand-hold controllable device has not been held for longer than a pre-determined time interval.
  23. 23. Apparatus according to any preceding claim, comprising a video game console comprising the control processor and the video game processor.
  24. 24. Apparatus according to any one of claims 1 to 23, in which the control device comprises the control processor.
  25. 25. A control device hand-holdable by a user, the hand-holdable control device comprising: a sensor operable to generate sensor data; a control processor configured to receive said sensor data and determine, in dependence on the received sensor data, whether the control device is being operated in a first mode or second of operation; wherein a determination of the first mode of operation corresponds to a determination that the control device is being hand-held by the user and a determination of the second mode of operation corresponds to a determination that the control device is not being hand-held by the user; a communication interface for exchanging data with a video game machine via a wired or wireless connection; and a data processor operable to receive an input from the control processor, and in response to said input, modify an output of the control device.
  26. 26. A control device according to claim 25, wherein the sensor comprises an inertial detector for detecting an acceleration of the control device.
  27. 27. A control device according to claim 25 or claim 26, wherein the sensor comprises a proximity detector for detecting the proximity of the control device to an object that is not the user.
  28. 28. A control device according to any of claims 25 to 27, wherein the sensor comprises a pressure sensor for detecting a force applied to the control device by the user.
  29. 29. A control device according to any of claims 25 to 28, wherein the sensor comprises a temperature sensor for detecting a surface temperature at a portion of the control device that is to be hand-held by the user.
  30. 30. A control device according to any of claims 25 to 29, wherein the sensor comprises an optical heart rate sensor for detecting a user's pulse when the user is gripping the control device.
  31. 31. A control device according to any of claims 25 to 30, wherein the sensor comprises one or more microphones for detecting audio.
  32. 32. A control device according to any of claims 25 to 31, wherein the communication interface is configured to receive vibration data from the video game machine, the control device further comprising: one or more haptic transducers for generating a haptic output, the one or more haptic transducers being configured to generate the haptic output in accordance with the received vibration data; wherein the data processor is operable to modify the haptic output in dependence on whether the control device is determined as being operated in the first or second mode of operation; the data processor being configured to output the haptic output at a first magnitude responsive to the determination that the control device is being operated in the first mode of operation, and to output the haptic output at a second magnitude responsive to the determination that the control device is being operated in the first mode of operation, the second magnitude corresponding to an inhibition of the haptic output relative to the first magnitude.
  33. 33. A control device according to any of claims 25 to 32, wherein the data processor is configured to transmit a command to the video game machine via the communication interface, the command varying in dependence on whether the control device is being operated in the first or second mode of operation.
  34. 34. A control device according to claim 33, wherein the command comprises a command to inhibit the transmission of vibration data to the control device responsive to the determination that the control device is being operated in the second mode of operation; and wherein the command comprises a command to resume transmission of vibration data to the control device responsive to the determination that the mode of operation has switched from the second mode to the first mode.
  35. 35. A control device according to any of claims 26 to 34, wherein the control device comprises a light source for enabling at least one of tracking of the control device by an external camera and identifying a character within a video game that a user of the control device corresponds to; wherein the data processor is configured to modify an output of the control device by causing the light source to output light at a first intensity responsive to the determination that the control device is being operated in the first mode of operation, and wherein the data processor is configured to modify an output of the control device by causing the light source to output light at a second intensity responsive to the determination that the control device is being operated in the second mode, the second intensity being lower than the first intensity.
  36. 36. A control device according to any of claims 33 to 35, wherein the command comprises a command to save a current state of a video game being executed at the video game machine responsive to the determination that the control device is being operated in the second mode of 20 operation.
  37. 37. A control device according to any of claims 33 to 36, wherein the command comprises a command to vary an execution of a video game at the video game machine responsive to a determination that the control device is being operated in the second mode; and wherein the command comprises a command to resume normal execution of the video game responsive to the determination that the mode of operation has switched from the second mode to the first mode.
  38. 38. A control device according to any of claims 33 to 37, wherein the command comprises a command to modify a display of notifications at a display associated with the video game machine, responsive to the determination that the control device is being operated in the second mode of operation; and wherein the command comprises a command to resume normal display of notifications responsive to the determination that the mode of operation has switched from the second mode to the first mode.
  39. 39. A control device according to any of claims 33 to 38, wherein the command comprises a command to render frames at one or more of (i) a lower resolution and (ii) a lower frame rate responsive to the determination that the control device is being operated in the second mode of operation; and wherein the command comprises a command to render frames at one or more of (i) an increased resolution and (ii) a higher frame rate, responsive to a determination that the control device has switched from the second mode to the first mode.
  40. 40. A control device according to any of claims 33 to 39, wherein the command comprises a command to inhibit the display of video at a display responsive to the determination that the control device is being operated in the second mode; and wherein the command comprises a command to resume display of video at a display responsive to the determination that the mode of operation has switched from the second mode to the first mode.
  41. 41. A control device according to any of claims 33 to 40, wherein the command comprises a command to inhibit the outputting of audio at a speaker responsive to the determination that the control device is being operated in the second mode of operation; and wherein the command comprises a command to resume outputting of audio at the speaker responsive to the determination the mode of operation has switched from the second mode to the first mode.
  42. 42. A control device according to any of claims 33 to 41, wherein the command comprises a command to disconnect the control device from the video game machine responsive to the determination that the control device is being operated in the second mode.
  43. 43. A control device according to any of claims 33 to 42, wherein the command comprises a command to disconnect one or more output devices from the video game machine responsive to the determination that the control device is being operated in the second mode. :30
  44. 44. A method for controlling an output of a hand-holdable control device, the hand-holdable control device being for providing player inputs to a video game programme being executed at a video game machine, the method comprising: obtaining sensor data from a sensor; determining, based on the obtained sensor data, whether the hand-holdable control device is being operated in a first or second mode of operation; wherein the first mode of operation corresponds to a determination that the control device is being hand-held by the user and the second mode of operation corresponds to a determination that the control device is not being hand-held by the user; and varying an output of the control device in dependence on the determination of whether the control device is being operated in the first or the second mode of operation.
  45. 45. A method for controlling an output of a video game machine, the video game machine comprising a video game programme for executing in dependence on player inputs received at the video game machine via a hand-holdable control device, the method comprising: obtaining sensor data from a sensor; determining, based on the obtained sensor data, whether the hand-holdable control device is being operated in a first or second mode of operation; wherein the first mode of operation corresponds to a determination that the control device is being hand-held by the user and the second mode of operation corresponds to a determination that the control device is not being hand-held by the user; and varying an output of the video game machine in dependence on the determination of whether the control device is being operated in the first or the second mode of operation.
  46. 46. Computer software which, when executed by one or more computers, causes the one or more computers to perform the method of at least one of claim 44 and 45.
  47. 47. A non-transitory machine-readable storage medium which stores computer software according to claim 46.
  48. 48. A non-transitory machine-readable storage medium which stores computer software according to claim 47.
GB1916272.6A 2019-11-08 2019-11-08 Data processing apparatus, control device and method Pending GB2588807A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1916272.6A GB2588807A (en) 2019-11-08 2019-11-08 Data processing apparatus, control device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1916272.6A GB2588807A (en) 2019-11-08 2019-11-08 Data processing apparatus, control device and method

Publications (2)

Publication Number Publication Date
GB201916272D0 GB201916272D0 (en) 2019-12-25
GB2588807A true GB2588807A (en) 2021-05-12

Family

ID=69062278

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1916272.6A Pending GB2588807A (en) 2019-11-08 2019-11-08 Data processing apparatus, control device and method

Country Status (1)

Country Link
GB (1) GB2588807A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100248822A1 (en) * 2009-03-27 2010-09-30 Microsoft Corporation Personalization using a hand-pressure signature
US20110118026A1 (en) * 2009-11-16 2011-05-19 Broadcom Corporation Hand-held gaming device that identifies user based upon input from touch sensitive panel
US20120122576A1 (en) * 2010-11-17 2012-05-17 Sony Computer Entertainment Inc. Smart shell to a game controller
WO2016140924A1 (en) * 2015-03-01 2016-09-09 Tactical Haptics Embedded grasp sensing devices, systems, and methods
US20170262045A1 (en) * 2016-03-13 2017-09-14 Logitech Europe S.A. Transition between virtual and augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100248822A1 (en) * 2009-03-27 2010-09-30 Microsoft Corporation Personalization using a hand-pressure signature
US20110118026A1 (en) * 2009-11-16 2011-05-19 Broadcom Corporation Hand-held gaming device that identifies user based upon input from touch sensitive panel
US20120122576A1 (en) * 2010-11-17 2012-05-17 Sony Computer Entertainment Inc. Smart shell to a game controller
WO2016140924A1 (en) * 2015-03-01 2016-09-09 Tactical Haptics Embedded grasp sensing devices, systems, and methods
US20170262045A1 (en) * 2016-03-13 2017-09-14 Logitech Europe S.A. Transition between virtual and augmented reality

Also Published As

Publication number Publication date
GB201916272D0 (en) 2019-12-25

Similar Documents

Publication Publication Date Title
US10347093B2 (en) Programmable haptic devices and methods for modifying haptic effects to compensate for audio-haptic interference
US9891741B2 (en) Controller for interfacing with an interactive application
EP3469466B1 (en) Directional interface object
US10350491B2 (en) Techniques for variable vibration waveform generation based on number of controllers
JP2023040129A (en) Information processing system, information processor, operation device, and accessory equipment
JP2018029968A (en) Transitioning gameplay on head-mounted display
JP7145889B2 (en) Input method and device
US10471346B2 (en) Information processing system, non-transitory storage medium having stored therein information processing program, information processing apparatus, and information processing method
WO2013052514A1 (en) Game controller on mobile touch-enabled devices
US10632371B2 (en) Storage medium having stored therein information processing program, information processing system, information processing apparatus, and information processing method
US20120214588A1 (en) Game controller with projection function
US10596462B2 (en) Information processing system, information processing apparatus, storage medium having stored therein information processing program, and information processing method
JP7233399B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
US20120190442A1 (en) Game system, game device, storage medium storing a game program, and game process method
GB2588807A (en) Data processing apparatus, control device and method
US8678926B2 (en) Computer-readable storage medium, information processing apparatus, system, and information process method
US10661163B2 (en) Video game with haptic signal that is disabled based on losing contact with a surface
JP7248720B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
GB2562245B (en) System and method of locating a controller
JP6829608B2 (en) Information processing programs, information processing devices, information processing systems, and information processing methods
JP6973967B2 (en) Information processing programs, information processing devices, information processing systems, and information processing methods
JP6842368B2 (en) Information processing programs, information processing systems, information processing devices, and information processing methods
JP2009183319A (en) Game apparatus, game control method, and game program