GB2562245B - System and method of locating a controller - Google Patents

System and method of locating a controller Download PDF

Info

Publication number
GB2562245B
GB2562245B GB1707394.1A GB201707394A GB2562245B GB 2562245 B GB2562245 B GB 2562245B GB 201707394 A GB201707394 A GB 201707394A GB 2562245 B GB2562245 B GB 2562245B
Authority
GB
United Kingdom
Prior art keywords
controller
hmd
location
user
lost
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
GB1707394.1A
Other versions
GB2562245A (en
GB201707394D0 (en
Inventor
Winesh Raghoebardajal Sharwin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Interactive Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Europe Ltd filed Critical Sony Interactive Entertainment Europe Ltd
Priority to GB1707394.1A priority Critical patent/GB2562245B/en
Publication of GB201707394D0 publication Critical patent/GB201707394D0/en
Publication of GB2562245A publication Critical patent/GB2562245A/en
Application granted granted Critical
Publication of GB2562245B publication Critical patent/GB2562245B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Description

SYSTEM AND METHOD OF LOCATING A CONTROLLER
The present invention relates to a system and method of locating a controller.
The recent advent of virtual reality using head mounted displays (HMDs) provides a new and immersive experience for users in which there field-of-view is fully occupied by a (typically stereoscopic) representation of a virtual environment or other content such as a pre-recorded movie. However because of this, the user is unable to see the real world environment, including control devices that they may normally use to interact with the source of this content (for example, a keyboard, mouse or videogame controller).
Consequently, there has been a tendency to provide the user with a simple hand-held controller that they can hold for the duration of their VR session, thereby avoiding the need to relocate the device for successive interactions. An example of such a handheld controller is the Sony Move ® controller.
However, there will be occasions where the user has either not picked up the controller, or has dropped the controller during a VR session, which may inconvenience the user by requiring them to locate the controller again and pick it up.
The present invention seeks to address or mitigate this problem.
In a first aspect, there is provided a system for guiding a user of a head mounted display (HMD) to the location of a lost controller in accordance with claim 1.
In another aspect, there is provided a method of guiding a user of a head mounted display (HMD) to the location of a lost controller in accordance with claim 10.
Further respective aspects and features of the invention are defined in the appended claims.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of apparatus comprising an entertainment device and head mounted display in accordance with an embodiment of the present invention.
Figure 2 is a flow diagram of a method of guiding a user of a head mounted display (HMD) to the location of a lost controller in accordance with an embodiment of the present invention. A system and method of locating a controller are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
In an embodiment of the present invention, a system for guiding a user of a head mounted display (HMD) to the location of a lost controller, as described later herein, may be implemented by an HMD operating in conjunction with a general purpose computer (e.g. a videogame console such as the Sony PlayStation 4®), or by an HMD with an integral processor (such as the Samsung Gear VR, where the processor, display and sensor functions are provided by a smart phone inserted into a head mounted unit comprising suitable optics to project respective halves of the phone display to the user’s eyes).
However, for the purposes of explanation only and as a non-limiting example, the present description will refer to a Sony PS VR head mounted display operating in conjunction with a Sony PlayStation 4.
Hence figure 1 schematically illustrates the overall system architecture of a Sony® PlayStation 4® entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises an accelerated processing unit (APU) 20 being a single chip that in turn comprises a central processing unit (CPU) 20A and a graphics processing unit (GPU) 20B. The APU 20 has access to a random access memory (RAM) unit 22.
The APU 20 communicates with a bus 40, optionally via an I/O bridge 24, which may be a discreet component or part of the APU 20.
Connected to the bus 40 are data storage components such as a hard disk drive 37, and a Blu-ray ® drive 36 operable to access data on compatible optical discs 36A. Additionally the RAM unit 22 may communicate with the bus 40.
Optionally also connected to the bus 40 is an auxiliary processor 38. The auxiliary processor 38 may be provided to run or support the operating system.
The system unit 10 communicates with peripheral devices as appropriate via an audio/visual input port 31, an Ethernet ® port 32, a Bluetooth ® wireless link 33, a Wi-Fi ® wireless link 34, or one or more universal serial bus (USB) ports 35. Audio and video may be output via an AV output 39, such as an HDMI port.
The peripheral devices may include a monoscopic or stereoscopic video camera 41 such as the PlayStation Eye ®; wand-style videogame controllers 42 such as the PlayStation Move ® and conventional handheld videogame controllers 43 such as the Dual Shock 4 ®; portable entertainment devices 44 such as the PlayStation Portable ® and PlayStation Vita ®; a keyboard 45 and/or a mouse 46; a media controller 47, for example in the form of a remote control; and a headset 48. Other peripheral devices may similarly be considered such as a printer, or a 3D printer (not shown).
The GPU 20B, optionally in conjunction with the CPU 20A, generates video images and audio for output via the AV output 39. Optionally the audio may be generated in conjunction with or instead by an audio processor (not shown).
The video and optionally the audio may be presented to a television 51. Where supported by the television, the video may be stereoscopic. The audio may be presented to a home cinema system 52 in one of a number of formats such as stereo, 5.1 surround sound or 7.1 surround sound. Video and audio may likewise be presented to a head mounted display unit 53 worn by a user 60.
The user may also interact with the system unit using a video camera 41 such as the PlayStation Eye ®. This may provide monoscopic or stereoscopic video images to the system unit 10 via for example AV input 31. Where these images capture some or all of the user, the user may enact gestures, facial expressions or speech as appropriate to interact with the currently presented user interface.
Alternatively or in addition, a controller designed to assist with camera-based user interaction, such as the PlayStation Move ® 42, may be provided. This controller has a wand form factor and an illuminated region that facilitates detection of the controller within a captured video image. Illuminated regions may similarly be provided on other controllers 43, such as on the Dual Shock 4 ®. Both kinds of controller comprise motion sensors to detect transverse movement along three axes and rotational movement around three axes, and wireless communication means (such as Bluetooth®) to convey movement data to the system unit. Optionally such controls can also receive control data from the system unit to enact functions such as a rumble effect, or to change the colour or brightness of the illuminated region, where these are supported by the controller.
The system unit may also communicate with a portable entertainment device 44. The portable entertainment device 44 will comprise its own set of control inputs and audio/visual outputs. Consequently, in a ‘remote play’ mode some or all of the portable entertainment device’s inputs may be relayed as inputs to the system unit 10, whilst video and/or audio outputs from the system unit 10 may be relayed to the portable entertainment device for use with its own audio/visual outputs. Communication may be wireless (e.g. via Bluetooth ® or Wi-Fi ®) or via a USB cable.
Other peripherals that may interact with the system unit 10, via either wired or wireless means, include a keyboard 45, a mouse 46, a media controller 47, and a headset 48. The headset may comprise one or two speakers, and optionally a microphone.
Finally, the video and optionally audio may be conveyed to a head mounted display 53 such as the Sony PSVR ® display mentioned previously. The head mounted display typically comprises two small display units respectively mounted in front of the user’s eyes, optionally in conjunction with suitable optics to enable the user to focus on the display units. Alternatively one or more display sources may be mounted to the side of the user’s head and operably coupled to a light guide to respectively present the or each displayed image to the user’s eyes. Alternatively, one or more display sources may be mounted above the user’s eyes and presented to the user via mirrors or half mirrors. In this latter case the display source may be a mobile phone or portable entertainment device 44, optionally displaying a split screen output with left and right portions of the screen displaying respective imagery for the left and right eyes of the user. Their head mounted display may comprise integrated headphones, or provide connectivity to headphones. Similarly the mounted display may comprise an integrated microphone or provide connectivity to a microphone.
In operation, the entertainment device defaults to an operating system such as a variant of FreeBSD 9.0. The operating system may run on the CPU 20A, the auxiliary processor 38, or a mixture of the two. The operating system provides the user with a graphical user interface such as the PlayStation Dynamic Menu. The menu allows the user to access operating system features and to select games and optionally other content.
Referring now again to Figure 1, in an embodiment of the present invention a system for guiding a user of a head mounted display 53 (such as the PS VR mentioned previously) to the location of a lost controller 42 (such as the PlayStation Move mentioned previously, or another peripheral capable of control functions, such as the portable entertainment device when operating as a second screen, for example) is provided.
The system comprises a first location processor adapted to obtain information indicating the location of the controller. This first location processor may for example be the CPU 20A or APU 20 of the PlayStation 4 operating under suitable software instruction, or may be a CPU of the HMD (for example of a mobile phone mounted in the HMD as discussed previously) similarly operating under suitable software instruction.
The operation of the first location processor is discussed later herein.
The system comprises a second location processor adapted to obtain information indicating the location of the HMD. This second location processor may for example be the CPU 20A or APU 20 of the PlayStation 4 operating under suitable software instruction, or may be a CPU of the HMD (for example of a mobile phone mounted in the HMD as discussed previously) similarly operating under suitable software instruction.
The operation of the second location processor is discussed later herein.
The system also comprises a difference calculation processor adapted to calculate the relative difference of the obtained location of the controller to the obtained location of the HMD. Again, this difference calculation processor may for example be the CPU 20A or APU 20 of the PlayStation 4 operating under suitable software instruction, or may be a CPU of the HMD (for example of a mobile phone mounted in the HMD as discussed previously) similarly operating under suitable software instruction.
The operation of the difference calculation processor is discussed later herein.
The system also comprises an indication positioning processor adapted to position an indicator in a virtual space relative to a virtual viewpoint of the user corresponding to the calculated relative difference. This indication positioning processor may for example be the CPU 20A or APU 20 of the PlayStation 4 operating under suitable software instruction, or may be a CPU of the HMD (for example of a mobile phone mounted in the HMD as discussed previously) similarly operating under suitable software instruction.
The operation of the indication positioning processor is discussed later herein.
The system also comprises an image processor adapted to output to the HMD at least a first image comprising the indicator within the virtual space. This image processor may for example be the GPU 20A or APU 20 of the PlayStation 4 operating under suitable software instruction, or may be a GPU of the HMD itself (for example of a mobile phone mounted in the HMD as discussed previously) similarly operating under suitable software instruction.
With regards to the first location processor, in embodiments of the present invention, the location of the controller may be obtained by processing data of an image comprising the controller received from a video camera, and/or by processing data of telemetry received from the controller.
Hence for example, the image data may comprise features characteristic of the controller, such as a known shape, colour or brightness or combination of these that may be matched to a template for the controller held by the first location processor, in order to identify the controller’s position within the image.
Meanwhile the telemetry data may comprise accelerometer data or the like that may be integrated to obtain velocity and position update information.
Image data received from a remote camera may for example be received via any suitable input port of the entertainment device, such as via USB, Bluetooth ® or Wifi ® ports (35, 33, 34). Meanwhile in the case of an HMD using a mobile phone, this may use data from an integrated camera.
The remote camera may be mounted on the HMD or a camera positioned elsewhere within the environment.
If the remote camera is mounted on the HMD, then its position and orientation may be determined from motion sensors incorporated in the HMD and/or analysis of an image from a further camera capturing the HMD.
If the remote camera is located elsewhere in the environment it should preferably encompass both the HMD and the controller in its field of view during normal use.
Telemetry data from a controller may be similarly received at the entertainment device for example via USB, Bluetooth ® or Wifi ® ports (35, 33, 34). Meanwhile in the case of an HMD using a mobile phone, this may use data from an integrated accelerometer or gyroscope.
With regards to the second location processor, in embodiments of the present invention, the location of the HMD may be obtained by processing data of an image comprising the HMD received from a video camera, and/or by processing data of telemetry received from the HMD.
Hence for example, the image data may comprise features characteristic of the HMD, such as a known shape, colour or brightness or combination of these that may be matched to a template for the HMD held by the second location processor, in order to identify the HMD’s position within the image.
Meanwhile the telemetry data may comprise accelerometer data or the like that may be integrated to obtain velocity and position update information.
Image data received from a remote camera may for example be received via any suitable input port of the entertainment device, such as via USB, Bluetooth ® or Wifi ® ports (35, 33, 34). Meanwhile in the case of an HMD using a mobile phone, this may likewise use data from such a remote camera.
The remote camera may be located within the environment at a position where its field of view should preferably encompass both the HMD and the controller in its field of view during normal use.
Telemetry data from the HMD may be similarly received at the entertainment device for example via USB, Bluetooth ® or Wifi ® ports (35, 33, 34). Meanwhile in the case of an HMD using a mobile phone, this may use data from an integrated accelerometer or gyroscope.
In an embodiment of the present invention, the system comprises a receiver (such as a Bluetooth, Wifi or USB port 33, 34, 35) for receiving state information from the controller. This state information indicates whether a touch sensor associated with the controller detects that it is in contact with the user.
The touch sensor may be any suitable sensor, such as a pressure sensor, capacitance sensor, galvanic skin conductance sensor or the like. Alternatively, any of the existing buttons on the controller may be used as a proxy for contact with the user. Hence for example, if any button is activated within a predetermined period of time (as a non-limiting example, 1 second), then the controller can be inferred to be currently held by the user. Optionally, if a button remains depressed for a second longer predetermined period of time, its use as a proxy for contact with the user can be discounted (thereby removing scope for a false positive detection, if the controller is in fact facing upside down on the floor, for example).
In conjunction with the receiver, the system comprises a controller loss detection processor (again for example the CPU 20A operating under suitable software instruction) adapted to detect that a controller is lost if the state information indicates that the controller is not being held by the user.
In an embodiment of the present invention, as noted previously herein the system may comprise a receiver (again such as a Bluetooth, Wifi or USB port 33, 34, 35) for receiving telemetry from the controller. Again as noted previously herein, this telemetry may comprise accelerometer data or the like, from which velocity, position changes and jerk values may also be derived.
In conjunction with the receiver, the system comprises a controller loss detection processor adapted to detect that a controller is lost if the telemetry (optionally after suitable processing) indicates that the distance of the controller from the HMD is more than a predetermined distance. Hence for example the controller loss detection processor may compare the positional change of telemetry data from the controller with the positional change of telemetry data from the HMD (of positional change data for the HMD obtained from video capture) to determine if the distance exceeds a predetermined amount indicative that the controller is no longer being held.
Hence for example if the controller is more than lm from the HMD this may be assumed to be greater than the possible distance of a user’s hand from their head, and hence the controller cannot be in their hand.
The threshold distance may be a matter of design choice or based on empirical data. For example, an average distance may be inferred from the user’s age as recorded in their account details, and a proportional additional distance may be added (for example to exclude most user of that age up to the Nth standard deviation, where N is a designer’s choice). Alternatively, a calibration process may measure the maximum distance of the controller from the HMD for the particular user, either during an explicit calibration phase or by measuring relative distances when other indicators that the controller is being held are present (for example the current activation of a button).
Alternatively or in addition, the controller loss detection processor may detect that the telemetry indicates that the controller has undergone a change of acceleration above a threshold amount. This may occur when the controller is dropped and then rapidly decelerates when it hits the floor, or rapidly changes direction when it bounces on a hard or resilient surface.
Again the threshold amount may be a designer choice or empirically determined. Again optionally a maximum change in acceleration attributable to the user in normal use may be obtained during game play or an explicit calibration stage when other indicators that the controller being held are present, such as the activation of a button. The threshold change in acceleration may then be a predetermined amount or proportion higher than this maximum attributable change.
It will also be appreciated that in principle such a change of acceleration will occur after a predominantly downward motion due to the controller being dropped. Hence other changes of acceleration (for example due to hitting an imaginary tennis ball or swinging an imaginary sword) can be discounted due to the preceding direction of travel regardless of the associated change of acceleration.
Alternatively or in addition, the controller loss detection processor may detect that the telemetry indicates that the controller has been static for more than a predetermined period of time.
Alternatively or in addition to telemetry, the controller loss detection processor may use a received video image comprising the controller and optionally but preferably the HMD.
The controller loss detection processor (again under suitable software instruction) may be adapted to detect that a controller is lost if the video image indicates that the controller is not in the user’s hand.
For example, the controller loss detection processor may maintain a skeletal model of the user using known techniques that provides an indication of the position of the user’s hand or hands. The controller loss detection processor may then use a detected position of the controller (using image or telemetry processing as described previously herein) to determine if this substantially coincides with the predicted or detected position of one of the user’s hands. If the controller and the user’s hands are calculated to be more than a threshold distance apart, then the controller loss detection processor may signal that the controller is lost.
Alternatively or in addition, in a similar manner to the analysis of the telemetry, the controller loss detection processor may detect that the distance of the controller from the HMD is more than a predetermined distance but using the captured video image instead of (or optionally in conjunction with) the telemetry. Hence if the controller and the HMD as detected in the image (for example using the techniques described previously herein) are detected to be more than a threshold distance apart, then the controller loss detection processor may signal that the controller is lost. As with the telemetry implementation, the threshold distance may be assumed or determined through a calibration process.
It will be appreciated that the any suitable combination of the above loss detection techniques may be used.
Hence for example, if the controller is detected as being stationary, but at the same time a touch sensor detects that the controller is being held, then the controller loss detection processor may signal that the controller is being held (even if it is being held very still).
Similarly, if the telemetry indicates a sudden change of acceleration, but the video image indicates that the controller is still within a predetermined distance from the HMD and that the user is standing, then the controller loss detection processor may signal that the controller is being held (even if it is being held very still).
Hence difference sources of detection of loss may out rank or prioritise over each other, where available.
Alternatively or in addition, indicators from plural sources of detection of loss may contribute additively, such that for example a score may be generated for the current change of acceleration, the current relative position of the controller to the HMD and how recently a button on the controller was pressed to generate a total score indicative of whether the controller has been lost.
The controller loss detection processor may than compare this score to a threshold and signal that the controller is lost if the score exceeds this threshold.
Hence the above techniques provide a means of determining if the controller is lost, to the extent that the user is unlikely to be holding the controller whilst wearing an HMD that prevents direct view of the controller in the real world.
In principle, the user may for example simply have put the controller down on a table momentarily, but have adequate proprioceptive and kinaesthetic awareness to be able to reach for the controller again without assistance.
Hence in an embodiment of the present invention, the system may not immediately initiate a visible guide to locate the controller when it is detected to no longer be held by the user.
Similarly, the system may not immediately initiate a visible guide to locate the controller when it is detected that the user placed the controller down deliberately.
For example, if the controller was moved at a velocity below a threshold value, or experienced changes in acceleration below a threshold value, before coming to rest at a given position then if may be assumed that the controller was placed there by a deliberate action of the user.
Alternatively or in addition, where video image data is available, a correlation between the user’s hand (for example by skeletal analysis) and the controller prior to the controller becoming static also indicates a deliberate act.
By contrast, if the system has just started and the controller is at rest, but there has been no scope to detect that this is due to a deliberate act, it may be assumed that the user forgot to pick the controller up and so the system may provide immediate guidance.
Similarly, if the controller experienced rapid movement or a large change in acceleration before coming to rest, it may be assumed that the loss of contact is accidental and so the system may provide immediate guidance.
Hence more generally the system may implement a delay in the provision of guidance responsive to a detection of whether the loss of the controller was deliberate instead of accidental.
With regards to the visible guide to locate the controller, in an embodiment of the present invention then as described previously herein the indication positioning processor is adapted to position an indicator in a virtual space relative to a virtual viewpoint of the user corresponding to a calculated relative difference between the obtained location of the controller to the obtained location of the HMD.
Hence the indication positioning processor transposes the relative position of the controller with respect to the HMD from the real world into the virtual world. Where necessary, an offset may be applied to account for any difference between the notional position of the HMD as calculated by the system and the user’s actual point of view (hence for example if the position of the HMD is calculated for a centroid of the HMD, this may differ from the position of the HMD screen, as may a position of a light on the HMD used as a proxy for the HMD position).
It may be assumed that the virtual environment is displayed at a scale that is substantially the same as the real world, and hence the transposed difference in position can be a direct 1:1 mapping. However if the scale is different, then the mapping ratio can be altered appropriately.
Meanwhile it will be appreciated that the user’s viewpoint within a virtual environment may not correlate exactly with their corresponding real-world viewpoint; for example, their in-game character may be taller or shorter than they are in real life. Consequently if the controller falls on the floor (for example), its position, when transposed into the virtual environment, may appear to float above or below a virtual floor in that environment.
Consequently, if the virtual environment is still displayed when indicating the location of a lost controller, then the indicator for the controller should not be subject to z-culling or clipping so that if it is on the other side of a virtual obstacle it remains visible.
Alternatively the virtual environment may be replaced with a separate virtual space, or may be alpha blended with such a space in which the indicator is positioned. In this way with user can navigate toward the controller in a space that is consistent with the real world relationships between the user and the lost controller.
The indicator itself may be any suitable graphical indicator, such as an icon, that is placed within the virtual environment/space. Alternatively it may be a virtual model of the controller itself. If orientation telemetry from the controller is available, its position at rest may thus be indicated to the user so that they can more easily pick it up whilst otherwise unable to see it.
Optionally, the system may also comprise an audio processor adapted to output an audio signal to assist with locating the controller, responsive to either the calculated relative difference of the obtained location of the controller to the obtained location of the HMD; and the position of the indicator within the at least first image (where this image has been obtained from an HMD mounted camera). This provides an intuitive cold-to-hot audio style feedback based on distance from the controller and also whether the user is directly looking at the controller.
This may be of assistance if the controller is at the periphery of the user’s vision, which may be the case where the HMDs own field of view is narrower than what the user normally experiences.
It will be appreciated that there will be circumstances in which the system is not able to locate the controller. For example, if the system relies solely or predominantly upon image processing to locate the controller, then if the controller comes to rest outside the field of view of the available camera(s) then it cannot be located with reference to the HMD. Similarly if the system relies solely or predominantly on telemetry, but the user forgets to pick the controller up and hence no reference position can be established, then it cannot easily be located with reference to the HMD. Similarly if the HMD cannot be detected it becomes difficult to locate it with reference to the controller. Hence more generally if the location of either the controller or the HMD cannot be found, then it is difficult to estimate the difference in position between the two objects and hence present a visual guide to the user through the HMD display. In the circumstances, optionally a failure mode may be used.
Accordingly, in an embodiment of the present invention the system may comprise a failure processor (for example CPU 20A operating under suitable software instruction) adapted to detect if either the first location processor fails to obtain information indicating the location of the controller or the second location processor fails to obtain information indicating the location of the HMD, and a transmitter adapted to transmit a command to the controller to emit an audible signal. The audible signal may be output through a loudspeaker or beeper of the controller, and/or by causing a rumble function of the controller to activate.
In this way, the user may be able to locate the position of the controller using audio cues whilst still wearing the HMD.
Clearly optionally when instructing the controller to generate sounds, the system may suspend audio output from the HMD itself to assist the user while listening for the controller.
Optionally the the failure processor is also adapted to notify the user of either a failure by the first location processor to obtain information indicating the location of the controller or a failure by the second location processor to obtain information indicating the location of the HMD. This may take the form of a general failure notification, or indication of which of the controller or HMD the system has failed to locate, as this may assist the user in providing remedial action to bring the relevant object back into view or reception range etc, of the system.
Turning now to figure 2, a method of guiding a user of a head mounted display (HMD) to the location of a lost controller comprises: a first step s210 of obtaining information indicating the location of the controller; a second step s220 of obtaining information indicating the location of the HMD; a third step s230 of calculating the relative difference of the obtained location of the controller to the obtained location of the HMD; a fourth step s240 of positioning an indicator in a virtual space relative to a virtual viewpoint of the user corresponding to the calculated relative difference; and a fifth step s250 of outputting to the HMD at least a first image comprising the indicator within the virtual space.
It will be apparent to a person skilled in the art that variations in the above method corresponding to operation of the various embodiments of the system as described and claimed herein are considered within the scope of the present invention, including but not limited to: receiving state information from the controller, and detecting that a controller is lost if the state information indicates that the controller is not being held by the user; receiving telemetry from the controller, and detecting that a controller is lost if the telemetry indicates either that the distance of the controller from the HMD is more than a predetermined distance, that the controller has undergone a change of acceleration above a threshold amount, and / or that the controller has been static for more than a predetermined period of time; receiving a video image comprising the controller, and detecting that a controller is lost if the video image indicates either that the controller is not in the user’s hand, and / or that the distance of the controller from the HMD is more than a predetermined distance; and detecting either failure to obtain information indicating the location of the controller or failure to obtain information indicating the location of the HMD, and transmitting a command to the controller to emit an audible signal.
Hence it will be appreciated that the above methods may be carried out on conventional hardware (such as that described previously herein) suitably adapted as applicable by software instruction or by the inclusion or substitution of dedicated hardware.
Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a non-transitory machine-readable medium such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device. Separately, such a computer program may be transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these or other networks.

Claims (15)

1. A system for guiding a user of a head mounted display (HMD) to the location of a lost controller, the system comprising: a receiver for receiving telemetry from the controller; a controller loss detection processor adapted to detect that the controller is lost if the telemetry indicates one or more from the lost consisting of: i) that the controller has undergone a change of acceleration above a threshold amount; and ii) that the controller has been static for more than a predetermined period of time; a first location processor adapted to obtain information indicating the location of the controller; a second location processor adapted to obtain information indicating the location of the HMD; a difference calculation processor adapted to calculate the relative difference of the obtained location of the controller to the obtained location of the HMD; an indication positioning processor adapted, responsive to detection that the controller is lost, to position an indicator in a virtual space relative to a virtual viewpoint of the user corresponding to the calculated relative difference; and an image processor adapted to output to the HMD at least a first image comprising the indicator within the virtual space.
2. The system of claim 1, in which the location of the HMD is obtained from one or more selected from the list consisting of: i. an image comprising the HMD received from a video camera; and ii. telemetry received from the HMD.
3. The system of claim 1 or claim 2, in which the location of the controller is obtained from one or more selected from the list consisting of: i. an image comprising the controller received from a video camera; and ii. telemetry received from the controller.
4. The system of any preceding claim, comprising: a receiver for receiving state information from the controller; and a controller loss detection processor adapted to detect that a controller is lost if the state information indicates that the controller is not being held by the user.
5. The system of claim 1, wherein the controller loss detection processor is adapted to detect that the controller is lost if the telemetry indicates that the distance of the controller from the HMD is more than a predetermined distance .
6. The system of any preceding claim, comprising: a receiver for receiving a video image comprising the controller; and a controller loss detection processor adapted to detect that a controller is lost if the video image indicates one or more selected from the list consisting of: i. that the controller is not in the user’s hand; and ii. that the distance of the controller from the HMD is more than a predetermined distance.
7. The system of any preceding claim, comprising; an audio processor adapted to output an audio signal responsive to one or more selected from the list consisting of: i. the calculated relative difference of the obtained location of the controller to the obtained location of the HMD; and ii. the position of the indicator within the at least first image.
8. The system of any preceding claim, comprising; a failure processor adapted to detect if either the first location processor fails to obtain information indicating the location of the controller or the second location processor fails to obtain information indicating the location of the HMD; and a transmitter adapted to transmit a command to the controller to emit an audible signal.
9. The system of claim 8, in which the failure processor is adapted to notify the user of either a failure by the first location processor to obtain information indicating the location of the controller or a failure by the second location processor to obtain information indicating the location of the HMD.
10. A method of guiding a user of a head mounted display (HMD) to the location of a lost controller, the method comprising the steps of: receiving telemetry from the controller; detecting that the controller is lost if the telemetry indicates one or more selected from the list consisting of: i) that the controller has undergone a change of acceleration above a threshold amount; and ii) that the controller has been static for more than a predetermined period of time; obtaining information indicating the location of the controller; obtaining information indicating the location of the HMD; calculating the relative difference of the obtained location of the controller to the obtained location of the HMD; positioning, responsive to detection that the controller is lost, an indicator in a virtual space relative to a virtual viewpoint of the user corresponding to the calculated relative difference; and outputting to the HMD at least a first image comprising the indicator within the virtual space.
11. The method of claim 10, comprising the steps of: receiving state information from the controller; and detecting that a controller is lost if the state information indicates that the controller is not being held by the user.
12. The method of claim 10, comprising the step of: detecting that a controller is lost if the telemetry indicates that the distance of the controller from the HMD is more than a predetermined distance.
13. The method of any one of claims 10 to 12, comprising the steps of: receiving a video image comprising the controller; and detecting that a controller is lost if the video image indicates one or more selected from the list consisting of: i. that the controller is not in the user’s hand; and ii. that the distance of the controller from the HMD is more than a predetermined distance.
14. The method of any one of claims 10 to 13, comprising the steps of: detecting either failure to obtain information indicating the location of the controller or failure to obtain information indicating the location of the HMD; and transmitting a command to the controller to emit an audible signal.
15. A computer readable medium having computer executable instructions adapted to cause a computer system to perform the method of any one of claims 10 to 14.
GB1707394.1A 2017-05-09 2017-05-09 System and method of locating a controller Active GB2562245B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1707394.1A GB2562245B (en) 2017-05-09 2017-05-09 System and method of locating a controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1707394.1A GB2562245B (en) 2017-05-09 2017-05-09 System and method of locating a controller

Publications (3)

Publication Number Publication Date
GB201707394D0 GB201707394D0 (en) 2017-06-21
GB2562245A GB2562245A (en) 2018-11-14
GB2562245B true GB2562245B (en) 2019-09-25

Family

ID=59065526

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1707394.1A Active GB2562245B (en) 2017-05-09 2017-05-09 System and method of locating a controller

Country Status (1)

Country Link
GB (1) GB2562245B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI798037B (en) * 2021-08-31 2023-04-01 宏達國際電子股份有限公司 Virtual image display system and calibration method for pointing direction of controller thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075167A1 (en) * 2010-09-29 2012-03-29 Eastman Kodak Company Head-mounted display with wireless controller
US20150352437A1 (en) * 2014-06-09 2015-12-10 Bandai Namco Games Inc. Display control method for head mounted display (hmd) and image generation device
US20160171770A1 (en) * 2014-12-10 2016-06-16 Sixense Entertainment, Inc. System and Method for Assisting a User in Locating Physical Objects While the User is in a Virtual Reality Environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075167A1 (en) * 2010-09-29 2012-03-29 Eastman Kodak Company Head-mounted display with wireless controller
US20150352437A1 (en) * 2014-06-09 2015-12-10 Bandai Namco Games Inc. Display control method for head mounted display (hmd) and image generation device
US20160171770A1 (en) * 2014-12-10 2016-06-16 Sixense Entertainment, Inc. System and Method for Assisting a User in Locating Physical Objects While the User is in a Virtual Reality Environment

Also Published As

Publication number Publication date
GB2562245A (en) 2018-11-14
GB201707394D0 (en) 2017-06-21

Similar Documents

Publication Publication Date Title
US10620699B2 (en) Head mounted display, mobile information terminal, image processing apparatus, display control program, display control method, and display system
US9599818B2 (en) Obstacle avoidance apparatus and obstacle avoidance method
JP2020175204A (en) Method for fading out image of physical object
CN106139587B (en) Method and system for avoiding real environment obstacles based on VR game
US20190079597A1 (en) Information processing method, computer and program
JP6096391B2 (en) Attention-based rendering and fidelity
US11763578B2 (en) Head-mounted display, display control method, and program
JP2020523673A (en) Input method and device
US20210132378A1 (en) Head mounted display and information processing method
US11195320B2 (en) Feed-forward collision avoidance for artificial reality environments
EP3508951B1 (en) Electronic device for controlling image display based on scroll input
GB2562245B (en) System and method of locating a controller
KR20190041002A (en) Content Discovery
JP6403843B1 (en) Information processing method, information processing program, and information processing apparatus
CN111208903B (en) Information transmission method, wearable device and medium
JP7413122B2 (en) Image generation system, image generation device, and image generation method
EP4349434A1 (en) Processing devices and methods
JP2019021331A (en) Information processing method, information processing program and information processing device

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20200723 AND 20200729