GB2587368A - Tactile output device and system - Google Patents

Tactile output device and system Download PDF

Info

Publication number
GB2587368A
GB2587368A GB1913794.2A GB201913794A GB2587368A GB 2587368 A GB2587368 A GB 2587368A GB 201913794 A GB201913794 A GB 201913794A GB 2587368 A GB2587368 A GB 2587368A
Authority
GB
United Kingdom
Prior art keywords
output device
tactile output
touch
video game
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1913794.2A
Other versions
GB201913794D0 (en
Inventor
Michael James Hollingsworth William
Mark Collingwood Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1913794.2A priority Critical patent/GB2587368A/en
Publication of GB201913794D0 publication Critical patent/GB201913794D0/en
Publication of GB2587368A publication Critical patent/GB2587368A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/0001Games specially adapted for handicapped, blind or bed-ridden persons
    • A63F2009/0003Games specially adapted for blind or partially sighted people
    • A63F2009/0004Games specially adapted for blind or partially sighted people using BRAILLE
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

A tactile output device is provided. The tactile output device comprises a communication interface operable to receive video game state information indicating at least one of (i) a topography of a virtual environment and (ii) the locations of one or more objects in a virtual environment. The device also comprises a touchpad comprising a plurality of touch elements for generating a respective tactile output. The device further comprises a controller operable lo receive an input from the communication interface, and in response to said input control the tactile output of at least some of the touch elements so as to generate a tactile representation of at least one of a topography of the virtual environment and the location of one or more objects in the virtual environment, each touch element corresponding to a different respective location in the virtual environment. A system comprising the tactile output device is also provided.

Description

TACTILE OUTPUT DEVICE AND SYSTEM
Technical Field
The present disclosure relates to a tactile output device and system.
Background
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
Currently, the accessibility of video games to the blind or visually impaired is limited. In known systems, accessibility features include: text-to-speech, zooming functionality, text size and boldness controls, contrast settings and colour inversion. However, these features tend to result in an interruption or modification of the content that would normally be displayed during the playing of a video game. For example, it may be that various displayed elements need to be re-configured and re-sized in order to accommodate enlarged text, or that background audio is interrupted for e.g. a description of the scene.
Some video games, such as open-world video games, provide players with diverse and complex virtual environments to explore and interact with. These virtual environments tend to include a variety of geographies, such as, e.g. mountainous terrain, rolling hills, urban metropolises, etc. Moreover, there are typically a variety of different objects within these environments for a player to view and interact with. However, for visually impaired players, the scale, motion and relative position of objects within the virtual environment, as well as physical features of the environment (e.g. waterfalls, geyzers, etc.) may be less apparent. A player may attempt to mitigate this by e.g. zooming in on a particular part of the screen or using e.g. text-to-speech to get an idea of what a current scene corresponds to. Although, as will be appreciated, this will typically result in a at least some of the visual, and possibly audio information of the video game being interrupted. Overall, this can result in a loss of immersion for the visually impaired player. The present disclosure seeks to alleviate these problems.
Summary
The present disclosure is defined by the appended claims.
Brief Description of the Drawings
To assist understanding of the present disclosure and to show how embodiments may be put into effect, reference is made by way of example to the accompanying drawings in which: Figure 1 shows schematically an example of a tactile output device in accordance with the present disclosure; Figure 2 shows schematically an example of a tactile representation of a topography of a virtual environment when the tactile output device is operating in a top-down mode; Figure 3 shows schematically an example of a tactile representation of one or more objects in a virtual environment when the tactile output device is operating in a top-down mode; Figure 4 shows schematically an example of a tactile representation of a virtual environment at a lower magni cation Figure 5 shows schematically an example of a forward-view mode of the tactile output device; Figure 6 shows schematically an example of a screen-mode of the tactile output device; and Figure 7 shows schematically an example of a system that includes the tactile output device of the present disclosure.
Detailed Description
While several video game consoles provide users with accessibility features, the amount of information that a visually impaired player can obtain about a video game world is often limited. Typically, the scale, motion and position of objects in a virtual environment, as well as physical features of the virtual environment, are not readily apparent to a visually impaired player. In turn, this can reduce the immersion that is experienced by the visually impaired player. The present disclosure provides a device and system that is designed to mitigate this loss of immersion for visually impaired players.
Figure 1 shows schematically an example of a tactile output device 100 for generating a tactile representation of a virtual environment, The tactile output device 100 comprises a communication interface 101 operable to receive video game state information. The video game state information may be received from an information processing apparatus, such as a video game machine. The video game state information may be generated as a result of the information processing apparatus executing a game program.
The video game state information may include information pertaining to a topography of a virtual environment that is rendered as part of the video game. The topography of the virtual environment may be defined by a map in the form of e.g. a 3D mesh, that represents the physical landscape and any physical structures present in the virtual environment, in sonic examples, topography information may pertain to physical features of the virtual environment that delimit the navigable extent of the virtual environment. That is, the topography information may provide an indication of blocked or impassable areas of the virtual environment, and/or conversely an indication of navigable paths.
In some examples, the topography information may indicate the topography of the virtual environment within a threshold distance of the player's location. The threshold distance may correspond the part of the virtual environment that is within a player's current field of view, for example. in other examples, the threshold distance may correspond to some arbitrary radius centred on the player. It is typical in some video games for only parts of the scene within a threshold distance of the player to be rendered, arid the topography information may relate to such parts of the game scene.
As will be appreciated, it may be inefficient for a high resolution mesh of an entire virtual world to be received at the tactile output device 100. The tactile output device 100 may lack a sufficient number of tactile elements (described below) for representing the virtual world at a corresponding high tactile resolution. Thus, the amount, and also, granularity of the virtual game state information that is processed by or for the tactile output device 100 may be dependent on a mode in which a user is operating device, as will be described later.
Alternatively or in addition, the video game state information may include information indicating the locations of one or more objects relative to the position of a player in the virtual environment. The objects may correspond to collectable items, characters, enemies, etc. that a player is able to interact with in the virtual environment. In most video games a player is in control of a playable character or object, and the location of this character / object corresponds to the position of the player in the virtual environment, in some examples, the video game state information may include the absolute location of the player and one or more other objects in the virtual environment.
In some examples, the video game state information may further comprise attribute information associated with the one or more objects identified in the video game state information. The attribute information may indicate, for each object, at least one of 0) a dimension of the object, such as e.g. the height of the object; (ii) a motion of the object; and (iii) an interactivity associated with the object. It will be appreciated that the dimension and / or motion associated with a given object need not be confined to the vertical direction; the dimension that is represented may depend on the mode in which the tactile output device 100 is being operated, as will be described later.
In some examples, the video game state information may comprise a configuration file that includes metadata associated with each of the objects. That is, the one or more attributes of the objects may be defined as metadata that is received at the communication interface from e.g. a video game machine.
In some examples, obtaining the video game state information may involve determining the pose of a player's head and / or the position of their eyes. For example, if a player is using a head-mountable display to view the virtual environment. The pose of the player's head may be tracked by e.g. an accelerometer or gyroscope within the HMD. The position of the player's eyes may be tracked by a gaze tracking unit (e.g. an infra-red camera) within the HMD. The pose of the player's head and / or position of the player's eyes may be used to determine a view frustum of the player, which may then be used to determine the parts of the virtual environment and / or any objects that are visible to the player. The video game machine may then transmit the corresponding video game state information to the tactile output device for rendering. it will be appreciated that, for non-HMD examples, the pose (and optionally, intrinsics) of a virtual camera within the virtual environment may be used to determine which parts of the virtual environment are within a viewing frustum of the player. It will be appreciated, for example, that a user of an HMD (or any conventional visual display mechanism) may optionally be separate from a user of the tactile output device -for example, operating in a cooperative manner to play a game, with inputs optionally from the tactile output device (as described later herein) and optionally from one or more conventional means of control, as shared between the plurality of users The tactile output device 100 further comprises a touchpad 104. The touchpad 104 comprises a plurality of touch elements 103 for generating a respective tactile output. Each touch element may correspond to a protrusion, such as a pin, having a controllable displacement and / or size. Although the term 'pin' is used in the present disclosure, it will be appreciated that the touch elements 103 need not have a rounded or pointed tip. For example, in Figure 1, each pin is shown as having a square-shaped upper end. Generally, the pin may have an upper end of any shape, provided that the user can perceive a tactile sensation by touching the pin or moving their finger over it.
The tactile output of each touch element is selectively controllable. This means that the tactile output at different locations on the touchpad 104 can be individually controlled. The pins may be movable between a 'rest' position and an 'active' position. A pin may be said to be in an 'active' position if it has actively been moved (or increased in size) so as to generate a tactile output relative to the other pins. The 'rest' position may correspond to a default position that pins remain in, or are returned to, when not being used to generate a tactile output. A pin may be said to be in the rest position if it is substantially flush with an upper surface of the touchpad, for example. Alternatively, a pin may be in a rest position if it is at a default height with respect to the upper surface of the touchpad, the default height being lower than the height of the pin when in an active state.
In some examples, each pin may have a plurality of active positions. That is, each actuator may exert a variable force on the corresponding pin or pins, such that each pin can take a plurality of different heights or offsets relative to the default 'rest' position. The variable height of each pin may be used to generate a tactile representation of different heights of objects or topography, for example.
In some examples, the pins may be arranged in a planar matrix. An example of this is shown in Figure 1, where the touch pad is shown as comprising a 6 (columns) x 3 (rows) array of pins. It will be appreciated that this is just an example and that in other examples, a different number of pins, with closer spacing may be provided, such as for example a 4x10 or 5x14 pin array or similar (corresponding approximately to the number of keys on a QUERTY keyboard, or equivalent typing keyboard layout). In the present disclosure, pins are primarily described as being in an active state based on a displacement of the pin. It will be appreciated that other methods for increasing the relative height or length of the pins may be used in accordance with the present disclosure.
The displacement and / or size of the pins may be controlled using actuators. Each pin or subset of pins may be associated with a corresponding actuator, with the actuator being operable to control a displacement of the pin (or group of pins) associated with that actuator. The actuators may be used to move the pins in the vertical direction, which may involve moving the whole of the pin, or an extendable portion of the pin. in the latter case, each pin may consist of an inner portion and an outer portion, with the inner portion being movable relative to the outer portion. The inner portion may be moved up and out of the outer portion so as to extend the apparent length of the pin. This may involve using the actuators to push the inner portion out of the outer portion, for example. In any case, the actuators may comprise MEMs devices to which one or more pins, or portions thereof, are attached. The actuators may comprise linear actuators, for example. The actuators may be provided in the lower layer 105, arranged below the corresponding ouch elements 103.
In some embodiments, the touchpad 104 may further comprise an upper layer 106 (dashed) for receiving the tactile output generated by the touch elements 103. The upper layer 106 may comprise e.g. a flexible synthetic material, such as rubber, that is located above the plurality of pins. The flexible layer may be relatively thin so as to allow the user to feel deformations created by pins extending into the flexible layer. In this case, typically the pins will have rounded ends to mitigate wear or tearing due to corners, or pinching between pins. in alternative examples, the upper layer 106 may comprise a plurality of holes, one for each pin, through which each pin can be raised and lowered. In other examples, there may be no upper layer 106, hence the upper layer 106 is shown in dashed in Figure 1, In Figure 1, five pins are shown as being in an 'active' position, corresponding to the relative locations of five objects 107 in a virtual environment viewable at a display 108. In Figure 1, some of the objects are shown with different shading, to indicate that they represent different objects.
The tactile output device 100 also comprises a controller 102 operable to receive an input from the communication interface 101, and in response to said input, control the tactile output of at least some of the touch elements 103 so as to generate a tactile representation of the virtual environment. In Figure 1, the controller 102 is shown as receiving this input from the communication interface 101. Each touch element may correspond to a different respective location in the virtual environment, as will be described below. The tactile representation of the virtual environment may correspond to a representation of the physical features of the virtual environment (i.e. topography) and / or one or more objects present in the virtual environment. The input received at the controller may include information indicating which of the pins are to be activated, and optionally a relative height of those pins.
The controller 102 is configured to control the touch elements 103 so as to represent at least one of the topography of the virtual environment and the presence of any objects located in the virtual environment. This may involve, for example, controlling a size and / or displacement of one or more touch elements 103 based on the received video game state information. The controller 102 may be configured to assign each pin (or a subset of pins) to a respective location in the virtual environment, such that each pin can be used to represent the topography and / or any objects at that location. Alternatively, the video game state information may include a mapping between virtual environment locations and corresponding pins, and an indication of which pins are to be activated. In such cases, the controller may be configured to simply render the tactile information received from the video game machine.
It will be appreciated that whilst the controller and communication interface may be integral to the tactile output device, or part of an intermediate control box, optionally they are provided by a host entertainment device or similar as a driver or drivers of the tactile output device, which may then receive control instructions for example via USB or Bluetoothg Fink.
The tactile output device 100 may be configured to operate in a number of different render modes, with the controller 102 controlling the touch elements 103 differently depending on the mode of operation. A user may cycle between the different render modes using one or more physical buttons located on the device. For example, there may be a 'forward' and 'backward' button for selecting each render mode. The render modes may include a 'terrain mode', wherein the physical features of the virtual environment are represented by the touch elements 103, and / or an 'object mode' wherein the locations (and any attributes) of interactive objects in the virtual environment are represented by the touch elements 103. The device may also operate in a number of different 'view' modes, as will be described below.
Top-down view -Terrain Mode In some embodiments, the tactile representation of the virtual environment may correspond to a top-down view of the virtual environment. This may correspond to a so-called 'top-down' mode. In such a mode, the horizontal plane (x-y) may correspond to a map view of the virtual environment, with the height of various features of the virtual environment being defined by z-coordinates.
In the 'top-down' mode, the controller may be configured to control a displacement of at least some of the pins so as to correspond to a top-down view of the topography of the virtual environment. This may involve, for example, controlling the displacement of the pins so as to represent the height of various physical features of the virtual environment. The physical features may include e.g. hills, mountains, caves, trees, rivers, lakes, walls, buildings, furniture, etc. The top-down view may provide players with a quick way of seeing where their character is and what is around them.
As mentioned previously, the physical features of the virtual environment may be received as topography information, which may include one or more 3D meshes defining the physical features of the virtual environment. The height of the physical features may be determined from the z-values of the mesh defining those features. Each touch element of the tactile output device may correspond to a different region of the horizontal plane (x-y) defining the map view (with this mapping being known by the controller 102). The controller 102 may be configured to determine for a given region in the x-y plane of the video game, a corresponding z-value that is to be represented by the corresponding touch element. This may involve determining, for each touch element, a maximum or e.g. average of the z-values of the one or more meshes defining the corresponding region of the virtual environment. In some examples, the video game machine may be configured to perform this calculation, and to transmit the relevant information to the tactile output device.
It will be appreciated that, for tactile output devices 101 having a lower granularity of touch elements, the topography information may be of a relatively low resolution. In some examples, it may be that 3D mesh representing the environment is divided into regions, and a bounding cube is fitted to each section. The height of the bounding cube for a given region may then be used to determine the relative height or offset of a pin that corresponds with that region.
In some examples, a player may simply wish to gauge which areas of the virtual environment are blocked or impassable. In such examples, the pins may be raised so as to indicate such areas, with the heights of the pins not necessarily corresponding to the heights of the physical features of the virtual environment. It may be, for example, that each blocked or impassable region of the virtual environment is indicated by a pin raised to a given height, with each pin being raised to substantially the same height. In such cases, parts of the terrain that are elevated but not impenetrable may be represented by a pin at 'rest' (i.e. flat). A region of the virtual environment may be identified as being blocked or impenetrable based on metadata associated with that region. Alternatively, or in addition, the tactile output device, or game machine, may be configured to determine whether a given region is impenetrable based on a geometry of the corresponding physical features located therein.
Figure 2 shows schematically an example of a map view 200 of a virtual environment. In Figure 2, the view is a sub-region of a map and may correspond to e.g. the part of the virtual environment that is within the immediate vicinity of a player. In Figure 2, the player 201 is shown at a central location on the map, with physical elements in the form of trees 202 and a wall 203 positioned around the player. The map is shown as being flat in Figure 2, but it will be appreciated that in some cases, the ground may be elevated at different locations. The map-view shown in Figure 2 may correspond to a region of the map that is within a threshold distance of the player.
Figure 2 also shows schematically the touchpad 204 of the tactile output device and a tactile representation of the virtual environment that has been generated by the touch elements. In Figure 2 it can be seen that the relative height of the touch elements (e.g. pins) corresponds to the relative height of the physical features of the virtual environment. The touch elements 202B correspond to a wall 202A in the virtual environment; the touch elements 203B correspond to the trees 203A in the virtual environment. In Figure 2, it can be seen that the touch elements representing the wall have an increased height with respect to the touch elements representing the trees. This may be because the z-values associated with the wall have been identified as higher compared with the z-values of the trees, it will be appreciated however that optionally any non-navigable region of an environment may be given a minimum tactile height, regardless of their in-game height -hence for example a lava lake may be represented as a raised region despite being flat in-game.
In Figure 2, a central region 205 of the touchpad is also shown. This may correspond to a location representative of the player's location in the virtual environment. That is, the player's position may be static (on the touchpad), with the relative heights of the touch elements being updated as the player moves to new positions in the virtual environment. The central location 205 may correspond to a region in which one or more touch elements always remain in a 'rest' position. Alternatively or in addition, the central region 205 of the touchpad 204 may have a different material or texture at the upper surface of the touchpad, such that this region can clearly be distinguished from other regions on the touchpad. For example, the central region 205 of the touchpad may comprise a material of increased surface friction, relative to the upper surface of the other touch elements.
In alternative or additional examples, the two touch elements either side of the central region 205 may be maintained in a raised position In these examples, a player may position their finger within the two raised touch elements.
In some examples, it may be that only the terrain in front of the player is represented by the touch elements and so the location of the player may not be represented at a central location 205 on the touchpad. In such examples, it may be that the location corresponding to the player is located towards the front end of the touchpad (i.e. the end closest to the user during use), which may itself be a central location. In some video games, such as e.g. racing games, platform games, etc. it may be that a player is required to travel in a given direction with limited ability to turn back and so it may not be useful to represent terrain that is behind the player.
In some embodiments, the position on the touchpad 204 that is to be used to represent the player position may be determined dynamically. For example, the video game state information may include an indication of the genre of video game, which the controller 102 may then use to determine where on the touchpad the player location is to be represented. In such examples, the region of the touchpad 204 corresponding to the player location may provide tactile feedback so as to indicate that the region corresponds to the player position. For example, a vibration may be generated (as described later) at a location on the touchpad corresponding to the player position, or two touch elements may be maintained in a raised position, with the lowered touch element located between them being used to indicate the player position.
In alternative or additional examples, the player location may be indicated at the touchpad 204 by one or more light sources arranged to project light onto the region of the touchpad 204 corresponding to the player location. These light sources may be located at the lower layer, for example. In some examples, each touch element may be associated with a corresponding light source that is arranged beneath that touch element. The location of the player may he indicated by causing the corresponding light source to output light. For example, the touch elements may be formed of a partially transparent material, such that the light output by each corresponding light source can be received and output by the corresponding touch element. The light sources may comprise LEDs, for example. The switching on and off of the LEDs may be under the control of the controller 102.
As will be appreciated, the presence of any objects (including other players) in the virtual environment may also be indicated in this way. In some examples, each touch element may be associated with a plurality of light sources, each light source being operable to emit light of a different colour (e.g. red, green or blue). In these examples, the video game state information may include colour information about the physical features of the environment and / or any objects located therein. The controller 102 may be configured to control the colour output by the light sources in accordance with the corresponding colour information indicated in the video game state information. This may thither assist a user in identifying which parts of the touchpad 204 correspond with the parts of the virtual environment viewable at a display.
As mentioned previously, in some embodiments, the video game state information may include attribute information that defines, for one or more objects, a motion associated with that object. It will be appreciated that, in some cases, there may be parts of the virtual environment that are associated with motion, e.g. waterfalls, weather such as wind or rain, flowing rivers, etc. Hence, in some examples, the video game state infommtion may provide an indication of motion associated with physical features of the virtual environment. The controller 102 may be configured to use this information to control a motion and / or vibration of the corresponding touch elements being used to represent those parts of the virtual environment.
Top-down view -Object Mode In some embodiments, the tactile representation of the virtual environment may correspond to a tactile representation of one or more objects in the virtual environment. This may be the case where the video game state information received at the communication interface provides an indication of the locations of one or more objects in the virtual environment relative to the player. In such embodiments, the controller 102 may be configured to control the height of at least some of the pins so as to indicate the presence of one or more objects in the virtual environment relative to the player. As mentioned previously, these objects may correspond to objects that are within a threshold distance of the player. If only the locations of the objects are being represented at the tactile output device, then each pin may be raised to the same height. As will be explained herein, objects represented in the object mode are typically qualitatively different to objects (such as walls and trees) that represent features of the environment in the top-down mode.
Figure 3 shows schematically an example of such an embodiment. In Figure 3, a plurality of objects 302A, 303A and 304A are shown positioned around a player 301 in a virtual environment. The view shown in Figure 3 corresponds to the map-view described previously in relation to Figure 2, but with the plurality of objects 302A, 303A, 304A added. The corresponding tactile representation generated at the touch pad 306 is also shown.
In order to determine a mapping between object locations and touch elements, each object may be represented by a bounding cube that bounds the extremities of that object. The bounding cubes may be defined as part of the video game state information received at the communication interface 101, for example. Alternatively, the video game state information may include the meshes of the objects and the bounding cubes for each object may be calculated at the tactile output device 100. Alternatively, the video game machine may perform this calculation and transmit the bounding cube information to the tactile output device 100. The locations of the bounding cubes relative to the location of the player may be used by the controller 102 to assign one or more touch elements to each respective object. Similarly, the volume of the bounding cubes may be used to determine the number of touch elements that are to be used for representing the object.
In Figure 3, the locations of the touch elements in an 'active' state correspond to the locations of the objects in the virtual environment. It will be appreciated that, in order to represent the location of a given object in the virtual environment, only one touch element may be needed. However, in some examples, it may be useful to provide a tactile representation of the size of an object in two dimensions. An example of this is shown in Figure 3, where four touch elements 302B are shown as providing a tactile representation of object 302A. This is because object 302A occupies a larger area of the x-y plane compared with the other objects shown (objects 303A and 304A), and hence a larger number of touch elements are used to represent that object. The selection of touch elements may follow rules similar to those used in graphical rendering, where if a polygon intersects a pixel by more than a threshold amount, then that pixel is rendered as part of the polvgon. Similarly, if an object (or its bounding box, if used) intersect a region of the game environment corresponding to a given touch element by more than a threshold amount, then that touch element is activated, optionally to an extent based upon the degree of intersection/overlap.
The central region 306 shown in Figure 3 may correspond to the central region 205 described previously in relation to Figure 2.
In Figure 3, each touch element in the active state is shown as having the same height relative to the other touch elements in the active state. However, as mentioned previously, the video game state information may include attribute infornmtion defining a dimension of the one or more objects that are to be represented by the touch elements (e.g. in the form of a height map). in such embodiments, the displacement of the touch elements may be controlled in accordance with the height of the one or more objects that are to be represented. For example, in Figure 3, it may be that object 304A is taller than object 303A, and so the pin representing object 304A is extended further upwards relative to the pin representing object 303A.
As mentioned previously, in some embodiments, the video game state information may include attribute information that defines, for one or more objects, a motion associated with that object and / or an interactivity associated with the object. The motion information may include an indication of the speed and direction of a movement that is being performed by a given object. In such embodiments, the controller 102 may be configured to represent a motion of the object by controlling a displacement of one or more pins at corresponding locations on the touchpad.
II
In some examples, motion of the of touch elements may be restricted in the vertical plane, for example, where linear actuators are used. If an object in the virtual environment is moving up and down (e.g. an elevator) then the touch elements at the corresponding location may be moved up and down in a corresponding manner. However, if the motion is in the x and / or' directions, then the relative height of the corresponding pins may be adjusted in sequence in a corresponding direction. This may involve, for example, lowering the height of each pin in sequence relative to an initial height that is used to indicate the presence of the object at a given location. It will be appreciated that, in order to represent motion in the N-and y-directions, it may be necessary to use two or more pins, arranged in a corresponding direction, to represent that object and its motion.
The attribute information may also include an interactivity associated with the objects that are to be represented by the touch elements. An object may have an associated interactivity if a player can e.g. collect, hold, wear, damage, destroy, repair, etc. that object. To convey this interactivity, at least some of the pins may be associated with respective vibrators (not shown in Figure 1) for generating a vibration at the corresponding pins. Each vibrator (i.e. haptic transducer) may be configured to receive an input from the controller 102. The controller 102 may be configured to selectively control a vibration of at least one pin based on an interactivity associated with the object at the corresponding pin location. This may involve, for example, determining which of the pins correspond to the location of the interactive object, and generating a signal for transmitting to the corresponding vibrators.
Alternatively, or in addition, the interactivity associated with a given object may be conveyed by raising and lowering the corresponding pins in quick succession. In such a case, the tactile output device need not comprise additional vibrators -the actuators may provide the same or similar functionality. In some cases, some objects may be more important, or closer, than others, and the magnitude of the vibration may be greater so as to signify the importance or proximity of the object. In such examples, the attribute information may include a priority value associated with each object, where priority values are higher for objects associated with a game objective. The controller 102 may be configured to control the vibration of one or more pins based on the proximity and / or priority of the corresponding object. It will be appreciated that, while Figure 3 shows the objects being represented separately from the physical features of the virtual environment, in some embodiments, this need not be the case. For example, the video game state information may include both topographic and object information, and the controller 102 may be configured to use this information to generate a tactile representation of the physical features of the environment and any objects located therein. in such examples, the controller 102 may be configured to generate a tactile output that is different for objects compared with the tactile output that is generated for physical features of the virtual environment. This may involve for example, generating a vibration at each touch element that is being used to indicate the presence of an object (and not at the touch elements being used to represent the physical features of the virtual environment), in some examples objects having an associated priority exceeding a threshold value may be rendered, with only those parts of the virtual environment defining a boundary also being rendered.
In some embodiments, at least some of the pins may be associated with respective touch sensors. Each touch sensor may be configured to detect a touch input received at the one or more corresponding pins. The pins may be arranged such that at least some movement of the pin in the vertical direction is possible when the pins are in an active state. Each touch sensor may be located below the corresponding pin(s) and be configured to detect when a pin an active state has been depressed below a threshold height.
In some examples, the pins may be operable to be depressed when in a rest state. That is, a user may provide a touch input to pins in a depressed and / or active state.
In some examples, the touch sensors may be configured to detect a variable force imparted on the pin by the user. That is, the touch sensors may correspond to pressure sensors. In such examples, the pressure imparted on a given pin may be used to determine a corresponding touch input that was intended by the user. For example, a tap gesture may correspond to the selection of the corresponding object that the pin represents, whereas a hard push may correspond to a 'reset' operation In embodiments involving the use of touch sensors, the controller 102 may be configured to receive an input from at least one touch sensor, and in response to said input, generate a zoomed in tactile representation of the location in the virtual environment that corresponds to the location at which the touch input was received. An example of this is shown in Figure 4, which shows a zoomed out view 400 of the virtual environment shown in views 200, 300 described previously in relation to Figures 2 and 3. The views 200, 300 shown in Figures 2 and 3 correspond to a zoomed-in view of the view shown in Figure 4, with the tactile representation shown in Figures 2 and 3 corresponding to a zoomed in tactile representation of the virtual environment. In Figure 4, it can be seen that, compared with Figures 2 and 3, a lake 402A, building 403A, and longer wall 404A are visible in the displayed view, and that these features are represented by corresponding touch elements 402B, 403B and 404B respectively.
It will be appreciated that, in some examples, a user may wish to zoom in on different locations of the virtual environment that are not necessarily centred on the player position. For example, in a first zoomed out view, it may be apparent that a given object is located north east of the player. The player may thus tap on the one or more pins representing that object so as to generate a zoomed in representation of the virtual environment, centred on that object. In such examples, the position on the touchpad representative of the player may not necessarily correspond to the central location on the touchpad. For example, if the object is located towards the north east of the player, the touch element at the bottom left hand corner may be used to represent the player location relative to the zoomed in portion of the virtual environment. The player location may be conveyed to the user in any of the manners described previously.
In some examples, it may be that a zoomed-out view of the virtual environment is initially presented to die user, with the user being able to zoom in on various parts of or objects within the virtual environment by pressing down on the pins representing those parts or objects. in such examples, it may be that the video game state information corresponds to a zoomed out version of the 3D map, with higher granularity information being provided to the tactile device in response to a user having provided a user In input at a location on the touchpad. For example, a touch input received at one or more pins may be relayed to a video games machine, which then supplies higher granularity topography and / or object information, corresponding to the region of the virtual environment that the player has selected to zoom-in on. Alternatively or in addition, a user may control the magnification of the virtual environment displayed at their display via e.g. their games controller 102, and the video game machine may transmit the corresponding topography or object information to the tactile output device.
In some examples, the touch sensors may be configured to detect a variable force imparted on the touch elements. In such embodiments, the magnitude of the force imparted on a given touch element may be used to determine whether a zoom in or zoom out operation is intended by the user. In some cases, it may be that a 'tap' input corresponds to a zoom-in operation and a 'push' gesture may correspond to zooming out to a default view of the virtual environment.
In simpler examples, a physical button for performing a zoom out operation may be provided on the device. A pressing of the physical button may result in a reset operation, with the tactile output device generating a tactile representation of the virtual environment at a default magnification (e.g. 100%). In some examples, the physical button may comprise a scroll wheel that can be used for zooming in and out, e.g. by performing a scroll motion in a corresponding direction.
It will be appreciated that, as the magnification decreases (i.e. the player's view is zoomed out), there will likely be an increasing number of objects and physical features surrounding the user.
However, it will be further appreciated that representing all of the objects and physical features on the map may not be feasible or particularly meaningful for the user. Hence, in some examples, the controller 102 may be configured to determine which of the objects or physical features are largest, tallest and / or of a highest priority. The tactile output generated at the touchpad may then be controlled so as to only correspond to such objects or physical features. The size and /or priority associated with the physical features or objects may be determined from the corresponding video game state information received at the tactile output device, for example.
It will be noted that, as the magnification decreases, the threshold distance or radius of the virtual environment relative to the player that is to be represented by the tactile output device, increases. However, the video game state information received at the tactile output device may be filtered so as to represent only the largest or most important features and objects of the virtual environment, as described above. This filtering may be performed by the video game machine, prior to the video game state information being transmitted to the tactile output device.
In embodiments where the tactile device is configured to generate a tactile representation of objects in the virtual environment, the touch input received at a given touch element may be interpreted by the device as an interaction with the corresponding object. An interaction may include e.g. the selection of the object for use. In these embodiments, the tactile output device may thither comprise a user input unit configured to receive an input from at least one touch sensor, and in response to said input, identify at least one object that corresponds to the location at which the touch input was received.
The user input unit may be further configured to determine a type of interaction that the user input corresponds to. In some examples, it may be that the input corresponds to a default input such as e.g. pressing of the button for a P54 game. Alternatively or in addition, it may be that only one interaction is possible with a given object and so the user input may provide an indication of the object that the user has selected to the video games machine, which then determines what the corresponding input is. Any suitable grammar may be envisaged, such as a single tap for an inspection of the object (e.g. to receive an audio description) and a double tap to initiate an interaction. Alternatively or in addition, one or more dedicated buttons may be provided that act like shift keys, so that when pressed in conjunction with a pin, the nature of the input for that pin is changed.
Optionally, a user may interact with an object by touching a pin corresponding to object in order to obtain an audio cue from that object; for environmental elements, some may have a noise associated with them, such as the waterfall mentioned previously. Where the game is generating a soundscape using a plurality of such sounds, optionally this may be suspended so that only the selected contributing element is played -this allows the user to explore the audio environment by selectively muting sounds other than those attributable to the selected object. Similarly for interactive objects, a user may touch a pin representing that object to hear an audio description of the object, or in the case of a character, hear their dialogue delivered, or redelivered; this may be of particular use when multiple in-game characters are talking and the user wants to attribute the dialog to the correct character. Similarly, for in-game characters, a pin corresponding to a character may vibrate whenever that character is speaking.
The tactile device may further comprise an output unit operable to transmit an indication of the at least one identified object to a video game machines, so as to indicate that the object has been interacted with by the user. The video game machine may then update the display of the video game, in accordance with the interaction simulated by pressing of the one or more corresponding pins at the tactile output device As mentioned previously, each touch clement may be associated with a corresponding touch sensor, and so a given touch element may be detected as having been interacted with, based on a touch input being received at the corresponding touch sensor. Any objects being represented at the touchpad will have been assigned to corresponding touch elements, and so the detection of a touch input at ally of the corresponding touch sensors can be identified as corresponding to a user selection of that object. The user selection may correspond to a user selecting the object for e.g. consuming or finding out more information about.
in some embodiments, the tactile device may further comprise a physical button operable to control whether a user input received at one or more pins is interpreted as corresponding to a zoom operation or an object selection operation. This button may correspond to a toggle, where e.g. if the button is in a left position, touch inputs correspond to 'zoom', whereas if the button is in a right position, touch inputs correspond to an object selection. As noted above, fins may also be a button for simulataneous use with a touch inputs, acting in a similar manner to a shift key. In these embodiments, the physical button is configured to provide an input to the controller 102, such that the controller 102 can generate an appropriate tactile representation of the virtual environment. If the button is toggled to correspond to an 'object selection' setting, the physical button may provide an input to the user input unit. The user input unit can then relay any object selections made by the user to one or more other devices, such as a video games machine.
In some examples, at least some of the pins may be associated with a touch sensor such as a button, or other depressible input, so that when pushed down upon the button or input is activated. The button may sit under the pin actuator for example. Similarly, the touch elements or pins may form keys, for example where each one has a surface area similar to that of a fingertip or keyboard key, or may be coupled to a shell Miming such a keyboard key. The pin may then move keys up or down to operate as touch / haptic elements (in any viewing mode), whilst a user may press on the key to create an input (optionally only when the pin/touch element/key is in the rest position). Where the array of pins/touch elements/keys is of a sufficient number (for example a 4x10 or 5x14 array), this may enable a QUERTY input or similar. Each pin/touch element/key may be embossed with a corresponding braille code indicative of its function. The tactile output device may then have a dual use as a tactile output device and potentially controller / interaction selector within a game, and as a braille keyboard, either within a game if appropriate, or for other uses. Hence in an example the touchpad comprises a plurality of touch elements arranged in an array corresponding to an array of keys on a typing keyboard, and these touch elements are associated with respective touch sensors with each touch sensor being configured to detect a touch input received at the corresponding touch element; the communication interface is then operable in a keyboard mode to transmit signals identifying respective key presses on the touch elements, so that the device appears, to an external system such as the videogamc device, to be operating as a keyboard.
Forward-view The tactile output device may provide a further mode corresponding to a 'forward-view' mode. In the forward-view mode, the controller 102 is configured to control the tactile output of at least some of the touch elements so as to generate a tactile representation of the locations of the objects in the virtual environment, relative to the player's current viewpoint of the virtual environment. The forward-view mode may also be used to generate a tactile representation of the physical features of the virtual environment. However, generally, it is expected that a user will be more interested in the objects that are within their field of view.
in the forward view mode, the video game state infomiation may comprise locations of one or more objects relative to the player's current viewpoint of the virtual environment. Generally, the video game state information may include the 3D coordinates of the objects (or representative bounding cubes) within a threshold radius of the player, and so the controller 102 may be configured to determine the object coordinates that are to be used when generating a tactile representation of the virtual environment. An example of a tactile representation of a plurality of objects in a player's field of view is shown in Figure 5, In Figure 5, the player 501 is shown as occupying a lower central region of the display 500, with two objects 502A, 503A being located to the right of the player. The corresponding tactile representation at the touehpad 504 is shown below the display area 500. As can be seen in Figure 5, the objects 502A, 503A are shown as being represented by corresponding touch elements 502B, 503A. it is worth noting that Figure 5 is a schematic diagram and that, although the outlines of the touch elements in a rest state is not shown, these may be visible in reality. It will be appreciated that, representing objects in the x-z plane may be of limited utility. For example, a player may need to switch to the top-down view to gauge how far a given object is from the player.
Screen-mode In some embodiments, each touch element may correspond to a region of screen-space. This may correspond to a so-called 'screen' mode. An example of the screen mode is shown in Figure 6, which shows a screen 600 divided into a plurality of sections 601A, 602A, 603A, and the corresponding touch elements 601B, 602B, 603B that have been assigned to those screen sections. If, for example, the screen size is 1920 x 1080 pixels, and the pin matrix is a small 5 x 4 matrix, each pin will represent a subsection of the screen corresponding to 384 x 270 pixels. More generally, the number of pixels forming the width in a subsection that a given pin represents can be expressed as: screen width (x) ± number of columns in the pin matrix, and the number of pixels forming the height (z) in a sub-section that a pin represents can be expressed as screen height number of rows in the pin matrix. Each screen section mapped to a corresponding pin may represent a bounding box. Although reference is made herein to a 'screen section' it will be appreciated that this equivalent to a section of a video image that has been re-sized to fit the screen of a display device.
In Figure 6, each touch element is at approximately the same height relative to the other touch elements. In the screen mode, the relative heights or offsets of the pins may not correspond to the relative heights of different objects in the scene. However, generally, the touch elements will be spaced apart such that a user can provide a touch input at individual touch elements.
To facilitate screen-mode', the tactile output device may comprise a mapping unit operable to obtain a mapping between touch element locations and respective sections of a video image of the virtual environment. The respective sections of the video image may correspond to the screen sections described above. The mapping may be determined as above, e.g. based on knowledge of the resolution of the display device. The controller 102 may be configured to receive an input from at least one touch sensor, corresponding to at least one touch element, and in response to the input received from the touch sensor, obtain object information for the objects that are within or primarily within the portion of the virtual environment that the selected screen section corresponds to. This object information may be received at the communication interface 101, for example.
In some examples, the tactile output device 100 may comprise an object identification unit operable to identify the objects within a selected screen section. In alternative examples, it may be that the object identification unit is located at the video game machine, and the tactile output device 100 simply provides an indication of the section of the screen that the user has selected, which the video game machine then uses to determine which of the objects are located within that screen section (in any of the manners described above). As will be appreciated, it may be preferable for the video game machine to perform this calculation, to reduce either the overhead on the tactile output device and thus any potential delay in rendering the tactile output, or simply the complexity of the tactile output devicein any case, the object identification unit may be configured to determine a frustum (with respect to the position of a player's eyes or pose of a virtual camera) that a selected screen section corresponds to, and to determine the objects that are within the world space that is intersected by that frustum.
In alternative or additional examples, it may be that the image data corresponding to the selected screen section is input to a machine learning or computer vision algorithm, which then identifies objects visible in that screen section, and generates a corresponding audio or text description. The user input received at the tactile device may be used to crop the video image according to the selected screen section. Once the objects within the screen section have been identified, text and / or audio information may be generated. However, as will be appreciated the information obtained in this way may not relate to the function of the object. For example, it may be possible to identify an object as a 'star' in a video image, but not as providing a user with an extra life in the video game.
In the screen-mode, when a pin is pressed, a user may be provided with information about the one or more objects identified as being within the bounds of the corresponding screen section. The information may include a description of the object and its characteristics. For example, if the object corresponds to a star, the description may include 'Object: 'Star', collect to gain a life'. The object information may correspond to text and / or audio information that is to be output at an appropriate output device. For example, the audio information may be output at one or more speakers; the text information may be output at a display, or a braille device such as that described in application no. GB1814843.7. Alternatively or in addition, the tactile output device could temporarily switch to a braille mode in which it replicates braille patterns sequentially using the pin army to provide the information. it will he appreciated that in some examples, the video game machine may include one or more speakers, and / or a display for outputting the information pertaining to the object(s) selected by the user.
In some examples, the information about the one or more objects within a given screen section may be obtained from the video game playing device. For example, each object may have metadata associated therewith, defining what the object is and one or more corresponding characteristics. This metadata may be defined as part of the video game. For example, most video games include information about the different objects a player will encounter and what that object is or does (e.g. in a help menu). in such examples, the video game machine may be configured to cause the information to be output at an appropriate output device, in response to an input being received from the tactile device whilst operating in the screen mode.
In some examples, the order in which information about objects in a given screen section is output may correspond to a pre-determined order. For example, objects may be 'read' from left to right, top to bottom (as reading a book in western countries), or from right to left, top to bottom.
Alternatively, or in addition, the order in which the objects are 'read' may be determined based on their relative locations within the corresponding screen section. For example, if a majority of the objects are detected as being located to the left of the screen section, the objects may be 'read' left to right. In some examples, the order in which this information is provided may correspond to an order of distance and / or priority, with the closest and / or most important objects being 'read' first, relative to the objects within the corresponding screen section. The order in which the object infomiation for a given screen section is provided to the user may be configurable by the user.
As will be appreciated, an object need not be completely contained within a given screen section in order for that object to be identified. However, generally, this will depend on how the identification is performed, for example, a cropped image containing just part of an object may not be identifiable by a machine learning model.
Figure 7 shows schematically an example of a system 700 that includes the tactile output device 701 of the present disclosure. The tactile output device 701 may correspond to any of the tactile output devices described previously. In Figure 7, the system 700 includes a tactile output device 701 that is in communication with a video game machine 702. The tactile output device 701 comprises a touchpad 703 (as described previously). The tactile output device 701 may be in communication with the video game machine 702 via a wired or wireless connection (the latter may include e.g. WiFi, Bluctooth, etc.). In Figure 7, the video game machine 702 is shown as comprising a communication interface 704 for transmitting data to arid receiving data from, the tactile output device 701. Although not shown in Figure 7, the tactile output device 701 will also comprise a communication interface for receiving video game state information, as described previously. In Figure 7 the previously described controller is not shown, since this will generally be an internal component of the tactile output device 701.
It will be appreciated that, in some examples, the tactile output device 701 may he used in combination with a cloud gaming service. in such cases, the video game machine 702 may not necessarily correspond to a games console. The video game machine 702 may comprise e.g. a dongle, router or client device that receives video game data from a cloud gaming service. In cloud-gaming embodiments, the video game machine 702 may be configured to forward video game data received from the cloud gaming service to the tactile output device 701.
In Figure 7, the system 700 is also shown as comprising a display device 705. The display device 705 is configured to output image data (i e video) corresponding to the video game being played at the video game machine 702. The display device 705 may be connected to the video game machine 702 by e.g. an MID cable. In Figure 7, the tactile output device 701 is shown in a dormant state -i.e. there is no tactile representation of the image displayed at the display device being output by the tactile output device 701. This may correspond to the device being switched off, for example. It will be appreciated that, in examples where the video game is provided by a cloud gaming service, the display device 705 may correspond to the client device that receives the video game data for outputting at the tactile output device 701. it will be further appreciated that in some examples, the display device 705 may be integral to the video game machine 702. In Figure 7, the tactile output device 701 is not necessarily shown to scale.
In Figure 7, the system 700 is also shown as comprising a games controller 706 for providing player inputs to the video game machine 702. The games controller 706 may be connected to the video game machine 702 via a wired or wireless connection. In Figure 7, the games controller 706 is shown as being connected to the video game machine 702 at port 707. Although port 707 is shown as being different to communication interface 704, it will be appreciated that these ports may be the same and / or of the same type. For example, both the games controller 706 and tactile output device 701 may connect to the video game machine 702 via a USB port. In cloud-gaming embodiments, the games controller 706 may connect directly to e.g. a dongle or router, which is in communication with the tactile output device 701 and the games controller 706.
In Figure 7, the games controller 706 is shown as being provided with an additional braille device 708. The braille device 708 may be configured to receive text or symbol information from the video game machine 702 and to generate a tactile representation of the received text or symbol information. The braille device 708 may have a touchpad corresponding that described previously, but operable to render braille as opposed to physical features of, and objects in, a virtual environment.
As described previously, the touchpad 703 of the tactile output device 701 may be used to select certain objects (object mode) or screen sections (screen-mode). This user selection may be received at the video game machine 702, which then transmits the corresponding text or symbol information to the braille pad. The braille pad may then render this information at its corresponding touchpad, such that the user can obtain more information about the objects they have selected or that are contained within a given screen section. An example of a braille device that may be used in this way is disclosed in GB1814843.7.
Figure 7 is also shown as comprising a speaker 709. As described previously, audio information may be output in response to a touch input received at a location on the touchpad 703 of the tactile output device 701. The touch input may correspond to a selection or interaction with an object, or selection of a given section of the screen of the display device 705. The video game machine 702 may be configured to transmit the audio information to the speaker 709, in response to receiving an input from the tactile output device 701. The video game machine 702 may be in communication with the speaker 709 via a Bluetooth connection, for example The speaker(s) 709 may include text-to-speech finictionality. For example, the video game machine 702 may transmit text relating to the selected objects or screen section, which the speaker 709 may then convert to speech and output said speech. In some examples, the speaker 709 may be integral to the video game machine 702 and! or the tactile output device 701.
In Figure 7, the tactile output device 701 is also shown as comprising a physical button 710. The physical button may correspond to a toggleable switch that is used to control whether a touch input received at the touchpad is interpreted as a zoom or object interaction operation (as described above). In Figure 7, additional physical buttons 711 are also shown. These buttons 711 may be used to cycle between different modes of operation, for example. in Figure 7, a further physical button 712 is also shown. This button 712 may be used to perform a reset operation, such as zooming out to a default magnification when the toggleable switch 710 is engaged in the 'zoom' position.
In some examples, each mode may have a corresponding respective physical button (not shown).
The buttons may be provided an indication of what they correspond to in the form of e.g. a label. Alternatively or in addition, the tactile output device 701 may be configured to output audio indicating what a currently selected mode corresponds to (or cause such audio to be output at e.g. speakers 709). In some examples, the tactile output device 701 may comprise one or more light sources, or a display, for providing a visual indication of a currently selected mode.
In some examples, the tactile output device 701 may also allow users to provide a directional input for traversing a virtual environment. For example, if a central location on the touchpad is used to represent a player's location, then a user may provide a directional input at this location by pressing or moving their finger in a corresponding direction. It may be for example, that at least some (if not all) of the touch elements are associated with touch sensors operable to detect a directional input, and an indication of this directional input may be provided to video game machine 702 via the wired or wireless connection. The player's location in the virtual environment may then be updated in accordance with the directional input, with the controller 102 reconfiguring the touch elements so as to represent the topography or objects relative to thc player's new position.
In Figure 7, the display device 705, braille device 708 and speaker 709 are all examples of output devices at which the above-described text and / or audio information may be output. Despite being shown as separate, external devices, any of these devices (or any combination of these devices) may be located at the tactile output device 701 or video game machine 704 or distributed across both. For example, the video game machine 704 may include a display 705 and the tactile output device 701 may include a speaker 709, and optionally, a further mode for rendering braille.
It will be appreciated that the controller 102 referred to in the present disclosure may correspond to one or more processors at the tactile output device that are suitably programmed to perform the above-described fiinctions. It will be further appreciated that the shape and size of the tactile output device, as well as the arrangement of the touch elements may differ from that shown in the Figures. The tactile resolution that the device is able to achieve will generally depend on the size of the tactile output device and the number of touch elements used to provide a tactile output. The size, and number of touch elements will ultimately be the choice of a designer of the tactile output device. In some embodiments, the tactile output device will comprise at least 3 x 3 touch elements for generating a tactile output.
It will be further appreciated that in some embodiments, it may be desirable to peiforrn any complex calculations on the video game machine and not at the video game machine. For example, the CPU of a video game machine may be used to determine the bounding boxes for all objects in a scene, and these bounding boxes may be provided to the tactile output device pre-calculated. Although, this may impact game performance if calculated on a per-video-frame basis. The data sent from the video game to the tactile output device may be dependent on the mode of operation. For example, if the device is being operated in "top-down terrain mode" the raw mesh data may be sent for the terrain. Whereas, if the tactile device is operating in the "Forward-view mode", the bounding boxes may be sent to the tactile output device instead.
It will also be appreciated that the modes described herein may be used in isolation or combination. For example, a user may operate the tactile output device in a 'terrain' and / or 'object' mode in any of the top-down or forward-view modes (the former modes may correspond to sub-modes of the latter). A user may switch between the terrain modes and object modes depending on whether they are interested in cosmetic features of the virtual environment (topography) or objects for interacting with.
The examples described herein are to be understood as illustrative examples of embodiments of the invention. Further embodiments and examples are envisaged. Any feature described in relation to any one example or embodiment may be used alone or in combination with other features. In addition, any feature described in relation to any one example or embodiment may also be used in combination with one or more features of any other of the examples or embodiments, or any combination of any other of the examples or embodiments. Furthermore, equivalents and modifications not described herein may also be employed within the scope of the invention, which is defined in the claims.

Claims (6)

  1. CLAIMSA tactile output device comprising: a communication interface operable to receive video game state information indicating at least one of (i) a topography of a virtual environment and (ii) the locations of one or more objects in a virtual environment; a touchpad comprising a plurality of touch elements for generating a respective tactile output; and a controller operable to receive an input from the communication interface, and in response to said input, control the tactile output of at least some of the touch elements so as to generate a tactile representation of at least one of a topography of the virtual environment and the location of one or more objects in the virtual environment, each touch element corresponding to a different respective location in the virtual environment.
  2. 2, A tactile output device according to claim 1, comprising: a plurality of pins; a plurality of actuators, each actuator being operable to control a displacement of one or more respective pins; and wherein respective touch elements comprise a respective pin, the controller being configured to control the displacement of the pins based on the received video game state information.
  3. 3. A tactile output device according to claim 2, wherein the video game state information indicates a topography of the virtual environment within a threshold distance of the player; and wherein the controller is configured to control the height of at least some of the pins so as to correspond to the topography of the virtual environment within the threshold distance of the player.
  4. 4. A tactile output device according to claim 2 or claim 3, wherein the video game state information indicates the locations of one or more objects in a virtual environment relative to a player within a threshold distance of the player; and wherein the controller is configured to control the relative height of at least some of the pins so as to indicate the presence of one or more objects relative to the player, within the threshold distance of the player.
  5. A tactile output device according to claim 4, wherein the video game state information defines one or more attributes associated with at least one object within the threshold distance of the player, the attributes corresponding to one or more of: i. a dimension of the at least one object; ii, a motion of the at least one object; iii, an interactivity associated with the at least one object; and wherein the controller is configured to control the displacement of at least some of the plurality of pins so as to represent the one or more attributes of the at least one object.
  6. 6. A tactile output device according to claim 5, wherein the video game state information defines an interactivity associated with an object within a threshold distance of the player; and wherein the controller is configured to control a vibration of at least one pin representing the location of the interactive object A tactile output device according to claim 6, comprising: a plurality of vibrators, each vibrator being in communication with one or more respective pins, the controller being configured to selectively transmit a signal to the vibrators; and wherein the controller is configured to control a vibration of at least one pin by transmitting a signal to the vibrator or vibrators in communication with the pin representing the location of the interactive object.8. A tactile output device according to any preceding claim, wherein at least some of the pins are associated with respective touch sensors, each touch sensor being configured to detect a touch input received at the corresponding pin; and wherein the controller is configured to receive an input from at least one touch sensor, and in response to said input, generate a zoomed in tactile representation of the location in the virtual environment that corresponds to the location at which the touch input was received.9. A tactile output device according to any preceding claim, wherein at least some of the pins are associated with respective touch sensors, each touch sensor being configured to detect a touch input received at the corresponding phi; wherein the tactile output device further comprises: a user input unit configured to receive an input from at least one touch sensor, and in response to said input, identi; at least one object that corresponds to the location at which a touch input was received; and an output unit operable to transmit an indication of the at least one identified object to a video games machine so as to indicate that the identified object has been interacted with by the user.10. A tactile output device according to claim 8 and claim 9, comprising a physical button operable to control whether a user input received at one or more pins is interpreted as corresponding to a first operation or an second operation; and wherein the controller is configured to control the tactile representation generated at the touchpad in dependence on an input received from the physical button.11. A tactile output device according to any preceding claim, comprising: a mapping unit operable to obtain a mapping between touch element locations and respective sections of a video image of the virtual environment, wherein at least some of the touch elements are associated with respective touch sensors; wherein the controller is configured to receive an input from at least one touch sensor, and in response to said input, identify a corresponding section of the video image that has been selected by the user; and wherein the device further comprises an output unit operable to transmit an indication of the section of the video image that has been selected by the user to a video game machine.12. A tactile output device according to any preceding claim, wherein the controller is configured to control the tactile output of at least some of the touch elements around a central location on the touchpad, the central location being representative of a player's location within the virtual 20 environment; wherein the communication interface is operable to receive updated video game state information corresponding to an updated position of the player within the virtual environment; and wherein the controller is configured to control at least some of the touch elements so as to represent the virtual environment within a threshold distance of the player's updated position.13. A tactile output device according to claim 12, comprising a touch sensor at the central location on the touchpad, the touch sensor being operable to receive a directional input; and an output unit operable to receive an input from the touch sensor at the central location on the touchpad, and in response thereto, transmit an indication of a directional input to a video games 30 machine.14. A system comprising: a tactile output device according to any of claims 1 to 13; a video game machine operable to communicate with the tactile output device, the video game machine being configured to execute a video game program; and an output device operable to communicate with at least one of the tactile output device and the video game machine.15. A system according to claim 14, wherein the tactile output device comprises the tactile output device of claim 9, the video game machine being configured to receive an input from the output unit of the tactile output device, and in response to said input, obtain at least one of text and audio information associated with object that the user has interacted with; and wherein the video game machine is configured to transmit the obtained text and / or audio information to the output device.16. A system according to claim 14 or 15, comprising the tactile output device of claim 11, the video game machine being configured to receive an input from the output unit of the tactile output device, and in response to said input, identify one or more objects within the corresponding section of video image that has been selected by the user; the video game machine being further configured to obtain at least one of text and audio information associated with the identified objects and transmit the obtained text and / or audio information to the output device.17. A system according to claim 16, wherein the output device is configured to output the obtained text and / or audio information received from the video game machine.18. A system according to any of claims 14 to 17, wherein the output device is operable to render a tactile representation of text information received at the output device from the video game machine; and wherein the video game machine is operable to receive an input from the tactile output device and in response to said input, transmit text information to the output device.19. A system according to any preceding claim, in which the touchpad comprises a plurality of touch elements arranged in an array corresponding to an array of keys on a typing keyboard; and these touch elements are associated with respective touch sensors, each touch sensor being configured to detect a touch input received at the corresponding touch element; and the communication interface is operable in a keyboard mode to transmit signals identifying respective key presses on the touch elements.
GB1913794.2A 2019-09-25 2019-09-25 Tactile output device and system Pending GB2587368A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1913794.2A GB2587368A (en) 2019-09-25 2019-09-25 Tactile output device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1913794.2A GB2587368A (en) 2019-09-25 2019-09-25 Tactile output device and system

Publications (2)

Publication Number Publication Date
GB201913794D0 GB201913794D0 (en) 2019-11-06
GB2587368A true GB2587368A (en) 2021-03-31

Family

ID=68425625

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1913794.2A Pending GB2587368A (en) 2019-09-25 2019-09-25 Tactile output device and system

Country Status (1)

Country Link
GB (1) GB2587368A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111870947A (en) * 2020-08-10 2020-11-03 网易(杭州)网络有限公司 Game interaction method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234502A1 (en) * 2010-03-25 2011-09-29 Yun Tiffany Physically reconfigurable input and output systems and methods
US20160189427A1 (en) * 2014-12-31 2016-06-30 Immersion Corporation Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
US20160202761A1 (en) * 2015-01-12 2016-07-14 International Business Machines Corporation Microfluidics Three-Dimensional Touch Screen Display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234502A1 (en) * 2010-03-25 2011-09-29 Yun Tiffany Physically reconfigurable input and output systems and methods
US20160189427A1 (en) * 2014-12-31 2016-06-30 Immersion Corporation Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
US20160202761A1 (en) * 2015-01-12 2016-07-14 International Business Machines Corporation Microfluidics Three-Dimensional Touch Screen Display

Also Published As

Publication number Publication date
GB201913794D0 (en) 2019-11-06

Similar Documents

Publication Publication Date Title
US11221730B2 (en) Input device for VR/AR applications
KR102104463B1 (en) Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US8146018B2 (en) Gesture-based control of multiple game characters and other animated objects
US8172681B2 (en) Storage medium having stored therein game program and game device
JP3637031B2 (en) GAME DEVICE AND GAME PROGRAM
EP1808210B1 (en) Storage medium having game program stored thereon and game apparatus
JP4319156B2 (en) Information processing program and information processing apparatus
JP2003502699A (en) Haptic interface system for electronic data display system
JP2000504450A (en) Cursor control by user feedback mechanism
US8487749B2 (en) Tactile virtual world
US8292710B2 (en) Game program and game apparatus
US7695367B2 (en) Storage medium having game program stored thereon and game apparatus
US20060258444A1 (en) Storage medium having game program stored thereon and game apparatus
JP2018147002A (en) Image processing program, image processing system, image processing apparatus and image processing method
JP6581639B2 (en) Game program and game system
KR20140043522A (en) Apparatus and method for controlling of transparent both-sided display
KR20190122581A (en) Systems and methods for multi-user shared virtual and augmented reality-based haptics
KR102218967B1 (en) System and control method of 3d air touch display
GB2587368A (en) Tactile output device and system
JP2004195210A (en) Game sound control program, game sound control method, and game device
JP4326585B2 (en) Information processing program and information processing apparatus
CN202548819U (en) Information processing apparatus
KR102201678B1 (en) Systems and methods for integrating haptics overlay in augmented reality
JP2019185363A (en) Display control device, display control method, and program
JP6892262B2 (en) Game program, recording medium, game processing method