GB2610591A - Apparatus, systems and methods for haptics - Google Patents

Apparatus, systems and methods for haptics Download PDF

Info

Publication number
GB2610591A
GB2610591A GB2112852.5A GB202112852A GB2610591A GB 2610591 A GB2610591 A GB 2610591A GB 202112852 A GB202112852 A GB 202112852A GB 2610591 A GB2610591 A GB 2610591A
Authority
GB
United Kingdom
Prior art keywords
audio
haptic
file
information
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2112852.5A
Other versions
GB202112852D0 (en
Inventor
Schembri Danjeli
William Walker Andrew
Thomas Downey Richard
George Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB2112852.5A priority Critical patent/GB2610591A/en
Publication of GB202112852D0 publication Critical patent/GB202112852D0/en
Publication of GB2610591A publication Critical patent/GB2610591A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Stereophonic System (AREA)

Abstract

An apparatus comprises receiving circuitry to receive a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information, processing circuitry to generate a haptic interaction signal in dependence upon the haptic information, and a haptic interface comprising one or more actuators to provide a physical interaction with a user in response to the haptic interaction signal. Also disclosed is the method of receiving a file or stream comprising at least one audio channel, encoding haptic information in an audio file format, modifying the file or stream to designate an audio channel for haptic data, inserting the encoded haptic information into said channel and outputting the file or stream. The file or stream may comprise a sequence of audio frames each having a frame duration, wherein some of the audio frames comprise a first audio channel comprising audio information and a second audio channel comprising haptic information. The haptic information may be in Pulse Code Modulation (PCM) format, Waveform Audio File Format (WAV) or an Audio Interchange File Format (AIFF).

Description

APPARATUS, SYSTEMS AND METHODS FOR HAPTICS
Field of the Disclosure
The present disclosure relates to apparatus, systems and methods for providing haptic interactions.
Description of the Prior Art
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as
prior art against the present disclosure.
Haptic technology is included a range of electronic devices for providing forces and vibrations to allow a user to interact with the device using the sense of touch. Examples of such electronic devices include smartphones, tablets, laptop trackpads, wearable devices, gaming devices and simulation devices. It has been proposed to provide so-called haptic feedback or interaction to a user, such as a user viewing a virtual world (for example via an HMD). Virtual realty and mixed reality systems may use one or more haptic interfaces to provide a more immersive experience by allowing users to interact with a virtual environment using their sense of touch to supplement traditional audio-visual sensory interaction. In this way, a user's interaction with a virtual environment, either for playing a game or completing a simulation, can be enhanced by haptic interactions.
Whereas techniques for improving communication of video and audio information have been the focus extensive research, solutions for addressing communication of haptic information have seen less attention. In one technique, haptic information for controlling a haptic interface is compressed by a dedicated haptic codec and communicated for decompression at a haptic interface. As haptic interfaces become more sophisticated with greater levels of control, the increasing data size of the haptic information is such that there is a growing need to efficiently communicate haptic information.
It is in the context of the above arrangements that the present disclosure arises.
Various aspects and features of the present disclosure are defined in the appended claims and within the text of the accompanying description. Example embodiments include at least a system, a method, a computer program and a machine-readable, non-transitory storage medium which stores such a computer program.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein: Figure 1 schematically illustrates a data processing apparatus for modifying a file or stream; Figure 2 schematically illustrates a user wearing a head-mountable display (HMD) connected to a games console; Figure 3 schematically illustrates a data processing apparatus for receiving a file or stream comprising a plurality of audio channels; Figure 4 schematically illustrates another data processing apparatus for receiving a file or stream comprising a plurality of audio channels; and Figures 5 and 6 schematically illustrate respective data processing methods.
DESCRIPTION OF THE EMBODIMENTS
Figure 1 schematically illustrates a data processing apparatus 100 for receiving and modifying a file or stream comprising at least one audio channel. In embodiments of the disclosure, the data processing apparatus 100, comprises: receiving circuitry 110 to receive a file or data stream comprising at least one audio channel; processing circuitry 120 to: encode hapfic information in an audio file format; modify the file or data stream to include an additional audio channel; and insert the encoded haptic information in the additional audio channel; and output circuitry 120 to output the modified file or data stream. The receiving circuitry 110 can receive a file such as an audio file or a media file including audio and optionally video, in which the file comprises at least one audio channel including digital audio data in an audio file format which may be either a compressed audio file format or an uncompressed audio file format.
Similarly, the receiving circuitry 110 can receive a data stream comprising at least one audio channel including digital audio data in an audio file format which may be either a compressed audio file format or an uncompressed audio file format. The file or the data stream can be received by the receiving circuitry 110 via a wired (e.g. HDMI) or wireless communication (e.g. Bluetoothe). The receiving circuitry 110 is thus configured to receive a file (e.g. by downloading the file from another device) or to receive a data stream corresponding to either a streaming file (e.g. streaming media file) or a live stream such as a live video game streamed by a service such as Twitch®.
Examples of suitable uncompressed audio file formats include the Waveform Audio File Format (WAV or WAVE), Audio Interchange File Format (AIFF) or AU. In some cases, the file or stream may comprise raw audio in the form of pulse code modulated (PCM) audio (e.g. a.raw file). Examples of suitable compressed audio file formats include MPEG-2 Advance Audio Coding (AAC), MPEG-1 Audio Layer III or MPEG-2 Audio Layer II (MP3), AC-3 and Enhance AC-3. In some cases, the file or stream received by the receiving circuitry 110 may be an MP4 (MPEG-4 Part 14) file including video data and/or audio data (in some examples, the file may be an audio-only MP4 file) in which the audio data is in a compatible audio file format such as AAC or MP3.
The data processing apparatus 100 optionally comprises storage circuitry (not shown in Figure 1) for storing one or more audio files and/or one or more media files so that the receiving circuitry 110 obtains the file from the storage circuitry. Alternatively or in addition, the receiving circuitry 110 can receive the file or stream via a wired or wireless communication (e.g. Bluetoothe or WFie) from an external device.
The receiving circuitry 110 can be provided as part of a device such as a general purpose computing device or a games console (e.g. PS50). In some cases, the receiving circuitry 110 may be provided as part of a server. For example, in the case of streaming an online vide game, the receiving circuitry 110 can be provided as part of a server to receive a data stream from a broadcast service (e.g. Twitch0) and modify the data stream including video and audio content by supplementing the content with additional haptic information so that the modified stream can allow output one or more haptic interactions by a haptic interface for the streamed content.
The processing circuitry 120 is configured to encode haptic information in an audio file format and to insert the haptic information into the file or stream received by the receiving circuitry 110 by inserting this information into a respective audio channel. The processing circuitry 120 thus modifies the file or stream received by the receiving circuitry 110 so that the modified file/stream includes at least one additional audio channel comprising haptic information that is encoded in an audio file format. The haptic information may comprise information for specifying one or more actuator properties for an actuator of a haptic interface, such as frequency and/or amplitude to thereby allow an actuator to be controlled to provide a physical interaction with a user in dependence upon the haptic information.
Whilst modifying the stream to include at least one additional audio channel has the advantage of ensuring that the original audio channels are unaffected, in principle an existing audio channel (whether used or spare, i.e. containing no audio data) could be re-purposed and designated for the haptic data. Typically, in this case the choice of audio channel to re-purpose may follow a ranking, such as replacing the sub-woofer channel for preference, or the two rear channels. Optionally the audio channel selected for use as a haptic channel may change dynamically depending on recent audio history (e.g. selecting the quietest channel for repurposing). Hence the following description recites in detail the approach of adding one or more audio channels for haptic data, but encompasses the approach of re-purposing existing audio channels for haptic data as appropriate.
In some examples, the data processing apparatus 100 may comprise a database storing a plurality of respective instances of haptic information corresponding to different haptic interactions (e.g. first haptic information may be stored corresponding to a first haptic interaction for a first type of in-game event and second haptic information may be stored corresponding to a second haptic interaction for a second type of in-game event). As such, one or more instances of haptic information can be selected from the database (e.g. by a developer) as appropriate for use by the processing circuitry 120 to be inserted into an audio channel of a file or data stream (this represents an example of offline content generation). Alternatively or in addition, the data processing apparatus 100 comprises a haptic information generator (not shown in Figure 1) configured to perform content analysis of audio and/or video content in a file or stream to select an instance of haptic information from the database based on one or more audio properties and/or one or more video properties. Techniques for obtaining haptic information corresponding to audio and/or video content are discussed in more detail later.
In some examples, the acquired file or stream comprises a first audio channel comprising encoded audio information and the processing circuitry 120 performs the modification to add at least one further audio channel comprising the encoded haptic information, in which the encoded audio information and the encoded haptic information are encoded in a same audio file format. For example, the received file may be an AAC file, and the processing circuitry 120 can be configured to encode the haptic information using the AAC coding format and insert the encoded haptic information into an additional audio channel for the file. It will be appreciated that any audio file format can be suitably used by the processing circuitry 120.
The output circuitry 130 is configured to output the modified file (updated file) or modified data stream (updated data stream) comprising the additional audio channel to an external device for use by the external device or another device that communicates with the external device. More generally, the modified file/stream is output for use by a haptic interface which can receive the file or data stream to control an actuator in dependence upon the haptic information to thereby provide a physical interaction with a user. Alternatively, the output circuitry 130 can output the modified file or stream to an intermediary device which communicates with a haptic interface.
Therefore, the data processing apparatus 100 acquires a file or stream and modifies the file or stream to include an additional audio channel including encoded haptic information so that the modified file/stream can be output. In some examples, the file/stream modification may be performed in substantially real-time using the haptic information generator to analyse audio and/or video in substantially real-time and detect an in-game event for which haptic information from a database can be selected and used to populate an audio channel.
By modifying the received data to include an additional audio channel and populating the additional audio channel with encoded haptic information, a single file or data stream can be used as a carrier for both audio information and haptic information. In this way, a recipient device that receives the file/stream can obtain both the audio information and the haptic information from the file/stream using the same processing operations thereby potentially removing the need for a dedicated haptic stream or haptic codec. For example, in the case where the haptic information and audio information are encoded in an audio file format such as AAC, then the recipient device can obtain the haptic information and the audio information from the received file/stream by performing the reverse of the coding operations performed by the processing circuitry 120 by using an AAC codec. It will be appreciated that depending on the audio coding format used for encoding the haptic information, any suitable audio codec can be used for decoding the haptic information.
The modified file/stream therefore allows haptic information to be communicated in a file or stream comprising audio and optionally video. The techniques of the present disclosure can allow an existing file or existing data stream including audio data to be adapted so as to allow haptic information to be provided in an additional audio channel so that the haptic information can be communicated together with the audio and optionally video.
In embodiments of the disclosure, the processing circuitry 120 is configured to modify the file or stream to include metadata indicative of the additional audio channel comprising the haptic information. For example, the file may comprise a header such that the header can be updated to include metadata. The metadata indicates the one or more audio channels that have been populated with the haptic information by the processing circuitry 120. For example, a flag may be included in the metadata for indicating an audio channel that comprises the haptic information. In the case where the processing circuitry 120 updates the file or stream to include a plurality of additional audio channels each comprising haptic information, the file or stream can be updated to include metadata indicating each of the plurality of audio channels comprising the haptic information. Consequently, a recipient device that receives the modified file/stream and comprising processing circuitry compatible with the audio file format (encoding format) of the haptic information can acquire the file/stream and distinguish between the audio channel(s) including audio information for use by an audio element (e.g. speaker) and the audio channel(s) including haptic information for use by a haptic interface on the basis of the metadata. The use of metadata for distinguishing between the audio channels is discussed in more detail later.
In some examples, the processing circuity 120 is configured to modify the file/stream to include an additional audio channel for the haptic information by adding a predetermined audio channel. For example, the received file/stream may include the 2.0 audio channel format (stereo audio with two distinct audio channels -one audio channel for left and another audio channel for right). The processing circuitry 120 can be configured to update the received file/stream by adding an additional audio channel and inserting the encoded haptic information into the additional channel so as to obtain a 2.1 audio channel format in which the bass channel (also known as the subwoofer channel or low-frequency effects channel) comprises the encoded haptic information. Similarly, the received file/stream may include the 5.1 audio channel format for surround sound (five audio channels for speakers and one bass channel). In this case, the processing circuitry 120 can be configured to update the received file/stream by adding an additional audio channel to obtain the 6.1 audio channel format in which the 6th channel (the back surround channel) includes the encoded haptic information. In some examples, the received file/stream may include the 5.1 audio channel format and the processing circuitry 120 can be configured to update the received file/stream by adding two or more additional audio channels to obtain the 7.1 audio channel format (or an N.1 audio channel format where N is 7 or greater).
Hence in a system in which the data processing apparatus 100 is provided as part of a games console and outputs the updated file/stream to a haptic interface (e.g. a handheld controller, such as the Sony® DualSense® controller, or an HMD, such as the PSVRO) the haptic interface can be programmed in advance so that for a given audio format that is received from the data processing apparatus 100, the haptic interface automatically identifies a predetermined audio channel that comprises the haptic information. For example, in the system as shown in Figure 2, in which the HMD 20 communicates with the games console 300 via the connection 82, 84 (note that whilst Figure 2 shows a wired connection using one or more wires 82, 84 a wireless connection may instead be used between the HMD 20 and the games console 300), the HMD 20 can be programmed so that when receiving a 2.1 audio channel format from the games console, the HMD 20 automatically identifies the low-frequency effects channel as including the haptic information. Similarly, the HMD 20 can be programmed so that when receiving a 7.1 audio channel format from the games console, the HMD 20 automatically identifies the 7th channel (or in some cases the 6th and 7th channels) as including the haptic information.
Hence more generally, a haptic interface (such as the HMD 20 or a handheld controller 330) may comprise storage circuitry to store a lookup table for indicating, for each of a plurality of predetermined audio channel formats, which of the audio channels comprise the haptic information.
Referring now to Figure 2, an example virtual reality system is shown in which a user is wearing an HMD 20 connected to a games console 300 (e.g. PS50). The games console 300 is connected to a mains power supply 310 and (optionally) to a display screen 305. One or more cables 82, 84 link the HMD 20 to the games console 300 and may, for example, be plugged into a USB socket 320 on the console 300.
The video displays in the HMD 20 are arranged to display images generated by the games console 300, and the earpieces 60 in the HMD 20 are arranged to reproduce audio signals generated by the games console 300. Note that if a USB type cable is used, these signals will be in digital form when they reach the HMD 20, such that the HMD 20 comprises a digital to analogue converter (DAC) to convert at least the audio signals back into an analogue form for reproduction.
Images from a camera 122 mounted on the HMD 20 can be passed back to the games console 300 via the cable. Similarly, if motion or other sensors are provided at the HMD 20, signals from those sensors may be at least partially processed at the HMD 20 and/or may be at least partially processed at the games console 300. The use and processing of such signals will be described further below.
The USB connection from the games console 300 can also provide power to the HMD 20, according to the USB standard. In some cases the HMD 20 may comprise its own power source in the form of one or more batteries Figure 2 also shows a separate display 305 such as a television or other openly viewable display (by which it is meant that viewers other than the HMD wearer may see images displayed by the display 305) and a camera 315, which may be (for example) directed towards the user (such as the HMD wearer) during operation of the apparatus. An example of a suitable camera is the PlayStation® Eye camera, although more generally a generic "webcam", connected to the console 300 by a wired (such as a USB) or wireless (such as WiFi® or Bluetoothe) connection.
The display 305 may be arranged (under the control of the games console) to provide the function of a so-called "social screen". It is noted that playing a computer game using an HMD can be very engaging for the wearer of the HMD but less so for other people in the vicinity (particularly if they are not themselves also wearing HMDs). To provide an improved experience for a group of users, where the number of HMDs in operation is fewer than the number of users, images can be displayed on a social screen. The images displayed on the social screen may be substantially similar to those displayed to the user wearing the HMD, so that viewers of the social screen see the virtual environment (or a subset, version or representation of it) as seen by the HMD wearer. In other examples, the social screen could display other material such as information relating to the HMD wearer's current progress through the ongoing computer game. For example, the HMD wearer could see the game environment from a first person viewpoint whereas the social screen could provide a third person view of activities and movement of the HMD wearer's avatar, or an overview of a larger portion of the virtual environment. In these examples, an image generator (for example, a part of the functionality of the games console) is configured to generate some of the virtual environment images for display by a display separate to the head mountable display.
In Figure 2 the user is shown holding a pair of handheld controllers 330 which may, for example, be Sony® Move® controllers which communicate wirelessly with the games console 300 and/or the HMD 20 to control (or to contribute to the control of) game operations relating to a currently executed game program. In this case each of the respective handheld controllers 330 can be considered a haptic interface. The two respective handheld controllers each comprise one or more actuators which may be controlled so that the user senses the same haptic interaction from both controllers or the two respective handheld controllers may be controlled to provide different haptic interactions, differing in at least one of their timing, frequency and amplitude. Instead, of the handheld controllers shown in Figure 2, the user may instead hold a single handheld controller with both hands, such as the Sony® DualShocke4 or the Sony® DualSense® controller.
Note that other haptic interfaces can be used providing one or more actuators. For example, a so-called haptics suit may be worn by the user and/or one or two so-called haptic gloves can be worn on the user's hands. Haptic shoes may include one or more actuators and one or more sensors. Or the user could stand on or hold a haptic interface device. The one or more actuators associated with each of these devices may have different respective frequency ranges and/or available amplitudes of vibration. In example arrangements to be discussed below, haptic interfaces may have different capabilities such as different frequency responses and/or different maximum amplitudes.
The data processing apparatus 100 may for example be provided as part of the games console 300 to output the file/stream including at least one audio channel comprising haptic information to the HMD 20 and/or one or both of the handheld controllers 330 via a wired or wireless communication (e.g. Bluetooth® or V\IFie). As such, the data processing apparatus 100 can output a compressed or uncompressed audio stream or media stream comprising at least one audio channel comprising haptic information in an encoded audio format (e.g. PCM format or MC format). A recipient device, such as the HMD 20, can receive the file or stream and generate a haptic interaction signal in dependence upon the haptic information so that an actuator of the HMD 20 can be controlled responsive to the haptic interaction signal to provide a physical interaction with the user wearing the HMD 20. For example, the haptic information included in the audio channel is indicative of at least one of a frequency, maximum amplitude and minimum amplitude for controlling an actuator. By receiving the haptic information, the HMD 20 can generate a haptic interaction signal in dependence upon the haptic information for controlling the actuator to perform a physical movement with a controlled frequency and/or amplitude. In some examples, the haptic information may be indicative of a headset rumble or controller rumble, or more generally a haptic interface rumble, such that an actuator of a haptic interface that receives the haptic information included in the audio channel in the file/stream can be controlled to output a rumble that is sensed by the user. Moreover, the frequency, amplitude and duration of the rumble can each be specified by the haptic information. In some examples where the user wears the HMD 20 and holds a handheld controller 330, both the HMD 20 and the controller 330 may be caused to rumble simultaneously by controlling an actuator in the HMD 20 and an actuator in the controller 330, and as discussed in more detail below the properties of the two respective rumbles may be the same or may be different depending on the haptic information.
Figure 3 schematically illustrates a data processing apparatus 200 for receiving a file or data stream comprising a plurality of audio channels and generating a haptic interaction signal in dependence upon haptic information that is included in at least one of the audio channels in an audio file format. In embodiments of the disclosure, the data processing apparatus 200 comprises receiving circuitry 210 to receive a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information; processing circuitry 220 to generate a haptic interaction signal in dependence upon the haptic information; and a haptic interface 230 comprising one or more actuators to provide a physical interaction with a user in response to the haptic interaction signal.
The acquired file or data stream may for example have been created by the apparatus described with respect to Figure 1, such that the receiving circuitry 210 can be configured to receive data output by the output circuitry 130, optionally via one or more intermediate devices.
In some examples, the acquired file may have been originally created by a developer to include a plurality of audio channels in which at least one audio channel comprises haptic information. For example, a game developer when creating a sound file or media file may include haptic information in a respective audio channel of the file so that the audio information and haptic information can be transported together in the same file. Such a file may be received by the receiving circuitry 210 using known file streaming techniques so that the receiving circuitry 210 receives the audio file or media file as a data stream. Hence, a developer can create sounds, such as speech for characters in a game, or sounds for in-game events, such as gunshots and/or explosions, and also create corresponding haptic information which can be stored in a respective audio channel for the file or select instances of haptic information from a database which complement these sounds. Alternatively, a developer may initially create sounds for a content such as a video game or a movie and create a sound file (or media file), and then subsequently modify the file at a later stage to include haptic information and the haptic information can be provided in a dedicated audio channel.
The receiving circuitry 210 receives a file or stream comprising an audio channel comprising haptic information and the manner in which the file or data stream is created is not particularly limited. The received file or stream may comprise any number of audio channels, in which at least one audio channel comprises haptic information in an audio file format.
In embodiments of the disclosure, the functionality of the data processing apparatus 200 is performed in a distributed manner. In the example system shown in Figure 2, the receiving circuitry 210 may be provided as part of the games console 300 and/or another device, such as the HMD 20 or the handheld controller 330. For example, the game console 300 may receive the file or stream and communicate the file/stream to the HMD 20 and/or controller 330 in an unchanged form such that the HMD 20 and/or controller 330 receives the file/stream, or the game console 300 may comprise the processing circuitry 220 to generate a haptic interaction signal in dependence upon the haptic information in the file/stream and communicate the haptic interaction signal to the HMD 20 and/or controller 330. As such, the processing circuitry 220 may be provided as part of the game console 300 and/or a device such as the HMD 20 or the handheld controller 330. The haptic interface 230 comprising at least one actuator is provided as part of a device such as the HMD 20 or the handheld controller 330.
Hence the locations of the receiving circuitry 310, the processing circuitry 320 and the haptic interface 330 are not particularly limited and may be provided at respective locations as part of one or more respective data processing devices.
Embodiments of the disclosure include a virtual reality system (such as the virtual reality system shown in Figure 2), comprising the receiving circuitry 110, the processing circuitry and the haptic interface 330.
In some cases, the receiving circuitry 210, the processing circuitry 220 and the haptic interface 230 are each provided as part of a peripheral device for providing a haptic interaction with a user. For example, the receiving circuitry 210, the processing circuitry 220 and the haptic interface 230 can each be provided as part of the HMD 20 or the handheld controller 330 or a haptic glove or a smartphone. In this way, the file or stream can be received by the haptic device via a wired or wireless communication and processing can be performed at the haptic device to generate a haptic interaction signal in response to a received audio file / media file / audio stream / media stream and thereby provide a physical interaction with the user.
In some examples, the data processing apparatus 200 comprises storage circuitry (not shown in Figure 3) configured to store one or more audio files and/or one or more media files so that the receiving circuitry 210 obtains the file from the storage circuitry.
In conventional audio processing, audio information included in one or more audio channels of a file or a stream is processed to generate an audio signal for driving a transducer element of a speaker to thereby cause movement of the transducer element for outputting audio. The audio transducer thus physically moves in dependence upon the properties of the audio signal to cause sound waves to be emitted that can be perceived by a user. Similarly, the processing circuitry 220 can optionally be configured to generate an audio signal in dependence upon audio information included in one or more audio channels of the received file/stream so that the audio signal can be used for controlling a transducer element of an audio speaker, such as an audio speaker of an HMD or an audio speaker of a handheld controller, to thereby emit sound waves.
The processing circuitry 220 is configured to generate a haptic interaction signal for controlling an actuator of a haptic interface in dependence upon the haptic information included in at least one audio channel of the received file/stream. The processing circuitry 220 can generate a haptic interaction signal using the haptic information defined in an audio channel of the received file or stream, and the actuator of the haptic interface 230 can be controlled to physically move in dependence upon the haptic interaction signal to thereby cause physical movement (such as a vibration) that can be sensed by the user via the sense of touch. As explained previously, the processing circuitry 220 may comprise any suitable codec for decoding an audio file format to obtain the haptic information from the file/stream. In the case where the haptic information is encoded using the PCM format, then the processing circuitry 220 can use a (Pulse Code Modulation) PCM type codec to obtain the haptic information. Hence, one or more actuators configured to stimulate the user's sense of touch can be controlled in dependence upon the haptic interaction signal obtained on the basis of the haptic information in the received file/stream.
Typically, a device capable of emitting sounds via a speaker element comprises a digital to analogue converter (DAC) to convert at least one audio signal to an analogue form for reproduction. In some cases, a plurality of audio signals may be converted for reproduction by a plurality of speaker elements (e.g. left and right audio signals). Similarly, a DAC can be provided as part of a haptic interface for converting at least one haptic interaction signal into an analogue form for reproduction. For example, the haptic interface 230 may comprise one or more DACdriven actuators.
The haptic interface comprises one or more actuators that are controlled to provide a physical movement sensed by a user when touching a portion of the haptic interface. Signals are generated for driving one or more motors according to the haptic information to thereby provide controlled physical movement.
Hence, the haptic interface 230 comprises one or more actuators which can be actuated responsive to the haptic interaction signal. The one or more actuators may comprise any suitable actuator technology for providing linear and/or rotary motion (e.g. electric linear motors, rotary electric motors). For example, an eccentric rotating mass vibration motor (ERM) or a linear resonant actuator ([RA) may be used for generating vibrations. In some cases, the haptic interface 230 comprises a plurality of actuators, in which each actuator is controlled responsive to a same haptic interaction signal. Alternatively, the haptic interface 230 comprises a plurality of actuators in which one or more first actuators are controlled responsive to a first haptic interaction signal and one or more second actuators are controlled responsive to a second haptic interaction signal different from the first haptic interaction signal. The plurality of actuators may each have the same capability or one or more of the actuators may differ in their capability. Actuator properties that may differ include a frequency range and a maximum amplitude.
The haptic interaction signal can thus be generated by the processing circuitry 220 to indicate at least one of a frequency and an amplitude for movement of the actuator. As explained previously, the haptic interaction signal can be generated in dependence upon the haptic information to cause one or more actuators to produce a force corresponding to a rumble, in which at least one of the frequency and the amplitude of the rumble is dependent upon the haptic interaction signal.
In embodiments of the disclosure, the haptic interface is included in one or more from the list consisting of: a handheld controller (e.g. a game controller or a mouse pointer); a head-mountable display; a smartphone; a data glove; a haptic vest; and a touchpad (such as a touchpad of a laptop).
In embodiments of the disclosure, the haptic information is included in an audio channel in an uncompressed audio file format. The haptic information can be digitally represented in an audio channel in an uncompressed format using the pulse code modulation (PCM) format. A WAV file format or AIFF file format or Broadcast Wave Format (BVVF) file format is an example of an uncompressed audio file format in which the haptic information (and optionally audio information) can be encoded based on the PCM format. The PCM format has been widely used when it comes to uncompressed audio, and embodiments of the disclosure provide a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information (e.g. for specifying an amplitude and/or frequency for an actuator of a haptic interface) that is encoded in the PCM format.
For example, the receiving circuitry 210 may receive a multichannel WAV file in which at least one channel comprises PCM encoded haptic information and at least one channel comprises PCM encoded audio information. The multichannel WAV file may for example be a 3 channel WAV file in which two audio channels include audio information encoded in PCM (left and right audio) and another audio channel includes haptic information encoded in PCM. As explained above, the bass channel of the 2.1 audio format may be used for carrying the PCM encoded haptic information in some examples. Whilst the above refers to a multichannel WAV file comprising a single audio channel holding the haptic information, it will be appreciated that haptic information can be encoded in the PCM format in any of the audio channels and in some cases in each of the audio channels.
As such, the processing circuitry 220 may comprise a PCM codec for decoding the PCM format to obtain the haptic information from an audio channel and optionally audio information from another audio channel. Moreover, the processing circuitry 120 in Figure 1 can perform encoding operations to encode the haptic information in the PCM format and corresponding decoding operations can be performed by the processing circuitry 220 in Figure 2.
In embodiments of the disclosure, the haptic information is included in an audio channel in a compressed audio file format. Examples of a compressed audio format include Advance Audio Coding (AAC), MPEG-1 Audio Layer III or MPEG-2 Audio Layer II (MP3), AC-3 and Enhance AC-3. Other known compressed audio file formats may also be used and the type of compressed audio file format is not particularly limited. As such, the processing circuitry 220 may comprise any suitable audio codec for decoding a compressed audio file format. Similarly, the processing circuitry 120 may comprise any suitable audio codec for encoding a compressed audio file format For example, the receiving circuitry 210 can receive an MPEG audio file (or MPEG media file) either by downloading the file or by streaming the file. The MPEG audio file comprises audio frames which can be streamed in sequence, in which an audio frame comprises at least one audio channel comprising haptic information in a compressed audio file format. In a similar manner, the receiving circuitry 210 can receive a live stream such as a live video game streamed by a remote server, in which the live stream comprises audio frames streamed in sequence, in which an audio frame comprises at least one channel comprising haptic information in a compressed audio file format. The received file or stream may comprise any number of audio channels.
In embodiments of the disclosure, the file or stream received by the receiving circuitry 210 comprises a sequence of audio frames each having a frame duration, and wherein at least some of the audio frames comprise a first audio channel comprising audio information and a second audio channel comprising the haptic information. For example, an audio frame comprising a 2.0 audio channel format may be received in which one audio channel comprises digitally encoded audio information (which may be in a compressed or uncompressed audio file format) and the other channel comprises digitally encoded haptic information (which may be in a compressed or uncompressed audio file format). An audio frame may comprise any number of audio channels. As such, the processing circuitry 220 can be configured to: generate an audio signal in dependence upon the audio information included in one or more audio channels of the received audio frame for controlling a transducer element of an audio speaker in dependence upon the audio signal, and generate a haptic interaction signal in dependence upon the haptic information included in one or more audio channels of the received audio frame for controlling an actuator of the haptic interface 130 in dependence upon the haptic interaction signal.
Hence more generally, data can be extracted from respective audio channels in a single audio frame to obtain a haptic interaction signal and an audio signal for controlling a haptic interface and an audio output unit, respectively. For example, in the case of the Sony® DualSense® handheld controller, which is an example of a haptic interface comprising a speaker, the speaker can be configured to output audio in dependence upon the audio information included in a first audio channel of the audio frame and the haptic interface can be configured to output a haptic interaction (e.g. a controller rumble) in dependence upon the haptic information included in a second audio channel of the audio frame.
In a sequence of successive audio frames in a data stream, each audio frame comprises at least one audio channel comprising audio information and at least some of the audio frames are multichannel audio frames comprising a first audio channel for audio information and a second audio channel for haptic information. For example, the successive audio frames may digitally represent an audio track comprising acoustic sounds to be output by a speaker element, and at least some of the successive audio frames comprise haptic information so that a haptic interaction can be provided in accordance with the audio track at certain points in the audio track.
The audio frames may each comprise a timestamp indicating a timing at which an actuator (and optionally a speaker) is to be controlled to output the haptic information (and optionally the audio) included in the audio frame. Therefore, the timing at which the audio and the haptic interaction can be provided to the user can be controlled on the basis of the timestamp information for an audio frame.
Referring now to Figure 4, in embodiments of the disclosure, the data processing apparatus 200 comprises an audio output unit 240 to output audio in dependence upon an audio signal, wherein at least one of the plurality of audio channels of the file or stream comprises audio information and the processing circuitry 220 is configured to generate the audio signal in dependence upon the audio information. As explained above, the receiving circuitry 210 can receive a single audio file, media file, audio stream or media stream and obtain both an audio signal and a haptic interaction signal therefrom. The haptic interface 230 is controlled responsive to the haptic interaction signal and the audio output unit 240 is controlled responsive to the audio signal, and as such the audio signal and the haptic interaction signal can be synchronised with respect to each other to allow sounds and haptics to be output in a commensurate fashion so that the user's experience of the content is enhanced by the provision of appropriately timed haptics. The haptic interface 230 and audio output unit 240 may be provided as part of a haptic device (e.g. an HMD or a handheld controller) or in some cases the audio output unit 240 may be provided as part of another device such as a games console or a television, such as those shown in Figure 2 so that the user perceives the emitted sounds.
In embodiments of the disclosure, the file or stream received by the receiving circuitry 210 comprises one or more audio channels each comprising respective audio information and the at least one audio channel comprising the haptic information, and wherein the processing circuitry 220 is configured to: generate the haptic interaction signal in dependence upon the haptic information; update the file or stream to remove the at least one audio channel comprising the haptic information; and output the updated file or stream comprising the one or more audio channels each comprising respective audio information. The receiving circuitry 210 can receive a file/stream comprising N audio channels each comprising respective audio information and M audio channels each comprising respective haptic information (where N and M are both integers that are greater than or equal to 1 and which may or may not be the same for a given file/stream). The processing circuitry 220 can generate the haptic interaction signal in dependence upon the haptic information. The processing circuitry 220 may identify an audio channel comprising the haptic information on the basis of metadata (discussed in more detail later) or by extracting the haptic information from a predetermined audio channel (for example using a look-up table as described previously). The processing circuitry 220 can thus generate a haptic interaction signal using the received file/stream. As explained above with reference to Figure 4, the processing circuitry 220 can be configured to generate an audio signal in dependence upon the audio information in an audio channel such that the audio output unit 240 outputs audio in dependence upon the generated audio signal. Alternatively or in addition to generating an audio signal in dependence upon the audio information in an audio channel, the processing circuitry 220 can be configured to update the file or stream to remove the at least one audio channel comprising the haptic information; and output the updated file or stream comprising the one or more audio channels. For example, in the case where the data processing apparatus 200 does not include the audio output unit 240, the data processing apparatus 200 may output the updated file/stream without generating an audio signal. Hence, the processing circuitry 220 can receive the file/stream, generate the haptic interaction signal, optionally generate the audio signal" and update the file/stream to remove the one or more audio channels comprising haptic information so that updated file/stream comprises the audio channels comprising the audio information, and the updated file/stream can be output to another device (e.g. a television comprising one or more speakers) so that the updated file/stream can be used to output audio at the another device.
For example, the received file/stream may have a 7.1 audio channel format, in which the 6th and 7'h audio channels comprise haptic information. In this case, the processing circuitry 220 can generate the haptic interaction signal in dependence upon the haptic information and remove the 6th and 7' audio channels so as to output an updated file/stream having a 5.1 audio channel format in which each audio channel comprises audio information. A similar approach may be used for any such audio channel format; hence for example a received file/stream with a 2.1 audio channel format may have haptic information in the bass channel (the.1' channel); this may be used to drive haptics whilst the remaining audio channels are output as conventional stereo. Hence more generally, the data processing apparatus 200 can be configured to: receive a file/stream comprising one or more audio channels each comprising audio information and one or more audio channels each comprising haptic information; update the file/stream by removing the audio channels comprising haptic information; and transmit the updated file/stream including just the audio channels comprising audio information to an external device. In this way, the data processing apparatus 200 extracts the haptic information and forwards just the audio information for use by another device.
In embodiments of the disclosure, the received file or stream comprises a first audio channel comprising first haptic information and a second audio channel comprising second haptic information different from the first haptic information. For example, in the case of a 5.1 audio channel format, two or more of the audio channels may comprise respective haptic information such that first haptic information is included in one channel, second haptic information is included in another channel and at least some of the remaining channels include audio information. The first haptic information is indicative of one or more first properties and the second haptic information is indicative of one or more second properties so that a given actuator is controlled differently to provide a physical interaction having different properties when using a haptic interaction signal obtained on the basis of the first haptic information compared to when using a haptic interaction signal obtained on the basis of the second haptic information. For example, the first haptic information may be indicative of at least one of a first frequency and first amplitude, and the second haptic information may be indicative of at least one of a second frequency and second amplitude for an actuator. Alternatively or in addition, the first and second haptic information may specify different durations for a haptic interaction. The haptic information included in the audio channel can be generated to provide a haptic interaction that complements a user's experience when watching a film or playing a video game or listening to music. The haptic information can be generated for inclusion in the audio channel in a number of different ways. In some examples, a media content creator or more specifically a game developer can manually define the properties of the haptic information to accompany the audio. For example, for a scene in a video game having an accompanying soundtrack, a developer can define haptic information to provide a haptic interaction corresponding to: a sound of a gun fire; a sound of an explosion; a sound of a wrench striking a given surface; a sound of a car crashing; and a sound of raindrops. In some examples, a database of instances of haptic information may be used by a developer so as to select an instance of haptic information from the database that is appropriate for the audio. For example, predefined instances of haptic information may be stored in a database with corresponding tags indicating a type of sound that the haptic information is suitable for accompanying and instances of haptic information can be selected from the database as needed. The audio information representing one or more sounds can thus be provided in one audio channel and the haptic information representing one or more haptic interactions corresponding to the one or more sounds can be provided in another audio channel such that the audio and haptics can be synchronised.
Rather than a developer manually defining or selecting the haptic information for an audio or a media content, a haptic information generator can perform content analysis of audio and/or video content to generate haptic information for a content. Video and or audio for a content can be analysed to detect in-game events and a detected in-game event can be mapped to a corresponding instance of haptic information in a database so as to automatically select an instance of haptic information that is to accompany the audio and/or video by inserting the selected instance of haptic information into an additional audio channel. In some examples, audio information included in a given audio frame and/or video information included in a corresponding video frame (video frame and audio frame having corresponding timestamps) can be analysed so as to detect an in-game event on the basis of properties of the audio data and/or video data, and an instance of haptic information can be selected for inclusion in an additional audio channel of the audio frame on the basis of the information in the audio frame and/or video frame. For example, the haptic information generator can analyse an audio frame to detect an in-game event such as an explosion event using known techniques for detection and classification of audio events, and select an instance of haptic information having associated tag information indicative of an explosion event.
In embodiments of the disclosure, the haptic interface 230 comprises a first actuator to provide a physical interaction with the user in response to a first haptic interaction signal generated in dependence upon the first haptic information and a second actuator to provide a physical interaction with the user in response to a second haptic interaction signal generated in dependence upon the second haptic information. The haptic interface 230 can comprise any number of actuators that can be controlled responsive to the haptic interaction signal generated by the processing circuitry 220. In some cases, the haptic interface may comprise one or more first actuators that are responsive to the first haptic interaction signal and one or more second actuators that are responsive to the second haptic interaction signal so that a first actuator is controlled independently of a second actuator to provide a different haptic interaction. The first haptic interaction signal may differ from the second haptic interaction signal by frequency and/or amplitude. The actuators can be provided at different respective locations within a haptic device to provide a range of possible haptic interactions. For example, a plurality of actuators can be distributed within the body of a handheld controller to provide various haptic interactions, such as the falling of raindrops or the crunching of sand when a virtual character walks over sand in a virtual environment.
For a file or stream comprising a first audio channel comprising first haptic information and a second audio channel comprising second haptic information, the processing circuitry 220 is operable to generate a first haptic interaction signal depending on the first haptic information in the first channel and to generate a second haptic interaction signal depending on the second haptic information in the second channel. As such, a plurality of respective haptic interaction signals can be generated based on haptic information obtained from a plurality of audio channels for a received file or stream, and a first actuator of the haptic interface 230 can be controlled responsive to a first haptic interaction signal and a second actuator of the haptic interface 230 can be controlled responsive to a second haptic interaction signal so that the first and second actuators can provide either a same haptic interaction (when the first and second haptic information is substantially the same) or different haptic interactions (when the first and second haptic information differ from each other).
In some cases, the first actuator may be a first type of actuator and the second actuator may be a second type of actuator such that the first and second actuators have different capabilities due to, for example, having different sizes or different motor technologies. For example, a first actuator may have a first amplitude limit and a second actuator may have a second amplitude limit different from the first amplitude limit such that one of the actuators is capable of providing a higher intensity haptic interaction. Alternatively or in addition, the first actuator may be operable in a first frequency range and the second actuator may be operable in a second frequency range such that one actuator can be used for lower frequency haptic interactions and the other actuator may be used for higher frequency haptic interactions.
In embodiments of the disclosure, the first actuator is configured to provide a physical interaction with one hand of the user and the second actuator is configured to provide a physical interaction with the other hand of the user. One or more first actuators controlled responsive to the first haptic interaction signal can be provided in a first portion of the haptic interface 230 to provide a physical interaction with a first hand when the first actuator is actuated. Similarly, one or more second actuators controlled responsive to the second haptic interaction signal can be provided in a second portion of the haptic interface 230 to provide a physical interaction with the other hand when the second actuator is actuated. For example, the haptic interface 230 may take the form of a handheld controller (e.g. Sony® DualSense®) in which the first portion is held by one hand and the second portion is held by the user's other hand. In some examples, the haptic interface may comprise two haptic gloves in which the gloves are worn on respective hands and one or more first actuators are provided in a first glove and one or more second actuators are provided in a second glove.
For a given audio file (or stream) comprising a plurality of audio channels, one of the audio channels may be designated as a first audio channel for carrying first haptic information for a left hand and one of the audio channels may be designated as a second audio channel for carrying second haptic information for a right hand. The processing circuitry 220 can thus generate a left-hand haptic interaction signal in dependence upon the haptic information included in the audio channel designated for the left hand and a right-hand haptic interaction signal in dependence upon the haptic information include in the audio channel designated for the right hand. More generally, a first audio channel can be designated for use in controlling actuators on a first side of the haptic interface and a second audio channel can be designated for use in controlling actuators on another side of the haptic interface opposite the first side (e.g. front side and back side, or left side and right side).
In embodiments of the disclosure, the file or stream comprises metadata indicative of each audio channel comprising the haptic information. Metadata indicative of whether an audio channel comprises haptic information can optionally be included in a file or a data stream so that the data processing apparatus 200 uses the metadata for identifying one or more audio channels comprising haptic information. The metadata may indicate just the one or more audio channels comprising haptic information so that the one or more audio channels which are to be used for generating one or more haptic interaction signals can be reliably identified and other audio channels not indicated by the metadata are assumed to include audio information.
Alternatively, the metadata may indicate for each audio channel whether the audio channel comprises haptic information or audio information. As explained previously, in some cases the metadata may comprise one or more flags for indicating whether a given audio channel comprises haptic information. For example, flag data may be provided for each of the plurality of audio channels in a file/stream in which a value of 1 indicates presence of haptic information in an audio channel and a value of 0 indicates absence of haptic information in an audio channel.
On the basis of the metadata, the processing circuitry 220 generates a signal for each respective audio channel of the received file/stream and designates each generated signal as being either an audio signal for controlling an audio output element or a haptic interaction signal for controlling an actuator of the haptic interface 230. The one or more audio signals and one or more haptic interaction signals can thus be selectively routed by the processing circuitry 220 for controlling either an audio output element or an actuator of the haptic interface 230. For example, in the case of an audio file comprising the 7.1 audio format, two respective haptic interaction signals can be generated using haptic information obtained from two of the audio channels, and up to six respective audio signals can be generated using audio information obtained from the remaining audio channels. The two haptic interaction signals can thus be routed to the haptic interface 230 for use in providing two respective haptic interactions by respectively controlling at least two actuators. Similarly, the up to six audio signals can thus be routed to the haptic interface 230, or to another device comprising audio output capabilities, for use in providing an audio output (note that in the case where the haptic interface is capable of outputting 5.1 audio then each of the audio signals can be used to drive a respective speaker element and when the haptic interface has only stereo audio capability, for example, then only some of the audio signals are used).
In embodiments of the disclosure, the processing circuitry 220 is configured to extract the haptic information in dependence upon the metadata. The metadata may be provided in a header of an audio file or media file for use by the processing circuitry 220. For example, in the case of a WAV file, the header of the WAV file can include metadata for indicating which of the audio channels comprises PCM encoded haptic information, or for indicating for each audio channel whether the audio channel comprises haptic information. The processing circuitry 220 can thus extract the haptic information from one or more respective audio channels of the WAV file in dependence upon the metadata and generate one or more haptic interaction signals in dependence upon the extracted haptic information. In some examples in which the data processing apparatus 200 is provided as part of a device such as a handheld controller with no audio capability, the data processing apparatus 200 extracts only the haptic information in dependence upon the metadata.
For a file or stream in which the haptic information is encoded in an MPEG audio file format (e.g. MP3), the file/stream may comprise one or more tags storing metadata indicating which of the audio channels comprises haptic information in a compressed audio format, or indicating for each audio channel whether the audio channel comprises haptic information in a compressed audio format.
In some examples, the metadata further comprises timing information for an audio channel comprising the haptic information. As explained above, an audio frame can be received which comprises: at least one audio channel comprising audio information; at least one audio channel comprising haptic information; and a timestamp associated with the audio frame which can be used for controlling a time at which an audio output and haptic output are generated for the user. In addition to indicating an audio channel comprising haptic information (e.g. either by using a flag or indicating a channel number), the metadata optionally comprises timing information in the form of one or more fimestamps for an audio channel comprising haptic information so as to indicate a time at which the actuator is to be actuated to output the haptic information included in the audio channel. Therefore, a time stamp associated with an audio frame can be used for controlling the timing at which the information included in the audio frame is to be output so that the respective channels are synchronised and aligned, and the metadata can indicate one or more timestamps for a respective audio channel to control the timing of the output of the haptic information within the period of time corresponding to the duration of the audio frame.
Figure 5 is a schematic flowchart illustrating a data processing method for generate a haptic interaction signal in dependence upon haptic information in an audio channel, the method 25 comprising: receiving (at a step 510) a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information; generating (at a step 520) a haptic interaction signal in dependence upon the haptic information; and providing (at a step 530), by a haptic interface, a physical interaction with a user in response to the haptic interaction signal.
Figure 6 is a schematic flowchart illustrating a data processing method for modifying a file or a data stream to include an additional audio channel and inserting haptic information into the additional audio channel, the method comprising: receiving (at a step 610) a file or stream comprising at least one audio channel; encoding (at a step 620) haptic information in an audio file format; modifying (at a step 630) the file or stream to either include an additional audio channel, or repurpose an existing (used or spare) audio channel, collectively referred to as designating an audio channel for haptic data, as described elsewhere herein; inserting (at a step 640) the encoded haptic information in the designated audio channel; and outputting (at a step 650) the modified file or stream.
It will be recognised that the designated audio channel is a channel that would be treated as an audio channel by a legacy device that does not implement the invention, even if upon attempting to decode the data therein (haptic data) it was unable to do so, or the resulting non-audio was unplayable. Hence whilst it is referred to as an audio channel, it will be appreciated that it is a haptic channel encoded in the same way as an audio channel.
Hence alternatively or in addition, a data processing method for modifying a file or a data stream to insert haptic information may comprise the steps of: intercepting audio and haptics generated by an interactive experience (e.g. in pulse code modulation) as described elsewhere herein; encoding multiple or single audio channels and multiple or single haptic channels, and combining (mux) with video content e.g. with a time stamp; storing the encoded information in a file format and/or transmitting in a stream; receiving that file or stream; decoding the file or the stream for example back into pulse code modulation; and rendering the haptics on the haptics device, as described elsewhere herein.
In an example embodiment, a data processing apparatus for generating a file or data stream to include an audio channel comprising haptic information comprises: generating circuitry to generate a file or stream comprising at least one audio channel; processing circuitry to: encode haptic information in an audio file format; modify the file or stream to include an additional audio channel; and insert the encoded haptic information in the additional audio channel and output circuitry to output the modified file or stream.
Therefore, in this example embodiment, rather than receiving a file or stream and then modifying the file or stream, the data processing apparatus generates the file or stream comprising at least one audio channel comprising audio information and supplements the audio information by adding another audio channel and inserting haptic information encoded in an audio file format into the additional audio channel.
It will be appreciated that example embodiments can be implemented by computer software operating on a general purpose computing system such as a games machine. In these examples, computer software, which when executed by a computer, causes the computer to carry out any of the methods discussed above is considered as an embodiment of the present disclosure. Similarly, embodiments of the disclosure are provided by a non-transitory, machine-readable storage medium which stores such computer software.
It will also be apparent that numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure may be practised otherwise than as specifically described herein.

Claims (18)

  1. CLAIMS1. A data processing apparatus, comprising: receiving circuitry to receive a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information; processing circuitry to generate a haptic interaction signal in dependence upon the haptic information; and a haptic interface comprising one or more actuators to provide a physical interaction with a user in response to the haptic interaction signal.
  2. 2. The data processing apparatus according to claim 1, wherein the haptic information is in an uncompressed audio file format.
  3. 3. The data processing apparatus according to claim 1 or claim 2, wherein the haptic information is in a Pulse Code Modulation format (PCM), a Waveform Audio File Format (WAV) or an Audio Interchange File Format (AIFF).
  4. 4. The data processing apparatus according to claim 1, wherein the haptic information is in a compressed audio file format.
  5. 5. The data processing apparatus according to any preceding claim, wherein the file or stream comprises a sequence of audio frames each having a frame duration, and wherein at least some of the audio frames comprise a first audio channel comprising audio information and a second audio channel comprising the haptic information.
  6. 6. The data processing apparatus according to any preceding claim, comprising an audio output unit to output audio in dependence upon an audio signal, wherein at least one of the plurality of audio channels comprises audio information and the processing circuitry is configured to generate the audio signal in dependence upon the audio information.
  7. 7. The data processing apparatus according to any preceding claim, wherein the file or stream comprises one or more audio channels each comprising respective audio information and the at least one audio channel comprising the haptic information, and wherein the processing circuitry is configured to: generate the haptic interaction signal in dependence upon the haptic information; update the file or stream to remove the at least one audio channel comprising the haptic information; and output the updated file or stream comprising the one or more audio channels each comprising respective audio information
  8. 8. The data processing apparatus according to any preceding claim, wherein the haptic interface is included in one or more from the list consisting of: a handheld controller; a head-mountable display; a smartphone; a data glove; a haptic vest; and a touchpad.
  9. 9. The data processing apparatus according to any preceding claim, wherein the file or stream comprises a first audio channel comprising first haptic information and a second audio channel comprising second haptic information different from the first haptic information.
  10. 10. The data processing apparatus according to claim 9, wherein the haptic interface comprises a first actuator to provide a physical interaction with the user in response to a first haptic interaction signal generated in dependence upon the first haptic information and a second actuator to provide a physical interaction with the user in response to a second haptic interaction signal generated in dependence upon the second haptic information.
  11. 11. The data processing apparatus according to claim 10, wherein the first actuator is configured to provide a physical interaction with a first hand of the user and the second actuator is configured to provide a physical interaction with a second hand of the user.
  12. 12. The data processing apparatus according to any preceding claim, wherein the file or stream comprises metadata indicative of each audio channel comprising the haptic information.
  13. 13. The data processing apparatus according to claim 12, wherein the processing circuitry is configured to extract the haptic information in dependence upon the metadata.
  14. 14. A data processing method, comprising receiving a file or stream comprising a plurality of audio channels, in which at least one of the audio channels comprises haptic information; generating a haptic interaction signal in dependence upon the haptic information; and providing, by a haptic interface, a physical interaction with a user in response to the haptic interaction signal.
  15. 15. A data processing apparatus, comprising: receiving circuitry to receive a file or stream comprising at least one audio channel; processing circuitry to: encode haptic information in an audio file format; modify the file or stream to include an additional audio channel; and insert the encoded haptic information in the additional audio channel and output circuitry to output the modified file or stream.
  16. 16. The data processing apparatus according to claim 15, wherein the processing circuitry is configured to modify the file or stream to include metadata indicative of the additional audio channel comprising the haptic information.
  17. 17. A data processing method, comprising: receiving a file or stream comprising at least one audio channel; encoding haptic information in an audio file format; modifying the file or stream to designate an audio channel for haptic data; inserting the encoded haptic information in the designated audio channel; and outputting the modified file or stream.
  18. 18. Computer software which, when executed by a computer, causes the computer to carry out the method of claim 14 or claim 17.
GB2112852.5A 2021-09-09 2021-09-09 Apparatus, systems and methods for haptics Pending GB2610591A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2112852.5A GB2610591A (en) 2021-09-09 2021-09-09 Apparatus, systems and methods for haptics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2112852.5A GB2610591A (en) 2021-09-09 2021-09-09 Apparatus, systems and methods for haptics

Publications (2)

Publication Number Publication Date
GB202112852D0 GB202112852D0 (en) 2021-10-27
GB2610591A true GB2610591A (en) 2023-03-15

Family

ID=78149369

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2112852.5A Pending GB2610591A (en) 2021-09-09 2021-09-09 Apparatus, systems and methods for haptics

Country Status (1)

Country Link
GB (1) GB2610591A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2881945A1 (en) * 2013-11-19 2015-06-10 Dolby Laboratories Licensing Corporation Haptic signal synthesis and transport in a bit stream
EP2955609A1 (en) * 2014-06-09 2015-12-16 Immersion Corporation Haptic devices and methods for providing haptic effects via audio tracks
US20190235640A1 (en) * 2010-12-03 2019-08-01 Razer (Asia-Pacific) Pte. Ltd. Haptic ecosystem
US20200057502A1 (en) * 2018-08-14 2020-02-20 Cirrus Logic International Semiconductor Ltd. Haptic output systems
US20200153602A1 (en) * 2019-12-27 2020-05-14 Satyajit Siddharay Kamat System for syncrhonizing haptic actuators with displayed content
WO2021131767A1 (en) * 2019-12-25 2021-07-01 ソニーグループ株式会社 Sending device, sending method, receiving device, and receiving method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190235640A1 (en) * 2010-12-03 2019-08-01 Razer (Asia-Pacific) Pte. Ltd. Haptic ecosystem
EP2881945A1 (en) * 2013-11-19 2015-06-10 Dolby Laboratories Licensing Corporation Haptic signal synthesis and transport in a bit stream
EP2955609A1 (en) * 2014-06-09 2015-12-16 Immersion Corporation Haptic devices and methods for providing haptic effects via audio tracks
US20200057502A1 (en) * 2018-08-14 2020-02-20 Cirrus Logic International Semiconductor Ltd. Haptic output systems
WO2021131767A1 (en) * 2019-12-25 2021-07-01 ソニーグループ株式会社 Sending device, sending method, receiving device, and receiving method
US20200153602A1 (en) * 2019-12-27 2020-05-14 Satyajit Siddharay Kamat System for syncrhonizing haptic actuators with displayed content

Also Published As

Publication number Publication date
GB202112852D0 (en) 2021-10-27

Similar Documents

Publication Publication Date Title
CN110121695B (en) Apparatus in a virtual reality domain and associated methods
JP2018129035A (en) Haptic broadcast with select haptic metadata
US10798518B2 (en) Apparatus and associated methods
KR20160146551A (en) Broadcast haptics architectures
WO1995003588A1 (en) An audio controlled computer-generated virtual environment
US10993067B2 (en) Apparatus and associated methods
US11395089B2 (en) Mixing audio based on a pose of a user
JP7465019B2 (en) Information processing device, information processing method, and information processing program
US11223925B2 (en) Apparatus and associated methods for presentation of captured spatial audio content
JP6897565B2 (en) Signal processing equipment, signal processing methods and computer programs
CN112292654B (en) Information processing device, information processing method, and program
GB2610591A (en) Apparatus, systems and methods for haptics
EP3321795A1 (en) An apparatus and associated methods
KR20140006424A (en) Method for embodiment sensible vibration based on sound source
CN206517592U (en) A kind of interactive 3D audio systems
US20230251718A1 (en) Method for Generating Feedback in a Multimedia Entertainment System
KR20240107407A (en) Real-time relay system for interactive performance of virtual reality and method unsing thereof
GB2612767A (en) Virtual reality interactions
CN116848498A (en) Receiving device, transmitting device, information processing method, and program
KR20100006322A (en) Apparatus and method of playing video for a feeling in the body
Danieau et al. Enhancing audiovisual experience with haptic feedback: A review on HAV