CN116601587A - Representation format of haptic objects - Google Patents

Representation format of haptic objects Download PDF

Info

Publication number
CN116601587A
CN116601587A CN202180085310.6A CN202180085310A CN116601587A CN 116601587 A CN116601587 A CN 116601587A CN 202180085310 A CN202180085310 A CN 202180085310A CN 116601587 A CN116601587 A CN 116601587A
Authority
CN
China
Prior art keywords
haptic
scene
effect
haptic effect
volume
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180085310.6A
Other languages
Chinese (zh)
Inventor
F·丹尼奥
Q·伽尔瓦内
P·吉洛特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
InterDigital CE Patent Holdings SAS
Original Assignee
InterDigital CE Patent Holdings SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by InterDigital CE Patent Holdings SAS filed Critical InterDigital CE Patent Holdings SAS
Priority claimed from PCT/EP2021/079400 external-priority patent/WO2022100985A1/en
Publication of CN116601587A publication Critical patent/CN116601587A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A haptic rendering device and corresponding rendering method allow rendering haptic effects defined in a haptic signal including information representing an immersive scene description. The immersive scene includes information representing at least one element of the scene and information representing a haptic object, the information including a type of haptic effect, at least one parameter of the haptic effect, and a haptic volume or surface on which the haptic effect acts. The parameter of the haptic effect may be a haptic texture map. A corresponding grammar is presented.

Description

Representation format of haptic objects
Technical Field
At least one of the present embodiments relates generally to haptics, and more particularly to glTF-based TM Definition of representation format of haptic objects in immersive scenes (graphic language transmission format).
Background
A fully immersive user experience is presented to a user through an immersive system based on feedback and interaction. The interaction may use conventional control means that meet the user's needs. Current visual and auditory feedback provides a satisfactory level of real-world immersion. Additional feedback may be provided by haptic effects that allow a human user to perceive the virtual environment with his senses and thus obtain a better experience of complete immersion with improved realism. However, haptic sensation remains one area of possible progress in improving the overall user experience in immersive systems.
Traditionally, immersive systems may include a 3D scene representing a virtual environment, wherein virtual objects are located within the 3D scene. To improve user interaction with elements of the virtual environment, haptic feedback may be used by actuation of a haptic actuator. Such interactions are based on the concept of "haptic objects" that correspond to physical phenomena to be transmitted to a user. In the context of immersive scenes, haptic objects allow haptic effects to be provided by defining the excitation of appropriate haptic actuators to mimic physical phenomena on a haptic rendering device. Different types of haptic actuators allow for the restoration of different types of haptic feedback.
One example of a haptic object is an explosion. The explosion may be rendered by vibration and heat, combining the different haptic effects of the user to improve realism. An immersive scene typically includes a plurality of haptic objects, for example using a first haptic object associated with a global effect and a second haptic object associated with a local effect.
The principles described herein are applicable to any immersive environment using haptic sensations, such as, for example, augmented reality, virtual reality, mixed reality, or haptic augmented video (or omnidirectional/360 ° video) rendering, and more generally to any haptic-based user experience. Such an example scene of an immersive environment is thus considered an immersive scene.
Haptic refers to touch sensation and includes two dimensions: haptic and kinesthetic sensations. The first dimension is related to the feel, such as friction, roughness, hardness, temperature, and is felt through mechanoreceptors of the skin (mercker nerve, lu Feini nerve endings, meissner's corpuscles, pasinib's corpuscles). The second dimension is related to the sensation of force/torque, position, movement/velocity provided by mechanoreceptors in muscles, tendons and joints. The haptic sensation also relates to the perception of self-movement, as it contributes to the proprioceptive system (i.e., the perception of the body of oneself). Thus, acceleration, velocity, or perception of any body model may be assimilated as a haptic effect. The frequency range is about 0KHz to 1KHz, depending on the type of modality. Most existing devices capable of rendering haptic signals generate vibrations. Examples of such haptic actuators are Linear Resonant Actuator (LRA), eccentric Rotating Mass (ERM), voice coil linear motor. These actuators may be integrated into haptic rendering devices, such as haptic packages and smart phones or game controllers.
For encoding haptic signals, several formats have been defined, which are related to advanced descriptions using XML-like formats (e.g. MPEG-V), parametric representations using JSON-like formats such as Apple (Apple) haptic audio mode (AHAP) or the HAPT format of immersion technology (Immersion Corporation), or waveform encoding (IEEE 1918.1.1 continuous standardization of haptic and kinesthesia signals). This HAPT format has recently been included in the MPEG ISOBMFF file format specification (ISO/IEC 14496 part 12).
In addition, GL transport format (glTF TM ) Is an unlicensed specification for efficient transfer and loading of 3D scenes and models by applications. The format defines an extensible generic publication format for 3D content tools and services that simplifies authoring workflows and supports interoperable use of content throughout the industry.
In addition, modern 3D engines are able to map textures to 3D objects. These textures contain information about parameters related to various appearances, such as the color of the object, but also about geometry, such as normal or relief maps that help modern visual rendering algorithms in the rendering process, and more complex parameters, such as diffusion, emission, glossiness, which also determine how the object is rendered.
The embodiments described below take the foregoing into consideration in designing.
Disclosure of Invention
Embodiments are directed to haptic rendering devices and corresponding rendering methods that allow for rendering haptic effects defined in haptic signals that include information representing an immersive scene description. A corresponding grammar is provided, defined as glTF TM Expansion of the format.
A first aspect of at least one embodiment relates to a signal for rendering an immersive scene including information representing a scene description including: at least one information representing at least one element of the scene; and information representing the haptic object, the information including a type of haptic effect, at least one parameter of the haptic effect, and a haptic volume or surface on which the haptic effect acts.
A second aspect of at least one embodiment relates to an apparatus comprising a processor configured to: obtaining information representing a scene description, the scene description comprising: at least one information representing at least one element of the scene; and information representing a haptic object, the information including a type of haptic effect, at least one parameter of the haptic effect, and a haptic volume or surface on which the haptic effect acts; detecting a collision between a position of a user or a position of a body part of the user and the tactile volume; and preparing data for rendering the immersive scene, wherein the data is generated based on at least one parameter of the haptic effect.
A third aspect of at least one embodiment relates to a method comprising: obtaining information representing a scene description, the scene description comprising: at least one information representing at least one element of the scene; and information representing a haptic object, the information including a type of haptic effect, at least one parameter of the haptic effect, and a haptic volume or surface on which the haptic effect acts; detecting a collision between a position of a user or a position of a body part of the user and the tactile volume; and preparing data for rendering the immersive scene, wherein the data is generated based on at least one parameter of the haptic effect.
According to a fourth aspect of at least one embodiment, a computer program is presented, comprising program code instructions executable by a processor, the computer program implementing at least the steps of the method according to the third aspect.
According to a fifth aspect of at least one embodiment, a computer program product is presented, stored on a non-transitory computer readable medium and comprising program code instructions executable by a processor, the computer program product implementing at least the steps of the method according to the third aspect.
In a variant embodiment, the at least one parameter of the haptic effect is a haptic texture map.
Drawings
FIG. 1 illustrates a block diagram of an example of a system in which various aspects and embodiments are implemented.
FIG. 2 illustrates an exemplary flow diagram of a process for rendering a haptic feedback description file in accordance with at least one embodiment.
Fig. 3 illustrates an example of a data structure of an immersive scene description file including haptic objects in accordance with at least one embodiment.
Fig. 4 shows an example of a 3D scene with haptic objects.
FIG. 5 illustrates a glTF-based scene corresponding to FIG. 4 in accordance with at least one embodiment TM Is a data structure of the computer system.
Fig. 6A illustrates an example of a 3D object according to an embodiment using haptic texture mapping.
FIG. 6B illustrates an example of a temperature haptic effect according to an embodiment using a haptic texture map.
FIG. 6C illustrates an example of a rate-hardness haptic effect according to an embodiment using a haptic texture map.
Fig. 7A and 7B illustrate examples of haptic objects including haptic texture maps in accordance with at least one embodiment.
Fig. 8A and 8B illustrate examples of scene descriptions representing haptic objects including haptic texture maps in accordance with at least one embodiment.
FIG. 9 illustrates various haptic effect attributes of a continuous effect.
Detailed Description
The haptic object may be related to a global environment (such as breeze) or to a local effect (such as making a fist in the chest). In the first case, the haptic effect is rendered for the complete immersive scene, whereas in the latter case the haptic effect is activated only in the determined subspace of the immersive scene (hereinafter referred to as the haptic volume) (hence the effect is valid). The haptic volume may be limited to a 2D surface, typically a surface of an object or a simple 2D plane (e.g., a bottom plane). In addition, some haptic rendering devices (such as haptic packages) are capable of providing localized haptic effects (e.g., vibrations on the chest) at precise locations on the user's body.
FIG. 1 illustrates a block diagram of an example of a system implementing various aspects and embodiments. In the depicted immersive system, a user Alice (Alice) interacts with a server 180 hosting an immersive scene 190 through a communication network 170 using a haptic rendering device 100. The immersive scene 190 may include various data and/or files representing different elements (scene description 191, audio data, video data, 3D models, and haptic objects 192) required for its rendering.
The haptic rendering device includes a processor 101. The processor 101 may be a general purpose processor, a special purpose processor, a conventional processor, a Digital Signal Processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) circuits, any other type of Integrated Circuit (IC), a state machine, or the like. The processor may perform data processing such as haptic signal decoding, input/output processing, and/or any other function that enables the device to operate in an immersive system.
The processor 101 may be coupled to an input unit 102 configured to communicate user interactions. Various types of inputs and modalities may be used to achieve this. A physical keypad or touch sensitive surface is a typical example of an input suitable for this purpose, but voice control may also be employed. In addition, the input unit may include a digital camera capable of capturing still pictures or video. The processor 101 may be coupled to a display unit 103 configured to output visual data to be displayed on a screen. Various types of displays may be used to achieve this, such as Liquid Crystal Displays (LCDs) or Organic Light Emitting Diode (OLED) display units. The processor 101 may also be coupled to an audio unit 104 configured to render sound data to be converted into audio waves by an adapted transducer such as a speaker. The processor 101 may be coupled to a communication interface 105 configured to exchange data with external devices. The communication preferably uses a wireless communication standard to provide mobility of the haptic rendering device, such as cellular (e.g., LTE) communication, wi-Fi communication, and the like. The processor 101 may access information in the memory 106 and store data in the memory, which may include various types of memory including Random Access Memory (RAM), read Only Memory (ROM), hard disk, subscriber Identity Module (SIM) card, memory stick, secure Digital (SD) memory card, any other type of memory storage device. In this embodiment, the processor 101 may access information that is not physically located in a memory on the device (e.g., on a server, a home computer, or another device) and store data in the memory.
The processor 101 may be coupled to a haptic unit 107 configured to provide haptic feedback to a user, the haptic feedback being described in a haptic object 192 that is part of a scene description 191 of the immersive scene 190. The haptic feedback 191 describes this feedback to be provided in accordance with the syntax further described below. Such description files are typically transferred from the server 180 to the haptic rendering device 100. The haptic unit 107 may include a single haptic actuator or multiple haptic actuators located at multiple locations on the haptic rendering device. Different haptic units may have different numbers of actuators and/or the position of the actuators on the haptic rendering device may be different.
The processor 101 may receive power from the power source 108 and may be configured to distribute and/or control the power to other components in the device 100. The power source may be any suitable device for powering the device. For example, the power source may include one or more dry battery cells (e.g., nickel cadmium (NiCd), nickel zinc (NiZn), nickel metal hydride (NiMH), lithium ion (Li-ion), etc.), solar cells, fuel cells, and the like.
Although the processor 101 and other elements 102-108 are depicted in the figure as separate components, it will be appreciated that these elements may be integrated together in an electronic package or chip. It is understood that the haptic rendering device 100 may include any subcombination of the elements described herein while remaining consistent with an embodiment. The processor 101 may also be coupled to other peripheral devices or units not depicted in fig. 1, which may include one or more software modules and/or hardware modules that provide additional features, functionality, and/or wired or wireless connections. For example, the peripheral devices may include sensors such as Universal Serial Bus (USB) ports, vibrating devices, television transceivers, hands-free headphones, and the like,Modules, frequency Modulation (FM) radio units, digital music players, media players, video game player modules, internet browsers, and the like. For example, the processor 101 may be coupled to a positioning unit configured to position the haptic rendering device in its environment. The positioning unit may integrate a GPS chipset that provides a longitude and latitude location with respect to the current location of the haptic rendering device, as well as other motion sensors that provide positioning services, such as an accelerometer and/or an electronic compass.
Typical examples of haptic rendering devices 100 are haptic packages, smartphones, game controllers, haptic gloves, haptic chairs, haptic props, motion platforms, etc. However, any device or combination of devices that provide similar functionality may be used as haptic rendering device 100 while still conforming to the principles of the present disclosure.
In at least one embodiment, the device does not include a display unit, but includes a haptic unit. In such an embodiment, the device does not visually render the scene, only the haptic effect. However, the device may prepare data for display so that another device (such as a screen) may perform display. Examples of such devices are haptic packages or motion platforms.
In at least one embodiment, the device does not include a haptic unit, but includes a display unit. In such an embodiment, the device does not render the haptic effect, only visually renders the scene. However, the device may prepare data for rendering the haptic effect so that another device (such as a haptic prop) may perform haptic rendering. Examples of such devices are smartphones, head mounted displays or laptop computers.
In at least one embodiment, the device includes neither a display unit nor a haptic unit. In such embodiments, the device does not visually render the scene and does not render the haptic effect. However, the device may prepare data for display so that another device (such as a screen) may perform display, and may prepare data for rendering haptic effects so that another device (such as a haptic prop) may perform haptic rendering. Examples of such devices are desktop computers, optical media players or set-top boxes.
In at least one embodiment, the immersive scene 190 and associated elements are directly hosted in the memory 106 of the haptic rendering device 100 so that local rendering and interaction can take place.
Although the different elements of immersive scene 190 are depicted as separate elements in fig. 1, the principles described herein also apply to the case where these elements are directly integrated in the scene description rather than the separate elements. Any mix between the two alternatives is also possible, where some elements are integrated in the scene description and other elements are separate files.
FIG. 2 illustrates an exemplary flow diagram of a process for rendering a haptic feedback description file in accordance with at least one embodiment. Such a process 200 is typically implemented in the haptic rendering device 100 and executed by the processor 101 of such a device. In step 201, the processor obtains a description of the immersive scene (191 in FIG. 1). This step may be accomplished, for example, by receiving the haptic feedback profile from a server via a communication network, by reading the haptic feedback profile from an external storage device or local storage, or by any other means. The processor analyzes the scene description file to extract haptic objects (192 in FIG. 1) that allow for the determination of parameters related to the haptic effect, and more particularly the haptic volume associated with the haptic effect. In step 202, the processor monitors the position of the user within the immersive scene (or the position of the user's body part for more accurate detection) to detect an intersection with the haptic volume (object collision) during the interaction. Collision detection may be performed, for example, by a dedicated physical engine dedicated to the task. When such an intersection is detected, in step 203, the processor extracts parameters from the haptic object, allowing selection of which haptic signal needs to be applied to which actuator or actuators group. In step 204, the processor controls the haptic unit to apply the selected haptic signal to the haptic actuator or actuators group and thus render haptic feedback in accordance with the information of the haptic object.
Although the haptic effect is described above as triggered by a bump, it may also be triggered by an event. Such events are, for example, relative to the entire immersive scene, such as elevated sun (elevated ambient temperature), or explosion (vibration may simulate a shock wave), or incoming communication or other situation.
As discussed above, some devices do not perform rendering themselves, but delegate the task to other devices. In this case, data is prepared for rendering of visual elements and/or rendering of haptic effects and transmitted to a device performing the rendering.
In a first example, immersive scene 191 may include a virtual environment of an outdoor campsite in which a user may move an avatar representing him. The first haptic feedback may be a breeze that will occur anywhere in the virtual environment and that is generated by the fan. The second haptic feedback may be a temperature of 30 c when the avatar approaches a campfire. The effect will be rendered by the heating element of the haptic package worn by the user performing process 200. However, this second feedback will only be valid when the user's position is detected as being within the haptic volume of the second haptic object. In this case, the haptic volume represents the distance to the fire at which the user perceives the temperature.
In another example, immersive scene 191 may include a video of a battle between two boxers, and the user wears a haptic suit, which may be a strong vibration on the user's chest when one of the fighters is subjected to a fist.
Fig. 3 illustrates an example of a data structure of an immersive scene in accordance with at least one embodiment. This embodiment is based on glTF TM File format. glTF TM At the heart of (a) is a JSON file describing the structure and composition of a scene containing a 3D model. The figure shows the relationship between the elements constituting the structure. In this case, scene 300 is the top-level element that aggregates all other elements. Most notably, it comprises an array of nodes. Each node 301 mayTo contain child nodes that allow the hierarchy to be created. A node may refer to a mesh or a camera or skin, and a local geometric transformation may be associated with the node. Grid 310 corresponds to the geometric data required to render the grid. The skin 320 is used to perform a vertex skin to subject the vertices of the mesh to the skeleton based on their pose. The camera 325 determines a projection matrix. Animation 340 may be applied to the attributes of the node. Buffer 355 contains data for the geometry, animation, and skin of the 3D model. The buffer view 350 adds structural information to the buffer data, while the accessor 345 defines the exact type and layout of the buffer view. Material 360 determines how the object should be rendered based on the physical material properties. Texture 365 allows defining the appearance of an object. The image 370 defines image data for the texture, while the sampler 380 describes the wrapping and scaling of the texture. glTF TM All of these elements of the file allow definition of a custom immersive scene with any haptic feedback.
Thus, in at least one embodiment, the glTF TM The file also includes haptic objects 330 that describe the haptic feedback to be rendered. In a variant embodiment, the haptic object includes a haptic texture map 335 whose data may be stored with other textures 365. Such haptic objects are described herein.
Fig. 4 shows an example of a 3D scene with haptic objects. Each volume is a region in which a user can feel a corresponding effect. In one example, the sphere corresponds to a vibration of 378Hz and the cube corresponds to a pressure effect of 10 newtons. This is the kind of information that needs to be stored in the haptic object as part of the immersive scene description. At run-time, the user navigates through his device 100 of fig. 1 within the immersive scene. Depending on the kind of immersive application, this navigation may correspond to different types of navigation. In an example of virtual reality, the navigation relates to movement of an avatar (e.g., 3D object) that represents the user within the immersive scene and under control of the user. In an example of an augmented reality application, the navigation is related to the physical movement of the user in the real world, tracked by a positioning system to determine the corresponding position of the user within the immersive scene. In the example of an all-dimensional video, the navigation is related to the viewpoint of the user within 360 ° space.
During navigation within an immersive scene, a collision with a haptic object may occur when the position of the user (or avatar thereof) collides with the haptic volume of the haptic object, or in other words, when the position of the user is within the boundaries of the haptic volume. In such cases, the corresponding haptic effect is triggered on the haptic rendering device. In the example of fig. 4, when the user begins to collide with the sphere, the haptic rendering device will render vibrations at a frequency of 378Hz until there are no more collisions.
However, the haptic object does not necessarily correspond to a visible 3D object. Thus, it may be associated with a volume (tactile volume) without any visual representation, such that a collision occurs when the user's position is "inside" the volume.
Thus, in at least one embodiment, the immersive scene comprises at least one haptic object characterized by a type of haptic effect, information characterizing the haptic signal to be applied, and information representing a volume within the scene in which the haptic effect is valid and in which the haptic rendering device should apply the haptic effect. In a variant embodiment, the information characterizing the haptic signal is a reference to a file comprising the haptic signal. In one variant embodiment, the haptic volume is the entire immersive scene such that the haptic effect is global and independent of the position of the user. In one variant embodiment, the haptic volume corresponds to the geometry of a virtual object associated with the haptic object.
FIG. 5 illustrates a glTF-based scene corresponding to FIG. 4 in accordance with at least one embodiment TM Is a data structure of the computer system. Scene 500 includes a top level node 501 that includes three child nodes 510, 520, and 530. The first child node 510 corresponds to the lower left sphere object of fig. 4. Node 510 includes a transformation parameter 511 defining translation t1, rotation r1, and scaling s1 of the node, a mesh 512 comprising the complete geometry (i.e., vertex set and surface set) of the sphere object. Node 510 also includes a haptic object 513, type 51 of which4 is determined to have a vibration with a frequency 515 of 378 hertz and an intensity 516 of 0.5. The shape 517 of the haptic object refers to the shape of the node, so the geometry defined by the mesh 512 defining the sphere will be used. Thus, the haptic effect of haptic object 513 will be valid within the volume of the sphere. The second child node 520 corresponds to the top cube object of fig. 4. Node 520 includes transformation parameters 521 that define translation t2, rotation r2, scaling s2 of the node, and a mesh 522 that includes the complete geometry (i.e., vertex set and surface set) of the cube object. Node 520 also includes a haptic object 523 whose type 524 is determined to have a pressure of value 525 of 10. The shape 526 of the haptic object refers to the shape of the node 520 and therefore the geometry defined by the mesh 522 defining the cube will be used. Thus, the haptic effect of the haptic object 523 will be valid within the volume of the cube. The third child node 530 corresponds to the cylinder object at the bottom right of fig. 4. The node does not contain any haptic objects and therefore will not have associated haptic effects.
Table 1 shows an example of a syntax for defining a haptic object in accordance with at least one embodiment. More specifically, the figure shows that the glTF is based on TM Is a JSON syntax of the extension mechanism of (c). An extension of a haptic object designed according to at least one embodiment is identified as "IDCC_Hapts" in this example of syntax. The list of haptic effects in the figures includes vibration, pressure and temperature effects, but is not exhaustive. Other types of haptic effects may be defined based on the same principles (wind, rain, snow, electricity, or any combination between effects). Tables 11 and 12 describe the syntax of the pressure effect and the temperature effect, respectively. The syntax for describing haptic effects is defined in a specific JSON schema (some examples below) that is then instantiated in the "properties" of nodes in the scene description file, as described below. A "shape type" is also associated with the haptic object and allows for the description of the haptic volume. It may be the original volume (sphere or cube) (thus allowing for an ellipsoidal or parallelepiped volume) scaled according to the scaling properties of the node, or may be defined as a custom mesh. In the latter case, the custom grid is defined by the existing network of nodes Lattice properties define, and correspond to, the geometry of the visible object. Defining one of the original volumes allows the haptic volume to be determined independently of any visible element.
TABLE 1
In at least one embodiment, additional volumes commonly used by 3D physics engines, such as, for example, 2D planes, ellipsoids, parallelepipeds, or capsules (capsules consisting of two hemispheres joined together by a cylinder) may be used in addition to the sphere original volume and the cube original volume. This syntax is not shown in the figure, but will include adding additional raw volumes in additional enumerated values and additional parameters that are used to define these conventional shapes.
Typically, a single effect is defined, but multiple effects may exist and be combined. For example, pressure and temperature may be combined to indicate weather conditions (cold rain).
Table 2 shows glTF-based for defining vibrotactile effects in accordance with at least one embodiment TM Is an example of a grammar for (c).
TABLE 2
The vibrotactile effect may be defined in terms of parameters including frequency (vibration at a constant frequency of a sinusoidal signal) and intensity (amplitude of vibration), or in terms of a haptic signal (similar to the waveform of an audio signal) when more complex effects are desired. In the first case, the parameters of the effect may be defined directly within the vibration grammar as shown and carried by the "frequency" and "intensity" grammar elements. In the second case, according to the index of the glTF embedded in the corresponding accessor TM The data in the buffer defines the signal to determine the effect. Such data is typically externalFile loading such as a waveform audio file (". Wav" format) or a haptic file format ("OHM" format) or any other file format suitable for carrying vibration signals. When the corresponding syntax element is not present, a default value may be determined and should be used. For example, the default value of the vibration effect according to the definition of fig. 7 is to vibrate at a frequency of 250Hz at half intensity.
Table 3 shows an example of a definition of a scene including a vibrating teapot in accordance with at least one embodiment. This example shows how a simple scene comprising 3D objects associated with vibration effects is defined.
/>
TABLE 3 Table 3
In this example, the scene includes a single node named "teapot" representing the unique 3D object of the scene. The geometry of the node is loaded from a "teapoy. Bin" file through a set of buffer views. The material defines how the mesh is represented and the translation defines the position of the object in the virtual environment. A haptic object is also associated with the node. The haptic object corresponds to a vibration effect ("vibration" syntax element) at a frequency of 250Hz ("frequency" syntax element) and an intensity of 70% ("intensity" syntax element). The haptic volume is defined as a grid of nodes ("shape type" syntax element = 2) and is therefore a grid of teapots. Thus, when such objects are present in an immersive scene, vibrations will be rendered when the user's position collides with the teapot geometry, in other words when the user "touches" the teapot.
Table 4 shows an example of a definition of a scene including a haptic object and an associated haptic volume in accordance with at least one embodiment.
TABLE 4 Table 4
As previously mentioned, the tactile volume is not necessarily visible. For brevity and simplicity, this example does not include definitions of other nodes that include other objects that are independent of the haptic volume, and includes only two haptic effects with invisible haptic volumes. A scenario includes a single node named "graphic_example". It indicates an "IDCC_graphics" extension, glTF TM Version (2.0) of the specification is used and no buffers are used to load the resource. The first haptic effect is a vibrotactile object configured to vibrate at a frequency 378Hz and a half (0.5) intensity. The effect is not associated with a visible object, but with an invisible haptic volume, which is a cube located at position p= (-1.8,0.7, -0.7) and of size 1.2. The second haptic effect is a pressure haptic object configured to apply a force of 10 newtons. The effect is not associated with a visible object, but with an invisible haptic volume, which is a sphere located at position p' = (-2.9,0.0,0.0) and of size 1.0 (default value since unspecified).
In one embodiment, the haptic object is associated with a mesh-based virtual object, but is configured with a volume that is greater than the volume defined by the mesh. For example, the virtual object may correspond to a fireplace represented by a grid having textures and animations, and the haptic object may include a temperature haptic effect configured with a haptic volume greater than that of a sphere of a bounding box of the fireplace grid. With such a configuration, a user in proximity to the virtual fireplace is able to feel heat before contacting (colliding with) the fireplace.
In one embodiment, a scene includes a plurality of overlapping haptic objects having concentric volumes of different haptic effects. For example, a set of haptic objects may use concentric spheres around a fireplace, the volume having a size that decreases as the temperature value increases. With this technique, the user perceives a gradual increase in heat as approaching the fireplace. Since the user will collide with multiple spheres, the smallest sphere (i.e., the sphere closest to the fire) will be selected.
Table 5 shows an example of a progressive effect that allows for vibrotactile effects, in accordance with at least one embodiment. Indeed, in one embodiment, rather than defining multiple overlapping haptic objects, interpolation between minimum and maximum values over a span of haptic volumes is proposed. The figure shows only the syntax elements added to the definition of the vibrotactile effect according to table 2. First, an "interpolate" flag syntax element is added to the definition of the vibrotactile effect. The flag allows requesting the value of the haptic effect to be interpolated and determining how to perform the interpolation. Interpolation may be any linear, exponential or nonlinear function. Second, the "min" and "max" syntax elements allow to define the range of interpolation by defining the scale factor to be applied to the desired value.
TABLE 5
Table 6 shows an example of vibrotactile effects using interpolation.
TABLE 6
In this example, the haptic volume is a cube of size 2.0 positioned at the origin of the virtual environment. The haptic effect is to linearly interpolate between 0.4 and 1.0, or more precisely between 0.4x1.0 (the first value is the "min" scaled value of "interpolation" and the second value is the "intensity") and 1.0x1.0 (the first value is the default "max" scaled value of "interpolation" and the second value is the "intensity"). Interpolation is based on the distance to the center of the haptic volume. Thus, at the origin, the haptic effect intensity will be 0.4. At the center of the cube, at a position equal to (1.0 ), the intensity will be 0.4. At a position equal to (0.5,0.5,0.5), the intensity will be 0.7.
In at least one embodiment, the type of interpolation is defined by parameters of the haptic object, allowing selection between at least linearity and customization. In the latter case, the function is determined in an additional parameter.
Table 7 shows glTF-based haptic effects for vibrotactile according to at least one embodiment using haptic signals stored in a file TM Is an example of a grammar for (c).
TABLE 7
This embodiment builds on the exemplary syntax for vibrotactile effects shown in table 2 and adds a reference to the file storing the haptic signals to be applied to render the effect. This allows more complex haptic effects to be defined than simply using a sinusoidal signal of fixed frequency. In addition, intensity parameters may also be applied to the haptic signal. This allows sharing a unique haptic signal file and applying it to different haptic objects at different intensity levels. In examples using haptic objects with concentric volumes, the haptic objects may share the same haptic signal file and have increased intensity to provide a progressive effect. The file format is adapted to store the haptic signal. Examples of such formats used are Waveform Audio (WAV), object Haptic Metadata (OHM), apple haptic audio mode (AHAP), or immersion technology HAPT format. The same principles are similarly applicable to other types of haptic effects.
Table 8 illustrates glTF-based for vibrotactile effects defining effect locations in accordance with at least one embodiment TM Is an example of a grammar for (c). This embodiment is built on the basis of the Object Haptic Metadata (OHM) format, and And aims at applying haptic effects at defined locations of the user's body.
TABLE 8
Thus, in at least one embodiment, it is proposed to add a syntax element, allowing specifying the location where an effect should be applied to the syntax presented above. This can be done in two steps. Firstly by determining a geometric model (in other words a body model) that represents the spatial acuity of the haptic sensation, and secondly by determining the location on the body model where the haptic effect should be applied. The geometric model may be selected as a generic model selected from a set of standard predetermined models. In this case, the model is based on a mesh of the human body. The geometric model may also be determined to be a custom geometric model by specifying the geometric shape of the geometric model. This may be applicable to non-standard haptic rendering devices, such as haptic chairs. In this case, the haptic perceived spatial acuity is limited by the precise location of the actuator on the rendering device. In the proposed syntax, the geometric model is identified by "avatar_id". The location where the effect should be applied is selected by using a "body_part_mask" syntax element corresponding to a binary mask specifying the body part of the set of associated vertices or by using a "vertex" syntax element specifying the vertices that should be actuated.
Table 9 shows a glTF-based for defining a geometric model when using vibrotactile effects that define effect locations, in accordance with at least one embodiment TM Is an example of a grammar for (c). The grammar defines an identifier "id" of the geometric model, a "lod" value specifying a level of detail (and hence resolution) of the geometric model, and a "type" of haptic effect to be rendered. Thus, it allows specifying the exact application of the haptic effectAnd (5) cutting the position.
TABLE 9
Table 10 shows a glTF-based for vibrotactile effects using a channel carrying a haptic signal, in accordance with at least one embodiment TM Is an example of a grammar for (c). This embodiment adds the concept of a channel to the syntax presented above. In practice, the waveform audio or OHM file may include a plurality of channels for carrying a plurality of haptic signals associated with a plurality of haptic objects. In this case, the syntax further includes information indicating a channel to be used.
/>
Table 10
Table 11 shows glTF-based for pressure haptic effects in accordance with at least one embodiment TM Is an example of a grammar for (c). Pressure haptic effects can be defined simply by numeric pressure values expressed in newtons in a "value" syntax element. If the "value" syntax element does not exist, a default value of "0.0" corresponding to no pressure should be used. All of the embodiments presented above in the context of vibrotactile effects apply similarly to pressure haptic effects.
/>
TABLE 11
Table 12 shows glTF-based for temperature haptic effects in accordance with at least one embodiment TM Is an example of a grammar for (c). The temperature haptic effect may be defined simply by a numeric temperature value in degrees celsius in a "value" syntax element. If the "value" syntax element does not exist, a default value of "20.0" corresponding to no pressure should be used. All of the embodiments presented above in the context of vibrotactile effects apply similarly to temperature haptic effects.
Table 12
Fig. 6A illustrates an example of a 3D object according to an embodiment using haptic texture mapping. The 3D object 1700 represents a metal bottle 1710 with a black soft rubber holder 1720 to isolate the user's hand from the temperature of the bottle. Traditionally, texture files may be used to describe the color, diffusion, emission, normal, occlusion, roughness, metal, specular gloss of an object material, and allow for proper (physical-based) rendering by a rendering engine based on the texture file.
In addition to displaying a representation of the 3D bottle, rendering may also benefit from a force feedback device to allow the user to perceive the shape of the bottle and its different components from the geometric description.
According to one embodiment, the rendering is enhanced by using a haptic texture map to describe the haptic properties of the object. The haptic texture map allows for simulating different roughness and temperatures of the bottle of fig. 6A by defining different parameters for different haptic properties of a particular region of a 3D object. For example, metal bottles 1710 are rendered as hard and cold metal bottles with softer and hotter rubber holders 1720 by using additional texture information encoded based on similar principles of texture mapping. The temperature texture map shown in fig. 6B determines the temperature on the surface of the object (the metal portion is colder than the plastic portion), and the rate hardness texture map shown in fig. 6C indicates that the metal portion is rigid and the rubber holder is soft.
Using these haptic texture maps, once a user touches an object, the location of the haptic texture is determined, relevant haptic information is obtained, and the corresponding haptic effect is rendered. This mechanism allows 3D objects with complex surfaces to be defined with heterogeneous haptic data for different types of haptic features.
According to one embodiment, glTF-based for defining haptic effects TM Comprises a haptic texture map to define haptic effects. Different haptic characteristics may be considered and need to be distinguished. In one embodiment, the tactile characteristics listed in table 13 are considered.
TABLE 13
Dynamic stiffness, stroke spectral response, and stick-slip do not directly encode haptic values, but rather use the index of the table. The ID corresponds to the file in which the coefficients of the autoregressive filter are stored. They model vibrations measured with the material during brief contact (dynamic stiffness) or stroke (stroke spectral response or stick-slip transient), as shown in table 14.
/>
TABLE 14
Table 15 shows an example of a syntax for defining haptic texture map properties for haptic objects in accordance with at least one embodiment. This syntax allows parameters for determining different types of haptic effects:
"rate-hardness" allows to determine the stiffness of the surface, in other words it is defined as the initial rate of change of force with penetration speed, and is used to simulate both stiffness and damping behaviour with better stability. This value is stored in an 8-bit texture and covers values from 0N.s-1/m.s-1 to 10000N.s-1/m.s-1, where the resolution is 40N.s-1/m.s-1,
"contact-area-spread-rate" is defined as the rate at which the contact area expands on the surface of a finger when the finger presses against the surface. This value is stored in an 8-bit texture and ranges from 0N.cm 2 To 25.6N.cm 2 Wherein the resolution is 0.1N.cm 2
"local-surface-orientation" allows determining the curvature of the shape. This value is stored in 24 textures (3 x 8 bits according to x, y, z directions) and covers values from 0 degrees to 180 degrees, where the resolution is 0.002 degrees,
"local-registration" allows to determine the relief or the microscopic details of the surface. This value is stored in an 8-bit texture and covers values from-5 mm to +5mm, with a resolution of 0.04mm,
"kinetic-friction" allows to determine the dynamic friction coefficient, i.e. the force due to friction between each object. This value is stored in an 8-bit texture, and covers values from-5 to +5, where the resolution is 0.04,
"static-friction" allows to determine the coefficient of static friction, i.e. the force necessary to slide objects on each other. This value is stored in an 8-bit texture, and covers values from-5 to +5, where the resolution is 0.04,
"temperature" allows determining the absolute temperature of the object. This value is stored in an 8-bit texture and covers values from-50 c to +75 c, with a resolution of 0.5 c,
"relative-temperature" allows to determine the relative temperature of the user (for example 37.5 ℃). This value is stored in an 8-bit texture and covers values from-25.4 ℃ to +25.4 ℃, where the resolution is 0.2 ℃,
"dynamic-stillness" allows to determine the compliance of the object from the viewpoint of the vibration, i.e. the instantaneous vibration when the user taps the object. This value is stored in an 8-bit texture, and covers values from 0 to 255, which is the id in the index table,
"stroke-spectral-response" allows to determine the vibrations caused by friction between two objects. This value is stored in an 8-bit texture, and covers values from 0 to 255, which is the id in the index table,
"stick-slip" allows to determine the vibration phenomena eventually observed in the transition between stiction and slip. This value is stored in an 8-bit texture and covers values from 0 to 255, which is the id in the index table.
/>
/>
TABLE 15
Fig. 7A and 7B illustrate examples of haptic objects including haptic texture maps in accordance with at least one embodiment. The object represents a teapot and the haptic effect relates to the temperature of the teapot. The geometry of the teapot is defined by a corresponding mesh. The haptic temperature effect is defined by a haptic texture map applied to the geometry of the object, which in this example defines the bottom of the teapot as hot, the lid of the teapot as cold, and the sides of the teapot change from hot to cold. In these figures, the region having a high temperature is represented as a region having a light gray shade, and the dark gray shade represents a region having a low temperature: in other words, the shallower the hotter, the darker the colder. However, the values shown in the figures do not reflect the temperatures defined in table 1, but are arbitrarily chosen to obtain an understandable diagram.
Fig. 8A and 8B illustrate examples of scene descriptions representing haptic objects including haptic texture maps in accordance with at least one embodiment. This scenario corresponds to a teapot with a temperature dependent tactile map as described in fig. 7A and 7B. The scene description syntax spans fig. 8A and 8B. From glTF TM Beginning with the end of the description file, in FIG. 8B, scene description 2001 includes a single node named teapot. Thus, a group of nodes 2010 is a single child node named teapot. The geometry is defined in 2020 as having a first mesh that translates to position the object within the scene. The single child node also includes a haptic object 2030 that includes two effects: vibration effect 2031 and temperature haptic map 2032. Haptic map 2032 is defined to provide a haptic effect related to "temperature" and uses the texture tile image (and thus the first) of index "0" in the list of texture files 2040. The vibration effect is directly defined by its parameters: since the ShapeType parameter is equal to 2, a vibration frequency of 250Hz and an intensity of 0.7 are applied on the mesh of the object. Other parts of the scene description file are related to the mesh defining the geometry of the object 2050, and in fig. 8A, the visual aspect of the surface is defined by defaultMat material 2060, buffer view 2070, buffer 2075 storing data, version number 2080, description 2085 of the buffer, and the expansion list 2090 used.
While a first example of a grammar for carrying haptic objects has been described above, a second example of a grammar in accordance with at least one embodiment is described below. This second example of syntax allows describing the signal more precisely and in a more optimal way. For example, a haptic effect may be defined once and then referenced multiple times to create a haptic signal, optionally with some variation. It also contains more signal parameters to provide a more complete solution to generate any type of signal.
Table 16 shows an example of a syntax describing a first level of expansion of a global haptic experience in accordance with at least one embodiment. It provides a description of the haptic object, listing the different avatars (i.e., body representations) and defining the desired signals. Shape attributes are also added.
/>
/>
Table 16
The syntax shown in table 16 is based on the following elements:
-description: character string description of the signal.
-avatar: a list of all avatars used in the haptic experience. It refers to an avatar mode described below.
-signals: a list of all signals attached to the haptic object. The array references the signal patterns described below.
Trigger: the keywords may be used to specify events that trigger the haptic object.
Shape: the shape of the haptic object is defined.
-accesses: an array of information and a reference to a buffer view. It references the glTF accessor mode defined in the official specification of glTF 2.0.
-bufferViews: a portion of the buffer. It references the glTF buffer view schema defined in the official specification of glTF 2.0.
-buffers: the original data is referenced. It references the glTF buffer mode defined in the official specification of glTF 2.0.
The haptic signal may be described as shown in the syntax of table 17 in addition to the previous syntax. The syntax contains a string description of the signal, some metadata information (e.g., signal type, encoder type, sampling rate, etc.), references to the avatar, and data of the signal. If the signal contains PCM data, the signal may be accessed with a reference to a file or an accessor to a buffer. For descriptive content, a list of all necessary effects is defined at this level. The channel list eventually completes the signal.
/>
TABLE 17
The syntax shown in table 17 is based on the following elements:
-description: character string description of the signal.
Signal_type: the type of tactile stimulus (vibration, temperature, force is specified.)
-an encoder: the type of encoder used to store the signal is specified. "original" means that the signal file is referenced without any type of encoding. "descriptive" is used when only glTF is used to extend the description signal (e.g., can be translated from an IVS or AHAP file). "pcm_lossy" and "pcm_lossless" mean that the signal is encoded using dedicated encoders (here an AAC codec and an ALS codec, respectively).
Sampling_rate: sampling rate of the signal.
Bit_depth: bit depth of referenced data.
Nb_channels: number of channels of the signal.
Nb_samples_per_channel: the number of samples in each channel.
-nb_reduced_samples_per_channel: the number of samples per channel after downsampling.
-avatar_id: the ids of avatars previously described using the avatar patterns described below.
Signal_file: a path to a file containing haptic data. It may be any type of file including wav, ahap, ivs, aac or other formats.
Signal_processor: accessor id of data in the buffer.
-effect_list: a list of all haptic effects used in the signal. It refers to the haptic effect mode described below.
-channels: list of channels for the signal. It refers to the haptic channel pattern described below.
The haptic effects may be described as shown in the syntax of table 18. The grammar defines basic effects that can be referenced in the timeline of the haptic channel. It allows describing an effect only once and then referencing it multiple times in different channels. Different properties may be used to describe the effect. We define five types of effects: continuous, periodic, transient, PCM, or timeline. The continuous and periodic effects may be defined using one or several properties. For example, intensity, attack time, fade-out time, attack volume, and fade-in level may be used to define a simple effect (similar to an IVS). Higher level effects may be described using envelope properties that allow curves to be defined by specifying keypoints. The transient effect may be defined only by intensity and sharpness values. The PCM effect may simply refer to the original data stored in the buffer. Attributes such as intensity, attack _time, fade_ time, envelope can be used as multipliers for these effects. A timeline effect is a simple timing reference to a previously defined basic effect.
/>
/>
TABLE 18
The syntax shown in table 18 is based on the following elements:
-id: the id of the effect.
-effect_type: the type of haptic effect is specified. The types of effects include continuous effects (e.g., non-periodic effects for force feedback), periodic effects (e.g., sinusoidal effects for vibration), transient (e.g., transient and compact vibration effects that feel like tapping), PCM (i.e., raw signal data), or timeline effects that reference other existing effects.
Pcm_data: accessor to the original data of the effect.
Intensity: intensity of effect. If the effect uses PCM data, envelope data or a timeline, this attribute may be used as a multiplier.
Sharpness: the sharpness of the effect is defined.
-duration: duration of effect.
-attack_time: duration of the attack phase of the effect.
-fade_time: duration of the fade-out phase of the effect.
Release_time: the amount of time it takes for the duration intensity envelope to reach zero after the event ends.
-attack_level: intensity at the beginning of the signal.
-decay_level: intensity at the end of the signal.
-envelope: an array of key frames defining a signal envelope.
Wave_frequency: frequency of periodic effects.
-waveform: waveform of periodic effect.
-timeline: time line of effects.
Various haptic effect attributes of the continuous effect are shown in FIG. 9.
The haptic channel extensions provide specific metadata information for each channel of the signal. As shown in the syntax of table 19, it includes description, gain, mix weights (to eventually merge the signals together), body part mask (locate effects following the same convention as OHM), accessor to vertex list (to provide more accurate body location). For descriptive content, an effects timeline is used to reference and organize temporal effects defined at the signal level. Finally, the attribute timeline may be used as an additional way to adjust the intensity and sharpness parameters over time.
/>
TABLE 19
The syntax shown in table 19 is based on the following elements:
-id: the id of the channel.
-description: description of the channel.
-gain: the gain applied to each effect in the channel.
-mixing_weight: weights for mixing the channels together are optionally specified.
Body_part_mask: a body mask specifying the location on the body where the effect of the channel should be applied.
-veritics: the vertex of the avatar defining the location on the body where the effect of the channel should be applied.
-effect_timeline: time line of channel effects. It uses the haptic reference pattern defined below.
-properties_timeline: a timeline of attributes. The properties defined herein act as multipliers to change the amplitude or sharpness of the signal path over time.
As shown in the syntax of Table 20, haptic references can be used within a timeline to reference haptic effects defined at the signal level. It only requires the id and start time of the effect. It also provides the possibility to override the properties of the cited effects. With this feature, the same effect can be used multiple times in different channels with slight variations.
/>
/>
Table 20
The syntax shown in table 20 is based on the following elements:
-id: the id of the effect referenced.
Starting_time: the start time of the referenced effect on the timeline.
-effect_type
-wave_frequency
-waveform
-intensity
-sharpness
-duration
-attack_time
-fade_time
-release_time
-attack_level
-decay_level
All these parameters except "id" and "starting time" are optional parameters and can be used to override the properties of the referenced effect. These parameters are the same as those defined for the effect mode in table 18. One example is to reuse haptic effects, but with lower intensity. As already mentioned, this allows optimizing the definition of the entire scene by reusing some parameters.
The extension may be used with the attribute timeline of the channel to adjust the strength or sharpness parameters, as shown in the syntax of table 21. It is used as a multiplier. These attributes may be defined using a single value or a curve defined in terms of keypoints.
Table 21
The syntax shown in table 21 is based on the following elements:
property_type: the type of attribute. It specifies whether the attribute is a single value or a curve, and whether it should be applied to intensity or sharpness.
Value: the value of the attribute.
-cut: an array of key frames defined by values and time stamps.
The haptic avatar is used as a body representation of the haptic effect. As shown in the syntax of Table 22, different types of avatars may be defined and custom grids from the buffers may be referenced to determine a particular geometry.
/>
Table 22
The syntax shown in table 22 is based on the following elements:
-id: the id of the avatar.
-lod: level of detail of the avatar.
Type: predefined types of avatars include vibration, pressure, and temperature. Other avatars may be described using a "Custom" type and grid.
-mesh: and (3) a grid of avatars.
The syntax shown in table 23 uses the signals given in the companion file "isolation. Wav" to define a first example of a haptic object.
/>
Table 23
The syntax shown in table 24 defines a second example of a haptic object that includes a full descriptive signal.
/>
/>
Table 24
The term "user" is used throughout the document. This means that not only human users are included, but also animals. One example of a use case is to inform the dog that it has entered a restricted area. To this end, the haptic rendering device may take the form of a vibrating dog collar. When the dog enters the restricted area, vibration is provided. In this case, the body model uses an appropriate mesh.
While different embodiments have been described separately, any combination of embodiments may be made while adhering to the principles of the present disclosure.
Although embodiments relate to haptic effects, those skilled in the art will appreciate that the same principles may be applied to other effects (e.g., such as sensory effects), and thus will include odors and tastes. Thus, the appropriate grammar will determine the appropriate parameters associated with these effects.
Reference to "one embodiment" or "an embodiment" or "one embodiment" or "an embodiment" and other variations thereof means that a particular feature, structure, characteristic, etc., described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrase "in one embodiment" or "in an embodiment" or "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment.
In addition, the present application or the claims thereof may relate to "determining" various information. The determination information may include, for example, one or more of estimation information, calculation information, prediction information, or retrieval information from memory.
In addition, the present application or the claims thereof may relate to "obtaining" various information. As with "access," acquisition is intended to be a broad term. Acquiring information may include, for example, one or more of acquiring information (e.g., from memory or optical media storage), accessing information, or retrieving information. Further, during operations such as, for example, storing information, processing information, transmitting information, moving information, copying information, erasing information, computing information, determining information, predicting information, or estimating information, the "acquiring" is typically engaged in one way or another.
It should be understood that, for example, in the case of "a/B", "a and/or B", and "at least one of a and B", use of any of the following "/", "and/or" and "at least one" is intended to cover selection of only the first listed option (a), or selection of only the second listed option (B), or selection of both options (a and B). As a further example, in the case of "A, B and/or C" and "at least one of A, B and C", such phrases are intended to cover selection of only the first listed option (a), or only the second listed option (B), or only the third listed option (C), or only the first and second listed options (a and B), or only the first and third listed options (a and C), or only the second and third listed options (B and C), or all three options (a and B and C). As will be apparent to one of ordinary skill in the art and related arts, this extends to as many items as are listed.

Claims (30)

1. A method, the method comprising:
obtaining (201) information representing:
a scene description (500), the scene description comprising:
at least one information item representing at least one element of the scene, and
information representative of a haptic object, the information comprising:
the type of haptic effect (514,524),
at least one parameter (515,516,525) of the haptic effect, and
a haptic volume or surface (517,512,526,522) upon which the haptic effect is to be activated,
detecting a collision (202) between the position of the user or the position of the body part of the user and the tactile volume or surface, and
data for rendering an immersive scene is prepared, wherein the data is generated based on the at least one parameter of the haptic effect.
2. The method of claim 1, wherein the type of haptic effect is selected from a set comprising vibration, pressure, temperature, and movement.
3. The method of claim 1 or 2, wherein the parameter of the haptic effect describes a signal to be applied to a haptic actuator to render the haptic effect.
4. A method according to any one of claims 1 to 3, wherein the parameter of the haptic effect comprises an identification of a file comprising a haptic signal to be applied.
5. The method of any of claims 1-4, wherein the haptic volume refers to the at least one element of the scene and is determined by a volume of a geometry of the at least one element of the scene.
6. The method of any of claims 1-4, wherein the haptic volume refers to the at least one element of the scene and is determined by a surface of a geometry of the at least one element of the scene.
7. The method of any one of claims 1 to 4, wherein the haptic volume is selected from the group consisting of a 2D plane, sphere, ellipsoid, cube, parallelepiped, and capsule.
8. The method of any of claims 1 to 7, wherein the element of the scene is selected from a set comprising a 3D object, a 2D or 3D video, and an omnidirectional video.
9. The method of any of claims 1-8, wherein the at least one parameter of the haptic effect is a texture map.
10. An apparatus, the apparatus comprising a processor (101) configured to:
information (190) representing:
a scene description (191), the scene description comprising:
At least one information representing at least one element of the scene, and
information representative of a haptic object (192), the information comprising:
the type of haptic effect that is to be generated,
at least one parameter of the haptic effect, and
the haptic volume or surface upon which the haptic effect acts,
detecting a collision between the position of the user or the position of the body part of the user and the tactile volume or surface, and
data for rendering an immersive scene is prepared, wherein the data is generated based on the at least one parameter of the haptic effect.
11. The device of claim 10, wherein the type of haptic effect is selected from a set comprising vibration, pressure, temperature, movement.
12. The apparatus of claim 10 or 11, wherein the parameter of the haptic effect describes a signal to be applied to a haptic actuator to render the effect.
13. The device of any of claims 10 to 12, wherein the parameter of the haptic effect comprises an identification of a file comprising a haptic signal to be applied.
14. The apparatus of any of claims 10 to 13, wherein the haptic volume refers to the at least one element of the scene and is determined by a volume of a geometry of the at least one element of the scene.
15. The apparatus of any of claims 10 to 13, wherein the haptic volume refers to the at least one element of the scene and is determined by a surface of a geometry of the at least one element of the scene.
16. The device of any one of claims 10 to 13, wherein the haptic volume is selected from the group consisting of a 2D plane, a sphere, an ellipsoid, a cube, a parallelepiped, and a capsule.
17. The apparatus of any of claims 10 to 16, wherein the element of the scene is selected from a set comprising a 3D object, a 2D or 3D video, and an omnidirectional video.
18. The apparatus of any of claims 10-17, wherein the at least one parameter of the haptic effect is a texture map.
19. The device of any of claims 10 to 18, wherein the device is further configured to render the haptic effect by applying a haptic signal to a haptic actuator in accordance with the at least one parameter of the haptic effect.
20. A signal for rendering an immersive scene comprising information representing a scene description (500), the scene description comprising:
At least one information representing at least one element of the scene, and
information representative of a haptic object, the information comprising:
the type of haptic effect that is to be generated,
at least one parameter of the haptic effect, and
the haptic effect is a haptic volume or surface on which the haptic effect acts.
21. The signal of claim 20, wherein the type of haptic effect is selected from a set comprising vibration, pressure, temperature, movement.
22. The signal of claim 20 or 21, wherein the parameter of the haptic effect describes the signal to be applied to a haptic actuator to render the effect.
23. The signal according to any of claims 20 to 22, wherein the parameter of the haptic effect comprises an identification of a file comprising the haptic signal to be applied.
24. The signal according to any of claims 20 to 23, wherein the haptic volume refers to the at least one element of the scene and is determined by the volume of the geometry of the at least one element of the scene.
25. The signal according to any one of claims 20 to 23, wherein the haptic volume refers to the at least one element of the scene and is determined by a surface of a geometry of the at least one element of the scene.
26. The signal according to any one of claims 20 to 23, wherein the haptic volume is selected from the group consisting of a 2D plane, a sphere, an ellipsoid, a cube, a parallelepiped, and a capsule.
27. The signal according to any of claims 20 to 26, wherein the element of the scene is selected in a set comprising 3D objects, 2D or 3D video and omnidirectional video.
28. The signal according to any one of claims 20 to 27, wherein the at least one parameter of the haptic effect is a texture map.
29. A computer program comprising program code instructions which, when executed by a processor, implement the method according to any one of claims 1 to 9.
30. A non-transitory computer readable medium comprising program code instructions which, when executed by a processor, implement the method of any one of claims 1 to 9.
CN202180085310.6A 2020-11-12 2021-10-22 Representation format of haptic objects Pending CN116601587A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
EP20306362.3 2020-11-12
EP21306048.6 2021-07-28
EP21306241.7 2021-09-10
EP21306241 2021-09-10
PCT/EP2021/079400 WO2022100985A1 (en) 2020-11-12 2021-10-22 Representation format for haptic object

Publications (1)

Publication Number Publication Date
CN116601587A true CN116601587A (en) 2023-08-15

Family

ID=77998917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180085310.6A Pending CN116601587A (en) 2020-11-12 2021-10-22 Representation format of haptic objects

Country Status (1)

Country Link
CN (1) CN116601587A (en)

Similar Documents

Publication Publication Date Title
US20230418381A1 (en) Representation format for haptic object
CN107835971B (en) Method and apparatus for providing haptic feedback and interaction based on user haptic space (HapSpace)
US9645648B2 (en) Audio computer system for interacting within a virtual reality environment
US10891793B2 (en) Reality to virtual reality portal for dual presence of devices
US20170344116A1 (en) Haptic output methods and devices
US10871829B2 (en) Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects
US20230367395A1 (en) Haptic scene representation format
CN108257205B (en) Three-dimensional model construction method, device and system
CN109426343A (en) Cooperation training method and system based on virtual reality
CN116897541A (en) Mapping architecture for Immersive Technology Media Format (ITMF) specification using a rendering engine
JP2024514948A (en) Voice-driven creation of 3D static assets in computer simulations
WO2022170230A1 (en) Shared mixed reality and platform-agnostic format
CN116601587A (en) Representation format of haptic objects
WO2023046899A1 (en) Location-based haptic signal compression
WO2023198622A1 (en) Hybrid haptic textures
WO2023202899A1 (en) Mipmaps for haptic textures
KR101447992B1 (en) Method and system for managing standard model of three dimension for augmented reality
WO2023202898A1 (en) Haptics effect comprising a washout
WO2023099233A1 (en) Adaptation of a haptic signal to device capabilities
KR20240088941A (en) Location-based haptic signal compression
JP2024521015A (en) Voice-activated modification of physical properties and physical parameterization
WO2024017589A1 (en) Coding of spatial haptic with temporal signal
JP2024514806A (en) Voice modification of sub-parts of assets in computer simulation
CN107704237A (en) A kind of virtual reality engine development technique
AU2012323313A1 (en) Methods and systems of providing items to customers via a network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination