WO2011071351A2 - Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles - Google Patents

Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles Download PDF

Info

Publication number
WO2011071351A2
WO2011071351A2 PCT/KR2010/008906 KR2010008906W WO2011071351A2 WO 2011071351 A2 WO2011071351 A2 WO 2011071351A2 KR 2010008906 W KR2010008906 W KR 2010008906W WO 2011071351 A2 WO2011071351 A2 WO 2011071351A2
Authority
WO
WIPO (PCT)
Prior art keywords
force
information
tactile
effect
data
Prior art date
Application number
PCT/KR2010/008906
Other languages
English (en)
Korean (ko)
Other versions
WO2011071351A9 (fr
WO2011071351A3 (fr
Inventor
류제하
김영미
Original Assignee
광주과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 광주과학기술원 filed Critical 광주과학기술원
Publication of WO2011071351A2 publication Critical patent/WO2011071351A2/fr
Publication of WO2011071351A9 publication Critical patent/WO2011071351A9/fr
Publication of WO2011071351A3 publication Critical patent/WO2011071351A3/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to tactile technology, and more particularly, to a tactile information presentation method and transmission system for adding a tactile sense to an audio-video multimedia to make a user feel real.
  • the present invention has been made in an effort to provide a tactile information presentation method that provides a more realistic multimedia to a user by appropriately applying a haptic effect to an audio-video stream.
  • Another technical problem of the present invention is to provide a tactile information transmission system that provides a user with more realistic multimedia by appropriately applying a haptic effect to an audio-video stream.
  • a tactile information presentation method through division of sensory information, including: tactile sense including multimedia information including audio and video, and tactile sense and force corresponding to the multimedia information.
  • tactile sense including multimedia information including audio and video
  • tactile sense and force corresponding to the multimedia information.
  • object data is generated, the generated object data is encoded, the encoded information is multiplexed into a single stream file, and used by the tactile device and the sense device.
  • a method of representing tactile information comprising: defining force data relating to the effects of passive movement and force data relating to the effects of active movement, describing the components of the sense of force data relating to the effects of passive movement; Describe the components of the adverse data relating to the effect of the active movement In the step of providing the passive motion effect, based on the component of the sense of force data relating to the effect of the passive motion described above, Providing a movement effect.
  • a tactile information transmission system through classification of sensory information, the multimedia information including audio and video and a tactile sense and a sense of force corresponding to the multimedia information.
  • Means for generating object data to generate a scene descriptor for setting a temporal position of tactile information means for encoding the generated object data, means for multiplexing the encoded information into a single stream file, a haptic device, and
  • a tactile information representation system comprising a force sensing device, comprising: means for defining force data relating to the effects of passive motion, which is the force information, and force data relating to the effects of active motion, and components of the force data relating to the effects of passive motion Means for describing an adverse effect on the effect of the active movement Means for describing the components of the data, means for providing the passive movement effect, based on the components of the sense of force data relating to the effects of the passive motion described above, and components of the force data for the effects of the described active movement On the basis of this, means for providing the active movement effect, based on the components
  • FIG. 1 is a view showing a tactile device including a driver according to an embodiment of the present invention.
  • FIG. 2 is a perspective view of an apparatus for providing a sense of force according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a driver array and a tactile video corresponding thereto according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating an example of generating a tactile video based on a video.
  • FIG. 5 is a diagram illustrating an example of a MovieTexture node of a scene descriptor in MPEG-4.
  • FIG. 6 is a diagram illustrating a TactileDisplay node for representing tactile information.
  • FIG. 7 illustrates a TactileDisplay node connected to a MovieTexture node to define a tactile video object.
  • FIG. 8 is a diagram illustrating a TactileDisplayTexture node for representing tactile information.
  • 9 and 10 are diagrams illustrating a Kinesthetic node for expressing feeling information.
  • FIG. 11 is a block diagram of a tactile information transmission system according to an embodiment of the present invention.
  • FIG. 12 is a flowchart of a tactile information transmission method according to an embodiment of the present invention.
  • FIG. 13 is a flowchart of a process in which active adverse data is provided to a user.
  • references to elements or layers "on” other elements or layers include all instances where another layer or other element is directly over or in the middle of another element. On the other hand, when a device is referred to as “directly on”, it means that it does not intervene with another device or layer in between. Like reference numerals refer to like elements throughout. "And / or” includes each and all combinations of one or more of the items mentioned.
  • first, second, etc. are used to describe various components, these components are of course not limited by these terms. These terms are only used to distinguish one component from another. Therefore, of course, the first component mentioned below may be a second component within the technical spirit of the present invention.
  • spatially relative terms below “, “ beneath “, “ lower”, “ above “, “ upper” It can be used to easily describe a component's correlation with other components. Spatially relative terms are to be understood as including terms in different directions of components in use or operation in addition to the directions shown in the figures. For example, when flipping a component shown in the drawing, a component described as “below” or “beneath” of another component may be placed “above” the other component. Can be. Thus, the exemplary term “below” can encompass both an orientation of above and below. The components can be oriented in other directions as well, so that spatially relative terms can be interpreted according to the orientation.
  • 1 is a view showing a haptic device including a driver.
  • 2 is a diagram illustrating a reverse device.
  • the tactile device 100 includes a tactile display 120, 130, a driver 200, a device transceiver 350, and a device specification DB 400.
  • the desensitization device 150 may be composed of a plurality of actuators (not shown) for presenting the desensitization information to the user.
  • the tactile sense may be roughly classified into tactile information including tactile information including vibration, heat, and current, and sensation including tactile information including force, torque, and stiffness.
  • the information about the tactile feeling and dysphoria is also called sensory information.
  • the device for presenting the tactile feeling is a haptic device
  • the device for presenting the sensation is a feeling sensation device, which can be referred to as a tactile device including the tactile device and the sensitizing device.
  • the sense of force may be classified into the sense of force data relating to the effects of passive movement and the sense of energy relating to the effects of active movement. This division is defined based on the user's side, and the effect of manual movement means that when the user mounts the haptic device, the pre-generated tactile information is provided to the user through the haptic device.
  • the active movement effect means that when a user equipped with a haptic device touches an object with his or her will, tactile information set on the object is provided through the haptic device.
  • the tactile presentation units 120 and 130 include left and right tactile presentation units, and the tactile presentation units 120 and 130 may include a plurality of drivers 200.
  • the tactile presentation units 120 and 130 may be implemented in a glove form and may be worn by a user. However, the present invention is not limited thereto and may be provided in various forms. Depending on the form for the tactile presentation, the tactile presentation units 120 and 130 may be implemented to be worn on the head in the form of a hat in addition to the form of gloves and shoes, or to be attached to the arms or legs or the back or the waist.
  • the driver 200 may be disposed in the tactile presentation units 120 and 130, and may be formed by a vibrotactile stimulation method or a pneumatic tactile stimualation method.
  • the driver may be composed of an eccentric motor, a linear motor, a voice coil, an ultrasonic motor, or a piezoelectric element in the case of a vibration tactile method, and a nozzle type or a pneumatic membrane for supplying air in the case of an air tactile method. It may be made in the form.
  • the device controller 300 controls the driver 200.
  • the driving signal generated by the main controller (not shown) in the compositor 776 is received, and thus the operation of the driver 200 is controlled.
  • the device transceiver 350 transmits and receives a control signal for controlling the device and transmits the control signal to the device controller 300.
  • the device specification DB 400 serves to store information about the tactile / sensitizing devices 100 and 150. It is referred to as a haptic device including a force-sensing device 150 for presenting force, torque, and the like, and a tactile device 100 for presenting vibration, heat, current, and the like. That is, the device specification DB 400 stores information about the tactile device 100, 150.
  • the device specification DB 400 includes a type of haptic device, a unit corresponding to the tactile device presented by the haptic device, a maximum / minimum intensity presented by the haptic device, and a horizontal direction of the tactile device.
  • the force-sensing device 150 includes a unit and a work space for the maximum / minimum force / torque / stiffness, degrees of freedom, force / torque / stiffness that the force-sensing device provides to each axis, wherein the form of degrees of freedom is X / Y / Z direction independent translation / rotational motion is allowed or not, and the shape of the work space means the range in which the deflection device can perform translational and rotational motion.
  • the range in which the deflector can translate and rotate can be defined in mm as the maximum range in which the deflector can translate in the X / Y / Z axis.
  • the maximum range that can be rotated about an axis can be defined as the degree of roll / pitch / yaw angle.
  • the unit is not limited to the above-mentioned unit.
  • the force-sensing device may be defined separately from the force-sensitive information on the passive movement and the force-sensitive information on the active movement.
  • the effect of the manual movement described in the manual sensitization data description unit 450 at the point in time when the manual sensitization information generated by the manual sensitization data generator 717 together with the multimedia information is planned regardless of the user's intention. It is provided to the user through the sensitization device 774 based on the components of the sensitization data for.
  • the user wearing the reverse device 774 must touch the object according to his will, so that the components of the reverse data on the effect of the active movement described in the active reverse data description unit 450 can be described. Based on the active sensitization information is provided to the user through the sensitization device 774.
  • the components of the sense data relating to the effect of passive withdrawal include the trajectory of the reverse device, the data update rate of the reverse device, the force provided by the reverse device and the torque provided by the reverse device.
  • the trajectory of the reverse device refers to the position and direction in which the reverse device moves.
  • This trajectory comprises three positions (Px, Py, Pz) and three directions (Ox, Oy, Oz), the position can be expressed in millimeters (mm), and the direction can be expressed in degrees. have.
  • This position and direction data can be updated according to the same update rate.
  • the data update rate of the reverse device refers to the number of update of reverse data per second. For example, if the update rate is 20, it means that the reverse data is updated 20 times per second.
  • the force / torque provided by the counterweight device is the force / torque provided by the counterweight device to the user.
  • Components of the sense of force data relating to the effects of active movement include the data update rate of the force of the reverse device, the force provided by the force of the reverse device and the torque provided by the force of the reverse device.
  • the data update rate of the reverse device refers to the number of updates per second of the reverse information.
  • the force / torque provided by the counterweight device refers to three forces (Fx, Fy, Fz) and three torques (Tx, Ty, Tz) on each axis, and the force can be expressed in Newtons (N).
  • the torque can be expressed in Newton millimeters (Nmm).
  • the process of transmitting the active sense information further will be described.
  • the device controller 300 When the user wears the sense of force device 774 and touches an object, the device controller 300 writes to the active sense of data description unit 450.
  • An active force is provided to the user through the force-sensing device 774 based on the components of the force-sensitive data regarding the active motion effect.
  • the reverse device can provide a sense of force to the user.
  • the tactile video 600, force data, and scene descriptor information mapped corresponding to the media (video, audio) information are resized in consideration of the specifications of the device stored in the device specification DB 400. It will give you a sense of force. This will be described later.
  • the compositor 776 includes a main controller (not shown) for generating a signal for controlling the driver 200 of the tactile device 100, and a main controller including a device transceiver 350 of the tactile device 100.
  • a main transceiver for transmitting a control signal is provided.
  • the main controller generates a control signal for controlling each driver 200 and transmits the control signal to the device controller 300 through the main transceiver and the device transceiver 350, and the device controller 300 transmits the control signal. Based on the control of the driving of each driver 200.
  • the main transceiver and the device transceiver 350 may be connected by wire or wireless communication.
  • the driving of each driver 200 can be controlled by specifying the driving strength. Therefore, the tactile information may be displayed to the user by transmitting information on the driving strength to the device controller 300 to each driver 200.
  • the main control unit transmits the information on the driving intensity of each driver to the device control unit. In the present invention, the intensity information on the driving of each driver 200 is transmitted to the main control unit in the form of a tactile video. Each time the frame is changed, each pixel value may be converted and transmitted to the device controller 300.
  • FIG. 3 is a diagram illustrating a driver array and a tactile video corresponding thereto.
  • the left tactile presenting unit 120 and the right tactile presenting unit 130 are each provided with 4 ⁇ 5 drivers, which can be expressed as a 4 ⁇ 10 driver array 500. have. That is, as shown in FIG. 2, the combination of drivers may be represented in the form of a rectangular array.
  • the tactile video 600 includes pixels corresponding to each driver.
  • Each pixel of the tactile video 600 includes intensity information of the pixel, and the intensity information corresponds to the driving intensity of the driver corresponding to the pixel.
  • the tactile video 600 is represented as a gray scale black and white video
  • each pixel has intensity information of 0 to 255.
  • the driver 200 is driven based on this information. For example, the driver corresponding to the white pixel is driven strongly and the driver corresponding to the black pixel vibrates weakly.
  • the intensity of each pixel and the driving intensity of each driver 200 may be matched 1: 1. Can be. However, if the dimension of the tactile video 600 is larger than the dimension of the driver array 500, the size may be resized according to the ratio. That is, when there is a difference between the requested tactile information and the presentable tactile information, the device controller 300 may perform resizing.
  • the resizing is performed by the device controller 300 using a specification of a device stored in the device specification DB 400.
  • Control information is a concept that includes the device's specifications and user preferences. That is, resizing refers to adjusting / providing tactile information with reference to control information.
  • the tactile video 600 has dimensions of 320 ⁇ 240 and the driver array 500 of the tactile device 100 has dimensions of 10 ⁇ 4, then the tactile video 600 of 320 ⁇ 240 pixels is 10.
  • the size of the pixel is adjusted to be 4x1 to correspond to the driver array 500 1: 1.
  • the intensity information of the scaled tactile video may be represented by averaging the intensity information of the pixel before the scaling.
  • the tactile video 600 is in the same format as a general color or monochrome video, it can be transmitted using a general video encoding and decoding method. Also, the tactile video 600 is composed of a plurality of frames, and the intensity information of the pixels in each frame corresponds to the driving strength of each driver 200 of the tactile presentation device.
  • the skilled person stores the movement of three degrees of freedom or six degrees of freedom of the tool manipulated in the workspace.
  • the robot arm with the tool used by the skilled person is used at the end, and each joint of the robot arm has an encoder to obtain the position data of the tool, Torque sensors also determine the force / torque applied to the tool by a skilled person.
  • the force data for force reproduction is then a series of position data and force data and includes the time each data is sampled.
  • the device controller 300 may perform a resizing process by using the information of the reverse device 150 stored in the device specification DB 400. That is, when there is a difference between the required force information and the presentable force information, the device controller 300 may perform resizing. For example, if you want to provide motion information that moves 1 meter in the X-axis direction, but the actual user's equipment is 1/2 the workspace of the required equipment, the scale of all motion information is Each can be delivered to the user after resizing in half in the X, Y, and Z axes.
  • the force reversing device 150 requires the user to provide a force of 10 N in the X-axis direction
  • the actual force reversing device 150 may provide only a force of 5 N in the X-axis direction, depending on the ratio. The amount of force provided by device 150 may be adjusted.
  • the user may not like the temperature of more than 30 degrees, and may not want the current to flow more than 0.5mA. It may also be disliked when the force provided is greater than 5N.
  • This user preference has priority over device specification information. Therefore, even if the transmitted tactile information is 10N and adjusted to 5N in consideration of the device specification, if the user preference is 3N, the force provided by the haptic device to the user may be 3N. As a result, 10N tactile information was resized to 3N.
  • FIG. 4 is a diagram illustrating an example of generating a tactile video based on a video.
  • a movie-adaptive tactile video 850 generated by a tactile editing / authoring tool. Shown to the right of this tactile video is a representation of a tactile video corresponding driver array 900.
  • the tactile video 850 generates a tactile editing / writing tool based on the video / audio information by applying a different tactile intensity (0 to 255) for each frame to exactly match the number of video frames.
  • the generated tactile video is reproduced in accordance with the device standard. More specifically, it is converted and reproduced according to the size of the device as well as the number of tactile strengths the device can provide. For example, the presentation of the tactile video is 256 steps, but if the device can play only 8 steps, the 256 steps are divided into 8 steps to be played.
  • the tactile video may be authored based not only on the third-person observer's point of view, but also on the first-person main character's viewpoint or tactile background effect viewpoint.
  • the tactile video is mapped to black, and the driver array presents a weak intensity tactile response corresponding to the mapped color.
  • the tactile video is mapped to white, and the driver array correspondingly presents a strong intensity tactile.
  • FIG. 5 is a diagram illustrating an example of a MovieTexture node of a scene descriptor in MPEG-4.
  • Node tactile information is transmitted along with general media (audio and video) information.
  • general media audio and video
  • a node structure, a transmission method, and a system for transmitting tactile information expressed in the form of tactile video together with media information will be described.
  • MPEG-4 information for representing an object is transmitted through a plurality of elementary streams (ES).
  • the correlation information and link configuration information between the respective elementary streams (ES) are transmitted by an object descriptor defined in MPEG-4.
  • an Initial Object Descriptor In order to compose a scene based on MPEG-4, an Initial Object Descriptor (IOD), a Binary Format for Scenes (BIFS), an Object Descriptor, and media data are generally required.
  • An Initial Object Descriptor is the first information to be transmitted in order to compose an MPEG-4 scene. It describes the profile and level of each media, and is an ESDescriptor for a Scene Descriptor (BIFS) stream and an Object Descriptor stream. ).
  • the object descriptor is a set of elementary stream descriptors that describe information about each media data constituting the scene, and provides a connection between the elementary stream (ES) of each media data and the scene description.
  • the scene descriptor (BIFS) is information describing how each object has a relationship in space and time.
  • the Scene Descriptor (BIFS) is provided with a MovieTexture node that defines a video object.
  • stratTime represents a start time at which a video is played
  • stopTime represents a time at which play of a video is stopped. This allows the video to be synchronized with other objects.
  • the url also sets the position of the video.
  • TactileDisplay node is defined to transmit tactile video using MovieTexture node of such scene descriptor.
  • FIG. 6 is a diagram illustrating a TactileDisplay node for representing tactile information.
  • FIG. 7 illustrates a TactileDisplay node connected to a MovieTexture node to define a tactile video object.
  • the TactileDisplay node is a kind of texture node.
  • the "url” field indicates the position of the tactile video
  • the "stratTime” field indicates the start time
  • the "stopTime” field indicates the end time.
  • define a tactile video object by connecting a MovieTexture node to the texture field of the TactileDisplay node.
  • the tactile video set to "tatile_video.avi” is played in the tactile presentation device from 3 seconds to 7 seconds after being played.
  • FIG. 8 is a diagram illustrating a TactileDisplayTexture node for representing tactile information.
  • a scene descriptor (BIFS) of MPEG-4 newly defines a TactileDisplayTexture for transmitting a tactile video.
  • TactileDisplayTexture defines the play start time and stop time of the tactile video file, and the "url" field indicates the location of the tactile video file.
  • FIGS. 9 and 10 are diagrams illustrating a dysfunction node. Like the tactile node mentioned above, this figure may define an object for the adverse data.
  • FIG. 11 is a block diagram of a tactile information transmission system according to another embodiment of the present invention.
  • the tactile information transmission system includes an object data generator 710, an encoder 720, a multiplexer (MUX, 730), a transport channel 740, a demultiplexer (DEMUX, 750), and a decoder ( 760, and a playback unit 770.
  • object data generator 710 an encoder 720, a multiplexer (MUX, 730), a transport channel 740, a demultiplexer (DEMUX, 750), and a decoder ( 760, and a playback unit 770.
  • the object data generator 710 generates a media (audio, video), and generates, edits, or authors the tactile video and the sense data corresponding to the media.
  • the audio generator 712 stores or generates audio
  • the video generator 714 stores or generates video.
  • the tactile video generator 716 generates a tactile video indicating the driving strength of the driver array based on the audio or video.
  • the tactile video generation in the tactile video generator 716 may be automatically generated according to the type of audio information or video information, or may be generated by a user directly generating the audio or video.
  • the haptic video generated by the tactile video generator 716 and the sensitized data generated by the sensitized data generator 717 are edited and authored with media (audio and video) information in the editing / authoring unit, and are arranged in accordance with respective time axes. do. Accordingly, the editing / authoring unit generates scene descriptors according to the spatiotemporal position of audio, video, tactile video, and adverse data.
  • the encoder 720 performs a function of encoding audio, video, tactile video, adverse data, and scene descriptors. Audio is encoded at audio encoder 722 and video is encoded at video encoder 724. On the other hand, since the tactile video corresponds to a kind of black and white video, it can be encoded by a general video encoding method, which is encoded by the tactile video encoder. Diversion data is encoded in Diversion Data Encoder 727.
  • the scene descriptor is encoded in the BIFS encoder 728.
  • This encoding is done by MPEG-4 audio and video encoding methods.
  • the information encoded by the encoder unit 720 is multiplexed through a multiplexer to generate a single MP4 file, which is transmitted through the transmission channel 740.
  • the encoding method is not limited to the MPEG-4 audio and video encoding method.
  • the transmission channel 740 should be understood as a concept encompassing wired and wireless communication networks, and may be an IP network, a DMB communication network, or an Internet network.
  • the MP4 file transmitted through the transmission channel 740 is demultiplexed by the demultiplexer 750 and decoded for each piece of information by the decoder 760.
  • the audio decoder 762 decodes the audio
  • the video decoder 764 decodes the video
  • the tactile video decoder 766 decodes the tactile video
  • the sensed data decoder 767 decodes the sensed data
  • the BIFS decoder. 768 decodes the scene descriptor.
  • the information decoded by the decoder 760 is reproduced by the reproducing unit 770.
  • the reproducing unit 770 includes a compositor 776, an audio-video output device 772, and a tactile presenting device.
  • the compositor 776 constructs objects such as transmitted audio, video, tactile video, and adverse data in time and space using scene descriptor (BIFS) information.
  • BIFS scene descriptor
  • the audio-video output device 772 outputs audio and video information
  • the tactile device 774 presents tactile information through the driver array.
  • the desensitization data provides desensitization information through the desensitization device 774.
  • the tactile presentation device includes a tactile sensation device 774, a device specification DB 400, an active sensation data description unit 450, and a device control unit 300.
  • the device specification DB 400 stores information on the specification of the tactile / sensitizing device 774
  • the device control unit 300 stores the transmitted tactile video information and the sensed data based on the device specification.
  • the haptic / sensitizing device 774 provides the tactile (feeling / feeling).
  • the information including the characteristics and specifications of the device which is the specification of the device, may be stored in the device specification DB manually and / or automatically through the control unit.
  • the active sense data description unit 450 describes components of the sense data related to the effects of active movement, and when the user touches an object, the device control unit 300 configures the described sense data regarding the effects of active movement.
  • the sense of force information corresponding to the touch of the user based on the element is provided to the user through the sense of force device 774.
  • FIG. 12 is a flowchart of a tactile information transmission method. This flow chart may also be viewed as a flow chart for a method of transmitting passive adverse data.
  • the tactile video generation unit generates a tactile video based on media information such as audio and video, and the sensation data generation unit generates sensitized data (S100).
  • Each pixel of the tactile video includes an intensity value that represents the driving intensity for each driver of the driver array of the tactile device.
  • the generation of such tactile video can be generated automatically or manually based on audio or video.
  • the adverse effect data may be classified into an active mode and a passive mode based on the side of the user (or viewer).
  • the skilled person stores the movement of three degrees of freedom or six degrees of freedom of the tool manipulated in the work space, and generates the sense of force data including a series of positional data and force data for the reproduction of the sense of force. Thereafter, when the user wears the haptic device, the generated feeling information is transmitted to the user.
  • virtual object tactile characteristics eg, sofa stiffness and texture
  • sofa stiffness and texture may be automatically or manually generated.
  • the tactile video and the sense of force data are arranged in accordance with the media information and the time axis, and the editing / authoring unit generates a scene descriptor including information about the spatiotemporal position of the media information and the tactile video and the adverse reaction data (S200).
  • the scene descriptor includes a texture node for tactile video and force data, and a texture node for tactile video and force data includes a start time field and a stopTime field for outputting tactile video and force data, and A url field is included that indicates the location of the tactile video and feel data.
  • the media information, the tactile video information, the sensed data, and the scene descriptor information are encoded by the encoder unit, and then a stream file is generated through a multiplexer (S300).
  • the generated stream file is transmitted through a transport channel (S400).
  • the stream file transmission method should be understood as a concept encompassing wired and wireless communication networks, and may be an IP network, a DMB communication network, or an Internet network.
  • the transmitted stream file is demultiplexed by the demultiplexer and then decoded by the decoder (S500).
  • the compositor uses scene descriptor information to construct audio, video tactile video and adverse data in time and space, and the audio and video are output from the audio-video output device and the tactile information is presented from the device.
  • the sense of permeation data is provided in a sense device.
  • Each driver of the driver array is driven through a resizing process in consideration of the intensity value and device specification of each pixel of the tactile video.
  • the reversing process is performed in consideration of the desensitization information and the device specification to drive the desensitization device.
  • FIG. 13 is a flowchart of a process in which active adverse data is provided to a user.
  • the components of the sense of force data relating to the effect of active movement are described in the active sense of energy data description unit 450 (S700).
  • the user wears the reverse device and touches the object (S800).
  • the device controller 300 provides active sense information corresponding to the user's touch on the basis of the components of the sense data relating to the effect of the active movement described in operation S900.
  • manual force data is generated by the manual force data generator 717 (S1000), and then transmitted to the user in a stream form through multiplexing.
  • the passive force data description unit 450 the components of the force data relating to the effects of passive motion are described (S1100). Based on the components of the sense of force data relating to the effects of the described passive movement, the user provides the passive sense of force transmitted through the sense of force device (S1200).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mechanical Engineering (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention concerne un procédé pour exprimer des informations haptiques et un système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles. Le procédé pour exprimer des informations haptiques au moyen d'informations sensorielles consiste à générer des données d'objet pour générer un descripteur de scène pour définir la position temporelle d'informations multimedia contenant des informations audio et vidéo et des informations haptiques contenant des informations tactiles et des informations de force correspondant aux informations multimédia; à coder les données d'objet ainsi générées; à multiplexer les données d'objet ainsi générées pour générer un fichier de communication unique; et à exprimer des informations haptiques par l'intermédiaire d'un dispositif tactile et un dispositif de représentation de force, ledit procédé pour exprimer des informations haptiques comprenant également les étapes suivantes qui consistent : à définir des données de force sur l'effet d'un mouvement passif, et des données de force sur l'effet d'un mouvement actif, les données de force étant représentées par lesdites informations de force; à décrire les composantes de données de force sur l'effet du mouvement actif; à stocker des informations de commande contenant des caractéristiques et la taille du dispositif haptique comprenant ledit dispositif haptique et ledit dispositif de représentation de force; à exprimer des informations sur ledit descripteur de scène au moyen dudit dispositif haptique tout en référençant lesdites informations de commande; et à produire l'effet du mouvement actif sur la base des composantes de données de force décrites précédemment sur l'effet du mouvement actif.
PCT/KR2010/008906 2009-12-11 2010-12-13 Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles WO2011071351A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2009-0123534 2009-12-11
KR20090123534 2009-12-11

Publications (3)

Publication Number Publication Date
WO2011071351A2 true WO2011071351A2 (fr) 2011-06-16
WO2011071351A9 WO2011071351A9 (fr) 2011-08-04
WO2011071351A3 WO2011071351A3 (fr) 2011-11-10

Family

ID=44146079

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2010/008906 WO2011071351A2 (fr) 2009-12-11 2010-12-13 Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles

Country Status (2)

Country Link
KR (1) KR101239368B1 (fr)
WO (1) WO2011071351A2 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100835297B1 (ko) * 2007-03-02 2008-06-05 광주과학기술원 촉감 정보 표현을 위한 노드 구조 및 이를 이용한 촉감정보 전송 방법과 시스템
KR20080080777A (ko) * 2007-03-02 2008-09-05 광주과학기술원 촉감 정보 저작 방법과 장치, 및 컴퓨터로 판독가능한 기록매체

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100860412B1 (ko) * 2007-02-02 2008-09-26 한국전자통신연구원 촉각체험 서비스 방법 및 그 시스템

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100835297B1 (ko) * 2007-03-02 2008-06-05 광주과학기술원 촉감 정보 표현을 위한 노드 구조 및 이를 이용한 촉감정보 전송 방법과 시스템
KR20080080777A (ko) * 2007-03-02 2008-09-05 광주과학기술원 촉감 정보 저작 방법과 장치, 및 컴퓨터로 판독가능한 기록매체

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JUN-SEOK PARK ET AL.: 'Development of Smart Haptic Interface Device' ETRI February 2007, pages 8 - 206 *
LEE, BEOM CHAN ET AL.: 'Development of K-touch API for kinesthetic/tactile haptic interaction' DEVELOPMENT OF K-TOUCH API FOR KINESTHETIC/TACTILE HAPTIC INTERACTION 2006, pages 1 - 8 *

Also Published As

Publication number Publication date
WO2011071351A9 (fr) 2011-08-04
KR20110066893A (ko) 2011-06-17
KR101239368B1 (ko) 2013-03-05
WO2011071351A3 (fr) 2011-11-10

Similar Documents

Publication Publication Date Title
WO2011071275A2 (fr) Procédé d'expression d'informations haptiques à l'aide d'informations de commande, et système de transmission d'informations haptiques
EP2132618B1 (fr) Système pour transmettre des informations tactiles
WO2010095835A2 (fr) Procédé et appareil de traitement d'une image vidéo
US9197840B2 (en) Head mount display and method for controlling head mount display
WO2018182321A1 (fr) Procédé et appareil de restitution de texte et de graphiques synchronisés dans une vidéo de réalité virtuelle
WO2018169367A1 (fr) Procédé et appareil de conditionnement et de diffusion en continu de contenu multimédia de réalité virtuelle
EP2441269A2 (fr) Procédé de traitement de signal et appareil correspondant utilisant la taille de l'écran d'un dispositif d'affichage
WO2018021707A1 (fr) Système de publicité vidéo vr et système de production de publicité vr
KR101239370B1 (ko) 가상 환경의 햅틱 속성의 정의를 통한 촉각 정보 표현 방법 및 촉각 정보 전송 시스템
WO2018182190A1 (fr) Utilisation de carillons pour l'identification de roi dans une vidéo à 360 degrés
WO2018088730A1 (fr) Appareil d'affichage, et procédé de commande correspondant
WO2011071352A2 (fr) Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen de définition de format de données
WO2019035581A1 (fr) Serveur, dispositif d'affichage et procédé de commande s'y rapportant
US20090309827A1 (en) Method and apparatus for authoring tactile information, and computer readable medium including the method
WO2020226233A1 (fr) Dispositif de commande de réalité virtuelle (rv) ou de jeu utilisant une roue haptique, son procédé de commande et système de rv le comprenant
WO2021167252A1 (fr) Système et procédé pour fournir un contenu de réalité virtuelle (rv) pour une réduction du mal des transports
WO2011071351A2 (fr) Procédé pour exprimer des informations haptiques et système de transmission d'informations haptiques au moyen d'une classification d'informations sensorielles
WO2018159981A1 (fr) Procédé de reproduction d'image de réalité virtuelle et programme d'utilisation associé
EP2319247A2 (fr) Procédés et appareil permettant de traiter et d'afficher une image
WO2018194320A1 (fr) Dispositif de commande audio spatial selon le suivi du regard et procédé associé
WO2019066591A1 (fr) Procédé permettant de fournir une image de réalité virtuelle et programme l'utilisant
WO2021060019A1 (fr) Dispositif de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations, dispositif serveur et programme
KR101250503B1 (ko) Mpeg-v의 아키텍쳐를 통한 촉각 정보 표현 방법 및 촉각 정보 전송 시스템
JP2006166362A (ja) 音響装置
CN206517592U (zh) 一种交互式3d音频系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10836246

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10836246

Country of ref document: EP

Kind code of ref document: A2