KR101239830B1 - Method for representing haptic information and system for transmitting haptic information through defining data formats - Google Patents

Method for representing haptic information and system for transmitting haptic information through defining data formats Download PDF

Info

Publication number
KR101239830B1
KR101239830B1 KR20100127100A KR20100127100A KR101239830B1 KR 101239830 B1 KR101239830 B1 KR 101239830B1 KR 20100127100 A KR20100127100 A KR 20100127100A KR 20100127100 A KR20100127100 A KR 20100127100A KR 101239830 B1 KR101239830 B1 KR 101239830B1
Authority
KR
South Korea
Prior art keywords
information
tactile
tactile information
virtual environment
actuator
Prior art date
Application number
KR20100127100A
Other languages
Korean (ko)
Other versions
KR20110066895A (en
Inventor
류제하
김영미
Original Assignee
광주과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 광주과학기술원 filed Critical 광주과학기술원
Publication of KR20110066895A publication Critical patent/KR20110066895A/en
Application granted granted Critical
Publication of KR101239830B1 publication Critical patent/KR101239830B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mechanical Engineering (AREA)

Abstract

Provided are a tactile information presentation method and a tactile information transmission system by defining a data format. In the tactile information expression method through the definition of the data format, the tactile information is expressed through a haptic device including a tactile device and a sensitizing device by transmitting tactile information including a virtual environment and tactile information and sensitization information corresponding to the virtual environment. A method comprising: detecting whether a touch is made to the virtual environment, generating sensing information calculated from the touch, generating feedback tactile information from relational expressions set in the virtual environment using the generated sensing information, and Storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, a standard of the haptic device, and a user preference; resizing the feedback tactile information with reference to the control information; and The tactile device using the resized feedback tactile information To include the step of passing the device command.

Description

Method for representing haptic information and system for transmitting haptic information through defining data formats

The present invention relates to tactile technology, and more particularly, to a tactile information presentation method and transmission system for adding a tactile sense to an audio-video multimedia to make a user feel real.

Conventional audio-video based standards such as MPEG have been developed, but this is merely to convey visual and auditory sense, and does not provide an effect that a user can feel immersive by touch. In addition, although haptic technology is partially employed in mobile phones, medical devices, and games, many technical problems need to be solved in order to apply haptic technology to an audio-video stream. It is difficult to apply to audio / video streams and various virtual environments.

SUMMARY The present invention has been made in an effort to provide a tactile information presentation method that provides a more realistic multimedia to a user by clearly defining a data format of an interaction device for implementing multimedia.

Another technical problem of the present invention is to provide a tactile information expression system that provides a more realistic multimedia to a user by clearly defining a data format of an interaction device for implementing multimedia.

The technical problems of the present invention are not limited to the above-mentioned technical problems, and other technical problems not mentioned will be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a tactile information presentation method through a definition of a data format. The tactile device transmits tactile information including a virtual environment and tactile information and feel information corresponding to the virtual environment. And a tactile device representing a haptic device, the method comprising: storing control information including characteristics of the tactile device including the tactile device and the sensitizing device, a standard of the tactile device, and a user preference; Resizing the tactile information with reference to the control information, and transmitting a device command to the haptic device using the resized tactile information.

According to an aspect of the present invention, there is provided a tactile information transmission system through a definition of a data format, including: a means for transmitting tactile information including a virtual environment and tactile information and sensation information corresponding to the virtual environment; A tactile information presentation system using a tactile device including a tactile device and a sensitizing device, the control information including characteristics of the tactile device including the tactile device and the sensitizing device, the size of the tactile device, and user preferences are stored. Means for resizing the tactile information with reference to the control information, and means for communicating a device command to the haptic device using the resized tactile information.

According to an aspect of the present invention, there is provided a tactile information transmission method through a definition of a data format, which transmits tactile information including a virtual environment and tactile information and feel information corresponding to the virtual environment, thereby providing a tactile feeling. A method of representing tactile information using a haptic device including a device and a sense device, the method comprising: detecting whether a touch is made to the virtual environment, generating sensing information calculated from the touch, and using the generated sensing information Generating feedback haptic information from a relational expression set in an environment, storing control information including characteristics of the haptic device including the haptic device and the reverse sensation device, specifications of the haptic device, and user preferences, and the control information Resizing the feedback tactile information with reference to the And transmitting a device command to the haptic device using the resized feedback tactile information.

According to an aspect of the present invention, there is provided a tactile information transmission system through a definition of a data format, including: a means for transmitting tactile information including a virtual environment and tactile information and sensation information corresponding to the virtual environment; A tactile information presentation system using a haptic device including a tactile device and a sensitizing device, the system comprising: means for sensing whether a touch is made to the virtual environment, sensing information generating means calculated from the touch, and the generated sensing information using the generated sensing information. Means for generating feedback tactile information from a relation set in a virtual environment, means for storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, specifications of the haptic device, and user preferences, the control Resizing the feedback tactile information with reference to the information And it means for transmitting the device command to the haptic device by using the resizing one of the feedback of tactile information.

By clearly defining the data format of the interaction device for implementing the multimedia, it is possible to provide a tactile information expression method for providing a more realistic multimedia to the user.

Another effect is to clearly define the data format of the interaction device for implementing the multimedia, thereby providing a tactile information presentation system that provides a more realistic multimedia to the user.

1 is a view showing a tactile device including a driver according to an embodiment of the present invention.
2 is a perspective view of an apparatus for providing a sense of force according to an embodiment of the present invention.
3 is a diagram illustrating a driver array and a tactile video corresponding thereto according to an embodiment of the present invention.
4 is a diagram illustrating an example of generating a tactile video based on a video.
5 is a diagram illustrating an example of a MovieTexture node of a scene descriptor in MPEG-4.
6 is a diagram illustrating a TactileDisplay node for representing tactile information.
FIG. 7 illustrates a TactileDisplay node connected to a MovieTexture node to define a tactile video object.
8 is a diagram illustrating a TactileDisplayTexture node for representing tactile information.
9 and 10 are diagrams illustrating a Kinesthetic node for expressing feeling information.
11 is a block diagram of a tactile information transmission system according to an embodiment of the present invention.
12 is a flowchart illustrating a process in which resized tactile information is transmitted to a haptic device as a device command according to an embodiment of the present invention.
13 is a flowchart of a process in which feedback tactile information is transmitted to a haptic device as a device command.

BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention and the manner of achieving them will become apparent with reference to the embodiments described in detail below with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Is provided to fully convey the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout.

References to elements or layers "on" other elements or layers include all instances where another layer or other element is directly over or in the middle of another element. On the other hand, a device being referred to as "directly on" refers to not intervening another device or layer in the middle. Like reference numerals refer to like elements throughout. "And / or" include each and any combination of one or more of the mentioned items.

Although the first, second, etc. are used to describe various components, it goes without saying that these components are not limited by these terms. These terms are used only to distinguish one component from another. Therefore, it goes without saying that the first component mentioned below may be the second component within the technical scope of the present invention.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

Unless otherwise defined, all terms (including technical and scientific terms) used in the present specification may be used in a sense that can be commonly understood by those skilled in the art. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terms spatially relative, "below", "beneath", "lower", "above", "upper" And can be used to easily describe a correlation between an element and other elements. Spatially relative terms should be understood in terms of the directions shown in the drawings, including the different directions of components at the time of use or operation. For example, when inverting an element shown in the figures, an element described as "below" or "beneath" of another element may be placed "above" another element . Thus, the exemplary term "below" can include both downward and upward directions. The components can also be oriented in different directions, so that spatially relative terms can be interpreted according to orientation.

1 is a view showing a haptic device including a driver. 2 is a diagram illustrating a reverse device.

1 and 2, the tactile device 100 includes a tactile display 120, 130, a driver 200, a device transceiver 350, and a device specification DB 400. In addition, the desensitization device 150 may be composed of a plurality of actuators (not shown) for presenting the desensitization information to the user.

The tactile sense may be roughly classified into tactile information including tactile information including vibration, heat, and current, and sensation including tactile information including force, torque, and stiffness. The concept that includes this tactile information and dysfunction information is sensory information. In addition, the device for presenting the tactile feeling is a haptic device, and the device for presenting the sensation is a feeling sensation device, which can be referred to as a tactile device including the tactile device and the sensitizing device.

In addition, the sense of force may be classified into the sense of force data relating to the effects of passive movement and the sense of energy relating to the effects of active movement. This division is defined based on the user's side, and the effect of manual movement means that when the user mounts the haptic device, the pre-generated tactile information is provided to the user through the haptic device. On the other hand, the active movement effect means that when a user equipped with a haptic device touches an object with his or her will, tactile information set on the object is provided through the haptic device.

The tactile presentation units 120 and 130 include left and right tactile presentation units, and the tactile presentation units 120 and 130 may include a plurality of drivers 200. The tactile presentation units 120 and 130 may be implemented in a glove form and may be worn by a user. However, the present invention is not limited thereto and may be provided in various forms. Depending on the form for the tactile presentation, the tactile presentation units 120 and 130 may be implemented to be worn on the head in the form of a hat in addition to the form of gloves and shoes, or to be attached to the arms or legs or the back or the waist.

The driver 200 may be disposed in the tactile presentation units 120 and 130, and may be formed by a vibrotactile stimulation method or a pneumatic tactile stimualation method. The driver may be composed of an eccentric motor, a linear motor, a voice coil, an ultrasonic motor, or a piezoelectric element in the case of the vibration tactile method, and a nozzle type or a pneumatic membrane for supplying air in the case of the air tactile method. It may be made in the form.

The device controller 300 controls the driver 200. The driving signal generated by the main controller (not shown) in the compositor 776 is received, and thus the operation of the driver 200 is controlled.

In addition, in the active tactile interaction mode, when the user wearing the tactile device touches an object in the virtual environment, the device controller 300 receives the touch from the touch sensing unit 440. Thereafter, the device controller 300 generates feedback tactile information through a dynamic relation set in the object.

For example, assume that the feedback tactile information is obtained using the relation F = kx. This relational expression is stored in the device control unit 300, and coefficients of the relational expression are set in the device control unit 300 differently for each object. For cats, k = 0.2 and for dogs, k = 0.8. If the user enters x = 3 inside the object, for cats, the feedback tactile force information is 0.6, and for dogs, 2.4.

Of course, in addition to the above-mentioned relation F = kx, other haptic rendering methods and force calculation equations may be applied to the device controller to generate feedback tactile information.

The device transceiver 350 transmits and receives a control signal for controlling the device and transmits the control signal to the device controller 300.

The device specification DB 400 serves to store information about the tactile / sensitizing devices 100 and 150. It is referred to as a haptic device including a force-sensing device 150 for presenting force, torque, and the like, and a tactile device 100 for presenting vibration, heat, current, and the like. That is, the device specification DB 400 stores information about the tactile device 100, 150.

In the case of the haptic device 100, the device specification DB 400 includes a type of haptic device, a unit corresponding to the tactile device presented by the haptic device, a maximum / minimum intensity presented by the haptic device, and a horizontal direction of the tactile device. The number of drivers, the number of drivers in the vertical direction arranged in the haptic device, the horizontal gap between the drivers, the maximum interval of drive in the vertical direction between the drivers, and the number of intensity levels presented by the haptic device. .

In the case of the force-sensing device 150, it includes a unit and a work space for the maximum / minimum force / torque / stiffness, degrees of freedom, force / torque / stiffness that the force-sensing device provides to each axis, wherein the form of degrees of freedom is X / Y / Z direction independent translation / rotational motion is allowed or not, and the shape of the work space means the range in which the deflection device can perform translational and rotational motion. The range in which the deflector can translate and rotate can be defined in mm as the maximum range in which the deflector can translate in the X / Y / Z axis. The maximum range that can be rotated about an axis can be defined as the degree of roll / pitch / yaw angle. However, the unit is not limited to the above-mentioned unit.

The virtual environment is generated by the virtual environment generator 712 and includes virtual objects (eg, avatars), video, and audio information. The tactile video 600 mapped to the media (video, audio) information in the virtual environment, the reverse data, and the scene descriptor information are resized in consideration of the specifications of the device stored in the device specification DB 400. It provides a tactile feeling. This will be described later.

In addition, the compositor 776 includes a main controller (not shown) for generating a signal for controlling the driver 200 of the tactile device 100, and a main controller including a device transceiver 350 of the tactile device 100. A main transceiver for transmitting a control signal is provided.

The main controller generates a control signal for controlling each driver 200 and transmits the control signal to the device controller 300 through the main transceiver and the device transceiver 350, and the device controller 300 transmits the control signal. Based on the control of the driving of each driver 200. Here, the main transceiver and the device transceiver 350 may be connected by wire or wireless communication.

The driving of each driver 200 can be controlled by specifying the driving strength. Therefore, the tactile information may be displayed to the user by transmitting information on the driving strength to the device controller 300 to each driver 200. The main control unit transmits the information on the driving intensity of each driver to the device control unit. In the present invention, the intensity information on the driving of each driver 200 is transmitted to the main control unit in the form of a tactile video. Each time the frame is changed, each pixel value may be converted and transmitted to the device controller 300.

3 is a diagram illustrating a driver array and a tactile video corresponding thereto.

Referring to FIG. 3, the left tactile presenting unit 120 and the right tactile presenting unit 130 are each provided with 4 × 5 drivers, which can be expressed as a 4 × 10 driver array 500. have. That is, as shown in FIG. 2, the combination of drivers may be represented in the form of a rectangular array. The tactile video 600 includes pixels corresponding to each driver.

Each pixel of the tactile video 600 includes intensity information of the pixel, and the intensity information corresponds to the driving intensity of the driver corresponding to the pixel. When the tactile video 600 is represented as a gray scale black and white video, each pixel has intensity information of 0 to 255. The driver 200 is driven based on this information. For example, the driver corresponding to the white pixel is driven strongly and the driver corresponding to the black pixel vibrates weakly.

When the driver array 500 of the tactile device 100 and the pixels of the tactile video 600 correspond to 1: 1, the intensity of each pixel and the driving intensity of each driver 200 may be matched 1: 1. Can be. However, if the dimension of the tactile video 600 is larger than the dimension of the driver array 500, the size may be resized according to the ratio. That is, when there is a difference between the requested tactile information and the presentable tactile information, the device controller 300 may perform resizing.

The resizing is performed by the device controller 300 using a specification of a device stored in the device specification DB 400. Control information is a concept that includes the device's specifications and user preferences. That is, resizing refers to adjusting / providing tactile information with reference to control information. The user preference may be set in the device controller.

For example, if the tactile video 600 has dimensions of 320 × 240 and the driver array 500 of the tactile device 100 has dimensions of 10 × 4, then the tactile video 600 of 320 × 240 pixels is 10. The size of the pixel is adjusted to be 4x1 to correspond to the driver array 500 1: 1. In this case, the intensity information of the scaled tactile video may be represented by averaging the intensity information of the pixel before the scaling.

Since the tactile video 600 is in the same format as a general color or monochrome video, it can be transmitted using a general video encoding and decoding method. Also, the tactile video 600 is composed of a plurality of frames, and the intensity information of the pixels in each frame corresponds to the driving strength of each driver 200 of the tactile presentation device.

In case of dysphoria, the skilled person stores the movement of three degrees of freedom or six degrees of freedom of the tool manipulated in the workspace. In order to store the movement and power of the skilled person, the robot arm with the tool used by the skilled person is used at the end, and each joint of the robot arm has an encoder to obtain the position data of the tool, Torque sensors also determine the force / torque applied to the tool by a skilled person. The force data for force reproduction is then a series of position data and force data and includes the time each data is sampled.

In addition, similarly, the device controller 300 may perform a resizing process by using the information of the reverse device 150 stored in the device specification DB 400. That is, when there is a difference between the required force information and the presentable force information, the device controller 300 may perform resizing. For example, if you want to provide motion information that moves 1 meter in the X-axis direction, but the actual user's equipment is 1/2 the workspace of the required equipment, the scale of all motion information is Each can be delivered to the user after resizing in half in the X, Y, and Z axes. In addition, although the force reversing device 150 requires the user to provide a force of 10 N in the X-axis direction, the actual force reversing device 150 may provide only a force of 5 N in the X-axis direction, depending on the ratio. The amount of force provided by device 150 may be adjusted.

And, looking at the case of the user preferences, for example, the user may not like the temperature of more than 30 degrees, and may not want the current to flow more than 0.5mA. It may also be disliked when the force provided is greater than 5N.

This user preference has priority over device specification information. Therefore, even if the transmitted tactile information is 10N and adjusted to 5N in consideration of the device specification, if the user preference is 3N, the force provided by the haptic device to the user may be 3N. As a result, 10N tactile information was resized to 3N.

As mentioned earlier, in addition to resizing considering device specifications, additional resizing may be performed in consideration of user preference.

4 is a diagram illustrating an example of generating a tactile video based on a video.

Referring to FIG. 4, positioned to the right of an image of a movie is a movie-adaptive tactile video 850 generated by a tactile editing / authoring tool. Shown to the right of this tactile video is a representation of a tactile video corresponding driver array 900. The tactile video 850 generates a tactile editing / writing tool based on the video / audio information by applying a different tactile intensity (0 to 255) for each frame to exactly match the number of video frames.

The generated tactile video is reproduced in accordance with the device standard. More specifically, it is converted and reproduced according to the size of the device as well as the number of tactile strengths the device can provide. For example, the presentation of the tactile video is 256 steps, but if the device can play only 8 steps, the 256 steps are divided into 8 steps to be played.

Looking at the drawing, there is shown a screen in which the actor jumps from right to left. This figure illustrates an example in which the movement of the actor can be transmitted to the user by the touch from the viewpoint of the third-person observer. The tactile video may be authored based not only on the third-person observer's point of view, but also on the first-person main character's viewpoint or tactile background effect viewpoint. In the first frame, when the actor starts to jump, the tactile video is mapped to black, and the driver array presents a weak intensity tactile response corresponding to the mapped color. In the last frame, the moment the actor moves left through a strong jump, the tactile video is mapped to white, and the driver array correspondingly presents a strong intensity tactile.

5 is a diagram illustrating an example of a MovieTexture node of a scene descriptor in MPEG-4. Node tactile information is transmitted along with general media (audio and video) information. Hereinafter, a node structure, a transmission method, and a system for transmitting tactile information expressed in the form of tactile video together with media information will be described.

In MPEG-4, information for representing an object is transmitted through a plurality of elementary streams (ES). The correlation information and link configuration information between the respective elementary streams (ES) are transmitted by an object descriptor defined in MPEG-4.

In order to compose a scene based on MPEG-4, an Initial Object Descriptor (IOD), a Binary Format for Scenes (BIFS), an Object Descriptor, and media data are generally required. An Initial Object Descriptor (IOD) is the first information to be transmitted in order to compose an MPEG-4 scene. It describes the profile and level of each media, and is an ESDescriptor for a Scene Descriptor (BIFS) stream and an Object Descriptor stream. ).

The object descriptor is a set of elementary stream descriptors that describe information about each media data constituting the scene, and provides a connection between the elementary stream (ES) of each media data and the scene description. Here, the scene descriptor (BIFS) is information describing how each object has a relationship in space and time.

In MPEG-4, the Scene Descriptor (BIFS) is provided with a MovieTexture node that defines a video object.

Referring to FIG. 5, in a MovieTexture node, stratTime represents a start time at which a video is played, and stopTime represents a time at which play of a video is stopped. This allows the video to be synchronized with other objects. The url also sets the position of the video.

TactileDisplay node is defined to transmit tactile video using MovieTexture node of such scene descriptor.

6 is a diagram illustrating a TactileDisplay node for representing tactile information. FIG. 7 illustrates a TactileDisplay node connected to a MovieTexture node to define a tactile video object.

6 and 7, according to FIG. 5, the TactileDisplay node is a kind of texture node. In FIG. 6, the "url" field indicates the position of the tactile video, the "stratTime" field indicates the start time, and the "stopTime" field indicates the end time. In other words, define a tactile video object by connecting a MovieTexture node to the texture field of the TactileDisplay node.

In the example of FIG. 7, the tactile video set to "tatile_video.avi" is played in the tactile presentation device from 3 seconds to 7 seconds after being played.

8 is a diagram illustrating a TactileDisplayTexture node for representing tactile information.

Referring to FIG. 8, according to FIG. 7, a scene descriptor (BIFS) of MPEG-4 newly defines a TactileDisplayTexture for transmitting a tactile video. TactileDisplayTexture defines the play start time and stop time of the tactile video file, and the "url" field indicates the location of the tactile video file.

9 and 10 are diagrams illustrating a dysfunction node. Like the tactile node mentioned above, this figure may define an object for the adverse data.

11 is a block diagram of a tactile information transmission system according to another embodiment of the present invention.

Referring to FIG. 11, the tactile information transmission system includes an object data generator 710, an encoder 720, a multiplexer (MUX, 730), a transport channel 740, a demultiplexer (DEMUX, 750), and a decoder ( 760, and a playback unit 770.

The object data generator 710 generates, edits, or authors a virtual environment, a tactile video corresponding to the virtual environment, and a sense of force data. Virtual environments include virtual objects (eg, avatars, virtual objects) and media (audio, video).

The tactile video generation in the tactile video generator 716 may be automatically generated according to the type of audio information or video information, or may be generated by a user directly generating the audio or video.

The tactile video generated by the tactile video generator 716 and the sensitized data generated by the sensitized data generator 717 are edited and authored with media (audio and video) information in a virtual environment by the editing / authoring unit, respectively, to create respective time axes. Or it is positioned along the event base axis. Accordingly, the editing / authoring unit 718 generates a scene descriptor according to the spatiotemporal position of the virtual environment, the tactile video, and the adverse data.

The encoder unit 720 performs a function of encoding a virtual environment (including audio and video), tactile video, force data, and a scene descriptor. The virtual environment is encoded in the virtual environment encoder 722, and the tactile video corresponds to a kind of black and white video, so it can be encoded by a general video encoding method, which is encoded in the tactile video encoder. Diversion data is encoded in Diversion Data Encoder 727.

In addition, the scene descriptor is encoded in the BIFS encoder 728. Such encoding can be done by MPEG-4 audio and video encoding methods. The information encoded by the encoder unit 720 is multiplexed through a multiplexer to generate a single MP4 file, which is transmitted through the transmission channel 740. However, the encoding method is not limited to the MPEG-4 audio and video encoding method.

In the present invention, the transmission channel 740 should be understood as a concept encompassing wired and wireless communication networks, and may be an IP network, a DMB communication network, or an Internet network.

The MP4 file transmitted through the transmission channel 740 is demultiplexed by the demultiplexer 750 and decoded for each piece of information by the decoder 760. The virtual environment decoder 762 decodes the virtual environment, the video decoder 764 decodes the video, the tactile video decoder 766 decodes the tactile video, the desensitization data decoder 767 decodes the sensed data, BIFS decoder 768 decodes the scene descriptor.

The information decoded by the decoder 760 is reproduced by the reproducing unit 770. The reproducing unit 770 includes a compositor 776, a virtual environment output device 772, and a tactile presentation device. The compositor 776 constructs objects such as a transmitted virtual environment, tactile video, and adverse data in time and space using scene descriptor (BIFS) information. Based on this information, the virtual environment output device 772 outputs virtual object, audio and video information, and the tactile device 774 presents tactile information through the driver array. In addition, the desensitization data provides desensitization information through the desensitization device 774.

The tactile presentation device includes a tactile sensation device 774, a device specification DB 400, a device command unit 420, a touch sensing unit 440, and a device control unit 300.

As mentioned above, the device specification DB 400 stores information on the specification of the tactile / sensitizing device 774, and the device control unit 300 stores the transmitted tactile video information and the sensed data based on the device specification. By means of which the haptic / sensitizing device 774 provides the tactile (feeling / feeling). Here, the information including the characteristics and specifications of the device, which is the specification of the device, may be stored in the device specification DB manually and / or automatically through the control unit.

The touch sensor 440 determines whether the user wearing the tactile device 774 touches the virtual object in the virtual environment, and generates sensed information.

The criterion of determining whether the user has touched the virtual object determines whether the user's location is within a volume of the virtual object. At this time, the location of the user can be considered as one point.

The sensed information may include the location of the user's virtual space, the distance the user entered into the virtual object after colliding with the virtual object, and the magnitude of the force applied to the virtual environment by the user.

The device command unit 420 converts the resized tactile information into a percentage (%) form and transmits a device command to the tactile device 774. This device command can be divided into two forms.

One is that the tactile information generated by the object data generator 710 refers to the control information including the device specification and the user preference in the device controller 300 to give the device command to the haptic device 774 in the form of a percentage as a percentage. If the user touches the virtual environment, the user may generate feedback tactile information using the sensed information generated by the touch, and then resize the generated feedback tactile information. In some cases, the device command may be transmitted.

For example, in the first case, the reverse force information 10N generated by the reverse force data generation unit 717 is transmitted through the transmission channel 740. As a result of the device control unit 300 considering the device specification, when the force capable of providing the maximum force device is 5N, the device control unit resizes the transmitted 10N to 5N.

However, if the user preference is set to the device control unit 300 does not want to receive more than 3N force, since the user preference is control information with higher priority than the device specification, 5N resized in consideration of the device specification is 3N. To readjust.

Thereafter, the device command unit 420 has a device specification of 5N and a force to be provided is 3N, so that 3/5 = 0.6, and transmits a device command to the reverse device 774 in the form of 60%.

In the second case, for example, when a user wearing the tactile device 774 touches a cat, which is a virtual object in a virtual environment, the touch sensing unit 440 determines the location of the user, and the location of the user is an object. It is judged that x = 10mm is inside. On the other hand, in the device control unit 300, a relational expression of F = kx is set, and in the virtual cat, the coefficient of the relational expression is set to k = 0.2N / mm. The device controller 300 calculates that the feedback tactile information is 2N, and the calculated 2N undergoes a resizing process.

If the maximum power that the device can provide is 10N and the user preference is 6N, the user preference becomes a priority, and the feedback tactile information 20N is resized to 6N. Thereafter, (resized value) / (force that the device can provide maximum) = 6/10 = 0.6, 60% of the device commands are sent to the haptic device.

If the user exerts a force on the virtual object, the sensed information may be in the form of a force, and feedback tactile information may be provided to the user by a predetermined relation.

In the present invention, it may be aimed at defining a data format for an interaction device (eg, a haptic device), which includes a device command and sensed information.

A device command is a control command (in percentage form) that is sent to the device to provide the sent sensory information to the user.

Sensed information uses various sensors (including position, velocity, acceleration, angle, angular velocity, angular acceleration, force, pressure and torque) when the user touches the virtual object. Say the value.

Hereinafter, the process of applying the data format will be described through a flowchart.

12 is a flowchart illustrating a process in which the tactile information generated by the object data generator is converted into a device command and transmitted to the tactile device.

Referring to FIG. 12, the object data generator 710 generates tactile information corresponding to the virtual environment, and transmits the generated tactile information to the user through the transmission channel 740 (S700). The device controller 300 resizes the transmitted tactile information in consideration of device specifications and user preferences (S800). Thereafter, the device command unit 420 changes the resized tactile information into a percentage form to generate a device command, and transmits a device command to the haptic device 774 (S900).

FIG. 13 is a flowchart illustrating a process of converting feedback tactile information calculated based on sensing information generated by a user's touch into a device command and transmitting the same.

Referring to FIG. 13, the touch detector 440 determines the location of the user and detects whether the user has touched the virtual object (S1000). The touch sensor 440 generates sensed information through a user touch in operation S1100. In the device controller 300, a relational expression for a specific virtual object is set, and the device controller generates feedback tactile information by substituting the sensing information in this relation (S1200).

Thereafter, the device controller resizes the feedback tactile information with reference to the control information (S1400). The device command unit 420 transmits this resized feedback tactile information to the haptic device 774 as a percentage of the device command (S1500).

Of course, the feedback tactile information may not be resized, but may be directly converted into a device command and transmitted to the haptic device 774.

Although embodiments of the present invention have been described above with reference to the accompanying drawings, those skilled in the art to which the present invention pertains may implement the present invention in other specific forms without changing the technical spirit or essential features thereof. I can understand that. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

100: tactile device 120,130: tactile display
150: reverse device 200: driver
300: device controller 350: device transceiver
400: Device specification DB 420: Device command unit
440: touch sensing unit 500: driver array
600: tactile video 700: tactile information transmission system
710: object data generation unit 712: virtual environment generation unit
716: tactile video generation unit 717: sensitization data generation unit
718: Editing / authoring unit 720: Encoder unit
722: virtual environment encoder 726: tactile video encoder
727: Reverse Data Encoder 728: BIFS Encoder
730: multiplexer 740: transport channel
750: demultiplexer 760: decoder unit
762: virtual environment decoder 766: tactile video decoder
767: Reverse Data Decoder 768: BIFS Decoder
770: playback unit 772: virtual environment output device
774: tactile device (tactile / sensing device) 776: compositor
800: video frame 850: tactile video corresponding to the movie
900: tactile video driver array

Claims (40)

In the tactile information presentation method through a tactile device including a tactile device and a sensation device by transmitting the tactile information including the virtual environment and the tactile and force feeling information corresponding to the virtual environment,
Storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, a specification of the haptic device, and a user preference;
Resizing the tactile information with reference to the control information; And
And transmitting a device command to the haptic device by using the resized tactile information.
The method of claim 1,
The haptic device includes a driver array for presenting tactile sensations,
The control information including the characteristics and specifications of the haptic device,
The number of drivers in the horizontal direction in the driver array, the number of drivers in the vertical direction in the driver array, the horizontal gap between the drivers and the vertical gap between the drivers Tactile information presentation method using control information, including.
3. The method according to claim 1 or 2,
The dysfunction device includes an actuator for presenting an adverse reaction,
The control information including the characteristics and specifications of the reverse device,
The maximum force that the actuator can provide to each axis, the maximum torque that the actuator can provide to each axis, the maximum stiffness the actuator can provide to each axis, the degree of freedom of the actuator and the actuator Tactile information presentation method using control information, including a workspace of the.
3. The method according to claim 1 or 2,
The resizing is,
And when there is a difference between the transmitted tactile information and the tactile information that can be presented by the haptic device, adjusting the transmitted tactile information to the presentable tactile information.
5. The method of claim 4,
And re-adjust the adjusted tactile information in consideration of the user preference.
6. The method of claim 5,
And the device command is transmitted to the haptic device in the form of a percentage.
3. The method according to claim 1 or 2,
The resizing is,
And when there is a difference between the transmitted feeling information and the feeling information that can be presented by the feeling device, the transmitted feeling information is adjusted to the displayable feeling information.
8. The method of claim 7,
And re-adjust the adjusted force sensitive information in consideration of the user preference.
The method of claim 8,
And the device command is transmitted to the force sensitive device in percentage form.
A tactile information presentation system through a tactile device including a tactile device including means for transmitting a tactile information including a virtual environment and tactile and perceptual information corresponding to the virtual environment,
Means for storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, a specification of the haptic device, and a user preference;
Means for resizing the tactile information with reference to the control information; And
Means for communicating a device command to the haptic device using the resized tactile information.
The method of claim 10,
The haptic device includes a driver array for presenting tactile sensations,
The control information including the characteristics and specifications of the haptic device,
The number of drivers in the horizontal direction in the driver array, the number of drivers in the vertical direction in the driver array, the horizontal gap between the drivers and the vertical gap between the drivers A tactile information representation system using control information.
The method according to claim 10 or 11,
The dysfunction device includes an actuator for presenting an adverse reaction,
The control information including the characteristics and specifications of the reverse device,
The maximum force that the actuator can provide to each axis, the maximum torque that the actuator can provide to each axis, the maximum stiffness the actuator can provide to each axis, the degree of freedom of the actuator and the actuator A tactile information presentation system using control information, including a workspace of the user.
The method according to claim 10 or 11,
The resizing is,
And when there is a difference between the transmitted tactile information and the tactile information that can be presented by the haptic device, adjusting the transmitted tactile information to the presentable tactile information.
The method of claim 13,
And re-adjust the adjusted tactile information in consideration of the user preference.
The method of claim 14,
And the device command is transmitted to the haptic device in percentage form.
The method according to claim 10 or 11,
The resizing is,
And if there is a difference between the transmitted feeling information and the feeling information that can be presented by the feeling device, the transmitted feeling information is adjusted to the displayable feeling information.
17. The method of claim 16,
And re-adjust the adjusted force sensitive information in consideration of the user preference.
18. The method of claim 17,
And the device command is transmitted to the force sensitive device in percentage form.
In the tactile information presentation method through a tactile device including a tactile device and a sensation device by transmitting the tactile information including the virtual environment and the tactile and force feeling information corresponding to the virtual environment,
Detecting whether the virtual environment is touched;
Generating sensing information calculated from the touch;
Generating feedback tactile information from relational expressions set in the virtual environment using the generated sensing information;
Storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, a specification of the haptic device, and a user preference;
Resizing the feedback tactile information with reference to the control information; And
And transmitting a device command to the haptic device by using the resized feedback tactile information.
20. The method of claim 19,
Detecting whether or not the touch on the virtual environment,
And determining whether a user's location is within a volume of the virtual environment.
The method of claim 20,
Wherein the sensed information includes a distance a user has entered the virtual environment, and a magnitude of force exerted by the user on the virtual environment.
20. The method of claim 19,
The haptic device includes a driver array for presenting tactile sensations,
The control information including the characteristics and specifications of the haptic device,
The number of drivers in the horizontal direction in the driver array, the number of drivers in the vertical direction in the driver array, the horizontal gap between the drivers and the vertical gap between the drivers Tactile information presentation method using control information, including.
20. The method of claim 19,
The dysfunction device includes an actuator for presenting an adverse reaction,
The control information including the characteristics and specifications of the reverse device,
The maximum force that the actuator can provide to each axis, the maximum torque that the actuator can provide to each axis, the maximum stiffness the actuator can provide to each axis, the degree of freedom of the actuator and the actuator Tactile information presentation method using control information, including a workspace of the.
23. The method of claim 22,
The resizing is,
And when there is a difference between the feedback tactile information and the tactile information presentable by the haptic device, adjusting the feedback tactile information to the presentable tactile information.
25. The method of claim 24,
And re-adjust the adjusted feedback tactile information in consideration of the user preference.
26. The method of claim 25,
And the device command is transmitted to the haptic device in the form of a percentage.
24. The method of claim 23,
The resizing is,
And when there is a difference between the feedback tactile information and the sensitized information that can be presented by the sensitizing device, adjusting the feedback tactile information to the presentable sensitized information.
28. The method of claim 27,
And re-adjust the adjusted force sensitive information in consideration of the user preference.
The method of claim 28,
And the device command is transmitted to the force sensitive device in percentage form.
A tactile information presentation system through a tactile device including a tactile device including means for transmitting a tactile information including a virtual environment and tactile and perceptual information corresponding to the virtual environment,
Means for detecting a touch on the virtual environment;
Means for generating sensing information calculated from the touch;
Means for generating feedback tactile information from relational expressions set in the virtual environment using the generated sensed information;
Means for storing control information including characteristics of the haptic device including the haptic device and the sensitizing device, a specification of the haptic device, and a user preference;
Means for resizing the feedback tactile information with reference to the control information; And
Means for communicating a device command to the haptic device using the resized feedback tactile information.
31. The method of claim 30,
Detecting whether or not the touch on the virtual environment,
And determine whether a user's location is within a volume of the virtual environment.
31. The method of claim 30,
Wherein the sensed information includes a distance a user has entered the virtual environment, and a magnitude of force exerted by the user on the virtual environment.
31. The method of claim 30,
The haptic device includes a driver array for presenting tactile sensations,
The control information including the characteristics and specifications of the haptic device,
The number of drivers in the horizontal direction in the driver array, the number of drivers in the vertical direction in the driver array, the horizontal gap between the drivers and the vertical gap between the drivers A tactile information representation system using control information.
31. The method of claim 30,
The dysfunction device includes an actuator for presenting an adverse reaction,
The control information including the characteristics and specifications of the reverse device,
The maximum force that the actuator can provide to each axis, the maximum torque that the actuator can provide to each axis, the maximum stiffness the actuator can provide to each axis, the degree of freedom of the actuator and the actuator A tactile information presentation system using control information, including a workspace of the user.
34. The method of claim 33,
The resizing is,
And when there is a difference between the feedback tactile information and the tactile information that can be presented by the haptic device, adjusting the feedback tactile information to the presentable tactile information.
36. The method of claim 35,
And re-adjust the adjusted feedback tactile information in consideration of the user preference.
37. The method of claim 36,
And the device command is transmitted to the haptic device in percentage form.
35. The method of claim 34,
The resizing is,
And when there is a difference between the feedback tactile information and the sensitization information that can be presented by the sensitizing device, adjusting the feedback tactile information to the presentable sensitizing information.
The method of claim 38,
And re-adjust the adjusted force sensitive information in consideration of the user preference.
40. The method of claim 39,
And the device command is transmitted to the force sensitive device in percentage form.
KR20100127100A 2009-12-11 2010-12-13 Method for representing haptic information and system for transmitting haptic information through defining data formats KR101239830B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020090123529 2009-12-11
KR20090123529 2009-12-11

Publications (2)

Publication Number Publication Date
KR20110066895A KR20110066895A (en) 2011-06-17
KR101239830B1 true KR101239830B1 (en) 2013-03-06

Family

ID=44146080

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20100127100A KR101239830B1 (en) 2009-12-11 2010-12-13 Method for representing haptic information and system for transmitting haptic information through defining data formats

Country Status (2)

Country Link
KR (1) KR101239830B1 (en)
WO (1) WO2011071352A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3599540A1 (en) * 2018-07-24 2020-01-29 Sony Interactive Entertainment Inc. Robot interaction system and method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016215481A1 (en) * 2016-08-18 2018-02-22 Technische Universität Dresden System and method for haptic interaction with virtual objects
EP4268053A4 (en) * 2020-12-22 2024-01-31 Ericsson Telefon Ab L M Moderating a user's sensory experience with respect to an extended reality

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060126452A (en) * 2003-09-25 2006-12-07 브리티쉬 텔리커뮤니케이션즈 파블릭 리미티드 캄퍼니 Haptics transmission system
KR20080072332A (en) * 2007-02-02 2008-08-06 한국전자통신연구원 System and method for haptic experience service
KR20100033954A (en) * 2008-09-22 2010-03-31 한국전자통신연구원 Method and apparatus for representation of sensory effects
KR20100067005A (en) * 2008-12-10 2010-06-18 포항공과대학교 산학협력단 Apparatus and method for providing haptic augmented reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100835297B1 (en) * 2007-03-02 2008-06-05 광주과학기술원 Node structure for representing tactile information, method and system for transmitting tactile information using the same
KR100860547B1 (en) * 2007-03-02 2008-09-26 광주과학기술원 Method and Apparatus for Authoring Tactile Information, and Computer Readable Medium Including the Method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060126452A (en) * 2003-09-25 2006-12-07 브리티쉬 텔리커뮤니케이션즈 파블릭 리미티드 캄퍼니 Haptics transmission system
KR20080072332A (en) * 2007-02-02 2008-08-06 한국전자통신연구원 System and method for haptic experience service
KR20100033954A (en) * 2008-09-22 2010-03-31 한국전자통신연구원 Method and apparatus for representation of sensory effects
KR20100067005A (en) * 2008-12-10 2010-06-18 포항공과대학교 산학협력단 Apparatus and method for providing haptic augmented reality

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3599540A1 (en) * 2018-07-24 2020-01-29 Sony Interactive Entertainment Inc. Robot interaction system and method

Also Published As

Publication number Publication date
KR20110066895A (en) 2011-06-17
WO2011071352A3 (en) 2011-11-10
WO2011071352A2 (en) 2011-06-16

Similar Documents

Publication Publication Date Title
KR100835297B1 (en) Node structure for representing tactile information, method and system for transmitting tactile information using the same
CN107835971B (en) Method and apparatus for providing haptic feedback and interaction based on user haptic space (HapSpace)
US9030305B2 (en) Method for expressing haptic information using control information, and system for transmitting haptic information
KR101239370B1 (en) Method for representing haptic information and system for transmitting haptic information through defining the haptic property of virtual environment
US10326978B2 (en) Method and apparatus for generating virtual or augmented reality presentations with 3D audio positioning
US6496200B1 (en) Flexible variation of haptic interface resolution
KR20140082266A (en) Simulation system for mixed reality contents
KR100860547B1 (en) Method and Apparatus for Authoring Tactile Information, and Computer Readable Medium Including the Method
KR20170059368A (en) Haptically enabled flexible devices
Cha et al. A Framework for Haptic Broadcasting.
KR101239830B1 (en) Method for representing haptic information and system for transmitting haptic information through defining data formats
US20200111257A1 (en) Sound reproduction apparatus for reproducing virtual speaker based on image information
Kim et al. Construction of a haptic-enabled broadcasting system based on the MPEG-V standard
JP2009194597A (en) Transmission and reception system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium
EP2961503B1 (en) Method for reproducing an item of audiovisual content having haptic actuator control parameters and device implementing the method
Cha et al. An authoring/editing framework for haptic broadcasting: passive haptic interactions using MPEG-4 BIFS
KR101250503B1 (en) Method for representing haptic information and system for transmitting haptic information through the architecture of MPEG-V
KR101239368B1 (en) Method for representing haptic information and system for transmitting haptic information through separating a sensory information
KR20170012089A (en) Signal for carrying washout request in haptic audiovisual content, related method and device
WO2021060019A1 (en) Information processing device, information processing method, server device, and program
WO2009101998A1 (en) Broadcast system, transmission device, transmission method, reception device, reception method, presentation device, presentation method, program, and recording medium
KR20110040128A (en) System for streaming a haptic content
Kim et al. MPEG-V standardization for haptically interacting with virtual worlds
JP2022173870A (en) Appreciation system, appreciation device, and program
JP2009194596A (en) Transmission and reception system, transmitter, transmission method, receiver, reception method, exhibition device, exhibition method, program, and recording medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20151217

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20161219

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20180201

Year of fee payment: 6

LAPS Lapse due to unpaid annual fee