CN107566847B - Method for encoding touch data into video stream for storage and transmission - Google Patents

Method for encoding touch data into video stream for storage and transmission Download PDF

Info

Publication number
CN107566847B
CN107566847B CN201710840076.9A CN201710840076A CN107566847B CN 107566847 B CN107566847 B CN 107566847B CN 201710840076 A CN201710840076 A CN 201710840076A CN 107566847 B CN107566847 B CN 107566847B
Authority
CN
China
Prior art keywords
data
dimensional
touch
torque
tactile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710840076.9A
Other languages
Chinese (zh)
Other versions
CN107566847A (en
Inventor
陈超
刘振宇
裘辿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
North China Institute of Science and Technology
Original Assignee
Zhejiang University ZJU
North China Institute of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU, North China Institute of Science and Technology filed Critical Zhejiang University ZJU
Priority to CN201710840076.9A priority Critical patent/CN107566847B/en
Publication of CN107566847A publication Critical patent/CN107566847A/en
Application granted granted Critical
Publication of CN107566847B publication Critical patent/CN107566847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention relates to a method for storing and transmitting haptic data encoded into a video stream, characterized in that it comprises the following steps: 1) converting touch data on each acquisition point of a three-dimensional human body at different moments into two-dimensional touch image data arranged according to a time sequence; 2) the method comprises the steps that the obtained two-dimensional touch image data arranged according to the time sequence are coded by a conventional video compression method to obtain touch stream data which are used for storing and transmitting the touch data or sharing the touch data among different systems; 3) decompressing the obtained touch sensing stream data by adopting a conventional video compression method to obtain a two-dimensional touch sensing image data sequence arranged according to a time sequence; 4) and performing tactile reproduction on the human body based on the obtained two-dimensional tactile image data sequence arranged in time sequence. The invention fundamentally solves the problems of data compression and real-time stream transmission of the tactile data in the network transmission process, thereby being widely applied to the storage and transmission of the tactile data.

Description

Method for encoding touch data into video stream for storage and transmission
Technical Field
The present invention relates to a data storage and transmission method, and more particularly, to a method for storing and transmitting haptic data encoded into a video stream.
Background
The tactile sensation is a perception that human beings sense the shape and material of an object, and when people watch videos or virtual reality contents which are personally on the scene, the people often stretch out hands to touch the object, so that the visible tactile sensation is a very important perception for the human beings.
Nowadays, touch reappearance is more and more emphasized in the field of video playing or virtual reality, and people can personally perceive the shape and the material of an object while watching through force feedback gloves or force feedback clothes. The touch reappearance is to record touch data which is required to be applied to a human body through a force feedback glove or a force feedback garment at the current moment according to picture content, the touch data can be obtained through manual editing or acquisition through a sensor, and the touch data are stored one by one according to time sequence to finally form a touch data stream. When the video or the virtual reality content is played, the touch data corresponding to the current moment is retrieved, and the corresponding touch action units are driven on the force feedback gloves or the force feedback clothes equipment to integrally stimulate the touch sense and the appearance of the human body, so that the human body can feel more completely immersed. However, most systems currently complete the storage and transmission of the touch data through a custom format, so that the touch data cannot be conveniently shared among the systems, and meanwhile, the existing custom format does not consider the problems that must be solved in network transmission, such as data compression, real-time streaming transmission and the like.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a method for storing and transmitting haptic data encoded into a video stream, which fundamentally solves the problems of data compression and real-time streaming transmission of haptic data in a network transmission process by multiplexing the existing algorithms and mechanisms of video streams, and simultaneously implements sharing of haptic data.
In order to achieve the purpose, the invention adopts the following technical scheme: a method of storing and transmitting haptic data encoded as a video stream, comprising the steps of: 1) converting touch data on each acquisition point of a three-dimensional human body at different moments into a two-dimensional touch image data sequence arranged according to a time sequence; 2) coding the obtained two-dimensional touch image data sequence arranged according to the time sequence to obtain touch flow data; 3) decompressing the obtained tactile streaming data to obtain a two-dimensional tactile image data sequence arranged according to a time sequence; 4) and performing tactile reproduction on the human body based on the obtained two-dimensional tactile image data sequence arranged in time sequence.
In the step 1), the method for converting the tactile data into the two-dimensional tactile image data sequence includes the following steps: 1.1) carrying out UV unfolding on each acquisition point of a three-dimensional human body, and mapping the surface of the human body to a two-dimensional tactile image in a two-dimensional plane; 1.2) acquiring touch data acting on each acquisition point of a three-dimensional human body at different moments, and decomposing the touch data on each acquisition point into multi-channel data; 1.3) mapping the multi-channel data of the acquisition points at different moments obtained in the step 1.2) to an RGB color space according to the positions of the two-dimensional tactile images corresponding to the acquisition points of the three-dimensional human body in the step 1.1) to obtain a two-dimensional tactile image data sequence arranged according to a time sequence.
In the step 1.3), the method for generating the two-dimensional tactile image data sequence arranged in time series includes the following steps: 1.3.1) acquiring multi-channel data of any acquisition point of a three-dimensional human body at the current moment, wherein the multi-channel data comprises pressure, X-direction torque and Y-direction torque data; 1.3.2) calculating the position of the acquisition point on the two-dimensional touch image according to the mapping relation in the step 1.1); 1.3.3) carrying out normalization processing on the multi-channel data of the acquisition point, and converting the multi-channel data into RGB data of a corresponding position on a two-dimensional touch image; 1.3.4) repeating the steps 1.3.1) to 1.3.3), converting the multi-channel data of all acquisition points on the three-dimensional human body at different moments into RGB data on the two-dimensional touch image, and obtaining a two-dimensional touch image data sequence arranged according to time sequence.
In the step 1.3.3), the corresponding relationship between the multichannel data and the RGB data of each collection point is: the pressure channel data corresponds to R channel data, the X-direction torque channel data corresponds to G channel data, and the Y-direction torque channel data corresponds to B channel data.
The calculation formulas for normalization processing of pressure, X-direction torque and Y-direction torque channel data in the multichannel data are respectively as follows:
Ypressure of=(XPressure of-MinValuePressure of)/(MaxValuePressure of-MinValuePressure of);
YTorque in X direction=(XTorque in X direction-MinValueTorque in X direction)/(MaxValueTorque in X direction-MinValueTorque in X direction);
YTorque in Y direction=(XTorque in Y direction-MinValueTorque in Y direction)/(MaxValueTorque in Y direction-MinValueTorque in Y direction);
Wherein: x, Y are values before and after the transition, and MaxValue and MinValue are the maximum and minimum possible values of the pressure, X-direction torque, and Y-direction torque channel data, respectively.
In the step 4), the method for reproducing the tactile sensation includes the following steps: 4.1) finding corresponding points of the touch reappearing units on the two-dimensional touch image according to the mapping relation in the step 1.1) and the positions of the touch reappearing units on the three-dimensional human body on the force feedback equipment; 4.2) finding RGB data on the corresponding point on the two-dimensional touch image according to the obtained two-dimensional touch image data arranged according to the time sequence; 4.3) converting the RGB data of the corresponding point into multi-channel data of the tactile data, and controlling the corresponding tactile reproduction unit of the force feedback device to act and then applying the tactile reproduction to the human body to realize the tactile reproduction.
Due to the adoption of the technical scheme, the invention has the following advantages: 1. the invention converts the touch data of each acquisition point on the three-dimensional human body into multi-channel data, can respectively correspond to touch reappearing units of various touch acquisition equipment, force feedback gloves or force feedback clothes, can unify different touch data coding transmission schemes of various manufacturers, and has wider applicability range. 2. The invention maps the positions of all the acquisition points on the three-dimensional human body to the two-dimensional touch image, converts the multi-channel data corresponding to all the acquisition points into RGB data, and generates the two-dimensional touch image data arranged according to the time sequence, and the two-dimensional touch image data can be coded and decoded by adopting a conventional video compression method, thereby realizing the touch data sharing among different systems. 3. The generated two-dimensional touch image data can multiplex a mature compression algorithm and a stream transmission method of video coding, so that the compression and stream transmission of the touch data are realized, and even the touch data stream can be used as a channel in the video stream to realize the synchronous transmission of the touch data, the picture and the sound. 4. The invention adopts a common video compression scheme to encode and decode the two-dimensional touch image data, so that the volume of the file is greatly reduced, and simultaneously, the invention can ensure that the distortion is extremely small when reproducing. The method is simple and convenient to operate, and can be widely applied to the field of storage and transmission of touch data.
Drawings
FIG. 1 is a data diagram of the present invention dividing tactile data into three channels;
FIG. 2 is a flow chart of the present invention for generating a two-dimensional tactile image;
FIG. 3 is a flow chart of the haptic data codec of the present invention;
FIG. 4 is a schematic diagram of the sharing of tactile data between different systems in accordance with the present invention;
fig. 5(a) and 5(b) are schematic diagrams showing the reproduction of tactile sensation according to the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and examples.
The invention provides a method for encoding touch data into a video stream for storage and transmission, which comprises the following steps:
1) and converting the touch data on each acquisition point of the three-dimensional human body at different moments into a two-dimensional touch image data sequence arranged according to a time sequence.
The method for converting the touch data on each acquisition point of the three-dimensional human body at different moments into the two-dimensional touch image data arranged according to the time sequence comprises the following steps of:
1.1) carrying out UV expansion on each acquisition point of the three-dimensional human body, and mapping the surface of the human body to a two-dimensional tactile image in a two-dimensional plane.
1.2) acquiring touch data acting on each acquisition point of the three-dimensional human body at different moments, and decomposing the touch data on each acquisition point into multi-channel data.
As shown in fig. 1, according to the current picture, a method of using a sensor to collect or manually edit is used to record the tactile data acting on each collection point of the three-dimensional human body, and then according to the difference of design principles of tactile collection equipment, force feedback gloves and force feedback clothes, the tactile data is decomposed into data of three channels of pressure, torque in the X direction and torque in the Y direction at most, and one or more channels can be selected to record and reproduce in actual use.
1.3) mapping the multi-channel data of the acquisition points at different moments obtained in the step 1.2) to an RGB color space according to the positions of the two-dimensional tactile images corresponding to the acquisition points of the three-dimensional human body in the step 1.1) to obtain a two-dimensional tactile image data sequence arranged according to a time sequence.
As shown in fig. 2, the method for generating two-dimensional tactile image data arranged in time series includes the steps of:
1.3.1) acquiring multi-channel data of a certain acquisition point of the three-dimensional human body at the current moment, wherein the multi-channel data comprises pressure, X-direction torque and Y-direction torque data.
1.3.2) calculating the position of the acquisition point on the two-dimensional touch image according to the mapping relation in the step 1.1).
1.3.3) carrying out normalization processing on the multi-channel data of the acquisition point, and converting the multi-channel data into RGB data of a corresponding position on a two-dimensional touch image.
Mapping multi-channel touch data of the acquisition point to an RGB color space, wherein pressure channel data is converted into R channel data, X-direction torque channel data is converted into G channel data, and Y-direction torque channel data is converted into B channel data; since the value range of the R, G, B channel data is 0 to 255, the pressure, the X-direction torque and the Y-direction torque channel data need to be normalized, and the specific formula is as follows:
Ypressure of=(XPressure of-MinValuePressure of)/(MaxValuePressure of-MinValuePressure of)
YTorque in X direction=(XTorque in X direction-MinValueTorque in X direction)/(MaxValueTorque in X direction-MinValueTorque in X direction)
YTorque in Y direction=(XTorque in Y direction-MinValueTorque in Y direction)/(MaxValueTorque in Y direction-MinValueTorque in Y direction)
Wherein: x, Y are values before and after the transition, and MaxValue and MinValue are the maximum and minimum possible values of the pressure, X-direction torque, and Y-direction torque channel data, respectively.
Y after normalizationPressure of、YTorque in X direction、YTorque in Y directionMultiply by 255 and divide by four or five, respectivelyAfter entering, R, G, B values finally stored in the corresponding points of the two-dimensional tactile image file are obtained.
For example, in a point corresponding to a two-dimensional tactile image file, if R is 0, G is 0, and B is 0, the point corresponding position pressure, the X-direction torque, and the Y-direction torque are all possible minimum values, that is, no force is applied. If the corresponding R, G, B, 128 at another point indicates that the pressure at the corresponding point is 1/2 of the pressure range, the X-direction torque is the minimum possible value, and the Y-direction torque is 1/2 of the Y-direction torque range.
1.3.4) repeating the steps 1.3.1) to 1.3.3), converting the multi-channel data of all acquisition points on the three-dimensional human body at different moments into RGB data on the two-dimensional touch image, and obtaining a two-dimensional touch image data sequence arranged according to time sequence.
2) And coding the obtained two-dimensional touch image data sequence arranged according to the time sequence by adopting a conventional video compression method to obtain touch stream data, wherein the touch stream data is used for storing and transmitting the touch data or sharing the touch data among different systems.
As shown in fig. 3, the two-dimensional touch image data sequence arranged according to the time sequence obtained in step 3) is encoded by using a commonly used video compression method such as MPEG-4, h.264, h.265, etc. and stored as a touch stream file, compared with the touch file sequence in the current custom format, the size of the file can be greatly reduced due to the use of a video compression scheme developed for many years, and simultaneously, the distortion during reproduction can be ensured to be extremely small. When the touch is required to be reproduced by using the touch stream file, the touch stream data is decoded according to a common video compression scheme such as MPEG-4, h.264, h.265 and the like to obtain a two-dimensional touch image data sequence arranged according to a time sequence, and then the subsequent operation is performed.
As shown in fig. 4, the sharing of the tactile data among different systems can be easily realized by using the streaming characteristics of the video compression schemes such as MPEG-4, h.264, and h.265, and the tactile data is encoded in real time by the transmitting end, transmitted to the receiving end by streaming, and then decoded in real time, thereby realizing the sharing of the tactile data among different systems.
3) And decompressing the obtained touch sensing stream data by adopting a conventional video compression method to obtain two-dimensional touch sensing image data arranged according to a time sequence.
4) Based on the obtained two-dimensional tactile image data arranged in time series, tactile reproduction is performed on the human body.
A method for reproducing tactile sensations to a human body based on two-dimensional tactile sensation image data arranged in time series, the method comprising the steps of:
4.1) finding the corresponding point of each tactile sensation reproduction unit on the two-dimensional tactile sensation image according to the mapping relation in the step 1.1) and the position of each tactile sensation reproduction unit on the three-dimensional human body on the force feedback equipment.
4.2) finding the RGB data on the corresponding point on the two-dimensional touch image according to the obtained two-dimensional touch image data sequence arranged according to the time sequence, and particularly, if the corresponding point is not recorded, interpolating the RGB data of the point near the corresponding point to obtain the RGB data of the corresponding point.
4.3) converting the RGB data of the point into multi-channel data of the tactile data, and controlling the corresponding tactile reproduction unit of the force feedback device to act and then apply the tactile reproduction to the human body to realize the tactile reproduction.
As shown in fig. 5(a) and 5(b), the present invention is described by taking a force-feedback glove as an example. The method comprises the steps of firstly finding a corresponding point a of a tactile sensation reproducing unit A on a force feedback glove on a two-dimensional tactile sensation image, finding RGB data of the corresponding point a, converting the RGB data of the point a into multi-channel data, namely converting R channel data on the point a into pressure channel data, converting G channel data on the point a into X-direction torque channel data, converting B channel data on the point a into Y-direction torque channel data, and controlling and acting the tactile sensation reproducing unit A by one or more of the pressure channel data, the X-direction torque channel data and the Y-direction torque channel data to be applied to a human body, so that an in-person tactile sensation is generated.
For the touch reappearing unit B on the force feedback glove, firstly, the RGB data of the corresponding point B is found, and when the corresponding point B has no RGB data, the RGB data of the point B is obtained by interpolating the RGB data of three points c, d and e near the point B. And then, converting the RGB data on the point B into multi-channel data, controlling the touch reappearing unit B to act and applying the data to a human body to realize touch reappearing.
The above embodiments are only used for illustrating the present invention, and the structure, connection mode, manufacturing process, etc. of the components may be changed, and all equivalent changes and modifications performed on the basis of the technical solution of the present invention should not be excluded from the protection scope of the present invention.

Claims (4)

1. A method of storing and transmitting haptic data encoded as a video stream, comprising the steps of:
1) converting touch data on each acquisition point of a three-dimensional human body at different moments into a two-dimensional touch image data sequence arranged according to a time sequence;
method for converting tactile data into a sequence of two-dimensional tactile image data, comprising the steps of:
1.1) carrying out UV unfolding on each acquisition point of a three-dimensional human body, and mapping the surface of the human body to a two-dimensional tactile image in a two-dimensional plane;
1.2) acquiring touch data acting on each acquisition point of a three-dimensional human body at different moments, and decomposing the touch data on each acquisition point into multi-channel data;
1.3) mapping the multi-channel data of the acquisition points at different moments obtained in the step 1.2) to an RGB color space according to the positions of the two-dimensional tactile images corresponding to the acquisition points of the three-dimensional human body in the step 1.1) to obtain a two-dimensional tactile image data sequence arranged according to a time sequence;
2) coding the obtained two-dimensional touch image data sequence arranged according to the time sequence to obtain touch flow data;
3) decompressing the obtained tactile streaming data to obtain a two-dimensional tactile image data sequence arranged according to a time sequence;
4) performing touch reappearance to the human body according to the obtained two-dimensional touch image data sequence arranged according to the time sequence;
a method of performing tactile reproduction comprising the steps of:
4.1) finding corresponding points of the touch reappearing units on the two-dimensional touch image according to the mapping relation in the step 1.1) and the positions of the touch reappearing units on the three-dimensional human body on the force feedback equipment;
4.2) finding RGB data on the corresponding point on the two-dimensional touch image according to the obtained two-dimensional touch image data arranged according to the time sequence; if the corresponding point is not recorded, interpolating the RGB data of the point near the corresponding point to obtain the RGB data of the corresponding point;
4.3) converting the RGB data of the corresponding point into multi-channel data of the tactile data, and controlling the corresponding tactile reproduction unit of the force feedback device to act and then applying the tactile reproduction to the human body to realize the tactile reproduction.
2. A method of storing and transmitting haptic data encoded as a video stream as claimed in claim 1, wherein: in the step 1.3), the method for generating the two-dimensional tactile image data sequence arranged in time series includes the following steps:
1.3.1) acquiring multi-channel data of any acquisition point of a three-dimensional human body at the current moment, wherein the multi-channel data comprises pressure, X-direction torque and Y-direction torque data;
1.3.2) calculating the position of the acquisition point on the two-dimensional touch image according to the mapping relation in the step 1.1);
1.3.3) carrying out normalization processing on the multi-channel data of the acquisition point, and converting the multi-channel data into RGB data of a corresponding position on a two-dimensional touch image;
1.3.4) repeating the steps 1.3.1) to 1.3.3), converting the multi-channel data of all acquisition points on the three-dimensional human body at different moments into RGB data on the two-dimensional touch image, and obtaining a two-dimensional touch image data sequence arranged according to time sequence.
3. A method of storing and transmitting haptic data encoded as a video stream as claimed in claim 2, wherein: in the step 1.3.3), the corresponding relationship between the multichannel data and the RGB data of each collection point is: the pressure channel data corresponds to R channel data, the X-direction torque channel data corresponds to G channel data, and the Y-direction torque channel data corresponds to B channel data.
4. A method of storing and transmitting haptic data encoded as a video stream as claimed in claim 2, wherein: the calculation formulas for normalization processing of pressure, X-direction torque and Y-direction torque channel data in the multichannel data are respectively as follows:
Ypressure of=(XPressure of-MinValuePressure of)/(MaxValuePressure of-MinValuePressure of);
YTorque in X direction=(XTorque in X direction-MinValueTorque in X direction)/(MaxValueTorque in X direction-MinValueTorque in X direction);
YTorque in Y direction=(XTorque in Y direction-MinValueTorque in Y direction)/(MaxValueTorque in Y direction-MinValueTorque in Y direction);
Wherein: x, Y are values before and after the transition, and MaxValue and MinValue are the maximum and minimum possible values of the pressure, X-direction torque, and Y-direction torque channel data, respectively.
CN201710840076.9A 2017-09-18 2017-09-18 Method for encoding touch data into video stream for storage and transmission Active CN107566847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710840076.9A CN107566847B (en) 2017-09-18 2017-09-18 Method for encoding touch data into video stream for storage and transmission

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710840076.9A CN107566847B (en) 2017-09-18 2017-09-18 Method for encoding touch data into video stream for storage and transmission

Publications (2)

Publication Number Publication Date
CN107566847A CN107566847A (en) 2018-01-09
CN107566847B true CN107566847B (en) 2020-02-14

Family

ID=60980160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710840076.9A Active CN107566847B (en) 2017-09-18 2017-09-18 Method for encoding touch data into video stream for storage and transmission

Country Status (1)

Country Link
CN (1) CN107566847B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782033A (en) * 2020-05-28 2020-10-16 深圳和而泰智能控制股份有限公司 Data processing method, device, equipment and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1659551A (en) * 2001-10-04 2005-08-24 诺文特技术公司 Coordinating haptics with visual images in a human-computer interface
CN101996413A (en) * 2009-08-10 2011-03-30 韩国电子通信研究院 Method of encoding haptic information on image, method of decoding haptic information from image and apparatus for processing haptic information for the same
CN103809960A (en) * 2012-11-02 2014-05-21 英默森公司 Encoding dynamic haptic effects
CN104157002A (en) * 2014-08-14 2014-11-19 东南大学 Color image texture force tactile reproduction method based on color transform space
CN104184721A (en) * 2013-05-24 2014-12-03 意美森公司 Method and a system used for touch data encoding and stream transfer
CN104636099A (en) * 2014-10-20 2015-05-20 东南大学 Vision and touch file format conversion device and method
CN106919257A (en) * 2017-02-28 2017-07-04 南京信息工程大学 Based on image luminance information power haptic interaction texture power reproducting method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9981182B2 (en) * 2016-02-12 2018-05-29 Disney Enterprises, Inc. Systems and methods for providing immersive game feedback using haptic effects

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1659551A (en) * 2001-10-04 2005-08-24 诺文特技术公司 Coordinating haptics with visual images in a human-computer interface
CN101996413A (en) * 2009-08-10 2011-03-30 韩国电子通信研究院 Method of encoding haptic information on image, method of decoding haptic information from image and apparatus for processing haptic information for the same
CN103809960A (en) * 2012-11-02 2014-05-21 英默森公司 Encoding dynamic haptic effects
CN104184721A (en) * 2013-05-24 2014-12-03 意美森公司 Method and a system used for touch data encoding and stream transfer
CN104157002A (en) * 2014-08-14 2014-11-19 东南大学 Color image texture force tactile reproduction method based on color transform space
CN104636099A (en) * 2014-10-20 2015-05-20 东南大学 Vision and touch file format conversion device and method
CN106919257A (en) * 2017-02-28 2017-07-04 南京信息工程大学 Based on image luminance information power haptic interaction texture power reproducting method

Also Published As

Publication number Publication date
CN107566847A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
EP3751857A1 (en) A method, an apparatus and a computer program product for volumetric video encoding and decoding
TWI826321B (en) A method for enhancing quality of media
KR101231160B1 (en) System And Method For Three-Dimensional Video Capture Workflow For Dynamic Rendering
Cavallaro et al. Semantic video analysis for adaptive content delivery and automatic description
KR100918392B1 (en) Personal-oriented multimedia studio platform for 3D contents authoring
JP6283108B2 (en) Image processing method and apparatus
TWI590662B (en) Decoder and method
CN104423590B (en) Demultiplex the method and system of haptic signal
CN103460250A (en) Object of interest based image processing
CN107710107A (en) The method and scheme that awareness driven for haptic effect encodes
KR20080109325A (en) System and method for generating and playing three dimensional image files based on two dimensional image media standards
US10469968B2 (en) Rendering for computer-mediated reality systems
TWI735552B (en) Method, apparatus and stream of formatting an immersive video for legacy and immersive rendering devices
CN112188180B (en) Method and device for processing sub-block images
US11451836B2 (en) Techniques and apparatus for PCM patch creation using Morton codes
US9998763B2 (en) Compression of signals, images and video for multimedia, communications and other applications
CN107566847B (en) Method for encoding touch data into video stream for storage and transmission
CN107396002B (en) A kind of processing method and mobile terminal of video image
Horiuchi et al. Interactive music video application for smartphones based on free-viewpoint video and audio rendering
CN113508598A (en) Method and apparatus for enhanced patch boundary identification for point cloud compression
CN113194326A (en) Panoramic live broadcast method and device, computer equipment and computer readable storage medium
CN111901628A (en) Cloud rendering method based on zSpace desktop VR all-in-one machine
CN104010177B (en) The real-time 2D of a kind of video turns method and the device thereof of 3D broadcasting
WO2008069474A1 (en) Personal-oriented multimedia studio platform apparatus and method for authorizing 3d content
US20240129578A1 (en) Method and apparatus for defining frames and timed referenced network abstraction layer (nals) structure in haptics signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant