GB2519848A - Method and system for interactively matching motion platform with 3D video data - Google Patents

Method and system for interactively matching motion platform with 3D video data Download PDF

Info

Publication number
GB2519848A
GB2519848A GB1415309.2A GB201415309A GB2519848A GB 2519848 A GB2519848 A GB 2519848A GB 201415309 A GB201415309 A GB 201415309A GB 2519848 A GB2519848 A GB 2519848A
Authority
GB
United Kingdom
Prior art keywords
data
motion platform
rotation
video
piecewise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1415309.2A
Other versions
GB2519848B (en
GB201415309D0 (en
Inventor
Zhenhua Huang
Fei Huang
Minzhong Jiang
Xunting He
Hao Fang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Playfun Culture & Technology Co Ltd
Original Assignee
Shenzhen Playfun Culture & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Playfun Culture & Technology Co Ltd filed Critical Shenzhen Playfun Culture & Technology Co Ltd
Publication of GB201415309D0 publication Critical patent/GB201415309D0/en
Publication of GB2519848A publication Critical patent/GB2519848A/en
Application granted granted Critical
Publication of GB2519848B publication Critical patent/GB2519848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The present disclosure provides a method for interactively matching a motion platform with 3D video data. The method includes following steps: a step A of obtaining a first 3D video lens attitude data including a first rotation data; a step B of obtaining a second rotation data by filtering out glitch data from the first rotation data; a step C of piecewise compressing the second rotation data so that compressed data obtained from the piecewise compressing is within a range supported by the motion platform; a step D of establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; a step E of computing an expansion and contraction quantity of the control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and a step F of controlling the motion platform to move accordingly according to the expansion and contraction quantity of the control rod of the motion platform. The method for interactively matching a motion platform with 3D video data provided by the present disclosure can ensure that the motion platform moves synchronously with contents of games or movies, thereby improving accuracy of a final control data.

Description

METHOD AND SYSTEM FOR TNTERACTTVELY MATCHING MOTTON
PLATFORM WITH 3D VIDEO DATA
Technical Field
[0001] The present disclosure relates to 3D video conversion technologies, in particular to a method and a system for interactively matching a motion platform with 3D video data.
Technical Background
[0002] Curentl in order to enhance user experience for many interactive 3D games and 3D movies, a movable platform is placed in front of a screen. When a player sits on a chair on the platform, the platform can move synchronously with contents of game images, so that the player can immersively experience reality of a game world and a movie world. However, control data of most motion platforms is adjusted manually according to images of games or movies frame by frame, resulting a huge workload for a game or movie (which having a frame rate of for example 24 frames per second, then 43200 frames per 30 minutes) of tens of minutes or even longer.
Moreover, it is difficult for the platform to move synchronously with the games or movies, or to ensure accuracy of the data.
Summary
[0003] In order to solve problems in the prior art, the present disclosure provides a method for interactively matching a motion platform with 3D video data.
[0004] The present disclosure provides a method for interactively matching a motion platform with 3D video data, the method includes: [0005] a step A of obtaining a first 3D video lens attitude data including a first rotation data; [0006] a step B of obtaining a second rotation data by filtering out glitch data from the first rotation data; [0007] a step C of piecewise compressing the second rotation data so that compressed data obtained from the piecewise compressing is within a motion range supported by the motion platform; [0008] a step D of establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; [0009] a step E of computing an expansion and contraction quantity of the control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and [00101 a step F of controlling the motion platform to move accordingly according to an expansion and contraction quantity data of the control rod of the motion platform.
[0011] As a further improvement of the present disclosure, between the step C and the step D, further including a step of smoothing the glitch data existing in the compressed data by utilizing a smooth filtering method.
[0012] As a further improvement of the present disclosure, the piecewise compressing in the step C includes: dividing the second rotation data into a plurality of sections according to data size, and respectively scaling the plurality of sections with different scales.
[0013] As a further improvement of the present disclosure, the smooth filtering method includes: filtering out glitch points in the compressed data with a five-spot triple filtering algorithm for discrete signals.
[0014] As a further improvement of the present disclosure, the motion platform is a motion platform with multiple degrees of freedom.
[0015] As a further improvement of the present disclosure, the 3D video includes 3D games and 3D movies.
100161 The present disclosure further provides a system for interactively matching a motion platform with 3D video data, including: [0017] an obtaining module, which is configured for obtaining a first 3D video lens attitude data including a first rotation data; [0018] a suppression module, which is configured for obtaining a second rotation data by filtering out glitch data from the first rotation data; [0019] a compression module, which is configured for piecewise compressing the second rotation data so that compressed data obtained from the piecewise compressing is within a range supported by the motion platform; [0020] a generation module, which is configured for establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; [0021] a computation module, which is configured for computing an expansion and contraction quantity of a control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and [0022] a control module, which is configured for controlling the motion platform to move accordingly according to an expansion and contraction quantity data of the control rod of the motion platform.
[0023] As a further improvement of the present disclosure, the system further includes a filtering module, which is configured for smoothing the glitch data existing in the compressed data by utilizing a smooth filtering method.
[0024] As a further improvement of the present disclosure, the piecewise compressing includes: dividing the second rotation data into a plurality of sections according to data size, and respectively scaling the plurality of sections with different scales.
100251 As a further improvement of the present disclosure, the smooth filtering method includes: filtering out glitch points in the compressed data with a five-spot triple filtering algorithm for discrete signals, and the motion platform is a motion platform with multiple degrees of freedom.
[0026] The present disdosure has beneficial effects below: the method for interactively matching a motion platform with 3D video data provided by the present disclosure can substantially decrease time arid difficulty for manually adjusting the control rod, and ensure that the motion platform moves synchronously with contents of games or movies, thereby improving accuracy of a final control data.
Description of Drawings
[0027] Figure 1 is a flowchart illustrating a method for interactively matching a motion platform with 3D video data according to the present disclosure; [0028] Figure 2 is a schematic diagram illustrating the structure of a motion platform having 6 degrees of freedom (i.e. 6-DOF motion platform); [0029] Figure 3 is a data graph illustrating a curve formed by connecting lens attitude data obtained from a 3D game; [0030] Figure 4 is a schematic structural diagram illustrating the 6-DOF motion platform in another state; [0031] Figure 5 illustrates a data graph after suppressing glitch points in Figure 3 according to
the present disclosure;
[0032] Figure 6 is a data graph obtained from piecewise compressing the attitude data in
Figure 5 according to the present disclosure;
[0033] Figure 7 is a partial enlarged data graph of Figure 6; [0034] FigureS is a data graph of Figure 7 after five-spot triple smoothing processing; [0035] Figure 9 is a schematic diagram of a 3D coordinate system in embodiments of the
present disclosure;
[0036] Figure 10 is a schematic diagram of a 3D rectangular coordinate system of the 6-DOF motion platform according to the present disclosure; and [0037] Figure 11 shows the 3D rectangular coordinate system in Figure 10 in another angle.
Detailed Description of Preferred Embodiments
[0038] As shown in Figure 1, the present disclosure discloses a method interactively matching a motion platform with 3D video data, the method includes following steps: in step SI, firstly obtaining a first 3D video lens attitude data including a first rotation data; in step S2, obtaining a second rotation data by filtering out glitch data from the first rotation data; in step S3, piecewise compressing the second rotation data so that the compressed data obtained from the piecewise compressing is within a range supported by the motion platform; in step 54, smoothing the glitch data existing in the compressed data by utilizing a smooth filtering method; in step 55, establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; in step S6, computing an expansion and contraction quantity of the control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and in step S7, controlling the motion platform to move accordingly according to an expansion and contraction quantity data of the control rod of the motion platform.
[0039] The piecewise compressing in the step S3 includes: dividing the second rotation data into a plurality of sections according to data size, and respectively scaling the plurality of sections with different scales.
[0040] The smooth filtering method is a five-spot triple filtering algorithm.
[0041] The motion platform is a 6-DOF motion platform, [0042] The 3D video includes 3D games and 3D movies.
[0043] The present disclosure further discloses a system for interactively matching a motion platform with 3D video data, including: 100441 an obtaining module, which is configured for obtaining a first 3D video lens attitude data including a first rotation data; [0045] a suppression module, which is configured for obtaining a second rotation data by filtering out glitch data from the first rotation data; [0046] a compression module, which is configured for piecewise compressing the second rotation data so that the compressed data obtained from the piecewise compressing is within a motion range supported by the motion platform; [0047] a generation module, which is configured for establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; [0048] a computation module, which is configured for computing an expansion and contraction quantity of a control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and [0049] a control module, which is configured for controlling the motion platform to move accordingly according to expansion and contraction quantity data of the control rod of the motion platform.
[0050] The method for interactively matching a motion platform with 3D video data provided by the present disclosure can direefly convert lens data in 3D games or 3D movies into an expansion and contraction quantity of a control rod of the motion platform, and then utilize expansion and contraction length data of the control rod to control the motion of the motion platform, so as to achieve the effect of simulating synchronously motions in game images so that a person sitting on the motion platform can more veritably experience motions of a game world or a movie world.
[0051] As shown in Figure 2, a 6-DOF motion platform is taken as an example in the present embodiment. A lower platform is fixed immovably, and the motion of an upper platform is controlled by six telescopic rods so as to simulate motions, including rotation and translation, of an object in a 3D world. In a 3D game scene, changes of images on the screen is implemented by changing positions and attitudes of virtual lens of the game in the 3D game world coordinate system, where, the position is a translation quantity of the lens relative to origin of the 3D game world coordinate system, and the aftitude is an occurred rotation quantity of the lens relative to three coordinate axes of the 3D game world coordinate system (a 3D game world coordinate system is generally fixed). Translation quantities in three directions of x, y and z and rotation quantities in three axial directions of y and z add up to six degrees of freedom, which can uniquely determine a position and an attitude being held of the object in the 3D coordinate system.
[0052] Figure 3 shows a section of the first 3D video lens attitude data obtained from a 3D game. The first 3D video lens attitude data includes a first rotation data with regard to a rotation quantity occurred relative to one coordinate in an axial direction. It can be seen that the attitude of the lens is defined artificially according to requirements of visual effects by game developers in the 3D game and not limited by other outside conditions (the lens can rotate 360° in any directions), and a curve connected by these first 3D video lens attitude data may have multiple discontinuous points (this is related to a game engine). However, control rods of 6-DOF motion platform may be all stretch mechanical devices in an actual application, and each of the stretch mechanical devices has an allowable stretch length range for limiting motion amplitude of the motion platform so that 3600 unconstrained rotation of the motion platform cannot be achieved
S
as in 3D game, thereby there is a constraint difference between rotation of the motion platform and the rotation of the 3D game lens and there is a situation that lens rotation data are discontinuous, these special situations must be noted and need to be handled reasonably before the 6-DOF motion platform simulates motion.
[0053] When the 6-DOF motion platform is in an initial state, the lower platform and the upper platform are parallel, a radius of a circumcircle at which hinge points in the lower platform or the upper platform are located is R, and a spacing distance between the lower platform and the upper platform is D, where R and D are fixed values for the same motion platform, but may vary with different motion platforms with different designs). Since the control rod is in an initial state, thus the expansion and contraction quantity thereof is 0 (the expansion and contraction quantity refers to a delta value A of the length of the rod relative to a length of the rod in the initial state when the control rod expands and contracts; when the control rod is contracted, the expansion and contraction quantity thereof is a negative value; and when the control rod is expanded, the expansion and contraction quantity thereof is a positive value).
[0054] Assuming that the control rod is in expansion and contraction range of(-L0+L0), the expansion and contraction range can merely support the motion platform to achieve a maximum inclination angle of 9 in one direction, as shown in Figure 4, where, the 9 is generally a rather small angle (for example, when R500, D464, and L045, theO is merely in a range of from 4 to 65, hence the lens attitude data in the 3D game must be converted into a range that the motion platform can accept.
[0055] The first rotation data of the virtual lens obtained from the 3D game includes: RotateX[n]; RotateY[n]; RotateZ[n]; [0056] where, fl = 0, 1, 2...; (the n represents the number of frames), and the RotateX[n] represents a rotation quantity of the n frame in direction of an X axis.
[0057] Firstly the first data is processed to suppress possible glitch points in the data so as to ensure that a curve thereof is smooth and continuous: A= RotateX[iI -RotateX[i -1] if fabs(LX) > 2700 fabs(a) RotateX[i] = RotateX[i] -E * 360 [0058] where, the fabs represents an operation for obtaining an absolute value.
[0059] RotateY[nI and Rotatez[n] are processed according to the above operations so that
S
the possible glitch points in the first rotation data can be suppressed and a second rotation data can be obtained.
[0060] A curve connected by the second rotation data is as shown in Figure 5, [0061] Then the second rotation data needs to be converted into a motion range supported by the motion platform so that the RotateX[nI, the RotateY[nI, and the RotateZ[n] are processed sequentially by using a piecewise compressing method, of which steps are set forth as follow: a) finding a maximum value max in the RotateX[n], and defining maxltem = factor * max; b) obtaining an absolute value for the RotateX[n], and then piecewise counting statistics from small to large (0720, and l'per section), and respectively counting the number of each section, statistic array Statistic[m]: sum = Statistic[0I, I = 0 sum = sum + Statistic [I]; I = I + 1; if sum > factor * maxitem = I -1; c) if fabs(RotateX[i]) «= maxitem RotateX[i] RotateX[i] = * 0.9 9 maxltem else if fabs(RotateX[i]) > maxitem / fabs(RotateX[i]) -maxltem \ fabs(RotateX[i]) RotateX[i]=(0.9*0+ *0.1*01* (1 -factor) * max J RotateX[i] [0062] where, the factor is an adjustable parameter, of which value is (0,1). The smaller the factor, the greater the changes of compressed data curve (so that experience for the game can be increased). However, the smaller the factor, the more unsmooth the compressed data curve, so that the factor needs to be adjusted according to requirements.
[0063] The same compressing processing are done to the RotateY[n] and the RotateZ[n] to obtain compressed data, of which a data curve is as shown in Figure 6 (unit: radian).
[0064] Then the compressed data is smoothed, as shown in Figure 7. It can be discovered by partially enlarging the compressed data curve obtained from the piecewise compressing that saltations still exist in the curve, These mutational rotation data will make the motion platform shake when simulating motions, thus causing the whole motion process disharmonious, All the compressed data need to be further processed so as to try hard to suppress these saltations without damaging an overall shape of the curve. The present disclosure uses a five-spot triple filtering algorithm for discrete signals to process data to suppress unsmooth data. Specific steps of the algorithm are set forth below: obtaining first two frames of data with formulas I and 2; 69 4 6 4 RotateX[O] = * RotateX[O] + * RotateX[1] -RotateX[2] + * RotateX[3] *RotateX[4] (1) 2 27 12 8 RotateX[1] = -* RotateX[O] + * RotateX[1] + * RotateX[2] -* RotateX[3] + * RotateX[4] (2) obtaining a middle ith frame of data with formula 3; 2 12 17 12 RotateX[i] = -* RotateX[i -2] + -RotateX[i -1] + RotateX[i] + * RotateX[i -E 1] -* RotateX[i -E 2] (3) obtaining last two frames of data with formulas 4 and 5; RotateX[n -1] 2 8 12 27 = -* RotateX[n -4] -RotateX[n -3] + * RotateX[n -2] -I- * RotateX[n -1] + * RotateX[n] (4) 1 4 6 4 RotateX[n] = -* RotateX[n -4] + * RotateX[n -3] -* RotateX[n -2] + * RotateX[n -1] + * RotateX[n] (5) [0065] The same processing are done to the RotateY[n] and the RotateZ[n], and a finally obtained result and a pre-process result are as shown in Figures 8 and 7, respectively. As can be seen that the algorithm effectively suppresses saltations without damaging an overall shape of the curve.
[0066] Then the compressed data after smoothing is used for simulating motions of the motion platform, and the expansion and contraction quantity of the control rod is obtained.
[0067] Before the simulating, a 3D rotation process in a fixed coordinate system and a formula a derivation thereof must be illustrated first.
[0068] As shown in Figure 9, a point (xl, yl, zi) is reached by rotating a point (xO, yO, zO) by B around an z axis, and obviously zi=zO; at the initial position, coordinates of a point may be represented as: xO = L * sin a, yO = L * cosa, new coordinates of the point after rotation may be represented as: xl = L * sin( a + 0) = L * sina * cosO + L * cosa * sinG yl = L * cos( a + 0) = L * cosa * cos9 -L * sina * sin9 zl = zO when described with vectors and matrixes, the above process may be described as: xl cosO sine o xO xO yl = -sinO cos9 0 yO = M yO (6) zl 0 0 1 zO zO cosO sinG 0 M = -sine cosO 0 0 0 1 Similarly, rotating a point by a around an x axis may be represented as: xl 1 0 0 xO x0 yl = 0 cosO sinO yO =M yO (7) zl o -sine cosO z0 zO 1 0 0 M = 0 cosO sine 0 -sine cosO Similarly, rotating a point by 13 around an y axis may be represented as: xl coso 0 -sine xO xO y1 0 1 0 y0=MyO (8) zl sine o cosO zO zO cosO 0 -sine M= 0 1 0 sine o cose Therefore, in the fixed 3D coordinate system, a finally new coordinates of a point after successively rotating the point by angles a, I and 0 around the axes x, y and z is represented as: xl xO yl = MZMYMX yO (9) zi zO [0069] It is significant for any simulation of motion to use a coordinate system as a reference.
As shown in Figures 10 and 11, a 3D rectangular coordinate system is established in the 6-DOF motion platform as a reference coordinate system, and a center of the upper platform is used as a coordinate original (the established coordinate system needs to be consistent with a coordinate system of the 3D game, both of the coordinate systems are either left-handed coordinate systems or right-handed coordinate systems, or both of the coordinate systems are converted into a same coordinate system).
[0070] For a specific motion platform in an actual situation, when the motion platform is in an initial state, some parameters of the coordinate system in the motion platform are fixed values, for example, radius R of a circumcircle at which three hinge points (A, B and C) in the lower platform and the upper platform are located is fixed (assuming that both of the lower platform and the upper platform is consistent or possibly inconsistent, and the method thereof is similar, which does not affect subsequent derivation). All sectional circular arcs of the circumcircle at which the three hinge points are located are 20° , and a distance between the lower platform and the upper platform D (O'O=D) is fixed and so on.
[0071] It can be easy to compute coordinates of six hinge points A, B, C, D, E and F, which are in an initial state, in the lower platform and the upper platform according to these initial fixed parameters, the coordinates are respectively denoted by (XA, YA' ZA), (XB, YB' ZB), (Xc, Yc' Zc), (XDJ YD' ZD), (Xe, Yc' Z[) and (Xr, Yr' Zr), and then it is not difficult to compute initial lengths Li, L2, L3, L4, L5 and L6 (i.e. an Euclidean distance between two points in a 3D space) of six control rods according to these six initial coordinates.
[0072] When the 6-DOF motion platform simulates motions, the lower platform is fixed immovably, and the upper platform moves (including rotation and translation) in an allowable range, Assuming that the upper platform successively rotates angles a, f3 and B around axes X, Y and Z, then according to formula (9), coordinates of the three points A, B and C in the 3D space will change into (xk, y1, z3, (x, y, z) and (x, yL. z). x XA
where, Y1 = MZMYMX YA [for the Mx, the My and the Mz, see formulas (6), (7) and (8)] 100731 Similarly, new coordinates of the point B and the point C are computed successively.
[0074] As such, coordinates of the three hinge points (A, B and C) in the upper platform among the six hinge points are changed because of motions of the motion platform, and the changed coordinates of the six hinge points A, B, C, D, E and F are successively represented as (x,yL,4), (XD,YD,ZD), (XE,YE,ZE) and (XF,YF,ZF). Then it is not difficult to compute lengths Li', L2, L3', L4', L5' and L6'of six control rods again according to the new coordinates, Then in comparison th lengths of the control rods in the initial state, the expansion and contraction quantity of each of the control rods can be computed as: Al = Li' -Li A2 = L2' -L2 A3 = L3' -L3 A4 = L4' -L4 A5 = L5' -L5 A6 = L6' -L6 (10) [0075] whole processes are illustrated above by taking a single frame as an example.
[0076] For multiple frames data of the 3D games (or the 3D movies), a 3D model of the motion platform is established in a DirectX3D environment. When the 3D model simulates motions, a model of the lower platform is fixed immovably, and the rotation data RotateX[n], RotateY[nj and RotateZ[n] obtained in the step S4 (successively corresponding to the angles a, f3 and U above) are successively put into formulas (6), (7) and (8) to compute the M, the M and the M corresponding to each frame, and then an attitude of a model of the upper platform is simulated with the formula (9); meanwhile, a model of control rods will expand and contract with the changes of the attitude of the upper platform. Finally, expansion and contraction quantities Al, A2, A3, A4, AS and A6 of the control rods corresponding to lens data of each frame can be computed with the formula (10) in a DirectX3D world coordinate system.
[0077] Expansion and contraction quantity data of each frame are saved in order, and when required, the expansion and contraction quantity data are used for controlling the motion platform, and the motion platform will move synchronously according to contents of games, thus completing conversion of data and control of the motion platform.
[0078] The methods of the present disclosure may be implemented by computer-executable instructions, The computer-executable instructions may be provided on a computer-readable medium such as a removable computer memory stick, a magnetic or optical dislc such as a CD-ROM, CD-R/W, or DVD. The computer-readable medium may be a computer memory or may be a transmission signal or medium for data transmission.
[0079] The present disclosure is further illustrated in detail by combining specifically preferable embodiments and above contents, and it cannot be identified that specific implementations of the present disclosure are merely limited to these illustrations. For those of ordinary skill in the art of the present disclosure, some simple derivations or substitutions may be made further without departing from the conception of the present disclosure should be regarded as fall within the protection scope of the present disclosure.

Claims (9)

  1. CLAIMSWhat is claimed is: 1, A method for interactively matching a motion platform with 3D video data, wherein the motion platform comprises a control rod, and the method comprises: a step A of obtaining a first 3D video lens attitude data including a first rotation data; a step B of obtaining a second rotation data by filtering out glitch data from the first rotation data; a step C of piecewise compressing the second rotation data so that compressed data obtained from the piecewise compressing is within a motion range supported by the motion platform; a step D of establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; a step B of computing an expansion and contraction quantity of the control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and a step F of controlling the motion platform to move accordingly according to the expansion and contraction quantity of the control rod of the motion platform.
  2. 2. The method for interactively matching a motion platform with 3D video data of claim 1, wherein between the step C and the step D, further comprising a step of: smoothing the glitch data existing in the compressed data by utilizing a smooth filtering method.
  3. 3. The method for interactively matching a motion platform with 3D video data of claim 1 or claim 2, wherein the piecewise compressing in the step C comprises: dividing the second rotation data into a plurality of sections according to data size, and respectively scaling the plurality of sections with different scales.
  4. 4. The method for interactively matching a motion platform with 3D video data of claim 2 or claim 3 when dependent on claim 2, wherein the smooth filtering method comprises: filtering out glitch points in the compressed data with a five-spot triple filtering algorithm for discrete signals.
  5. 5. The method for interactively matching a motion platform with 3D video data of any of claims to 4, wherein the motion platform is a motion platform with multiple degrees of freedom.
  6. 6. The method for interactively matching a motion platform with 3D video data of any of claims 1 to 5, wherein the 3D video comprises 3D games and 3D movies.
  7. 7. A system for interactively matching a motion platform with 3D video data, comprising: an obtaining module, which is configured for obtaining a first 3D video lens attitude data including a first rotation data; a suppression module, which is configured for obtaining a second rotation data by filtering out glitch data from the first rotation data; a compression module, which is configured for piecewise compressing the second rotation data so that compressed data obtained from the piecewise compressing is within a motion range supported by the motion platform; a generation module, which is configured for establishing a reference coordinate system to simulate motions of the motion platform according to rotation information in the compressed data; a computation module, which is configured for computing an expansion and contraction quantity of a control rod of the motion platform in the reference coordinate system by using a 3D rotation formula; and a control module, which is configured for controlling the motion platform to move accordingly according to the expansion and contraction quantity of the control rod of the motion platform.
  8. 8. The system for interactively matching a motion platform with 3D video data of claim 7, further comprising: a filtering module, which is configured for smoothing the glitch data existing in the compressed data by utilizing a smooth filtering method.
  9. 9. The system for interactively matching a motion platform with 3D video data of claim 8, wherein the piecewise compressing comprises: dividing the second rotation data into a plurality of sections according to data size, and scaling the plurality of sections with different scales.O. The system for interactively matching a motion platform with 3D video data of claim 9, wherein the smooth filtering method comprises: filtering out glitch points in the compressed data with a five-spot triple filtering algorithm for discrete signals, and the motion platform is a motion platform with multiple degrees of freedom.Ii, A computer readable medium having computer-executable instructions to perform the method of any of claims Ito 6.
GB1415309.2A 2013-08-30 2014-08-29 Method and system for interactively matching motion platform with 3D video data Active GB2519848B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310389104.1A CN103413329B (en) 2013-08-30 2013-08-30 A kind of motion platform mates interactive approach and system with 3D video data

Publications (3)

Publication Number Publication Date
GB201415309D0 GB201415309D0 (en) 2014-10-15
GB2519848A true GB2519848A (en) 2015-05-06
GB2519848B GB2519848B (en) 2017-06-28

Family

ID=49606334

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1415309.2A Active GB2519848B (en) 2013-08-30 2014-08-29 Method and system for interactively matching motion platform with 3D video data

Country Status (2)

Country Link
CN (1) CN103413329B (en)
GB (1) GB2519848B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885465A (en) * 2014-04-02 2014-06-25 中国电影器材有限责任公司 Method for generating dynamic data of dynamic seat based on video processing
CN104091360A (en) * 2014-07-28 2014-10-08 周立刚 Method and device for generating movement data through dynamic cinema
CN106582035A (en) * 2016-11-18 2017-04-26 深圳市远望淦拓科技有限公司 Simulation method of MDOF (Multi-degree of Freedom) motion platform
CN107272901B (en) * 2017-06-23 2020-05-19 歌尔科技有限公司 Motion seat control method and system
CN107376344B (en) * 2017-07-26 2020-11-06 歌尔光学科技有限公司 Method, device and system for matching game output data with virtual reality seat
CN108011884B (en) * 2017-12-07 2022-07-01 指挥家(厦门)科技有限公司 Attitude data transmission optimization method and device
CN108031118B (en) * 2017-12-12 2020-09-01 苏州蜗牛数字科技股份有限公司 Method for establishing surface model interactive somatosensory interface
CN108525331B (en) * 2018-04-12 2020-02-07 大连博跃科技发展有限公司 Method for making dynamic experience data
CN110163087B (en) * 2019-04-09 2022-03-25 江西高创保安服务技术有限公司 Face gesture recognition method and system
CN110430270B (en) * 2019-08-08 2022-03-25 网易(杭州)网络有限公司 Carrier data synchronization method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090115232A1 (en) * 2007-11-07 2009-05-07 Montecito Research, Inc. Motion simulation chair

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101107537B1 (en) * 2006-03-15 2012-02-06 퀄컴 인코포레이티드 Sensor-based orientation system
CN102591347B (en) * 2012-01-19 2014-07-30 河海大学常州校区 Multi-leg mobile platform and attitude and height control method thereof
CN103009376B (en) * 2012-12-04 2015-01-14 天津大学 Spatial three-dimensional rotation parallel mechanism

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090115232A1 (en) * 2007-11-07 2009-05-07 Montecito Research, Inc. Motion simulation chair

Also Published As

Publication number Publication date
GB2519848B (en) 2017-06-28
CN103413329A (en) 2013-11-27
GB201415309D0 (en) 2014-10-15
CN103413329B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
GB2519848A (en) Method and system for interactively matching motion platform with 3D video data
CN104932677B (en) Interactive more driver's virtual realities drive system
JP2021193599A (en) Virtual object figure synthesizing method, device, electronic apparatus, and storage medium
US20160225188A1 (en) Virtual-reality presentation volume within which human participants freely move while experiencing a virtual environment
CN110850977B (en) Stereoscopic image interaction method based on 6DOF head-mounted display
US9171402B1 (en) View-dependent textures for interactive geographic information system
WO2019023397A1 (en) Systems and methods for real-time complex character animations and interactivity
JP2018523326A (en) Full spherical capture method
CN111161395B (en) Facial expression tracking method and device and electronic equipment
CN103327217B (en) A kind of method for processing video frequency and device
CN110506419B (en) Rendering extended video in virtual reality
US11574416B2 (en) Generating body pose information
CN106774930A (en) A kind of data processing method, device and collecting device
CN108170940A (en) A kind of computational methods of the fundamental physical quantity of hull
US11243606B2 (en) Method and apparatus for controlling deformation of flexible virtual reality interaction controller, and virtual reality interaction system
CN111508080B (en) Method for realizing adhesive tape winding simulation based on UE4 engine and related equipment
CN105915877A (en) Free film watching method and device of three-dimensional video
CN108379841B (en) Game special effect processing method and device and terminal
WO2020043105A1 (en) Image display method, device, and system
US20210209808A1 (en) Compression of dynamic unstructured point clouds
JP6852224B2 (en) Sphere light field rendering method in all viewing angles
WO2014131733A1 (en) Method for reproducing an item of audiovisual content having haptic actuator control parameters and device implementing the method
CN111445568A (en) Character expression editing method and device, computer storage medium and terminal
WO2023240999A1 (en) Virtual reality scene determination method and apparatus, and system
CN109426332A (en) A kind of information processing method, device and virtual reality device