CN114405004A - VR recreation intelligent management system based on big data feature recognition - Google Patents
VR recreation intelligent management system based on big data feature recognition Download PDFInfo
- Publication number
- CN114405004A CN114405004A CN202210071775.2A CN202210071775A CN114405004A CN 114405004 A CN114405004 A CN 114405004A CN 202210071775 A CN202210071775 A CN 202210071775A CN 114405004 A CN114405004 A CN 114405004A
- Authority
- CN
- China
- Prior art keywords
- game
- user
- action
- role
- limb
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/77—Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a VR game intelligent management system based on big data characteristic identification, which comprises a user head posture parameter acquisition module, a role visual field switching parameter comparison module, a role visual field operation sensitivity analysis module, a user operation gesture acquisition module, a role action conformity degree analysis module, a limb action parameter detection module, a limb action synchronization analysis module and a game operation test experience analysis module, wherein the invention calculates the comprehensive operation test experience coefficient of a user to a VR game by acquiring the visual field operation sensitivity coefficient, the hand action conformity degree and the limb action synchronization matching degree coefficient of a game role in a picture of the VR game operated by the user, and performs corresponding VR game optimization processing measures if the comprehensive operation test experience coefficient is smaller than a preset VR game operation test experience coefficient threshold value, thereby improving the experience and substitution feeling of the user to the VR game operation and increasing the viscosity of the user to the VR game, thereby providing a foundation stone for the good development of VR games.
Description
Technical Field
The invention relates to the field of VR game intelligent management, in particular to a VR game intelligent management system based on big data feature recognition.
Background
VR recreation is virtual reality recreation, is the performance of virtual reality technological development, as long as take virtual reality equipment, just can let you get into in the virtual scene of playing of a virtual that can be interactive, not only can the virtual current scene, also can be virtual past and future, can give the user experience of the recreation experience of being personally on the spot, receives game fan's favor deeply.
However, in the current VR game operation experience management system, the view frames of game roles are switched mainly by means of fixing the swing direction of VR game equipment, so that the delay of the view frames of the game roles exists in the switching, which causes poor operation view experience of a user on a VR game, even causes a certain dizzy feeling of the experienced user, further affects the immersive game experience of the user, and causes the experienced user to have a disgust feeling on the VR game;
at present, in VR game operation experience, a position coordinate is basically specified in a VR game picture by a user, and the coordinate is instantly moved to after the position coordinate is determined, so that the moving mode is too abrupt, and synchronous movement cannot be performed by combining limb actions of the user, thereby reducing the experience and substitution feeling of the user on VR game operation, influencing the viscosity of the VR game by the user, and further causing the loss of the VR game experience user;
in order to solve the problems, a VR game intelligent management system based on big data feature recognition is designed.
Disclosure of Invention
In view of the problems existing in the prior art, the invention provides an intelligent VR game management system based on big data feature recognition, which is used for solving the technical problems.
In order to achieve the above objects and other objects, the present invention adopts the following technical solutions:
a VR game intelligent management system based on big data feature recognition comprises a user head posture parameter acquisition module, a role visual field switching parameter comparison module, a role visual field operation sensitivity analysis module, a user operation gesture acquisition module, a role action conformity degree analysis module, a limb action parameter detection module, a limb action synchronous analysis module and a game operation test experience analysis module;
the user head posture parameter acquisition module is used for acquiring head posture parameters when a user operates a VR game, and comprises the following specific steps:
s11, acquiring the head swing horizontal angle in the head posture parameters when the user operates the VR game through the VR game equipment, and marking the head swing horizontal angle as wa when the user operates the VR game1;
S12, for acquisitionMarking the head swing vertical angle as wa when the user operates the VR game2;
The character view switching parameter comparison module is used for comparing and analyzing game character view switching parameters when a user operates a VR game, and specifically comprises the following steps:
s21, extracting the switching horizontal angle of the game role view field in the VR game picture operated by the user, comparing the head swing horizontal angle with the game role view field switching horizontal angle when the user operates the VR game, and obtaining the switching horizontal angle difference value delta w' b of the game role view field in the VR game picture operated by the user1;
S22, extracting the switching vertical angle of the game role visual field in the VR game picture operated by the user, comparing the head swing vertical angle with the game role visual field switching vertical angle when the user operates the VR game, and obtaining the switching vertical angle difference value delta w' b of the game role visual field in the VR game picture operated by the user2;
The character visual field operation sensitivity analysis module is used for analyzing the visual field operation sensitivity coefficient of a game character in a VR game picture operated by a user, and the method specifically comprises the following steps:
s31, obtaining the delay time of the view switching of the game role in the VR game picture operated by the user;
s32, comprehensively analyzing the visual field operation sensitivity coefficient of the game character in the VR game picture operated by the user;
the user operation gesture obtaining module is used for obtaining a hand operation gesture when a user operates the VR game, and obtaining an operation instruction action corresponding to the hand operation gesture when the user operates the VR game through comparison;
the role action conformity degree analysis module is used for acquiring the operation instruction action of the game role in the VR game picture operated by the user and analyzing the hand action conformity degree of the game role in the VR game picture operated by the user;
the limb action parameter detection module is used for detecting limb action parameters in each limb action state when a user operates the VR game;
the limb action synchronous analysis module is used for analyzing a limb action synchronous matching degree coefficient of a game role in a VR game picture operated by a user, and the method specifically comprises the following steps:
s51, detecting limb vibration data when the user operates the VR game, and marking the limb vibration data as Fa when the user operates the VR game;
s52, obtaining the action movement rate of the game role in the VR game picture operated by the user, and marking the action movement rate of the game role in the VR game picture operated by the user as Vb;
s53, analyzing the limb action synchronous matching degree coefficient of the game role in the VR game picture operated by the user;
the game operation testing experience analysis module is used for calculating a comprehensive operation testing experience coefficient of the user to the VR game, comparing the comprehensive operation testing experience coefficient with a preset VR game operation testing experience coefficient threshold value, and if the comprehensive operation testing experience coefficient of the user to the VR game is smaller than the preset VR game operation testing experience coefficient threshold value, performing corresponding VR game optimization treatment measures.
Further, the step of obtaining the delay time for switching the view of the game character in the VR game picture operated by the user in the character view operation sensitivity analysis module includes:
acquiring the start time of head swing when a user operates the VR game, and marking the start time of head swing when the user operates the VR game as tStart ofa;
Acquiring the starting time of the view switching of the game character in the VR game picture operated by the user, and marking the starting time of the view switching of the game character in the VR game picture operated by the user as t'Start ofb;
Starting time t of head swing when the user operates VR gameStart ofa. Start time t 'for switching view field of game character in VR game screen operated by user'Start ofb into the formula Δ TDelay=t′Start ofb-tStart ofa, obtaining the delay time Delta T of the view switching of the game role in the VR game picture operated by the userDelay。
Further, the character view operation sensitivity analysis module comprehensively analyzes a view operation sensitivity coefficient of a game character in a VR game screen operated by a user, and includes:
the switching horizontal angle difference value delta w' b of the game role visual field in the VR game picture operated by the user1And the vertical angle difference value delta w' b for switching the game role view field in the VR game picture operated by the user2Delay time DeltaT for switching view of game character in VR game picture operated by userDelaySensitivity analysis model for substituting role visual field operationObtaining the visual field operation sensitivity coefficient ξ r of the game character in the VR game picture operated by the user1Wherein alpha and beta are respectively expressed as operation sensitivity influence proportionality coefficients corresponding to angles and delay time of game role visual field switching, WError of the measurementb1、WError of the measurementb2Respectively expressed as a switching horizontal angle allowable error and a switching vertical angle allowable error, T'Setting upThe time is represented as a set delay time threshold for switching the view of the character in the VR game screen.
Further, the user operation gesture obtaining module is used for obtaining a hand operation gesture when the user operates the VR game, and includes:
establishing a hand space coordinate system of a user, and acquiring position coordinates of each hand positioning sensor when the user operates a VR game;
constructing hand gesture outlines of the users when the users operate the VR games according to the position coordinates of the hand positioning sensors when the users operate the VR games;
comparing the hand gesture outline when the user operates the VR game with each set standard operation gesture, counting the similarity between the hand gesture outline and each standard operation gesture when the user operates the VR game, and screening the standard operation gesture with the highest similarity corresponding to the hand gesture outline when the user operates the VR game.
Further, the role action conformity analysis module is used for acquiring the operation instruction action of the user for operating the game role in the VR game picture, and analyzing the hand action conformity of the user for operating the game role in the VR game picture, and specifically includes:
acquiring an operation instruction action of a user for operating a game role in a VR game picture;
comparing the operation instruction action of the user for operating the game role in the VR game picture with the operation instruction action of the user hand;
obtaining the hand action conformity of the user operating the game role in the VR game picture, and marking the hand action conformity of the user operating the game role in the VR game picture as epsilon' r2。
Further, the limb action parameter detection module is configured to detect a limb action parameter in each limb action state when the user operates the VR game, and specifically includes:
detecting the upper limb action amplitude in each limb action state when the user operates the VR game, and marking the upper limb action amplitude in each limb action state as p when the user operates the VR gameia1Wherein i is 1, 2.. times.n;
detecting the action distance of the lower limbs in each limb action state when the user operates the VR game, and marking the action distance of the lower limbs in each limb action state as p when the user operates the VR gameia2;
Detecting the height of the lower limb body from the ground in each limb action state when the user operates the VR game, and marking the height of the lower limb body from the ground in each limb action state when the user operates the VR game as pia3。
Further, the limb action parameter detection module further obtains limb action parameters of the game role in each limb action state in the VR game screen operated by the user, and specifically includes the following steps:
acquiring the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user, and marking the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user as p'ib1;
Acquiring the action distance of the lower limb body of the game role in each limb action state in the VR game picture operated by the user, and enabling the game role in the VR game picture operated by the user to act on the lower limb body in each limb action stateThe pitch mark is p'ib2;
Acquiring the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user, and marking the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user as p'ib3。
Further, in the limb movement synchronization analysis module, the analysis mode of the limb movement synchronization matching degree coefficient of the game role in the VR game screen operated by the user is as follows:
substituting limb vibration data Fa when a user operates the VR game and action movement rate Vb of game roles in a picture of the VR game operated by the user into a limb action synchronous matching analysis modelObtaining the limb action synchronous matching degree coefficient ξ r of the game role in the VR game picture operated by the user3Wherein k is the ratio of the user's body motion parameter to the game character's body motion parameter, λ1、λ2、λ3Respectively expressed as the influence factors of the limb action parameters on the action synchronous matching, mu expressed as the compensation influence coefficient of the limb action running speed, VSign boardRepresents the motion movement rate V 'of the game character in the unit limb vibration data'Error of the measurementExpressed as the allowable error of the motion rate of the action of the game character.
Further, the game operation test experience analysis module calculates the comprehensive operation test experience coefficient of the user on the VR game in the following manner:
a visual field operation sensitivity coefficient ξ r for operating a game character in VR game picture by a user1And the hand action conformity epsilon' r of the game role in the VR game picture operated by the user2And a limb action synchronous matching degree coefficient ξ r of a game role in a VR game picture operated by a user3Substitution formulaObtaining a comprehensive operation test experience coefficient xi of a user on the VR gameGeneral assemblyWherein γ is1、γ2、γ3Respectively expressed as impact weight factors of the VR game play test experience,the influence coefficient is expressed by the hand motion conformity degree of the game role in the VR game picture operated by the user.
As described above, the VR game intelligent management system based on big data feature recognition provided by the present invention has at least the following beneficial effects:
the invention provides a VR game intelligent management system based on big data characteristic identification, which obtains head posture parameters when a user operates a VR game, compares the head posture parameters to obtain a head posture parameter difference value of a game role visual field in a picture of the VR game operated by the user, obtains delay time of switching the game role visual field in the picture of the VR game operated by the user, analyzes a visual field operation sensitivity coefficient of a game role in the picture of the VR game operated by the user, thereby effectively analyzing the influence of switching delay of the picture of the VR game role visual field on the visual field operation sensitivity, improving the operation visual field experience of the later-stage user on the VR game, reducing the operation dizzy feeling of the later-stage experience user on the VR game, simultaneously obtaining operation instruction actions corresponding to hand operation gestures when the user operates the VR game, analyzing the hand action degree of the user operating the game role in the picture of the VR game, and further providing the experience feeling of the user on-the-scene game, the possibility of experiencing the user to have a feeling of disgust on the VR game itself is reduced.
According to the VR game intelligent management system based on big data feature recognition, the limb action parameters in each limb action state when a user operates a VR game are detected, and the limb action synchronous matching degree coefficient of the game role in the VR game picture operated by the user is analyzed, so that synchronous movement is realized by combining the limb actions of the user, and the experience and substitution feeling of the user on VR game operation are improved.
According to the VR game intelligent management system based on big data feature recognition, the comprehensive operation test experience coefficient of a user on a VR game is calculated, and if the comprehensive operation test experience coefficient is smaller than the preset VR game operation test experience coefficient threshold, corresponding VR game optimization treatment measures are carried out, so that the playability of the VR game is improved, the viscosity of the user on the VR game can be increased, the problem that the VR game experience user runs off is solved, a foundation stone is provided for good development of the VR game, and a better foundation is provided for progress of VR game enterprises.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 shows a VR game intelligent management system based on big data feature recognition provided in an embodiment of the present application, where the system includes a user head posture parameter acquisition module, a role view switching parameter comparison module, a role view operation sensitivity analysis module, a user operation gesture acquisition module, a role action conformity analysis module, a limb action parameter detection module, a limb action synchronization analysis module, and a game operation test experience analysis module.
The character visual field operation sensitivity analysis module is respectively connected with the character visual field switching parameter comparison module and the game operation testing experience analysis module, the character action conformity degree analysis module is respectively connected with the user operation gesture acquisition module and the game operation testing experience analysis module, and the limb action synchronous analysis module is respectively connected with the limb action parameter detection module and the game operation testing experience analysis module.
The user head posture parameter acquisition module is used for acquiring head posture parameters when a user operates a VR game, and comprises the following specific steps:
s11, acquiring the head swing horizontal angle in the head posture parameters when the user operates the VR game through the VR game equipment, and marking the head swing horizontal angle as wa when the user operates the VR game1;
S12, acquiring the head swing vertical angle in the head posture parameters when the user operates the VR game, and marking the head swing vertical angle as wa when the user operates the VR game2。
The character view switching parameter comparison module is used for comparing and analyzing game character view switching parameters when a user operates a VR game, and specifically comprises the following steps:
s21, extracting the switching horizontal angle of the game role view field in the VR game picture operated by the user, comparing the head swing horizontal angle with the game role view field switching horizontal angle when the user operates the VR game, and obtaining the switching horizontal angle difference value delta w' b of the game role view field in the VR game picture operated by the user1;
S22, extracting the switching vertical angle of the game role visual field in the VR game picture operated by the user, comparing the head swing vertical angle with the game role visual field switching vertical angle when the user operates the VR game, and obtaining the switching vertical angle difference value delta w' b of the game role visual field in the VR game picture operated by the user2。
The character visual field operation sensitivity analysis module is used for analyzing the visual field operation sensitivity coefficient of a game character in a VR game picture operated by a user, and the method specifically comprises the following steps:
s31, obtaining the delay time of the view switching of the game role in the VR game picture operated by the user;
and S32, comprehensively analyzing the visual field operation sensitivity coefficient of the game character in the VR game picture operated by the user.
In a preferred embodiment of the present application, the obtaining, in the role view operation sensitivity analysis module, a delay time for switching the game role view in the VR game screen operated by the user includes:
acquiring the start time of head swing when a user operates the VR game, and marking the start time of head swing when the user operates the VR game as tStart ofa;
Acquiring the starting time of the view switching of the game character in the VR game picture operated by the user, and marking the starting time of the view switching of the game character in the VR game picture operated by the user as t'Start ofb;
Starting time t of head swing when the user operates VR gameStart ofa. Start time t 'for switching view field of game character in VR game screen operated by user'Start ofb into the formula Δ TDelay=t′Start ofb-tStart ofa, obtaining the delay time Delta T of the view switching of the game role in the VR game picture operated by the userDelay。
In a preferred embodiment of the present application, the method for analyzing the view operation sensitivity coefficient of the game character in the VR game screen by the user comprehensively in the character view operation sensitivity analysis module includes:
the switching horizontal angle difference value delta w' b of the game role visual field in the VR game picture operated by the user1And the vertical angle difference value delta w' b for switching the game role view field in the VR game picture operated by the user2Delay time DeltaT for switching view of game character in VR game picture operated by userDelaySensitivity analysis model for substituting role visual field operationObtaining the visual field operation sensitivity coefficient ξ r of the game character in the VR game picture operated by the user1Wherein alpha and beta are respectively expressed as operation sensitivity influence proportionality coefficients corresponding to angles and delay time of game role visual field switching, WError of the measurementb1、WError of the measurementb2Respectively expressed as a switching horizontal angle allowable error and a switching vertical angle allowable error, T'Setting upThe time is represented as a set delay time threshold for switching the view of the character in the VR game screen.
It should be noted that, in the invention, the head posture parameter when the user operates the VR game is obtained, the head posture parameter difference of the game role visual field in the VR game picture operated by the user is obtained through comparison, the delay time of the game role visual field switching in the VR game picture operated by the user is obtained, and the visual field operation sensitivity coefficient of the game role in the VR game picture operated by the user is analyzed, so that the influence of the VR game role visual field picture switching delay on the visual field operation sensitivity is effectively analyzed, the operation visual field experience of the later-stage user on the VR game is improved, and the operation dizzy feeling of the later-stage experience user on the VR game is reduced.
The user operation gesture obtaining module is used for obtaining a hand operation gesture when a user operates the VR game, and comparing the hand operation gesture with an operation instruction action corresponding to the hand operation gesture when the user operates the VR game.
In the technical scheme that this application is preferred, user operation gesture obtains module is used for obtaining hand operation gesture when user operation VR recreation, includes:
establishing a hand space coordinate system of a user, and acquiring position coordinates of each hand positioning sensor when the user operates a VR game;
constructing hand gesture outlines of the users when the users operate the VR games according to the position coordinates of the hand positioning sensors when the users operate the VR games;
comparing the hand gesture outline when the user operates the VR game with each set standard operation gesture, counting the similarity between the hand gesture outline and each standard operation gesture when the user operates the VR game, and screening the standard operation gesture with the highest similarity corresponding to the hand gesture outline when the user operates the VR game.
Wherein, each hand location sensor is dressed respectively on each finger joint of user's hand, and each finger joint one-to-one of each hand location sensor and user's hand.
The role action conformity degree analysis module is used for acquiring the action of an operation instruction of a user for operating a game role in the VR game picture and analyzing the hand action conformity degree of the user for operating the game role in the VR game picture.
In a preferred technical solution of the present application, the method specifically includes:
acquiring an operation instruction action of a user for operating a game role in a VR game picture;
comparing the operation instruction action of the user for operating the game role in the VR game picture with the operation instruction action of the user hand;
obtaining the hand action conformity of the user operating the game role in the VR game picture, and marking the hand action conformity of the user operating the game role in the VR game picture as epsilon' r2。
If the operation instruction action of the user for operating the game role in the VR game picture is completely consistent with the operation instruction action of the user hand, the hand action conformity degree of the user for operating the game role in the VR game picture is recorded as 1, and if the operation instruction action of the user for operating the game role in the VR game picture is not consistent with the operation instruction action of the user hand, the hand action conformity degree of the user for operating the game role in the VR game picture is recorded as 0.
It should be noted that, according to the invention, by acquiring the operation instruction action corresponding to the hand operation gesture when the user operates the VR game, the hand action conformity of the game role in the VR game picture operated by the user is analyzed, so that the user can experience the immersive game, and the possibility of the experience of the user to generate the aversion to the VR game itself is reduced.
The limb action parameter detection module is used for detecting limb action parameters in each limb action state when a user operates the VR game.
In a preferred technical solution of the present application, the specific steps include:
detecting the upper limb action amplitude in each limb action state when the user operates the VR game, and marking the upper limb action amplitude in each limb action state as p when the user operates the VR gameia1Wherein i is 1, 2, …, n;
detecting the action distance of the lower limbs in each limb action state when the user operates the VR game, and marking the action distance of the lower limbs in each limb action state as p when the user operates the VR gameia2;
Detecting the height of the lower limb body from the ground in each limb action state when the user operates the VR game, and marking the height of the lower limb body from the ground in each limb action state when the user operates the VR game as pia3。
The limb action states in the limb action parameter detection module comprise a walking action state, a running action state, a high jump action state, a jumping action state and a standing action state.
In a preferred technical solution of the present application, the limb movement parameter detecting module further includes a limb movement parameter acquiring module configured to acquire a limb movement parameter of a game character in each limb movement state in a VR game screen operated by a user, specifically as follows:
acquiring the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user, and marking the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user as p'ib1;
Acquiring the lower limb body motion distance of the game character in each limb motion state in the VR game picture operated by the user, and marking the lower limb body motion distance of the game character in each limb motion state in the VR game picture operated by the user as p'ib2;
Acquiring the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user, and marking the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user as p'ib3。
The limb action synchronous analysis module is used for analyzing a limb action synchronous matching degree coefficient of a game role in a VR game picture operated by a user, and the method specifically comprises the following steps:
s51, detecting limb vibration data when the user operates the VR game, and marking the limb vibration data as Fa when the user operates the VR game;
s52, obtaining the action movement rate of the game role in the VR game picture operated by the user, and marking the action movement rate of the game role in the VR game picture operated by the user as Vb;
and S53, analyzing the limb movement synchronous matching degree coefficient of the game role in the VR game picture operated by the user.
In a preferred technical solution of the present application, an analysis manner of the limb movement synchronization matching degree coefficient of the game role in the VR game screen operated by the user in the limb movement synchronization analysis module is as follows:
substituting limb vibration data Fa when a user operates the VR game and action movement rate Vb of game roles in a picture of the VR game operated by the user into a limb action synchronous matching analysis modelObtaining the limb action synchronous matching degree coefficient ξ r of the game role in the VR game picture operated by the user3Wherein k is the ratio of the user's body motion parameter to the game character's body motion parameter, λ1、λ2、λ3Respectively expressed as the influence factors of the limb action parameters on the action synchronous matching, mu expressed as the compensation influence coefficient of the limb action running speed, VSign boardRepresents the motion movement rate V 'of the game character in the unit limb vibration data'Error of the measurementExpressed as the allowable error of the motion rate of the action of the game character.
It should be noted that, the limb motion parameters in each limb motion state when the user operates the VR game are detected, and the limb motion synchronous matching degree coefficient of the game role in the VR game picture operated by the user is analyzed, so that synchronous motion is realized by combining the limb motion of the user, and the experience and substitution feeling of the user on the VR game operation are improved.
The game operation testing experience analysis module is used for calculating a comprehensive operation testing experience coefficient of the user to the VR game, comparing the comprehensive operation testing experience coefficient with a preset VR game operation testing experience coefficient threshold value, and if the comprehensive operation testing experience coefficient of the user to the VR game is smaller than the preset VR game operation testing experience coefficient threshold value, performing corresponding VR game optimization treatment measures.
In a preferred technical solution of the present application, the game operation test experience analysis module calculates a comprehensive operation test experience coefficient of the user for the VR game in the following manner:
a visual field operation sensitivity coefficient ξ r for operating a game character in VR game picture by a user1And the hand action conformity epsilon' r of the game role in the VR game picture operated by the user2And a limb action synchronous matching degree coefficient ξ r of a game role in a VR game picture operated by a user3Substitution formulaObtaining a comprehensive operation test experience coefficient xi of a user on the VR gameGeneral assemblyWherein γ is1、γ2、γ3Respectively expressed as impact weight factors of the VR game play test experience,the influence coefficient is expressed by the hand motion conformity degree of the game role in the VR game picture operated by the user.
It should be noted that, in the present invention, if the comprehensive operation test experience coefficient of the user for the VR game is smaller than the preset VR game operation test experience coefficient threshold, the corresponding VR game optimization processing measure is performed, so as to improve the playability of the VR game, increase the viscosity of the user for the VR game, and avoid the problem of the VR game experiencing user loss, thereby providing a foundation stone for the good development of the VR game, and providing a better foundation for the progress of the VR game enterprise.
The foregoing is merely exemplary and illustrative of the principles of the present invention and various modifications, additions and substitutions of the specific embodiments described herein may be made by those skilled in the art without departing from the principles of the present invention or exceeding the scope of the claims set forth herein.
Claims (9)
1. The utility model provides a VR recreation intelligent management system based on big data feature recognition which characterized in that: the system comprises a user head posture parameter acquisition module, a role visual field switching parameter comparison module, a role visual field operation sensitivity analysis module, a user operation gesture acquisition module, a role action conformity degree analysis module, a limb action parameter detection module, a limb action synchronization analysis module and a game operation test experience analysis module;
the user head posture parameter acquisition module is used for acquiring head posture parameters when a user operates a VR game, and comprises the following specific steps:
s11, acquiring the head swing horizontal angle in the head posture parameters when the user operates the VR game through the VR game equipment, and marking the head swing horizontal angle as wa when the user operates the VR game1;
S12, acquiring the head swing vertical angle in the head posture parameters when the user operates the VR game, and marking the head swing vertical angle as wa when the user operates the VR game2;
The character view switching parameter comparison module is used for comparing and analyzing game character view switching parameters when a user operates a VR game, and specifically comprises the following steps:
s21, extracting the switching horizontal angle of the game role view field in the VR game picture operated by the user, comparing the head swing horizontal angle with the game role view field switching horizontal angle when the user operates the VR game, and obtaining the switching horizontal angle difference value delta w' b of the game role view field in the VR game picture operated by the user1;
S22, extracting the switching vertical angle of the game role visual field in the VR game picture operated by the user, comparing the head swing vertical angle with the game role visual field switching vertical angle when the user operates the VR game, and obtaining the switching vertical angle difference value delta w' b of the game role visual field in the VR game picture operated by the user2;
The character visual field operation sensitivity analysis module is used for analyzing the visual field operation sensitivity coefficient of a game character in a VR game picture operated by a user, and the method specifically comprises the following steps:
s31, obtaining the delay time of the view switching of the game role in the VR game picture operated by the user;
s32, comprehensively analyzing the visual field operation sensitivity coefficient of the game character in the VR game picture operated by the user;
the user operation gesture obtaining module is used for obtaining a hand operation gesture when a user operates the VR game, and obtaining an operation instruction action corresponding to the hand operation gesture when the user operates the VR game through comparison;
the role action conformity degree analysis module is used for acquiring the operation instruction action of the game role in the VR game picture operated by the user and analyzing the hand action conformity degree of the game role in the VR game picture operated by the user;
the limb action parameter detection module is used for detecting limb action parameters in each limb action state when a user operates the VR game;
the limb action synchronous analysis module is used for analyzing a limb action synchronous matching degree coefficient of a game role in a VR game picture operated by a user, and the method specifically comprises the following steps:
s51, detecting limb vibration data when the user operates the VR game, and marking the limb vibration data as Fa when the user operates the VR game;
s52, obtaining the action movement rate of the game role in the VR game picture operated by the user, and marking the action movement rate of the game role in the VR game picture operated by the user as Vb;
s53, analyzing the limb action synchronous matching degree coefficient of the game role in the VR game picture operated by the user;
the game operation testing experience analysis module is used for calculating a comprehensive operation testing experience coefficient of the user to the VR game, comparing the comprehensive operation testing experience coefficient with a preset VR game operation testing experience coefficient threshold value, and if the comprehensive operation testing experience coefficient of the user to the VR game is smaller than the preset VR game operation testing experience coefficient threshold value, performing corresponding VR game optimization treatment measures.
2. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the character view operation sensitive analysis module obtains the delay time of game character view switching in the VR game picture operated by a user, and comprises the following steps:
acquiring the start time of head swing when a user operates the VR game, and marking the start time of head swing when the user operates the VR game as tStart ofa;
Obtaining user operation VR gameThe start time of switching the view field of the game character in the game screen is marked as t 'when the user operates the VR game screen'Start ofb;
Starting time t of head swing when the user operates VR gameStart ofa. Start time t 'for switching view field of game character in VR game screen operated by user'Start ofb into the formula Δ TDelay=t′Start ofb-tStart ofa, obtaining the delay time Delta T of the view switching of the game role in the VR game picture operated by the userDelay。
3. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the character visual field operation sensitivity analysis module comprehensively analyzes the visual field operation sensitivity coefficient of the game character in the VR game picture operated by the user, and comprises the following steps:
the switching horizontal angle difference value delta w' b of the game role visual field in the VR game picture operated by the user1And the vertical angle difference value delta w' b for switching the game role view field in the VR game picture operated by the user2Delay time DeltaT for switching view of game character in VR game picture operated by userDelaySensitivity analysis model for substituting role visual field operation Obtaining a visual field operation sensitivity coefficient ξ r of a game character in a VR game picture operated by a user1Wherein alpha and beta are respectively expressed as operation sensitivity influence proportionality coefficients corresponding to angles and delay time of game role visual field switching, WError of the measurementb1、WError of the measurementb2Respectively expressed as a switching horizontal angle allowable error and a switching vertical angle allowable error, T'Setting upThe time is represented as a set delay time threshold for switching the view of the character in the VR game screen.
4. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the user operation gesture obtains the module and is used for obtaining the hand operation gesture when user operation VR recreation, includes:
establishing a hand space coordinate system of a user, and acquiring position coordinates of each hand positioning sensor when the user operates a VR game;
constructing hand gesture outlines of the users when the users operate the VR games according to the position coordinates of the hand positioning sensors when the users operate the VR games;
comparing the hand gesture outline when the user operates the VR game with each set standard operation gesture, counting the similarity between the hand gesture outline and each standard operation gesture when the user operates the VR game, and screening the standard operation gesture with the highest similarity corresponding to the hand gesture outline when the user operates the VR game.
5. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: role action conformity analysis module is arranged in obtaining the operation instruction action of user operation VR recreation role in the picture, and the hand action conformity of analysis user operation VR recreation role in the picture specifically includes:
acquiring an operation instruction action of a user for operating a game role in a VR game picture;
comparing the operation instruction action of the user for operating the game role in the VR game picture with the operation instruction action of the user hand;
obtaining the hand action conformity of the user operating the game role in the VR game picture, and marking the hand action conformity of the user operating the game role in the VR game picture as epsilon' r2。
6. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the limb action parameter detection module is used for detecting limb action parameters in each limb action state when a user operates a VR game, and the method specifically comprises the following steps:
detecting the upper limb action amplitude in each limb action state when the user operates the VR game, and marking the upper limb action amplitude in each limb action state as p when the user operates the VR gameia1Wherein i is 1, 2.. times.n;
detecting the action distance of the lower limbs in each limb action state when the user operates the VR game, and marking the action distance of the lower limbs in each limb action state as p when the user operates the VR gameia2;
Detecting the height of the lower limb body from the ground in each limb action state when the user operates the VR game, and marking the height of the lower limb body from the ground in each limb action state when the user operates the VR game as pia3。
7. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the limb action parameter detection module further comprises a limb action parameter acquiring module for acquiring the limb action parameter of the game role in each limb action state in the VR game picture operated by the user, and the parameter acquiring module specifically comprises the following components:
acquiring the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user, and marking the upper limb action amplitude of the game character in each limb action state in the VR game picture operated by the user as p'ib1;
Acquiring the lower limb body motion distance of the game character in each limb motion state in the VR game picture operated by the user, and marking the lower limb body motion distance of the game character in each limb motion state in the VR game picture operated by the user as p'ib2;
Acquiring the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user, and marking the motion ground clearance of the lower limb body of the game character in each limb motion state in the VR game picture operated by the user as p'ib3。
8. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the analysis mode of the limb action synchronous matching degree coefficient of the game role in the VR game picture operated by the user in the limb action synchronous analysis module is as follows:
substituting limb vibration data Fa when a user operates the VR game and action movement rate Vb of game roles in a picture of the VR game operated by the user into a limb action synchronous matching analysis modelObtaining the limb action synchronous matching degree coefficient ξ r of the game role in the VR game picture operated by the user3Wherein k is the ratio of the user's body motion parameter to the game character's body motion parameter, λ1、λ2、λ3Respectively expressed as the influence factors of the limb action parameters on the action synchronous matching, mu expressed as the compensation influence coefficient of the limb action running speed, VSign boardRepresents the motion movement rate V 'of the game character in the unit limb vibration data'Error of the measurementExpressed as the allowable error of the motion rate of the action of the game character.
9. The big data feature recognition based VR gaming intelligence management system of claim 1, wherein: the game operation test experience analysis module calculates the comprehensive operation test experience coefficient of the user to the VR game in the following mode:
a visual field operation sensitivity coefficient ξ r for operating a game character in VR game picture by a user1And the hand action conformity epsilon' r of the game role in the VR game picture operated by the user2And a limb action synchronous matching degree coefficient ξ r of a game role in a VR game picture operated by a user3Substitution formulaObtaining a comprehensive operation test experience coefficient xi of a user on the VR gameGeneral assemblyWherein γ is1、γ2、γ3Respectively expressed as impact weight factors of the VR game play test experience,the influence coefficient is expressed by the hand motion conformity degree of the game role in the VR game picture operated by the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210071775.2A CN114405004A (en) | 2022-01-21 | 2022-01-21 | VR recreation intelligent management system based on big data feature recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210071775.2A CN114405004A (en) | 2022-01-21 | 2022-01-21 | VR recreation intelligent management system based on big data feature recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114405004A true CN114405004A (en) | 2022-04-29 |
Family
ID=81275165
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210071775.2A Pending CN114405004A (en) | 2022-01-21 | 2022-01-21 | VR recreation intelligent management system based on big data feature recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114405004A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116650950A (en) * | 2023-06-08 | 2023-08-29 | 廊坊市珍圭谷科技有限公司 | Control system and method for VR game |
CN116758109A (en) * | 2023-06-20 | 2023-09-15 | 杭州光线数字科技有限公司 | Action appearance state synchronicity monitoring system based on intelligent equipment |
-
2022
- 2022-01-21 CN CN202210071775.2A patent/CN114405004A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116650950A (en) * | 2023-06-08 | 2023-08-29 | 廊坊市珍圭谷科技有限公司 | Control system and method for VR game |
CN116650950B (en) * | 2023-06-08 | 2024-02-06 | 廊坊市珍圭谷科技有限公司 | Control system and method for VR game |
CN116758109A (en) * | 2023-06-20 | 2023-09-15 | 杭州光线数字科技有限公司 | Action appearance state synchronicity monitoring system based on intelligent equipment |
CN116758109B (en) * | 2023-06-20 | 2023-11-14 | 杭州光线数字科技有限公司 | Action appearance state synchronicity monitoring system based on intelligent equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114405004A (en) | VR recreation intelligent management system based on big data feature recognition | |
JP5982714B2 (en) | Apparatus and method for characterizing operation | |
KR100678373B1 (en) | User input device and method for interaction with graphic images | |
US9519828B2 (en) | Isolate extraneous motions | |
US8156067B1 (en) | Systems and methods for performing anytime motion recognition | |
EP2969080B1 (en) | Center of mass state vector for analyzing user motion in 3d images | |
US20050215319A1 (en) | Method and apparatus for controlling a three-dimensional character in a three-dimensional gaming environment | |
CN105229666A (en) | Motion analysis in 3D rendering | |
US20110317871A1 (en) | Skeletal joint recognition and tracking system | |
JP2011170856A (en) | System and method for motion recognition using a plurality of sensing streams | |
US20110112996A1 (en) | Systems and methods for motion recognition using multiple sensing streams | |
EP2969079A1 (en) | Signal analysis for repetition detection and analysis | |
US9052746B2 (en) | User center-of-mass and mass distribution extraction using depth images | |
CN111617464A (en) | Treadmill body-building method with action recognition function | |
Cho et al. | Motion recognition with smart phone embedded 3-axis accelerometer sensor | |
CN110227243A (en) | Table tennis practice intelligent correcting system and its working method | |
KR100866847B1 (en) | System and method for on-line fighting aaction game using tracker | |
CN106249901B (en) | A kind of adaptation method for supporting somatosensory device manipulation with the primary game of Android | |
CN113262459B (en) | Method, apparatus and medium for determining motion standard of sport body-building mirror | |
JP2011170857A (en) | System and method for performing motion recognition with minimum delay | |
CN109464798A (en) | A kind of Snooker auxiliary exercise method of sensing data driving | |
CN117021098B (en) | Method for generating world-place action based on in-place action | |
CN102542140A (en) | Control method and system of virtual taekwondo sport based on computer | |
CN115364472A (en) | Target movement triggering method and system based on human body image comparison | |
KR20220052452A (en) | Method and apparatus for providing expert coaching based on motion information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |