CN116966577A - VR game optimization system and method based on facial emotion recognition - Google Patents
VR game optimization system and method based on facial emotion recognition Download PDFInfo
- Publication number
- CN116966577A CN116966577A CN202311089926.8A CN202311089926A CN116966577A CN 116966577 A CN116966577 A CN 116966577A CN 202311089926 A CN202311089926 A CN 202311089926A CN 116966577 A CN116966577 A CN 116966577A
- Authority
- CN
- China
- Prior art keywords
- game
- facial emotion
- data
- glasses
- facial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000001815 facial effect Effects 0.000 title claims abstract description 135
- 238000000034 method Methods 0.000 title claims abstract description 51
- 230000008909 emotion recognition Effects 0.000 title claims abstract description 40
- 238000005457 optimization Methods 0.000 title claims abstract description 37
- 230000008451 emotion Effects 0.000 claims abstract description 95
- 230000003993 interaction Effects 0.000 claims abstract description 52
- 239000011521 glass Substances 0.000 claims abstract description 44
- 238000013523 data management Methods 0.000 claims abstract description 35
- 230000010354 integration Effects 0.000 claims abstract description 22
- 238000013500 data storage Methods 0.000 claims abstract description 12
- 238000004458 analytical method Methods 0.000 claims abstract description 11
- 230000006870 function Effects 0.000 claims abstract description 9
- 238000013480 data collection Methods 0.000 claims abstract description 8
- 238000009423 ventilation Methods 0.000 claims description 16
- 238000011156 evaluation Methods 0.000 claims description 8
- 230000007547 defect Effects 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000004478 pupil constriction Effects 0.000 claims description 5
- 230000008602 contraction Effects 0.000 claims description 3
- 210000001747 pupil Anatomy 0.000 claims description 3
- 240000005528 Arctium lappa Species 0.000 claims 2
- 239000011159 matrix material Substances 0.000 description 9
- 238000012216 screening Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000009825 accumulation Methods 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000000889 atomisation Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000013210 evaluation model Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/65—Methods for processing data by generating or executing the game program for computing the condition of a game character
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application discloses a VR game optimization system and a VR game optimization method based on facial emotion recognition, which are characterized in that facial emotion parameters in the game process of a player are recorded, facial emotion is rated by utilizing edge cloud computing, game data in a corresponding period are screened out according to the rated negative facial emotion, and are fed back to a game manufacturer for professional analysis, so that game scenes are optimized. The game machine is characterized by comprising VR glasses, a VR game host, a data management cloud center, edge clouds and a game manufacturer terminal, wherein the VR glasses are provided with facial emotion recognition functions, signal interaction is established between the VR glasses and the VR game host, facial emotion recognition modules of the VR glasses collect facial emotion parameters in the game process of a player and send the facial emotion parameters to a data collection module of the VR game host, and the data management cloud center is composed of a data storage module, a data scheduling module and a data integration module.
Description
Technical Field
The application discloses a VR game optimization system and method based on facial emotion recognition, relates to a VR game optimization system and method for screening game data by using facial emotion recognition, and belongs to the technical field of virtual reality.
Background
The virtual reality technology simulates a virtual environment through a computer, so that the environment immersion sense is brought to people, VR games become one of the most popular entertainment modes because of novelty and immersion experience sense, the existing game optimization method is to collect game experience comments and opinions of players manually through an internet platform, and feed the comments and the opinions back to a background developer for game optimization after the comments and the opinions are collected, however, the game optimization method is complicated in flow and long in feedback time, the game optimization update period is long, the player needs can not be met rapidly, the game experience is improved, the VR games are more diversified because of the specificity of interaction between the players and the virtual scenes, and the background developer is difficult to accurately position the game scene to be optimized and conduct targeted optimization on the game scene by means of language and word description of the comments.
Publication number CN113426132a discloses a game optimization method, apparatus, device and storage medium, the method comprising: screening comments of the game corresponding to the comment area to obtain target comments; determining a split functional module of the game; the method automatically locates the target comment to the functional module of the game and feeds back the target comment, so that the feedback speed of the comment in a game forum or a game community is improved, and further the game optimization updating speed is accelerated.
Disclosure of Invention
In order to improve the situation, the VR game optimization system and method based on facial emotion recognition provided by the application are used for grading facial emotion by recording facial emotion parameters in the game process of a player and utilizing edge cloud computing, screening game data in a corresponding period according to the evaluated negative facial emotion, feeding the game data back to a game manufacturer for professional analysis, and further optimizing game scenes.
The VR game optimization system and method based on facial emotion recognition are realized as follows: the application relates to a VR game optimization system based on facial emotion recognition, which comprises VR glasses, a VR game host, a data management cloud center, edge clouds and a game manufacturer terminal, wherein the VR glasses have facial emotion recognition function, signal interaction is established between the VR glasses and the VR game host,
the facial emotion recognition module of the VR glasses collects facial emotion parameters in the game process of the player and sends the facial emotion parameters to the data collection module of the VR game host,
preferably, the facial emotion parameters include blink times, pupil constriction degree and eye microexpressions,
the data management cloud center consists of a data storage module, a data scheduling module and a data integration module,
the data collection module of the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and facial emotion parameters to the data storage module of the data management cloud center,
preferably, the game data collected by the VR game host includes game names, real-time scene interaction audio and video, etc.,
the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name,
the edge cloud is provided with a facial emotion assessment database, the facial emotion of different periods is graded and divided by comparing the facial emotion parameters with the data in the facial emotion assessment database,
preferably, the facial emotion rating is calculated according to the weight accumulation of three evaluation indexes of blink times, pupil contraction degree and eye microexpressions to obtain comprehensive scores, and then the comprehensive scores are graded, the facial emotion rating comprises three grades of negative, flat and excited,
the edge cloud selects real-time scene interaction audio and video corresponding to the time period from the game data according to the facial emotion which is rated as negative, and sends the selected real-time scene interaction audio and video to a data integration module of a data management cloud center,
the data integration module of the data management cloud center performs data visual integration on the received real-time scene interaction audio and video and feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to the game name,
the game manufacturer receives the fed-back real-time scene interaction audio and video, performs professional analysis on the real-time scene interaction audio and video, and optimizes the defects of the interaction scene;
the application discloses a VR game optimization method based on facial emotion recognition, which is characterized by comprising the following steps of: the VR game optimization system based on facial emotion recognition is utilized for optimization, and specifically comprises the following steps:
1. the VR glasses collect facial emotion parameters in the game process of the player and send the facial emotion parameters to the VR game host,
2. the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and facial emotion parameters to the data management cloud center,
3. the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name,
4. the edge cloud performs rating division on the facial emotion of different periods by comparing the facial emotion parameters with the data in the facial emotion assessment database,
5. the edge cloud selects real-time scene interaction audios and videos of corresponding time periods from game data according to the facial emotion which is rated as negative, and sends the selected real-time scene interaction audios and videos to a data integration module of a data management cloud center,
6. the data integration module performs data visual integration on the received real-time scene interaction audio and video, and feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to name,
7. after receiving the feedback real-time scene interaction audio and video, the game manufacturer performs professional analysis on the feedback real-time scene interaction audio and video, and optimizes the defects of the interaction scene;
the application also relates to VR glasses with the facial emotion recognition function, which consists of a VR glasses shell, a soft cushion, an elastic band, a ventilation groove, a camera, lenses, a first shading sheet and a second shading sheet, wherein the VR glasses shell is of a rectangular structure, two lenses are symmetrically arranged on the VR glasses shell, the camera is flush with the two lenses, the camera is arranged between the two lenses, two side walls of the VR glasses shell are correspondingly connected with two ends of the elastic band, a circle of soft cushion is arranged at the edge of the VR glasses shell, two ventilation grooves are respectively correspondingly formed at two ends of the soft cushion, two first shading sheets are equidistantly arranged on the inner wall of one side of the ventilation groove, a second shading sheet is arranged on the inner wall of the other side of the ventilation groove, and the second shading sheet is positioned between the two first shading sheets.
Advantageous effects
1. The facial emotion parameters in the game process of the player are recorded, the facial emotion is graded, game data in a corresponding period are screened out according to the assessed negative facial emotion, and are fed back to a game manufacturer for professional analysis, so that the game scene is optimized, the time for a background developer to find the scene to be optimized can be shortened, and the game optimizing and updating efficiency is improved.
2. The facial emotion parameters and the game data are processed and analyzed through the edge cloud, so that the data processing efficiency is improved, and the operation pressure of the data management cloud center is reduced.
Drawings
FIG. 1 is a schematic diagram of a VR game optimization system based on facial emotion recognition in accordance with the present application;
FIG. 2 is a flow chart of a VR game optimization method based on facial emotion recognition in accordance with the present application;
fig. 3 is a schematic left view structure diagram of VR glasses with facial emotion recognition function according to the present application;
fig. 4 is a schematic structural diagram of VR glasses with facial emotion recognition function according to the present application.
In the accompanying drawings
Wherein the method comprises the following steps: VR glasses shell (1), cushion (2), elasticity elastic cord (3), ventilation groove (4), camera (5), lens (6), first anti-dazzling screen (7), second anti-dazzling screen (8).
Detailed Description
The VR game optimization system and method based on facial emotion recognition are realized as follows: the application relates to a VR game optimization system based on facial emotion recognition, which comprises VR glasses, a VR game host, a data management cloud center, edge clouds and a game manufacturer terminal, wherein the VR glasses have facial emotion recognition function, signal interaction is established between the VR glasses and the VR game host,
the facial emotion recognition module of the VR glasses collects facial emotion parameters in the game process of the player and sends the facial emotion parameters to the data collection module of the VR game host,
preferably, the facial emotion parameters include blink times, pupil constriction degree and eye microexpressions,
the data management cloud center consists of a data storage module, a data scheduling module and a data integration module,
the data collection module of the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and facial emotion parameters to the data storage module of the data management cloud center,
preferably, the game data collected by the VR game host includes game names, real-time scene interaction audio and video, etc.,
the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name,
the edge cloud is provided with a facial emotion assessment database, the facial emotion of different periods is graded and divided by comparing the facial emotion parameters with the data in the facial emotion assessment database,
preferably, the facial emotion rating is calculated according to the weight accumulation of three evaluation indexes of blink times, pupil contraction degree and eye microexpressions to obtain comprehensive scores, and then the comprehensive scores are graded, the facial emotion rating comprises three grades of negative, flat and excited,
the edge cloud selects real-time scene interaction audio and video corresponding to the time period from the game data according to the facial emotion which is rated as negative, and sends the selected real-time scene interaction audio and video to a data integration module of a data management cloud center,
the data integration module of the data management cloud center performs data visual integration on the received real-time scene interaction audio and video and feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to the game name,
the game manufacturer receives the fed-back real-time scene interaction audio and video, performs professional analysis on the real-time scene interaction audio and video, and optimizes the defects of the interaction scene;
the application discloses a VR game optimization method based on facial emotion recognition, which comprises the following steps:
the VR glasses collect facial emotion parameters in the game process of the player and send the facial emotion parameters to the VR game host,
preferably, the facial emotion parameters include blink times, pupil constriction degree and eye microexpressions,
the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and facial emotion parameters to the data management cloud center,
preferably, the game data collected by the VR game host includes game names, real-time scene interaction audio and video, etc.,
the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name,
preferably, the data scheduling module can also respectively send the facial emotion parameters and the game data of the same game to different edge clouds, and after the edge clouds process the facial emotion parameter data, the facial emotion parameter data is sent to the edge clouds which process the game data of the same game for analysis,
4. the edge cloud performs rating division on the facial emotion of different periods by comparing the facial emotion parameters with the data in the facial emotion assessment database,
5. the edge cloud selects real-time scene interaction audios and videos of corresponding time periods from game data according to the facial emotion which is rated as negative, and sends the selected real-time scene interaction audios and videos to a data integration module of a data management cloud center,
preferably, the selected real-time scene interaction audio and video can be screened again through the edge cloud,
6. the data integration module performs data visual integration on the received real-time scene interaction audio and video, and feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to name,
7. after receiving the feedback real-time scene interaction audio and video, the game manufacturer performs professional analysis on the feedback real-time scene interaction audio and video, and optimizes the defects of the interaction scene;
the index weight algorithm of the evaluation index is specifically described as follows:
1. sequentially numbering 3 evaluation indexes as A1-A3, determining the relative importance degree of each evaluation index, and constructing a hierarchical structure model and a judgment matrix;
2. carrying out normalization processing on each column of elements of the judgment matrix, wherein the general term of the elements is aij, and aij represents the elements of the ith row and the jth column of the judgment matrix;
3. performing row-by-row addition on the judgment matrix after normalization of each column, and performing normalization processing to obtain a feature vector W of the judgment matrix, wherein the feature vector W is represented by the following formula;
4. obtaining the maximum characteristic root of the judgment matrix through the calculation of the judgment matrix and the characteristic vector, wherein (AmW) i represents the ith element of the vector AmW, m is [1,3], and Am represents the judgment matrix with corresponding number;
5. and carrying out consistency test on the judgment matrix to obtain index weights of all evaluation indexes, wherein CR=CI/RI, wherein CI represents a consistency index, and RI represents a random consistency index.
The facial emotion rating model is specifically described as follows:
according to the calculated index weight, respectively setting the correlation score of the blink times of the player in the game process as C1, and the corresponding index weight as C1; the relevant score of the pupil constriction degree of the player in the game process is C2, and the corresponding index weight is C2; in the game process, the relevant score of the eye microexpressions of the player is C3, and the corresponding index weight is C3, so that the evaluation model G=C1×c1+C2×c2+C3×c3 of the facial emotion rating;
the application also relates to VR glasses with a facial emotion recognition function, which consists of a VR glasses shell (1), a soft cushion (2), an elastic band (3), a ventilation groove (4), a camera (5), lenses (6), a first light shielding piece (7) and a second light shielding piece (8), wherein the VR glasses shell (1) is of a rectangular structure, two lenses (6) are symmetrically arranged on the VR glasses shell (1), the camera (5) is flush with the two lenses (6), the camera (5) is arranged between the two lenses, two side walls of the VR glasses shell (1) are correspondingly connected with the two ends of the elastic band (3), a circle of soft cushion (2) is arranged at the edge of the VR glasses shell, the two ends of the soft cushion (2) are respectively correspondingly provided with the ventilation groove (4), two first light shielding pieces (7) are equidistantly arranged on the inner wall of one side of the first light shielding piece (4), the inner wall (4) is positioned between the two ventilation grooves (8), and the second light shielding piece (8) is positioned between the two ventilation grooves (8);
when the novel VR glasses are used, the VR glasses are worn on the head by the aid of the elastic bands (3), the soft cushion (2) is attached to the periphery of the eyes, a player watches VR game scenes and starts a game through the lenses (6), meanwhile, the camera (5) is opened and records facial emotion parameters of the player, the facial emotion parameters are sent to the data collection module of the VR game host through the facial emotion parameter storage module in the VR glasses shell (1), ventilation grooves (4) at the two ends of the soft cushion (2) can ventilate the soft cushion (2), camera (5) lens atomization is prevented, the recording process of facial emotion is influenced, meanwhile, the first light shielding sheet (7) and the second light shielding sheet (8) in the ventilation grooves (4) are mutually shielded, light leakage of the ventilation grooves (4) is prevented, and the viewing experience of the player during game is influenced.
The method comprises the steps of recording facial emotion parameters in the game process of a player, grading facial emotion by utilizing edge cloud computing, screening game data in a corresponding period according to the assessed negative facial emotion, feeding the game data back to a game manufacturer for professional analysis, and optimizing a game scene.
Other similar embodiments of the application will readily suggest themselves to such skilled persons having the benefit of this disclosure and the practice of the application disclosed herein, and are intended to cover any adaptations or variations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
Claims (10)
1. A VR game optimization method based on facial emotion recognition is characterized in that: the VR game optimization system based on facial emotion recognition is utilized for optimization, and specifically comprises the following steps:
1) The VR glasses collect facial emotion parameters in the game process of the player and send the facial emotion parameters to the VR game host;
2) The VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and facial emotion parameters to the data management cloud center;
3) The data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, and the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name;
4) The edge cloud performs grading division on the facial emotions in different periods by comparing the facial emotion parameters with data in a facial emotion evaluation database;
5) The edge cloud selects real-time scene interaction audios and videos corresponding to the time period from game data according to the facial emotion rated as negative, and sends the selected real-time scene interaction audios and videos to a data integration module of a data management cloud center;
6) The data integration module performs data visual integration on the received real-time scene interaction audio and video, and feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to the name;
7) And after receiving the feedback real-time scene interaction audio and video, the game manufacturer performs professional analysis on the feedback real-time scene interaction audio and video, and optimizes the defects of the interaction scene.
2. The VR game optimization method based on facial emotion recognition according to claim 1, wherein the VR game optimization system based on facial emotion recognition comprises VR glasses, a VR game host, a data management cloud center, an edge cloud and a game manufacturer terminal, wherein the VR glasses have facial emotion recognition functions, signal interaction is established between the VR glasses and the VR game host, a facial emotion recognition module of the VR glasses collects facial emotion parameters in the game process of a player and sends the facial emotion parameters to a data collection module of the VR game host, the data management cloud center consists of a data storage module, a data scheduling module and a data integration module, a data collection module of the VR game host collects game data and facial emotion parameters generated in the game process of the VR game, the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to the corresponding edge cloud according to the game name, the edge cloud is provided with a facial emotion assessment database, the facial emotion in different time periods is graded and divided by comparing the facial emotion parameters with the data in the facial emotion assessment database, the edge cloud selects real-time scene interaction audios and videos in corresponding time periods from the game data according to the facial emotion graded as negative, and transmits the selected real-time scene interaction audios and videos to the data integration module of the data management cloud center, the data integration module of the data management cloud center performs data visual integration on the received real-time scene interaction audio and video, feeds the integrated real-time scene interaction audio and video back to the corresponding game manufacturer terminal according to the game name, and performs professional analysis on the real-time scene interaction audio and video after the game manufacturer receives the feedback real-time scene interaction audio and video, so as to optimize the defects of the interaction scene.
3. The VR game optimization method based on facial emotion recognition according to claim 2, characterized in that the VR glasses with facial emotion recognition function comprise VR glasses shell, soft pad, elastic band, ventilation groove, camera, lens, first anti-dazzling screen and second anti-dazzling screen, two lenses are symmetrically placed on the VR glasses shell, the camera is placed on the VR glasses shell, two side walls of the VR glasses shell are correspondingly connected with two ends of the elastic band, a circle of soft pad is placed at the edge of the VR glasses shell, ventilation grooves are respectively correspondingly formed at two ends of the soft pad, two first anti-dazzling screens are equidistantly placed on the inner wall of one side of each ventilation groove, and second anti-dazzling screens are placed on the inner wall of the other side of each ventilation groove.
4. The VR game optimization method based on facial emotion recognition of claim 2, wherein the facial emotion parameters include blink times, pupil constriction degree, and eye microexpressions.
5. The VR game optimization method based on facial emotion recognition of claim 2, wherein the game data collected by the VR game host includes game names, real-time scene interaction audio and video, and the like.
6. The VR game optimization method based on facial emotion recognition as set forth in claim 2, wherein the facial emotion rating is calculated by integrating weights of three evaluation indexes including blink times, pupil contraction degrees and eye micro-expressions, and is further classified into three classes of negative, flat and excited.
7. The VR game optimization method based on facial emotion recognition of claim 3, wherein said VR glasses case has a rectangular structure.
8. A VR game optimization method based on facial emotion recognition as set forth in claim 3, wherein said camera is flush with both lenses.
9. A VR game optimization method based on facial emotion recognition as set forth in claim 3, wherein said camera is disposed between two lenses.
10. The VR game optimization method based on facial emotion recognition of claim 3, wherein said second gobo is located between two first gobos.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311089926.8A CN116966577A (en) | 2023-08-28 | 2023-08-28 | VR game optimization system and method based on facial emotion recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311089926.8A CN116966577A (en) | 2023-08-28 | 2023-08-28 | VR game optimization system and method based on facial emotion recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116966577A true CN116966577A (en) | 2023-10-31 |
Family
ID=88479696
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311089926.8A Pending CN116966577A (en) | 2023-08-28 | 2023-08-28 | VR game optimization system and method based on facial emotion recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116966577A (en) |
-
2023
- 2023-08-28 CN CN202311089926.8A patent/CN116966577A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109121021A (en) | A kind of generation method of Video Roundup, device, electronic equipment and storage medium | |
CN106205609B (en) | A kind of audio scene recognition method and its device based on audio event and topic model | |
CN109961047A (en) | Study measure of supervision, device, robot and the storage medium of educational robot | |
CN110321409A (en) | Secondary surface method for testing, device, equipment and storage medium based on artificial intelligence | |
CN109509010A (en) | A kind of method for processing multimedia information, terminal and storage medium | |
CN110415103A (en) | The method, apparatus and electronic equipment that tenant group mentions volume are carried out based on variable disturbance degree index | |
CN110198453A (en) | Live content filter method, storage medium, equipment and system based on barrage | |
KR20210023631A (en) | System and method for improving development disorder using deep learning module | |
CN111026267A (en) | VR electroencephalogram idea control interface system | |
CN114949840A (en) | VR game optimization system and method based on facial emotion recognition | |
CN113497946B (en) | Video processing method, device, electronic equipment and storage medium | |
CN114638445B (en) | Method, device, medium, and electronic device for crop disease prevention | |
CN116966577A (en) | VR game optimization system and method based on facial emotion recognition | |
CN112183172A (en) | Training of appearance evaluation model, pushing method of beauty parameters of appearance evaluation model and related device | |
CN112717343B (en) | Method and device for processing sports data, storage medium and computer equipment | |
CN113611416A (en) | Psychological scene assessment method and system based on virtual reality technology | |
CN107944056B (en) | Multimedia file identification method, device, terminal and storage medium | |
CN111105651A (en) | AR-based waste classification teaching method and system | |
CN111760276A (en) | Game behavior control method, device, terminal, server and storage medium | |
CN113794874B (en) | Quality evaluation method for videos shot by multiple cameras | |
CN117771664B (en) | Interactive game projection method of self-adaptive projection surface | |
CN117297540B (en) | Visual fatigue monitoring method and system based on intelligent glasses | |
CN112419112B (en) | Method and device for generating academic growth curve, electronic equipment and storage medium | |
CN115445204A (en) | Simulation system for game development | |
JP7041093B2 (en) | Information processing equipment, information processing methods, and programs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |