CN114949840A - VR game optimization system and method based on facial emotion recognition - Google Patents
VR game optimization system and method based on facial emotion recognition Download PDFInfo
- Publication number
- CN114949840A CN114949840A CN202210557768.3A CN202210557768A CN114949840A CN 114949840 A CN114949840 A CN 114949840A CN 202210557768 A CN202210557768 A CN 202210557768A CN 114949840 A CN114949840 A CN 114949840A
- Authority
- CN
- China
- Prior art keywords
- game
- data
- facial emotion
- facial
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Cardiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to a VR game optimization system and method based on facial emotion recognition, which is characterized in that facial emotion parameters in a game process of a player are recorded, facial emotion is graded through edge cloud calculation, game data in corresponding time periods are screened out according to the evaluated negative facial emotion and are fed back to game manufacturers for professional analysis, and then a game scene is optimized, the data processing efficiency is improved, and the operation pressure of the data management cloud center is reduced.
Description
Technical Field
The invention discloses a VR game optimization system and method based on facial emotion recognition, relates to a VR game optimization system and method for screening game data by utilizing facial emotion recognition, and belongs to the technical field of virtual reality.
Background
The virtual reality technology simulates a virtual environment through a computer, so that environment immersion is brought to people, at present, VR games become one of the most popular entertainment modes for people due to novelty and immersion experience, the existing game optimization method is to manually collect game experience comments and opinions of players through an internet platform, arrange and aggregate the game experience comments and opinions, and feed the game experience back to background developers for game optimization, however, the game optimization mode has complicated flow and long feedback time, so that the game optimization updating period is long, the requirements of the players cannot be met quickly, the game experience is improved, and the VR games need the interaction particularity of the players and virtual scenes, the rating standard of the game is more diversified, and background developers are difficult to accurately position the game scene to be optimized and perform targeted optimization on the game scene only by means of language and text description of comments and opinions.
Publication No. CN113426132A discloses a game optimization method, apparatus, device and storage medium, the method comprising: screening comments of a comment area corresponding to the game to obtain a target comment; determining a split function module of the game; the target comments are associated to the target function module in the split function module so that the server-side associated server side can optimize the game based on the target comments, the method automatically positions the target comments to the function module of the game and feeds back the target comments, the feedback speed of comments in a game forum or a game community is improved, and the game optimization updating speed is further accelerated.
Disclosure of Invention
In order to improve the situation, the VR game optimization system and method based on facial emotion recognition provide the VR game optimization system and method for optimizing the game scene by recording facial emotion parameters of a player in the game process, utilizing edge cloud computing to grade the facial emotion, screening out game data in a corresponding time period according to the evaluated negative facial emotion, and feeding back the game data to a game manufacturer for professional analysis.
The invention discloses a VR game optimization system based on facial emotion recognition, which is realized as follows: the invention relates to a VR game optimization system based on facial emotion recognition, which comprises VR glasses, a VR game host, a data management cloud center, an edge cloud and a game manufacturer terminal, wherein the VR glasses have a facial emotion recognition function, signal interaction is established between the VR glasses and the VR game host,
the facial emotion recognition module of the VR glasses collects facial emotion parameters in the game process of a player and sends the facial emotion parameters to the data collection module of the VR game host,
preferably, the facial emotion parameters comprise blinking times, pupil contraction degree and eye micro expression,
the data management cloud center consists of a data storage module, a data scheduling module and a data integration module,
the data collection module of the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and the facial emotion parameters to the data storage module of the data management cloud center,
preferably, the game data collected by the VR game host comprises game names, real-time scene interactive audios and videos and the like, the data storage module of the data management cloud center uniformly sends the received facial emotion parameters and the game data to the data scheduling module of the data management cloud center, the data scheduling module sends the facial emotion parameters and the game data to corresponding edge clouds according to the game names, the edge clouds are provided with facial emotion assessment databases, facial emotion parameters are compared with data in the facial emotion assessment databases, and the facial emotions in different time periods are graded and divided,
preferably, the facial emotion rating is a comprehensive score obtained by accumulative calculation according to the weight of three evaluation indexes, namely blinking times, pupil contraction degree and eye micro-expression, and is further subjected to rating division, wherein the facial emotion rating comprises three grades of negative, flat and excited,
the edge cloud selects real-time scene interactive audio and video of corresponding time periods from the game data according to the facial emotion rated as negative, and sends the selected real-time scene interactive audio and video to a data integration module of the data management cloud center,
the data integration module of the data management cloud center performs data visualization integration on the received real-time scene interactive audio and video, and feeds the integrated real-time scene interactive audio and video back to the corresponding game manufacturer terminal according to the game name,
after receiving the fed back real-time scene interactive audio and video, the game manufacturer carries out professional analysis on the real-time scene interactive audio and video, and optimizes the defects of the interactive scene;
the invention relates to a game optimization method of a VR game optimization system applying facial emotion recognition, which comprises the following steps:
1) the VR glasses collect the facial emotion parameters of the player during the game and send the parameters to the VR game host,
2) the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and the facial emotion parameters to the data management cloud center,
3) the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to corresponding edge clouds according to game names,
4) the edge cloud carries out rating division on the facial emotions in different periods by comparing the facial emotion parameters with the data in the facial emotion assessment database,
5) the edge cloud selects real-time scene interactive audio and video of corresponding time periods from the game data according to the facial emotion rated as negative, and sends the selected real-time scene interactive audio and video to a data integration module of the data management cloud center,
6) the data integration module carries out data visual integration on the received real-time scene interactive audio and video and feeds the integrated real-time scene interactive audio and video back to the corresponding game manufacturer terminal according to the name,
7) after receiving the fed back real-time scene interactive audio and video, game manufacturers carry out professional analysis on the real-time scene interactive audio and video, and optimize the defects of the interactive scene;
the invention also relates to VR glasses with a facial emotion recognition function, which consists of a VR glasses shell, a soft cushion, an elastic band, a ventilation groove, a camera, a lens, a first shading sheet and a second shading sheet, the VR glasses shell is of a rectangular structure, two lenses are symmetrically arranged on the VR glasses shell, the camera is arranged on the VR glasses shell, the camera is flush with the two lenses and is arranged between the two lenses, the two side walls of the VR glasses shell are correspondingly connected with the two ends of the elastic band, a circle of soft pad is arranged at the edge of the VR glasses shell, two ends of the soft pad are respectively provided with a ventilation groove correspondingly, two first anti-dazzling screens have been put to ventilation groove one side inner wall equidistance, ventilation groove opposite side inner wall has been put the second anti-dazzling screen, the second anti-dazzling screen is located between two first anti-dazzling screens.
Advantageous effects
The method comprises the steps of recording facial emotion parameters of a player in a game process, grading the facial emotion, screening game data in a corresponding time period according to the evaluated negative facial emotion, feeding the game data back to a game manufacturer for professional analysis, further optimizing a game scene, shortening the time for a background developer to find the scene to be optimized, and improving the game optimization updating efficiency.
And secondly, facial emotion parameters and game data are rapidly processed and analyzed through the edge cloud, so that the data processing efficiency is improved, and the operation pressure of a data management cloud center is reduced.
Drawings
FIG. 1 is a schematic diagram of a VR game optimization system based on facial emotion recognition in accordance with the present invention;
FIG. 2 is a flow chart of a VR game optimization method based on facial emotion recognition in accordance with the present invention;
FIG. 3 is a schematic diagram of a left-view structure of VR glasses with facial emotion recognition function according to the present invention;
fig. 4 is a schematic structural diagram of VR glasses with a facial emotion recognition function according to the present invention.
In the drawings, wherein: the glasses comprise a VVR glasses shell (1), a soft cushion (2), an elastic band (3), a ventilation groove (4), a camera (5), a lens (6), a first shading sheet (7) and a second shading sheet (8). Elastic band (3), ventilation groove (4), camera (5), lens (6), first shade (7), second shade (8)
The specific implementation mode is as follows:
the invention discloses a VR game optimization system based on facial emotion recognition, which is realized as follows: the invention relates to a VR game optimization system based on facial emotion recognition, which comprises VR glasses, a VR game host, a data management cloud center, an edge cloud and a game manufacturer terminal, wherein the VR glasses have a facial emotion recognition function, signal interaction is established between the VR glasses and the VR game host,
the facial emotion recognition module of the VR glasses collects facial emotion parameters in the game process of a player and sends the facial emotion parameters to the data collection module of the VR game host,
preferably, the facial emotion parameters comprise blinking times, pupil contraction degree and eye micro expression,
the data management cloud center consists of a data storage module, a data scheduling module and a data integration module,
the data collection module of the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and the facial emotion parameters to the data storage module of the data management cloud center,
preferably, the game data collected by the VR game host comprises game names, real-time scene interactive audios and videos and the like, the data storage module of the data management cloud center uniformly sends the received facial emotion parameters and the game data to the data scheduling module of the data management cloud center, the data scheduling module sends the facial emotion parameters and the game data to corresponding edge clouds according to the game names, the edge clouds are provided with facial emotion assessment databases, facial emotion parameters are compared with data in the facial emotion assessment databases, and the facial emotions in different time periods are graded and divided,
preferably, the facial emotion rating is a comprehensive score obtained by accumulative calculation according to the weight of three evaluation indexes of the blinking number, the pupil contraction degree and the eye micro expression, and is further graded and divided, wherein the facial emotion rating comprises three grades of negative, flat and excited,
the edge cloud selects real-time scene interactive audio and video of corresponding time periods from the game data according to the facial emotion rated as negative, and sends the selected real-time scene interactive audio and video to a data integration module of the data management cloud center,
the data integration module of the data management cloud center performs data visualization integration on the received real-time scene interactive audio and video, and feeds the integrated real-time scene interactive audio and video back to the corresponding game manufacturer terminal according to the game name,
after receiving the fed back real-time scene interactive audio and video, the game manufacturer carries out professional analysis on the real-time scene interactive audio and video, and optimizes the defects of the interactive scene;
the invention relates to a game optimization method of a VR game optimization system applying facial emotion recognition, which comprises the following steps:
1) the VR glasses collect facial emotion parameters in the game process of the player and send the facial emotion parameters to the VR game host, preferably, the facial emotion parameters comprise blinking times, pupil contraction degree and eye micro expression,
2) the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and the facial emotion parameters to the data management cloud center,
preferably, the game data collected by the VR game host comprises a game name and a real-time scene interactive audio-video,
3) the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to corresponding edge clouds according to game names,
preferably, the data scheduling module can also respectively send the facial emotion parameters and the game data of the same game to different edge clouds, after the edge clouds process the facial emotion parameter data, the facial emotion parameter data are sent to the edge clouds processing the same game data for analysis,
4) the edge cloud carries out rating division on the facial emotions in different periods by comparing the facial emotion parameters with the data in the facial emotion assessment database,
5) the edge cloud selects real-time scene interactive audio and video of corresponding time periods from the game data according to the facial emotion rated as negative, and sends the selected real-time scene interactive audio and video to a data integration module of the data management cloud center,
preferably, the selected real-time scene interactive audios and videos can be screened again through the edge cloud,
6) the data integration module carries out data visual integration on the received real-time scene interactive audio and video and feeds the integrated real-time scene interactive audio and video back to the corresponding game manufacturer terminal according to the name,
7) after receiving the fed back real-time scene interactive audio and video, game manufacturers carry out professional analysis on the real-time scene interactive audio and video, and optimize the defects of the interactive scene;
the index weight algorithm of the evaluation index is specifically described as follows:
1) the 3 evaluation indexes are numbered A1-A3 in sequence, the relative importance degree of each evaluation index is determined, and a hierarchical structure model and a judgment matrix are constructed;
2) normalizing each row of elements of the judgment matrix, wherein the general item of the element is aij, and the aij represents the element of the ith row and the jth column of the judgment matrix;
3) adding the normalized judgment matrixes of all the columns according to rows, and performing normalization processing to obtain a characteristic vector W of the judgment matrix;
4) calculating to obtain a maximum characteristic root of the judgment matrix through the judgment matrix and the characteristic vector, wherein (AmW) i represents the ith element of the vector AmW, m is [1, 3], and Am represents the judgment matrix of the corresponding number;
5) and performing consistency check on the judgment matrix to obtain index weight of each evaluation index, wherein CR is CI/RI, wherein CI represents a consistency index, and RI represents a random consistency index.
The facial emotion rating model is obtained by the following method:
respectively setting the related score of the blinking times of the player in the game process as C1 and the corresponding index weight as C1 according to the calculated index weight; the relative score of the degree of pupil constriction of the player during the game is C2, and the corresponding index weight is C2; the evaluation model G of the facial emotion rating is C1C 1+ C2C 2+ C3C 3 when the relevant score of the eye micro expression of the player during the game is C3 and the corresponding index weight is C3;
the invention also relates to VR glasses with a facial emotion recognition function, which comprise a VR glasses shell (1), a soft cushion (2), an elastic band (3), a ventilation groove (4), a camera (5), lenses (6), a first light shading sheet (7) and a second light shading sheet (8), wherein the VR glasses shell (1) is of a rectangular structure, two lenses (6) are symmetrically arranged on the VR glasses shell (1), the camera (5) is flush with the two lenses (6), the camera (5) is arranged between the two lenses, two side walls of the VR glasses shell (1) are correspondingly connected with two ends of the elastic band (3), a circle of soft cushion (2) is arranged on the edge of the VR glasses shell (1), two ends of the soft cushion (2) are respectively and correspondingly provided with the ventilation groove (4), two first light-shielding sheets (7) are arranged on the inner wall of one side of the ventilation groove (4) at equal intervals, a second light-shielding sheet (8) is arranged on the inner wall of the other side of the ventilation groove (4), and the second light-shielding sheet (8) is positioned between the two first light-shielding sheets (7);
during the use, utilize elastic cord (3) to wear VR glasses overhead, make cushion (2) and eye around laminating mutually, the player watches VR recreation scene and begin the recreation through lens (6), camera (5) are opened and are recorded the facial mood parameter of player simultaneously, and send it to the data collection module of VR recreation host computer through the facial mood parameter storage module in VR glasses shell (1), ventilation duct (4) at cushion (2) both ends can ventilate in cushion (2), prevent camera (5) camera lens atomizing, influence the record process of facial mood, first anti-dazzling screen (7) and second anti-dazzling screen (8) in ventilation duct (4) shelter from each other simultaneously, prevent ventilation duct (4), the impression when influence player recreation is experienced.
The game scene optimization method achieves the purposes of recording facial emotion parameters of a player in the game process, grading the facial emotion by utilizing edge cloud computing, screening out game data in corresponding time intervals according to the evaluated negative facial emotion, feeding back the game data to game manufacturers for professional analysis, and further optimizing the game scene.
Other similar embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein, which application is intended to cover any adaptations or variations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
Claims (9)
1. A VR game optimization system based on facial emotion recognition comprises VR glasses, a VR game host, a data management cloud center, a border cloud and a game manufacturer terminal, wherein the VR glasses have a facial emotion recognition function, signal interaction is established between the VR glasses and the VR game host, facial emotion recognition modules of the VR glasses collect facial emotion parameters in a game process of a player and send the facial emotion parameters to a data collection module of the VR game host, a data collection module of the VR game host collects game data and facial emotion parameters generated in the game process of a VR and sends the game data and the facial emotion parameters to a data storage module of the data management cloud center, the data storage module of the data management cloud center sends the received emotional emotion parameters and the game data to a data scheduling module of the data management cloud center in a unified manner, and the data scheduling module sends the facial emotion parameters and the game data to the corresponding border cloud according to game names, the edge cloud selects real-time scene interactive audio and video of corresponding time intervals from game data according to facial emotion rated as negative, the selected real-time scene interactive audio and video is sent to a data integration module of a data management cloud center, the data management cloud center is composed of a data storage module, a data scheduling module and a data integration module, the data integration module of the data management cloud center performs data visualization integration on the received real-time scene interactive audio and video, the integrated real-time scene interactive audio and video and a data game name are fed back to a corresponding game manufacturer terminal, and the game manufacturer performs professional analysis on the fed-back real-time scene interactive audio and video after receiving the fed-back real-time scene interactive audio and video, and optimizes the defects of an interactive scene.
2. The VR game optimization system of claim 1, wherein the facial emotion parameters include blink rate, pupil contraction, and eye microexpression.
3. The VR game optimization system based on facial emotion recognition of claim 1, wherein the edge cloud has a facial emotion assessment database, and facial emotions of different time periods are graded by comparing facial emotion parameters with data in the facial emotion assessment database.
4. The VR game optimization system based on facial emotion recognition of claim 1, wherein the facial emotion rating is a composite score calculated by cumulatively calculating weights of three evaluation indexes of blinking number, pupil contraction degree and eye micro expression, and is further classified into a rating grade, and the facial emotion rating includes three levels of negative, neutral and excited.
5. A game optimization method of a VR game optimization system applying facial emotion recognition is characterized by comprising the following steps:
1) the VR glasses collect the facial emotion parameters of the player during the game and send the parameters to the VR game host,
2) the VR game host collects game data and facial emotion parameters generated in the VR game process and sends the game data and the facial emotion parameters to the data management cloud center,
3) the data storage module of the data management cloud center uniformly transmits the received facial emotion parameters and game data to the data scheduling module of the data management cloud center, the data scheduling module transmits the facial emotion parameters and the game data to corresponding edge clouds according to game names,
4) the edge cloud carries out rating division on the facial emotions in different periods by comparing the facial emotion parameters with the data in the facial emotion assessment database,
5) the edge cloud selects real-time scene interactive audios and videos in corresponding time periods from game data according to the facial emotion rated as negative, and sends the selected real-time scene interactive audios and videos to a data integration module of a data management cloud center,
6) the data integration module carries out data visual integration on the received real-time scene interactive audio and video and feeds the integrated real-time scene interactive audio and video back to the corresponding game manufacturer terminal according to the name,
7) after receiving the fed back real-time scene interactive audio and video, game manufacturers carry out professional analysis on the real-time scene interactive audio and video, and optimize the defects of the interactive scene.
6. The VR game optimization method based on facial emotion recognition of claim 5, wherein the data scheduling module is further configured to send facial emotion parameters and game data of the same game to different edge clouds respectively, and the edge clouds process the facial emotion parameter data and then send the processed facial emotion parameter data to the edge clouds processing the same game data for analysis.
7. The VR game optimization method based on facial emotion recognition of claim 5, wherein the selected real-time scene interactive audios and videos can be screened again through an edge cloud.
8. The VR game optimization method of claim 5, wherein the facial emotion rating model is implemented by respectively setting the relative blink frequency score of a player during a game to C1 and the corresponding index weight to C1 according to the calculated index weights; the relative score of the degree of pupil constriction of the player during the game is C2, and the corresponding index weight is C2; the relevant score of the micro-expression of the player's eyes during the game was C3, and the corresponding index weight was C3, then the evaluation model G ═ C1 × C1+ C2 × C2+ C3 × C3 for the facial emotion rating.
9. The utility model provides a VR glasses with facial mood recognition function, its characterized in that comprises VR glasses shell, cushion, elastic cord, ventilation groove, camera, lens, first lens shade and second lens shade, and the VR glasses shell is the rectangle structure, two lenses have been put to the symmetry on the VR glasses shell, and the camera is arranged in on the VR glasses shell, the camera flushes mutually with two lenses, the camera is arranged in between two camera lenses, VR glasses shell both sides wall corresponds with elastic cord's both ends and is connected, the round cushion has been put at VR glasses shell edge, the cushion both ends correspond respectively and have opened the ventilation groove, two first lens shades have been put to ventilation groove one side inner wall equidistance, the second lens shade has been put to ventilation groove opposite side inner wall, the second lens shade is located between two first lens shades.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210557768.3A CN114949840A (en) | 2022-05-19 | 2022-05-19 | VR game optimization system and method based on facial emotion recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210557768.3A CN114949840A (en) | 2022-05-19 | 2022-05-19 | VR game optimization system and method based on facial emotion recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114949840A true CN114949840A (en) | 2022-08-30 |
Family
ID=82986045
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210557768.3A Pending CN114949840A (en) | 2022-05-19 | 2022-05-19 | VR game optimization system and method based on facial emotion recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114949840A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116993405A (en) * | 2023-09-25 | 2023-11-03 | 深圳市火星人互动娱乐有限公司 | Method, device and system for implanting advertisements into VR game |
-
2022
- 2022-05-19 CN CN202210557768.3A patent/CN114949840A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116993405A (en) * | 2023-09-25 | 2023-11-03 | 深圳市火星人互动娱乐有限公司 | Method, device and system for implanting advertisements into VR game |
CN116993405B (en) * | 2023-09-25 | 2023-12-05 | 深圳市火星人互动娱乐有限公司 | Method, device and system for implanting advertisements into VR game |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109544524B (en) | Attention mechanism-based multi-attribute image aesthetic evaluation system | |
CN111915148B (en) | Classroom teaching evaluation method and system based on information technology | |
CN108874959B (en) | User dynamic interest model building method based on big data technology | |
CN109491915B (en) | Data processing method and device, medium and computing equipment | |
CN114949840A (en) | VR game optimization system and method based on facial emotion recognition | |
CN110531849A (en) | Intelligent teaching system based on 5G communication and capable of enhancing reality | |
CN116090065B (en) | Digital twinning-based smart city greening design method and device | |
CN107993170A (en) | A kind of psychological health education system based on virtual reality technology | |
CN110321409A (en) | Secondary surface method for testing, device, equipment and storage medium based on artificial intelligence | |
CN110198453A (en) | Live content filter method, storage medium, equipment and system based on barrage | |
CN116630106A (en) | Intelligent training interactive teaching management method and system | |
CN117745494A (en) | Multi-terminal-fusion 3D video digital OSCE examination station system | |
CN109214448A (en) | Non- good performance staff training method, system, terminal and computer readable storage medium | |
CN111026267A (en) | VR electroencephalogram idea control interface system | |
CN117251057A (en) | AIGC-based method and system for constructing AI number wisdom | |
CN116484051A (en) | Course assessment method based on knowledge training platform | |
CN111062074A (en) | Building space quality virtual simulation and intelligent evaluation method | |
CN116966577A (en) | VR game optimization system and method based on facial emotion recognition | |
CN116681613A (en) | Illumination-imitating enhancement method, device, medium and equipment for face key point detection | |
KR102643159B1 (en) | A matching method that finds empty space in lcl containers in real time during container import and export | |
CN113435116B (en) | Sound quality self-adaptive design method and device based on virtual driving stand | |
CN115776446A (en) | Service quality monitoring system based on big data | |
CN109408638A (en) | Calibrate set update method and device | |
CN113592765A (en) | Image processing method, device, equipment and storage medium | |
CN106682204B (en) | Semantic extraction method based on crowdsourcing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |