CN116597118A - Multi-scene VR interactive system - Google Patents

Multi-scene VR interactive system Download PDF

Info

Publication number
CN116597118A
CN116597118A CN202211633657.2A CN202211633657A CN116597118A CN 116597118 A CN116597118 A CN 116597118A CN 202211633657 A CN202211633657 A CN 202211633657A CN 116597118 A CN116597118 A CN 116597118A
Authority
CN
China
Prior art keywords
interactive
scene
user
module
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202211633657.2A
Other languages
Chinese (zh)
Inventor
吴忠利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Sanshang Technology Co ltd
Original Assignee
Hangzhou Sanshang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Sanshang Technology Co ltd filed Critical Hangzhou Sanshang Technology Co ltd
Priority to CN202211633657.2A priority Critical patent/CN116597118A/en
Publication of CN116597118A publication Critical patent/CN116597118A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Graphics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to the technical field of VR, and particularly discloses a multi-scene VR interactive system. According to the application, character analysis is carried out by acquiring the historical record data of the user, so that character information of the user is generated; performing scene analysis by combining VR interaction requirements and user character information to construct VR interaction scenes; matching and marking target roles according to the character information of the user; performing demand adjustment according to VR interaction demands to generate optimized interaction perception parameters; and carrying out VR interaction in the VR interaction scene based on the optimized interaction perception parameters. The VR interaction requirement and the user character information of the user can be synthesized to carry out scene analysis, a VR interaction scene is built, target roles are matched and marked according to the user character information, and the interaction perception parameters are adjusted to generate optimized interaction perception parameters, so that the VR scene, the interaction roles and the perception parameters are built, matched and adjusted according to factors in multiple aspects such as the user character and the requirement, and the actual requirements of different users are met.

Description

Multi-scene VR interactive system
Technical Field
The application belongs to the technical field of VR, and particularly relates to a multi-scene VR interactive system.
Background
VR technology, also known as virtual reality technology, is a brand new practical technology developed in the 20 th century. The VR technology comprises a computer, electronic information and simulation technology, and the basic implementation mode is that the computer technology is used as the main mode, and the latest development achievements of various high technologies such as a three-dimensional graphic technology, a multimedia technology, a simulation technology, a display technology, a servo technology and the like are utilized and integrated, and a realistic virtual world with various sensory experiences such as three-dimensional vision, touch sense, smell sense and the like is generated by means of equipment such as the computer, so that a person in the virtual world generates an immersive sense.
The multi-scene VR interactive system is one of the specific modes of VR technical application, however, the existing multi-scene VR interactive system can only be fixedly provided with a plurality of VR interactive scenes generally so that a user can select VR interactive experiences, and can not construct and match VR scenes and interactive roles according to factors such as characters and requirements of the user, so that VR interaction is single, and the single VR interactive system can not meet the requirements of the user along with the gradual development and popularization of VR technologies.
Disclosure of Invention
The embodiment of the application aims to provide a multi-scene VR interactive system, which aims to solve the problems in the background technology.
In order to achieve the above object, the embodiment of the present application provides the following technical solutions:
the system comprises a user character analysis unit, an interactive scene construction unit, a target role matching unit, a parameter optimization and adjustment unit and a VR scene interaction unit, wherein:
the user character analysis unit is used for acquiring historical record data of a user, carrying out character analysis on the historical record data and generating user character information of the user;
the interactive scene construction unit is used for acquiring VR interactive requirements of users, and carrying out scene analysis by integrating the VR interactive requirements and the user character information to construct VR interactive scenes;
the target role matching unit is used for acquiring a plurality of interactive roles in the VR interactive scene, and matching and marking the target roles according to the user character information;
the parameter optimization and adjustment unit is used for acquiring the standard interaction perception parameters of the target role, and carrying out demand adjustment according to the VR interaction demand to generate optimized interaction perception parameters;
and the VR scene interaction unit is used for carrying out VR interaction in the VR interaction scene by taking the user as the first view angle of the target role based on the optimized interaction perception parameter.
As a further limitation of the technical solution of the embodiment of the present application, the user personality analysis unit specifically includes:
the identity verification module is used for carrying out user identity verification and judging whether the verification is successful or not;
the information acquisition module is used for acquiring the identity information of the user after the verification is successful;
the history matching module is used for matching and acquiring history record data of the user according to the identity information;
and the character analysis module is used for carrying out character analysis on the historical record data and generating user character information of the user.
As a further limitation of the technical solution of the embodiment of the present application, the personality analysis module specifically includes:
the correlation extraction sub-module is used for extracting character correlation data in the history record data;
the character analysis sub-module is used for carrying out character analysis on the character related data to obtain a plurality of character labels;
and the label synthesis sub-module is used for synthesizing a plurality of character labels and generating user character information of the user.
As a further limitation of the technical solution of the embodiment of the present application, the interactive scene construction unit specifically includes:
the window display module is used for generating and displaying an interactive demand input window;
the demand receiving module is used for receiving VR interaction demands input by a user through the interaction demand input window;
the information extraction module is used for extracting key requirement information in the VR interaction requirement;
and the scene construction module is used for carrying out scene analysis by integrating the key demand information and the user character information to construct a VR interactive scene.
As further defined by the technical solution of the embodiment of the present application, the scene construction module specifically includes:
the scene matching sub-module is used for matching a plurality of related demand scenes according to the key demand information;
the correlation analysis sub-module is used for carrying out character correlation analysis on a plurality of related demand scenes according to the character information of the user and generating correlation analysis results;
and the scene selection sub-module is used for selecting a VR interactive scene from a plurality of related demand scenes according to the association analysis result.
As further defined by the technical solution of the embodiment of the present application, the target role matching unit specifically includes:
the role acquisition module is used for acquiring a plurality of interactive roles in the VR interactive scene;
the requirement extraction module is used for extracting role requirement information from the VR interaction requirement;
and the role matching module is used for integrating the role demand information and the user character information, and matching and marking target roles from a plurality of interactive roles.
As further defined by the technical solution of the embodiment of the present application, the role matching module specifically includes:
the standard generation sub-module is used for synthesizing the role demand information and the user character information to generate a role matching standard;
the matching evaluation sub-module is used for carrying out matching evaluation on a plurality of interactive roles according to the role matching standard to generate matching evaluation information;
and the role matching sub-module is used for matching and marking target roles from the interactive roles according to the matching evaluation information.
As a further limitation of the technical solution of the embodiment of the present application, the parameter optimization adjusting unit specifically includes:
the related acquisition module is used for acquiring role related information of the target role;
the standard acquisition module is used for acquiring standard interactive perception parameters from the role related information;
and the parameter adjusting module is used for carrying out demand adjustment on the standard interactive perception parameters according to the VR interactive demand to generate optimized interactive perception parameters.
As a further limitation of the technical solution of the embodiment of the present application, the parameter adjusting module specifically includes:
the interval acquisition sub-module is used for analyzing the VR interaction requirement and acquiring a parameter requirement interval;
the adjustment planning sub-module is used for integrating the standard interactive perception parameters and the parameter demand interval to carry out adjustment planning and generating adjustment planning data;
and the parameter adjustment sub-module is used for carrying out demand adjustment on the standard interactive perception parameters according to the adjustment planning data to generate optimized interactive perception parameters.
As further defined by the technical solution of the embodiment of the present application, the VR scene interaction unit specifically includes:
the view angle creating module is used for creating a first view angle of the VR interactive scene for the user according to the target role;
the perception acquisition module is used for acquiring interactive perception data in the VR interactive scene in real time;
and the interactive transmission module is used for carrying out VR interactive transmission on the interactive perception data to the user based on the optimized interactive perception parameters.
Compared with the prior art, the application has the beneficial effects that:
according to the embodiment of the application, character analysis is carried out by acquiring the historical record data of the user, so that character information of the user is generated; performing scene analysis by combining VR interaction requirements and user character information to construct VR interaction scenes; matching and marking target roles according to the character information of the user; performing demand adjustment according to VR interaction demands to generate optimized interaction perception parameters; and carrying out VR interaction in the VR interaction scene based on the optimized interaction perception parameters. The VR interaction requirement and the user character information of the user can be synthesized to carry out scene analysis, a VR interaction scene is built, target roles are matched and marked according to the user character information, and the interaction perception parameters are adjusted to generate optimized interaction perception parameters, so that the VR scene, the interaction roles and the perception parameters are built, matched and adjusted according to factors in multiple aspects such as the user character and the requirement, and the actual requirements of different users are met.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present application.
Fig. 1 shows an application architecture diagram of a system provided by an embodiment of the present application.
Fig. 2 is a block diagram illustrating a configuration of a user personality analysis unit in the system according to an embodiment of the present application.
Fig. 3 is a block diagram illustrating a configuration of a personality analysis module in the system according to an embodiment of the present application.
Fig. 4 shows a block diagram of an interactive scene construction unit in the system according to an embodiment of the present application.
Fig. 5 shows a block diagram of a scene building module in the system according to an embodiment of the present application.
Fig. 6 is a block diagram showing the structure of a target character matching unit in the system according to the embodiment of the present application.
Fig. 7 is a block diagram illustrating a configuration of a role matching module in a system according to an embodiment of the present application.
Fig. 8 shows a block diagram of a parameter optimization adjusting unit in the system according to an embodiment of the present application.
Fig. 9 shows a block diagram of a parameter adjustment module in a system according to an embodiment of the present application.
Fig. 10 shows a block diagram of a VR scene interaction unit in a system according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It can be appreciated that in the prior art, a multi-scene VR interactive system can only be fixedly provided with a plurality of VR interactive scenes generally, so that a user can select VR interactive experiences, but can not construct and match VR scenes and interactive roles according to factors such as characters and requirements of the user, so that VR interaction is single, and the single VR interactive system can not meet the requirements of the user along with gradual development and popularization of VR technology.
In order to solve the problems, the embodiment of the application performs character analysis by acquiring the historical record data of the user to generate character information of the user; performing scene analysis by combining VR interaction requirements and user character information to construct VR interaction scenes; matching and marking target roles according to the character information of the user; performing demand adjustment according to VR interaction demands to generate optimized interaction perception parameters; and carrying out VR interaction in the VR interaction scene based on the optimized interaction perception parameters. The VR interaction requirement and the user character information of the user can be synthesized to carry out scene analysis, a VR interaction scene is built, target roles are matched and marked according to the user character information, and the interaction perception parameters are adjusted to generate optimized interaction perception parameters, so that the VR scene, the interaction roles and the perception parameters are built, matched and adjusted according to factors in multiple aspects such as the user character and the requirement, and the actual requirements of different users are met.
Fig. 1 shows an application architecture diagram of a system provided by an embodiment of the present application.
Specifically, in another preferred embodiment of the present application, a multi-scene VR interactive system includes:
the user personality analysis unit 101 is configured to obtain historical record data of a user, perform personality analysis on the historical record data, and generate user personality information of the user.
In the embodiment of the application, before the user performs VR scene interaction, the user character analysis unit 101 collects the authentication information of the user, and further performs identity authentication on the user through the authentication information, and judges whether authentication is successful or not, if so, the user refuses to perform subsequent operation; if verification is successful, acquiring identity information of the user, matching historical record data of the user from a database according to the identity information, analyzing the historical record data, extracting character related data in the historical record data, further carrying out character analysis on the character related data to obtain a plurality of character labels, and carrying out comprehensive processing on the character labels to generate user character information of the user.
It will be appreciated that there are a variety of ways of authentication, including in particular: facial recognition, voice verification, fingerprint verification, account password verification, and the like; the character analysis comprises summarizing and analyzing character characteristics of the user and summarizing and analyzing hobby characteristics of the user.
Specifically, fig. 2 shows a block diagram of a user personality analysis unit 101 in the system according to the embodiment of the present application.
In a preferred embodiment of the present application, the user personality analysis unit 101 specifically includes:
the authentication module 1011 is configured to perform user authentication and determine whether the authentication is successful.
The information acquisition module 1012 is configured to acquire identity information of the user after the authentication is successful.
And the history matching module 1013 is configured to match and obtain history record data of the user according to the identity information.
And the character analysis module 1014 is used for carrying out character analysis on the history data and generating user character information of the user.
Specifically, fig. 3 shows a block diagram of a system neutral density analysis module 1014 according to an embodiment of the present application.
In a preferred embodiment of the present application, the personality analysis module 1014 specifically includes:
a correlation extraction submodule 10141 is used for extracting character correlation data in the history data.
And the character analysis submodule 10142 is used for carrying out character analysis on the character related data to obtain a plurality of character labels.
The tag synthesis submodule 10143 is used for synthesizing a plurality of character tags and generating user character information of a user.
Further, the multi-scene VR interactive system further includes:
the interaction scene construction unit 102 is configured to obtain VR interaction requirements of a user, perform scene analysis by integrating the VR interaction requirements and the user character information, and construct a VR interaction scene.
In the embodiment of the present application, after user authentication and character analysis are completed, the interactive scene construction unit 102 generates and displays an interactive demand input window, the user may input information related to the interactive demand in the interactive demand input window, the interactive scene construction unit 102 may receive the VR interactive demand input by the user, analyze the VR interactive demand, reject redundant information, extract key demand information from the VR interactive demand, further match a plurality of relevant demand scenes meeting the user demand according to the key demand information, perform character correlation analysis on the plurality of relevant demand scenes based on the user character information of the user, generate a correlation analysis result, and finally select a relevant demand scene adapted to the user character from the plurality of relevant demand scenes according to the correlation analysis result, and mark the relevant demand scene as the VR interactive scene.
Specifically, fig. 4 shows a block diagram of the structure of the interactive scene construction unit 102 in the system according to the embodiment of the present application.
In a preferred embodiment of the present application, the interactive scene construction unit 102 specifically includes:
the window display module 1021 is configured to generate and display an interactive requirement input window.
The request receiving module 1022 is configured to receive, through the interaction request input window, a VR interaction request input by a user.
The information extraction module 1023 is configured to extract key requirement information in the VR interaction requirement.
The scene construction module 1024 is configured to integrate the key requirement information and the user personality information to perform scene analysis, and construct a VR interactive scene.
Specifically, fig. 5 shows a block diagram of a scene building module 1024 in the system according to an embodiment of the present application.
In a preferred embodiment of the present application, the scene building module 1024 specifically includes:
the scene matching submodule 10241 is used for matching a plurality of related demand scenes according to the key demand information.
And the correlation analysis submodule 10242 is used for carrying out character correlation analysis on a plurality of related demand scenes according to the character information of the user and generating correlation analysis results.
And the scene selection submodule 10243 is used for selecting the VR interaction scene from the plurality of related demand scenes according to the association analysis result.
Further, the multi-scene VR interactive system further includes:
and the target role matching unit 103 is configured to obtain a plurality of interaction roles in the VR interaction scene, and match and mark the target roles according to the user character information.
In the embodiment of the present application, the target role matching unit 103 obtains a plurality of interactive roles in the VR interactive scene, extracts role requirement information related to the role requirement of the user from the VR interactive requirement, generates a role matching standard by comprehensively analyzing the role requirement information and the user character information, further performs matching evaluation on the interactive roles according to the role matching standard, which meets the user requirement and adapts to the user character, generates matching evaluation information, and performs permutation comparison on the plurality of interactive roles according to the matching evaluation information, and matches and marks the target roles from the plurality of interactive roles.
Specifically, fig. 6 shows a block diagram of the structure of the target character matching unit 103 in the system according to the embodiment of the present application.
In a preferred embodiment of the present application, the target role matching unit 103 specifically includes:
the role obtaining module 1031 is configured to obtain a plurality of interactive roles in the VR interactive scene.
The requirement extraction module 1032 is configured to extract role requirement information from the VR interaction requirement.
And a role matching module 1033, configured to integrate the role requirement information and the user character information, and match and mark a target role from the multiple interactive roles.
Specifically, fig. 7 shows a block diagram of a role matching module 1033 in the system according to an embodiment of the present application.
In a preferred embodiment of the present application, the role matching module 1033 specifically includes:
and a standard generation submodule 10331, configured to synthesize the role requirement information and the user character information and generate a role matching standard.
And the matching evaluation submodule 10332 is used for performing matching evaluation on the interactive roles according to the role matching standard to generate matching evaluation information.
And the role matching submodule 10333 is used for matching and marking target roles from the interactive roles according to the matching evaluation information.
Further, the multi-scene VR interactive system further includes:
and the parameter optimization and adjustment unit 104 is configured to obtain the standard interactive perception parameters of the target character, perform demand adjustment according to the VR interactive demand, and generate optimized interactive perception parameters.
In the embodiment of the present application, the parameter optimization adjustment unit 104 obtains the role related information of the target role, performs parameter identification on the angular hue information, obtains the standard interactive perception parameter from the role related information according to the identification result, performs the perception requirement analysis of the user through the VR interactive requirement, obtains the parameter requirement interval of the user, performs adjustment planning by integrating the standard interactive perception parameter and the parameter requirement interval, generates adjustment planning data, and further performs requirement adjustment on the standard interactive perception parameter according to the adjustment planning data, so as to generate the optimized interactive perception parameter.
It will be appreciated that the perception parameters are intensity parameters corresponding to the perception system including auditory, visual, tactile, gustatory, olfactory, etc., for example: illumination intensity and volume.
Specifically, fig. 8 shows a block diagram of the parameter optimization adjusting unit 104 in the system according to the embodiment of the present application.
In a preferred embodiment of the present application, the parameter optimization adjustment unit 104 specifically includes:
the related acquiring module 1041 is configured to acquire role related information of the target role.
The standard obtaining module 1042 is configured to obtain standard interactive perception parameters from the character related information.
And a parameter adjustment module 1043, configured to perform a requirement adjustment on the standard interactive sensing parameter according to the VR interactive requirement, so as to generate an optimized interactive sensing parameter.
Specifically, fig. 9 shows a block diagram of a parameter adjustment module 1043 in the system according to an embodiment of the present application.
In a preferred embodiment of the present application, the parameter adjustment module 1043 specifically includes:
and the interval obtaining submodule 10431 is configured to analyze the VR interaction requirement and obtain a parameter requirement interval.
And the adjustment planning submodule 10432 is used for integrating the standard interactive perception parameters and the parameter demand interval to carry out adjustment planning and generate adjustment planning data.
And the parameter adjustment submodule 10433 is configured to perform demand adjustment on the standard interactive sensing parameters according to the adjustment planning data, so as to generate optimized interactive sensing parameters.
Further, the multi-scene VR interactive system further includes:
and the VR scene interaction unit 105 is configured to perform VR interaction in the VR interaction scene with the user as the first view angle of the target character based on the optimized interaction perception parameter.
In the embodiment of the present application, the VR scene interaction unit 105 uses the target character as the interactive experience character in the VR interaction scene, creates the first view angle of the user in the VR interaction scene, acquires the interactive perception data in the VR interaction scene in real time based on the experience scenario of the target character in the VR interaction scene, and adjusts the parameters of the interactive perception data according to the optimized interactive perception parameters, and then transmits VR interaction to the user.
Specifically, fig. 10 shows a block diagram of the VR scene interaction unit 105 in the system according to the embodiment of the present application.
In the preferred embodiment of the present application, the VR scene interaction unit 105 specifically includes:
the view creation module 1051 is configured to create a first view of the VR interactive scene for the user with the target role.
The perception acquisition module 1052 is configured to acquire interactive perception data in the VR interactive scene in real time.
The interactive transmission module 1053 is configured to transmit the interactive perception data to the user in a VR interactive manner based on the optimized interactive perception parameter.
It should be understood that, although the steps in the flowcharts of the embodiments of the present application are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the application.

Claims (10)

1. The multi-scene VR interactive system is characterized by comprising a user character analysis unit, an interactive scene construction unit, a target role matching unit, a parameter optimization and adjustment unit and a VR scene interaction unit, wherein:
the user character analysis unit is used for acquiring historical record data of a user, carrying out character analysis on the historical record data and generating user character information of the user;
the interactive scene construction unit is used for acquiring VR interactive requirements of users, and carrying out scene analysis by integrating the VR interactive requirements and the user character information to construct VR interactive scenes;
the target role matching unit is used for acquiring a plurality of interactive roles in the VR interactive scene, and matching and marking the target roles according to the user character information;
the parameter optimization and adjustment unit is used for acquiring the standard interaction perception parameters of the target role, and carrying out demand adjustment according to the VR interaction demand to generate optimized interaction perception parameters;
and the VR scene interaction unit is used for carrying out VR interaction in the VR interaction scene by taking the user as the first view angle of the target role based on the optimized interaction perception parameter.
2. The multi-scene VR interactive system of claim 1, wherein said user personality analysis unit specifically comprises:
the identity verification module is used for carrying out user identity verification and judging whether the verification is successful or not;
the information acquisition module is used for acquiring the identity information of the user after the verification is successful;
the history matching module is used for matching and acquiring history record data of the user according to the identity information;
and the character analysis module is used for carrying out character analysis on the historical record data and generating user character information of the user.
3. The multi-scene VR interactive system of claim 2, wherein said personality analysis module specifically comprises:
the correlation extraction sub-module is used for extracting character correlation data in the history record data;
the character analysis sub-module is used for carrying out character analysis on the character related data to obtain a plurality of character labels;
and the label synthesis sub-module is used for synthesizing a plurality of character labels and generating user character information of the user.
4. The multi-scene VR interactive system of claim 1, wherein said interactive scene construction unit specifically comprises:
the window display module is used for generating and displaying an interactive demand input window;
the demand receiving module is used for receiving VR interaction demands input by a user through the interaction demand input window;
the information extraction module is used for extracting key requirement information in the VR interaction requirement;
and the scene construction module is used for carrying out scene analysis by integrating the key demand information and the user character information to construct a VR interactive scene.
5. The multi-scene VR interactive system of claim 4, wherein said scene building module specifically comprises:
the scene matching sub-module is used for matching a plurality of related demand scenes according to the key demand information;
the correlation analysis sub-module is used for carrying out character correlation analysis on a plurality of related demand scenes according to the character information of the user and generating correlation analysis results;
and the scene selection sub-module is used for selecting a VR interactive scene from a plurality of related demand scenes according to the association analysis result.
6. The multi-scene VR interactive system of claim 1, wherein said target character matching unit specifically comprises:
the role acquisition module is used for acquiring a plurality of interactive roles in the VR interactive scene;
the requirement extraction module is used for extracting role requirement information from the VR interaction requirement;
and the role matching module is used for integrating the role demand information and the user character information, and matching and marking target roles from a plurality of interactive roles.
7. The multi-scene VR interactive system of claim 6, wherein said character matching module specifically comprises:
the standard generation sub-module is used for synthesizing the role demand information and the user character information to generate a role matching standard;
the matching evaluation sub-module is used for carrying out matching evaluation on a plurality of interactive roles according to the role matching standard to generate matching evaluation information;
and the role matching sub-module is used for matching and marking target roles from the interactive roles according to the matching evaluation information.
8. The multi-scene VR interactive system of claim 1, wherein said parameter optimization adjustment unit specifically comprises:
the related acquisition module is used for acquiring role related information of the target role;
the standard acquisition module is used for acquiring standard interactive perception parameters from the role related information;
and the parameter adjusting module is used for carrying out demand adjustment on the standard interactive perception parameters according to the VR interactive demand to generate optimized interactive perception parameters.
9. The multi-scene VR interactive system of claim 8, wherein said parameter adjustment module specifically comprises:
the interval acquisition sub-module is used for analyzing the VR interaction requirement and acquiring a parameter requirement interval;
the adjustment planning sub-module is used for integrating the standard interactive perception parameters and the parameter demand interval to carry out adjustment planning and generating adjustment planning data;
and the parameter adjustment sub-module is used for carrying out demand adjustment on the standard interactive perception parameters according to the adjustment planning data to generate optimized interactive perception parameters.
10. The multi-scene VR interactive system of claim 1, wherein the VR scene interactive unit specifically comprises:
the view angle creating module is used for creating a first view angle of the VR interactive scene for the user according to the target role;
the perception acquisition module is used for acquiring interactive perception data in the VR interactive scene in real time;
and the interactive transmission module is used for carrying out VR interactive transmission on the interactive perception data to the user based on the optimized interactive perception parameters.
CN202211633657.2A 2022-12-19 2022-12-19 Multi-scene VR interactive system Withdrawn CN116597118A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211633657.2A CN116597118A (en) 2022-12-19 2022-12-19 Multi-scene VR interactive system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211633657.2A CN116597118A (en) 2022-12-19 2022-12-19 Multi-scene VR interactive system

Publications (1)

Publication Number Publication Date
CN116597118A true CN116597118A (en) 2023-08-15

Family

ID=87610438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211633657.2A Withdrawn CN116597118A (en) 2022-12-19 2022-12-19 Multi-scene VR interactive system

Country Status (1)

Country Link
CN (1) CN116597118A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117590951A (en) * 2024-01-18 2024-02-23 江西科技学院 Multi-scene VR interaction method, system and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117590951A (en) * 2024-01-18 2024-02-23 江西科技学院 Multi-scene VR interaction method, system and storage medium
CN117590951B (en) * 2024-01-18 2024-04-05 江西科技学院 Multi-scene VR interaction method, system and storage medium

Similar Documents

Publication Publication Date Title
CN111695439B (en) Image structured data extraction method, electronic device and storage medium
CN105518708A (en) Method and equipment for verifying living human face, and computer program product
RU2018137829A (en) METHOD, DEVICE AND INFORMATION DISPLAY SYSTEM
CN109446905A (en) Sign electronically checking method, device, computer equipment and storage medium
CN109033058B (en) Contract text verification method, apparatus, computer device and storage medium
CN109767261A (en) Products Show method, apparatus, computer equipment and storage medium
CN112581567B (en) Image processing method, device, electronic equipment and computer readable storage medium
CN108537115B (en) Image recognition method and device and electronic equipment
CN112036147B (en) Method, device, computer equipment and storage medium for converting picture into webpage
CN109886223B (en) Face recognition method, bottom library input method and device and electronic equipment
CN116597118A (en) Multi-scene VR interactive system
CN109002784A (en) The training method and system of streetscape identification model, streetscape recognition methods and system
CN109949207B (en) Virtual object synthesis method and device, computer equipment and storage medium
CN114222179B (en) Virtual image video synthesis method and equipment
CN113793256A (en) Animation character generation method, device, equipment and medium based on user label
CN112102157A (en) Video face changing method, electronic device and computer readable storage medium
CN110580507B (en) City texture classification and identification method
CN115423936A (en) AI virtual character and image processing method, system, electronic device and storage medium
Yurtsever et al. Photorealism in driving simulations: Blending generative adversarial image synthesis with rendering
CN116205723A (en) Artificial intelligence-based face tag risk detection method and related equipment
CN113627576B (en) Code scanning information detection method, device, equipment and storage medium
CN113408317A (en) Intelligent label sticking monitoring method and system
CN113821677A (en) Method, device and equipment for generating cover image and storage medium
CN112989115A (en) Screening control method and device for videos to be recommended
CN111696179A (en) Method and device for generating cartoon three-dimensional model and virtual simulator and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20230815