CN116850575A - Entertainment system capable of realizing virtual-real interaction - Google Patents

Entertainment system capable of realizing virtual-real interaction Download PDF

Info

Publication number
CN116850575A
CN116850575A CN202311045088.4A CN202311045088A CN116850575A CN 116850575 A CN116850575 A CN 116850575A CN 202311045088 A CN202311045088 A CN 202311045088A CN 116850575 A CN116850575 A CN 116850575A
Authority
CN
China
Prior art keywords
virtual
real
interaction
park
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311045088.4A
Other languages
Chinese (zh)
Inventor
陈涛
钟芳
曾繁景
刘喜旺
张潇思
陈学锋
陈日光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Jinma Smart Technology Co ltd
Original Assignee
Guangzhou Jinma Smart Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Jinma Smart Technology Co ltd filed Critical Guangzhou Jinma Smart Technology Co ltd
Publication of CN116850575A publication Critical patent/CN116850575A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an entertainment system capable of realizing virtual-real interaction, which comprises a real scene paradise, a meta-stream system server and a terminal; the live-action park comprises interaction equipment, live-action data acquisition devices, player information acquisition devices, interaction equipment data acquisition devices and communication devices, wherein the interaction equipment is used for a player to entertain or play or experience, the interaction equipment is arranged on an amusement project field or runs on the amusement project field, and the communication devices are used for communication between the interaction equipment or between the live-action park and a meta-play system server; the meta-stream system server is used for constructing a virtual paradise scene, associating virtual identities of players and synchronously mapping the live-action interaction equipment to the corresponding virtual interaction equipment; the terminal is used for displaying virtual paradise scenes, virtual identities and virtual interaction equipment in the virtual paradise scenes in the meta-stream system server; the live-action park and the meta-play system server are in communication connection with each other and the meta-play system server and the terminal.

Description

Entertainment system capable of realizing virtual-real interaction
[ field of technology ]
The invention relates to an entertainment system capable of realizing virtual-real interaction.
[ background Art ]
Besides the stronger interest, the networking game can also lead a plurality of game players to perform the same-field competition. Generating a ranking by a game player through game competition according to a certain competition rule; the game players rank by ranking. The competitive networking game increases the game resistance, so that game players can participate in the game more invested, and the competitive networking game has strong play resistance and becomes a main mode of networking game development.
The existing networking games are all carried out in a virtual environment provided by a game platform, a game scene, a game doll and the like exist in the virtual world, the reality of the game is lacking, and the field atmosphere of the game cannot be experienced.
[ invention ]
The invention aims to overcome the defects of the prior art, and provides an entertainment system which can realize virtual-real interaction by enabling the same player to play with each other or experience with each other between a virtual world player and a real world player, can realize interactive experience amusement projects, and can increase the virtual entertainment playing method of playing and the real participatory interaction.
The object of the present invention is achieved in that,
An entertainment system capable of realizing virtual-real interaction is characterized by comprising a real scene paradise, a meta-stream system server and a terminal;
the live-action park comprises interaction equipment which is used for entertainment, play or experience of players and is arranged on a recreation ground or runs on the recreation ground; the live-action data acquisition device is used for acquiring live-action park data; player information collection means for collecting player information; the interactive equipment data acquisition device is used for acquiring the interactive equipment data; and the communication device is used for communication between the interaction equipment or between the live-action paradise and the meta-play system server; the information acquired by the live-action data acquisition device, the player information acquisition device and the interactive equipment data acquisition device is transmitted to the meta-play system server through the communication device;
the meta-game system server comprises a virtual paradise and an interactive system which are constructed according to the live-action paradise data; the interactive system comprises an amusement service module for synchronously displaying the running state of the corresponding virtual interactive equipment mapped by the player live-action interactive equipment in the virtual paradise scene so as to realize synchronous interaction between reality and virtual information; the player management module is used for realizing the association of the real identity and the virtual identity of the player and presenting the real and virtual information;
The terminal is used for displaying virtual paradise scenes, virtual identities and virtual interaction equipment in the virtual paradise scenes in the meta-stream system server;
the live-action park is in communication connection with the meta-stream system server and the meta-stream system server is in communication connection with the terminal.
The entertainment system capable of realizing virtual-real interaction is characterized in that the real-scene data acquisition device and the interaction equipment data acquisition device can be integrated.
An entertainment system for realizing virtual-real interaction is characterized in that the terminal can be one or more of a mobile terminal or a PC or a display terminal or a physical display device.
The entertainment system capable of realizing virtual-real interaction is characterized in that the terminal is arranged on interaction equipment of a real-scene park.
The entertainment system capable of realizing virtual-real interaction is characterized by further comprising an entertainment project management module.
An entertainment system capable of realizing virtual-real interaction as described above is characterized in that the interaction system further comprises a task management module.
An entertainment system capable of realizing virtual-real interaction as described above, wherein the interactive system further comprises a digital asset management module.
The entertainment system capable of realizing virtual-real interaction is characterized in that the virtual paradise refers to part or all of the real paradise, and a digital scene displayed after the theme activity is superimposed.
The entertainment system capable of realizing virtual-real interaction is characterized in that the virtual park comprises a virtual scene, virtual equipment and a virtual character library.
The entertainment system capable of realizing virtual-real interaction is characterized in that a plurality of virtual characters are included in the virtual character library, the virtual characters are constructed based on real player information or are created by the system, and the virtual characters can be non-player characters set according to the amusement experience of the virtual park.
An entertainment system capable of realizing virtual-real interaction is characterized in that the virtual paradise further comprises a basic material library, and the basic material library can be one or more of virtual theme elements, virtual props or virtual interaction equipment.
The entertainment system capable of realizing virtual-real interaction is characterized in that the acquisition device can be one or more of an image acquisition device, a position acquisition device, a motion state acquisition device, a sound acquisition device and a physiological information acquisition device.
The entertainment system capable of realizing virtual-real interaction is characterized in that the terminal is a display mechanism capable of displaying a mixed scene of virtual elements and reality and interacting with players of the amusement park.
The entertainment system capable of realizing virtual-real interaction is characterized in that a plurality of real-scene parks are provided, and a plurality of virtual parks are provided; and the plurality of live-action parks, the plurality of virtual parks and the live-action parks are connected through an interactive system.
An entertainment system capable of realizing virtual-real interaction is characterized in that a plurality of real-scene parks form a real world, a plurality of virtual parks form a virtual world, and the real world and the virtual world are connected through an interaction system.
An interaction method of an entertainment system capable of realizing virtual-real interaction is characterized by comprising the following steps of
S1, a live-action data acquisition device acquires live-action park data and transmits the live-action park data to a meta-game system server through a communication device;
s2, the meta-game system server constructs a virtual paradise according to the collected live-action paradise data;
s3, the player information acquisition device acquires player information and transmits the player information to the meta-game system server through the communication device;
s4, the player management module associates the real identity and the virtual identity of the player according to the collected player information and enters a corresponding recreation service module;
S5, the interaction equipment data acquisition device acquires the operation state data of the interaction equipment and transmits the operation state data to the meta-stream system server through the communication device;
and S6, synchronously displaying the running state of the corresponding virtual interaction device mapped by the player live-action interaction device in the virtual park scene by the recreation service module according to the collected data, and simultaneously enabling the player to interact with the virtual park scene, the virtual identity and the virtual interaction device through the terminal so as to realize synchronous interaction of reality and virtual information.
The beneficial effects of the invention are as follows:
the playing mode of the invention comprises the following steps: 1. the virtual park can be accessed by players, 2, the players can access the virtual park to participate in games, 3, the players can play in the live-action park, 4, the players can interact outside the scene of the interaction device in the live-action park, 5, different players can play simultaneously on the virtual device of the virtual park and the interaction device in the live-action park, 6, the players can interact between the virtual park and the live-action park, and the playing method is various, and has stronger entertainment and stronger interest.
[ description of the drawings ]
FIG. 1 is a schematic diagram of the structure of the present invention;
FIG. 2 is a schematic diagram of the connection structure of the interactive system and the terminal of the present invention;
FIG. 3 is a block diagram of a virtual park of the present invention;
FIG. 4 is a schematic representation of a real park of the present invention;
FIG. 5 is a schematic illustration of a virtual park of the present invention;
FIG. 6 is a schematic illustration of the play of the dodgem play item of the present invention;
FIG. 7 is a schematic diagram of an amusement ride management module according to the present invention;
FIG. 8 is a schematic view of a sparse point cloud reconstruction of the principles of the three-dimensional reconstruction algorithm of the present invention;
FIG. 9 is a schematic illustration of a feature decomposition of the principles of the three-dimensional reconstruction algorithm of the present invention;
FIG. 10 is a rendering flow diagram of the principles of the three-dimensional reconstruction algorithm of the present invention;
FIG. 11 is a block diagram of a bumper car in a bumper car playitem of the present invention;
FIG. 12 is an electrical control block diagram of a bumper car in accordance with the present invention;
FIG. 13 is a second schematic illustration of the present invention dodgem play item play;
FIG. 14 is a second schematic diagram of the present invention bumper car amusement park play;
FIG. 15 is a three schematic views of the present invention's bumper car amusement park play;
FIG. 16 is a flow chart of the plays of the dodgem play item of the present invention.
[ detailed description ] of the invention
The invention is further described below with reference to the accompanying drawings:
as shown in fig. 1-3, an entertainment system capable of realizing virtual-real interaction comprises a live-action park 1, a meta-stream system server 2 and a terminal 3.
The live-action park comprises interaction devices 11 for the entertainment or play or experience of players placed on or running on a play field; and a live-action data collection device 12 for collecting live-action park data; a player information collection device 13 for collecting player information; and an interactive device data acquisition means 14 for acquiring interactive device data; and communication means 15 for communication between the interactive devices or between the live-action park and the meta-play system server; the information acquired by the live-action data acquisition device, the player information acquisition device and the interactive equipment data acquisition device is transmitted to the meta-play system server through the communication device; specifically, as shown in fig. 4, the live-action park comprises a plurality of amusement park sites, live-action data acquisition devices acquire live-action data of each amusement park, and amusement interaction equipment is arranged in each amusement park; as shown in fig. 6, the bumper car interaction device is arranged in the bumper car amusement project field, and the player information acquisition device and the interaction device data acquisition device are arranged on the bumper car interaction device.
The meta-game system server constructs a virtual paradise scene according to the live-action paradise data, correlates the real identity and the virtual identity of a player according to player information, enters into an amusement project, synchronously displays the running states of the corresponding virtual interaction devices mapped by the live-action interaction devices in the virtual paradise scene according to the interaction device data so as to realize synchronous interaction between the real and the virtual information, namely, the real-time running states such as speed, steering and the like of the live-action interaction devices in the live-action paradise and the virtual interaction devices in the virtual paradise are the same. Specifically, a meta-play system is built in a meta-play system server, and the meta-play system comprises a virtual paradise 21 and an interaction system 22; as shown in fig. 5, the virtual paradise 21 is constructed by adding virtual elements on the basis of the environment of the real paradise; the interactive system comprises an amusement service module 23 for realizing the interaction of players with each amusement project; and a player management module 24 that implements association of the player's real identity with the virtual identity and is used to present real and virtual information.
The terminal is used for displaying virtual paradise scenes, virtual character identities and virtual interaction equipment in the virtual paradise scenes in the meta-stream system server.
The live-action park is connected with the meta-stream system server and the meta-stream system server is connected with the terminal through network communication.
The real-scene data acquisition device and the interactive equipment data acquisition device can be integrated, for example, when the interactive equipment is unpowered interactive equipment, the real-time motion state can be mapped to the corresponding virtual interactive equipment by acquiring the motion image data of the unpowered interactive equipment, so that synchronous interaction between reality and virtual information can be realized.
The terminal of the present invention includes a mobile terminal 31, a pc, a display terminal 32, and a physical display device. The mobile terminal can be a mobile phone, a tablet, a remote controller, an intelligent watch, an intelligent glasses and the like, and the PC can be a desktop computer, a DIY computer, a notebook computer, a tablet computer, an integrated computer, an ultrabook and the like. Display terminals include a variety of displays, televisions, rear projection, and the like. The physical display device can be the lifting of the water level in the water column, and the higher the water level is, the higher the score is; the physical display means may also be a lifting of sand in the sand column, the more the sand pile the higher the integral.
As shown in fig. 6, the terminal of the present invention may be installed on an interactive device of a live-action park, or may be provided with a display mechanism 33 that can display a mixed scene of virtual elements and reality on the live-action park and interact with players on the amusement park.
The interactive system 22 of the invention comprises a recreation service module 23 and a player management module 24, wherein the recreation service module is used for completing the butt joint and processing of recreation items combining virtual and real. The player management module is used for realizing the association of the player reality identity and the virtual identity and the presentation of the reality and virtual information. A recreation service module can store a game, and the player management module is used for realizing the association of the real identity and the virtual identity of the player and realizing the association and presentation of the real and virtual information of the same player.
The recreation service module consists of a series of recreation services, each recreation service corresponds to one recreation item, and is realized based on a recreation server engine skynet to provide back-end recreation services for the recreation item.
And the player uses a client of the meta-game system to log in the meta-game system to participate in the recreation item on the mobile phone and other terminals. The client of the meta-stream system is realized by adopting a game engine, wherein a game scene is a scene which is obtained by adopting 2D/3D modeling to construct by using a physical park, namely a virtual park.
After a player logs in, the client establishes real-time long connection with the recreation service, the input of the player is sent to the recreation service through a network, and the recreation service forwards the input of the player to equipment of recreation items, so that the player can control the equipment of recreation items; meanwhile, interaction equipment of the amusement project collects field data through devices such as sensors and cameras and sends the field data to the amusement service, the amusement service converts data of an actual scene and sends the converted data to a meta-stream system client, and after receiving the data, the meta-stream system client updates information such as states and positions of elements such as the interaction equipment in the amusement project on a virtual park.
The player management module maintains registration information for the player, which includes real identity information such as: name, micro-signal, cell phone number, etc., also contains virtual information set by the player in the virtual park such as: nicknames, professions, roles, impersonation, etc. The player management module binds the real identity and the virtual identity by the information of unique identification individuals in the real world such as mobile phone numbers, micro signals and the like. When a player plays in a virtual park, the players are presented with virtual identities, when the player needs to trade with the real world, such as: purchasing props uses the real identity of the player.
The interactive system in the meta-game system further comprises an amusement project management module 25 for maintaining and upgrading the live-action amusement projects, wherein the amusement project management module is used for managing access information of the live-action amusement projects, upgrading software and upgrading content.
As shown in fig. 7, the play item management module maintains access credentials for live action play items. Before the amusement project is accessed to the meta-play system, a manager inputs information of the amusement project in the amusement project management module and generates an access certificate. The access certificate contains the affiliated institution information, authority information, keys, access addresses of the interactive system and the like of the amusement park. The access certificate is manually imported into the access device of the attraction. When the access equipment of the amusement project accesses the meta-stream system, certificate information is required to be carried, and the platform authenticates according to the certificate.
The amusement project access interaction system and the software and content updating flow are as follows: after the equipment is started, reading a local access certificate, actively establishing connection with an interactive system, and reporting authentication required information; the game item management module accesses the information reported by the equipment to carry out authentication, and after the authentication is not passed, the connection is disconnected after a result is returned; if the authentication is passed, returning authentication success and the current software version and content version of the system, and keeping long connection; after receiving the software version and the content version returned by the game item management, the game item access device compares the local software version and the local content version. If the version is different, the game item management module is accessed, the latest version is downloaded, and after the downloading is successful, the local software and the content are updated.
The interactive system in the meta-game system further comprises a task management module 26 for an operation end to create recreation task information and for a client end to acquire recreation task information, wherein the task management module is used for associating recreation items of a live-action park to realize that players play in various theme activity modes.
The task management module realizes: 1. and configuring the recreation items to be played for each task, the connection relation between the recreation items, the requirements for completing the recreation items and the achievements for operators in a visual editing interface mode. 2. And generating a theme activity route map according to the relation between the recreation items, and enabling the meta-tourist client to acquire the theme activity route map and recreation item information from the task management module and present the theme activity route map and recreation item information to the player.
The interactive system in the meta-game system of the present invention further comprises a digital asset management module 27 which can store digital asset data of the identity of the virtual character, and the digital asset management module is used for realizing the digital content management of the online and realistic paradise creation of the player and supporting the transaction of the player in the platform. The digital asset management module creates a property account and a digital content library for each player, respectively stores virtual property, digital content and transaction records of the player, realizes an online mall, and issues digital content and carries out transactions for the player in the mall.
The virtual paradise is a digital scene displayed after a theme activity is superimposed by taking a real paradise as a prototype. The virtual park is characterized in that a park scene, amusement projects, service points, merchants, hotels and the like of the virtual park are obtained by taking the real park as a prototype, and theme atmosphere or theme elements or other elements are overlaid for rendering through 2D or 3D modeling, so that the scene experienced by a player in the virtual park can correspond to the real scene, but the virtual park is more vivid and interesting than the real scene.
The virtual park of the present invention, as shown in fig. 3, includes a virtual scene 211, a virtual device 212, and a virtual character library 213. The at least one virtual scene plus the at least one virtual device may constitute a playground.
The virtual character library of the present invention includes a plurality of virtual characters 2131, which may be player characters constructed based on player information.
The virtual character of the present invention may also be a non-player character, i.e., NPC, set according to the amusement experience of the virtual park.
The task management module is used for unifying the content of the theme activities in the virtual paradise and the real paradise, wherein the virtual paradise and the real paradise comprise theme elements, props, IP, landscapes, sound effects, light effects, story lines, NPC, prizes and the like.
The virtual park of the present invention further includes a base stock 214 for constructing a virtual park including virtual theme elements, virtual props, virtual equipment elements, virtual scene elements, virtual character elements, and the like. For example, when the theme activity is an end-day theme activity, the virtual theme elements include virtual rice dumplings, virtual rice dumplings leaves and virtual food materials. The virtual props include paddles of a rowing boat, etc.
The task management module comprises a theme activity, wherein the theme activity can be a festival activity, a cultural propaganda activity and the like, the IP can be a paradise IP or an IP cooperated with other companies, and the story line can be a similar IP, but has story line promotion and the like, which are simultaneously embodied in virtual/real scenes.
The embodiment mode comprises the following steps: 1. landscape: the theme elements revealed by the contents such as landscape arrangement, lighting, music and the like of the live scenes are fused with the theme elements such as scene rendering, picture music and the like in the virtual paradise, and are mutually connected and supported. 2. Storyline tasks: in a live-action park, the amusement park tasks distributed to the players and the events/game tasks reflected by the players in the virtual park are in certain connection, can be similar subject contents and playing methods, and can also be mutually complemented and pushed layer by layer. Npc grooming/interaction modality: in a live-action park, elements capable of expressing NPC function are: staff, digital persons, crews, certain equipment or landscapes, which play roles of amusement park service, task propulsion, theme atmosphere rendering, interactive pleasure and the like. NPCs in the virtual park also have the above display forms and functions, and have mapping relation with NPCs in the live-action park. 4. Gift/prize, etc.: the players may have intercommunicating parts in the form of expressions of a live-action and virtual park by participating in prizes obtained by games, activities, competitions, or the like, or gifts given to the players by the park side, or the like. Such as coexistence of virtual badges and physical badges, coexistence of virtual props and physical props, and real-scene park gift redemption coupons of virtual parks.
The live-action data acquisition device can be one or more of an image acquisition device or a sound acquisition device; the player information acquisition device can be one or more of a physiological information acquisition device or an intelligent device information acquisition device; the interaction device data acquisition means may be one or more of a position acquisition means or a motion state acquisition means. The image acquisition device comprises a camera, a camera and the like, the position acquisition device comprises a positioner, and the motion state acquisition device can be a speed sensor, a direction sensor, an acceleration sensor and the like and is used for acquiring the states of a player, such as the actions of a body, the hand and foot stretching, and the front, back, left, right, up and down movements and the like. The sound collection device can be a microphone, a recording device and the like, and the physiological information collection device can be a sphygmomanometer and a device for measuring height and weight.
The interactive equipment comprises recreation facilities, recreation machines, game machines and props. The amusement ride may be a bumper car, a roller coaster, or the like.
The player management module includes an identification sub-module for identifying virtual identity information and real identity information of the player. The identity recognition sub-module recognizes the identity information of the player through password authentication or face recognition or fingerprint recognition or iris recognition or an electronic tag containing the identity information such as a bracelet.
The display terminal of the invention is a display device for displaying a real paradise or a virtual and real mixed scene or a virtual paradise. The display device may be a display screen, a cell phone, a tablet, a PC, AR glasses, etc.
The real-scene paradise can be multiple, the virtual paradise can be multiple, and the real-scene paradise and the virtual paradise and/or the virtual paradise are connected through an interactive system.
The plurality of live-action parks of the present invention form a real world, and the plurality of virtual parks of the present invention form a virtual world, with the real world and/or the virtual world being coupled by an interactive system.
The interaction system 22 completes the connection of the live-action park to the virtual park by: the equipment such as a camera and a sensor arranged in the live-action park acquires the people stream, the temperature, the light and the sound of the live-action park, and the people stream, the temperature, the light and the sound are uploaded to the interactive system through the access equipment of the live-action park. The interaction system cleans and filters the field data of the live-action park and stores the field data into a live-action park scene real-time database. The meta-stream system client accesses the interactive system through the network to acquire real-time status data of the live-action park, and then updates the corresponding scene effect in the virtual park; the amusement equipment of the live-action park reports the acquired data such as the position, the speed and the like of the equipment to an interactive system through an access equipment, the interactive system is forwarded to a meta-stream client playing corresponding amusement items in the virtual park in real time, and the meta-stream client updates the states, the positions and the like of the equipment in the virtual park scene according to the received data.
The working principle of the invention is as follows: firstly, constructing a virtual paradise by fully or partially referring to a live-action paradise, wherein the virtual paradise model can be 2D or 3D; then establishing a corresponding relation between the virtual coordinates and the real coordinates; then determining the current virtual coordinates and visual angles of the player or determining the real coordinates and visual angles of the player through a locator and converting the current virtual coordinates and visual angles of the player; the amusement service in the interactive system calculates the view in real time in the virtual paradise model according to the current virtual coordinates and the visual angle of the player, and the view is presented or fused and presented to the player through the terminal.
The three-dimensional reconstruction algorithm principle of the scheme is as follows:
1. there are two situations in a multi-view image, a shot image:
1.1, independent object: when photographing an independent object, a plurality of view angles surround the photographed object, preferably 360 degrees are fully illuminated, and view angle center lines of all images are intersected on the photographed object as much as possible;
1.2, scene environment: when the scene environment is photographed, the surrounding is not needed, the close-up and the far-up are needed, the close-up is needed to be photographed continuously, namely, enough parts of the images are overlapped, and the far-up is needed to include a plurality of close-up images.
2. Sparse point cloud reconstruction is shown in fig. 8.
2.1, extracting image features:
2.1.1, search image position: identifying potential scaling, rotation and displacement invariant feature points by a gaussian derivative function; feature points are typically turned bright in the dark or at inflection points of edges in an image and where the bright turns dark.
2.1.2, key feature point selection: the choice of key feature points depends on their degree of stability, the more stable the feature points present in more viewing angles, the more critical;
2.1.3, direction determination: each key feature point is assigned a direction or directions based on the direction of the gradient of the image part.
2.1.4, key feature point description: in the area around each key feature point, local gradients of the image are measured, these gradients being a description of the key feature point, which remain unchanged under larger shape and illumination changes.
2.2, feature matching: by matching the descriptions of the key feature points between the images, all the feature points on the matching are the projections of the same three-dimensional feature point on different two-dimensional images. The specific matching method is related to the description method of the feature points, for example, the matching is performed by using a bit comparison method, that is, the more the same bits are, the greater the possibility that the feature points of different images are matched.
2.3, obtaining a rotation matrix R and a translation vector t by characteristic decomposition
As shown in fig. 9, we assume that we obtain a pair of paired feature points from two images: p is p 1 And p 2
The three-dimensional space coordinates of the three-dimensional feature point P are:
P=[X,Y,Z] T
then there are:
p 1 =KP,p 2 =K(RP+t)
wherein K is a camera internal reference matrix, R and t are rotation matrixes and translation vectors of a camera coordinate system for photographing two images. Wherein, the camera coordinate system is three-dimensional taking:
x 1 =K -1 p 1 ,x 2 =K -1 p 2
wherein: x is x 1 ,x 2 Is p 1 And p 2 Homogeneous coordinates on two pixel planes. Substituting the above formula to obtain:
x 2 =Rx 1 +t
thus, there are:
x T 2 t Rx 1 =0 is: p is p T 2 K -T t RK -1 p 1 =0
Wherein the symbols are Is defined as follows:
let the vector t= [ t ] 1 ,t 2 ,t 3 ] T Then:
called directionAn antisymmetric matrix of the quantity t.
Taking:
E=t R,F=K -T EK -1 the following steps are:
x T 2 Ex 1 =0,p T 2 Fp 1 =0
consider a pair of matching points whose homogeneous coordinates are: x is x 1 =[u 1 ,v 1 ,1] T ,x 2 =[u 2 ,v 2 ,1] T
The method comprises the following steps:
because u is 1 ,v 1 ,u 2 ,v 2 Is known, e can be found by using a plurality of pairs of matching points 1 ~e 9 Similarly, F can be obtained.
R and t can be solved by E and F, and after the camera position of a certain image is set as the origin, the position and the rotation angle of the camera are determined when all the images are photographed.
2.4 triangularization
As shown in FIG. 9, because of p 1 =KP,p 2 =k (rp+t), where K, R, t, p 1 ,p 2 Are known or calculated, due to p 1 ,p 2 The two equations are four equations, P is three-dimensional, three coordinates are to be solved, and the three coordinates to be solved of P can be solved through a linear equation. In practice, O 1 P and O 2 The P-factor error may not intersect, and at this time, the optimal solution is obtained by the least square method, that is, the three-dimensional coordinate of the P point is calculated. All the calculated P points form a sparse point cloud and become a map. The point cloud refers to a three-dimensional point set of a three-dimensional space, and the coordinate attribute is (X, Y, Z).
3. Dense point cloud reconstruction
As shown in fig. 9, three-dimensional feature point P, projection P in two images 1 ,p 2 The absolute value of the difference between the abscissas of (a) is called the parallax of P between two images. The two-dimensional map of the parallax composition corresponding to all pixels of one image becomes a parallax map. Point P at I 1 The Z coordinate value in the camera coordinate system is the pixel point p 1 A two-dimensional map composed of all pixels of one image corresponding to the own depth becomes a depth map.
As known from the previous calculation, the parallax of the projection points of all three-dimensional feature points (i.e. the points in the sparse point cloud) in any two images can be calculated (if the two images have the same projection of the three-dimensional points), by expanding the feature points, the parallax of all pixels can be obtained by using the patch matchstereo algorithm [ BleyerM, rhemannC, rotherC, patchMatchStereo-StereoMatchingwithSlantedSupportWindows, britishMachineVisionConference2011 ], and then the depth map can be obtained according to the following equation:
Z=D
Where D is depth, D is parallax, B is the baseline length (baseline is O 1 O 2 ) F is focal length (pixel unit), x 0 ,y 0 Is the image center point pixel coordinate, the image coordinate system is two-dimensional and the upper left corner is the coordinate system origin.
Thus, the coordinates (X, Y, Z) of all three-dimensional points in the camera coordinate system are obtained, and the world seat of all three-dimensional points is calculated by using R and t of each image calculated by 2.3Three-dimensional coordinates of the frame (X w ,Y w ,Z w ) I.e. a dense point cloud.
4. Virtual-to-real mapping
4.1 sparse Point cloud for mapping between real space and virtual space
And shooting an image in the actual space, and calculating a rotation matrix R and a translation vector t corresponding to the image according to the same method as the method 2.1-2.3, so as to acquire the virtual position and the gesture of the image shooting camera in the virtual space in the actual space, and complete the mapping between the actual space and the virtual space.
4.2, dense point clouds are used for digital twinning. And displaying the dense point cloud in the virtual space.
5. Three-dimensional spatial positioning
The method comprises the steps of acquiring an image of an actual space in real time through a camera, calculating a rotation matrix R and a translation vector t of the camera in real time according to the method in the section 2.1-2.3, and acquiring three-dimensional coordinates and three-dimensional angle postures of the camera in a virtual space in real time, so that real-time positioning of the camera in the three-dimensional space (including the actual space and the virtual space) including the postures is realized.
6. Fusion of real and virtual scenes
When the AR glasses are used, the system displays the virtual scene on the glasses according to the positions of the AR glasses after positioning the AR glasses, at the moment, the virtual scene is aligned with the real scene, and the real scene and the virtual scene are fused through optical perspective of the glasses.
When the MR helmet display device is used, the system acquires an image of a real scene in real time through the camera and acquires a depth map of the real scene by a method of section 3, after the system positions the MR helmet display device, a virtual scene is displayed on the MR helmet according to the position of the MR helmet display device, at the moment, the virtual scene is aligned with the real scene, the system compares the actual depth information of the depth map with the content of the virtual scene, and if the actual depth is smaller than the virtual depth, the system displays the actual image, otherwise, the virtual content is displayed.
7. Virtual scene display principle
3D refers to a representation of all shapes in 3D space and its position can be calculated using a coordinate system. A virtual scene (also referred to as a model) is made up of virtual objects, which are described using vertices. A vertex is defined by a point in the 3D coordinate system that has a coordinate position and some optional additional information. Each vertex may contain the following attributes:
Position: is used to identify (X, Y, Z) in 3D space.
Color: RGBA is included where R, G and B are red, green, blue, respectively, and A is an alpha channel for controlling transparency.
Normal: the vertex orientations are described.
Texture: the vertices are used to decorate a 2D picture of the model surface, simply by color.
3D rendering flow: the vertex descriptions of the virtual scene are accepted, fragments (fragments) thereof are calculated, and then the fragments are rendered into pixels (pixels) and output to the screen, as shown in fig. 10.
The vertex processing is divided into four steps: first, object in world coordinates, also called model transformation (modeling); the second step is view transformation (view transformation), which processes the position and orientation of the camera; the third step, projective transformation (projective transformation), also called perspective transformation (superpositional transformation), defines the camera settings, including field of view (fieldofview), aspect ratio (aspect ratio) and optional near clipping and far clipping parameters; the fourth step is view window conversion (view transformation), and two-dimensional display contents are determined. Subsequent rasterization converts the 3D display primitives into a series of fragments. The segments are corresponding to a grid of pixels, the segments processing the texture and illumination of interest, and the final color values are calculated based on given parameters. Texture is a 2D picture that makes the model look more realistic in 3D space, texture is a combination of individual texels (texels), as is the principle of pixel combination. The colors we see on the screen are the final result after the interaction of the illumination and model colors, textures. The processed segments are converted into a 2D grid of pixels in an output merge stage, printed onto screen pixels.
8. Taking AR glasses bumper car as an example to illustrate the practical application of the principle
8.1, establishing a sparse point cloud of a scene of a parking lot
Firstly, a camera is used for photographing the actual environment of a bumper car park (hundreds of photos, wherein the first photo is the photo for determining the origin of a cloud coordinate system), the photos are clear, the contents of two close photos are partially overlapped, and the environment is provided with a fixed pattern (a wall surface which cannot be white). Then, a sparse point cloud is generated (in preparation for later positioning with this point cloud) according to the steps of sections 2.1-2.4 above.
8.2 establishing virtual scenes
A virtual scene is established by using 3D modeling software, wherein the virtual scene comprises virtual objects (such as virtual gold coins, virtual bombs, virtual runways, virtual starry sky, virtual dinosaurs, virtual walls and the like), and then 8.1 established sparse point clouds are also put into the virtual scene, so that the fixed objects in the virtual scene have a three-dimensional relative position relationship with the sparse point clouds.
8.3, realizing AR playing method of the dodgem
1. Topology structure: a plurality of (2-40) dodgems are arranged in a dodgem yard; each dodgem is provided with 1-2 tourists, each tourist wears a pair of AR glasses, each AR glasses is provided with a computing device, and the computing device can be directly placed on the AR glasses or connected with the AR glasses in a wired manner but placed on the dodgem; a parking lot is provided with a server, a client program is arranged in each computing device of the AR glasses, and clients in all computing devices of the AR glasses are connected with the server through a wireless network.
2. Acquiring an image in real time through a camera on the AR glasses, and positioning the three-dimensional position and the three-dimensional posture of the AR glasses in real time by using the three-dimensional space positioning method of the 5 th section and the sparse point cloud established in the 8.1 section; the positioning program runs in the computing device of the AR glasses;
3. the game 3D engine in the AR glasses computing device correctly displays the virtual scene established in the section 8.2 on the AR glasses in real time (including the display of the virtual object) after acquiring the correct positioning and posture of the game 3D engine;
4. the server may distribute the virtual scene to the computing devices of each AR glasses over a wireless network. The server also synchronizes the state (e.g., vanishing, rebuilding, sound effects start and stop, etc.) of virtual objects associated with the virtual scene in the AR glasses computing device in real time.
5. The game management program running in all AR glasses computing devices is the same, and manages the game play together with the server, such as the collision of a bumper car with a virtual object, the handling of the collision of a bumper car with a bumper car (e.g., who has consumed gold coins, who has encountered a bomb, and the corresponding score, etc.).
The following description will be given by taking a bumper car as an example:
in this embodiment, as shown in fig. 6, the live-action park includes an interactive device, namely a bumper car, where the field is a bumper car field, and the bumper car has a similar structure to the existing bumper car, and includes a driving device 111 for controlling the front of the bumper car, a braking device 112 for controlling the stop of the bumper car, and a steering device 113 for controlling the direction of the bumper car, as shown in fig. 11-12. The system also comprises a remote control device 114, a steering motor 115, a position acquisition device 116, a vehicle-mounted control device 117, a driving device mounting seat 118, a sound acquisition device 119, a loudspeaker 1110, an image camera 1111 serving as a live-action data acquisition device, a vehicle-mounted reader 1112 serving as a player information acquisition device, a motion state acquisition device 1113 serving as an interaction device data acquisition device, a display screen 1114 serving as a display terminal, a power management module 1115 for supplying power and a processor 1116 for receiving and controlling the work of each component. The driving device comprises a speed control motor 1117 for controlling the front of the bumper car and a motor driving circuit 1118 for driving the speed control motor, the motor driving circuit is connected with the steering motor, the braking device is connected with a braking control coil circuit 1119, and the control device is mainly a remote control device additionally arranged on the steering device and the driving device.
The terminal in this embodiment is a vehicle-mounted display device, and aims to provide immersive mixed reality experience, and virtual scenes in a field can be displayed through the vehicle-mounted display device, and the vehicle-mounted display screen is a mixed reality display device arranged in front of a bumper car, and can also be AR glasses, and simultaneously displays a real environment of the field and virtual game elements, wherein the virtual game elements are correctly mapped to the real environment positions through a mixed reality technology, so that the immersive experience of players is enhanced in the interaction process.
In the running process of the collection device vehicles in the embodiment, parameters such as the ID, the position, the gesture and the like of the vehicles of each vehicle can be collected, so that a series of interaction events in an interaction system, such as picking up virtual gold coins, props and the like in the field, are completed; through the related collision sensor, the information of collision position, force, direction and the like can be recorded by the bumper car amusement service in the interaction system during the interactive collision of the vehicles, so that different scores obtained by different players can be calculated; virtual reality superposition of the dodgem is realized, and virtual and real combination and interaction in a real scene are realized. The communication device can be a wireless module Bluetooth module or a WIFI module or a mobile communication module arranged on a bumper car or a bumper car field, the terminal is a mobile phone, and the display terminal can be a display screen or AR glasses and the like arranged on the bumper car. The player uses the mobile phone to log in the server through the network, enter the virtual park, connect with the bumper car in the live-action park through the bumper car recreation service of the interaction system, and through the driving device that controls the advance of the bumper car, the braking device that the control bumper car stops, the steering device to control the direction of the bumper car controls the bumper car.
The specific playing interaction implementation manner of the dodgem in different scenes is further described below:
1. play downstream of live-action park midline
As shown in fig. 6 and 16, the implementation of the game interaction of the dodgem in the live-action park is to develop a virtual scene, and the technologies of vehicle-mounted display or MR glasses or AR glasses or VR glasses are utilized to combine with software and hardware development to construct a virtual-real fusion competition environment, so that players can play games in different places or virtual three-dimensional environments created by designers through head-display in an immersive manner, such as lava, space, seabed and the like; when a player drives a vehicle by wearing MR glasses, a mixed reality scene is displayed, the position and azimuth information of each bumper car is transmitted to the bumper car recreation service of the interaction system through the positioning system, the positions and postures of the bumper cars can be correctly displayed by the glasses, the additional virtual game playing method of the player in the real scene is realized, and meanwhile, the real collision of the vehicle is experienced.
The playing flow of the in-field playing method comprises the following steps: (1) after a player wears a bracelet containing an RFID chip to enter a bumper car of a bumper car park, the identity of the player is identified through a vehicle-mounted RFID reader. After the identity information of the player is transferred to the player management module of the interactive system through the network, the player management module returns relevant information of the player, such as player information, prop number and the like. (2) After the device is started, players mutually compete in the dodgem field through driving the dodgems, the driving dodgems of other players fall off gold coins including real collision, the fallen gold coins can be picked up by other players, virtual props such as gold coins are presented to the players through display terminals, and the display terminals can be vehicle-mounted displays or transparent HUD (head up display) windscreens or AR/MR (augmented reality) glasses and the like. (3) In addition to the virtual prop, the game can also have a virtual vehicle, the vehicle can be controlled by an AI, after the player interacts with the vehicle, feedback is carried out through a vehicle-mounted control system, for example, the vehicle collides with the virtual vehicle, and the real bumper car is braked by a control braking device to simulate the collision of the virtual and real vehicle. (4) And (5) finishing playing: after the prescribed time is reached, the game service calculates the result of the game and records the result into each player account.
The field external interaction playing method comprises the following steps: as shown in fig. 6, the off-site player interacts with the player playing the dodgem on-site through the display mechanism, sees that the virtual element in-site is fused with the real scene, and can use the terminal devices such as the off-site display mechanism to "throw" the virtual prop into the site, the corresponding virtual prop will be displayed at the throwing position thereof through various display terminals, and the disfigurement capability of the project is promoted.
The system enables the player to "throw" virtual props into the venue by:
1. the off-site terminal presents optional virtual props of the player;
2. the player selects the prop by pressing a button or touch screen on the off-site terminal and controls the direction key or directly drags and drops the position of the prop "throw" on the touch screen.
3. And the game engine of the off-site terminal calculates the position of the prop in the virtual scene according to the position set by the player and then sends the position to the interaction system.
4. The interactive system displays the player-selected prop in the virtual scene.
An implementation of in-field and out-of-field interactive play: a display screen, which may be a transparent screen or a translucent screen or an opaque screen, is disposed around the yard, and may be used by off-site players to interact with on-site players. The off-site player controls by operating the dodgem arranged on the off-site display screen, such as zooming the screen, so that the screen displays panorama or local close-up, such as throwing virtual gold coins or bombs or virtual walls and other props, such as adding an acceleration card or a deceleration card or a armor and the like to the selected dodgem, so that the selected dodgem accelerates or decelerates or is protected from attack and the like. When no one operates the display screen, the display screen can be used as a large screen to display various ranking conditions, real-time battle condition rebroadcasting conditions, scene display in virtual interaction and the like, or give operation prompts to attract audience participation.
2. Play on virtual park midline
As shown in fig. 16, the online hand-play mini-game: the online player can log in the meta-game system through terminals such as a mobile phone and the like, participate in the small game of the dodgem in the virtual scene by using the mobile phone, the player can see the virtual dodgem controlled by various other players to play in the virtual scene, the game scene is various and changeable, the playing method can be close to the off-line playing method, and the off-line playing method can be exceeded, so that the online game is an independent game ecology.
3. Online virtual park and offline live-action park play together
As shown in fig. 13 and 16, in the virtual park and the live-action park, the two play modes can be combined by software and hardware, so that an online player can remotely control an offline real vehicle or an online virtual vehicle to perform the same-platform play with the offline real dodgem player.
The off-line player can see that the virtual image of the on-line player drives the real dodgem or can see the virtual dodgem and the virtual image through the display terminal, and the on-line player can see the picture of the fusion of the real scene and the virtual scene or the picture of the pure virtual. The playing method solves the problem that offline players lack accompanying playing, and connects various terminals with a plurality of space-time players.
Before the game starts, the online player logs in the platform in the terminal authorization, enters the amusement hall, selects the dodgem to be controlled, and the offline player binds the selected dodgem through the bracelet.
After waiting for sending out an instruction for starting a game on site, an online player can control the selected bumper car to participate in the game through the bumper car recreation service of the interaction system like playing the game on the terminal, and the player can see through the terminal that the real picture transmitted by the vehicle-mounted camera fuses with the virtual-real combined picture of the virtual element; the real dodgem can be not remotely controlled, and only a pure virtual picture can be displayed, so that an online player can see a play picture composed of a plurality of virtual dodgems displayed according to other player positions and other virtual elements.
The off-line player can see the virtual images of the real dodgem which is remotely controlled and the remote player on the car through various display terminals, or the pure virtual vehicles and roles are fused in the real field.
The online game playing method has the advantages that online and offline play is achieved, online players can remotely experience stimulation and happiness of the online games, and the online players can obtain distinctive play experience because the remotely controlled unmanned dodgem participates in the games.
In the playing method, a player sends data such as control instructions, positions, postures and the like of the dodgems to a dodgem recreation service through a network at an online game client, the dodgem recreation service forwards the data to the dodgems of off-line players, and the positions and postures of the dodgems of the on-line players are rendered in real time at a display terminal by the dodgems of the off-line players. The speed, position and posture data reported by the sensor and the positioning device are reported to the dodgem recreation service by the dodgem of the online player, the data are forwarded to the client of the online player by the dodgem recreation service, and the position and state of the offline dodgem are rendered by the game client of the online player, as shown in fig. 14.
4. Sports of the dodgem car between different sites: as shown in fig. 15, this mode requires at least two real bumper yards with online offline playing functions, the two yards automatically match the intersection area, the playing area is guaranteed to be consistent, the two yards need to be in online playing matching mode at the same time, hereinafter referred to as yard a and yard B, and the total number of players in the two yards is no more than the number of smaller players.
The playing method requires that the shape, side length and direction of the dodgem yard are drawn through an editor provided by the system when the dodgem yard is connected to the platform. When the system automatically matches intersection areas, two yard polygons are placed in a coordinate system according to the same direction, then the gravity centers of the shapes of the two yards are calculated, then the gravity center of any one yard polygon is translated to be overlapped with the gravity center of the other polygon, and then the intersection of the two polygons is used as the range of a playing area.
A calculation step of calculating an intersection area of polygons:
1) Establishing a fixed point table of the main polygon and the auxiliary polygon;
2) Solving focuses of the main polygon and the auxiliary polygon, inserting the focuses into the two polygon tables in sequence, and establishing a bidirectional pointer;
3) If there is an entry point that has not been tracked, the following operations are performed
3.1 selecting any untracked entry point from the main polygon table as a starting point, outputting the entry point into the intersection table by using a tracked polygon boundary as an original point, and changing the tracking direction through a bidirectional pointer until encountering the starting point if the boundary is an intersection point.
3.2 calculate intersection area by intersection table using the following company:
is provided withIs polygonal, M i =(x i ,y i ) I=1, 2, …, k, the area of the polygon M is then
15-16, after matching is successful, the initialization process is started, and players at two sites need to travel to a designated area to stop, wherein the steps are to unify the positions of vehicles at the two sites, so that the local vehicles and actions of the remote unmanned dodgem can be synchronized during playing.
After the initialization is finished, the game starts, the game states of the two sites are synchronous, and when a site player drives a vehicle to run, a site B has an unmanned bumper car to run synchronously according to the running gesture. Otherwise, the corresponding unmanned bumper car can run on the B site. Therefore, the players can collide with each other, virtual props are used, and the body feeling feedback on both sides is consistent because of the synchronous physical vehicles.
In the playing method, the position, the speed and the gesture of the manned bumper car are reported to the bumper car amusement service of the interactive system in real time, the interactive system forwards the position, the speed and the gesture to the unmanned bumper car, and the unmanned bumper car control system controls the running of the vehicle according to the data sent by the interactive system.
5. Online and offline continuous play
Each account has a storage function, and online points can be played offline or online points can be played offline. The player can create an account number, log in the meta-game system through a terminal such as a mobile phone, participate in the items in the virtual paradise by using the terminal, and obtain points.
When the player plays on line, the player can enter the live-action park through the bracelet configured in the live-action park, and the meta-stream system obtains and updates information such as the score ranking of the player while brushing the bracelet, so that the player can realize that the online score ranking is used for playing on line.
Players enter a mine car project queuing area of the park, and through 1-2 virtual reality playing methods for prefabricating the project in a large screen arranged in the queuing area, the method is expected to be full of innovative playing methods to be experienced, and the operation methods and skills of individual links of playing are mastered.
And when the safe compression bar is bound in the boarding area, prompting the player to bind with the interactive handrail through broadcasting, and determining whether the binding is successful according to the interactive handrail indication and the like.
After the mining vehicle gets off, a player can visually see a plurality of virtual story line related pictures on the track, and can make specified actions through peripheral audible prompts, can interact with the pictures, and can pass through the pictures to create active interaction and amusement surprise for the player.
The player completes the making action according to the interactive armrests, and can control certain devices of the recreation items according to the action indexes of the whole car player, such as: gemstone lighting, machine model action, volcanic eruption, water curtain quantity, etc.
Through player playing performance, the meta-stream system scores each tourist, and converts the score into a relevant growth value of a point rewarding system of the meta-stream system, and the point can be used in an online meta-stream system and an offline live-action park.
The points of the player in the meta-game system can be exchanged for special props in the process of playing mining vehicles in the live-action paradise on line, so that different game privileges can be obtained; or for redeeming relevant ticket offers issued by the campus.

Claims (16)

1. An entertainment system capable of realizing virtual-real interaction is characterized by comprising a real scene park (1), a meta-stream system server (2) and a terminal (3);
The live-action park comprises interactive devices (11) for the entertainment or play or experience of players placed on or running on a play field; a live-action data acquisition device (12) for acquiring live-action park data; a player information collection device (13) for collecting player information; an interactive device data acquisition means (14) for acquiring interactive device data; and communication means (15) for communication between the interaction devices or between the live-action park and the meta-play system server; the information acquired by the live-action data acquisition device, the player information acquisition device and the interactive equipment data acquisition device is transmitted to the meta-play system server through the communication device;
the meta-game system server comprises a virtual park (21) and an interactive system (22) which are constructed according to the live-action park data; the interactive system comprises an amusement service module (23) for synchronously displaying the running state of the corresponding virtual interactive equipment mapped by the player live-action interactive equipment in the virtual paradise scene so as to realize synchronous interaction between reality and virtual information; and a player management module (24) for effecting association of the player's real identity with the virtual identity and for presenting real and virtual information;
the terminal is used for displaying virtual paradise scenes, virtual identities and virtual interaction equipment in the virtual paradise scenes in the meta-stream system server;
The live-action park is in communication connection with the meta-stream system server and the meta-stream system server is in communication connection with the terminal.
2. The entertainment system capable of realizing virtual-real interaction according to claim 1, wherein the live-action data acquisition device and the interaction equipment data acquisition device can be integrated.
3. An entertainment system for enabling virtual-real interaction according to claim 1, characterized in that the terminal is one or more of a mobile terminal (31) or a PC or a display terminal (32) or a physical display device.
4. An entertainment system for realizing virtual-real interaction according to claim 3, wherein the terminal is installed on an interaction device of a live-action park.
5. An entertainment system for enabling virtual-real interaction according to claim 1, characterized in that the interaction system further comprises an amusement park management module (25).
6. An entertainment system for enabling virtual-real interactions as claimed in claim 4 wherein the interaction system further comprises a task management module (26).
7. An entertainment system for enabling virtual-real interactions as claimed in claim 5, characterized in that the interaction system further comprises a digital asset management module (27).
8. The entertainment system for realizing virtual-real interaction according to claim 1, wherein the virtual park refers to a part or all of the real park, and the digital scene displayed after the theme activity is superimposed.
9. An entertainment system for enabling virtual-real interaction according to claim 1, characterized in that the virtual park comprises a virtual scene (211), a virtual device (212), a virtual character library (213).
10. The entertainment system for realizing virtual-real interaction according to claim 9, characterized in that the virtual character library comprises a plurality of virtual characters (2131), wherein the virtual characters are constructed based on real player information or are created by the system, and the virtual characters can be non-player characters set according to the amusement experience of the virtual park.
11. An entertainment system for enabling virtual-real interactions according to claim 9, characterized in that the virtual park further comprises a base material library (214) which may be one or more of virtual theme elements or virtual props or virtual interaction devices.
12. An entertainment system capable of realizing virtual-real interaction according to claim 1, wherein the acquisition device can be one or more of an image acquisition device, a position acquisition device, a motion state acquisition device, a sound acquisition device, or a physiological information acquisition device.
13. A virtual-real interactive entertainment system according to claim 3, characterized in that the terminal is a display means (33) for displaying a virtual element and a real mixed scene and for interacting with players of the amusement park.
14. The entertainment system capable of realizing virtual-real interaction according to claim 1, wherein the number of the real parks is plural, and the number of the virtual parks is plural; and the plurality of live-action parks, the plurality of virtual parks and the live-action parks are connected through an interactive system.
15. An entertainment system for realizing virtual-real interaction according to claim 14, wherein the plurality of live-action parks form a real world, the plurality of virtual parks form a virtual world, and the real world and the virtual world are connected through the interaction system.
16. An interaction method of an entertainment system capable of realizing virtual-real interaction according to any one of claims 1-15, characterized by comprising
S1, a live-action data acquisition device acquires live-action park data and transmits the live-action park data to a meta-game system server through a communication device;
s2, the meta-game system server constructs a virtual paradise according to the collected live-action paradise data;
s3, the player information acquisition device acquires player information and transmits the player information to the meta-game system server through the communication device;
s4, the player management module associates the real identity and the virtual identity of the player according to the collected player information and enters a corresponding recreation service module;
s5, the interaction equipment data acquisition device acquires the operation state data of the interaction equipment and transmits the operation state data to the meta-stream system server through the communication device;
And S6, synchronously displaying the running state of the corresponding virtual interaction device mapped by the player live-action interaction device in the virtual park scene by the recreation service module according to the collected data, and simultaneously enabling the player to interact with the virtual park scene, the virtual identity and the virtual interaction device through the terminal so as to realize synchronous interaction of reality and virtual information.
CN202311045088.4A 2022-08-23 2023-08-18 Entertainment system capable of realizing virtual-real interaction Pending CN116850575A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211013555.0A CN115337626A (en) 2022-08-23 2022-08-23 Entertainment system capable of realizing virtual-real interaction
CN2022110135550 2022-08-23

Publications (1)

Publication Number Publication Date
CN116850575A true CN116850575A (en) 2023-10-10

Family

ID=83953986

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211013555.0A Pending CN115337626A (en) 2022-08-23 2022-08-23 Entertainment system capable of realizing virtual-real interaction
CN202311045088.4A Pending CN116850575A (en) 2022-08-23 2023-08-18 Entertainment system capable of realizing virtual-real interaction

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211013555.0A Pending CN115337626A (en) 2022-08-23 2022-08-23 Entertainment system capable of realizing virtual-real interaction

Country Status (1)

Country Link
CN (2) CN115337626A (en)

Also Published As

Publication number Publication date
CN115337626A (en) 2022-11-15

Similar Documents

Publication Publication Date Title
US11605203B2 (en) Creation and use of virtual places
Bolter et al. Reality media: Augmented and virtual reality
CN105188516B (en) For strengthening the System and method for virtual reality
Menache Understanding motion capture for computer animation and video games
CN106485779B (en) A kind of 3D virtual interacting display platform and the method for showing 3D animation
CN104011788B (en) For strengthening and the system and method for virtual reality
US20090069096A1 (en) Program, information storage medium, game system, and input instruction device
US10561943B2 (en) Digital imaging method and apparatus
JP2023059884A (en) Bidirectional video game system
CN109562294A (en) Method for creating virtual objects
CN109829976A (en) One kind performing method and its system based on holographic technique in real time
Nguyen et al. Real-time 3D human capture system for mixed-reality art and entertainment
Zioulis et al. 3D tele-immersion platform for interactive immersive experiences between remote users
CN109976527B (en) Interactive VR display system
KR20110110379A (en) Card game system using camera
JP7323751B2 (en) game system and program
CN112104857A (en) Image generation system, image generation method, and information storage medium
CN116850602A (en) Mixed reality dodgem recreation system
JP2020107251A (en) Image generation system and program
JP6732463B2 (en) Image generation system and program
JP6566209B2 (en) Program and eyewear
CN116850575A (en) Entertainment system capable of realizing virtual-real interaction
JP2016087007A (en) Online game system and online game server
Silberman Matrix2
Gong et al. Grey Island: Immersive tangible interaction through augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination