CN113398577B - Multi-person AR interaction method and system for offline space - Google Patents

Multi-person AR interaction method and system for offline space Download PDF

Info

Publication number
CN113398577B
CN113398577B CN202110522697.9A CN202110522697A CN113398577B CN 113398577 B CN113398577 B CN 113398577B CN 202110522697 A CN202110522697 A CN 202110522697A CN 113398577 B CN113398577 B CN 113398577B
Authority
CN
China
Prior art keywords
experience
content
local server
dimensional map
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110522697.9A
Other languages
Chinese (zh)
Other versions
CN113398577A (en
Inventor
邹礼见
吴文斌
虞崇军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yixian Advanced Technology Co ltd
Original Assignee
Hangzhou Yixian Advanced Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yixian Advanced Technology Co ltd filed Critical Hangzhou Yixian Advanced Technology Co ltd
Priority to CN202110522697.9A priority Critical patent/CN113398577B/en
Publication of CN113398577A publication Critical patent/CN113398577A/en
Application granted granted Critical
Publication of CN113398577B publication Critical patent/CN113398577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Abstract

The application relates to a multi-person AR interaction method of an offline space, wherein the multi-person AR interaction method of the offline space comprises the following steps: the AR experience equipment collects real pictures of the offline space through an internal camera unit and uploads the real pictures to a local server; the local server downloads the three-dimensional map and the virtual experience content from the global server and provides positioning service for the AR experience device based on the three-dimensional map; the AR experience equipment acquires positioning information and visual field information through a positioning service, and acquires virtual experience content corresponding to the positioning information and the visual field information from a local server according to the positioning information and the visual field information; and the AR experience equipment receives the interaction instruction of the user, and updates and displays the virtual experience content in the display interface according to the interaction instruction. Through the method and the device, the problem that AR interaction experience among a plurality of users cannot be carried out in the online space in the related technology is solved, and interactivity and use experience of the online space of a player are improved.

Description

Multi-person AR interaction method and system for offline space
Technical Field
The application relates to the technical field of augmented reality, in particular to a multi-person AR interaction method and system for offline space.
Background
Augmented reality (Augmented Reality, abbreviated as AR) technology is a technology that ingeniously merges virtual information with the real world. Currently, with the update and development of technology, AR technology is increasingly applied to entertainment scenes such as amusement parks and game halls.
In the related art, the AR experience devices in the conventional amusement park or game hall are single-point devices and desktop devices, wherein the single-point devices are mainly AR experiences of user watching devices, and the desktop devices are AR interactive experiences of users in a preset range space (such as a desktop). Both of the above methods cannot perform AR interactive experience between multiple users in an online down space.
At present, an effective solution is not proposed for the problem that in the related art, AR interaction experience between a plurality of users cannot be performed in an online space.
Disclosure of Invention
The embodiment of the application provides a multi-person AR interaction method, a multi-person AR interaction system, a multi-person AR interaction computer device and a multi-person AR interaction computer readable storage medium for solving at least the problem that AR interaction experience among a plurality of users cannot be carried out in an online space in the related technology.
In a first aspect, an embodiment of the present application provides a multi-person AR interaction method for offline space, where the method includes:
the AR experience equipment collects real pictures of the offline space through an internal camera unit and uploads the real pictures to a local server;
the local server downloads a three-dimensional map and virtual experience content from a global server and provides positioning service for the AR experience equipment based on the three-dimensional map;
the AR experience equipment acquires positioning information and visual field information through the positioning service, and acquires virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information and displays the virtual experience content;
the AR experience device receives an interaction instruction of a user, updates the virtual experience content in a display interface according to the interaction instruction and displays the virtual experience content, wherein the interaction instruction is generated by the user based on the display interface of the virtual experience content.
In some embodiments, before the AR experience device acquires a real picture of the offline space through the internal camera unit, the method includes:
acquiring scene data of a target scene through a camera and laser scanning equipment, and constructing a three-dimensional map of the target scene according to the scene data, wherein the target scene comprises the offline space;
and uploading the three-dimensional map to the global server.
In some of these embodiments, the local server providing location services to the AR experience device based on the three-dimensional map comprises:
the local server receives the real picture uploaded by the AR experience equipment, traverses the three-dimensional map through a preset positioning algorithm, acquires the positioning information and the visual field information corresponding to the real picture in the three-dimensional map, and transmits the positioning information and the visual field information to the AR experience equipment.
In some embodiments, the local server is disposed in the offline space, and the global server is disposed in a remote operation center, and in the case that the three-dimensional map needs to be updated, the global server receives an operation instruction of an operator, updates the three-dimensional map, and pushes the updated three-dimensional map to the local server.
In a second aspect, embodiments of the present application provide a multi-person AR interactive system for offline space, the system comprising: the system comprises a global server, a local server and AR experience equipment, wherein the global server is in communication connection with the local server through a network, and the local server is in communication connection with the AR experience equipment through a local area network;
the global server comprises a map management module and an AR content management module, wherein,
the map management module is used for receiving scene data of the offline space acquired by the camera and the laser scanning equipment, constructing a three-dimensional map of the offline space according to the scene data and storing the three-dimensional map;
the content management platform is used for receiving virtual experience content and providing storage, wherein the virtual experience content is AR experience content overlapped in the offline space by combining scene distribution conditions of the offline space by a designer;
the local server comprises a positioning service module and a content service module, wherein the positioning service module is used for downloading a three-dimensional map from the global server and providing positioning services for the AR experience equipment based on the three-dimensional map, and the content service module is used for receiving the virtual experience content from the global server and providing storage and pushing the virtual experience content to the AR experience equipment;
the AR experience equipment is used for acquiring positioning information and visual field information through the positioning service, and acquiring virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information and displaying the virtual experience content;
the AR experience device is used for receiving an interaction instruction of a user, updating the virtual experience content in a display interface according to the interaction instruction and displaying the virtual experience content, wherein the interaction instruction is generated by the user based on the display interface of the virtual experience content.
In some embodiments, the positioning service module is configured to receive a real picture uploaded by the AR experience device, traverse the three-dimensional map through a preset positioning algorithm, obtain the positioning information and the view information corresponding to the real picture in the three-dimensional map, and send the positioning information and the view information to the AR experience device.
In a third aspect, embodiments of the present application provide an AR interactive cart, the cart comprising: the device comprises an operation unit, a visual unit, a display unit and a communication operation unit;
the running unit is used for providing power to enable the trolley to run in an on-line lower scene;
the visual unit is used for acquiring a real picture of the off-line scene and uploading the real picture to a local server to acquire positioning information and visual field information of the trolley;
the communication operation unit is used for acquiring virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information, wherein the virtual experience content is AR experience content added in the offline space, and the three-dimensional map is constructed according to the scene data after scene data of the offline scene are acquired through a camera and a laser scanning device;
the display unit is used for displaying the virtual experience content, wherein the display unit presents the virtual experience content updated according to the interaction instruction under the condition that the communication operation unit receives the interaction instruction of the user, and the interaction instruction is generated by the user based on a display interface of the virtual experience content.
In some embodiments, the communication operation unit is further configured to create a network connection with the local server, and establish interactive communication between the plurality of carts through the local server to enable communication interaction between the plurality of carts.
In a fourth aspect, an embodiment of the present application provides a computer device, including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, where the processor implements a multi-person AR interaction method for offline space according to the first aspect when executing the computer program.
In a fifth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor implements a multi-person AR interaction method for offline space as described in the first aspect above.
Compared with the related art, the multi-person AR interaction method and system for the offline space provided by the embodiment of the application acquire data of a target scene through the laser scanning device and the color camera and construct a three-dimensional map of the target scene in the first stage, the three-dimensional map is distributed to a local server through a global server, and the local server provides positioning service for AR experience equipment. In the second stage, the AR experience device deployed in the target scene shoots a real picture, the local server is used for determining the position information and the visual field information of the AR experience device, the corresponding AR experience content is further acquired by the local server and is superimposed on the real picture of the scene under the online condition, and finally, the real picture is displayed to the user. The user can send out various interactive instructions to interact with the AR experience content of the user or other users. According to the method and the device for the AR interaction experience, the problem that AR interaction experience among a plurality of users cannot be carried out in the online space in the related technology is solved, and interactivity and use experience of the online space of a player are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic view of an application environment of a multi-person AR interaction method for offline space according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of multi-person AR interaction in offline space according to an embodiment of the present application;
FIG. 3 is a block diagram of a multi-person AR interaction system in offline space according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a multi-person AR interactive system in offline space according to an embodiment of the present application;
FIG. 5 is a schematic diagram of the internal structure of an AR interaction cart according to an embodiment of the present application;
FIG. 6 is a communication schematic diagram of an AR interaction cart according to an embodiment of the present application;
fig. 7 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described and illustrated below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden on the person of ordinary skill in the art based on the embodiments provided herein, are intended to be within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the embodiments described herein can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar terms herein do not denote a limitation of quantity, but rather denote the singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein refers to two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
The multi-person AR interaction method for offline space provided in the embodiment of the present application may be applied in an application environment as shown in fig. 1, fig. 1 is a schematic diagram of an application environment of the multi-person AR interaction method for offline space according to the embodiment of the present application, and as shown in fig. 1, a terminal 10 communicates with a server 11 through a network. The user sends the real picture of the shot offline scene to the server 11 through the terminal 10, and the server 11 can determine the positioning information and the visual field information of the terminal 10 according to the real picture, and further, the terminal 10 obtains and displays the virtual experience content corresponding to the positioning information and the visual field information from the server 11 according to the positioning information and the visual field information. In the process, the user may send out various types of interaction instructions to the terminal 10, and the terminal 10 updates the original virtual experience content according to the interaction instructions, for example, moves the corresponding three-dimensional virtual object in the AR experience content according to the dragging instruction of the user. It should be noted that, the server 11 may be further divided into a local server and a global server, where the local server is deployed in an online space, and the global server is deployed in a remote control center, and the server 11 may be an independent server or a server cluster formed by multiple servers; the terminal 10 may be a smart phone, tablet, AR glasses, etc. that may provide an AR experience.
The embodiment provides a multi-person AR interaction method for an offline space, and fig. 2 is a flowchart of a multi-person AR interaction method for an offline space according to an embodiment of the present application, as shown in fig. 2, where the flowchart includes the following steps:
step S201, scene data of a target scene is acquired through a camera and laser scanning equipment, and a three-dimensional map of the target scene is constructed according to the scene data; the target scene can be entertainment scenes such as a recreation ground, a recreation hall and the like in an offline space, and the three-dimensional map can reflect entity distribution conditions in the target scene, including but not limited to a carrier form of point cloud and a model;
step S202, the AR experience equipment collects real pictures of the offline space through an internal camera unit and uploads the real pictures to a local server; the hardware part of the AR experience device comprises, but is not limited to, a camera, a processor, a display and a signal receiver, and the software part comprises, but is not limited to, a camera picture processing module, a visual positioning module, an AR rendering engine and the like. In this embodiment, the real picture is a picture reflecting the real situation of the offline space in the form of video or picture shot by the camera. Furthermore, the local server is in communication connection with the AR experience device through a local area network, and the uploading of the real picture to the local server is realized through the local area network;
step S203, the local server downloads the three-dimensional map and the virtual experience content from the global server, and provides positioning service for the AR experience device based on the three-dimensional map; the positioning service is realized by a positioning module on a local server, and the positioning information and the orientation information of the real picture in the three-dimensional map are obtained by putting the real picture into the three-dimensional map for comparison analysis; correspondingly, for the AR experience device, the positioning information and the orientation information are the position of the AR experience device in the online space and the visual field range of the camera thereof. It should be noted that, because the operation requirement of the visual positioning service on the hardware in the subsequent steps is higher, and the corresponding hardware price is also higher, the hardware for providing the visual positioning service needs to be deployed on each AR experience device with a larger cost, so that the local server is selectively deployed in the offline scene where the AR experience device is located, the positioning service is provided for the AR experience device through the local server, and the AR interaction of multiple people can be quickly and cost-effectively built. However, if only the local server is used to provide services for the AR experience device, there is a problem that the maintenance cost is high, for example, in the case that the three-dimensional scene map or the virtual experience content needs to be updated, a person needs to go to the corresponding offline space to perform update maintenance, which consumes a large labor cost. Therefore, network communication is established between the global server and the local server so as to realize remote management of the three-dimensional map and virtual experience content, and when updating is needed, configuration and pushing are only needed on the remote global server;
step S204, the AR experience device obtains positioning information and visual field information through a positioning service, and obtains virtual experience content corresponding to the positioning information and the visual field information from a local server according to the positioning information and the visual field information and displays the virtual experience content; the virtual experience content is AR experience content associated with three-dimensional coordinates of objects in the offline space, is an actual scene design of the offline space combined by a designer, is stored in a global server and is pushed to a local server by the global server according to a certain rule or strategy; further, after the local server obtains the positioning information and the visual field information of the AR experience device, virtual experience content to be sent can be determined according to the positioning information and the visual field information and sent to the AR experience device for display; the virtual experience content includes, but is not limited to, 3D models, 3D animations, 3D dynamic effects, etc. For example, when the online space contains a real pool, the virtual experience content obtained by the AR experience device may be a whale jumping out of the pool;
in step S205, the AR experience device receives an interaction instruction of the user, updates the virtual experience content in the display interface according to the interaction instruction, and displays the virtual experience content, where the interaction instruction is generated by the user based on the display interface of the virtual experience content. The interactive instructions include, but are not limited to, clicking, triggering, dragging, rotating, collecting, or a combination thereof. The clicking refers to a specific position in space converted in a certain way, and a subsequent logic process of responding based on the position, and the clicking instruction can be a click of a mobile phone touch screen, a handle click of an AR eye, or a gesture click of an AR glasses. The trigger instruction is used for triggering the appearance sequence logic of the virtual content, for example, the click trigger is an action instruction for triggering new virtual content after clicking a certain preset position, the position trigger is an instruction for triggering new virtual content when the AR experience device moves to a certain specific position, and the range trigger is an instruction for triggering new virtual content when the AR experience device enters (or exits) a certain space range. Rotation refers to an instruction to perform a rotation operation on a virtual three-dimensional object. The instruction collection means that a certain preset condition is completed, and an instruction corresponding to the prop is obtained. After the user outputs the interactive instructions of various types, the AR experience equipment can acquire rich AR effects linked with the offline space, so that the immersion feeling and the use interestingness of the user during use are improved. It should be noted that, multiple AR experience devices in the same offline space may send operation instructions to the virtual experience content in the offline space at the same time, so as to implement multi-person and real-time AR interaction.
Through steps S201 to S205, compared with the single-point type and desktop type AR interaction method in the related art, the embodiment of the present application realizes positioning of the AR experience device based on the real image captured by the AR experience device and the three-dimensional map in the local server, and further, determines the corresponding virtual interaction content through the positioning information and pushes the virtual interaction content to the AR experience device for display. Meanwhile, the user can send various interaction instructions to the AR experience device to update the virtual interaction content. Through the embodiment, the problem that AR interaction experience among a plurality of users cannot be carried out in the offline space in the related technology is solved, and the technical effect of AR interaction experience among the plurality of users in the offline space is achieved.
In some embodiments, the local server is deployed in an online space, the global server is deployed in a remote operation center, and when the three-dimensional map needs to be updated, the global server receives an operation instruction of an operator, updates the three-dimensional map and pushes the three-dimensional map to the local server. It should be noted that, due to the adoption of the local service mode, operators in the offline scene can start corresponding services according to actual demands, namely, open when experience is needed and close when experience is not needed, so that reasonable utilization of resources is realized. Further, the local server includes a computing unit and a storage unit, which may be a general purpose computer device or a device customized for the AR experience. In addition, through setting up global server and can effectual reduction operation cost, for example, when a plurality of off-line spaces of operation simultaneously, in case the map is updated or virtual content is updated, all need to go to corresponding off-line space and update the maintenance, travel and man-hour cost is great, and this scheme proposes to utilize global server to manage map and virtual experience content, only need the configuration propelling movement on the far end when appearing updating can. Optionally, an account management subsystem may be provided on the local server and the global server.
It should be noted that the steps illustrated in the above-described flow or flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
The embodiment also provides a multi-person AR interaction system for offline space, which is used to implement the foregoing embodiments and preferred embodiments, and the description thereof is omitted. As used below, the terms "module," "unit," "sub-unit," and the like may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 3 is a block diagram of a multi-person AR interactive system in offline space according to an embodiment of the present application, and as shown in fig. 3, the system includes a global server 31, a local server 32, and an AR experience device 33, where the global server 31 is communicatively connected to the local server 32 through a wireless or wired network, and the local server 32 is communicatively connected to the AR experience device 33 through a local area network;
the global server 31 includes a map management module and an AR content management module, wherein,
the map management module is used for receiving scene data of the offline space obtained through the camera and laser scanning, constructing a three-dimensional map of the offline space according to the scene data and storing the three-dimensional map;
the content management platform is used for receiving virtual experience content and providing storage, wherein the virtual experience content is a scene distribution condition of a designer combined with an offline space, and AR experience content in the online space is added;
the local server 32 includes a location service module for downloading a three-dimensional map from the global server 31 and providing location services to the AR experience device 33 based on the three-dimensional map, and a content service module for receiving virtual experience content from the global server and providing storage, and pushing the virtual experience content to the AR experience device;
the AR experience device 33 is configured to acquire positioning information and visual field information through a positioning service, and acquire virtual experience content corresponding to the positioning information and visual field information from the local server 32 according to the positioning information and the visual field information and display the virtual experience content;
the AR experience device 33 is configured to receive an interaction instruction of a user, update virtual experience content in a display interface according to the interaction instruction, and display the virtual experience content, where the interaction instruction is generated by the user based on the display interface of the virtual experience content.
In some of these embodiments, fig. 4 is a schematic diagram of a multi-person AR interactive system for offline space according to an embodiment of the present application. As shown in fig. 4, the global server 31 is connected to a plurality of local servers 32 for storing and managing all three-dimensional maps and virtual experience contents, and pushing the three-dimensional maps and virtual experience contents to the local servers 32. The local server 32 is connected to the multiple AR experience devices 33, and provides a positioning service and a content service for the AR experience devices 33, where the positioning service module is configured to receive the real image uploaded by the AR experience devices 33, traverse the three-dimensional map through a preset positioning algorithm, obtain positioning information and view information of the real image in the three-dimensional map, and send the positioning information and the view information to the AR experience devices 33.
The application also provides an AR interaction trolley, and fig. 5 is a schematic diagram of an internal structure of the AR interaction trolley according to an embodiment of the application, as shown in fig. 5, and the trolley includes: an operation unit 51, a vision unit 52, a display unit 53, and a communication operation unit 54;
the operation unit 51 is used for providing power to operate the trolley in an on-line lower scene;
the vision unit 52 is used for acquiring a real picture of the off-line scene and uploading the real picture to the local server to acquire positioning information and visual field information of the trolley;
the communication operation unit 54 is configured to obtain virtual experience content corresponding to the positioning information and the view information from the local server according to the positioning information and the view information, where the virtual experience content is AR experience content added in an offline scene, and after scene data of the offline scene is acquired through the camera and the laser scanning device, construct a three-dimensional map according to the scene data;
the display unit 53 is configured to display virtual experience content, where, when the communication operation unit 54 receives an interaction instruction of a user, the display unit 53 displays the virtual experience content updated according to the interaction instruction, where the interaction instruction is generated by the user based on a display interface of the virtual experience content, and optionally, the display unit 53 may be disposed at a front windshield position of the cart;
in the above embodiment, data is collected in the physical play scene by the camera and the laser scanning device first, and then a three-dimensional map of the physical play scene is constructed from the collected data by the algorithm. Secondly, a local server is arranged in the entity recreation scene, and a wireless local area network is arranged to realize subsequent communication connection with the AR trolley. It should be noted that, because the visual positioning algorithm of the AR interactive trolley requires a larger amount of operation, the local server selects a computing unit with higher computing capability. Finally, arranging the AR interaction trolley in the entity amusement park and connecting the AR interaction trolley with a local server through a local area network, wherein the AR interaction trolley acquires a real picture of the amusement scene through a visual unit 52 and sends the real picture to the local server, and the local server calculates positioning information of the AR interaction trolley by combining the three-dimensional map and the real picture and sends the positioning information to the AR interaction trolley; finally, the AR interaction trolley displays the virtual AR content superimposed on the real picture through the display unit 53 for user experience, and further, the user can also send various types of interaction instructions through the display unit 53 to interact with the AR experience content. It should be noted that, positioning of the AR interaction trolley in the recreation scene is completed together through self lightweight calculation and high calculation power of the local server, and after positioning is completed, mutual interaction communication is performed between the AR interaction trolley through the local server, so that an interaction recreation function between different trolley terminals is realized. The problem that AR interaction experience among a plurality of users cannot be carried out in the online lower space is solved through the method, and the use experience of the users is improved.
In some embodiments, fig. 6 is a communication schematic diagram of an AR interactive cart according to an embodiment of the present application, as shown in fig. 6, a global server is set in a remote center, the global server is connected with a local server through a public network, a person uploads designed virtual experience content and a three-dimensional scene map to the global server in the remote center, and the global server pushes the virtual experience content and the three-dimensional scene map to the local server according to a certain preset policy, and meanwhile, the local server can also dynamically acquire the virtual experience content and the three-dimensional scene map from the global server. It should be noted that, the global server is mainly used to centrally manage map and content resources of multiple offline spaces, and provide services of global class, such as account management and network management. Furthermore, the local server sends specific virtual experience contents to the AR interaction trolley driven by the user through the local area network, various virtual experience contents overlapped in a real scene are displayed on the front windshield of the trolley, and meanwhile, multiple types of interactions can be carried out between the user and multiple users, so that the immersion feeling of the user experience is greatly improved.
The above-described respective modules may be functional modules or program modules, and may be implemented by software or hardware. For modules implemented in hardware, the various modules described above may be located in the same processor; or the above modules may be located in different processors in any combination.
In addition, in combination with the multi-person AR interaction method in offline space in the above embodiment, the embodiments of the present application may provide a storage medium for implementation. The storage medium has a computer program stored thereon; the computer program, when executed by a processor, implements the multi-person AR interaction method of any of the offline spaces of the above embodiments.
In one embodiment, a computer device is provided, which may be a terminal. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a multi-person AR interaction method for offline space. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
In one embodiment, fig. 7 is a schematic diagram of an internal structure of an electronic device according to an embodiment of the present application, as shown in fig. 7, and an electronic device, which may be a server, may be provided, and an internal structure diagram thereof may be shown in fig. 7. The electronic device includes a processor, a network interface, an internal memory, and a non-volatile memory connected by an internal bus, where the non-volatile memory stores an operating system, computer programs, and a database. The processor is used for providing computing and control capability, the network interface is used for communicating with an external terminal through network connection, the internal memory is used for providing environment for the operation of an operating system and a computer program, the computer program is executed by the processor to realize a multi-person AR interaction method of offline space, and the database is used for storing data.
It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the electronic device to which the present application is applied, and that a particular electronic device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be understood by those skilled in the art that the technical features of the above embodiments may be combined in any manner, and for brevity, all of the possible combinations of the technical features of the above embodiments are not described, however, they should be considered as being within the scope of the description provided herein, as long as there is no contradiction between the combinations of the technical features.
The foregoing examples represent only a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (10)

1. A method of multi-person AR interaction in an off-line space, the method comprising:
the AR experience equipment collects real pictures of the offline space through an internal camera unit and uploads the real pictures to a local server;
the local server downloads a three-dimensional map and virtual experience content from a global server and provides positioning service for the AR experience equipment based on the three-dimensional map, wherein the local server is deployed in the offline space, and the global server is deployed in a remote operation center;
the AR experience equipment acquires positioning information and visual field information through the positioning service, and acquires virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information and displays the virtual experience content;
and the plurality of AR experience devices in the offline space respectively receive interaction instructions of users, and update and display the virtual experience content in a display interface according to the interaction instructions, wherein the interaction instructions are generated by the users based on the display interface of the virtual experience content.
2. The method according to claim 1, wherein before the AR experience device acquires a real picture of the offline space through an internal camera unit, the method comprises:
acquiring scene data of a target scene through a camera and laser scanning equipment, and constructing a three-dimensional map of the target scene according to the scene data, wherein the target scene comprises the offline space;
and uploading the three-dimensional map to the global server.
3. The method of claim 1, wherein the local server providing location services to the AR experience device based on the three-dimensional map comprises:
the local server receives the real picture uploaded by the AR experience equipment, traverses the three-dimensional map through a preset positioning algorithm, acquires the positioning information and the visual field information corresponding to the real picture in the three-dimensional map, and transmits the positioning information and the visual field information to the AR experience equipment.
4. The method according to claim 1, wherein in the case that the three-dimensional map needs to be updated, the global server receives an operation instruction of an operator, updates the three-dimensional map, and pushes the updated three-dimensional map to the local server.
5. A multi-person AR interactive system for an offline space, the system comprising: the system comprises a global server, a local server and AR experience equipment, wherein the global server is in communication connection with the local server through a network, and the local server is in communication connection with the AR experience equipment through a local area network;
the global server comprises a map management module and an AR content management module, wherein,
the map management module is used for receiving scene data of the offline space acquired by the camera and the laser scanning equipment, constructing a three-dimensional map of the offline space according to the scene data and storing the three-dimensional map;
the content management platform is used for receiving virtual experience content and providing storage, wherein the virtual experience content is AR experience content overlapped in the offline space by combining scene distribution conditions of the offline space by a designer;
the local server comprises a positioning service module and a content service module, wherein the positioning service module is used for downloading a three-dimensional map from the global server and providing positioning services for the AR experience equipment based on the three-dimensional map, the content service module is used for receiving the virtual experience content from the global server and providing storage and pushing the virtual experience content to the AR experience equipment, the local server is deployed in the offline space, and the global server is deployed in a remote operation center;
the AR experience equipment is used for acquiring positioning information and visual field information through the positioning service, and acquiring virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information and displaying the virtual experience content;
the plurality of AR experience devices in the offline space are used for respectively receiving interaction instructions of users, updating the virtual experience content in the display interface according to the interaction instructions and displaying the virtual experience content, wherein the interaction instructions are generated by the users based on the display interface of the virtual experience content.
6. The system of claim 5, wherein the positioning service module is configured to receive a real picture uploaded by the AR experience device, traverse the three-dimensional map through a preset positioning algorithm, obtain the positioning information and the view information corresponding to the real picture in the three-dimensional map, and send the positioning information and the view information to the AR experience device.
7. The AR interactive cart for multiple person AR interaction according to the method of any one of claims 1-6, wherein the cart comprises: the device comprises an operation unit, a visual unit, a display unit and a communication operation unit;
the running unit is used for providing power to enable the trolley to run in an on-line lower scene;
the visual unit is used for acquiring a real picture of the off-line scene and uploading the real picture to a local server to acquire positioning information and visual field information of the trolley;
the communication operation unit is used for acquiring virtual experience content corresponding to the positioning information and the visual field information from the local server according to the positioning information and the visual field information, wherein the virtual experience content is AR experience content added in the offline space, and the three-dimensional map is constructed according to the scene data after scene data of the offline scene are acquired through a camera and a laser scanning device;
the display unit is used for displaying the virtual experience content, wherein the display unit presents the virtual experience content updated according to the interaction instruction under the condition that the communication operation unit receives the interaction instruction of the user, and the interaction instruction is generated by the user based on a display interface of the virtual experience content.
8. The cart of claim 7, wherein the communication computing unit is further configured to create a network connection with the local server and establish interactive communication between the plurality of carts via the local server to enable communication interactions between the plurality of carts.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements a multi-person AR interaction method of an offline space according to any of claims 1 to 4 when executing the computer program.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a multi-person AR interaction method of an offline space according to any of claims 1 to 4.
CN202110522697.9A 2021-05-13 2021-05-13 Multi-person AR interaction method and system for offline space Active CN113398577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110522697.9A CN113398577B (en) 2021-05-13 2021-05-13 Multi-person AR interaction method and system for offline space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110522697.9A CN113398577B (en) 2021-05-13 2021-05-13 Multi-person AR interaction method and system for offline space

Publications (2)

Publication Number Publication Date
CN113398577A CN113398577A (en) 2021-09-17
CN113398577B true CN113398577B (en) 2024-04-09

Family

ID=77678553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110522697.9A Active CN113398577B (en) 2021-05-13 2021-05-13 Multi-person AR interaction method and system for offline space

Country Status (1)

Country Link
CN (1) CN113398577B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648115A (en) * 2017-01-13 2017-05-10 广州大学 Trolley Kinect fighting device and control method based on AR virtual control
CN106984043A (en) * 2017-03-24 2017-07-28 武汉秀宝软件有限公司 The method of data synchronization and system of a kind of many people's battle games
CN108114471A (en) * 2017-12-04 2018-06-05 广州市动景计算机科技有限公司 AR method for processing business, device, server and mobile terminal
WO2018098744A1 (en) * 2016-11-30 2018-06-07 深圳益强信息科技有限公司 Data processing method and system based on virtual driving
CN110892410A (en) * 2017-07-07 2020-03-17 奈安蒂克公司 Cloud-enabled augmented reality
CN111970557A (en) * 2020-09-01 2020-11-20 深圳市慧鲤科技有限公司 Image display method, image display device, electronic device, and storage medium
CN112148197A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Augmented reality AR interaction method and device, electronic equipment and storage medium
CN112785700A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method, global map updating method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5471626B2 (en) * 2010-03-09 2014-04-16 ソニー株式会社 Information processing apparatus, map update method, program, and information processing system
US20140267234A1 (en) * 2013-03-15 2014-09-18 Anselm Hook Generation and Sharing Coordinate System Between Users on Mobile
US9607437B2 (en) * 2013-10-04 2017-03-28 Qualcomm Incorporated Generating augmented reality content for unknown objects

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018098744A1 (en) * 2016-11-30 2018-06-07 深圳益强信息科技有限公司 Data processing method and system based on virtual driving
CN106648115A (en) * 2017-01-13 2017-05-10 广州大学 Trolley Kinect fighting device and control method based on AR virtual control
CN106984043A (en) * 2017-03-24 2017-07-28 武汉秀宝软件有限公司 The method of data synchronization and system of a kind of many people's battle games
CN110892410A (en) * 2017-07-07 2020-03-17 奈安蒂克公司 Cloud-enabled augmented reality
CN108114471A (en) * 2017-12-04 2018-06-05 广州市动景计算机科技有限公司 AR method for processing business, device, server and mobile terminal
CN112785700A (en) * 2019-11-08 2021-05-11 华为技术有限公司 Virtual object display method, global map updating method and device
CN111970557A (en) * 2020-09-01 2020-11-20 深圳市慧鲤科技有限公司 Image display method, image display device, electronic device, and storage medium
CN112148197A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Augmented reality AR interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113398577A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN109861948B (en) Virtual reality data processing method and device, storage medium and computer equipment
CN108446310B (en) Virtual street view map generation method and device and client device
CN112316424B (en) Game data processing method, device and storage medium
WO2021258994A1 (en) Method and apparatus for displaying virtual scene, and device and storage medium
CN105338117B (en) For generating AR applications and method, equipment and the system of AR examples being presented
US9665334B2 (en) Rendering system, rendering server, control method thereof, program, and recording medium
CN108144294B (en) Interactive operation implementation method and device and client equipment
US9658737B2 (en) Cross platform sharing of user-generated content
CN103650001B (en) Moving image distribution server, moving image playback device and control method
US20140302930A1 (en) Rendering system, rendering server, control method thereof, program, and recording medium
CN109731330B (en) Method and device for displaying picture, storage medium and electronic device
CN107638690B (en) Method, device, server and medium for realizing augmented reality
US11409276B2 (en) Method and system for controlling robots within in an interactive arena and generating a virtual overlayed
US20130300762A1 (en) Teleport Preview Provisioning in Virtual Environments
CN105183477A (en) System and method for acquiring virtual item information of application program
CN111558221B (en) Virtual scene display method and device, storage medium and electronic equipment
CN112933606A (en) Game scene conversion method and device, storage medium and computer equipment
CN111142967B (en) Augmented reality display method and device, electronic equipment and storage medium
CN112330819A (en) Interaction method and device based on virtual article and storage medium
CN113705520A (en) Motion capture method and device and server
CN113230652B (en) Virtual scene transformation method and device, computer equipment and storage medium
CN113617026B (en) Cloud game processing method and device, computer equipment and storage medium
CN112827169B (en) Game image processing method and device, storage medium and electronic equipment
CN113398577B (en) Multi-person AR interaction method and system for offline space
CN111899349A (en) Model presentation method and device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant