CN108540542B - Mobile augmented reality system and display method - Google Patents

Mobile augmented reality system and display method Download PDF

Info

Publication number
CN108540542B
CN108540542B CN201810253121.5A CN201810253121A CN108540542B CN 108540542 B CN108540542 B CN 108540542B CN 201810253121 A CN201810253121 A CN 201810253121A CN 108540542 B CN108540542 B CN 108540542B
Authority
CN
China
Prior art keywords
digital
mobile terminal
image
identification information
cloud server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810253121.5A
Other languages
Chinese (zh)
Other versions
CN108540542A (en
Inventor
余日季
张立明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei University
Original Assignee
Hubei University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei University filed Critical Hubei University
Priority to CN201810253121.5A priority Critical patent/CN108540542B/en
Publication of CN108540542A publication Critical patent/CN108540542A/en
Application granted granted Critical
Publication of CN108540542B publication Critical patent/CN108540542B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a mobile augmented reality system, which comprises a mobile terminal, a wireless sensor, one or more wireless network nodes and a cloud server, wherein the wireless sensor is used for receiving signals sent by the one or more wireless network nodes and sending the signals to the mobile terminal; the mobile terminal is used for acquiring images and sending the acquired images and the signals to the cloud server; and the cloud server is used for constructing digital culture contents corresponding to the identification information, fusing and superposing the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sending the fused and superposed image to the mobile terminal. Correspondingly, the invention also provides an augmented reality display method, which solves the problems of low utilization rate and low user experience of displaying the digital culture content by utilizing the augmented reality technology in the prior art.

Description

Mobile augmented reality system and display method
Technical Field
The invention belongs to the technical field of Augmented Reality (AR), and particularly relates to a mobile Augmented Reality system and a display method.
Background
Augmented Reality (AR) is a research hotspot in the field of computer graphics developed in recent years, and can superimpose virtual digital information generated by a computer in a real environment, integrate the virtual digital information with the real environment, and present the virtual-real fused scene through a display device, so that the real environment presents an information enhancement effect. The enhanced information is presented, so that an experiencer not only can perceive the information of the real world, but also can perceive the virtual digital information outside the real world, and the perception of the experiencer to the environment of the real world can be effectively enhanced. With the development of computer science and technology, the research of AR technology is also turned from the application research in the early military, aviation and other high-end fields to the application development research in many fields such as education, entertainment, culture, tourism and the like.
In the prior art, aiming at the infinite AR application layers of travel culture, such as AR guide of scenic spots, AR three-dimensional display of intelligent exhibition areas and the like, a helmet type and a mobile terminal type are adopted, so that a user experiences the reappearance and story description of digital culture contents. However, in the prior art, most of the AR applications adopt a method of three-dimensional superposition of digital culture contents on the basis of Marker to perform AR display. The implementation mode has certain limitation, and AR identification and display are difficult to be carried out aiming at specific material culture carriers of each local area, so the utilization rate is low, and the experience is not high.
Disclosure of Invention
The invention provides a Mobile Augmented Reality (MAR) system, which solves the problems of low utilization rate and low user experience of AR display digital culture content in the prior art.
In order to achieve the above object, an embodiment of the present invention provides a mobile augmented reality MAR system, which includes a mobile terminal, a wireless sensor, one or more wireless network nodes, and a cloud server, wherein,
the wireless sensor is used for receiving signals sent by the one or more wireless network nodes and sending the signals to the mobile terminal, and the signals comprise identification information corresponding to the material culture carrier;
the mobile terminal is used for acquiring the image of the material culture carrier, receiving the signal sent by the wireless sensor and sending the acquired image and the signal to the cloud server;
the cloud server is used for constructing digital culture contents corresponding to the identification information, fusing and superposing the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sending the fused and superposed image to the mobile terminal;
and the mobile terminal is also used for receiving and displaying the fused and superposed images sent by the cloud server.
Optionally, the cloud server is configured to fuse and overlay digital culture content corresponding to the identification information in the signal onto the image, and includes:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts.
Optionally, the cloud server is further configured to:
and manufacturing a digital three-dimensional model corresponding to the identification by adopting a computer graphics image CG technology.
Optionally, the material culture carrier includes a cultural creative product, a museum exhibit, or a sight spot, and the identification information includes one or more of location information of an area where the mobile terminal is located, identification information of the cultural creative product, identification information of the museum exhibit, or identification information of the sight spot.
Optionally, the cloud server is further configured to:
when the material culture carrier is a museum exhibit and the identification information is the information of the exhibit in the museum, constructing an identification database, and performing dynamic identification switching, identification and matching on the image according to the identification database;
acquiring a posture positioning coordinate in an image acquired by the mobile terminal, and performing matching detection on the exhibit;
and determining the posture and position information of the mobile terminal, and performing virtual-real fusion on the digital culture content corresponding to the exhibit and the exhibit image.
Optionally, the wireless sensor is a Zigbee wireless transceiver module, and the wireless sensor is independent or built in the mobile terminal.
The embodiment of the invention also provides a method for displaying the mobile augmented reality, which comprises the following steps:
the method comprises the steps that a wireless sensor receives signals sent by one or more wireless network nodes and sends the signals to a mobile terminal, wherein the signals comprise identification information corresponding to a material culture carrier;
the mobile terminal collects the image of the material culture carrier, receives the signal sent by the wireless sensor and sends the collected image and the signal to a cloud server;
the cloud server constructs digital culture contents corresponding to the identification information, fuses and superposes the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sends the fused and superposed image to the mobile terminal;
and the mobile terminal receives and displays the fused and superposed image sent by the cloud server.
Optionally, the fusing and superimposing, by the cloud server, digital culture content corresponding to the identification information in the signal onto the image includes:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts.
Optionally, the method further comprises:
and the cloud server adopts a computer graphics image CG technology to manufacture a digital three-dimensional model corresponding to the identifier.
Optionally, the material culture carrier includes a cultural creative product, a museum exhibit, or a sight spot, and the identification information includes one or more of location information of an area where the mobile terminal is located, identification information of the cultural creative product, identification information of the museum exhibit, or identification information of the sight spot.
The method and the system of the embodiment of the invention have the following advantages:
by combining the AR technology and the wireless sensing communication technology, the wireless network node sends the identification in the current area range to the wireless sensor and forwards the identification to the mobile terminal, and when a user uses the mobile terminal to collect the image of the material culture carrier, the cloud server superposes the digital culture content corresponding to the identification information at the specific position of the image according to the identified image and the identification information, so that AR virtual-real superposition is realized. By adopting the scheme provided by the embodiment of the invention, not only can AR virtual and real superposition be more accurately carried out, but also higher user experience is brought, and commercial success is easy to realize.
Drawings
FIG. 1 is a block diagram of the MAR system components in an embodiment of the present invention;
FIG. 2 is a flow chart of a three-dimensional registration tracking technique according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for displaying by the mobile augmented reality system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example one
To achieve the above object, as shown in fig. 1, the present invention provides a mobile augmented reality MAR system 11, the MAR system 11 comprising a mobile terminal 12, a wireless sensor 13, one or more wireless network nodes 14 and a cloud server 15, wherein,
the wireless sensor 13 is configured to receive a signal sent by the one or more wireless network nodes 14, and send the signal to the mobile terminal 12, where the signal includes identification information corresponding to a material culture carrier;
it should be noted that the wireless sensor includes, but is not limited to, Zigbee, WIFI, bluetooth, NFC, and other wireless sensors. In the embodiment of the present invention, the wireless sensors used in the embodiments are different according to different scenes, for example, in a short-distance (within 10 meters) communication range, NFC and bluetooth may be used to complete data transmission and reception, and in a medium-distance (10-100 meters) communication range, Zigbee, WIFI, and other communication methods are used.
In the embodiment of the present invention, a Zigbee module is preferably used as a wireless sensor, and the Zigbee module may be inherited inside a mobile terminal, or may be independent from the mobile terminal, and performs signaling transmission and interaction with the mobile terminal through a communication protocol. ZigBee is a low-power consumption local area network protocol based on IEEE802.15.4 standard. According to international standards, the ZigBee technology is a short-range, low-power consumption wireless communication technology. Its advantages are short distance, low complexity, self-organization, low power consumption and low data rate. The device is mainly suitable for the fields of automatic control and remote control, and can be embedded into various devices.
The wireless network node 14 can be fixedly placed at a specific position, the wireless network node 14 is used for communicating with a wireless sensor, taking Zigbee as an example, the wireless network node 14 acquires Zigbee information within a communication range, adopts a Zigbee transmission protocol to handshake with a Zigbee module, completes signaling interaction, and sends identification information stored inside and information of a material culture carrier at the current wireless network node position to the Zigbee module.
The identification information comprises position information of the current area of the mobile terminal, after the wireless network node sends the position information to the wireless sensor, the wireless sensor sends the position information to the mobile terminal and forwards the position information to the cloud server, and at the moment, the cloud server can determine digital culture content corresponding to the position information according to the corresponding relation between the position information and the digital culture content and superimpose the digital culture content in the image. Different from the traditional combination mode of GPS and AR, the invention has the advantages of higher position precision and higher recognition rate by combining the wireless sensing and AR.
Alternatively, the identifier may be a label, which may be a specific label for indicating non-material Cultural Heritage (ICH) or material Cultural Heritage. The material cultural heritage is the actual object, such as natural landscape, historical and ancient sites, souvenir, exhibit, cultural relics, calligraphy and painting and the like. The non-material cultural heritage is different from the material cultural heritage, is an extremely important cultural heritage form and is a cultural gene representing the diversity of the traditional culture of national characteristics. It is invisible and blind, and can express its ICH content through digital culture content. For the ICH, the label determines the kind, content and presentation form of the ICH, and the identifier may also be a trigger mechanism, which triggers the corresponding AR element (for example, the digital culture content mentioned later) to be loaded on the captured image after receiving the identifier. In addition, the mark can also be a Marker which can be divided into a black and white artificially marked identifier (Marker) and an identifier (Markerless) of a natural image characteristic graph without an artificial mark, wherein the Marker is a Marker with various patterns, graphs and the like of traditional culture element symbols according to culture contents and is a target object for subsequent three-dimensional tracking registration. In the embodiment of the invention, the identification information can be one or more of identification information of the cultural creative product, identification information of an exhibit inside the museum, or identification information of the scenic spot. The Marker identification information can also be a string of digital codes with a specific format, the digital codes are stored in the wireless network nodes, when the wireless sensor interacts with the wireless network nodes, the wireless network nodes package the digital codes in a signal format and send the digital codes to the wireless sensor in a message form, and the wireless sensor decodes the signals, extracts the identification information from the signals and forwards the identification information to the mobile terminal. Optionally, the identifier may also be a QR code, etc., and the specific representation form of the identifier is not limited in the present invention.
The mobile terminal 12 is configured to collect an image of the material culture carrier, receive a signal sent by the wireless sensor 13, and send the collected image and the signal to the cloud server 15;
the mobile terminal 12 may be a conventional mobile terminal, a personal computer, a smart phone, a tablet computer, or the like. The mobile terminal 12 is provided with a single camera or dual cameras for capturing images of the material culture carrier.
The material culture carriers can be divided into the following three types in the embodiment of the invention: the method aims at the handicrafts such as the traditional skills, the traditional arts and crafts, the folk literature and the like, souvenirs, ethnic feature products and the like; second, the exhibition article of the museum, including various cultural relics, calligraphy and painting, classical architecture model, etc.; and thirdly, ecological museums, folk villages, national style gardens and other cultural tourist scenic spots, and national features/regional features products in the scenic spots, including national costumes, specialties, various garden landscapes (flowers, plants, trees, marbles, parks and the like). The material culture carrier is a visible and touchable carrier, and is essentially different from the visible and touchable carrier, and the technical scheme provided by the embodiment of the invention loads some digital culture contents which cannot be seen and heard at ordinary times on the visible and touchable carrier image, so that a user can really feel the existence of the digital culture contents.
The cloud server 15 is configured to construct digital culture content corresponding to the identification information, fuse and superimpose the digital culture content corresponding to the identification information in the signal onto the image after receiving the image and the signal sent by the mobile terminal, and send the fused and superimposed image to the mobile terminal 12;
the cloud server 15 fuses and superimposes the digital culture content corresponding to the identification information in the signal on the image, and specifically, the method may include:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts. Alternatively, the cloud server 15 may use computer graphics image CG technology to make a digital three-dimensional model corresponding to the identifier. This technology belongs to the prior art and will not be described here again.
The three-dimensional tracking registration means that in the operation process of a system, image characteristics of a real environment need to be detected and identified in real time, and the position and the direction of a camera in a three-dimensional space relative to the real environment are determined according to the acquired characteristics of a video image frame of the real environment, so that a virtual object is accurately placed at the correct position in the real scene. To realize seamless fusion of a virtual object and a real scene, a three-dimensional tracking registration technology is the most core key technology of an AR system.
The implementation of three-dimensional tracking Registration technology (3D Registration) can be generally divided into three approaches: computer vision based three-dimensional tracking registration techniques, hardware device based three-dimensional tracking registration techniques, and hybrid three-dimensional tracking registration techniques.
The three-dimensional tracking registration technology based on computer vision: the position and the posture of the camera are mainly determined by recognizing the features of the image in the real scene, and the registration method can be divided into a method based on an artificial Marker (Marker) and a method based on the features of a natural image. The system acquires internal and external parameters of the camera by identifying parallel lines, vertical lines, plane objects, angular points, texture features and the like of markers in real scenes in real time. The Marker-based method has the advantages that the characteristics of the real scene are obvious, the robustness is high, the operation speed is high, the attractiveness of the scene can be influenced by manually arranging black and white markers in the scene in advance, and the defects of the artificial markers can be overcome by the natural image characteristic-based method.
The three-dimensional tracking registration technology based on hardware equipment comprises the following steps: the method mainly utilizes a signal transmitting device and a sensing device to obtain related data and calculate position and attitude information required by three-dimensional registration. Hardware devices such as electromagnetic tracking devices, optical tracking devices, ultrasonic tracking devices, GPS positioning devices, inertial navigation devices, and electronic compasses are commonly used. The electromagnetic tracking device utilizes a coil signal transmitter to determine the relative position and attitude information of a target object in a real scene according to the coupling interaction relationship between a magnetic transmitting signal and a magnetic induction signal. The optical tracking device determines the six-degree-of-freedom information of a target object through an image pickup device or a photosensitive element according to received light source information or light emitted by a reflecting ball and through acquired images, light source information and three-dimensional space position information of a sensor. The ultrasonic tracking device performs three-dimensional registration based on a time difference, a phase difference, and a sound pressure generated when ultrasonic waves emitted from different sound sources reach a target object specified in a real scene. Determining the relative position information of a target object in a real scene through hardware devices such as GPS positioning, inertial navigation, electronic compass and the like is commonly used in an augmented reality information service system based on position service.
Hybrid three-dimensional tracking registration technology: in some specific applications, the position information of the virtual object relative to the real scene is determined by combining the multiple three-dimensional tracking registration methods, and the three-dimensional tracking registration function of the system is realized.
In the embodiment of the invention, the superposition of digital culture content and a real scene is realized by utilizing a three-dimensional tracking registration technology: and converting and calculating a world coordinate system, a camera coordinate system, an imaging plane coordinate system and a pixel coordinate system through coordinate information in the video image to realize accurate superposition of the digital culture content object and the real scene. In order to realize perfect fusion and registration of a virtual object and a real scene, the MAR system needs to detect and calculate the position and the direction of a camera of a mobile terminal in real time and calculate the mapping relation between digital cultural content and the real scene in real time according to external parameters of the camera so as to accurately place the cultural digital cultural content object at the correct position of a video image of the real scene where a target object is located; in addition, the MAR system also needs to render and draw digital three-dimensional models, digital animations, digital movies, sounds, pictures, digital texts and other digital culture content objects in real time, and fuse virtual images generated by the computers with real scene video images according to the mapping relation of the virtual objects acquired by the MAR system in real time in the real scene, so as to realize the enhancement effect on the real scene information, that is, obtain video images for enhancing the digital culture information on the basis of the real scene video images.
Specifically, in the embodiment of the invention, a system captures a video image sequence frame of a real environment through a camera on a mobile terminal, identifies a target object in the real environment, further determines the pose relationship of an intelligent terminal camera relative to the target object in the real environment, determines the accurate position of a virtual object (namely one or more digital culture contents such as a digital three-dimensional model, a digital animation, a video, an audio frequency, a graph, a digital text and the like of a digital culture corresponding to an identifier) in the real environment according to the acquired camera pose data information and positioning data information acquired from the real scene, finally completes real-time drawing and generation of the virtual object in the real environment, fuses and displays the virtual object and the real environment on the mobile terminal, forms a virtual-real fused new scene, completes a three-dimensional tracking and registering process, and achieves the purpose of enhancing culture information of the real environment.
In the running process of the MAR system, on one hand, the pose information of the camera of the mobile terminal is in a dynamic change state, the system must accurately acquire the position information of the camera relative to a target object in a real environment in real time to complete a subsequent accurate three-dimensional registration task, on the other hand, the system must detect, identify and track the identification information in a real environment in real time to quickly acquire the pose data information of the camera, and detect, identify and track the target object (identification) in real time in a three-dimensional space environment to complete virtual and real scene registration in real time. The three-dimensional tracking registration technology based on computer vision is characterized in that a system takes an image frame containing a target object as a positioning reference in a video image sequence frame of a real scene, obtains the pose relation between a camera and the real environment in real time by a method of detecting, identifying, matching and tracking the characteristic points of the image frame of the target object, updates the coordinate conversion relation from a virtual object to the target object in the real environment in real time, further updates a scene with fusion of virtual and real in real time, and completes the task of three-dimensional registration. The three-dimensional tracking registration process based on computer vision mainly comprises four links of image feature point detection, identification matching, real-time tracking and virtual-real fusion of a target object.
In the MAR system, three-dimensional tracking registration is a key technology for ensuring that virtual objects are accurately superposed and fused in a real environment, in the process, the system acquires data information of a real scene through a real scene video image captured by a camera, the data of the three-dimensional tracking registration directly comes from the image information captured by the camera, and if internal parameters of the camera are inaccurate, the three-dimensional registration precision is seriously influenced, even registration failure is caused. Therefore, the augmented reality system needs to perform Camera Calibration (Camera Calibration) to determine relevant parameters of the Camera, the Camera parameters include internal parameters and external parameters, the internal parameters refer to parameters related to geometric and optical characteristics of the Camera itself, the external parameters refer to three-dimensional spatial positions and directions of the Camera relative to a certain world coordinate system, and the process of performing Camera Calibration by the MAR system is a process of determining the internal and external parameters of the Camera. The three-dimensional tracking registration process of the augmented reality system is shown in fig. 2. In addition, the essence of the coordinate transformation three-dimensional tracking registration process in the three-dimensional tracking registration relates to the conversion relation among a world coordinate system, a camera coordinate system, an imaging plane coordinate system and a pixel coordinate system, and through the determination of the conversion matrix among the four coordinate systems, the MAR system can accurately overlay a virtual object in a scene of a real environment. The camera calibration aims to help a system to determine internal parameters and external parameters of the camera, obtain a conversion relation among a real world coordinate system, a camera coordinate system, an imaging plane coordinate system and a pixel coordinate system, and provide data information for accurately superposing subsequent virtual objects in a real scene. The real-time tracking process comprises real-time detection and tracking of a target object in a real scene, and the position and the direction of the camera relative to a real world coordinate system are obtained through real-time detection, identification matching and tracking of the target object in the real scene so as to determine a conversion matrix between the real world coordinate system and a camera coordinate system. The three-dimensional scene drawing and fusion mainly uses a conversion relation among a world coordinate system, a camera coordinate system, an imaging plane coordinate system and a pixel coordinate system which are obtained by a system, and uses a computer graphic image drawing generation technology to superpose a virtual object in a target object area of a real scene to form a new virtual-real fused scene.
The mobile terminal 12 is further configured to receive and display the fused and superimposed image sent by the cloud server 15.
It should be noted that, for the case of a museum exhibit, that is, when the material culture carrier is a museum exhibit and the identification information is museum internal exhibit information, the cloud server 15 is further configured to: constructing an identification database, and performing dynamic identification switching, identification and matching on the image according to the identification database; acquiring gesture positioning coordinates in the images acquired by the mobile terminal, and performing matching detection on the exhibits; and determining the posture and position information of the mobile terminal, and performing virtual-real fusion on the digital culture content corresponding to the exhibit and the exhibit image, so that not only can the reconstruction and reproduction of the cultural relic be realized, but also the perfect fusion of the digital culture content and the historical historic site can be realized.
The mobile augmented reality MAR has two main modes for development and application, one mode is a Location Based Service (LBS) Based MAR system, the position and the posture of an intelligent terminal are determined mainly through equipment such as a Global Positioning System (GPS) of a mobile intelligent terminal, an electronic compass, a gravity accelerometer and the like, and relevant information of a Point of Interest (POI) of the surrounding environment of a user, such as marking information of hotels, navigation, main public facilities and the like around the user, is superimposed in the intelligent terminal of the user. The other is the application of a mobile augmented reality system of a computer vision tracking registration method based on natural image characteristics, the position and the posture of an intelligent terminal relative to a target object are determined mainly by identifying the characteristics of the target object around a user through a mobile intelligent terminal camera, and information related to the target object is superposed in the intelligent terminal of the user. In two application modes of the current MAR, the former mainly marks POI information in a certain distance range around a user, and the general superposed information belongs to information such as two-dimensional characters and pictures, so that the method is not suitable for realizing detailed interpretation and display of a target object through superposition of richer digital information such as formalization and contextualization. The embodiment of the invention creatively provides a scheme based on the combination of a wireless sensing technology and an AR technology, the wireless sensor is used for receiving the wireless network node, the identifier in the current area range is sent to the wireless sensor and is forwarded to the mobile terminal, when a user uses the mobile terminal to collect the image of the material culture carrier, the cloud server superposes the digital culture content corresponding to the identifier information at the specific position of the image according to the identified image and the identifier information, and AR virtual-real superposition is realized. By adopting the scheme provided by the embodiment of the invention, not only can AR virtual and real superposition be carried out more accurately, but also higher user experience is brought, and industrial application can be realized, thereby realizing commercial success.
Example two
As shown in fig. 3, an embodiment of the present invention discloses a method for displaying a mobile augmented reality, including:
s201, a wireless sensor receives signals sent by one or more wireless network nodes and sends the signals to a mobile terminal, wherein the signals comprise identification information corresponding to a material culture carrier;
it should be noted that the wireless sensor includes, but is not limited to, Zigbee, WIFI, bluetooth, NFC, and other wireless sensors. In the embodiment of the present invention, the wireless sensors used in the embodiments are different according to different scenes, for example, in a short-distance (within 10 meters) communication range, NFC and bluetooth may be used to complete data transmission and reception, and in a medium-distance (10-100 meters) communication range, Zigbee, WIFI, and other communication methods are used.
In the embodiment of the present invention, a Zigbee module is preferably used as a wireless sensor, and the Zigbee module may be inherited inside a mobile terminal, or may be independent from the mobile terminal, and performs signaling transmission and interaction with the mobile terminal through a communication protocol. ZigBee is a low-power consumption local area network protocol based on IEEE802.15.4 standard. According to international standards, the ZigBee technology is a short-range, low-power consumption wireless communication technology. Its advantages are short distance, low complexity, self-organization, low power consumption and low data rate. The device is mainly suitable for the fields of automatic control and remote control, and can be embedded into various devices.
The wireless network node can be fixedly placed at a specific position and is used for communicating with the wireless sensor, taking Zigbee as an example, the wireless network node acquires Zigbee information in a communication range, adopts a Zigbee transmission protocol to handshake with the Zigbee module, completes signaling interaction, and sends identification information stored inside and information of a material culture carrier at the current position of the wireless network node to the Zigbee module. The radio network node may be understood as a Zigbee "base station".
S202, the mobile terminal collects the image of the material culture carrier, receives a signal sent by the wireless sensor, and sends the collected image and the signal to a cloud server;
the identification information comprises position information of the current area of the mobile terminal, after the wireless network node sends the position information to the wireless sensor, the wireless sensor sends the position information to the mobile terminal and forwards the position information to the cloud server, and at the moment, the cloud server can determine digital culture content corresponding to the position information according to the corresponding relation between the position information and the digital culture content and superimpose the digital culture content in the image. Different from the traditional mode of combining the GPS and the AR, the invention has the advantages of higher position precision and higher recognition rate by combining the wireless sensing and the AR.
Alternatively, the identifier may be a label, which may be a specific label for indicating non-material Cultural Heritage (ICH) or material Cultural Heritage. The material cultural heritage is the actual object, such as natural landscape, historical and ancient sites, souvenir, exhibit, cultural relics, calligraphy and painting and the like. The non-material cultural heritage is different from the material cultural heritage, is an extremely important cultural heritage form and is a cultural gene representing the diversity of the traditional culture of national characteristics. It is invisible and blind, and can express its ICH content through digital culture content. For the ICH, the label determines the kind, content and presentation form of the ICH, and the identifier may also be a trigger mechanism, which triggers the corresponding AR element (for example, the digital culture content mentioned later) to be loaded on the captured image after receiving the identifier. In addition, the mark can also be a Marker which can be divided into a black and white artificially marked identifier (Marker) and an identifier (Markerless) of a natural image characteristic graph without an artificial mark, wherein the Marker is a Marker with various patterns, graphs and the like of traditional culture element symbols according to culture contents and is a target object for subsequent three-dimensional tracking registration. In the embodiment of the invention, the identification information can be one or more of identification information of the cultural creative product, identification information of an exhibit inside the museum, or identification information of the scenic spot. The identification information can also be a string of digital codes with a specific format, the digital codes are stored in the wireless network node, when the wireless sensor interacts with the wireless network node, the wireless network node encapsulates the digital codes in a signal format and sends the digital codes to the wireless sensor in a message form, and the wireless sensor decodes the signals, extracts the identification information from the signals and forwards the identification information to the mobile terminal. Optionally, the identifier may also be a QR code, etc., and the specific representation form of the identifier is not limited in the present invention.
The digital cultural content Marker mobile terminal can be a commonly used mobile terminal, a personal computer, a smart phone, a tablet computer and the like. The mobile terminal is provided with a single camera or two cameras and is used for collecting images of the material culture carrier.
The material culture carriers can be divided into the following three types in the embodiment of the invention: the method aims at the handicrafts such as the traditional skills, the traditional arts and crafts, the folk literature and the like, souvenirs, ethnic feature products and the like; second, the exhibition article of the museum, including various cultural relics, calligraphy and painting, classical architecture model, etc.; and thirdly, ecological museums, folk villages, national style gardens and other cultural tourist scenic spots, and national features/regional features products in the scenic spots, including national costumes, specialties, various garden landscapes (flowers, plants, trees, marbles, parks and the like). The material culture carrier is a visible and touchable carrier, and is essentially different from the visible and touchable carrier, and the technical scheme provided by the embodiment of the invention loads some digital culture contents which cannot be seen and heard at ordinary times on the visible and touchable carrier image, so that a user can really feel the existence of the digital culture contents.
S203, the cloud server constructs digital culture contents corresponding to the identification information, fuses and superposes the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sends the fused and superposed image to the mobile terminal;
and S204, the mobile terminal receives and displays the fused and overlapped image sent by the cloud server.
The cloud server fuses and superposes digital culture contents corresponding to the identification information in the signal on the image, and specifically, the digital culture contents can be:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts. That is, the digital culture content includes a three-dimensional digital model and other contents capable of showing the digital culture content, such as a piece of video, a piece of text, a piece of voice or picture, etc.
The three-dimensional tracking registration means that in the operation process of a system, image characteristics of a real environment need to be detected and identified in real time, and the position and the direction of a camera in a three-dimensional space relative to the real environment are determined according to the acquired characteristics of a video image frame of the real environment, so that a virtual object is accurately placed at the correct position in the real scene. To realize seamless fusion of a virtual object and a real scene, a three-dimensional tracking registration technology is the most core key technology of an AR system. For details of the related art, please refer to the related part of the embodiment, which is omitted here.
Wherein the method further comprises:
and the cloud server adopts a computer graphics image CG technology to manufacture a digital three-dimensional model corresponding to the identifier.
It should be noted that, for the case of a museum exhibit, that is, when the material culture carrier is a museum exhibit and the identification information is museum internal exhibit information, the cloud server 15 is further configured to: constructing an identification database, and performing dynamic identification switching, identification and matching on the image according to the identification database; acquiring gesture positioning coordinates in the images acquired by the mobile terminal, and performing matching detection on the exhibits; and determining the posture and position information of the mobile terminal, and performing virtual-real fusion on the digital culture content corresponding to the exhibit and the exhibit image, so that not only can the reconstruction and reproduction of the cultural relic be realized, but also the perfect fusion of the digital culture content and the historical historic site can be realized.
The embodiment of the invention creatively provides a scheme based on the combination of a wireless sensing technology and an AR technology, the wireless sensor is used for receiving the wireless network node, the identifier in the current area range is sent to the wireless sensor and is forwarded to the mobile terminal, when a user uses the mobile terminal to collect the image of the material culture carrier, the cloud server superposes the digital culture content corresponding to the identifier information at the specific position of the image according to the identified image and the identifier information, and AR virtual-real superposition is realized. By adopting the scheme provided by the embodiment of the invention, not only can AR virtual and real superposition be carried out more accurately, but also higher user experience is brought, and industrial application can be realized, thereby realizing commercial success.
It should be understood that, in the various embodiments of the present application, the size of the serial number of each process does not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative modules and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
All parts of the specification are described in a progressive mode, the same and similar parts of all embodiments can be referred to each other, and each embodiment is mainly introduced to be different from other embodiments. In particular, as to the apparatus and system embodiments, since they are substantially similar to the method embodiments, the description is relatively simple and reference may be made to the description of the method embodiments in relevant places.
Finally, it is to be noted that: the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. To the extent that such modifications and variations of the present application fall within the scope of the claims and their equivalents, they are intended to be included within the scope of the present application.

Claims (10)

1. A Mobile Augmented Reality (MAR) system, the system comprising a mobile terminal, a wireless sensor, one or more wireless network nodes and a cloud server, wherein,
the wireless sensor is used for receiving signals sent by the one or more wireless network nodes and sending the signals to the mobile terminal, wherein the signals comprise identification information corresponding to a material culture carrier, the identification information is a digital code and is stored in the wireless network nodes, when the wireless sensor interacts with the wireless network nodes, the wireless network nodes package the digital code in a signal format, and the digital code packaged in the signal format is used as the signals and is sent to the wireless sensor in a message form;
the receiving the signals sent by the one or more wireless network nodes and sending the signals to the mobile terminal comprises: receiving the signal sent by the wireless network node in a message form, decoding the received signal to extract the identification information, and sending the proposed identification information to the mobile terminal;
the mobile terminal is used for acquiring the image of the material culture carrier, receiving the signal sent by the wireless sensor and sending the acquired image and the signal to the cloud server;
the cloud server is used for constructing digital culture contents corresponding to the identification information, fusing and superposing the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sending the fused and superposed image to the mobile terminal;
the mobile terminal is also used for receiving and displaying the fused and superposed image sent by the cloud server;
the identification information also comprises position information of the current area of the mobile terminal, and after the wireless network node sends the position information to the wireless sensor, the wireless sensor sends the position information to the mobile terminal and forwards the position information to the cloud server.
2. The system of claim 1, wherein the cloud server is configured to superimpose a digital culture fusion corresponding to the identification information in the signal onto the image, and the superimposing comprises:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts.
3. The system of claim 2, wherein the cloud server is further configured to:
and manufacturing a digital three-dimensional model corresponding to the identification by adopting a computer graphics image CG technology.
4. The system of any one of claims 1-3, wherein the material culture carrier comprises a cultural creative product, a museum exhibit, or an attraction, and the identification information comprises one or more of location information of an area where the mobile terminal is located, identification information of the cultural creative product, identification information of the museum interior exhibit, or attraction identification information.
5. The system of claim 4, wherein the cloud server is further configured to:
when the material culture carrier is a museum exhibit and the identification information is the exhibit information in the museum, constructing an identification database, and performing dynamic identification switching, identification and matching on the image according to the identification database;
acquiring a posture positioning coordinate in an image acquired by the mobile terminal, and performing matching detection on the exhibit;
and determining the posture and position information of the mobile terminal, and performing virtual-real fusion on the digital culture content corresponding to the exhibit and the exhibit image.
6. The system of claim 1, wherein the wireless sensor is a Zigbee wireless transceiver module, and the wireless sensor is independent or built in the mobile terminal.
7. A method of mobile augmented reality display, comprising:
the method comprises the steps that a wireless sensor receives signals sent by one or more wireless network nodes in a message form, decodes the received signals to extract identification information, and sends the signals with the identification information extracted to a mobile terminal, wherein the signals comprise identification information corresponding to a material culture carrier, the identification information is digital codes and is stored in the wireless network nodes, and when the wireless sensor interacts with the wireless network nodes, the digital codes are packaged in a signal format by the wireless network nodes, the digital codes packaged in the signal format are used as the signals, and the signals are sent to the wireless sensor in the message form;
the mobile terminal collects the image of the material culture carrier, receives the signal sent by the wireless sensor and sends the collected image and the signal to a cloud server;
the cloud server constructs digital culture contents corresponding to the identification information, fuses and superposes the digital culture contents corresponding to the identification information in the signal on the image after receiving the image and the signal sent by the mobile terminal, and sends the fused and superposed image to the mobile terminal;
the mobile terminal receives and displays the fused and superposed image sent by the cloud server;
the identification information also comprises position information of the current area of the mobile terminal, and after the wireless network node sends the position information to the wireless sensor, the wireless sensor sends the position information to the mobile terminal and forwards the position information to the cloud server.
8. The method of claim 7, wherein the cloud server overlays digital cultural content fusion corresponding to the identification information in the signal onto the image, and comprises:
carrying out virtual-real registration processing on the digital culture content and the acquired image by adopting a three-dimensional tracking registration technology;
rendering the digital culture content in real time, superposing and fusing the digital culture content and the material culture carrier image according to the mapping relation of the digital culture content in a real scene after the virtual-real registration processing,
the digital culture content comprises a digital three-dimensional model and one or more of digital animation, digital film and television, digital sound, digital pictures and digital texts.
9. The method of claim 8, further comprising:
and the cloud server adopts a computer graphics image CG technology to manufacture a digital three-dimensional model corresponding to the identifier.
10. The method of any one of claims 7-9, wherein the material culture carrier comprises a cultural creative product, a museum exhibit, or an attraction, and the identification information comprises one or more of location information of an area where the mobile terminal is located, identification information of the cultural creative product, identification information of the museum interior exhibit, or attraction identification information.
CN201810253121.5A 2018-03-26 2018-03-26 Mobile augmented reality system and display method Active CN108540542B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810253121.5A CN108540542B (en) 2018-03-26 2018-03-26 Mobile augmented reality system and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810253121.5A CN108540542B (en) 2018-03-26 2018-03-26 Mobile augmented reality system and display method

Publications (2)

Publication Number Publication Date
CN108540542A CN108540542A (en) 2018-09-14
CN108540542B true CN108540542B (en) 2021-12-21

Family

ID=63484746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810253121.5A Active CN108540542B (en) 2018-03-26 2018-03-26 Mobile augmented reality system and display method

Country Status (1)

Country Link
CN (1) CN108540542B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345632B (en) * 2018-09-17 2023-04-07 深圳达闼科技控股有限公司 Method for acquiring image, related device and readable storage medium
CN109584378A (en) * 2018-12-29 2019-04-05 广州欧科信息技术股份有限公司 History culture ancient building object based on AR leads reward method, apparatus and system
CN111829546A (en) * 2019-04-18 2020-10-27 阿里巴巴集团控股有限公司 AR navigation system and equipment, and clock correction method
CN112053451A (en) * 2019-06-05 2020-12-08 北京外号信息技术有限公司 Method for superimposing virtual objects based on optical communication means and corresponding electronic device
CN112053444A (en) * 2019-06-05 2020-12-08 北京外号信息技术有限公司 Method for superimposing virtual objects based on optical communication means and corresponding electronic device
CN111504308A (en) * 2020-04-21 2020-08-07 南京比特互动创意科技有限公司 Intelligent exhibition hall supporting system based on AR technology
CN112070901A (en) * 2020-07-21 2020-12-11 马小淞 AR scene construction method and device for garden, storage medium and terminal
CN113140046A (en) * 2021-04-21 2021-07-20 上海电机学院 AR (augmented reality) cross-over control method and system based on three-dimensional reconstruction and computer readable medium
CN114268784A (en) * 2021-12-31 2022-04-01 东莞仲天电子科技有限公司 Method for improving near-to-eye display experience effect of AR (augmented reality) equipment
CN114489344A (en) * 2022-02-16 2022-05-13 海南热带海洋学院 Augmented reality digital culture content display device and method based on Internet of things
CN115345808B (en) * 2022-08-18 2023-07-21 北京拙河科技有限公司 Picture generation method and device based on multi-element information acquisition

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN103703758A (en) * 2011-07-01 2014-04-02 英特尔公司 Mobile augmented reality system
CN107800749A (en) * 2016-09-07 2018-03-13 北京嘀嘀无限科技发展有限公司 Data sending processing method and device, route planning method and server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103703758A (en) * 2011-07-01 2014-04-02 英特尔公司 Mobile augmented reality system
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN107800749A (en) * 2016-09-07 2018-03-13 北京嘀嘀无限科技发展有限公司 Data sending processing method and device, route planning method and server

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
余日季;基于移动终端和AR技术的博物馆文化教育体验系统的设计与应用研究;《中国电化教育》;20170307;题注*,摘要,第3-5节 *

Also Published As

Publication number Publication date
CN108540542A (en) 2018-09-14

Similar Documents

Publication Publication Date Title
CN108540542B (en) Mobile augmented reality system and display method
CN102741797B (en) Method and apparatus for transforming three-dimensional map objects to present navigation information
US10677596B2 (en) Image processing device, image processing method, and program
US20200349350A1 (en) Methods and apparatus for venue based augmented reality
CA2949543C (en) Platform for constructing and consuming realm and object feature clouds
US9558559B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
CN104024984B (en) Portable set, virtual reality system and method
US9495783B1 (en) Augmented reality vision system for tracking and geolocating objects of interest
CN110017841A (en) Vision positioning method and its air navigation aid
CN108564662A (en) The method and device that augmented reality digital culture content is shown is carried out under a kind of remote scene
EP2974509B1 (en) Personal information communicator
US9041714B2 (en) Apparatus and method for compass intelligent lighting for user interfaces
JP2013517579A (en) Augmented reality system
US11734898B2 (en) Program, information processing method, and information processing terminal
US11769306B2 (en) User-exhibit distance based collaborative interaction method and system for augmented reality museum
JP2006105640A (en) Navigation system
JP2004102835A (en) Information providing method and system therefor, mobile terminal device, head-wearable device, and program
JP6665402B2 (en) Content display terminal, content providing system, content providing method, and content display program
TWM580186U (en) 360 degree surround orientation and position sensing object information acquisition system
WO2019016820A1 (en) A METHOD FOR PLACING, TRACKING AND PRESENTING IMMERSIVE REALITY-VIRTUALITY CONTINUUM-BASED ENVIRONMENT WITH IoT AND/OR OTHER SENSORS INSTEAD OF CAMERA OR VISUAL PROCCESING AND METHODS THEREOF
CN108615260A (en) The method and device that shows of augmented reality digital culture content is carried out under a kind of exception actual environment
KR20150077607A (en) Dinosaur Heritage Experience Service System Using Augmented Reality and Method therefor
CN104166929A (en) Information pushing system and method based on space-time scenes
Trivedi et al. A Systematic Review of Tools Available in the Field of Augmented Reality
KR20120048888A (en) 3d advertising method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant