CN114025133A - Augmented reality projection method and system - Google Patents

Augmented reality projection method and system Download PDF

Info

Publication number
CN114025133A
CN114025133A CN202111286425.XA CN202111286425A CN114025133A CN 114025133 A CN114025133 A CN 114025133A CN 202111286425 A CN202111286425 A CN 202111286425A CN 114025133 A CN114025133 A CN 114025133A
Authority
CN
China
Prior art keywords
augmented reality
cloud platform
terminal
video data
recorded video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111286425.XA
Other languages
Chinese (zh)
Inventor
俞一帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ailing Network Co ltd
Original Assignee
Shenzhen Ailing Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ailing Network Co ltd filed Critical Shenzhen Ailing Network Co ltd
Priority to CN202111286425.XA priority Critical patent/CN114025133A/en
Publication of CN114025133A publication Critical patent/CN114025133A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an augmented reality projection method and system, and relates to the technical field of computer vision. The augmented reality projection method applied to the terminal comprises the following steps: receiving real-world-based recorded video data sent by augmented reality equipment; the method comprises the steps of sending real-world-based recorded video data to a cloud platform through a network system, enabling the cloud platform to generate virtual images according to the recorded video data, receiving the virtual images sent by the cloud platform through the network system, and sending the virtual images to augmented reality equipment, wherein the augmented reality equipment projects the virtual images to an optical virtual-real synthesis module through a display, so that a user watches the augmented reality images through the augmented reality equipment, and the augmented reality images are obtained by synthesizing the virtual images and the real-world images through the optical virtual-real synthesis module. According to the method and the device, the cloud platform generates the virtual image from the recorded video data so as to reduce the performance requirement on the terminal.

Description

Augmented reality projection method and system
Technical Field
The invention relates to the technical field of computer vision, in particular to an augmented reality projection method and system.
Background
An Augmented Reality (AR) system uses a computer vision method to establish a mapping relationship between the real world and a screen.
Fig. 1 is a schematic diagram of a conventional augmented reality projection, and as shown in fig. 1, an AR system synthesizes a real world image with virtual information generated by a graphic system through an optical virtual-real synthesizer, and displays the synthesized image to a user.
The existing graphic system usually runs in a terminal, so that the graphic system needs to finish the detection of head position movement information depending on the processing capability of the terminal and quickly finish the generation of a virtual object image according to the position movement information. When the graphic system needs to convert a real object video into a virtual object image, the performance requirement on the terminal is high, and complicated real-time calculation is needed, so that the battery consumption of the terminal is increased, and the working endurance time of the AR system is reduced.
Disclosure of Invention
The present invention provides an augmented reality projection method to reduce the performance requirement on the terminal and improve the user experience effect, aiming at the defects in the prior art.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides an augmented reality projection method, which is applied to a terminal, and the method includes:
receiving real-world-based recorded video data sent by augmented reality equipment;
sending the real-world-based recorded video data to a cloud platform through a network system, so that the cloud platform generates a virtual image according to the recorded video data;
receiving the virtual image sent by the cloud platform through the network system;
and sending the virtual image to the augmented reality equipment, wherein the augmented reality equipment projects the virtual image to an optical virtual-real synthesis module through a display so that a user can watch the augmented reality image through the augmented reality equipment, and the augmented reality image is obtained by synthesizing the virtual image and the real world image through the optical virtual-real synthesis module.
Optionally, after receiving real-world-based recorded video data sent by an augmented reality device, the method further includes:
and carrying out video coding on the recorded video data by adopting a preset coding algorithm to obtain a key data frame and a general data frame.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink;
the sending the real-world-based recorded video data to a cloud platform through a network system includes:
and sending the key data frame to the cloud platform through the high-speed uplink, and sending the general data frame to the cloud platform through the common uplink.
In a second aspect, an embodiment of the present application further provides an augmented reality projection method, which is applied to a cloud platform, and the method includes:
receiving real-world-based recorded video data forwarded by a terminal through a network system, wherein the real-world-based recorded video data is recorded by augmented reality equipment and transmitted to the terminal;
performing video decoding on the recorded video data, and generating a virtual image based on the recorded video data;
and forwarding the virtual image to the terminal through the network system so that the terminal transmits the virtual image to the augmented reality equipment for display, wherein the augmented reality equipment projects the virtual image to an optical virtual-real synthesis module through a display.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink; the recorded video data comprises a coded key data frame and a general data frame;
the real-world-based recorded video data forwarded by the receiving terminal through the network system comprises:
receiving the key data frame transmitted through the high speed uplink and the general data frame transmitted through the normal uplink.
Optionally, the performing video decoding on the recorded video data and generating a virtual image based on the recorded video data includes:
and respectively carrying out video decoding on the key data frame and the general data frame, and generating the virtual image based on a preset processing algorithm.
In a third aspect, an embodiment of the present application further provides an augmented reality projection system, including: augmented reality equipment, a terminal, a network system and a cloud platform;
the augmented reality equipment is in communication connection with the terminal, and the terminal is in communication connection with the cloud platform through the network system;
the terminal is configured to execute the method applied to any one of the terminals in the above embodiments, and the cloud platform is configured to execute the method applied to any one of the cloud platforms in the above embodiments.
Optionally, augmented reality equipment is wearing formula augmented reality equipment, includes: camera module, display, optics virtual reality synthesis module.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink.
Optionally, the network system is a 5G network system.
In a fourth aspect, an embodiment of the present application further provides an augmented reality projection apparatus, which is applied to a terminal, the apparatus includes:
the first video receiving module is used for receiving real-world-based recorded video data sent by augmented reality equipment;
the video forwarding module is used for sending the real-world-based recorded video data to a cloud platform through a network system so that the cloud platform generates a virtual image according to the recorded video data;
the image receiving module is used for receiving the virtual image sent by the cloud platform through the network system;
the first image forwarding module is used for sending the virtual image to the augmented reality equipment, wherein the augmented reality equipment projects the virtual image to the optical virtual-real synthesis module through a display so that a user can watch the augmented reality image through the augmented reality equipment, and the augmented reality image is obtained by synthesizing the virtual image and the real world image through the optical virtual-real synthesis module.
Optionally, after the first video receiving module, the apparatus further includes:
and the video coding module is used for carrying out video coding on the recorded video data by adopting a preset coding algorithm to obtain a key data frame and a general data frame.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink; the video forwarding module is specifically configured to send the key data frame to the cloud platform through the high-speed uplink, and send the general data frame to the cloud platform through the normal uplink.
In a fifth aspect, an embodiment of the present application further provides an augmented reality projection apparatus, which is applied to a cloud platform, the apparatus includes:
the second video receiving module is used for receiving real-world-based recorded video data forwarded by the terminal through the network system, and the real-world-based recorded video data is recorded by the augmented reality equipment and transmitted to the terminal;
the video decoding module is used for carrying out video decoding on the recorded video data and generating a virtual image based on the recorded video data;
and the second image forwarding module is used for forwarding the virtual image to the terminal through the network system so that the terminal transmits the virtual image to the augmented reality device for displaying, wherein the augmented reality device projects the virtual image to the optical virtual-real synthesis module through a display.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink; the recorded video data comprises a coded key data frame and a general data frame; the second video receiving module is specifically configured to receive the key data frame sent through the high-speed uplink and the general data frame sent through the normal uplink.
Optionally, the video decoding module is specifically configured to perform video decoding on the key data frame and the general data frame, respectively, and generate the virtual image based on a preset processing algorithm.
In a sixth aspect, an embodiment of the present application further provides a terminal, including: a processor, a storage medium and a bus, wherein the storage medium stores program instructions executable by the processor, when the terminal runs, the processor communicates with the storage medium through the bus, and the processor executes the program instructions to execute the steps of any of the augmented reality projection methods applied to the terminal.
In a seventh aspect, this application embodiment further provides a computer-readable storage medium, where the storage medium stores a computer program, and the computer program is executed by a processor to perform the steps of the augmented reality projection method as described in any one of the above embodiments applied to a terminal.
In an eighth aspect, an embodiment of the present application further provides a cloud platform, including: the system comprises a processor, a storage medium and a bus, wherein the storage medium stores program instructions executable by the processor, when the cloud platform runs, the processor and the storage medium are communicated through the bus, and the processor executes the program instructions to execute the steps of any augmented reality projection method applied to the cloud platform.
In a ninth aspect, embodiments of the present application further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the augmented reality projection method as described in any one of the above embodiments applied to the cloud platform.
The beneficial effect of this application is:
the application provides an augmented reality projection method and system, wherein the augmented reality projection method applied to a terminal comprises the following steps: receiving real-world-based recorded video data sent by augmented reality equipment; the method comprises the steps of sending real-world-based recorded video data to a cloud platform through a network system, enabling the cloud platform to generate virtual images according to the recorded video data, receiving the virtual images sent by the cloud platform through the network system, and sending the virtual images to augmented reality equipment, wherein the augmented reality equipment projects the virtual images to an optical virtual-real synthesis module through a display, so that a user watches the augmented reality images through the augmented reality equipment, and the augmented reality images are obtained by synthesizing the virtual images and the real-world images through the optical virtual-real synthesis module. According to the method and the device, the terminal sends the recorded video data to the cloud platform through the network system, and the cloud platform generates the virtual image by recording the video data, so that the performance requirement on the terminal is reduced, the workload of the terminal is reduced, the endurance time of the terminal is improved, and the experience effect of a user is enhanced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a schematic diagram of a conventional augmented reality projection;
fig. 2 is a schematic structural diagram of an augmented reality projection system according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a first augmented reality projection method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a second augmented reality projection method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a first augmented reality projection apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a second augmented reality projection apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a cloud platform provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present application, it should be noted that if the terms "upper", "lower", etc. are used for indicating the orientation or positional relationship based on the orientation or positional relationship shown in the drawings or the orientation or positional relationship which is usually arranged when the product of the application is used, the description is only for convenience of describing the application and simplifying the description, but the indication or suggestion that the referred device or element must have a specific orientation, be constructed in a specific orientation and operation, and thus, cannot be understood as the limitation of the application.
Furthermore, the terms "first," "second," and the like in the description and in the claims, as well as in the drawings, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the features of the embodiments of the present application may be combined with each other without conflict.
The augmented reality projection method provided by the embodiment of the application is applied to an augmented reality projection system, and fig. 2 is a schematic structural diagram of the augmented reality projection system provided by the embodiment of the application; as shown in fig. 2, the augmented reality projection system includes: augmented reality device 10, terminal 20, network system 30 and cloud platform 40.
The augmented reality device 10 is in communication connection with the terminal 20, and the terminal 20 is in communication connection with the cloud platform 40 through the network system 30.
Specifically, the augmented reality device 10, on the one hand, performs video recording of the real world that the user is facing, and transmits the recorded video data of the real world to the terminal 20, and, on the other hand, receives the virtual image transmitted by the terminal 20 to present the augmented reality image obtained by synthesizing the virtual image and the real world image to the user. For example, the augmented reality device 10 and the terminal 20 may be connected through a data line communication, or a wireless communication connection may be used, where the wireless communication connection is used to extend the moving range of the user within a certain range with respect to the data line communication connection, and the wireless communication connection may be used to connect the augmented reality device 10 and the terminal 20 to the same local area network.
The network system 30 between the terminal 20 and the cloud platform 40 is in communication connection, in the uplink direction, the terminal 20 sends the recorded video data to the cloud platform 40 after video coding through the network system 30, and in the downlink direction, the cloud platform 40 sends the virtual image generated according to the recorded video data to the terminal 20 through the network system 30. In the embodiment of the application, the terminal is an intelligent terminal, such as a smart phone, a tablet computer, a computer, and the like.
In an alternative embodiment, an image forwarding module is run on the terminal 20 to forward the recorded video data to the cloud platform 40 in the uplink direction and forward the virtual image to the augmented reality device 10 in the downlink direction.
For example, the cloud platform 40 may adopt an edge cloud platform, and a graphic system is run on the cloud platform 40 and is used for analyzing and recognizing the recorded video data and generating a virtual image.
The augmented reality projection system provided by the embodiment of the application comprises augmented reality equipment, a terminal, a network system and a cloud platform, wherein the terminal is used for executing an augmented reality projection method applied to the terminal, and the cloud platform is used for executing an augmented reality projection method applied to the cloud platform. According to the embodiment of the application, the cloud platform executes the generation of the virtual image according to the recorded video data, so that the configuration performance requirement on the terminal and the power consumption of the terminal are reduced, and the working duration of the augmented reality projection system is prolonged.
In an optional embodiment, the augmented reality device is a wearable augmented reality device, as shown in fig. 2, the wearable augmented reality device includes: camera module 11, display 12, optics virtual reality synthesis module 13.
Specifically, the camera module 11 is in communication connection with the terminal 20, and the camera module 11 records a video of a real world faced by a user and transmits the recorded video data to the terminal 20. The display 12 is in communication connection with the terminal 20 to receive the virtual image sent by the terminal 20, the display 12 projects the virtual image onto the optical virtual-real synthesizing module 13, the optical virtual-real synthesizing module 13 generates an augmented reality image by optically fusing the virtual image and the real world image, and a user can view the augmented reality image through a screen of the wearable augmented reality device. For example, the wearable augmented reality device may be a head-mounted display that the user wears on his or her head so that the augmented reality image can be viewed through the screen of the head-mounted display.
The embodiment of the application provides an augmented reality projection system, wherein augmented reality equipment is wearing formula augmented reality equipment, includes: camera module, display and optics virtual reality synthesis module. The camera module is used for shooting the recorded video data of the real world, so that the virtual image generated by the cloud platform based on the recorded video data can be better fused with the real world, and the projection effect of the augmented reality device is improved.
In an alternative embodiment, the network system 30 between the terminal 20 and the cloud platform 40 includes: high speed uplink and normal uplink.
Specifically, the transmission rate of the high-speed uplink is higher than that of the normal uplink, the high-speed uplink is used for transmitting a key data frame, the normal uplink is used for transmitting a general data frame, the terminal 20 transmits the key data frame to the network system 30 through the high-speed uplink, the general data frame is transmitted to the network system 30 through the normal uplink, the network system 30 forwards the key data frame to the cloud platform 40 through the high-speed uplink, the general data frame is forwarded to the cloud platform 40 through the normal uplink, and the key data frame and the general data frame are obtained by encoding the recorded video data by the terminal 20.
It should be noted that the key data frame includes, but is not limited to: recording a frame related to an action in the video data, recording a frame summarizing a video scene or an event in the video data, or recording one of a plurality of frames with a similarity within a preset range.
In an alternative embodiment, the network system is a 5G network system.
Specifically, the 5G network system is a high-speed and low-delay mobile communication technology, and the network system used between the terminal 20 and the cloud platform 40 may be selected according to a transmission requirement, and may be adjusted along with development of the mobile communication technology, which is not limited in this application.
The augmented reality projection system that this application embodiment provided, the network system between terminal and the cloud platform includes: high speed uplink and normal uplink. According to the embodiment of the application, the uplink transmission link is divided into the high-speed uplink and the common uplink, so that the transmission efficiency is improved, the key data frames can be accurately transmitted to the cloud platform, and the effect of the augmented reality projection system is improved conveniently.
On the basis of the embodiment of the augmented reality projection system, an augmented reality projection method is further provided in the embodiment of the present application, and is applied to the terminal 20, fig. 3 is a schematic flow diagram of a first augmented reality projection method provided in the embodiment of the present application, and as shown in fig. 3, the method includes:
s21: real-world based recorded video data transmitted by an augmented reality device is received.
Specifically, the terminal is in communication connection with the augmented reality device, and the augmented reality device is worn on a user body to record videos of a real world where the user faces and send recorded video data to the terminal.
S22: and sending the real-world-based recorded video data to the cloud platform through the network system, so that the cloud platform generates a virtual image according to the recorded video data.
Specifically, the terminal is in communication connection with the cloud platform through a network system, and the terminal sends the recorded video data to the cloud platform through an uplink transmission link established between the network system and the cloud platform, so that the cloud platform analyzes and identifies the recorded video data to generate a virtual image.
S23: and receiving the virtual image sent by the cloud platform through the network system.
Specifically, the terminal receives the virtual image sent by the cloud platform through a downlink link established between the network system and the cloud platform.
S24: the virtual image is sent to an augmented reality device.
Specifically, the terminal sends the virtual image to the augmented reality device, and the augmented reality device projects the virtual image to the optical virtual-real synthesis module through the built-in display, so that the user watches the augmented reality image through the augmented reality device, and the augmented reality image is obtained by synthesizing the virtual image and the real world image through the optical virtual-real synthesis module.
The augmented reality projection method provided by the embodiment of the application is applied to a terminal, and comprises the following steps: receiving real-world-based recorded video data sent by augmented reality equipment; the method comprises the steps of sending real-world-based recorded video data to a cloud platform through a network system, enabling the cloud platform to generate virtual images according to the recorded video data, receiving the virtual images sent by the cloud platform through the network system, and sending the virtual images to augmented reality equipment, wherein the augmented reality equipment projects the virtual images to an optical virtual-real synthesis module through a display, and a user watches the virtual images through the augmented reality equipment. According to the embodiment of the application, the terminal sends the recorded video data to the cloud platform through the network system, the cloud platform generates the virtual image by recording the video data, the performance requirement on the terminal is lowered, the workload of the terminal is reduced, the endurance time of the terminal is prolonged, and the experience effect of a user is enhanced.
In an alternative embodiment, after the above S21, the method further includes:
and carrying out video coding on the recorded video data by adopting a preset coding algorithm to obtain a key data frame and a general data frame.
Specifically, the preset encoding algorithm is a video encoding algorithm, video encoding is performed on the recorded video data through the video encoding algorithm to obtain the encoding characteristics of each frame of image, the identifier of one frame of image in the continuous adjacent multi-frame images with the similarity of the encoding characteristics within the preset range is stored as a key frame identifier, the identifiers of other frames of images in the continuous adjacent multi-frame images with the similarity of the encoding characteristics within the preset range are stored as general frame identifiers, and the identifier of the first frame of image of the recorded video data is stored as 1, so that the key data frame and the general data frame can be extracted according to the identifiers. The video coding is to perform coding according to pixel points and colors of multi-frame images of recorded video data so as to determine representative images from the multi-frame images as key data frames, and determine other images with high content similarity with the key data frames or record excessive images in the video data as general data frames.
The key data frame includes, but is not limited to: recording a frame related to an action in the video data, recording a frame summarizing a video scene or an event in the video data, or recording one of a plurality of frames with a similarity within a preset range.
Exemplary Video coding algorithms include, but are not limited to, the h.26x series, the MPEG (Moving Picture Experts Group) series, and the AVS (Audio Video coding Standard), which is not limited in this application.
The augmented reality projection method provided by the embodiment of the application adopts a preset coding algorithm to carry out video coding on recorded video data, and key data frames and general data frames are obtained. According to the embodiment of the application, the recorded video data are divided into the key data frames and the general data frames, so that the data volume of transmission is reduced, the transmission efficiency is improved, and the effect of enhancing the projection in real time is improved.
On the basis of the foregoing embodiment, an embodiment of the present application further provides an augmented reality projection method, where a network system between a terminal and a cloud platform includes: the high speed uplink and the normal uplink, the S22 includes:
and sending the key data frame to the cloud platform through the high-speed uplink, and sending the general data frame to the cloud platform through the common uplink.
Specifically, at least two uplink transmission data links are established among the terminal, the network system and the cloud platform: the transmission rate of the high-speed uplink is better than that of the normal uplink.
When the network state of the network system is poor, the terminal discards the general data frame and sends the key data frame to the cloud platform only through the high-speed uplink, and when the network state of the network system is good, the terminal sends the key data frame to the cloud platform through the high-speed uplink and sends the general data frame to the cloud platform through the common uplink, so that the key data frame is sent to the cloud platform, and the key data frame is prevented from being discarded.
In the augmented reality projection method provided by the embodiment of the present application, a network system between a terminal and a cloud platform includes: the high-speed uplink and the common uplink transmit the key data frame to the cloud platform through the high-speed uplink and transmit the common data frame to the cloud platform through the common uplink. According to the embodiment of the application, the two uplinks with different transmission rates are established, so that the terminal can rapidly and efficiently send the key data frame to the cloud platform, the transmission efficiency is improved, and the effect of enhancing the projection in real time is improved.
On the basis of the foregoing embodiments, an augmented reality projection method is further provided in an embodiment of the present application, and is applied to a cloud platform 40, fig. 4 is a schematic flow diagram of a second augmented reality projection method provided in an embodiment of the present application, and as shown in fig. 4, the method includes:
s41: and receiving real-world-based recorded video data forwarded by the terminal through the network system.
Specifically, real-world-based recorded video data is recorded by augmented reality equipment and transmitted to a terminal, a cloud platform is in communication connection with the terminal through a network system, and the cloud platform receives the recorded video data through an uplink transmission link established between the network system and the terminal.
S42: and performing video decoding on the recorded video data, and generating a virtual image based on the recorded video data.
Specifically, according to an encoding algorithm for encoding the recorded video data by the terminal, the recorded video data is decoded by adopting a decoding algorithm corresponding to the encoding algorithm, key objects are analyzed and identified from the recorded video data, virtual objects corresponding to the key objects are determined based on a pre-established corresponding relation library of real objects and virtual objects, and virtual images of the virtual objects are generated.
The virtual image may be decorated, added with elements, animated, added with animation, etc. based on the recorded video data, and is not limited herein. For example, a decoration image is added to the real-world image, for example, if the identified key object is a big tree, the virtual object may be selected to be the sun, the moon, or the star according to the current time, and in a further aspect, the cloud platform may select the virtual object to be rain, snow, hail, or the like according to the weather of the area where the user is located. After the cloud platform determines the virtual object, the virtual object is generated into a two-dimensional or three-dimensional virtual image according to the requirement.
S43: and forwarding the virtual image to the terminal through a network system so that the terminal transmits the virtual image to the augmented reality equipment for display.
Specifically, the cloud platform sends the virtual image to the terminal through a downlink transmission link established between the network system and the terminal, the terminal forwards the virtual image to the augmented reality device, the augmented reality device projects the virtual image to the optical virtual-real synthesis module through the display, the optical virtual-real synthesis module superimposes and synthesizes the virtual image and the real world image, and light and shadow adjustment is performed to display the synthesized augmented reality image for the user.
The augmented reality projection method provided by the embodiment of the application is applied to a cloud platform, and comprises the following steps: and receiving the real-world-based recorded video data forwarded by the terminal through the network system, performing video decoding on the recorded video data, generating a virtual image based on the recorded video data, and forwarding the virtual image to the terminal through the network system so that the terminal transmits the virtual image to the augmented reality device for display. According to the embodiment of the application, the cloud platform decodes the recorded video data and generates the virtual image, so that the performance requirement on the terminal is reduced, the workload of the terminal is reduced, the endurance time of the terminal is prolonged, and the experience effect of a user is enhanced.
On the basis of the foregoing embodiment, an embodiment of the present application further provides an augmented reality projection method, where a network system between a terminal and a cloud platform includes: a high speed uplink and a normal uplink, the recorded video data including encoded key data frames and general data frames, the S41 includes:
the method includes receiving a key data frame transmitted through a high-speed uplink and a general data frame transmitted through a normal uplink.
The specific receiving process is the same as the sending process of the terminal, and reference may be made to the process in which the terminal sends the key data frame to the cloud platform through the high-speed uplink and sends the general data frame to the cloud platform through the normal uplink, which is not described herein again.
In the augmented reality projection method provided by the embodiment of the present application, a network system between a terminal and a cloud platform includes: the high-speed uplink and the normal uplink, the recorded video data comprises encoded key data frames and normal data frames, and the key data frames transmitted through the high-speed uplink and the normal data frames transmitted through the normal uplink are received. According to the embodiment of the application, the two uplinks with different transmission rates are established, so that the cloud platform can efficiently receive the key data frames sent by the terminal, the transmission efficiency is improved, and the effect of enhancing the projection in real time is improved.
On the basis of the foregoing embodiment, an embodiment of the present application further provides an augmented reality projection method, where the foregoing S42 includes:
and respectively carrying out video decoding on the key data frame and the general data frame, and generating a virtual image based on a preset processing algorithm.
Specifically, a decoding algorithm corresponding to the coding algorithm is adopted to decode the key data frame and the general data frame, the preset processing algorithm comprises an identification algorithm and a searching method, firstly, a key object and a general object are respectively analyzed and identified from the key data frame and the general data frame, and determining a first location area of the key object in the key data frame, a second location area of the general object in the general data frame, then based on the pre-established corresponding relation library of the real object and the virtual object, the virtual object corresponding to the key object and the general object respectively is determined, rendering a virtual object corresponding to the key object in a first position region in the key data frame, generating a key virtual image corresponding to the key data frame, and rendering the virtual object corresponding to the virtual object in the second position area in the general data frame, and generating a general virtual image corresponding to the general data frame.
The augmented reality projection method provided by the embodiment of the application respectively carries out video decoding on the key data frame and the general data frame, and generates the virtual image based on the preset processing algorithm. The key data frame and the general data frame can be decoded respectively through the embodiment of the application, so that the virtual images corresponding to the key data frame and the general data frame are generated respectively, the virtual images corresponding to the key data frame can basically ensure the augmented reality projection effect, and the virtual images corresponding to the general data frame can enrich the projection effect, so that the effect that a user is personally on the scene is improved.
On the basis of the foregoing embodiments, an augmented reality projection apparatus is further provided in an embodiment of the present application, and is applied to a terminal, fig. 5 is a schematic structural diagram of a first augmented reality projection apparatus provided in an embodiment of the present application, and as shown in fig. 5, the apparatus includes:
the first video receiving module 101 is configured to receive real-world-based recorded video data sent by an augmented reality device.
The video forwarding module 102 is configured to send real-world-based recorded video data to the cloud platform through the network system, so that the cloud platform generates a virtual image according to the recorded video data;
the image receiving module 103 is configured to receive a virtual image sent by the cloud platform through the network system.
The first image forwarding module 104 is configured to send a virtual image to the augmented reality device, where the augmented reality device projects the virtual image to the optical virtual-real synthesizing module through the display, so that the user views the augmented reality image through the augmented reality device, and the augmented reality image is obtained by synthesizing the virtual image and the real world image by the optical virtual-real synthesizing module.
Optionally, after the first video receiving module 101, the apparatus further includes:
and the video coding module is used for carrying out video coding on the recorded video data by adopting a preset coding algorithm to acquire a key data frame and a general data frame.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink; the video forwarding module 102 is specifically configured to send a key data frame to the cloud platform through the high-speed uplink, and send a general data frame to the cloud platform through the common uplink.
On the basis of the foregoing embodiments, an augmented reality projection apparatus applied to a cloud platform is further provided in the embodiments of the present application, fig. 6 is a schematic structural diagram of a second augmented reality projection apparatus provided in the embodiments of the present application, and as shown in fig. 6, the apparatus includes:
and the second video receiving module 201 is configured to receive real-world-based recorded video data forwarded by the terminal through the network system, and the real-world-based recorded video data is recorded by the augmented reality device and transmitted to the terminal.
And the video decoding module 202 is configured to perform video decoding on the recorded video data and generate a virtual image based on the recorded video data.
And a second image forwarding module 203, configured to forward the virtual image to the terminal through the network system, so that the terminal transmits the virtual image to the augmented reality device for display, where the augmented reality device projects the virtual image to the optical virtual-real synthesizing module through the display.
Optionally, the network system between the terminal and the cloud platform includes: high speed uplink and normal uplink; the recorded video data comprises coded key data frames and general data frames; the second video receiving module 201 is specifically configured to receive a key data frame transmitted through a high-speed uplink and a general data frame transmitted through a normal uplink.
Optionally, the video decoding module 202 is specifically configured to perform video decoding on the key data frame and the general data frame, and generate a virtual image based on a preset processing algorithm.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors, or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
On the basis of the foregoing embodiments, an embodiment of the present application further provides a terminal, and fig. 7 is a schematic structural diagram of the terminal provided in the embodiment of the present application, and as shown in fig. 7, the terminal 20 includes: the terminal comprises a processor 21, a storage medium 22 and a bus, wherein the storage medium 22 stores program instructions executable by the processor 21, when the terminal 20 runs, the processor 21 communicates with the storage medium 22 through the bus, and the processor 21 executes the program instructions to execute the above method embodiment applied to the terminal.
Optionally, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the method embodiment applied to the terminal is executed, where specific implementation and technical effects are similar, and are not described herein again.
On the basis of the foregoing embodiments, an embodiment of the present application further provides a cloud platform, fig. 8 is a schematic structural diagram of the cloud platform provided in the embodiment of the present application, and as shown in fig. 8, a cloud platform 40 includes: when the cloud platform 40 runs, the processor 41 and the storage medium 42 communicate with each other through the bus, and the processor 41 executes the program instructions to execute the above-described method embodiment applied to the cloud platform.
Optionally, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the method embodiment applied to a cloud platform is executed, where specific implementation manners and technical effects are similar, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and shall be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. An augmented reality projection method is applied to a terminal, and the method comprises the following steps:
receiving real-world-based recorded video data sent by augmented reality equipment;
sending the real-world-based recorded video data to a cloud platform through a network system, so that the cloud platform generates a virtual image according to the recorded video data;
receiving the virtual image sent by the cloud platform through the network system;
and sending the virtual image to the augmented reality equipment, wherein the augmented reality equipment projects the virtual image to an optical virtual-real synthesis module through a display so that a user can watch the augmented reality image through the augmented reality equipment, and the augmented reality image is obtained by synthesizing the virtual image and the real world image through the optical virtual-real synthesis module.
2. The method of claim 1, wherein after receiving real-world based recorded video data transmitted by an augmented reality device, the method further comprises:
and carrying out video coding on the recorded video data by adopting a preset coding algorithm to obtain a key data frame and a general data frame.
3. The method of claim 2, wherein the network system between the terminal and the cloud platform comprises: high speed uplink and normal uplink;
the sending the real-world-based recorded video data to a cloud platform through a network system includes:
and sending the key data frame to the cloud platform through the high-speed uplink, and sending the general data frame to the cloud platform through the common uplink.
4. An augmented reality projection method applied to a cloud platform, the method comprising:
receiving real-world-based recorded video data forwarded by a terminal through a network system, wherein the real-world-based recorded video data is recorded by augmented reality equipment and transmitted to the terminal;
performing video decoding on the recorded video data, and generating a virtual image based on the recorded video data;
and forwarding the virtual image to the terminal through the network system so that the terminal transmits the virtual image to the augmented reality equipment for display, wherein the augmented reality equipment projects the virtual image to an optical virtual-real synthesis module through a display.
5. The method of claim 4, wherein the network system between the terminal and the cloud platform comprises: high speed uplink and normal uplink; the recorded video data comprises a coded key data frame and a general data frame;
the real-world-based recorded video data forwarded by the receiving terminal through the network system comprises:
receiving the key data frame transmitted through the high speed uplink and the general data frame transmitted through the normal uplink.
6. The method of claim 5, wherein said video decoding said recorded video data and generating a virtual image based on said recorded video data comprises:
and respectively carrying out video decoding on the key data frame and the general data frame, and generating the virtual image based on a preset processing algorithm.
7. An augmented reality projection system, comprising: augmented reality equipment, a terminal, a network system and a cloud platform;
the augmented reality equipment is in communication connection with the terminal, and the terminal is in communication connection with the cloud platform through the network system;
the terminal is used for executing the method of any one of claims 1 to 3, and the cloud platform is used for executing the method of any one of claims 4 to 6.
8. The augmented reality projection system of claim 7, wherein the augmented reality device is a wearable augmented reality device comprising: camera module, display, optics virtual reality synthesis module.
9. The augmented reality projection system of claim 7, wherein the network system between the terminal and the cloud platform comprises: high speed uplink and normal uplink.
10. The augmented reality projection system of claim 9, wherein the network system is a 5G network system.
CN202111286425.XA 2021-11-02 2021-11-02 Augmented reality projection method and system Pending CN114025133A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111286425.XA CN114025133A (en) 2021-11-02 2021-11-02 Augmented reality projection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111286425.XA CN114025133A (en) 2021-11-02 2021-11-02 Augmented reality projection method and system

Publications (1)

Publication Number Publication Date
CN114025133A true CN114025133A (en) 2022-02-08

Family

ID=80059635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111286425.XA Pending CN114025133A (en) 2021-11-02 2021-11-02 Augmented reality projection method and system

Country Status (1)

Country Link
CN (1) CN114025133A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103580773A (en) * 2012-07-18 2014-02-12 中兴通讯股份有限公司 Method and device for transmitting data frame
CN104076513A (en) * 2013-03-26 2014-10-01 精工爱普生株式会社 Head-mounted display device, control method of head-mounted display device, and display system
CN107222529A (en) * 2017-05-22 2017-09-29 北京邮电大学 Augmented reality processing method, WEB modules, terminal and cloud server
CN107332977A (en) * 2017-06-07 2017-11-07 安徽华米信息科技有限公司 The method and augmented reality equipment of augmented reality
CN111443814A (en) * 2020-04-09 2020-07-24 深圳市瑞云科技有限公司 AR glasses system and method based on cloud rendering
CN213423602U (en) * 2020-08-07 2021-06-11 北京亮亮视野科技有限公司 Binocular optical waveguide augmented reality intelligent glasses and assembly

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103580773A (en) * 2012-07-18 2014-02-12 中兴通讯股份有限公司 Method and device for transmitting data frame
CN104076513A (en) * 2013-03-26 2014-10-01 精工爱普生株式会社 Head-mounted display device, control method of head-mounted display device, and display system
CN107222529A (en) * 2017-05-22 2017-09-29 北京邮电大学 Augmented reality processing method, WEB modules, terminal and cloud server
CN107332977A (en) * 2017-06-07 2017-11-07 安徽华米信息科技有限公司 The method and augmented reality equipment of augmented reality
CN111443814A (en) * 2020-04-09 2020-07-24 深圳市瑞云科技有限公司 AR glasses system and method based on cloud rendering
CN213423602U (en) * 2020-08-07 2021-06-11 北京亮亮视野科技有限公司 Binocular optical waveguide augmented reality intelligent glasses and assembly

Similar Documents

Publication Publication Date Title
CN110475150B (en) Rendering method and device for special effect of virtual gift and live broadcast system
CN110493630B (en) Processing method and device for special effect of virtual gift and live broadcast system
TWI708152B (en) Image processing method, device, and storage medium
EP3499897B1 (en) Camerawork generating method and video processing device
KR102063895B1 (en) Master device, slave device and control method thereof
US11706403B2 (en) Positional zero latency
US20180158246A1 (en) Method and system of providing user facial displays in virtual or augmented reality for face occluding head mounted displays
CN112166604B (en) Volume capture of objects with a single RGBD camera
JP2022524683A (en) Systems and methods for rendering real-world objects using depth information
US11288854B2 (en) Information processing apparatus and information processing method
WO2017072534A2 (en) Communication system and method
CN107332977B (en) Augmented reality method and augmented reality equipment
US11310560B2 (en) Bitstream merger and extractor
CN107481318A (en) Replacement method, device and the terminal device of user's head portrait
CN111464828A (en) Virtual special effect display method, device, terminal and storage medium
KR20210138484A (en) System and method for depth map recovery
CN114641806A (en) Distributed sensor data processing using multiple classifiers on multiple devices
CN116152416A (en) Picture rendering method and device based on augmented reality and storage medium
CN112016548B (en) Cover picture display method and related device
CN114025133A (en) Augmented reality projection method and system
CN107092347B (en) Augmented reality interaction system and image processing method
CN106101574B (en) A kind of control method, device and the mobile terminal of image enhancement reality
CN115336269A (en) Texture-based immersive video coding
KR20200003291A (en) Master device, slave device and control method thereof
CN115272151A (en) Image processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination