CN107247928B - Method and system for constructing AR scene based on horizontal angle of recognition graph - Google Patents

Method and system for constructing AR scene based on horizontal angle of recognition graph Download PDF

Info

Publication number
CN107247928B
CN107247928B CN201710369723.2A CN201710369723A CN107247928B CN 107247928 B CN107247928 B CN 107247928B CN 201710369723 A CN201710369723 A CN 201710369723A CN 107247928 B CN107247928 B CN 107247928B
Authority
CN
China
Prior art keywords
horizontal
angle
graph
normal
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710369723.2A
Other languages
Chinese (zh)
Other versions
CN107247928A (en
Inventor
胡德志
孙碧亮
万厚亮
谢为杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Show Baby Software Co ltd
Original Assignee
Wuhan Show Baby Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Show Baby Software Co ltd filed Critical Wuhan Show Baby Software Co ltd
Priority to CN201710369723.2A priority Critical patent/CN107247928B/en
Publication of CN107247928A publication Critical patent/CN107247928A/en
Application granted granted Critical
Publication of CN107247928B publication Critical patent/CN107247928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention relates to a method and a system for constructing an AR scene based on the horizontal angle of an identification graph, wherein the method comprises the following steps of S1, setting different matched models for the same identification graph at different horizontal angles in a horizontal plane; s2, determining a plurality of horizontal angles of the recognition map; and S3, dynamically loading the corresponding matched model for the recognition graph according to each horizontal angle in the plurality of horizontal angles, and generating different AR scenes. According to the method for constructing the AR scene based on the horizontal angle of the recognition graph, disclosed by the invention, the size of the horizontal angle of the recognition graph is recognized, different models and matched special effects, sounds and the like are loaded according to the real-time horizontal angle of the recognition graph, so that a user can more deeply experience the fun brought by the AR technology, and huge innovation is provided for a past rigid recognition graph scanning mode and brand new experience is provided for the user.

Description

Method and system for constructing AR scene based on horizontal angle of recognition graph
Technical Field
The invention relates to the technical field of augmented reality, in particular to a method and a system for constructing an AR scene based on a horizontal angle of an identification graph.
Background
In the application of the AR mobile terminal nowadays, the AR scene is constructed in a mode that a mobile terminal camera is used for recognizing a recognition graph in the real world so as to generate a corresponding model in the mobile terminal, such as a very explosive enlightenment card in the market. However, the mode of scanning the card to generate the model in the market is very rigid, only one model can be fixedly scanned for the same identification picture, and more interaction with the card in the real world cannot be performed, which is also a difficult problem to be solved urgently.
Disclosure of Invention
The invention aims to solve the technical problem of providing a method and a system for constructing an AR scene based on the horizontal angle of an identification graph, which can identify various different AR scenes for the same identification graph.
The technical scheme for solving the technical problems is as follows: a method for constructing an AR scene based on horizontal angles of a recognition graph includes the steps of,
s1, setting different matched models for the same identification graph at different horizontal angles in the horizontal plane;
s2, determining a plurality of horizontal angles of the recognition map;
and S3, dynamically loading the corresponding matched model for the recognition graph according to each horizontal angle in the plurality of horizontal angles, and generating different AR scenes.
The invention has the beneficial effects that: according to the method for constructing the AR scene based on the horizontal angle of the recognition graph, disclosed by the invention, the size of the horizontal angle of the recognition graph is recognized, different models and matched special effects, sounds and the like are loaded according to the real-time horizontal angle of the recognition graph, so that a user can more deeply experience the fun brought by the AR technology, and huge innovation is provided for a past rigid recognition graph scanning mode and brand new experience is provided for the user.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, in S2, the horizontal angle of the recognition map is determined by using a mobile terminal with a camera.
Further, the S2 includes the steps of,
s21, acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
s22, establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and S23, calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
Further, the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the gravity sensing module on the mobile terminal is called by the unity tool to obtain an included angle A between one surface of the mobile terminal with the camera and the horizontal plane.
Further, the normal l is established in a unity tool through a camera, the normal m is established through the identification image, and the angle D between the normal l and the normal m is obtained through the unity tool.
The beneficial effect of adopting the further scheme is that: according to the invention, the size of the horizontal angle of the recognition graph at the moment can be more intelligently recognized through the gravity sensing module of the mobile terminal, and the recognition method is simple.
Based on the method for constructing the AR scene based on the horizontal angle of the identification graph, the invention also provides a system for constructing the AR scene based on the horizontal angle of the identification graph.
A system for constructing an AR scene based on horizontal angles of a recognition graph,
comprises a matched model setting module, a horizontal angle recognition module and a matched model loading module,
the matching model setting module is used for setting different matching models for the same identification map at different horizontal angles in a horizontal plane;
the horizontal angle identification module is used for determining a plurality of horizontal angles of the identification map;
the matching model loading module is configured to dynamically load the corresponding matching model for the recognition graph according to each of the plurality of horizontal angles, and generate different AR scenes.
The invention has the beneficial effects that: according to the system for constructing the AR scene based on the horizontal angle of the recognition graph, disclosed by the invention, different models, matched special effects, sounds and the like are loaded according to the real-time horizontal angle of the recognition graph by recognizing the size of the horizontal angle of the recognition graph, so that a user can more deeply experience the fun brought by the AR technology, and huge innovation is provided for a past rigid recognition graph scanning mode and brand new experience is provided for the user.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, in the horizontal angle identification module, a mobile terminal with a camera is used for determining the horizontal angle of the identification map.
Further, the process of the horizontal angle identification module determining the horizontal angle of the identification map is specifically,
acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
Further, the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the gravity sensing module on the mobile terminal is called by the unity tool to obtain an included angle A between one surface of the mobile terminal with the camera and the horizontal plane.
Further, the normal l is established in a unity tool through a camera, the normal m is established through the identification image, and the angle D between the normal l and the normal m is obtained through the unity tool.
The beneficial effect of adopting the further scheme is that: according to the invention, the size of the horizontal angle of the recognition graph at the moment can be more intelligently recognized through the gravity sensing module of the mobile terminal, and the recognition method is simple.
Drawings
FIG. 1 is a flow chart of a method of constructing an AR scene based on horizontal angles of an identification graph according to the present invention;
FIG. 2 is a flowchart of determining the horizontal angle of the recognition graph in a method for constructing an AR scene based on the horizontal angle of the recognition graph according to the present invention;
FIG. 3 is a distributed problem diagram of three axes of x, y and z of gravity sensing of a mobile terminal in the method for constructing an AR scene based on the horizontal angle of the recognition diagram of the present invention;
FIG. 4 is a side model structure diagram when determining the horizontal angle of the recognition map in the method for constructing an AR scene based on the horizontal angle of the recognition map according to the present invention;
FIG. 5 is a diagram of another side model structure for determining the horizontal angle of the recognition map in the method for constructing an AR scene based on the horizontal angle of the recognition map according to the present invention;
fig. 6 is a block diagram of a system for constructing an AR scene based on the horizontal angle of the recognition graph according to the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
As shown in fig. 1, a method of constructing an AR scene based on horizontal angles of a recognition map includes the steps of,
s1, setting different matched models for the same identification graph at different horizontal angles in the horizontal plane;
s2, determining a plurality of horizontal angles of the recognition map;
and S3, dynamically loading the corresponding matched model for the recognition graph according to each horizontal angle in the plurality of horizontal angles, and generating different AR scenes.
In S2, the mobile terminal with a camera is used to determine the horizontal angle of the identification map, and the determination process includes the following steps, as shown in fig. 2,
s21, acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
s22, establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and S23, calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
In this embodiment, the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the unity tool is used to call the gravity sensing module on the mobile terminal to obtain an included angle a between a surface of the mobile terminal with a camera and a horizontal plane. The normal L is established in a manner that the normal m established in the unity tool through the camera is transmitted through the identification image, and the included angle D between the normal L and the normal m is obtained through the unity tool.
The principle of determining the horizontal angle of the recognition map in the method for constructing the AR scene based on the horizontal angle of the recognition map according to the present invention is described below by taking a mobile terminal (specifically, a smart phone) with a camera, a gravity sensing module, and a unity tool as an example.
Fig. 3 is a distributed thematic map of three axes x, y, and z of gravity sensing of a mobile terminal in a method for constructing an AR scene based on a horizontal angle of a recognition map according to the present invention, wherein,
an X axis: the home button rotates 90 degrees towards the right and the sky on the lower mobile terminal surface, and the gravity component rotates 90 degrees towards the left and is-1.0;
y-axis: the home key is back to the upper mobile terminal and faces to the gravity component of the upper mobile terminal, and the home key is-1.0;
z-axis: the mobile terminal has a gravity component of +1.0 facing the ground and a gravity component of-1.0 facing the sky.
In the unity tool, a gravity sensing module on the mobile terminal can be called to obtain an included angle a between the surface of the camera of the mobile terminal and the horizontal plane at the moment, an included angle (also called a horizontal angle) between the back surface of the identification drawing and the horizontal plane at the moment is firstly set as B, the camera of the mobile terminal can establish a normal ray l in the unity, the identification drawing also sends a normal ray m of the identification drawing, and if the included angle B between the back surface of the identification drawing and the horizontal plane at the moment is smaller than 90 degrees, the side model of the horizontal angle of the identification drawing is judged at the moment and is shown in fig. 4. An included angle D between the normal l and the normal m can be obtained in vector3.angle of the unity tool, the included angle between the mobile terminal plane and the recognition graph plane is equal to a supplementary angle of the included angle between the normal l and the recognition graph plane, C is 180-D, B is a-C, and finally B is converted to a-180+ D, i.e. the size of the included angle B between the recognition graph and the horizontal direction at the moment can be calculated.
If the angle B between the back of the recognition chart and the horizontal plane is greater than 90 degrees, the side model of the horizontal angle of the recognition chart is determined as shown in fig. 5. At this time, the included angle between the mobile terminal plane and the recognition graph plane is equal to the complement of the included angle of the normal line, C is 180-D, B is a-C, and finally, B is a-180+ D through conversion. It is proved that the public sign is universal whether the included angle between the identification chart and the horizontal direction is larger than 90 degrees or not, and the included angle is equal to A-180+ D.
The method for constructing the AR scene based on the horizontal angle of the recognition graph loads different models, matched special effects, sounds and the like according to the real-time horizontal angle of the recognition graph by judging the size of the horizontal angle of the recognition graph, for example, a rat digging out a pit appears by scanning the recognition graph when the recognition graph is horizontally placed, the special effects and the sounds of digging the pit are matched, when the recognition graph is vertically placed, the rat directly coming towards a flying tiger appears by scanning with a mobile terminal, the flying dust special effects and the flying sounds are matched, the same recognition graph can bring completely different AR effects to people, and the scanning can provide huge innovation for the scanning mode of the recognition graph of the past death board and give brand-new experience to users.
Based on the method for constructing the AR scene based on the horizontal angle of the identification graph, the invention also provides a system for constructing the AR scene based on the horizontal angle of the identification graph.
As shown in fig. 6, a system for constructing an AR scene based on horizontal angles of a recognition graph includes a matching model setting module, a horizontal angle recognition module, and a matching model loading module,
the matching model setting module is used for setting different matching models for the same identification map at different horizontal angles in a horizontal plane;
the horizontal angle identification module is used for determining a plurality of horizontal angles of the identification map;
the matching model loading module is configured to dynamically load the corresponding matching model for the recognition graph according to each of the plurality of horizontal angles, and generate different AR scenes.
Wherein, in the horizontal angle identification module, the horizontal angle of the identification chart is determined by using a mobile terminal with a camera, and the determination process is specifically,
acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
In a specific embodiment of the present invention, the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the unity tool is used to call the gravity sensing module on the mobile terminal to obtain an included angle a between a surface of the mobile terminal with a camera and a horizontal plane. The normal l is established in a manner that the normal m established in the unity tool through the camera is transmitted through the identification chart, and an included angle D between the normal l and the normal m is obtained through the unity tool.
According to the system for constructing the AR scene based on the horizontal angle of the recognition graph, disclosed by the invention, different models, matched special effects, sounds and the like are loaded according to the real-time horizontal angle of the recognition graph by recognizing the size of the horizontal angle of the recognition graph, so that a user can more deeply experience the fun brought by the AR technology, and huge innovation is provided for a past rigid recognition graph scanning mode and brand new experience is provided for the user.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (6)

1. A method for constructing an AR scene based on a horizontal angle of an identification graph is characterized in that: comprises the following steps of (a) carrying out,
s1, setting different matched models for the same identification graph at different horizontal angles in the horizontal plane;
s2, determining a plurality of horizontal angles of the recognition map;
s3, dynamically loading a corresponding matched model for the identification graph according to each horizontal angle in the plurality of horizontal angles, and generating different AR scenes;
in S2, determining a horizontal angle of the recognition map by using a mobile terminal with a camera;
the S2 includes the steps of,
s21, acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
s22, establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and S23, calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
2. The method of claim 1, wherein the method for constructing the AR scene based on the horizontal angle of the recognition graph comprises: the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the unity tool is used for calling the gravity sensing module on the mobile terminal to obtain an included angle A between one surface of the mobile terminal with a camera and the horizontal plane.
3. The method of claim 2, wherein the method for constructing the AR scene based on the horizontal angle of the recognition graph comprises: the normal l is established in a unity tool through a camera, the normal m is established through the self-emission of the identification graph, and the included angle D between the normal l and the normal m is obtained through the unity tool.
4. A system for constructing an AR scene based on horizontal angles of a recognition graph is characterized in that: comprises a matched model setting module, a horizontal angle recognition module and a matched model loading module,
the matching model setting module is used for setting different matching models for the same identification map at different horizontal angles in a horizontal plane;
the horizontal angle identification module is used for determining a plurality of horizontal angles of the identification map;
the matching model loading module is used for dynamically loading the corresponding matching model for the identification graph according to each horizontal angle in the plurality of horizontal angles to generate different AR scenes;
in the horizontal angle identification module, a mobile terminal with a camera is used for determining the horizontal angle of the identification graph;
the process of the horizontal angle identification module determining the horizontal angle of the identification map is specifically,
acquiring an included angle A between one surface of the mobile terminal with the camera and a horizontal plane;
establishing a normal l perpendicular to one surface of the mobile terminal with the camera and a normal m perpendicular to the identification picture, and acquiring an included angle D between the normal l and the normal m;
and calculating a horizontal angle B of the recognition graph through the included angle A and the included angle D, wherein the horizontal angle B of the recognition graph is equal to the included angle A-180 degrees + the included angle D.
5. The system of claim 4, wherein the system is configured to construct the AR scene based on the horizontal angles of the recognition graph: the mobile terminal is a mobile terminal with a gravity sensing module and a unity tool, and the unity tool is used for calling the gravity sensing module on the mobile terminal to obtain an included angle A between one surface of the mobile terminal with a camera and the horizontal plane.
6. The system of claim 5, wherein the system is configured to construct the AR scene based on the horizontal angles of the recognition graph: the normal l is established in a unity tool through a camera, the normal m is established through the self-emission of the identification graph, and the included angle D between the normal l and the normal m is obtained through the unity tool.
CN201710369723.2A 2017-05-23 2017-05-23 Method and system for constructing AR scene based on horizontal angle of recognition graph Active CN107247928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710369723.2A CN107247928B (en) 2017-05-23 2017-05-23 Method and system for constructing AR scene based on horizontal angle of recognition graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710369723.2A CN107247928B (en) 2017-05-23 2017-05-23 Method and system for constructing AR scene based on horizontal angle of recognition graph

Publications (2)

Publication Number Publication Date
CN107247928A CN107247928A (en) 2017-10-13
CN107247928B true CN107247928B (en) 2020-06-23

Family

ID=60016661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710369723.2A Active CN107247928B (en) 2017-05-23 2017-05-23 Method and system for constructing AR scene based on horizontal angle of recognition graph

Country Status (1)

Country Link
CN (1) CN107247928B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103365596A (en) * 2013-07-01 2013-10-23 天脉聚源(北京)传媒科技有限公司 Method and device for controlling virtual world
CN103456301A (en) * 2012-05-28 2013-12-18 中兴通讯股份有限公司 Ambient sound based scene recognition method and device and mobile terminal
CN103514446A (en) * 2013-10-16 2014-01-15 北京理工大学 Outdoor scene recognition method fused with sensor information
CN104575130A (en) * 2014-11-07 2015-04-29 马振轩 Multi-picture combining and recognizing cartoon education application system for augmented reality technology
CN105023294A (en) * 2015-07-13 2015-11-04 中国传媒大学 Fixed point movement augmented reality method combining sensors and Unity3D
CN105448292A (en) * 2014-08-19 2016-03-30 北京羽扇智信息科技有限公司 Scene-based real-time voice recognition system and method
CN106293058A (en) * 2016-07-20 2017-01-04 广东小天才科技有限公司 The method for changing scenes of virtual reality device and device for changing scenes
CN106571072A (en) * 2015-10-26 2017-04-19 苏州梦想人软件科技有限公司 Method for realizing children education card based on AR

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6107276B2 (en) * 2013-03-22 2017-04-05 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103456301A (en) * 2012-05-28 2013-12-18 中兴通讯股份有限公司 Ambient sound based scene recognition method and device and mobile terminal
CN103365596A (en) * 2013-07-01 2013-10-23 天脉聚源(北京)传媒科技有限公司 Method and device for controlling virtual world
CN103514446A (en) * 2013-10-16 2014-01-15 北京理工大学 Outdoor scene recognition method fused with sensor information
CN105448292A (en) * 2014-08-19 2016-03-30 北京羽扇智信息科技有限公司 Scene-based real-time voice recognition system and method
CN104575130A (en) * 2014-11-07 2015-04-29 马振轩 Multi-picture combining and recognizing cartoon education application system for augmented reality technology
CN105023294A (en) * 2015-07-13 2015-11-04 中国传媒大学 Fixed point movement augmented reality method combining sensors and Unity3D
CN106571072A (en) * 2015-10-26 2017-04-19 苏州梦想人软件科技有限公司 Method for realizing children education card based on AR
CN106293058A (en) * 2016-07-20 2017-01-04 广东小天才科技有限公司 The method for changing scenes of virtual reality device and device for changing scenes

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于 Unity3D 的移动增强现实光学实验平台;陈泽婵等;《计算机应用》;20151215;第35卷(第S2期);194-199 *

Also Published As

Publication number Publication date
CN107247928A (en) 2017-10-13

Similar Documents

Publication Publication Date Title
CN111445583B (en) Augmented reality processing method and device, storage medium and electronic equipment
US20090257730A1 (en) Video server, video client device and video processing method thereof
US9392248B2 (en) Dynamic POV composite 3D video system
CN111556278A (en) Video processing method, video display device and storage medium
US10115127B2 (en) Information processing system, information processing method, communications terminals and control method and control program thereof
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN107566749B (en) Shooting method and mobile terminal
CN111311756B (en) Augmented reality AR display method and related device
CN112991553B (en) Information display method and device, electronic equipment and storage medium
CN103914876A (en) Method and apparatus for displaying video on 3D map
CN108320263A (en) A kind of method, device and mobile terminal of image procossing
CN112272311B (en) Method, device, terminal, server and medium for repairing splash screen
CN108668108A (en) A kind of method, apparatus and electronic equipment of video monitoring
EP4186033A2 (en) Map for augmented reality
CN110740285A (en) telematics method and device
CN112581571A (en) Control method and device of virtual image model, electronic equipment and storage medium
CN107203961B (en) Expression migration method and electronic equipment
WO2019000464A1 (en) Image display method and device, storage medium, and terminal
CN114638885A (en) Intelligent space labeling method and system, electronic equipment and storage medium
CN108537878B (en) Environment model generation method and device, storage medium and electronic equipment
CN107247928B (en) Method and system for constructing AR scene based on horizontal angle of recognition graph
CN113014960B (en) Method, device and storage medium for online video production
CN113160270A (en) Visual map generation method, device, terminal and storage medium
CN112070901A (en) AR scene construction method and device for garden, storage medium and terminal
US10068147B2 (en) System and method for insertion of photograph taker into a photograph

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant