CN109118592B - AR presentation compensation effect realization method and system - Google Patents

AR presentation compensation effect realization method and system Download PDF

Info

Publication number
CN109118592B
CN109118592B CN201810986370.5A CN201810986370A CN109118592B CN 109118592 B CN109118592 B CN 109118592B CN 201810986370 A CN201810986370 A CN 201810986370A CN 109118592 B CN109118592 B CN 109118592B
Authority
CN
China
Prior art keywords
coordinate system
standard
space coordinate
virtual
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810986370.5A
Other languages
Chinese (zh)
Other versions
CN109118592A (en
Inventor
周伟胜
陈国镇
罗龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yingxin Education Technology Co ltd
Original Assignee
Guangzhou Yingxin Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yingxin Education Technology Co ltd filed Critical Guangzhou Yingxin Education Technology Co ltd
Priority to CN201810986370.5A priority Critical patent/CN109118592B/en
Publication of CN109118592A publication Critical patent/CN109118592A/en
Application granted granted Critical
Publication of CN109118592B publication Critical patent/CN109118592B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The invention discloses a method for realizing compensation effect of AR presentation, which comprises the following steps: s1, constructing a standard space coordinate system, and positioning an identification object to the standard space coordinate system; s2, tracking the identified object in real time, and if the identified object is not lost, positioning the identified object through a standard space coordinate system; s3, tracking the identification object in real time, if the identification object is lost, constructing a virtual space coordinate system, positioning the identification object through the virtual space coordinate system, and when the identification object appears again, entering the step S1. The invention also discloses a system for realizing the compensation effect of AR presentation. By adopting the method, the compensation is realized by recording the standard coordinate vector before the identification object is lost and constructing the virtual space coordinate system according to the standard coordinate vector; when the information of the identification object is acquired again, returning to a default state, and re-creating experience; therefore, the virtual space coordinate system is not fixed any more, the change of the switching process of the virtual space coordinate system and the standard space coordinate system is small, and the experience effect of the switching process is good.

Description

AR presentation compensation effect realization method and system
Technical Field
The invention relates to the technical field of augmented reality, in particular to an AR presented compensation effect realization method and an AR presented compensation effect realization system.
Background
With the popularization of electronic devices, the performance of electronic devices is gradually improved, the price is gradually reduced, and AR (Augmented Reality ) available devices are also gradually civilian. The electronic equipment can be spatially positioned only by the camera, and an enhanced 3D effect is generated. However, the recognizable image or object may be affected by various factors (e.g., camera image capability, whether it has a complete recognizable image, light stability, etc.), resulting in a jittery or non-productive AR positioning effect.
At present, the data processing of a real object is hindered in physical aspect, cannot be compensated through an algorithm, and further cannot generate an AR effect, and cannot adapt to a low-end model and various environmental problems in the market, the AR effect is quite poor due to an unstable state, and the experience is quite poor. In the prior art, AR positioning is often realized by directly switching to a fixed coordinate system, and when a new real object is triggered again, the coordinate system switching process can generate larger change, and the switching process experience is very poor.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method and a system for realizing the compensation effect of AR presentation, wherein a non-fixed virtual space coordinate system can be constructed by recording a standard coordinate vector before the identification object is lost, so that the change of a switching process of the virtual space coordinate system and the standard space coordinate system is small, and the experience effect of the switching process is good.
In order to solve the above technical problems, the present invention provides a method for implementing compensation effect of AR presentation, including:
s1, constructing a standard space coordinate system, and positioning an identification object to the standard space coordinate system;
s2, tracking the identified object in real time, and if the identified object is not lost, positioning the identified object through a standard space coordinate system;
s3, tracking the identification object in real time, if the identification object is lost, constructing a virtual space coordinate system, positioning the identification object through the virtual space coordinate system, and when the identification object appears again, entering the step S1.
As an improvement of the above solution, in the step S1, the method for positioning the identifier to the standard space coordinate system includes: the center point of the recognition object is set as the origin of coordinates of the standard space coordinate system.
As an improvement of the above solution, in the step S2, the method for positioning the identifier by using the standard space coordinate system includes: adjusting the origin of coordinates of a standard space coordinate system in real time according to the center point of the identified object; and acquiring a standard coordinate vector of the AR camera in real time according to the standard space coordinate system.
As an improvement of the above solution, in the step S3, the method for constructing the virtual space coordinate system and locating the identifier by using the virtual space coordinate system includes: constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system; acquiring rotation data of an AR camera in real time; the imaging angle of the AR camera is changed according to the rotation data to acquire the spatial features of the virtual spatial coordinate system.
As an improvement of the above solution, the method for constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system includes: extracting the latest standard coordinate vector of the AR camera in a standard space coordinate system; setting the latest standard coordinate vector as a virtual coordinate vector; and constructing a virtual space coordinate system according to the virtual coordinate vector.
Correspondingly, the invention also provides a compensation effect realization system for AR presentation, which comprises: the standard space coordinate system module is used for constructing a standard space coordinate system and positioning the identification object to the standard space coordinate system; the standard positioning module is used for tracking the identified object in real time, and positioning the identified object through a standard space coordinate system if the identified object is not lost; and the virtual positioning module is used for tracking the identification object in real time, constructing a virtual space coordinate system if the identification object is lost, and positioning the identification object through the virtual space coordinate system.
As an improvement of the above solution, the standard space coordinate system module includes a positioning unit for setting the center point of the recognition object as the origin of coordinates of the standard space coordinate system.
As an improvement of the above solution, the standard positioning module includes: the adjusting unit is used for adjusting the origin of coordinates of the standard space coordinate system in real time according to the center point of the identification object; and the vector acquisition unit is used for acquiring the standard coordinate vector of the AR camera in real time according to the standard space coordinate system.
As an improvement of the above solution, the virtual positioning module includes: the virtual coordinate system construction unit is used for constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system; the data acquisition unit is used for acquiring the rotation data of the AR camera in real time; and a feature acquisition unit for changing an imaging angle of the AR camera according to the rotation data to acquire a spatial feature of the virtual spatial coordinate system.
As an improvement of the above-described aspect, the virtual coordinate system constructing unit includes: the extraction unit is used for extracting the latest standard coordinate vector of the AR camera in the standard space coordinate system; a setting unit that sets the latest standard coordinate vector as a virtual coordinate vector; and the construction unit constructs a virtual space coordinate system according to the virtual coordinate vector.
The implementation of the invention has the following beneficial effects:
the invention realizes compensation by recording the standard coordinate vector before the identification object is lost and constructing the virtual space coordinate system according to the standard coordinate vector; when the information of the identifier is acquired again, the default state is returned, and the experience is recreated. Therefore, the virtual space coordinate system is not fixed any more, the change of the switching process of the virtual space coordinate system and the standard space coordinate system is small, and the experience effect of the switching process is good.
Drawings
FIG. 1 is a flow chart of a method for implementing the compensation effect presented by AR of the present invention;
FIG. 2 is a schematic representation of a standard spatial coordinate system in accordance with the present invention;
FIG. 3 is a schematic diagram of a virtual space coordinate system in accordance with the present invention;
FIG. 4 is a rotational schematic of an AR camera of the present invention;
FIG. 5 is a schematic diagram of the structure of the compensation effect implementation system presented by AR of the present invention;
FIG. 6 is a schematic diagram of a standard space coordinate system module according to the present invention;
FIG. 7 is a schematic diagram of the structure of a standard positioning module of the present invention;
fig. 8 is a schematic structural diagram of a virtual positioning module according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present invention more apparent. It is only stated that the terms of orientation such as up, down, left, right, front, back, inner, outer, etc. used in this document or the imminent present invention, are used only with reference to the drawings of the present invention, and are not meant to be limiting in any way.
Referring to fig. 1, fig. 1 shows a compensation effect implementation method presented by AR of the present invention, including:
s1, constructing a standard space coordinate system, and positioning an identification object to the standard space coordinate system;
the standard space coordinate system is: the real recognition object parameters are recognized by the AR recognition map technique to create a spatial coordinate system XYZ of the 3D art motion in U3D.
Specifically, in the step S1, the method for positioning the identifier to the standard space coordinate system includes: the center point of the recognition object is set as the origin (0, 0) of coordinates of the standard space coordinate system.
It should be noted that, the relationship between the standard space coordinate system and the object to be identified is adjusted by the 3D art preform in the U3D for each 3D scene identified by the AR, so as to ensure that the visual effect is that the object to be identified is the ground of the 3D model, and meanwhile, the center of the identification map is the origin (0, 0) of coordinates in the standard space coordinate system.
S2, tracking the identification object in real time, and if the identification object is not lost, positioning the identification object through a standard space coordinate system (see FIG. 2);
specifically, in the step S2, the method for positioning the identifier by using the standard space coordinate system includes:
s21, adjusting the origin of coordinates of a standard space coordinate system in real time according to the center point of the identification object;
if the identified object is not lost, the original standard space coordinate system XYZ is always used for positioning the 3D model, namely, when the central point of the identified object moves, the coordinate origin position of the standard space coordinate system is updated in real time, so that the coordinate origin of the standard space coordinate system moves along with the central point of the identified object.
S22, acquiring a standard coordinate vector of the AR camera in real time according to a standard space coordinate system.
Since the center point of the recognition object moves, the relative coordinate vector of the AR camera and the recognition object also changes, so that the standard coordinate vector (X, Y, Z) of the AR camera in the current standard space coordinate system XYZ needs to be recorded in real time.
S3, tracking the identification object in real time, if the identification object is lost, constructing a virtual space coordinate system, positioning the identification object through the virtual space coordinate system (see FIG. 3), and when the identification object appears again, entering step S1.
It should be noted that, when the identifier is lost, the standard space coordinate system cannot be created due to the fact that the identifier cannot be identified, i.e. the standard space coordinate system is lost.
In the step S3, the method for constructing the virtual space coordinate system and positioning the identifier through the virtual space coordinate system includes:
s31, constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system;
under the condition of no space basis, the virtual space coordinate system is reversely generated by the latest standard coordinate vector of the AR camera in the standard space coordinate system before disappearance.
Specifically, the step S31 includes:
s311, extracting the latest standard coordinate vector of the AR camera in a standard space coordinate system;
s312, setting the latest standard coordinate vector as a virtual coordinate vector;
s313, constructing a virtual space coordinate system according to the virtual coordinate vector.
It should be noted that, the present invention reversely forms a virtual space coordinate system X ' Y ' Z ' by simulating the coordinate system of the coordinate vector (X, Y, Z) of the recorded AR camera.
Reverse creation specification:
extracting vectors (X, Y, Z) of the AR camera in a standard space coordinate system XYZ;
when the identification object disappears due to the fact that the identification object cannot be identified, a standard space coordinate system XYZ required by the space operation based on the real identification object cannot be established;
the coordinate vector (X, Y, Z) at this time is coordinate vector data before disappearance of the standard space coordinate system XYZ, and is regarded as a virtual coordinate vector (X ', Y', Z ') in the virtual space coordinate system X' Y 'Z' created next;
let (X ', Y ', Z) = (X, Y, Z) to obtain specific virtual coordinate vectors (X ', Y ', Z '), and according to mathematical space theorem, obtain all spatial data of the virtual space coordinate system X ' Y ' Z ' where the virtual coordinate vectors (X ', Y ', Z ') are located as reference data of new space operation.
Accordingly, the virtual space coordinate system X ' Y ' Z ' does not change due to any motion.
S32, acquiring rotation data of the AR camera in real time;
s33, changing the imaging angle of the AR camera according to the rotation data to acquire the spatial characteristics of the virtual spatial coordinate system.
As shown in FIG. 4, the conventional AR operation can be performed by rotating the mobile phone through a gyroscope, namely, rotating the AR camera, and can view the 3D model in the virtual space coordinate system X ' Y ' Z ' at any time. In the virtual space coordinate system X ' Y ' Z ' separated from the physical object (recognition object) recognition positioning, the physical object coordinates of the AR camera cannot be determined, that is, it is not known what position the AR camera is on the earth. However, since the AR camera has a gyro function, the reverse orientation of the AR camera can be known, and thus, by acquiring reverse data of the AR camera, it is possible to perform rotation operation observation in a virtual space coordinate system X ' Y ' Z ' that is out of the physical object. Since the virtual space coordinate system X ' Y ' Z ' is based on the coordinates of the AR camera. Therefore, the AR camera moves in all directions in the earth, and the virtual space coordinate system X ' Y ' Z ' does not change at all; however, by rotating the AR camera, the rotation data of the AR camera can be obtained through the gyroscope data, and thus the characteristics of each space of the virtual space coordinate system X ' Y ' Z ' can be observed.
In addition, the standard spatial coordinate system XYZ will be re-established after the AR camera re-acquires the recognition object, and the coordinates of the 3D model are changed to a default position opposite to the standard spatial coordinate system XYZ.
Assuming that the user reacquires the recognition object, the virtual space coordinate system X ' Y ' Z ' does not need to be created successfully, i.e., the standard space coordinate system XYZ can be destroyed. Therefore, the step S1 can be repeated, the motion of the 3D art is linked with the real object again, and the actuation of the AR camera can also monitor all the motions of the AR camera in the earth through the real object data, so as to generate a real AR effect, namely an augmented reality effect. The compensation effect is achieved by circulating in this way.
From the above, the invention realizes compensation by recording the standard coordinate vector before the identification object is lost and constructing the virtual space coordinate system according to the standard coordinate vector; when the information of the identifier is acquired again, the default state is returned, and the experience is recreated. Therefore, the virtual space coordinate system is not fixed any more, the change of the switching process of the virtual space coordinate system and the standard space coordinate system is small, and the experience effect of the switching process is good.
Referring to fig. 5, fig. 5 shows a specific structure of the compensation effect implementation system 100 presented by the AR of the present invention, which includes:
the standard space coordinate system module 1 is used for constructing a standard space coordinate system and positioning the identification object to the standard space coordinate system; wherein the standard spatial coordinate system is a spatial coordinate system XYZ of 3D art motion created in U3D by recognizing real recognition object parameters through AR recognition map technology.
A standard positioning module 2, configured to track the identifier in real time, and if the identifier is not lost, position the identifier through a standard space coordinate system XYZ (see fig. 2);
the virtual positioning module 3 is used for tracking the identified object in real time, if the identified object is lost, constructing a virtual space coordinate system and positioning the identified object through the virtual space coordinate system X ' Y ' Z '. It should be noted that, when the identifier is lost, the standard space coordinate system cannot be created due to the fact that the identifier cannot be identified, i.e. the standard space coordinate system is lost.
In addition, the standard spatial coordinate system XYZ will be re-established after the AR camera re-acquires the recognition object, and the coordinates of the 3D model are changed to a default position opposite to the standard spatial coordinate system XYZ.
Assuming that the user reacquires the recognition object, the virtual space coordinate system X ' Y ' Z ' does not need to be created successfully, i.e., the standard space coordinate system XYZ can be destroyed. Therefore, the step S1 can be repeated, the motion of the 3D art is linked with the real object again, and the actuation of the AR camera can also monitor all the motions of the AR camera in the earth through the real object data, so as to generate a real AR effect, namely an augmented reality effect. The compensation effect is achieved by circulating in this way.
As shown in fig. 6, the standard space coordinate system module 1 includes a positioning unit 11 for setting a center point of the recognition object as a coordinate origin of the standard space coordinate system.
It should be noted that, the relationship between the standard space coordinate system and the object to be identified is adjusted by the 3D art preform in U3D for each 3D scene identified by AR, so as to ensure that the visual effect is that the object to be identified is the ground of the 3D model, and meanwhile, the center of the identification map is the origin (0, 0) of coordinates in the standard space coordinate system, i.e. the center point of the object to be identified is set as the origin (0, 0) of coordinates in the standard space coordinate system.
As shown in fig. 7, the standard positioning module 2 includes:
an adjusting unit 21, configured to adjust the origin of coordinates of the standard space coordinate system in real time according to the center point of the identification object; if the identified object is not lost, the original standard space coordinate system XYZ is always used for positioning the 3D model, namely, when the central point of the identified object moves, the coordinate origin position of the standard space coordinate system is updated in real time, so that the coordinate origin of the standard space coordinate system moves along with the central point of the identified object.
The vector acquisition unit 22 is configured to acquire a standard coordinate vector of the AR camera in real time according to a standard space coordinate system. Since the center point of the recognition object moves, the relative coordinate vector of the AR camera and the recognition object also changes, so that the standard coordinate vector (X, Y, Z) of the AR camera in the current standard space coordinate system XYZ needs to be recorded in real time.
As shown in fig. 8, the virtual positioning module 3 includes:
a virtual coordinate system construction unit 31 for constructing a virtual space coordinate system from the latest standard coordinate vector of the AR camera in the standard space coordinate system; under the condition of no space basis, the virtual space coordinate system is reversely generated by the latest standard coordinate vector of the AR camera in the standard space coordinate system before disappearance.
A data acquisition unit 32 for acquiring rotation data of the AR camera in real time;
a feature acquisition unit 33 for changing the imaging angle of the AR camera according to the rotation data to acquire the spatial feature of the virtual spatial coordinate system.
As shown in FIG. 4, the conventional AR operation can be performed by rotating the mobile phone through a gyroscope, namely, rotating the AR camera, and can view the 3D model in the virtual space coordinate system X ' Y ' Z ' at any time. In the virtual space coordinate system X ' Y ' Z ' separated from the physical object (recognition object) recognition positioning, the physical object coordinates of the AR camera cannot be determined, that is, it is not known what position the AR camera is on the earth. However, since the AR camera has a gyro function, the reverse orientation of the AR camera can be known, and thus, by acquiring reverse data of the AR camera, it is possible to perform rotation operation observation in a virtual space coordinate system X ' Y ' Z ' that is out of the physical object. Since the virtual space coordinate system X ' Y ' Z ' is based on the coordinates of the AR camera. Therefore, the AR camera moves in all directions in the earth, and the virtual space coordinate system X ' Y ' Z ' does not change at all; however, by rotating the AR camera, the rotation data of the AR camera can be obtained through the gyroscope data, and thus the characteristics of each space of the virtual space coordinate system X ' Y ' Z ' can be observed.
Specifically, the virtual coordinate system construction unit 31 includes:
an extracting unit 311, configured to extract the latest standard coordinate vector of the AR camera in the standard space coordinate system;
a setting unit 312 that sets the latest standard coordinate vector as a virtual coordinate vector;
the construction unit 313 constructs a virtual space coordinate system from the virtual coordinate vectors.
It should be noted that, the present invention reversely forms a virtual space coordinate system X ' Y ' Z ' by simulating the coordinate system of the coordinate vector (X, Y, Z) of the recorded AR camera.
Reverse creation specification:
extracting vectors (X, Y, Z) of the AR camera in a standard space coordinate system XYZ;
when the identification object disappears due to the fact that the identification object cannot be identified, a standard space coordinate system XYZ required by the space operation based on the real identification object cannot be established;
the coordinate vector (X, Y, Z) at this time is coordinate vector data before disappearance of the standard space coordinate system XYZ, and is regarded as a virtual coordinate vector (X ', Y', Z ') in the virtual space coordinate system X' Y 'Z' created next;
let (X ', Y ', Z) = (X, Y, Z) to obtain specific virtual coordinate vectors (X ', Y ', Z '), and according to mathematical space theorem, obtain all spatial data of the virtual space coordinate system X ' Y ' Z ' where the virtual coordinate vectors (X ', Y ', Z ') are located as reference data of new space operation.
Accordingly, the virtual space coordinate system X ' Y ' Z ' does not change due to any motion.
From the above, the invention realizes compensation by recording the standard coordinate vector before the identification object is lost and constructing the virtual space coordinate system according to the standard coordinate vector; when the information of the identifier is acquired again, the default state is returned, and the experience is recreated. Therefore, the virtual space coordinate system is not fixed any more, the change of the switching process of the virtual space coordinate system and the standard space coordinate system is small, and the experience effect of the switching process is good.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.

Claims (6)

1. A method for implementing a compensation effect for AR presentation, comprising:
s1, constructing a standard space coordinate system, and positioning an identification object to the standard space coordinate system;
s2, tracking the identified object in real time, and if the identified object is not lost, positioning the identified object through a standard space coordinate system; the specific method comprises the following steps: adjusting the origin of coordinates of a standard space coordinate system in real time according to the center point of the identified object; acquiring a standard coordinate vector of the AR camera in real time according to a standard space coordinate system;
s3, tracking the identified object in real time, if the identified object is lost, constructing a virtual space coordinate system, positioning the identified object through the virtual space coordinate system, and entering the step S1 when the identified object appears again; the specific method comprises the following steps: constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system; acquiring rotation data of an AR camera in real time; the imaging angle of the AR camera is changed according to the rotation data to acquire the spatial features of the virtual spatial coordinate system.
2. The method for implementing the compensation effect of AR presentation according to claim 1, wherein in the step S1, the method for locating the recognition object to the standard space coordinate system includes: the center point of the recognition object is set as the origin of coordinates of the standard space coordinate system.
3. The method for implementing the compensation effect of AR presentation according to claim 1, wherein the method for constructing a virtual space coordinate system from the latest standard coordinate vector of the AR camera in the standard space coordinate system comprises:
extracting the latest standard coordinate vector of the AR camera in a standard space coordinate system;
setting the latest standard coordinate vector as a virtual coordinate vector;
and constructing a virtual space coordinate system according to the virtual coordinate vector.
4. A compensation effect realization system for AR presentation, comprising:
the standard space coordinate system module is used for constructing a standard space coordinate system and positioning the identification object to the standard space coordinate system;
the standard positioning module is used for tracking the identified object in real time, and positioning the identified object through a standard space coordinate system if the identified object is not lost; the standard positioning module comprises: the adjusting unit is used for adjusting the origin of coordinates of the standard space coordinate system in real time according to the center point of the identification object; the vector acquisition unit is used for acquiring the standard coordinate vector of the AR camera in real time according to the standard space coordinate system;
the virtual positioning module is used for tracking the identified object in real time, if the identified object is lost, constructing a virtual space coordinate system and positioning the identified object through the virtual space coordinate system; the virtual positioning module comprises: the virtual coordinate system construction unit is used for constructing a virtual space coordinate system according to the latest standard coordinate vector of the AR camera in the standard space coordinate system; the data acquisition unit is used for acquiring the rotation data of the AR camera in real time; and a feature acquisition unit for changing an imaging angle of the AR camera according to the rotation data to acquire a spatial feature of the virtual spatial coordinate system.
5. The AR-presented compensation effect implementation system of claim 4, wherein the standard spatial coordinate system module includes a positioning unit for setting a center point of the recognition object as a coordinate origin of a standard spatial coordinate system.
6. The compensation effect realization system of AR presentation of claim 4, wherein the virtual coordinate system construction unit comprises:
the extraction unit is used for extracting the latest standard coordinate vector of the AR camera in the standard space coordinate system;
a setting unit that sets the latest standard coordinate vector as a virtual coordinate vector;
and the construction unit constructs a virtual space coordinate system according to the virtual coordinate vector.
CN201810986370.5A 2018-08-28 2018-08-28 AR presentation compensation effect realization method and system Active CN109118592B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810986370.5A CN109118592B (en) 2018-08-28 2018-08-28 AR presentation compensation effect realization method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810986370.5A CN109118592B (en) 2018-08-28 2018-08-28 AR presentation compensation effect realization method and system

Publications (2)

Publication Number Publication Date
CN109118592A CN109118592A (en) 2019-01-01
CN109118592B true CN109118592B (en) 2023-06-16

Family

ID=64861247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810986370.5A Active CN109118592B (en) 2018-08-28 2018-08-28 AR presentation compensation effect realization method and system

Country Status (1)

Country Link
CN (1) CN109118592B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009138069A1 (en) * 2008-05-14 2009-11-19 Christian-Albrechts-Universität Zu Kiel Augmented reality binoculars as a navigational aid
WO2016115872A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted display device and information display method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI419081B (en) * 2009-12-29 2013-12-11 Univ Nat Taiwan Science Tech Method and system for providing augmented reality based on marker tracing, and computer program product thereof
TW201126451A (en) * 2011-03-29 2011-08-01 Yuan-Hong Li Augmented-reality system having initial orientation in space and time and method
JP5474899B2 (en) * 2011-09-14 2014-04-16 株式会社バンダイナムコゲームス PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
CN106856013A (en) * 2017-01-12 2017-06-16 深圳市彬讯科技有限公司 The method and system that a kind of augmented reality identification figure off card shows

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009138069A1 (en) * 2008-05-14 2009-11-19 Christian-Albrechts-Universität Zu Kiel Augmented reality binoculars as a navigational aid
WO2016115872A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted display device and information display method thereof

Also Published As

Publication number Publication date
CN109118592A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
EP3579192B1 (en) Method, apparatus and device for determining camera posture information, and storage medium
JP6768156B2 (en) Virtually enhanced visual simultaneous positioning and mapping systems and methods
US11481982B2 (en) In situ creation of planar natural feature targets
EP3234806B1 (en) Scalable 3d mapping system
US8797353B2 (en) Augmented media message
US9674507B2 (en) Monocular visual SLAM with general and panorama camera movements
US9367961B2 (en) Method, device and storage medium for implementing augmented reality
WO2014169692A1 (en) Method,device and storage medium for implementing augmented reality
KR102398478B1 (en) Feature data management for environment mapping on electronic devices
CN111651051B (en) Virtual sand table display method and device
CN111127524A (en) Method, system and device for tracking trajectory and reconstructing three-dimensional image
US20140192055A1 (en) Method and apparatus for displaying video on 3d map
CN110361005B (en) Positioning method, positioning device, readable storage medium and electronic equipment
CN112207821B (en) Target searching method of visual robot and robot
CN112308977A (en) Video processing method, video processing apparatus, and storage medium
JP2016066187A (en) Image processor
CN109448105B (en) Three-dimensional human body skeleton generation method and system based on multi-depth image sensor
CN109118592B (en) AR presentation compensation effect realization method and system
CN111385481A (en) Image processing method and device, electronic device and storage medium
CN114882106A (en) Pose determination method and device, equipment and medium
JP2016071496A (en) Information terminal device, method, and program
CN114268771A (en) Video viewing method, mobile terminal and computer readable storage medium
CN117115244A (en) Cloud repositioning method, device and storage medium
CN117635717A (en) Information processing method, information processing device, electronic equipment and storage medium
CN117346650A (en) Pose determination method and device for visual positioning and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191008

Address after: 510000 room 2418, floor 24, no.102-2, Xianlie Middle Road, Yuexiu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU YINGXIN EDUCATION TECHNOLOGY Co.,Ltd.

Address before: Room 206, room 1, No. 20, Taihe Gang Road, Yuexiu District, Guangzhou, Guangdong

Applicant before: SUNMNET TECHNOLOGY CO.,LTD.

Applicant before: GUANGZHOU YINGXIN EDUCATION TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant