CN105786166B - Augmented reality method and system - Google Patents

Augmented reality method and system Download PDF

Info

Publication number
CN105786166B
CN105786166B CN201410826020.4A CN201410826020A CN105786166B CN 105786166 B CN105786166 B CN 105786166B CN 201410826020 A CN201410826020 A CN 201410826020A CN 105786166 B CN105786166 B CN 105786166B
Authority
CN
China
Prior art keywords
mobile device
augmented reality
entity
reference object
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410826020.4A
Other languages
Chinese (zh)
Other versions
CN105786166A (en
Inventor
张腾文
梁哲玮
陈思玮
陈毅承
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Publication of CN105786166A publication Critical patent/CN105786166A/en
Application granted granted Critical
Publication of CN105786166B publication Critical patent/CN105786166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an augmented reality method and system. The augmented reality method is applied to an augmented reality system. The augmented reality method comprises the following steps: detecting a physical reference object, a plurality of environmental characteristics, an angle parameter and a displacement parameter of the mobile device in a physical space; tracking the physical reference object to estimate the physical position and the physical angle of the mobile device in the physical space so as to display the augmented reality; when the physical reference object cannot be tracked, tracking the environmental features to estimate the physical position and the physical angle of the mobile device in the physical space so as to perform augmented reality display; and when the physical reference object cannot be tracked and the plurality of environmental features cannot be tracked, estimating the physical position and the physical angle of the mobile device in the physical space according to the angle parameter and the displacement parameter of the mobile device, so as to display the augmented reality.

Description

Augmented reality method and system
Technical field
The invention relates to a kind of augmented reality method and systems.
Background technique
By entity residence adornment, seeing room, person can see that furniture puts situation within doors, house probability of transaction is enhanced, and borrow This understands the hobby of consumer.After augmented reality (Augment Reality, AR) is suggested, design dealer/manufacturer will be set Meter style and virtual furnishings upload to augmented reality development platform, allow consumer can see on the mobile device virtual furnishings with The combination of entity house object.
Therefore, though how any place in user in entity space, can all track user position AR displaying appropriate is correspondingly carried out in turn with angle, is one of the emphasis in this field.
The present invention proposes a kind of augmented reality method and system, can be tracked using various ways the position of mobile device with Angle, to carry out AR displaying.
Summary of the invention
The invention relates to a kind of augmented reality method and system, can according to the entity reference object detected and/ Or multiple environmental characteristics and/or the angle parameter of the mobile device and the displacement parameter, to estimate the mobile device in entity Provider location and entity angle in space, to carry out augmented reality displaying.
An embodiment according to the present invention proposes a kind of augmented reality method, is applied to an augmented reality system.The amplification Real border method include: in entity space, detecting entity reference object, multiple environmental characteristics, mobile device angle parameter with Displacement parameter;The entity reference object is tracked to estimate provider location and angle of stereopsis of the mobile device in the entity space Degree, to carry out augmented reality displaying;When tracking is less than the entity reference object, multiple environmental characteristic is tracked to estimate the shifting The dynamic provider location of device in the entity space and the entity angle, to carry out augmented reality displaying;And work as tracking not To the entity reference object and when tracking is less than multiple environmental characteristic, according to the angle parameter of the mobile device and the displacement Parameter is to estimate the provider location of the mobile device in the entity space and the entity angle, to carry out augmented reality exhibition Show.
An embodiment according to the present invention proposes a kind of augmented reality system, comprising: mobile device and augmented reality are flat Platform is coupled to the mobile device.In entity space, which detects entity reference object, multiple environmental characteristics, the shifting The angle parameter and displacement parameter of dynamic device.The mobile device tracks the entity reference object to estimate the mobile device in the reality Provider location and entity angle in body space, to carry out augmented reality displaying.When mobile device tracking is joined less than the entity When examining object, which tracks multiple environmental characteristic to estimate the entity position of the mobile device in the entity space Set with the entity angle, to carry out augmented reality displaying.When mobile device tracking less than the entity reference object and is tracked not When to multiple environmental characteristic, according to the angle parameter of the mobile device and the displacement parameter, which estimates the shifting The dynamic provider location of device in the entity space and the entity angle, to carry out augmented reality displaying.
In order to have a better understanding to the above-mentioned and other aspect of this case, special embodiment below, and cooperate appended attached drawing, It is described in detail below:
Detailed description of the invention
Fig. 1 shows the schematic diagram of augmented reality system according to an embodiment of the invention.
Fig. 2A to Fig. 2 E is painted according to the position for estimating mobile device of one embodiment of the invention and angle, to carry out AR exhibition The schematic diagram shown.
Fig. 3 A and Fig. 3 B shows editor's schematic diagram of the AR object according to one embodiment of the invention.
Fig. 4 shows the flow chart of augmented reality method according to an embodiment of the invention.
[symbol description]
Augmented reality system: 100 mobile devices: 110
AR platform: 150 networks: 170
Camera unit: 111 rotation sensors: 112
Acceleration transducer: 113 screens: 114
AR application program: 115 AR tracing modules: 121
Environmental characteristic tracing module: 122 property data bases: 124
Module: 123 AR size of contents modules: 152 is estimated in movement
AR editor module: 151 environmental characteristics: 220A-220D
Entity reference object: 210 AR furniture: 320,320 '
AR furniture: 230 entity reference objects: 310
Specific embodiment
The technical terms of this specification are the idioms referring to the art, are added as this specification has part term To illustrate or define, the explanation or definition of this specification are subject in the explanation of the part term.Each embodiment of the invention point It Ju You not one or more technical characteristics.Under the premise of may implement, those of ordinary skill in the art is optionally real All or part of technical characteristic in any embodiment is applied, or selectively by technology all or part of in these embodiments Feature is combined.
Now referring to Fig. 1, the schematic diagram of augmented reality system according to an embodiment of the invention is shown.As shown in Figure 1, Augmented reality system 100 includes: mobile device 110 and AR platform 150.Net can be passed through between mobile device 110 and AR platform 150 Network 170 and communicate with each other.Network 170 may include wire/radio network.
Mobile device 110 includes: camera unit 111, rotation sensor 112, acceleration transducer 113, screen 114 and AR Application program 115.AR application program 115 is installed in the memory (not shown) of mobile device 110.AR application program 115 is wrapped Include: module 123 and property data base 124 are estimated in AR tracing module 121, environmental characteristic tracing module 122, movement.AR platform 150 It include: AR editor module 151 and AR size of contents module 152.
Camera unit 111 is to shoot the entity article in entity space.Rotation sensor 112 is to sense mobile device 110 angular acceleration and angular momentum angularly parameter, to sense the angle of mobile device 110.Rotation sensor 112 such as can be with It is gyroscope (gyroscope) etc..Acceleration parameter/displacement ginseng of the acceleration transducer 113 to sense mobile device 110 Number, to sense the position of mobile device 110.Screen 114 is being presented image, for example is image captured by camera unit 111, The associated picture either shown by the AR that AR application program 115 is transmitted.
AR application program 115 is showed in 114 on screen AR to be presented in real time.That is, user can see on one side See screen 114, it (for example is reality in entity space that the solid images captured by camera unit 111 can be presented simultaneously thereon Body object (people, entity reference object, wall etc.)) associated picture that is shown with the AR transmitted by AR application program 115.
If mobile device 110 is in movement/rotation, the AR being shown on the screen 114 of mobile device 110 is configured It can move/rotate (by the control of AR application program 115), to allow visual experience of the user in viewing can be close to mesh Preceding situation.
When the camera unit 111 of mobile device 110 can take entity reference object, AR tracing module 121 can be accordingly To calculate provider location of the mobile device 110 in entity space.Entity reference object is usually one put on the ground Plane object.In initialization (for example, when executing AR application program 115 to detect this entity reference object for the first time), pass through This entity reference object and (selectivity) other environmental characteristics, it is real herein that AR application program 115 can calculate mobile device 110 Initial solid position in body space.Later, if camera unit 111 can shoot always/detect entity reference object Words, AR tracing module 121 can track out provider location of the mobile device 110 in entity space and entity angle accordingly.
When the camera unit 111 of mobile device 110 can take the environmental characteristic in entity space (for example, hanging over wall On figure etc.) when, environmental characteristic tracing module 122 can calculate accordingly/track out reality of the mobile device 110 in entity space Body position.In detail, initial solid position of the mobile device 110 in this entity space is calculated in AR tracing module 121 Later, environmental characteristic tracing module 122 can continue to calculate/track out according to environmental characteristic mobile device 110 in entity Provider location in space.
Furtherly, environmental characteristic tracing module 122 can be looked for according to computer vision (computer vision, CV) Environmental characteristic out.Later, environmental characteristic tracing module 122 it is sustainable establish environmental characteristic map (be stored in property data base 124 it In).If it can detect always entity reference object, environmental characteristic tracing module 122 repeats aforesaid operations.But If it can not detect entity reference object, environmental characteristic tracing module 122 can (such as random sampling be consistent using algorithm (RANdom SAmple Consensus, RANSAC) algorithm) come find out be located proximate to entity reference object those environment it is special Sign.Later, environmental characteristic tracing module 122 (can such as iterate closest approach (Iterative Closest using another algorithm Point, ICP) algorithm) come from those environmental characteristics for being located proximate to entity reference object found out removal may be noise One or more environmental characteristics.Whereby, environmental characteristic tracing module 122 can estimate out mobile device 110 according to environmental characteristic Position and angle etc..
Mobile device 110 is calculated after the initial solid position in this entity space in AR tracing module 121, it is mobile Mobile device 110 can be kept track/estimate according to rotation sensor 112 and acceleration transducer 113 by estimating module 123 Position and angle.
Furtherly, it is mobile estimate module 123 can be according to the angle for the mobile device 110 that rotation sensor 112 is detected Speed estimates out the angle parameter of mobile device 110;And the mobile device detected according to acceleration transducer 113 110 speed, to estimate out the displacement parameter and location parameter of mobile device 110.Therefore, mobile module 123 of estimating can continue Track/estimate position and the angle of mobile device 110.
For data priority, AR tracing module 121 is higher than environmental characteristic tracing module 122, and environmental characteristic is tracked Module 122 is higher than movement again and estimates module 123.That is, in an embodiment of the present invention, if mobile device 110 can If taking entity reference object always, then the mobile device estimated/tracked out by AR tracing module 121 is preferentially adopted 110 provider location, because its accuracy is height compared with other two modules 122/123.Once mobile device 110 is shot less than reality If body refers to object but if enough environmental characteristics can be taken, then has to take the second best, adopt and tracked by environmental characteristic The provider location for the mobile device 110 that module 122 is estimated/tracked out.If once the shooting of mobile device 110 is joined less than entity If examining object and the lazy weight of taken environmental characteristic is estimated/is tracked out to allow environmental characteristic tracing module 122 It if the provider location of mobile device 110, then has to take the second best again, adopts and estimate what module 123 was estimated/tracked out by movement The provider location of mobile device 110.
Property data base 124 to store captured/characteristic point for detecting (including entity reference object and environment it is special Sign).
AR editor module 151 can be according to the data that the AR application program 115 of mobile device 110 is returned (for example, entity is empty Between size etc.), to carry out the editor of AR displaying.For example, if needed for user thinks that current AR shows and do not meet it If (for example, AR shows corresponding entity space square meter number and does not meet the square meter of the entity space where current user Several or user wants replacement AR style), mobile device 110 can return to AR platform 150, allow AR editor module 151 It carries out AR and edits (for example, the placement position of adjustment AR virtual article, rotation angle, size etc.), to reach customization purpose.
AR size of contents module 152 calculates AR and configures the corresponding area in entity space.Such as, it is assumed that entity ginseng The entity area for examining object is A, and the AR area of AR configuration is B, and the AR in AR configuration is C with reference to the AR area of object, then this AR Configure shared entity area Y=A*B/C in entity space.It is, AR is referred to object pair when carrying out area conversion Should in entity reference object, with find out AR configure it is converted after entity area occupied.
It illustrates but is not only restricted to, with the object of A0 size as if entity reference object, then A=1189 (cm) * 841 (cm).Entity reference object refers to the reference object being placed in entity space, when AR application program 115 detects this entity reference When object, this entity reference object is treated as into reference location point.Typically, entity reference object can be placed in entity space Central point.For example, then entity reference object can be placed in the central point in parlor if entity space is parlor.
So-called AR configuration generally includes type, size, shape, the placement position of AR furniture etc. to be placed.For example, When design is suitable for the AR configuration in parlor, AR configuration generally includes common AR furniture, such as tea table, cabinet for TV, sofa.? When designing AR configuration, the corresponding AR area of this AR configuration can be also designed together.That is, if there is a house includes If one parlor (5 square meter), a dining room (3 square meter), master bedroom (5 square meter) and general bedroom (4 square meter), match in design AR When setting, each AR configuration in suitable parlor/dining room/room can be separately designed/generated.
When designing AR configuration, object, the preset virtual same position in placement position also are referred to AR is centered in AR In the central point in the space AR, to correspond to entity reference object.
Therefore, AR size of contents module 152 is that AR is corresponding to reality with reference to the position of object when carrying out area conversion Body refers to the position of object, and AR is contracted to scale/put with reference to the area of object to being identical to entity reference object, then compares according to this Example configures the corresponding area in entity space come the AR area for AR configuration of contracting/put, to obtain the AR.
In an embodiment of the present invention, by AR editor module 151 come after adjusting/editing AR virtual article, AR platform 150 The AR application program 115 of mobile device 110 can be returned to.Accordingly, AR configuration adjusted can be obtained in AR application program 115, and 3D registration (3D registration) can not be carried out again.
Now illustrate position and the angle of mobile device 110 are tracked/estimated to one embodiment of the invention how.Such as Fig. 2A institute Showing, user holds mobile device 110, and alignment entity reference object (marker) 210 is shot, whereby, AR tracing module 121 Initial solid position of the mobile device 110 in this entity space can be calculated.If when mobile device 110 is moved to and can not clap If the position for taking the photograph entity reference object 210, as shown in Figure 2 B, then environmental characteristic tracing module 122 can detect current institute's energy The environmental characteristic 220A-220D taken tracks/estimates position and the angle of mobile device 110, to allow AR application program 115 can show that AR corresponding to current position and angle shows (such as AR furniture 230) on screen 114 accordingly, as shown in Figure 2 C.
It can not take entity reference object 210 if working as mobile device 110 and being moved to and can not also take environment spy Sign 220A-220D (or taken environmental characteristic is not enough to environmental characteristic tracing module 122 is allowed to estimate/track out The provider location of mobile device 110) position if, as shown in Figure 2 D, then it is mobile estimate module 123 can be according to rotation sensor 112 learn the displacement and angle of the mobile device 110 of each time point with acceleration transducer 113, and then it can be seen that The entities coordinate for the mobile device 110 inscribed when each.So it is mobile estimate module 123 it is also traceable/estimate mobile device 110 position and angle, to allow, AR application program 115 can display corresponds to current position and angle on screen 114 accordingly AR show, as shown in Figure 2 E.
In addition, in an embodiment of the present invention, mould is estimated in AR tracing module 121, environmental characteristic tracing module 122 and movement 123 continuous service of block, persistently to estimate out position and the angle of mobile device 110.
In addition, if user thinks that the AR furniture in AR configuration is dissatisfied (for example, AR furniture during AR is shown Size is not suitable for), parameter can be returned to AR platform 150 by AR application program 115 by user, and designer is allowed to pass through AR platform The AR furniture in AR configuration is modified/adjusted to 150 AR editor module 151 and/or AR size of contents module 152, then returns to User.For example, when carrying out AR displaying, the AR application program 115 of mobile device 110 can measure this entity by taking Fig. 3 A as an example The area in space, with the AR style picking out suitable area in multiple pre-designed AR configurations of comforming with being liked.? This, so-called AR style refers to, the combination of pre-designed multiple AR furniture.AR application program 115 is selected user AR configuration be shown on screen, as shown in Figure 3A.Entity reference object 310 and AR furniture 320 are shown on screen.User Think that the size of AR furniture 320/color is not in harmony/does not like, then user can return to AR platform by AR application program 115 150.AR furniture 320 is modified as returning to AR application program 115 after AR furniture 320 ' by AR platform 150, and is shown in screen 114 It is upper, as shown in Figure 3B.In figure 3b, color/size of AR furniture 320 ' is modified.
Now referring to Fig. 4, the flow chart of augmented reality method according to an embodiment of the invention is shown.As shown in figure 4, In step 410, in entity space, detecting entity reference object, multiple environmental characteristics, the angle parameter of mobile device and position Shifting parameter.At step 420, the entity reference object is tracked to estimate provider location of the mobile device in the entity space With entity angle, to carry out augmented reality displaying.In step 430, when tracking is less than the entity reference object, it is more to track this A environmental characteristic is real to carry out amplification to estimate the provider location of the mobile device in the entity space and the entity angle Border is shown.In step 440, when tracking is less than the entity reference object and tracking is less than multiple environmental characteristic, according to this Angle parameter and the displacement parameter of mobile device, estimate the provider location of the mobile device in the entity space and are somebody's turn to do Entity angle, to carry out augmented reality displaying.
In conclusion in embodiments of the present invention, the position and angle of mobile device are estimated since various ways can be utilized Therefore degree when that is, convenient mobile device is moved to detecting less than entity reference object (being typically placed in room centre), still may be used According to environmental characteristic, and/or, acceleration/rotation sensor estimates position and the angle of mobile device, so that AR is shown not It is distorted because of the movement of mobile device.
In conclusion although the present invention has been disclosed by way of example above, it is not intended to limit the present invention..Institute of the present invention Belong to those of ordinary skill in technical field, without departing from the spirit and scope of the present invention, when various change and modification can be made. Therefore, protection scope of the present invention is subject to view as defined in claim.

Claims (12)

1. a kind of augmented reality method, which is characterized in that be applied to an augmented reality system, which includes:
In entity space, detecting entity reference object, multiple environmental characteristics, mobile device angle parameter and displacement parameter;
The entity reference object is tracked to estimate provider location of the mobile device in the entity space and entity angle, into Row augmented reality is shown;
When tracking is less than the entity reference object, multiple environmental characteristic is tracked to estimate the mobile device in the entity space In the provider location and the entity angle, to carry out augmented reality displaying;And
When tracking is less than the entity reference object and tracking is less than multiple environmental characteristic, according to the angle of the mobile device Parameter and the displacement parameter, estimate the provider location of the mobile device in the entity space and the entity angle, to carry out Augmented reality is shown.
2. augmented reality method according to claim 1, which is characterized in that further include:
Using the entity reference object is detected for the first time to calculate initial solid position of the mobile device in the entity space; And
If detecting the entity reference object always, the reality of the mobile device in the entity space is tracked out accordingly Body position and entity angle, to carry out augmented reality displaying.
3. augmented reality method according to claim 1, wherein the step for detecting multiple environmental characteristic includes:
Multiple environmental characteristic is found out according to computer vision;
It is lasting to establish and storage environment characteristics map;
If it can not detect the entity reference object, the multiple environment spy for being located proximate to entity reference object is found out Sign;
Removal may be one or more environmental characteristics of noise from the multiple environmental characteristic found out;And
Provider location and the entity angle of the mobile device are estimated, according to remained multiple environmental characteristic to carry out Augmented reality is shown.
4. augmented reality method according to claim 1, wherein detect the angle parameter and the displacement of the mobile device The step of parameter includes:
The angle parameter of the mobile device is detected using rotation sensor;And
The displacement parameter of the mobile device is detected using acceleration transducer.
5. augmented reality method according to claim 1, which is characterized in that further include:
The mobile device returns to augmented reality platform, to carry out the editor of augmented reality displaying;And
After the augmented reality platform is edited, returns to the mobile device and carry out augmented reality exhibition not.
6. augmented reality method according to claim 1, which is characterized in that further include:
By augmented reality configure in augmented reality with reference to object correspond to the entity reference object;
The augmented reality is contracted to scale/put with reference to the area of object and is identical to the entity reference object to size;And
Contract/put according to the ratio augmented reality configuration augmented reality area, with obtain the augmented reality configuration in the entity Corresponding area in space.
7. a kind of augmented reality system characterized by comprising
Mobile device, and
Augmented reality platform is coupled to the mobile device,
Wherein,
In entity space, the mobile device detect entity reference object, multiple environmental characteristics, the mobile device angle parameter With displacement parameter;
The mobile device tracks the entity reference object to estimate provider location and reality of the mobile device in the entity space Body angle, to carry out augmented reality displaying;
When the mobile device is tracked less than the entity reference object, which tracks multiple environmental characteristic to estimate this The provider location of mobile device in the entity space and the entity angle, to carry out augmented reality displaying;And
When mobile device tracking is less than the entity reference object and tracking is less than multiple environmental characteristic, filled according to the movement The angle parameter and the displacement parameter set, the mobile device estimate the provider location of the mobile device in the entity space With the entity angle, to carry out augmented reality displaying.
8. augmented reality system according to claim 7, wherein
It is first in the entity space that the mobile device calculates the mobile device using the entity reference object is detected for the first time Beginning provider location;And
If the mobile device detects always the entity reference object, which tracks out the mobile device accordingly The provider location and entity angle in the entity space, to carry out augmented reality displaying.
9. augmented reality system according to claim 7, wherein
The mobile device finds out multiple environmental characteristic according to computer vision;
The mobile device is persistently established and storage environment characteristics map;
If the mobile device can not detect the entity reference object, which, which finds out, is located proximate to entity reference Multiple environmental characteristic of object;
The mobile device removed from the multiple environmental characteristic found out may be noise one or more environmental characteristics;And
According to remained multiple environmental characteristic, which estimates the provider location and the entity angle, to be expanded Increase real border to show.
10. augmented reality system according to claim 7, wherein
The mobile device detects the angle parameter of the mobile device using rotation sensor;And
The mobile device detects the displacement parameter of the mobile device using acceleration transducer.
11. augmented reality system according to claim 7, wherein
The mobile device returns to the augmented reality platform, to carry out the editor of augmented reality displaying;And
After the augmented reality platform is edited, which returns to the mobile device and carries out augmented reality exhibition Show.
12. augmented reality system according to claim 7, in which:
The mobile device by augmented reality configure in augmented reality with reference to object correspond to the entity reference object;
The augmented reality is contracted to scale/put with reference to the area of object by the mobile device is identical to the entity reference object to size Part;And
Contract/put according to the ratio augmented reality configuration augmented reality area, the mobile device obtain the augmented reality configuration Corresponding area in the entity space.
CN201410826020.4A 2014-12-16 2014-12-26 Augmented reality method and system Active CN105786166B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW103143813A TWI518634B (en) 2014-12-16 2014-12-16 Augmented reality method and system
TW103143813 2014-12-16

Publications (2)

Publication Number Publication Date
CN105786166A CN105786166A (en) 2016-07-20
CN105786166B true CN105786166B (en) 2019-01-29

Family

ID=55640453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410826020.4A Active CN105786166B (en) 2014-12-16 2014-12-26 Augmented reality method and system

Country Status (2)

Country Link
CN (1) CN105786166B (en)
TW (1) TWI518634B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI586936B (en) * 2016-05-20 2017-06-11 國立交通大學 A transform method between a physical image and a virtual image and a system thereof
TWI603227B (en) * 2016-12-23 2017-10-21 李雨暹 Method and system for remote management of virtual message for a moving object
CN106843493B (en) * 2017-02-10 2019-11-12 成都弥知科技有限公司 A kind of picture charge pattern method and the augmented reality implementation method using this method
CN114935994A (en) * 2022-05-10 2022-08-23 阿里巴巴(中国)有限公司 Article data processing method, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201224995A (en) * 2010-12-15 2012-06-16 Univ Chaoyang Technology Augmented reality system requiring no object marker
CN102770843A (en) * 2010-01-29 2012-11-07 Olaworks株式会社 Method for providing information on object which is not included in visual field of terminal device, terminal device and computer readable recording medium
TW201429242A (en) * 2013-01-07 2014-07-16 Ind Tech Res Inst System and method for determining individualized depth information in augmented reality scene

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8688443B2 (en) * 2009-12-23 2014-04-01 At&T Intellectual Property I, L.P. Multimodal augmented reality for location mobile information service
US8933986B2 (en) * 2010-05-28 2015-01-13 Qualcomm Incorporated North centered orientation tracking in uninformed environments
KR101303948B1 (en) * 2010-08-13 2013-09-05 주식회사 팬택 Apparatus and Method for Providing Augmented Reality Information of invisible Reality Object
KR101330805B1 (en) * 2010-08-18 2013-11-18 주식회사 팬택 Apparatus and Method for Providing Augmented Reality
US9214137B2 (en) * 2012-06-18 2015-12-15 Xerox Corporation Methods and systems for realistic rendering of digital objects in augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102770843A (en) * 2010-01-29 2012-11-07 Olaworks株式会社 Method for providing information on object which is not included in visual field of terminal device, terminal device and computer readable recording medium
TW201224995A (en) * 2010-12-15 2012-06-16 Univ Chaoyang Technology Augmented reality system requiring no object marker
TW201429242A (en) * 2013-01-07 2014-07-16 Ind Tech Res Inst System and method for determining individualized depth information in augmented reality scene

Also Published As

Publication number Publication date
CN105786166A (en) 2016-07-20
TWI518634B (en) 2016-01-21
TW201624424A (en) 2016-07-01

Similar Documents

Publication Publication Date Title
AU2023200411B2 (en) Automated control of image acquisition via acquisition location determination
US11010965B2 (en) Virtual object placement for augmented reality
US11367250B2 (en) Virtual interaction with three-dimensional indoor room imagery
US20200334913A1 (en) In situ creation of planar natural feature targets
Sankar et al. Capturing indoor scenes with smartphones
JP6258953B2 (en) Fast initialization for monocular visual SLAM
CN103181157B (en) Unmarked augmented reality system and its operating method based on flatness of the response
CN105786166B (en) Augmented reality method and system
CN110505463A (en) Based on the real-time automatic 3D modeling method taken pictures
WO2019233347A1 (en) Systems and methods for filling holes in virtual reality models
CN108765498A (en) Monocular vision tracking, device and storage medium
KR20200020937A (en) Cloud-based augmented reality
JP2016534461A (en) Method and apparatus for representing a physical scene
KR20160130217A (en) Methods and systems for generating a map including sparse and dense mapping information
ES2626301T3 (en) Procedure and image processing equipment to determine the parameters of a camera
CN105912121A (en) Method and system enhancing reality
CN108846899B (en) Method and system for improving area perception of user for each function in house source
CN113298946A (en) House three-dimensional reconstruction and ground identification method, device, equipment and storage medium
CN111383340A (en) Background filtering method, device and system based on 3D image
Yang et al. 3D Scene Reconstruction Using Multi-Sources Data

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant