WO2015062164A1 - Method for optimizing localization of augmented reality-based location system - Google Patents

Method for optimizing localization of augmented reality-based location system Download PDF

Info

Publication number
WO2015062164A1
WO2015062164A1 PCT/CN2014/000788 CN2014000788W WO2015062164A1 WO 2015062164 A1 WO2015062164 A1 WO 2015062164A1 CN 2014000788 W CN2014000788 W CN 2014000788W WO 2015062164 A1 WO2015062164 A1 WO 2015062164A1
Authority
WO
WIPO (PCT)
Prior art keywords
frames
pattern
recording device
outside frame
identified
Prior art date
Application number
PCT/CN2014/000788
Other languages
French (fr)
Inventor
Yangsheng Xu
Wing Kwong Chung
Long HAN
Xu Chen
Huihuan Qian
Original Assignee
The Chinese University Of Hong Kong
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Chinese University Of Hong Kong filed Critical The Chinese University Of Hong Kong
Priority to CN201480055667.XA priority Critical patent/CN105637560B/en
Publication of WO2015062164A1 publication Critical patent/WO2015062164A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Definitions

  • the present application generally relates to a field of Augmented Reality (AR), and more specifically to a method for optimizing localization of an AR-based location system, and an AR-based location system.
  • AR Augmented Reality
  • Augmented Reality is a computer-based perception referred to the mix of real and virtual world. It is a technology that increases user perception of the real world via information provided through computer systems. At present, AR technology is widely used in location systems such as GPS.
  • locations systems such as GPS.
  • patterns designed on basis of AR location system consist of an outside frame and an inside figure on a plane. The outside frame is used for calculating position and orientation of the pattern relative to the camera; the inside figure is for recognizing pattern and locating the whole recording device.
  • the present application proposes a solution to enlarge the location range of the AR-based location system and to locate the pattern relative to the recording device more precisely.
  • an Augmented Reality (AR)-based location system comprising a recording device and a pattern with a plurality of frames and at least one figure inside the frames.
  • the recording device comprises a processor configured to identify each of the plurality of frames appeared within a viewing range of the recording device; to select any one of the identified frames as an outside frame of the pattern; to determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and to recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
  • the AR-based location system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames
  • the method comprises: identifying each of the plurality of frames appeared in a viewing range of the recording device; selecting any one of the identified frames as an outside frame of the pattern; determining the frames inside the outside frame and the at least one figure as a whole inside figure of the pattern; and recognizing the pattern based on the whole inside figure so as to enlarge location range of the system.
  • an Augmented Reality (AR)-based location system comprising a recording device and a pattern with a plurality of frames and at least one figure inside the frames.
  • the recording device comprises an identifying module configured to identify each of the plurality of frames appeared within a viewing range of the recording device; a selecting module configured to select any one of the identified frames as an outside frame of the pattern; a determining module configured to determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and a recognizing module configured to recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
  • the method according to the present application can mend the problem that the patterns may not be recognized by the recording device (for example, camera) when the distance between the recording device and the patterns increases or decreases too much. Hence the distance range can be enlarged and more data can be collected, which cause this kind of the AR-based location system to be applied more widely and practically.
  • the recording device for example, camera
  • Fig. 1 is a schematic diagram illustrating an AR-based location system consistent with some disclosed embodiments.
  • FIG. 2 is a flowchart illustrating a method for optimizing localization of an AR-based location system consistent with some disclosed embodiments.
  • Fig. 3 is a schematic diagram illustrating a recording device consistent with some disclosed embodiments.
  • Fig. 4(a)-4(e) are schematic examples of patterns of the AR-based location system according to some disclosed embodiments.
  • Fig. 1 is a schematic diagram illustrating an AR-based location system 1000 consistent with some disclosed embodiments.
  • the system 1000 may comprise a pattern 100 and a recording device 200.
  • the pattern 100 may have a plurality of frames 101 and at least one figure 102 inside the frames, as shown in Fig. 1.
  • the frames 101 are used for calculating position and orientation of the pattern relative to the camera and the figure is used for recognizing pattern and locating the whole recording device. Also this kind of pattern may be used in zoom vision system, which will improve the system performance.
  • the pattern 100 may be pre-stored in the recording device 200 or in a system database external to the device 200.
  • the frames 101 may have same or different widths and there shall be a certain space between each two of the plurality of frames 101.
  • the pattern may be configured to an asymmetrical pattern.
  • the frames of the pattern are configured to rotate couple angles relative to each other.
  • the plurality of frames of the pattern are configured to have corners 107 overlapped.
  • the recording device 200 is a device that records images that can be stored directly, transmitted to another location, or both. These images may be still photographs or moving images such as videos or movies.
  • the recording device 200 may be a camera, or a mobile device with a camera like Smart phones, or a general purpose computer, a computer cluster, a mainstream computer, a computing device with a camera. Intrinsic parameters of the recording device 200 are pre-determined
  • the device 200 may at least comprise a processor (not shown) which is configured to carry out the computer program instructions stored in a memory so as to implement the process 2000 as shown in Fig. 2.
  • the processor may identify each of the plurality of frames appeared in a viewing range of the recording device 200 (For example, the range enclosed in the dashed box 104).
  • both of the two frames 101a and 101b can be used to calculate the orientation and position of the pattern relative to the recording device.
  • both the frames 101a and 101b can be identified to locate the pattern 100 relative to the recording device 200.
  • the outermost frame 101b cannot show in a viewing range of the recording device and only the inside frame 101a should be identified as the reference to calculate the orientation and position of the pattern relative to the recording device.
  • a few more frames which are not nested may be added into the outermost frame.
  • the whole pattern with the multiple frames can integrally appear in a viewing range of the recording device, both the frames can be identified.
  • the outermost frame cannot show in the viewing range of the recording device, then only the respective inside frame should be identified as the reference to calculate the orientation and position of the pattern relative to the recording device.
  • the processor may select any one of the identified frames
  • any one of the identified frames 101a and 101b can be selected as the outside frame of the pattern 100. But in the situation that distance between recording device and pattern is short, only the identified frame 101b can be selected as the outside frame of the pattern 100.
  • the processor may determine the frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern 100.
  • the members enclosed in the dashed box 103 may be determined as the figure as a whole.
  • the members enclosed in the dashed box 105 may be determined as the figure as a whole; that is, the pattern 100 can be recognized only by the inside figure 102.
  • the processor may repeat the selecting (step S202) and determining (step S203) processes for each frame 101 of the identified frames until all the identified frames have been selected as the outside frame for one time.
  • the processor may recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
  • the processor may determine position and orientation of the pattern 100 relative to the recording device 200 based on the selected outside frame. For example, image processing techniques well known in the art are available to extract parameters of corners of the selected outside frame. The region bounded by the extracted corners is normalized and compared with the pre-stored patterns to locate the pattern corresponding to the recognized pattern. Then, a transformation matrix between the coordinate frame of the recording device 200 and the located pattern is estimated based on the regions bounded by the extracted corners.
  • the identified pattern and the pre-stored pattern can be regarded as two different images of the same planar surface in space.
  • the two images are related by the transformation matrix.
  • the transformation matrix may be a homography matrix, which represents the motion of the recording device (rotation and translation in respective to the two images).
  • the computation of extrinsic parameter of the recording device 200 becomes to solving the homography matrix, for example a 3x3 matrix, with eight independent variables. Since four corners may be identified from the identified pattern and each corner can contribute two independent equations, four corners can be utilized to solve the homography matrix.
  • the estimated transformation matrix represents the extrinsic parameter of the recording device with respect to the pattern and so it can be used for localization.
  • the process 2000 is carried out by the processor running the computer program instructions.
  • the process 2000 may be carried out by hardware with the circuitry or the hardware combined with the software.
  • the recording device 200 may include an identifying module 301, a selecting module 302, a determining module 303 and a recognizing module 304.
  • the identifying module 301 may be configured to identify each of the plurality of frames 101 appeared in a viewing range of the recording device 200.
  • the selecting module 302 may be configured to select any one of the identified frames 101 as an outside frame of the pattern 100.
  • the determining module 303 may be configured to determine the frames inside the outside frame and the at least one figure 102 as a whole inside figure of the pattern 100.
  • the recognizing module 304 may be configured to recognize the pattern 100 based on the whole inside figure to locate the pattern 100 relative to the recording device 200.
  • the determining module 303 may be further configured to determine position and orientation of the pattern 100 relative to the recording device 200 based on the selected outside frame.
  • the selecting module 302 and the determining module 303 may repeat the selecting and determining processes for each frame 101 of the identified frames until all the identified frames have been selected as the outside frame for one time.
  • the embodiments of the present invention may be implemented using certain hardware, software, or a combination thereof.
  • the embodiments of the present invention may be adapted to a computer program product embodied on one or more computer readable storage media (comprising but not limited to disk storage, CD-ROM, optical memory and the like) containing computer program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A method for optimizing localization of an Augmented Reality (AR) -based location system is disclosed, in which the AR-based location system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames. The method comprises identifying each of the plurality of frames appeared in the view of the recording device; selecting any one of the identified frames as an outside frame of the pattern; determining the frames inside the outside frame and the at least one figure as a whole inside figure of the pattern; and recognizing the pattern based on the whole inside figure so as to enlarge location range of the system. In addition, an AR-based location system is disclosed.

Description

METHOD FOR OPTIMIZING LOCALIZATION OF AUGMENTED REALITY-BASED LOCATION SYSTEM
Technical Field
[0001] The present application generally relates to a field of Augmented Reality (AR), and more specifically to a method for optimizing localization of an AR-based location system, and an AR-based location system.
Background
[0002] Augmented Reality (AR) is a computer-based perception referred to the mix of real and virtual world. It is a technology that increases user perception of the real world via information provided through computer systems. At present, AR technology is widely used in location systems such as GPS. Generally, in a conventional AR-based location system, patterns designed on basis of AR location system consist of an outside frame and an inside figure on a plane. The outside frame is used for calculating position and orientation of the pattern relative to the camera; the inside figure is for recognizing pattern and locating the whole recording device.
[0003] Now localization optimization based on AR mainly focuses on improving image processing methods or making the system more practically by information sharing. With methods above, although the system performance can be improved, range of localization is still limited by the size of the patterns. Obviously, with the patterns which only have one frame and one figure drawn inside the frame in traditional AR, the location range should be limited within certain meters. When the distance between camera and the patterns increases, like 3 or 4 meters, only large patterns should be applied; in turn only small patterns should be applied when the distance is short enough such as 0.5m or even closer, making the system with a non-desirable distance range.
Summary
[0004] The present application proposes a solution to enlarge the location range of the AR-based location system and to locate the pattern relative to the recording device more precisely.
[0005] According to an aspect of the present application, disclosed is an Augmented Reality (AR)-based location system. The system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames. The recording device comprises a processor configured to identify each of the plurality of frames appeared within a viewing range of the recording device; to select any one of the identified frames as an outside frame of the pattern; to determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and to recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
[0006] According to a further aspect of the present application, disclosed is a method for optimizing localization of an Augmented Reality (AR)-based location system, in which the AR-based location system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames, and the method comprises: identifying each of the plurality of frames appeared in a viewing range of the recording device; selecting any one of the identified frames as an outside frame of the pattern; determining the frames inside the outside frame and the at least one figure as a whole inside figure of the pattern; and recognizing the pattern based on the whole inside figure so as to enlarge location range of the system.
[0007] According to another aspect of the present application, disclosed is an Augmented Reality (AR)-based location system. The system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames. The recording device comprises an identifying module configured to identify each of the plurality of frames appeared within a viewing range of the recording device; a selecting module configured to select any one of the identified frames as an outside frame of the pattern; a determining module configured to determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and a recognizing module configured to recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
[0008] The method according to the present application can mend the problem that the patterns may not be recognized by the recording device (for example, camera) when the distance between the recording device and the patterns increases or decreases too much. Hence the distance range can be enlarged and more data can be collected, which cause this kind of the AR-based location system to be applied more widely and practically.
Brief Description of the Drawing
[0009] Exemplary non-limiting embodiments of the invention are described below with reference to the attached figures. The drawings are illustrative and generally not to an exact scale.
[0010] Fig. 1 is a schematic diagram illustrating an AR-based location system consistent with some disclosed embodiments.
[0011] Fig. 2 is a flowchart illustrating a method for optimizing localization of an AR-based location system consistent with some disclosed embodiments.
[0012] Fig. 3 is a schematic diagram illustrating a recording device consistent with some disclosed embodiments.
[0013] Fig. 4(a)-4(e) are schematic examples of patterns of the AR-based location system according to some disclosed embodiments.
Detailed Description
[0014] Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When appropriate, the same reference numbers are used throughout the drawings to refer to the same or like parts.
[0015] Fig. 1 is a schematic diagram illustrating an AR-based location system 1000 consistent with some disclosed embodiments. As shown in Fig. 1 , the system 1000 may comprise a pattern 100 and a recording device 200.
[0016] The pattern 100 may have a plurality of frames 101 and at least one figure 102 inside the frames, as shown in Fig. 1. The frames 101 are used for calculating position and orientation of the pattern relative to the camera and the figure is used for recognizing pattern and locating the whole recording device. Also this kind of pattern may be used in zoom vision system, which will improve the system performance. The pattern 100 may be pre-stored in the recording device 200 or in a system database external to the device 200. According to an embodiment of the present application, the frames 101 may have same or different widths and there shall be a certain space between each two of the plurality of frames 101. According to a further illustrating example of the present application, as shown in Fig. 4(b), the pattern may be configured to an asymmetrical pattern. In an embodiment, as shown in Fig. 4(c), the frames of the pattern are configured to rotate couple angles relative to each other. According to another example of the present application, as shown in Fig. 4(d), the plurality of frames of the pattern are configured to have corners 107 overlapped.
[0017] The recording device 200 is a device that records images that can be stored directly, transmitted to another location, or both. These images may be still photographs or moving images such as videos or movies. In an embodiment of the present application, the recording device 200 may be a camera, or a mobile device with a camera like Smart phones, or a general purpose computer, a computer cluster, a mainstream computer, a computing device with a camera. Intrinsic parameters of the recording device 200 are pre-determined
[0018] In one embodiment of the present application, the device 200 may at least comprise a processor (not shown) which is configured to carry out the computer program instructions stored in a memory so as to implement the process 2000 as shown in Fig. 2.
[0019] At step S201, the processor may identify each of the plurality of frames appeared in a viewing range of the recording device 200 (For example, the range enclosed in the dashed box 104). In an embodiment, as shown in Fig. 4(a), when a pattern with two frames is applied, both of the two frames 101a and 101b can be used to calculate the orientation and position of the pattern relative to the recording device. For example, in the situation that distance between recording device and pattern is large, the whole pattern with the two frames can integrally appear in a viewing range of the recording device, both the frames 101a and 101b can be identified to locate the pattern 100 relative to the recording device 200. However, in the situation that distance between recording device 200 and the pattern 100 is short, the outermost frame 101b cannot show in a viewing range of the recording device and only the inside frame 101a should be identified as the reference to calculate the orientation and position of the pattern relative to the recording device.
[0020] According to another embodiment of the present application, as shown in Fig. 4(e), a few more frames which are not nested may be added into the outermost frame. In this situation, when distance between a recording device and a pattern becomes large enough to make the multiple frames to appear in the viewing range, then the whole pattern with the multiple frames can integrally appear in a viewing range of the recording device, both the frames can be identified. However, in the present embodiment, when the distance between recording device and the pattern is short, the outermost frame cannot show in the viewing range of the recording device, then only the respective inside frame should be identified as the reference to calculate the orientation and position of the pattern relative to the recording device.
[0021] At step S202, the processor may select any one of the identified frames
101 as an outside frame of the pattern 100. For example, in the embodiment shown in Fig. 4(a), in the situation that distance between recording device and pattern is large, any one of the identified frames 101a and 101b can be selected as the outside frame of the pattern 100. But in the situation that distance between recording device and pattern is short, only the identified frame 101b can be selected as the outside frame of the pattern 100.
[0022] Then, at step S203, the processor may determine the frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern 100. As shown in Fig. 4(a), in the situation that distance between recording device and pattern is large, the members enclosed in the dashed box 103 may be determined as the figure as a whole. But in the situation that distance between recording device and pattern is short, the members enclosed in the dashed box 105 may be determined as the figure as a whole; that is, the pattern 100 can be recognized only by the inside figure 102. In one embodiment, the processor may repeat the selecting (step S202) and determining (step S203) processes for each frame 101 of the identified frames until all the identified frames have been selected as the outside frame for one time.
[0023] At step S204, the processor may recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
[0024] In addition, the processor may determine position and orientation of the pattern 100 relative to the recording device 200 based on the selected outside frame. For example, image processing techniques well known in the art are available to extract parameters of corners of the selected outside frame. The region bounded by the extracted corners is normalized and compared with the pre-stored patterns to locate the pattern corresponding to the recognized pattern. Then, a transformation matrix between the coordinate frame of the recording device 200 and the located pattern is estimated based on the regions bounded by the extracted corners.
[0025] For example, the identified pattern and the pre-stored pattern can be regarded as two different images of the same planar surface in space. In fact, the two images are related by the transformation matrix. Here, the transformation matrix may be a homography matrix, which represents the motion of the recording device (rotation and translation in respective to the two images). By using this concept, the computation of extrinsic parameter of the recording device 200 becomes to solving the homography matrix, for example a 3x3 matrix, with eight independent variables. Since four corners may be identified from the identified pattern and each corner can contribute two independent equations, four corners can be utilized to solve the homography matrix. In fact, the estimated transformation matrix represents the extrinsic parameter of the recording device with respect to the pattern and so it can be used for localization.
[0026] In the above, the process 2000 is carried out by the processor running the computer program instructions. Alternatively, the process 2000 may be carried out by hardware with the circuitry or the hardware combined with the software. As shown in the Fig. 3, the recording device 200 may include an identifying module 301, a selecting module 302, a determining module 303 and a recognizing module 304.
[0027] According to an embodiment of the present application, the identifying module 301 may be configured to identify each of the plurality of frames 101 appeared in a viewing range of the recording device 200. The selecting module 302 may be configured to select any one of the identified frames 101 as an outside frame of the pattern 100. The determining module 303 may be configured to determine the frames inside the outside frame and the at least one figure 102 as a whole inside figure of the pattern 100. The recognizing module 304 may be configured to recognize the pattern 100 based on the whole inside figure to locate the pattern 100 relative to the recording device 200.
[0028] In an embodiment of the present application, the determining module 303 may be further configured to determine position and orientation of the pattern 100 relative to the recording device 200 based on the selected outside frame.
[0029] In an embodiment of the present application, the selecting module 302 and the determining module 303 may repeat the selecting and determining processes for each frame 101 of the identified frames until all the identified frames have been selected as the outside frame for one time.
[0030] The embodiments of the present invention may be implemented using certain hardware, software, or a combination thereof. In addition, the embodiments of the present invention may be adapted to a computer program product embodied on one or more computer readable storage media (comprising but not limited to disk storage, CD-ROM, optical memory and the like) containing computer program codes.
[0031] In the foregoing descriptions, various aspects, steps, or components are grouped together in a single embodiment for purposes of illustrations. The disclosure is not to be interpreted as requiring all of the disclosed variations for the claimed subject matter. The following claims are incorporated into this Description of the Exemplary Embodiments, with each claim standing on its own as a separate embodiment of the disclosure.
[0032] Moreover, it will be apparent to those skilled in the art from consideration of the specification and practice of the present disclosure that various modifications and variations can be made to the disclosed systems and methods without departing from the scope of the disclosure, as claimed. Thus, it is intended that the specification and examples be considered as exemplary only, with a true scope of the present disclosure being indicated by the following claims and their equivalents.

Claims

What is claimed is:
1. An Augmented Reality (AR)-based location system, comprising:
a pattern with a plurality of frames and at least one figure inside the frames; and a recording device comprising a processor configured to
identify each of the plurality of frames appeared within a viewing range of the recording device;
select any one of the identified frames as an outside frame of the pattern;
determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and
recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
2. The system according to claim 1, wherein the processor is further configured to determine position and orientation of the pattern relative to the recording device based on the selected outside frame.
3. The system according to claim 1, wherein the processor is further configured to repeat the selecting and determining processes for each frame of the identified frames until all the identified frames are selected as the outside frame for one time.
4. The system according to claim 2, wherein each of the frames has same width.
5. The system according to claim 2, wherein each of the frames has different width from others.
6. The system according to claim 1, wherein the plurality of frames are configured to rotate predetermined angles relative to each other.
7. The system according to claim 1, wherein the plurality of frames are configured to have corners overlapped.
8. The system according to claim 1, wherein the pattern is configured to an asymmetrical pattern.
9. A computer-implemented method for optimizing localization of an Augmented Reality (AR)-based location system, wherein the AR-based location system comprises a recording device and a pattern with a plurality of frames and at least one figure inside the frames, and the method comprises:
identifying each of the plurality of frames appeared within a viewing range of the recording device;
selecting any one of the identified frames as an outside frame of the pattern; and determining frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and
recognizing the pattern based on the whole inside figure so as to enlarge location range of the system.
10. The method according to claim 9, further comprising:
determining position and orientation of the pattern relative to the recording device based on the selected outside frame.
11. The method according to claim 10, wherein the selecting and determining processes are repeated for each frame of the identified frames until all the identified frames are selected as the outside frame for one time.
12. The method according to claim 10, wherein each of the frames has same width.
13. The method according to claim 10, wherein each of the frames has different width from others.
14. The method according to claim 10, wherein the plurality of frames are configured to rotate predetermined angles relative to each other.
15. The method according to claim 10, wherein the plurality of frames are configured to have corners overlapped.
16. The method according to claim 10, wherein the pattern is configured to an asymmetrical pattern.
17. An Augmented Reality (AR)-based location system, comprising:
a pattern with a plurality of frames and at least one figure inside the frames; and a recording device comprising:
an identifying module configured to identify each of the plurality of frames appeared within a viewing range of the recording device; a selecting module configured to select any one of the identified frames as an outside frame of the pattern;
a determining module configured to determine frames inside the selected outside frame and the at least one figure as a whole inside figure of the pattern; and
a recognizing module configured to recognize the pattern based on the whole inside figure so as to enlarge location range of the system.
18. The system according to claim 17, wherein the determining module is further configured to determine position and orientation of the pattern relative to the recording device based on the selected outside frame.
19. The system according to claim 18, wherein the selecting module and the determining module are configured to repeat the selecting and determining processes for each frame of the identified frames until all the identified frames are selected as the outside frame for one time.
PCT/CN2014/000788 2013-10-31 2014-08-22 Method for optimizing localization of augmented reality-based location system WO2015062164A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201480055667.XA CN105637560B (en) 2013-10-31 2014-08-22 Optimize the method for the positioning of the positioning system based on augmented reality

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361898032P 2013-10-31 2013-10-31
US61/898,032 2013-10-31

Publications (1)

Publication Number Publication Date
WO2015062164A1 true WO2015062164A1 (en) 2015-05-07

Family

ID=53003219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/000788 WO2015062164A1 (en) 2013-10-31 2014-08-22 Method for optimizing localization of augmented reality-based location system

Country Status (2)

Country Link
CN (1) CN105637560B (en)
WO (1) WO2015062164A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11133993B2 (en) 2019-02-28 2021-09-28 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011048497A2 (en) * 2009-10-19 2011-04-28 National University Of Singapore Computer vision based hybrid tracking for augmented reality in outdoor urban environments
CN102509104A (en) * 2011-09-30 2012-06-20 北京航空航天大学 Confidence map-based method for distinguishing and detecting virtual object of augmented reality scene
US20120154638A1 (en) * 2010-12-21 2012-06-21 Cyberlink Corp. Systems and Methods for Implementing Augmented Reality
US20120214590A1 (en) * 2010-11-24 2012-08-23 Benjamin Zeis Newhouse System and method for acquiring virtual and augmented reality scenes by a user
US20120307075A1 (en) * 2011-06-01 2012-12-06 Empire Technology Development, Llc Structured light projection for motion detection in augmented reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100470452C (en) * 2006-07-07 2009-03-18 华为技术有限公司 Method and system for implementing three-dimensional enhanced reality
JP2013535047A (en) * 2010-05-28 2013-09-09 クゥアルコム・インコーポレイテッド Creating a data set to track targets with dynamically changing parts

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011048497A2 (en) * 2009-10-19 2011-04-28 National University Of Singapore Computer vision based hybrid tracking for augmented reality in outdoor urban environments
US20120214590A1 (en) * 2010-11-24 2012-08-23 Benjamin Zeis Newhouse System and method for acquiring virtual and augmented reality scenes by a user
US20120154638A1 (en) * 2010-12-21 2012-06-21 Cyberlink Corp. Systems and Methods for Implementing Augmented Reality
US20120307075A1 (en) * 2011-06-01 2012-12-06 Empire Technology Development, Llc Structured light projection for motion detection in augmented reality
CN102509104A (en) * 2011-09-30 2012-06-20 北京航空航天大学 Confidence map-based method for distinguishing and detecting virtual object of augmented reality scene

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11133993B2 (en) 2019-02-28 2021-09-28 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization
US11528198B2 (en) 2019-02-28 2022-12-13 At&T Intellectual Property I, L.P. Augmented/mixed reality virtual venue pipeline optimization

Also Published As

Publication number Publication date
CN105637560B (en) 2019-02-15
CN105637560A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
US10362296B2 (en) Localized depth map generation
EP3039655B1 (en) System and method for determining the extent of a plane in an augmented reality environment
US20180261005A1 (en) Method and Apparatus for Constructing Three-Dimensional Map
US20120075433A1 (en) Efficient information presentation for augmented reality
US20150310617A1 (en) Display control device and display control method
US11551388B2 (en) Image modification using detected symmetry
EP3711025A1 (en) Graphical coordinate system transform for video frames
US10789762B2 (en) Method and apparatus for estimating parameter of virtual screen
JP2022048963A (en) Acquisition method for obstacle three-dimensional position to be used for roadside calculation device, apparatus, electronic device, computer readable storage medium, and computer program
CN111295667B (en) Method for stereo matching of images and auxiliary driving device
CN106570482B (en) Human motion recognition method and device
JP2022529367A (en) Peripheral estimation from postural monocular video
CN108122280A (en) The method for reconstructing and device of a kind of three-dimensional point cloud
CN104952056A (en) Object detecting method and system based on stereoscopic vision
CN111583381A (en) Rendering method and device of game resource map and electronic equipment
JP2024008869A (en) Method and device for multi-target multi-camera head tracking
Jung et al. Object Detection and Tracking‐Based Camera Calibration for Normalized Human Height Estimation
CN114972689A (en) Method and apparatus for performing augmented reality pose determination
Yang et al. Robust and real-time pose tracking for augmented reality on mobile devices
US11189053B2 (en) Information processing apparatus, method of controlling information processing apparatus, and non-transitory computer-readable storage medium
US20150154736A1 (en) Linking Together Scene Scans
CN115601672A (en) VR intelligent shop patrol method and device based on deep learning
CN113344957B (en) Image processing method, image processing apparatus, and non-transitory storage medium
WO2015062164A1 (en) Method for optimizing localization of augmented reality-based location system
WO2018220824A1 (en) Image discrimination device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14857586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14857586

Country of ref document: EP

Kind code of ref document: A1