GB2502946A - Maintaining augmented reality image when marker leaves field of view - Google Patents

Maintaining augmented reality image when marker leaves field of view Download PDF

Info

Publication number
GB2502946A
GB2502946A GB201206570A GB201206570A GB2502946A GB 2502946 A GB2502946 A GB 2502946A GB 201206570 A GB201206570 A GB 201206570A GB 201206570 A GB201206570 A GB 201206570A GB 2502946 A GB2502946 A GB 2502946A
Authority
GB
United Kingdom
Prior art keywords
image
view
representation
modified representation
overlay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201206570A
Other versions
GB201206570D0 (en
Inventor
Omar Tayeb
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BLIPPAR COM Ltd
Original Assignee
BLIPPAR COM Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BLIPPAR COM Ltd filed Critical BLIPPAR COM Ltd
Priority to GB201206570A priority Critical patent/GB2502946A/en
Publication of GB201206570D0 publication Critical patent/GB201206570D0/en
Publication of GB2502946A publication Critical patent/GB2502946A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method comprises displaying an augmented reality (AR) or modified representation of a target element, such as a marker 104, in a field of view of an image capture device and transitioning to a stored modified representation of the element 106, such as a 3D overlay, upon detecting that it has left the field of view. A device such as a hand held device may calculate or retrieve a transformation matrix of the overlay at a point before the target was lost, when the camera last detected the target element and provide a smooth visual transition from the last calculated transformation matrix to an identity matrix to render the overlay in the plane of and fitted to the screen of the device.

Description

A METHOD OF PROCESSiNG A DISPLAY IMAGE The invention relates to a method of processing a display image for example in an augmented reality (AR) system.
Augmented (or more generally, mediated) reality is a method for overlaying digital content over a display image of real world content so that, ideally, the user will not be able to recognise the difference between real world objects and computer generated objects. For example on a device capturing and displaying a video stream, content can be added to and can maintain its position in relation to the displayed stream allowing presentation of additional information or user manipulation of the content.
However cxisting AR systems arc limited to a eomparativcly narrow rangc of applications.
The invention is set out in the claims.
Embodiments of the invention will now be described, by way of example, with reference to the drawings, of which: Fig. I shows an initial AR triggering step; Fig. 2 shows an AR tracking step; Fig. 3 shows a conventional AR system when an element leaves the field of view of an image capture device; Fig. 4 shows operation of the device according to the inventiom Fig. 5 shows stcps performed by the device according to the invention; and Fig. 4 shows a component of an AR supporting device according to the invention.
In overview, the invention recognises that in known AR systems, an augmented or modified representation of an element is only shown when it is in the field of view of an image capture device; when the object or image is lost from the camera view so is the AR experience. The invention provides an improved user experience for presentation and/or interaction with AR content in the real world even when targets may not be stationary or the user is unable to use their device within the confines of having the target object or image always present in the camera view. In particular, even when the reference or target image or object is lost, the AR modified representation is maintained by smoothly "peeling off' the target image and attaching itself to the screen of the device, for example transitioning to a stored representation of the image.
Operation of a conventional aunented reality system for example as implemented on a handheld device including a display and an image capture element will be well known to the skilled person such that detailed description is not required here.
However for purposes of background the principle components are outlined.
Traditional operation of an augmented reality system can be understood with reference to Figs. Ito 3. Referring to Fig. I firstly, an image or object 100 in the real world enters the field of view of a device such a handhcld device 102. The image is recognised from the video stream using any appropriate approach, for example feature recognition as discussed for example at http:7en. wikipediaorg/wiki/Feature detection (corn puter vision).
When the image is recognised it is then tracked whilst in the field of view such that the position of the image in the real world relative to the image capture device is derived. This can be done, for example, by recognising the orientation of a predefined marker 104 in the image or by other known techniques as described for example at http:/ienwikipedia.org!wiki/Homography.
The orientation and location of the target image relative to the image capture device can then be represented for example in the form of a set of coordinates in a matrix
M, for example:
f*O,29i o9> 0.096 0.
I -0S75 0224 -O428 0 03B6 -0208 -0299 0 \5a728 141222 879 J57 1 (Equation 1) Once the target element has been identified and tracked, an augmented reality process can place a 3D or other overlay over or around the image/object that gives the experience that the object exists in the real world including the modified representation. The overlay can be selected based on the identification of the element, for example. Referring to Figs. 2 and 3, as the camera 102 moves, the 3D overlay 106 on the display maintains its apparent position in the real world by being moved accordingly (see Fig. 3) and rendered relative to the movement of the camera.
This can be done in any appropriate manner as will be understood by the skilled person for example as described at L x. According to one approach, the 3D overlay layer comprises the video stream plus content added on top and the content is manipulated by identifying a transformation that transforms it to correspond to the position, scale and orientation of the target image relative to the camera. For example the 3D layer can be manipulated by calculating a 4x4 transform matrix (A41) that defines the x, y, z vectors representing 3 orthogonal axes (equation 2) that the direction in which the overlay is pointing, the t value defining its translation so for example move it left or right along the x, y. z axis). h is the homogeneous coordinate described at, for example, h 1x2'1 z t2\ -t 1' *:-t? (Equation 2) As indicated above, however, known AR systems only function when the image or element is within the field of view as conventional system are designed to augment what the viewer camera is currently looking at.
However it has been recognised according to the invention that in some instances it is desirable to maintain the augmented reality representation even when the target element is no longer in the field of view of the image capture device. This can be understood with reference to Fig. 4 where it can be seen that the overlay 108 is still shown on the display 106 even though the target element and marker 104 are no
longer in the field of view.
The process for achieving this can be understood from Fig. 5 which shows a flow diagram representing the principal steps. At step 500 the image is recognised and at step 502 the image is tracked. At step 504 the overlay is added, all in the manner described above. However, at step 506, as the device or element move away from one another the device recognises that the image is being lost from the field of view for example by detecting that the feature(s) previously recognised are no longer in the video stream. At this stage the device must transition to a representation which can, for example, be a stored representation on the device corresponding to the modified image such that the AR experience is maintained.
Accordingly at step 508 the device calculates or retrieves the transformation matrix MA of the overlay (Equation 2) at a point before the target was lost, for example when the camera last detected the target element. This matrix is then compared with the identity matrix, that is, the transformation matrix that would render the overlay in the plane of, and fitted to the screen (which can be dependent upon the orientation of the screen). A value s can be pmvided to define the size of the overlay, which represents how far away (along the z axis) the 3D experience is from the camera; the higher it is thefuirtherawaythe3Dexperienceisalongthezaxissothesmalleritappearstobe on the screen Hence at step 510 the transformation matrix is subtracted from the identity matrix and at step 512 the difference is divided into smaller vector elements for example 10 element or as many are required to provide a smooth visual transition from the last calculated transformation matrix to the identity matrix. In particular, at step 512, transition vectors are generated by dividing the difference matrix into vectors Ap which are added onto the vector of the overlay at appropriate intervals, for example every 100 milliseconds, so that after a period of time, for example 1 second, the overlay will be "attached to", that is in the plane of; and sealed to the screen.
Computation of Ap can be seen from Equation 3: /1 0 0 0\ 1 Ap- I) M.$}/ 1) 0 0:i* J p (Equation 3) This effectively provides a sequence of frames transitioning from the last (or a late or representative) detected image whilst the target element was in view to the modified augmented reality image attached to the screen, corresponding to step 514 after which the augmented reality image is attached at step 516 providing a smooth animated transition that can then rotate, translate and scale the overlay to fit on screen. At this stage the user can interact with the image in a similar way they would while the image was tracking it for example by manipulating elements that are displayed.
It will be further appreciated that the reverse takes place when the target element returns into the field of view, transitioning from the attached image to the captured modified image by reversing the transition steps.
It will be appreciated that the approach described herein can be implemented on any appropriate device such as a handheld device, and with an integral device including an image capture and display element or separable devices for each function, as appropriate. One possible device is shown schematically at Fig. 6 which can be a handheld device including an image capture element 602 and a display element 604.
A processor 606 is connected to each element. The processor 606 performs image recognition from the video stream from the image capture element 602, and image tracking thereof The processor 606 further includes a 3D engine to render an augmented layer and present this to display 604. Where user interaction is possible for example by use of a touch screen display 604, the processor 606 allows detection and processing of user interaction. Further the processor 606 is arranged to detect when a target clement is leaving a field of view and perform the transition steps described above in relation to transitioning to an attached image.
Thc approach allows thc user to continue to manipulate an image after it has left thc field of view such that initial capture effectively acts as a trigger to permit augmented reality processes to be applied to a desired target element without any need fbr nomial capture or storage steps.
It will be fbrther appreciated that the approaches described above can be perfrrmed using any appropriate algorithms and in relation to any appropriate device. The steps can be perfotmed in software, firmware or hardware and can be implemcn ted for example in the form of a downloadable application interacting with image capture and display hardware components on a device such as a computer, laptop, tablet or telephone, or head mounted apparatus such as augmented reality glasses that recognise objects as the wearer moves around.

Claims (13)

  1. CLAIMS: 1. A method, performed by a processor, of processing a display image comprising displaying a modified representation of a target element in a field of view of an image capture device and transitioning to a stored modified representation of the element upon detecting that it has left the field of view.
  2. 2. A method as claimed in claim I in which the image capture and display are performed on a common device.
  3. 3. Amethodasclaimedinclaimlorclaim2inwbichthetargetelement comprises an object or an image.
  4. 4. A method as claimed in any preceding claim in which the modified representation comprises an augmented reality representation.
  5. 5. A method as claimed in any preceding claim in which the modified representation includes modification of display content, addition of display content, replacement of display content, overlay of display content or inclusion of manipulable display content.
  6. 6. A method as claimed in any preceding claim in which the step of displaying a modified representation of an element comprises identi'ing a target element to be displayed, and modifying the representation of the identified element.
  7. 7. A method as claimed in any preceding claim in which the modified representation of an element is maintained by transitioning from a modified captured image of an element to a stored image.
  8. 8. A method as claimed in any preceding claim wherein the transitioning further comprises comparing a displayed image with the stored modified representation, generating one or more intermediate images and displaying the intermediate images sequentially prior to displaying the stored modified representation.
  9. 9. A computer readable medium comprising a sequence of instructions for a processortoperformthemethodofanyofclaims I to 8.
  10. 10. A computer readable medium as claimed in claim 9 comprising a downloadablc application.
  11. 11. A processor configured to implement the instructions stored on the computer readable medium of claims 9 or 10.
  12. 12. A device arranged to operate according to the instructions of the computer readable medium of claims 9 or 10.
  13. 13. A device as claimed in claim 12 comprising a handhold device.
GB201206570A 2012-04-13 2012-04-13 Maintaining augmented reality image when marker leaves field of view Withdrawn GB2502946A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB201206570A GB2502946A (en) 2012-04-13 2012-04-13 Maintaining augmented reality image when marker leaves field of view

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201206570A GB2502946A (en) 2012-04-13 2012-04-13 Maintaining augmented reality image when marker leaves field of view

Publications (2)

Publication Number Publication Date
GB201206570D0 GB201206570D0 (en) 2012-05-30
GB2502946A true GB2502946A (en) 2013-12-18

Family

ID=46209057

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201206570A Withdrawn GB2502946A (en) 2012-04-13 2012-04-13 Maintaining augmented reality image when marker leaves field of view

Country Status (1)

Country Link
GB (1) GB2502946A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010094065A1 (en) * 2009-02-17 2010-08-26 Jumbuck Entertainment Limited Augmented reality system and method
US20110157017A1 (en) * 2009-12-31 2011-06-30 Sony Computer Entertainment Europe Limited Portable data processing appartatus
US20110169861A1 (en) * 2010-01-08 2011-07-14 Sony Corporation Information processing apparatus, information processing system, and information processing method
KR101056418B1 (en) * 2011-03-31 2011-08-11 주식회사 맥스트 Apparatus and method for tracking augmented reality contents using mobile sensors
KR20110091126A (en) * 2010-02-05 2011-08-11 에스케이텔레콤 주식회사 Augmented reality book station based augmented reality system and method, augmented reality processing apparatus for realizing the same
WO2012009789A2 (en) * 2010-07-19 2012-01-26 Smart Technologies Ulc Interactive input system having a 3d input space
WO2013061504A1 (en) * 2011-10-27 2013-05-02 Sony Corporation Image processing apparatus, image processing method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010094065A1 (en) * 2009-02-17 2010-08-26 Jumbuck Entertainment Limited Augmented reality system and method
US20110157017A1 (en) * 2009-12-31 2011-06-30 Sony Computer Entertainment Europe Limited Portable data processing appartatus
US20110169861A1 (en) * 2010-01-08 2011-07-14 Sony Corporation Information processing apparatus, information processing system, and information processing method
KR20110091126A (en) * 2010-02-05 2011-08-11 에스케이텔레콤 주식회사 Augmented reality book station based augmented reality system and method, augmented reality processing apparatus for realizing the same
WO2012009789A2 (en) * 2010-07-19 2012-01-26 Smart Technologies Ulc Interactive input system having a 3d input space
KR101056418B1 (en) * 2011-03-31 2011-08-11 주식회사 맥스트 Apparatus and method for tracking augmented reality contents using mobile sensors
US20120249528A1 (en) * 2011-03-31 2012-10-04 Maxst Co., Ltd. Apparatus and method for tracking augmented reality content
WO2013061504A1 (en) * 2011-10-27 2013-05-02 Sony Corporation Image processing apparatus, image processing method, and program

Also Published As

Publication number Publication date
GB201206570D0 (en) 2012-05-30

Similar Documents

Publication Publication Date Title
US9972136B2 (en) Method, system and device for navigating in a virtual reality environment
EP3218781B1 (en) Spatial interaction in augmented reality
US10890983B2 (en) Artificial reality system having a sliding menu
US9881423B2 (en) Augmented reality-based hand interaction apparatus and method using image information
KR101171660B1 (en) Pointing device of augmented reality
US20140240225A1 (en) Method for touchless control of a device
US20080266323A1 (en) Augmented reality user interaction system
US20120113223A1 (en) User Interaction in Augmented Reality
WO2016109409A1 (en) Virtual lasers for interacting with augmented reality environments
TW201142745A (en) Information processing apparatus, information processing system, and information processing method
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
CN105229720A (en) Display control unit, display control method and recording medium
US11249556B1 (en) Single-handed microgesture inputs
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
US11054896B1 (en) Displaying virtual interaction objects to a user on a reference plane
WO2020054760A1 (en) Image display control device and program for controlling image display
US11212501B2 (en) Portable device and operation method for tracking user's viewpoint and adjusting viewport
CN112088348A (en) Method, system and computer program for remote control of a display device via head gestures
WO2014111947A1 (en) Gesture control in augmented reality
KR20140030444A (en) Apparatus for providing marker-less augmented reality service and photographing postion estimating method therefor
Hakoda et al. Eye tracking using built-in camera for smartphone-based HMD
JP2018063567A (en) Image processing device, image processing method and program
KR101519589B1 (en) Electronic learning apparatus and method for controlling contents by hand avatar
GB2502946A (en) Maintaining augmented reality image when marker leaves field of view
CN105787971B (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)