WO2013036233A1 - Augmented reality based on imaged object characteristics - Google Patents

Augmented reality based on imaged object characteristics Download PDF

Info

Publication number
WO2013036233A1
WO2013036233A1 PCT/US2011/050879 US2011050879W WO2013036233A1 WO 2013036233 A1 WO2013036233 A1 WO 2013036233A1 US 2011050879 W US2011050879 W US 2011050879W WO 2013036233 A1 WO2013036233 A1 WO 2013036233A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
marker
characteristic
processor
storing instructions
Prior art date
Application number
PCT/US2011/050879
Other languages
French (fr)
Inventor
Lucas B. AINSWORTH
James P. Melican
Tondra J. SCHLIESKI
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to PCT/US2011/050879 priority Critical patent/WO2013036233A1/en
Priority to KR1020147006043A priority patent/KR20140045574A/en
Priority to EP11871960.8A priority patent/EP2754289A4/en
Priority to JP2014529651A priority patent/JP2014531644A/en
Priority to CN201180073313.4A priority patent/CN103765867A/en
Priority to US13/993,220 priority patent/US20130265333A1/en
Priority to KR1020157013675A priority patent/KR101773018B1/en
Publication of WO2013036233A1 publication Critical patent/WO2013036233A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • G09G2340/125Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • This relates generally to computers and, particularly, to augmented reality applications.
  • Augmented reality is the process of adding computer supplied content, including images, video, text, and other data as layers on computer displayed images of the real world.
  • a mobile device such as a cellular telephone
  • applications that can add information about the buildings, based on their global positioning system coordinate. For example, the address of the building and a link to a real estate listing for the building may be provided.
  • Figure 1 is a depiction of an imaged scene with an overlaid marker in accordance with one embodiment of the present invention
  • Figure 2 is a depiction of an imaged scene with the imaged object having moved (relative to Figure 1) relative to the overlaid marker in accordance with one embodiment of the present invention
  • Figure 3 corresponds to Figure 2 with augmented reality in accordance with one embodiment of the present invention
  • Figure 4 is a depiction of an imaged screen using augmented reality in accordance with another embodiment of the present invention.
  • Figure 5 is a flow chart for one embodiment of the present invention.
  • Figure 6 is a flow chart for another embodiment of the present invention.
  • Figure 7 is a schematic depiction of one embodiment of the present invention.
  • augmented reality may guide human capture and playback of specific collections of digital media. These embodiments may leverage a combination of physical geometry of the space, human behavior, and programmed activities in order to create new and novel experiences. Embodiments may be applicable in gaming, community action, education, and photography, as examples.
  • augmented reality may be selectively applied to an image scene. For example, based on a characteristic of the image scene, such as the location of an object within the scene, recognition of the object, or recognition of a particular movement of the object, an augmented reality audio/visual object may be added to the scene. In this way, a computer supplied object may be overlaid on a real world image to augment the depiction.
  • a computer may place one or more markers on an imaged scene. Then the person capturing the image of the scene may encourage a person in the scene to interact with those markers, knowing that augmented reality will be applied based on the location of the markers.
  • an image object U in this case a person, has an image arm A.
  • the marker M is overlaid on the image by computer.
  • the overlaying of the marker may be done by applying an additional layer onto the image, which layer may be largely transparent so that the underlying image may be seen.
  • the marker M may be a guide to indicate to the person capturing the image that an augmented reality object may be overlaid on the ultimate image at that location.
  • the image object U may be a still or moving image.
  • the image may be captured by any device with still or moving image capture capabilities, including a camera, a video camera, a cellular telephone, a mobile Internet device, a television, or a laptop computer, to mention a few examples.
  • the person capturing the image may encourage the user to extend the user's arm so his or her arm image A interacts with the overlaid marker M.
  • the person capturing the image may encourage the arm movement, knowing that the marker M (that only the person capturing the image sees in this embodiment) marks the position where an overlaid augmented reality image will be inserted.
  • FIG 3 This insertion of an augmented reality image is shown in Figure 3, where the image of a butterfly O is overlaid ultimately at the position of the marker M.
  • the marker M is overlaid on the image as it is being captured.
  • the marker M is applied to the image being captured in real time. Then it appears as if the butterfly magically landed on the user's hand.
  • a computer may recognize a characteristic of an imaged object using digital image based pattern recognition or image analysis.
  • the characteristic may be, for example, shape, color, orientation, a gestural movement, or speed, to mention a few examples.
  • Digital image based pattern recognition or analysis identifies the characteristic by analyzing the content of the digital image, in contrast to simply comparing the image to other known images of the exact same object to identify an unknown image.
  • the digital image based pattern recognition or analysis identifies a human form.
  • a human form is any part of a human being, including the entire body, the face, or any appendage, as examples.
  • the object itself may be recognized using digital image based pattern recognition or analysis to determine what the object is. Recognition of a predefined
  • characteristic may be used to initiate the generation of augmented reality by overlaying another audio/visual object on the image scene.
  • a computer system may detect the image of the cap and, based on that detection (using pattern recognition, for example), may automatically display an image of a fairy F on the hand of the depicted image of the girl.
  • the computer again using video image analysis, can recognize the girl's outstretched arm. Recognition of the outstretched arm (effectively, a gestural command) may be the trigger to generate the fairy image F. As still another example, the computer may recognize a movement to outstretch the left arm and, based on this recognized movement, may generate the fairy image F.
  • a characteristic of the image of the object such as its shape or gestural motion, is used to automatically overlay an audio/visual image object at a desired location within the display.
  • a given characteristic of an image object may be used to generate audio. For example, when the imaged object is recognized as a conductor directing an orchestra, the sound of an orchestra may be automatically added to the audio/visual media.
  • an image scene from a fixed camera may be analyzed to recognize a vehicle moving within an intersection at the time when a red light is visible.
  • the computer may automatically overlay the word "violation" on the image to assist an officer in implementing a red light camera traffic enforcement system.
  • a fixed camera on a roadside may image cars going by. The captured image of a car going faster than the speed limit may be overlaid with the word "violation.”
  • a security camera may detect a person at an unauthorized location and may overlay the object with the word "violation” or may, by speech synthesis, say the word "intruder.”
  • a characteristic of the imaged object (other than its global positioning system (GPS) coordinates, which is not a characteristic of the imaged object) may be used to generate augmented reality.
  • global positioning system coordinates may also be used in addition to non-GPS based characteristics.
  • Augmented reality overlays may be provided in real time at the time of image capture or may be overlaid later using digital image based content recognition of the captured scene or series of frames. For example, an extended moving picture file may be analyzed to search for particularly shaped objects and, when those objects are found, augmented reality may be added to enhance the depiction.
  • the sequence 10 may be used in an embodiment such as the one depicted in Figures 1-3.
  • the sequence 10 may be implemented in software, hardware, and/or firmware.
  • the sequence may be implemented by computer readable instructions stored on a non-transitory computer readable medium, such as a semiconductor, magnetic, or optical memory.
  • guide markers are automatically overlaid on an imaged object as the depiction is being captured as a still or moving picture.
  • the overlaid marker or markers may be overlaid as a layer that overlays the imaged picture, the marker being non-transparent, but the rest of the overlay being transparent.
  • the user capturing the images may be prompted to prompt the subject to move in a desired way to interact with the marker so that the desired effect may be achieved through the application of augmented reality.
  • the augmented reality audio/visual object may be automatically applied over the existing scene, as depicted in block 16, in some embodiments.
  • the application of augmented reality may be the result of a user input command in one embodiment. In another embodiment, it may occur after the marker has been displayed for a time period. In one embodiment, the marker and the object may be the same.
  • sequence 20 shown in Figure 6, may be used, for example, to implement embodiments such as the one depicted in Figure 4. Again, the sequence 20 may be implemented in software, firmware, and/or hardware. In software and firmware embodiments, the sequence may be implemented by computer readable instructions stored on a non-transitory computer readable medium, such as a semiconductor, optical, or magnetic memory.
  • a non-transitory computer readable medium such as a semiconductor, optical, or magnetic memory.
  • the sequence may begin by receiving an image file that may be composed of a still image or a series of frames of a moving image, as indicated in block 22. Then, a given characteristic of an imaged object is detected (block 24). As described above, a variety of image characteristics of the image itself, not the real world object (i.e., not its GPS coordinate), may be used to trigger the generation of augmented reality. Again, examples of such characteristics of the image include shape recognition, movement, speed, gestural commands, color, and position within the imaged scene relative to one or more other depicted objects.
  • augmented reality For example, in a computer animation, two players may be driving race cars and when the system detects that the race cars come together, the system may generate a crash image or a crash sound, overlaid on the ongoing depiction.
  • Such an embodiment may be described as augmented virtual reality, but since the race car image was generated in the real world, this is actually another example of augmented reality.
  • the augmented reality overlay is overlaid over the existing captured or computer generated image.
  • a computer 30 for implementing embodiments of the present invention may include a display screen 32 with an integrated video camera 34, in some embodiments.
  • the video camera 34 may be separate from the computer system 30 and/or the display screen 32.
  • the display screen 32 is coupled to a bus 38 by a display interface 36.
  • the bus 38 may be conventionally coupled to a processor 40 and a system memory 42.
  • the processor may be any controller, including a central processing unit or a graphics processing unit.
  • the system memory 42 may store the computer readable instructions implementing the sequences 10 and/or 20, in the case where the sequences 10 and/or 20 are implemented by firmware or software.
  • the embedded augmented reality layer may have the following characteristics, in some embodiments:
  • the layer may be "free form" - i.e., it responds to real world real time events, not just to pre-programmed or pre-loaded events;
  • the layer may be transitory (visible during capture as a guide, but not transferred to the media output) or integrated (i.e., visible during capture and integrated into the media output);
  • the guidance provided by the layer may be context aware, and may reflect one or more of the following variables: location of the subject, the geometry of the space, the movement within the frame, the RBG image content of the frame, and/or other sensor data, like noise, heat, electrical charge, wireless signal; and/or
  • the augmented reality layer may interact with the human subject capturing media, to direct that capture toward a programmed objective.
  • An embodiment may leverage human behavior.
  • a user at a theme park, waiting in line for an attraction, can play with or tell stories with characters from the theme park, and create a take-away "movie" of his or her experience:
  • the interaction is captured (integrated with the augmented digital media) and can be
  • This embodiment also illustrates how visible real time playback can be used to influence capture, specifically:
  • the application in this situation is programmed to allow users to share their capture on the screens provided in line;
  • User A then "directs" a scene in which the fairies visit and interact with different people in line.
  • the subject gestures and reactions are all recognized by the system, and the digital animation layer changes its behavior based on the subject's reaction.
  • graphics processing techniques described herein may be implemented in various hardware architectures. For example, graphics functionality may be integrated within a chipset. Alternatively, a discrete graphics processor may be used. As still another embodiment, the graphics functions may be implemented by a general purpose processor, including a multicore processor.
  • references throughout this specification to "one embodiment” or “an embodiment” mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation encompassed within the present invention. Thus, appearances of the phrase “one embodiment” or “in an embodiment” are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be instituted in other suitable forms other than the particular embodiment illustrated and all such forms may be encompassed within the claims of the present application.

Abstract

Augmented reality may be enabled by adding computer generated images to images of real world occurrences. The computer generated images may be inserted automatically based on a characteristic of an imaged object in said image.

Description

AUGMENTED REALITY BASED ON IMAGED OBJECT CHARACTERISTICS
Background
This relates generally to computers and, particularly, to augmented reality applications.
Augmented reality is the process of adding computer supplied content, including images, video, text, and other data as layers on computer displayed images of the real world. For example, when a mobile device, such as a cellular telephone, captures an image of a scene including different buildings, there are applications that can add information about the buildings, based on their global positioning system coordinate. For example, the address of the building and a link to a real estate listing for the building may be provided.
Brief Description of the Drawings
Figure 1 is a depiction of an imaged scene with an overlaid marker in accordance with one embodiment of the present invention;
Figure 2 is a depiction of an imaged scene with the imaged object having moved (relative to Figure 1) relative to the overlaid marker in accordance with one embodiment of the present invention;
Figure 3 corresponds to Figure 2 with augmented reality in accordance with one embodiment of the present invention;
Figure 4 is a depiction of an imaged screen using augmented reality in accordance with another embodiment of the present invention;
Figure 5 is a flow chart for one embodiment of the present invention;
Figure 6 is a flow chart for another embodiment of the present invention; and
Figure 7 is a schematic depiction of one embodiment of the present invention.
Detailed Description
In some embodiments, augmented reality may guide human capture and playback of specific collections of digital media. These embodiments may leverage a combination of physical geometry of the space, human behavior, and programmed activities in order to create new and novel experiences. Embodiments may be applicable in gaming, community action, education, and photography, as examples.
In accordance with some embodiments of the present invention, based on a characteristic of an imaged object, augmented reality may be selectively applied to an image scene. For example, based on a characteristic of the image scene, such as the location of an object within the scene, recognition of the object, or recognition of a particular movement of the object, an augmented reality audio/visual object may be added to the scene. In this way, a computer supplied object may be overlaid on a real world image to augment the depiction.
In another embodiment of the present invention, a computer may place one or more markers on an imaged scene. Then the person capturing the image of the scene may encourage a person in the scene to interact with those markers, knowing that augmented reality will be applied based on the location of the markers.
Referring to Figure 1, an image object U, in this case a person, has an image arm A. The marker M is overlaid on the image by computer. The overlaying of the marker may be done by applying an additional layer onto the image, which layer may be largely transparent so that the underlying image may be seen. The marker M may be a guide to indicate to the person capturing the image that an augmented reality object may be overlaid on the ultimate image at that location. The image object U may be a still or moving image.
The image may be captured by any device with still or moving image capture capabilities, including a camera, a video camera, a cellular telephone, a mobile Internet device, a television, or a laptop computer, to mention a few examples.
Referring next to Figure 2, the person capturing the image may encourage the user to extend the user's arm so his or her arm image A interacts with the overlaid marker M. The person capturing the image may encourage the arm movement, knowing that the marker M (that only the person capturing the image sees in this embodiment) marks the position where an overlaid augmented reality image will be inserted.
This insertion of an augmented reality image is shown in Figure 3, where the image of a butterfly O is overlaid ultimately at the position of the marker M. In this embodiment, the marker M is overlaid on the image as it is being captured. In other words, the marker M is applied to the image being captured in real time. Then it appears as if the butterfly magically landed on the user's hand.
In accordance with another embodiment of the present invention, a computer may recognize a characteristic of an imaged object using digital image based pattern recognition or image analysis. The characteristic may be, for example, shape, color, orientation, a gestural movement, or speed, to mention a few examples. Digital image based pattern recognition or analysis identifies the characteristic by analyzing the content of the digital image, in contrast to simply comparing the image to other known images of the exact same object to identify an unknown image. In one embodiment, the digital image based pattern recognition or analysis identifies a human form. A human form is any part of a human being, including the entire body, the face, or any appendage, as examples.
For example, the object itself may be recognized using digital image based pattern recognition or analysis to determine what the object is. Recognition of a predefined
characteristic may be used to initiate the generation of augmented reality by overlaying another audio/visual object on the image scene.
Thus, in the case of Figure 4, a captured image of a girl wearing a magic cap is depicted. A computer system may detect the image of the cap and, based on that detection (using pattern recognition, for example), may automatically display an image of a fairy F on the hand of the depicted image of the girl.
As another example, the computer, again using video image analysis, can recognize the girl's outstretched arm. Recognition of the outstretched arm (effectively, a gestural command) may be the trigger to generate the fairy image F. As still another example, the computer may recognize a movement to outstretch the left arm and, based on this recognized movement, may generate the fairy image F.
In each case, a characteristic of the image of the object, such as its shape or gestural motion, is used to automatically overlay an audio/visual image object at a desired location within the display.
In other embodiments, a given characteristic of an image object may be used to generate audio. For example, when the imaged object is recognized as a conductor directing an orchestra, the sound of an orchestra may be automatically added to the audio/visual media.
As an additional example, an image scene from a fixed camera may be analyzed to recognize a vehicle moving within an intersection at the time when a red light is visible. The computer may automatically overlay the word "violation" on the image to assist an officer in implementing a red light camera traffic enforcement system. As another traffic application, a fixed camera on a roadside may image cars going by. The captured image of a car going faster than the speed limit may be overlaid with the word "violation."
As still another example, a security camera may detect a person at an unauthorized location and may overlay the object with the word "violation" or may, by speech synthesis, say the word "intruder."
In many cases, a characteristic of the imaged object, (other than its global positioning system (GPS) coordinates, which is not a characteristic of the imaged object) may be used to generate augmented reality. In some embodiments, global positioning system coordinates may also be used in addition to non-GPS based characteristics.
Augmented reality overlays may be provided in real time at the time of image capture or may be overlaid later using digital image based content recognition of the captured scene or series of frames. For example, an extended moving picture file may be analyzed to search for particularly shaped objects and, when those objects are found, augmented reality may be added to enhance the depiction.
Referring to Figure 5, the sequence 10 may be used in an embodiment such as the one depicted in Figures 1-3. The sequence 10 may be implemented in software, hardware, and/or firmware. In software or firmware based embodiments, the sequence may be implemented by computer readable instructions stored on a non-transitory computer readable medium, such as a semiconductor, magnetic, or optical memory.
At block 12, guide markers are automatically overlaid on an imaged object as the depiction is being captured as a still or moving picture. In some embodiments, the overlaid marker or markers may be overlaid as a layer that overlays the imaged picture, the marker being non-transparent, but the rest of the overlay being transparent.
At block 14, the user capturing the images may be prompted to prompt the subject to move in a desired way to interact with the marker so that the desired effect may be achieved through the application of augmented reality.
Then, the augmented reality audio/visual object may be automatically applied over the existing scene, as depicted in block 16, in some embodiments. The application of augmented reality may be the result of a user input command in one embodiment. In another embodiment, it may occur after the marker has been displayed for a time period. In one embodiment, the marker and the object may be the same.
The sequence 20, shown in Figure 6, may be used, for example, to implement embodiments such as the one depicted in Figure 4. Again, the sequence 20 may be implemented in software, firmware, and/or hardware. In software and firmware embodiments, the sequence may be implemented by computer readable instructions stored on a non-transitory computer readable medium, such as a semiconductor, optical, or magnetic memory.
The sequence may begin by receiving an image file that may be composed of a still image or a series of frames of a moving image, as indicated in block 22. Then, a given characteristic of an imaged object is detected (block 24). As described above, a variety of image characteristics of the image itself, not the real world object (i.e., not its GPS coordinate), may be used to trigger the generation of augmented reality. Again, examples of such characteristics of the image include shape recognition, movement, speed, gestural commands, color, and position within the imaged scene relative to one or more other depicted objects.
For example, in a computer animation, two players may be driving race cars and when the system detects that the race cars come together, the system may generate a crash image or a crash sound, overlaid on the ongoing depiction. Such an embodiment may be described as augmented virtual reality, but since the race car image was generated in the real world, this is actually another example of augmented reality. Finally, in block 26, the augmented reality overlay is overlaid over the existing captured or computer generated image.
Referring to Figure 7, in accordance with one embodiment, a computer 30 for implementing embodiments of the present invention may include a display screen 32 with an integrated video camera 34, in some embodiments. Of course, the video camera 34 may be separate from the computer system 30 and/or the display screen 32. The display screen 32 is coupled to a bus 38 by a display interface 36.
The bus 38 may be conventionally coupled to a processor 40 and a system memory 42. The processor may be any controller, including a central processing unit or a graphics processing unit. In some embodiments, the system memory 42 may store the computer readable instructions implementing the sequences 10 and/or 20, in the case where the sequences 10 and/or 20 are implemented by firmware or software.
The embedded augmented reality layer may have the following characteristics, in some embodiments:
- the layer may be "free form" - i.e., it responds to real world real time events, not just to pre-programmed or pre-loaded events;
- the layer may be transitory (visible during capture as a guide, but not transferred to the media output) or integrated (i.e., visible during capture and integrated into the media output);
- the guidance provided by the layer may be context aware, and may reflect one or more of the following variables: location of the subject, the geometry of the space, the movement within the frame, the RBG image content of the frame, and/or other sensor data, like noise, heat, electrical charge, wireless signal; and/or
- the augmented reality layer may interact with the human subject capturing media, to direct that capture toward a programmed objective.
An embodiment may leverage human behavior. A user at a theme park, waiting in line for an attraction, can play with or tell stories with characters from the theme park, and create a take-away "movie" of his or her experience:
- user A launches the fairy story application and points an image capture device at user B;
- user A sees characters on the screen, which respond to the movement and interaction of user B;
- the interaction is captured (integrated with the augmented digital media) and can be
played back on the capture device, displayed real time on a screen in line, or sent home as a movie; - user A (with the image capture device) can leverage his or her augmented reality application to direct the screen to other players within the space (i.e., by changing the focus of the camera, and user A can send the fairy to another person in line). The reactions of user C continue to inform the augmented reality behavior of the fairy.
This embodiment also illustrates how visible real time playback can be used to influence capture, specifically:
- the application in this situation is programmed to allow users to share their capture on the screens provided in line;
- both the human subject (as determined by the view finder) and the animation (digital overlay) play real time on the screen. User A then "directs" a scene in which the fairies visit and interact with different people in line. The subject gestures and reactions (laughter, annoyance) are all recognized by the system, and the digital animation layer changes its behavior based on the subject's reaction.
The graphics processing techniques described herein may be implemented in various hardware architectures. For example, graphics functionality may be integrated within a chipset. Alternatively, a discrete graphics processor may be used. As still another embodiment, the graphics functions may be implemented by a general purpose processor, including a multicore processor.
References throughout this specification to "one embodiment" or "an embodiment" mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation encompassed within the present invention. Thus, appearances of the phrase "one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be instituted in other suitable forms other than the particular embodiment illustrated and all such forms may be encompassed within the claims of the present application.
While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.

Claims

What is claimed is:
1. A method comprising:
using digital image based image analysis to detect a characteristic of an imaged object displayed on a display screen; and
based on said characteristic, overlaying an audio or visual object on said display screen.
2. The method of claim 1 wherein using digital image based image analysis includes detecting a human form.
3. The method of claim 1 including analyzing an image in association with said human form.
4. The method of claim 1 including recognizing a characteristic that is a gestural command.
5. The method of claim 1 including recognizing a shape, color, orientation, or speed of the imaged object.
6. A method comprising:
overlying a marker on the display of an imaged scene; and
using the marker to augment reality.
7. The method of claim 6 wherein using includes applying a computer generated image to said display.
8. The method of claim 7 including using a marker that is the same as the image.
9. The method of claim 7 including replacing said marker with a computer generated image.
10. A non-transitory computer readable medium storing instructions to enable a processor-based device to:
use digital image based image analysis to identify a characteristic of an imaged object and, based on that characteristic, overlay an audio or visual object on the display screen.
11. The medium of claim 10 further storing instructions to use said digital image based image analysis to detect a human form.
12. The medium of claim 10 further storing instructions to analyze an image in association with the human form.
13. The medium of claim 10 further storing instructions to recognize a characteristic in the form of a gestural command.
14. The medium of claim 10 further storing instructions to recognize shape, color, orientation, or speed of an imaged object.
15. The medium of claim 10 further storing instructions to overlay an indicator on an uncaptured image and to use the indicator as a marker to position an augmented reality depiction.
16. The medium of claim 15 further storing instructions to use the marker to apply a computer generated image to the display at the position of the marker.
17. The medium of claim 16 further storing instructions to use the marker that is the same as the image.
18. The medium of claim 16 further storing instructions to replace the marker with a computer generated image.
19. An apparatus comprising:
an image capture device;
a processor coupled to said image capture device; and
said processor to overlay a marker on an image display and to use said marker for augmented reality.
20. The apparatus of claim 19, said processor to substitute the augmented reality image for said marker in a captured depiction.
21. The apparatus of claim 19, said processor to overlay said marker in a depiction of a scene in said image capture device before an image of the scene is captured.
22. The apparatus of claim 19 including a display screen coupled to said processor.
23. The apparatus of claim 22, said processor to use digital image based image analysis to identify a characteristic of an imaged object and, based on that characteristic, overlay an audio or visual object on the display.
24. The apparatus of claim 23, said processor to use said digital image based image analysis to detect a human form.
25. The apparatus of claim 24, said processor to analyze an image in association with the human form.
26. The apparatus of claim 24, said processor to recognize a characteristic in the form of a gestural command.
PCT/US2011/050879 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics WO2013036233A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
PCT/US2011/050879 WO2013036233A1 (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics
KR1020147006043A KR20140045574A (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics
EP11871960.8A EP2754289A4 (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics
JP2014529651A JP2014531644A (en) 2011-09-08 2011-09-08 Augmented reality based on the characteristics of the object being imaged
CN201180073313.4A CN103765867A (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics
US13/993,220 US20130265333A1 (en) 2011-09-08 2011-09-08 Augmented Reality Based on Imaged Object Characteristics
KR1020157013675A KR101773018B1 (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/050879 WO2013036233A1 (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics

Publications (1)

Publication Number Publication Date
WO2013036233A1 true WO2013036233A1 (en) 2013-03-14

Family

ID=47832472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/050879 WO2013036233A1 (en) 2011-09-08 2011-09-08 Augmented reality based on imaged object characteristics

Country Status (6)

Country Link
US (1) US20130265333A1 (en)
EP (1) EP2754289A4 (en)
JP (1) JP2014531644A (en)
KR (2) KR101773018B1 (en)
CN (1) CN103765867A (en)
WO (1) WO2013036233A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014229090A (en) * 2013-05-23 2014-12-08 株式会社電通 Image sharing system
WO2015175730A1 (en) * 2014-05-13 2015-11-19 Nant Vision, Inc. Augmented reality content rendering via albedo models, systems and methods
CN107079139A (en) * 2014-04-30 2017-08-18 图片动态有限公司 There is no the augmented reality of physical trigger
CN110832525A (en) * 2017-06-28 2020-02-21 三星电子株式会社 Augmented reality advertising on objects
US11526935B1 (en) * 2018-06-13 2022-12-13 Wells Fargo Bank, N.A. Facilitating audit related activities

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9286711B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
US9345957B2 (en) 2011-09-30 2016-05-24 Microsoft Technology Licensing, Llc Enhancing a sport using an augmented reality display
US9285871B2 (en) * 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Personal audio/visual system for providing an adaptable augmented reality environment
US20140015826A1 (en) * 2012-07-13 2014-01-16 Nokia Corporation Method and apparatus for synchronizing an image with a rendered overlay
US11042607B2 (en) * 2013-08-23 2021-06-22 Nant Holdings Ip, Llc Recognition-based content management, systems and methods
US9615177B2 (en) * 2014-03-06 2017-04-04 Sphere Optics Company, Llc Wireless immersive experience capture and viewing
US9826164B2 (en) 2014-05-30 2017-11-21 Furuno Electric Co., Ltd. Marine environment display device
US10134187B2 (en) 2014-08-07 2018-11-20 Somo Innvoations Ltd. Augmented reality with graphics rendering controlled by mobile device position
US20160171739A1 (en) * 2014-12-11 2016-06-16 Intel Corporation Augmentation of stop-motion content
US20170124890A1 (en) * 2015-10-30 2017-05-04 Robert W. Soderstrom Interactive table
US9996978B2 (en) 2016-02-08 2018-06-12 Disney Enterprises, Inc. System and method of simulating first-person control of remote-controlled vehicles
US9922465B2 (en) 2016-05-17 2018-03-20 Disney Enterprises, Inc. Systems and methods for changing a perceived speed of motion associated with a user
US10169918B2 (en) 2016-06-24 2019-01-01 Microsoft Technology Licensing, Llc Relational rendering of holographic objects
US10074205B2 (en) 2016-08-30 2018-09-11 Intel Corporation Machine creation of program with frame analysis method and apparatus
DE102016121281A1 (en) 2016-11-08 2018-05-09 3Dqr Gmbh Method and device for superimposing an image of a real scene with virtual image and audio data and a mobile device
US11487988B2 (en) 2017-08-31 2022-11-01 Ford Global Technologies, Llc Augmenting real sensor recordings with simulated sensor data
US11455565B2 (en) 2017-08-31 2022-09-27 Ford Global Technologies, Llc Augmenting real sensor recordings with simulated sensor data
KR102614048B1 (en) 2017-12-22 2023-12-15 삼성전자주식회사 Electronic device and method for displaying object for augmented reality
US11393282B2 (en) 2019-10-09 2022-07-19 Sg Gaming, Inc. Systems and devices for identification of a feature associated with a user in a gaming establishment and related methods

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090087807A (en) * 2008-02-13 2009-08-18 세종대학교산학협력단 Method for implementing augmented reality
KR20110084748A (en) * 2010-01-18 2011-07-26 (주)엔시드코프 Augmented reality apparatus and method for supporting interactive mode
KR20110088778A (en) * 2010-01-29 2011-08-04 주식회사 팬택 Terminal and method for providing augmented reality

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2227361A1 (en) * 1998-01-19 1999-07-19 Taarna Studios Inc. Method and apparatus for providing real-time animation utilizing a database of expressions
GB0031016D0 (en) * 2000-12-20 2001-01-31 Alphafox Systems Ltd Security systems
US6857746B2 (en) * 2002-07-01 2005-02-22 Io2 Technology, Llc Method and system for free-space imaging display and interface
US8100552B2 (en) * 2002-07-12 2012-01-24 Yechezkal Evan Spero Multiple light-source illuminating system
US8896725B2 (en) * 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
JP4473754B2 (en) * 2005-03-11 2010-06-02 株式会社東芝 Virtual fitting device
JP2007235642A (en) * 2006-03-02 2007-09-13 Hitachi Ltd Obstruction detecting system
EP1840798A1 (en) * 2006-03-27 2007-10-03 Sony Deutschland Gmbh Method for classifying digital image data
AU2006352758A1 (en) * 2006-04-10 2008-12-24 Avaworks Incorporated Talking Head Creation System and Method
US9052294B2 (en) * 2006-05-31 2015-06-09 The Boeing Company Method and system for two-dimensional and three-dimensional inspection of a workpiece
JP5012373B2 (en) * 2007-09-28 2012-08-29 カシオ計算機株式会社 Composite image output apparatus and composite image output processing program
DE102007059478B4 (en) * 2007-12-11 2014-06-26 Kuka Laboratories Gmbh Method and system for aligning a virtual model with a real object
US9269090B2 (en) * 2008-08-18 2016-02-23 Nokia Technologies Oy Method, apparatus and computer program product for providing indications regarding recommended content
US8023160B2 (en) * 2008-09-10 2011-09-20 Xerox Corporation Encoding message data in a cover contone image via halftone dot orientation
JP5210820B2 (en) * 2008-11-17 2013-06-12 株式会社東芝 Status notification device
EP2539759A1 (en) * 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US20140063055A1 (en) * 2010-02-28 2014-03-06 Osterhout Group, Inc. Ar glasses specific user interface and control interface based on a connected external device type
US20110313779A1 (en) * 2010-06-17 2011-12-22 Microsoft Corporation Augmentation and correction of location based data through user feedback
KR101286866B1 (en) * 2010-10-13 2013-07-17 주식회사 팬택 User Equipment and Method for generating AR tag information, and system
KR101260576B1 (en) * 2010-10-13 2013-05-06 주식회사 팬택 User Equipment and Method for providing AR service
US20120147246A1 (en) * 2010-12-13 2012-06-14 Research In Motion Limited Methods And Apparatus For Use In Enabling An Efficient Review Of Photographic Images Which May Contain Irregularities
CN102110379A (en) * 2011-02-22 2011-06-29 黄振强 Multimedia reading matter giving readers enhanced feeling of reality
US9147014B2 (en) * 2011-08-31 2015-09-29 Woodtech Measurement Solutions System and method for image selection of bundled objects
CN104582622B (en) * 2012-04-16 2017-10-13 儿童国家医疗中心 For the tracking in surgery and intervention medical procedure and the bimodulus stereo imaging system of control
US8873818B1 (en) * 2013-01-11 2014-10-28 E. Theodore Ostermann System and method for image analysis with characteristic curves

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090087807A (en) * 2008-02-13 2009-08-18 세종대학교산학협력단 Method for implementing augmented reality
KR20110084748A (en) * 2010-01-18 2011-07-26 (주)엔시드코프 Augmented reality apparatus and method for supporting interactive mode
KR20110088778A (en) * 2010-01-29 2011-08-04 주식회사 팬택 Terminal and method for providing augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2754289A4 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014229090A (en) * 2013-05-23 2014-12-08 株式会社電通 Image sharing system
CN107079139A (en) * 2014-04-30 2017-08-18 图片动态有限公司 There is no the augmented reality of physical trigger
EP3138284A4 (en) * 2014-04-30 2017-11-29 Aurasma Limited Augmented reality without a physical trigger
WO2015175730A1 (en) * 2014-05-13 2015-11-19 Nant Vision, Inc. Augmented reality content rendering via albedo models, systems and methods
US9805510B2 (en) 2014-05-13 2017-10-31 Nant Holdings Ip, Llc Augmented reality content rendering via albedo models, systems and methods
US10192365B2 (en) 2014-05-13 2019-01-29 Nant Holdings Ip, Llc Augmented reality content rendering via albedo models, systems and methods
US10685498B2 (en) 2014-05-13 2020-06-16 Nant Holdings Ip, Llc Augmented reality content rendering via albedo models, systems and methods
US11176754B2 (en) 2014-05-13 2021-11-16 Nant Holdings Ip, Llc Augmented reality content rendering via albedo models, systems and methods
US11710282B2 (en) 2014-05-13 2023-07-25 Nant Holdings Ip, Llc Augmented reality content rendering via Albedo models, systems and methods
CN110832525A (en) * 2017-06-28 2020-02-21 三星电子株式会社 Augmented reality advertising on objects
US11526935B1 (en) * 2018-06-13 2022-12-13 Wells Fargo Bank, N.A. Facilitating audit related activities
US11823262B1 (en) 2018-06-13 2023-11-21 Wells Fargo Bank, N.A. Facilitating audit related activities

Also Published As

Publication number Publication date
JP2014531644A (en) 2014-11-27
EP2754289A4 (en) 2016-05-18
KR20140045574A (en) 2014-04-16
KR20150068489A (en) 2015-06-19
CN103765867A (en) 2014-04-30
EP2754289A1 (en) 2014-07-16
US20130265333A1 (en) 2013-10-10
KR101773018B1 (en) 2017-08-30

Similar Documents

Publication Publication Date Title
US20130265333A1 (en) Augmented Reality Based on Imaged Object Characteristics
US10536661B2 (en) Tracking object of interest in an omnidirectional video
WO2018072652A1 (en) Video processing method, video processing device, and storage medium
US9349218B2 (en) Method and apparatus for controlling augmented reality
TWI534654B (en) Method and computer-readable media for selecting an augmented reality (ar) object on a head mounted device (hmd) and head mounted device (hmd)for selecting an augmented reality (ar) object
US10255690B2 (en) System and method to modify display of augmented reality content
JP6630665B2 (en) Correlation display of biometric ID, feedback and user interaction status
CN106464773B (en) Augmented reality device and method
CN109416562B (en) Apparatus, method and computer readable medium for virtual reality
US10771707B2 (en) Information processing device and information processing method
EP3236336B1 (en) Virtual reality causal summary content
WO2016151956A1 (en) Information processing system and information processing method
CN116710878A (en) Context aware augmented reality system
WO2023075973A1 (en) Tracking a handheld device
WO2022040574A1 (en) Integrating overlaid digital content into displayed data via graphics processing circuitry
JP2017126899A (en) Image processing device and image processing method
KR102635477B1 (en) Device for providing performance content based on augmented reality and method therefor
JP2004287004A (en) Display system
AU2015264917A1 (en) Methods for video annotation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11871960

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13993220

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2011871960

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2014529651

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20147006043

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE