WO2020181086A1 - Système et kit de réalité virtuelle et augmentée - Google Patents

Système et kit de réalité virtuelle et augmentée Download PDF

Info

Publication number
WO2020181086A1
WO2020181086A1 PCT/US2020/021190 US2020021190W WO2020181086A1 WO 2020181086 A1 WO2020181086 A1 WO 2020181086A1 US 2020021190 W US2020021190 W US 2020021190W WO 2020181086 A1 WO2020181086 A1 WO 2020181086A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
reality content
mobile device
linked
printed media
Prior art date
Application number
PCT/US2020/021190
Other languages
English (en)
Inventor
Steve RAD
Original Assignee
Rad Steve
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/294,706 external-priority patent/US10430658B2/en
Application filed by Rad Steve filed Critical Rad Steve
Publication of WO2020181086A1 publication Critical patent/WO2020181086A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/062Combinations of audio and printed presentations, e.g. magnetically striped cards, talking books, magnetic tapes with printed texts thereon
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates generally to an augmented and virtual reality system and kit, and, more particularly to an augmented and virtual reality system that allows a user to perform a series of actions using a printed media member and associated instruments.
  • Virtual reality and augmented reality systems have become increasingly popular in recent years.
  • prior art AR utilizes a tablet or mobile device in full screen mode, allowing the user to augment something in front of them, e.g., snapchat face filters.
  • face filters and the like the user typically has their phone in one hand.
  • the smartphone is placed in the googles with the camera lens exposed so that the user can be hands-free.
  • the app shows the video in split screen mode so that it can be viewed properly on the goggles.
  • a method of participating in an augmented reality experience includes obtaining or providing a printed media member that includes at least a first image target that includes first linked augmented reality content, obtaining or providing a head mounted device that includes a mobile device securing member, obtaining or providing a mobile device with a camera that has a camera lens that includes software running thereon that is in communication with a target database, securing the mobile device in the head mounted device using the mobile device securing member, and orienting the mobile device such that the camera lens is directed toward the printed media member.
  • the first image target is recognized by the software the first linked augmented reality content is displayed on the mobile device.
  • the method also includes viewing the first linked augmented reality content through the head mounted device.
  • the first linked augmented reality content is displayed in first and second screen frames on the screen of the mobile device.
  • the head mounted device includes first and second lenses.
  • the viewing step of the method includes viewing the first and second screen frames through the first and second lenses.
  • the printed media member includes at least first and second pages where the first image target is disposed on the first page and the second page includes a second image target that includes second linked augmented reality content.
  • the printed media member can include any number of pages and any number of image targets.
  • the printed media member and head mounted device are part of a kit and the method includes the step of obtaining at least a first instrument that is part of the kit.
  • the first linked augmented reality content includes audio that directs a user to utilize the first instrument.
  • the method also includes obtaining or providing at least a second instrument that is not part of the kit and the first linked augmented reality content includes audio that directs a user to utilize the second instrument.
  • the method includes the step of obtaining at least a first instrument that is part of the kit and the first linked augmented reality content includes audio that directs a user to utilize the first instrument.
  • the method includes obtaining or providing at least a second instrument that is not part of the kit and the second linked augmented reality content includes audio that directs a user to utilize the second instrument.
  • the use of the instruments whether they are part of the kit or not can be associated with one or more image targets and the augmented reality content associated therewith.
  • the first linked augmented reality content is one of three-dimensional type augmented reality content or two-dimensional type augmented reality content.
  • the two-dimensional type augmented reality content may augment within a frame located on the printed media member.
  • the frame may not be an actual visible frame. However, the video appears to play within a certain space on the page.
  • the first linked augmented reality content is three-dimensional type augmented reality content and the second linked augmented reality content is two- dimensional type augmented reality content that augments within a frame located on the printed media member.
  • augmented reality system that includes a printed media member that has at least a first image target, a head mounted device that includes a mobile device securing member, and software that is configured to run on a mobile device that includes a camera that has a camera lens.
  • the software is in communication with a target database (either on a remote server or within the software) that includes information related to the first image target, and wherein the software includes first augmented reality content that is associated with the first image target.
  • the printed media member and head mounted device are part of a kit that also includes at least first and second instruments.
  • the first augmented reality content includes audio instructions regarding a first task that includes use of the first instrument.
  • the first augmented reality content also includes audio instructions regarding a second task that includes the use of a third instrument that is not part of the kit.
  • the printed media member includes a second image target, the target database includes information related to the second image target, and second augmented reality content that is associated with the second image target is part of the software.
  • the second augmented reality content includes audio instructions regarding a third task that includes use of the second instrument.
  • the first augmented reality content includes video related to the first task that augments within a frame located on the printed media member.
  • the video includes a demonstration using the first instrument.
  • the first instrument is not the exact same instrument that came with the kit, but is the same type of instrument. For example, if the first instrument is a measuring cup that came with the kit, the first instrument in the video is a similar measuring cup.
  • the present invention includes a kit and system that
  • the goggles are not actually VR goggles, meaning they do not include VR capability built in to the goggles.
  • the goggles helmet or other head mounted device
  • the app can also have the capability of not showing the video in split screen mode so that it can be viewed directly on the screen of the phone and without having to use goggles.
  • the AR device is a mobile device, which may be a smart phone (e.g., iPhone, Google phone, or other phones running Android, Windows Mobile, or other operating systems), a tablet computer (e.g., iPad, Galaxy), personal digital assistant (PDA), a notebook computer, or various other types of wireless or wired computing devices, that includes associated software running thereon (typically in the form of an app).
  • the AR device is coupled or paired with the goggles or other head mounted device so that the user can view the content on the screen of the AR device and be provided a simultaneous real-world view.
  • the mobile device (AR device) together with the goggles (head mounted device) are referred to herein as the AR assembly.
  • the VR goggles are used for an AR experience and can also be used for a VR experience.
  • the AR experience involves an educational chemistry or science lesson or lessons that takes place on the pages of the book.
  • this is not a limitation on the present invention.
  • the user upon launch of the Professor Maxwell's 4D Lab app, the user chooses the program for the proper product (e.g., Professor Maxwell’s 4D Chemistry). The user then places the smartphone into the goggles. Next, once the book is opened to the first page and the user“looks at” the page through the goggles and the app, via the camera, the app recognizes via scanning an AR target on the page and the app renders or augments the AR content on the screen of the smartphone (or other AR device).
  • the AR content rendered by the first target may be Professor Maxwell giving an introduction to the lesson. As different AR targets are scanned and recognized, different AR content is rendered or augmented.
  • the second AR content may be two glasses with the professor telling the user to fill the physical two glasses (which are part of the instruments) as part of the experiment and lesson.
  • the app scans/reads the page in front of the user and triggers the professor to "come alive" on the page and begin his introduction about the specific project and then walk through the steps of the lesson.
  • the passive scanning allows users to be relieved of the need to actively search for the AR targets. Instead, as soon as a target is scanned the AR content is augmented and displayed on the smartphone screen and viewed by the user through the goggles. Augmented reality target detection is taught in US Patent Application No. 9,401,048, the entirety of which is incorporated herein by reference.
  • each book includes one or more lesson plans that are each a unique, educational and lesson based product, in which the professor has a specific agenda per project.
  • Each separate project or lesson can be on a single or multiple pages.
  • the user learns about static electricity, as it is explained by the professor.
  • Each step by step instruction by the professor is in AR, so once the user starts the project, the user watches the professor bring each of the steps“to life,” showing the user the step by step instructions as he is animated and walks around the page explaining to the user the steps of the process.
  • the professor may instruct the user to use the instruments that come in the kit.
  • Some of the instruments necessary for the project can be provided by the user.
  • the other instruments can be, for example, a set of science materials, e.g., beakers, test tubes, baking soda, magnifying glass, etc.
  • Some or all of the instruments are preferably related to a separate project or lesson within the set, and are used with the book, app and goggles.
  • the goggles include a hollow main body portion and a holder, clamp or support member that is spaced from the main body portion define a phone slot
  • the goggles also include lenses, and a strap, or other component for securing the goggles to the user’s head.
  • the present invention is a lesson based teaching tool used in a hands-free augmented reality framework that is achieved by a set of VR goggles that exposes the camera of a smartphone, allowing the split-screen function to immerse the user into an augmented reality environment.
  • the book, the app, the step-by step AR, the instruments, etc. all combine to provide, in an exemplary embodiment, a professor that appears in the room with the user, walking around and teaching the user.
  • FIG. 1 is a schematic view of a number of components of a kit for the system of the present invention
  • FIG. 2 is an elevational view of a mobile device that includes an app icon thereon that associated with the augmented reality system
  • FIG. 3A is an exploded front elevational view of a head mounted device
  • FIG. 3B is an exploded side elevational view of a head mounted device and mobile device
  • FIG. 4 is a front elevational view of the mobile device secured in the head mounted device
  • FIG. 5 is a view of the mobile device screen showing the first and second screen frames
  • FIG. 6 is a plan view of exemplary pages of the printed media member;
  • FIG. 7 is block diagram showing the augmented reality system in communication through a network with a remote target database;
  • FIG. 8 is an exemplary view that a user would see when viewing the printed media member through the head mounted device and mobile device;
  • FIG. 9 is a plan view of exemplary pages of a printed media member in accordance with another preferred embodiment of the present invention.
  • FIG. 10 is a block diagram of a computer system suitable for implementing one or more components discussed herein;
  • FIG. 11 is a plan view of exemplary pages of a printed media member in
  • FIG. 12 is a view of the mobile device screen showing a virtual reality video
  • references in this specification to "one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the-disclosure.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
  • various features are described which may be exhibited by some embodiments and not by others.
  • various requirements are described which may be requirements for some embodiments but not other
  • the kit 10 includes a printed media member 12, a head mounted device 14 and a plurality of instruments 16 that can be used during the augmented reality experience described herein.
  • the printed media member 12 can be, for example, a book, poster, brochure, etc.
  • the printed media member 12 can also be a .pdf or other electronic file that is viewable on a computer screen.
  • the head mounted device 14 can be any device that is capable of being mounted on a user's head and securing a mobile device thereto.
  • the head mounted device 14 is a pair of goggles, but could also be a helmet, headband or other head mounted device.
  • the kit 10 can be provided in a box that includes the goggles, book and plurality of instruments, such as beakers, a cup and an eye dropper, as shown in FIG. 1.
  • the present invention includes software that can be downloaded to a mobile device 18 in the form of an app.
  • FIG. 2 shows a mobile device 18 with an exemplary app icon 20 thereon that is associated with the software program.
  • the mobile device 18 is secured to the head mounted device 14 so that the screen 22 of the mobile device 18 is viewable by the user when the head mounted device 14 is placed on the user's head.
  • the head mounted device 14 includes a main body portion 24, a securing member 26, first and second lenses 28 and a strap 30.
  • a securing space 32 is defined between the securing member 26 and the main body portion 24.
  • the securing member can be any component that secures the mobile device to the head mounted device.
  • the securing member can be a strap, elastic band or the like and can include Velcro, snaps, tabs or other components for securing the mobile device.
  • FIG. 4 when properly positioned in the head mounted device 14, the camera lens 34 of the mobile device 18 is exposed so that it can view the printed media member 12, as discussed below.
  • the software includes the capability of displaying the video in split screen mode, which includes a first screen frame 36 and a second screen frame 38.
  • the first screen frame 36 is viewable by the first lens 28 and the second screen frame 38 is viewable by the second lens 28 for stereoscopic viewing to provide a 3D image.
  • the printed media member 12 includes at least first and preferably a plurality of image targets 40.
  • the image targets 40 are recognized by the software and, as a result of being recognized, prompt or cause certain actions to take place.
  • the system includes a target database 42.
  • the target database is part of the software program downloaded to the mobile device 18.
  • the target database 42 can be stored on a remote server 44 or the like that is accessible via a network 46, such as the internet.
  • Each image target 40 includes augmented reality content or virtual reality content linked thereto or associated therewith.
  • the image targets can be any of the printed material that is recognized by the software.
  • the image targets can be a particular picture, drawing, word or set of words.
  • FIG. 8 includes two image targets 40 that cause linked augmented reality content to be augmented, displayed, activated or sounded.
  • the associated augmented reality content is displayed on the screen.
  • FIG. 8 which shows an exemplary Professor Maxwell 48 as the displayed augmented reality content.
  • the augmented reality content may also include audio.
  • each image target 40 may cause Professor Maxwell 48 to make a series of moves and say a particular set of words.
  • the kit 10 includes instruments 60.
  • the instruments 60 are associated with the theme of the kit, e.g., science, chemistry, cooking, etc. An example will be provided using the Professor Maxwell’s 4D Chemistry.
  • the book 12 is opened to the first page that includes an image target 40.
  • the image target 40 is recognized by the software via the camera and camera lens on the mobile device the Professor Maxwell related augmented reality content is rendered.
  • Professor Maxwell may tell the user to take one of the instruments 16 and perform a task.
  • the exemplary instruments shown in FIG. 1 are a Ph test paper strip, a cup, tweezers, a set of test tubes and a dropper.
  • 5 includes associated augmented reality content where Professor Maxwell 48 tells the user to use scissors to cut a few squares of the pH test paper. The user then takes the test paper strip that was provided with the kit and does what is told by Professor Maxwell. This can be done while still wearing the goggles.
  • the image targets 40 have one or more tasks or projects associated with them and/or have one or more instruments for performing the task(s) associated with them.
  • the software may recognize the second image target 40 shown in FIG. 6, and the linked augmented reality content can be displayed and Professor Maxwell continues to explain the next task or step in the experiment.
  • the next image target is recognized and the associated content is animated or rendered.
  • the first augmented reality content has not finished playing the second augmented reality content begins playing as soon as the second image target is recognized.
  • the instruments can be at least one of test tubes, a funnel, tweezers, a dropper, beaker, string, a test tube rack, test tube holder and stopper, magnet, pH test chart, a cup, a measuring cup, a measuring spoon, bottle(s) of food coloring, pH test strips, Styrofoam, a magnifying glass and other chemistry related instruments and the tasks or projects can be one or more of testing copper coins, making egg tattoos, making a compass, making perfume, mixing oil and water, performing soapy tricks, making rock candy, making an erupting rainbow, making sticky ice and testing pH.
  • the instruments can be at least one of a plastic volcano mold, a dropper, a cup, a measuring cup, a measuring spoon, beaker, bottle(s) of food coloring, a marker, a paint brush, a glue, mirror, filter paper, salt, washers, string, a rubber band, balloons and other science related instruments and the tasks or projects can be one or more of making sticky slime, making a volcano eruption, making a lava lamp, making dancing paper, extracting color from water, making a colored flower, creating a magic message, making a rainbow, making radical crystals and trapping gas in a balloon.
  • FIG. 9 shows an exemplary printed media member 12 in accordance with another preferred embodiment of the present invention.
  • the subject matter is baking/cooking and not chemistry.
  • the printed media member 12 includes augmented reality content that looks 3D when viewed, similar to Professor Maxwell 48 shown in FIG. 8, and augmented reality content that appears to play within a frame 41.
  • the image target 40 on the left page augments Professor Maxwell to teach the user about oxidizing food enzymes.
  • the image targets 40 on the right page augment such that they appear to be playing within the frame 41.
  • the image target 40 for each video that is recognized can be the first frame or image of the video.
  • the videos are related to the steps for making an apple-berry crumble.
  • FIG. 9 includes eight videos showing eight steps for making the apple-berry crumble.
  • the video is animated or augmented and appears to play within the frame 41 associated with step 1.
  • a different video is associated with and played for each of the steps as the associated image target is recognized.
  • the type of augmented reality associated with Professor Maxwell and the like is referred to as three-dimensional type augmented reality content because the content appears three-dimensional to the user, as shown in FIG. 8.
  • the type of augmented reality associated with the frames 41 and the like is referred to as two-dimensional type augmented reality content because the content appears two- dimensional to the viewer as it plays within the frame located on the page of the printed media member, as shown in FIG. 9.
  • the videos associated with each step may demonstrate to the user how to use the instruments associated with the task.
  • the instrument is a measuring cup (which may be an instrument that comes with the kit)
  • the subject video can show someone measuring out an amount of an ingredient in the measuring cup and then pouring it into a pot (which may be an instrument that does not come with the kit)
  • the two-dimension type augmented reality content can also include audio explaining the steps.
  • three-dimensional type augmented reality content e.g., Professor Maxwell
  • FIGS. 11 and 12 show another aspect of the present invention that includes a virtual reality component thereto.
  • the augmented reality content can be viewed by the user (through the goggles or on their mobile device), but the user can also see the printed media member (and other things therearound) as captured by the camera of the mobile device. Therefore, the augmented reality content appears to play on the printed media member.
  • the camera of the mobile device is not activated and the user cannot see the printed media member any longer. Instead, the content that is rendered is a virtual reality video where the user is immersed therein.
  • FIG. 11 shows another exemplary printed media member 12 that includes the lesson steps via virtual reality content.
  • the printed media member 12 includes augmented reality content that looks 3D when viewed, similar to Professor Maxwell 48 shown in FIG. 8, and virtual reality content that plays on the user's mobile device.
  • the image target 40 on the left page augments Professor Maxwell to teach the user about osmosis.
  • each of the image targets 40 associated with the ten steps in FIG. 11 causes virtual reality content to be augmented, displayed, activated or sounded and played on the screen of the mobile device (see FIG. 11).
  • the virtual reality content is related to the steps for cooking sweet potato fries.
  • FIG. 11 includes ten steps for making the sweet potato fries.
  • Each of the steps include a drawing representing the step and include an image target 40.
  • a video is activated and plays on the phone.
  • the video shows all ten steps in a single video, as opposed to having to look at each step and the related image target separately for a separate video.
  • each step can include a different virtual reality video that is associated with each image target.
  • the video shows all of ten of the steps in the process for making the sweet potato fries.
  • the video(s) can be an animated video or can be an actual video that is filmed of people performing the steps.
  • the video is filmed or created as a virtual reality video
  • the user turns their head to "look around” (if wearing the goggles) or moves their phone (if just viewing on the screen of the phone)
  • the user sees a different perspective within the site where the video takes place (e.g., the kitchen).
  • FIG. 12 shows an exemplary screen 22 of a mobile device 18 with a video playing thereon.
  • the video demonstrates to the user how to perform each of the steps and how to use the instruments associated with the tasks.
  • the virtual reality content video may show two kids in a kitchen performing all of steps one through ten to make and cook the sweet potato fries.
  • the virtual reality content can also include audio explaining the steps. Therefore, when using the book, the user can direct the camera towards the top left of the page and Professor Maxwell is activated to explain about osmosis. Then the user can direct the camera toward one of the steps in the process and a virtual reality video appears on the screen and walks the user through all of the steps for making the sweet potato fries.
  • the type of augmented reality associated with Professor Maxwell and the like is referred to as three-dimensional type augmented reality content because the content appears three-dimensional to the user, as shown in FIG. 8, the type of augmented reality associated with the frames 41 and the like is referred to as two-dimensional type augmented reality content because the content appears two-dimensional to the viewer as it plays within the frame located on the page of the printed media member, as shown in FIG. 9, and the content associated with the image targets that activate a full screen video is referred to as virtual reality content.
  • Network 46 may be implemented as a single network or a combination of multiple networks.
  • network 46 may include the Internet and/or one or more intranets, wireless networks (e.g., cellular, wide area network (WAN), WiFi hot spot, WiMax, personal area network (PAN), Bluetooth, etc.), landline networks and/or other appropriate types of communication networks.
  • wireless networks e.g., cellular, wide area network (WAN), WiFi hot spot, WiMax, personal area network (PAN), Bluetooth, etc.
  • computing device 18 may be associated with a particular link (e.g., a link, such as a URL (Uniform Resource Locator) to an IP (Internet Protocol) address).
  • a link e.g., a URL (Uniform Resource Locator) to an IP (Internet Protocol) address
  • mobile device 18 may use the remote target database 42 or may transmit the images to a remote server for image target identification.
  • FIG. 10 is a block diagram of a computer system 60 (e.g., mobile device 18 or a head mounted device that includes the computer system built in) suitable for implementing one or more components discussed herein according to one embodiment of the subject matter of the present disclosure.
  • mobile device 18 of the user may comprise a personal computing device (e.g., smart phone, a computing tablet, a personal computer, laptop, PDA, Bluetooth device, key FOB, badge, etc.) capable of
  • HMD 14 may comprise a personal computing device incorporated into a pair of glasses or a helmet.
  • HMD 14 may comprise or implement a plurality of hardware components and/or software components that operate to perform various methodologies in accordance with the described embodiments.
  • Exemplary HMD 14 may include, for example, stand-alone and networked computers running mobile OS.
  • Computer system 60 includes a bus 62 or other communication mechanism for communicating information data, signals, and information between various components of computer system 60.
  • Components include an input/output (I/O) component 64 that processes a user action, such as selecting keys from a virtual keypad/keyboard, selecting one or more buttons or links, etc., and sends a corresponding signal to bus 62.
  • I/O component 64 may also include an output component such as a display medium 70 mounted a short distance in front of the user's eyes, and an input control such as a cursor control 74 (such as a virtual keyboard, virtual keypad, virtual mouse, etc.).
  • An optional audio input/output component 66 may also be included to allow a user to use voice for inputting information by converting audio signals into information signals. Audio I/O component 66 may allow the user to hear audio.
  • a transceiver or network interface 68 transmits and receives signals between computer system 60 and other devices, such as another user device, or another network computing device via a communication link to a network. In one embodiment, the transmission is wireless, although other transmission mediums and methods may also be suitable.
  • a processor 72 which can be a micro controller, digital signal processor (DSP), or other processing component, processes these various signals, such as for display on computer system 60 or transmission to other devices via communication link. Processor 72 may also control transmission of information, such as cookies or IP addresses, to other devices.
  • Components of computer system 60 also include a system memory component 76 (e.g., RAM), a static storage component 78 (e.g., ROM), and/or a disk drive 80.
  • Computer system 60 performs specific operations by processor 72 and other components by executing one or more sequences of instructions contained in system memory component 76.
  • Logic may be encoded in a computer readable medium, which may refer to any medium that participates in providing instructions to processor 72 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • non-volatile media includes optical, or magnetic disks, or solid-state drives; volatile media includes dynamic memory, such as system memory component 76; and transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 62.
  • the logic is encoded in non-transitory computer readable medium.
  • transmission media may take the form of acoustic or light waves, such as those generated during radio wave, optical, and infrared data communications.
  • execution of instruction sequences to practice the present disclosure may be performed by computer system 60.
  • a plurality of computer systems 60 coupled by communication link to the network e.g., such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks
  • the network e.g., such as a LAN, WLAN, PTSN, and/or various other wired or wireless networks, including telecommunications, mobile, and cellular phone networks
  • various embodiments provided by the present disclosure may be implemented using hardware, software, firmware, or combinations thereof.
  • the various hardware components, software components, and/or firmware components set forth herein may be combined into composite components comprising software, firmware, hardware, and/or all without departing from the spirit of the present disclosure.
  • the various hardware components, software components, and/or firmware components set forth herein may be separated into sub-components comprising software, firmware, hardware, or all without departing from the spirit of the present disclosure.
  • software components may be implemented as hardware components, and vice-versa.
  • the ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.

Abstract

L'invention concerne un système de réalité augmentée qui comprend un élément de support imprimé qui comporte au moins une première cible d'image, un casque de tête qui comprend un élément de fixation de dispositif mobile, et un logiciel qui est configuré pour fonctionner sur un dispositif mobile qui comprend un appareil de prise de vue pourvu d'un objectif. Le logiciel est en communication avec une base de données cible (soit sur un serveur distant, soit dans le logiciel) qui comprend des informations relatives à la première cible d'image, ledit logiciel comprenant un premier contenu de réalité augmentée qui est associé à la première cible d'image.
PCT/US2020/021190 2019-03-06 2020-03-05 Système et kit de réalité virtuelle et augmentée WO2020181086A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/294,706 2019-03-06
US16/294,706 US10430658B2 (en) 2017-10-06 2019-03-06 Augmented reality system and kit

Publications (1)

Publication Number Publication Date
WO2020181086A1 true WO2020181086A1 (fr) 2020-09-10

Family

ID=72337134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/021190 WO2020181086A1 (fr) 2019-03-06 2020-03-05 Système et kit de réalité virtuelle et augmentée

Country Status (1)

Country Link
WO (1) WO2020181086A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113943A1 (en) * 2011-08-05 2013-05-09 Research In Motion Limited System and Method for Searching for Text and Displaying Found Text in Augmented Reality
US20130278635A1 (en) * 2011-08-25 2013-10-24 Sartorius Stedim Biotech Gmbh Assembling method, monitoring method, communication method, augmented reality system and computer program product
US20150253574A1 (en) * 2014-03-10 2015-09-10 Ion Virtual Technology Corporation Modular and Convertible Virtual Reality Headset System
US20160313902A1 (en) * 2015-04-27 2016-10-27 David M. Hill Mixed environment display of attached control elements
US20170352187A1 (en) * 2016-06-03 2017-12-07 J. Michelle HAINES System and method for implementing computer-simulated reality interactions between users and publications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130113943A1 (en) * 2011-08-05 2013-05-09 Research In Motion Limited System and Method for Searching for Text and Displaying Found Text in Augmented Reality
US20130278635A1 (en) * 2011-08-25 2013-10-24 Sartorius Stedim Biotech Gmbh Assembling method, monitoring method, communication method, augmented reality system and computer program product
US20150253574A1 (en) * 2014-03-10 2015-09-10 Ion Virtual Technology Corporation Modular and Convertible Virtual Reality Headset System
US20160313902A1 (en) * 2015-04-27 2016-10-27 David M. Hill Mixed environment display of attached control elements
US20170352187A1 (en) * 2016-06-03 2017-12-07 J. Michelle HAINES System and method for implementing computer-simulated reality interactions between users and publications

Similar Documents

Publication Publication Date Title
US10565452B2 (en) Virtual reality system and kit
Balakrishnan et al. Interaction of Spatial Computing In Augmented Reality
JP7445720B2 (ja) 拡張現実のためのシステムおよび方法
Rampolla et al. Augmented reality: An emerging technologies guide to AR
CN106937531B (zh) 用于产生虚拟和增强现实的方法和系统
US9285871B2 (en) Personal audio/visual system for providing an adaptable augmented reality environment
US20130083007A1 (en) Changing experience using personal a/v system
CN108600632B (zh) 拍照提示方法、智能眼镜及计算机可读存储介质
US20130083008A1 (en) Enriched experience using personal a/v system
Marr Extended reality in practice: 100+ amazing ways virtual, augmented and mixed reality are changing business and society
CN105378632A (zh) 用户焦点控制的有向用户输入
CN103076875A (zh) 具有全息对象的个人音频/视频系统
CN105009039A (zh) 使用imu的直接全息图操纵
KR20150126938A (ko) 증강 및 가상 현실을 위한 시스템 및 방법
US10183231B1 (en) Remotely and selectively controlled toy optical viewer apparatus and method of use
JP2002336317A (ja) 立体映像を利用する視力回復装置及び立体映像の表示方法
CN105959666A (zh) 一种虚拟现实系统中分享3d影像的方法和装置
CN114502921A (zh) 混合现实中的空间指令和指南
US10558858B2 (en) Augmented reality system and kit
Jones Spaces mapped and monstrous: Digital 3D cinema and visual culture
JP2019509540A (ja) マルチメディア情報を処理する方法及び装置
US10430658B2 (en) Augmented reality system and kit
Wade On stereoscopic art
Loijens et al. What is augmented reality?
Hainich The End of hardware: augmented reality and beyond

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20767201

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20767201

Country of ref document: EP

Kind code of ref document: A1