US20210043007A1 - Virtual Path Presentation - Google Patents
Virtual Path Presentation Download PDFInfo
- Publication number
- US20210043007A1 US20210043007A1 US17/078,271 US202017078271A US2021043007A1 US 20210043007 A1 US20210043007 A1 US 20210043007A1 US 202017078271 A US202017078271 A US 202017078271A US 2021043007 A1 US2021043007 A1 US 2021043007A1
- Authority
- US
- United States
- Prior art keywords
- user
- real
- path
- world space
- world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present subject matter relates to displaying information, and more specifically, to presenting virtual barriers or virtual paths added to a real-world scene.
- One new technology that is making its way into the world of emergency responders is digital displays. These displays may be on a handheld device, such as a mobile phone, or on a head-mounted display (HMD), such as a virtual reality (VR) display or an augmented reality (AR) display, which may be integrated into their emergency equipment, such as their helmet.
- HMD head-mounted display
- VR virtual reality
- AR augmented reality
- Textual information can be presented to the emergency responder through the display and the information can be updated in real-time through the digital wireless interface from a command center or other information sources.
- FIG. 1A shows a user wearing an embodiment of a head-mounted display presenting a virtual barrier
- FIG. 1B shows a user wearing the embodiment of a head-mounted display presenting the virtual barrier from a different perspective
- FIG. 2A shows a user wearing an embodiment of a head-mounted display presenting a virtual barrier
- FIG. 2B shows a user wearing the embodiment of a head-mounted display presenting a different state of the virtual barrier
- FIG. 3 shows a user wearing an embodiment of a head-mounted display presenting a brightness mask used to direct the user
- FIG. 4A shows in-view and out-of-view objects near a user wearing an embodiment of a head-mounted display
- FIG. 4B shows a distortion mask presenting out-of-view data to direct a user wearing an embodiment of a head-mounted display
- FIG. 5 shows an overhead map of a path presented by an embodiment of a head-mounted display
- FIG. 6A shows part of a 3D path presented by an embodiment of a head-mounted display
- FIG. 6B shows the exit of a 3D path presented by an embodiment of a head-mounted display
- FIG. 7 shows a path for a hand motion presented by an embodiment of a head-mounted display.
- FIG. 8 shows a block diagram of an embodiment of an HR system
- FIG. 9 is a flowchart of an embodiment of a method for directing a user to approach a virtual barrier
- FIG. 10 is a flowchart of an embodiment of a method for directing a user to retreat from a virtual barrier
- FIG. 11 is a flowchart of an embodiment of a method for providing directional guidance to a user.
- FIG. 12 is a flowchart of an embodiment of a method to provide directional guidance to a user wearing a head-mounted display.
- Hybrid Reality refers to an image that merges real-world imagery with imagery created in a computer, which is sometimes called virtual imagery. While an HR image can be a still image, it can also be a moving image, such as imagery created using a video stream. HR can be displayed by a traditional two-dimensional display device, such as a computer monitor, one or more projectors, or a smartphone screen. HR imagery can also be displayed by a head-mounted display (HMD). Many different technologies can be used in an HMD to display HR imagery.
- a virtual reality (VR) HMD system may receive images of a real-world object, objects, or scene, and composite those images with a virtual object, objects, or scene to create an HR image.
- VR virtual reality
- An augmented reality (AR) HMD system may present a virtual object, objects, or scene on a transparent screen which then naturally mixes the virtual imagery with a view of a scene in the real-world.
- a display which mixes live video with virtual objects is sometimes denoted AR, but for the purposes of this disclosure, an AR HMD includes at least a portion of the display area that is transparent to allow at least some of the user's view of the real-world to be directly viewed through the transparent portion of the AR HMD.
- the display used by an HR system represents a scene which is a visible portion of the whole environment.
- the term “scene” and “field of view” (FOV) are used to indicate what is visible to a user.
- the word “occlude” is used herein to mean that a pixel of a virtual element is mixed with an image of another object to change the way the object is perceived by a viewer.
- this can be done through use of a compositing process to mix the two images, a Z-buffer technique to remove elements of the image that are hidden from view, a painter's algorithm to render closer objects later in the rendering process, or any other technique that can replace a pixel of the image of the real-world object with a different pixel value generated from any blend of real-world object pixel value and an HR system determined pixel value.
- the virtual object occludes the real-world object if the virtual object is rendered, transparently or opaquely, in the line of sight of the user as they view the real-world object.
- the terms “occlude”, “transparency”, “rendering” and “overlay” are used to denote the mixing or blending of new pixel values with existing object pixel values in an HR display.
- a sensor may be mounted on or near the display, on the viewer's body, or be remote from the user.
- Remote sensors may include, but are not limited to, fixed sensors attached in an environment, sensors attached to robotic extensions, sensors attached to autonomous or semi-autonomous drones, or sensors attached to other persons.
- Data from the sensors may be raw or filtered.
- Data from the sensors may be transmitted wirelessly or using a wired connection.
- Sensors used by some embodiments of HR systems include, but are not limited to, a camera that captures images in the visible spectrum, an infrared depth camera, a microphone, a sound locator, a Hall effect sensor, an air-flow meter, a fuel level sensor, an oxygen sensor, an electronic nose, a gas detector, an anemometer, a mass flow sensor, a Geiger counter, a gyroscope, an infrared temperature sensor, a flame detector, a barometer, a pressure sensor, a pyrometer, a time-of-flight camera, radar, or lidar.
- Sensors in some HR system embodiments that may be attached to the user include, but are not limited to, a biosensor, a biochip, a heartbeat sensor, a pedometer, a skin resistance detector, or skin temperature detector.
- the display technology used by an HR system embodiment may include any method of projecting an image to an eye.
- Conventional technologies include, but are not limited to, cathode ray tube (CRT), liquid crystal display (LCD), light emitting diode (LED), plasma or organic LED (OLED) screens, or projectors based on those technologies or digital micromirror devices (DMD).
- CTR cathode ray tube
- LCD liquid crystal display
- LED light emitting diode
- OLED organic LED
- virtual retina displays such as direct drawing on the eye's retina using a holographic grating, may be used.
- direct machine to brain interfaces may be used in the future.
- the display of an HR system may also be an HMD or a separate device, such as, but not limited to, a hand-held mobile phone, a tablet, a fixed monitor or a TV screen.
- connection technology used by an HR system may include any physical link and associated protocols, such as, but not limited to, wires, transmission lines, solder bumps, near-field connections, infra-red connections, or radio frequency (RF) connections such as cellular, satellite or Wi-Fi® (a registered trademark of the Wi-Fi Alliance).
- RF radio frequency
- Virtual connections, such as software links, may also be used to connect to external networks and/or external compute.
- aural stimuli and information may be provided by a sound system.
- the sound technology may include monaural, binaural, or multi-channel systems.
- a binaural system may include a headset or another two-speaker system but may also include systems with more than two speakers directed to the ears.
- the sounds may be presented as 3D audio, where each sound has a perceived position in space, achieved by using reverberation and head-related transfer functions to mimic how sounds change as they move in a particular space.
- objects in the display may move.
- the movement may be due to the user moving within the environment, for example walking, crouching, turning, or tilting the head.
- the movement may be due to an object moving, for example a dog running away, a car coming towards the user, or a person entering the FOV.
- the movement may also be due to an artificial movement, for example the user moving an object on a display or changing the size of the FOV.
- the motion may be due to the user deliberately distorting all or part of the FOV, for example adding a virtual fish-eye lens.
- all motion is considered relative; any motion may be resolved to a motion from a single frame of reference, for example the user's viewpoint.
- the perspective of any generated object overlay may be corrected so that it changes with the shape and position of the associated real-world object. This may be done with any conventional point-of-view transformation based on the angle of the object from the viewer; note that the transformation is not limited to simple linear or rotational functions, with some embodiments using non-Abelian transformations. It is contemplated that motion effects, for example blur or deliberate edge distortion, may also be added to a generated object overlay.
- images from cameras may be processed before algorithms are executed.
- Algorithms used after image processing for embodiments disclosed herein may include, but are not limited to, object recognition, motion detection, camera motion and zoom detection, light detection, facial recognition, text recognition, or mapping an unknown environment.
- the image processing may also use conventional filtering techniques, such as, but not limited to, static, adaptive, linear, non-linear, and Kalman filters. Deep-learning neural networks may be trained in some embodiments to mimic functions which are hard to create algorithmically. Image processing may also be used to prepare the image, for example by reducing noise, restoring the image, edge enhancement, or smoothing.
- objects may be detected in the FOV of one or more cameras.
- Objects may be detected by using conventional algorithms, such as, but not limited to, edge detection, feature detection (for example surface patches, corners and edges), greyscale matching, gradient matching, pose consistency, or database look-up using geometric hashing.
- edge detection for example surface patches, corners and edges
- feature detection for example surface patches, corners and edges
- greyscale matching for example surface patches, corners and edges
- gradient matching for example surface patches, corners and edges
- pose consistency for example geometric hashing.
- database look-up using geometric hashing such as, but not limited to, edge detection, feature detection (for example surface patches, corners and edges), greyscale matching, gradient matching, pose consistency, or database look-up using geometric hashing.
- Genetic algorithms and trained neural networks using unsupervised learning techniques may also be used in embodiments to detect types of objects, for example people, dogs, or trees.
- object may be performed on a single frame of a video stream, although techniques using multiple frames are also envisioned.
- Advanced techniques such as, but not limited to, Optical Flow, camera motion, and object motion detection may be used between frames to enhance object recognition in each frame.
- rendering the object may be done by the HR system embodiment using databases of similar objects, the geometry of the detected object, or how the object is lit, for example specular reflections or bumps.
- the locations of objects may be generated from maps and object recognition from sensor data.
- Mapping data may be generated on the fly using conventional techniques, for example the Simultaneous Location and Mapping (SLAM) algorithm used to estimate locations using Bayesian methods, or extended Kalman filtering which linearizes a non-linear Kalman filter to optimally estimate the mean or covariance of a state (map), or particle filters which use Monte Carlo methods to estimate hidden states (map).
- SLAM Simultaneous Location and Mapping
- the locations of objects may also be determined a priori, using techniques such as, but not limited to, reading blueprints, reading maps, receiving GPS locations, receiving relative positions to a known point (such as a cell tower, access point, or other person) determined using depth sensors, WiFi time-of-flight, or triangulation to at least three other points.
- a known point such as a cell tower, access point, or other person
- Gyroscope sensors on or near the HMD may be used in some embodiments to determine head position and to generate relative motion vectors which can be used to estimate location.
- sound data from one or microphones may be processed to detect specific sounds. Sounds that might be identified include, but are not limited to, human voices, glass breaking, human screams, gunshots, explosions, door slams, or a sound pattern a particular machine makes when defective.
- Gaussian Mixture Models and Hidden Markov Models may be used to generate statistical classifiers that are combined and looked up in a database of sound models.
- One advantage of using statistical classifiers is that sounds can be detected more consistently in noisy environments.
- Eye tracking of one or both viewer's eyes may be performed. Eye tracking may be used to measure the point of the viewer's gaze.
- the position of each eye is known, and so there is a reference frame for determining head-to-eye angles, and so the position and rotation of each eye can be used to estimate the gaze point.
- Eye position determination may be done using any suitable technique and/or device, including, but not limited to, devices attached to an eye, tracking the eye position using infra-red reflections, for example Purkinje images, or using the electric potential of the eye detected by electrodes placed near the eye which uses the electrical field generated by an eye independently of whether the eye is closed or not.
- HR imagery are becoming increasingly common and are making their way from entertainment and gaming into industrial and commercial applications.
- Examples of systems that may find HR imagery useful include aiding a person doing a task, for example repairing machinery, testing a system, or responding to an emergency.
- HR imagery might be used also provide information to a user.
- This information may be associated with real objects in the environment or may be related to the environment as a whole, for example an ambient or average value.
- the information to be provided to the user is unrelated to the real environment they are working in. Providing the various types of information to the user in a way that can be readily understood by the user and is not confusing, distracting or obscuring details that the user needs can be a challenge.
- an HR system which aids a person doing a task, for example repairing machinery, testing a system, or responding to an emergency
- there may objects in the environment that should be avoided for example a critical component that cannot be repaired if broken, a sensitive detector that cannot be touched, or a device that may require a time-consuming and complex reconfiguration if knocked.
- the virtual barrier is an artificial object that is created as an overlay on the screen used to indicate a barrier, for example to provide information that may be occluded or recommend that the user's movements be restricted.
- the HR system may use additional stimuli to reinforce the virtual barrier, for example sounds, haptic pushes and changes to the display. As one non-limiting example, an buzzing sound may increase in volume or frequency to indicate potential encroachment as a user approaches a virtual barrier.
- a virtual barrier may be rendered in any manner, such as, but not limited to, a wall, a fence, a pole, an icon, or a cross.
- a blanket color covering the component may be used to indicate that the component should be avoided.
- additional graphics e.g. icons or text
- the position and orientation of a virtual barrier may be fixed in 3D space. If there is relative motion as described previously herein, a virtual barrier may appear as a static object in the environment by updating the overlay to show a virtual barrier in the same position and orientation in 3D space.
- a virtual barrier may also change appearance, such as, but not limited to, growing larger, changing color, or rendering a different image, to indicate a change in status.
- the stimuli may be changed, increased in volume, or updated more frequently rate to encourage the user to move a hand or to move away.
- a directional haptic shake in the HR system, glove or suit may be used to prompt the user to move in a specific direction.
- a surprising flash on the display may also be used to indicate that a virtual barrier is being crossed, causing the user to stop and reassess. Sounds that have a position in space may be presented to encourage the user to move in a specific direction, for example away from a virtual barrier.
- the user may move away from a strident alarm or move towards a sound that is associated with continuing the task at hand.
- the stimuli may include unexpected or unpleasant odors added to a breathing system, the temperature of a suit raised, or moisture added, to make the user more uncomfortable and so treat a virtual barrier as something to be avoided.
- the virtual barrier may reinforce the user's action, such as, but not limited to, flashing, offering a reassuring voice or sound, or reducing any applied discomfort.
- a virtual barrier may also be used to direct the user of an HR system to controlled actions, such as, but not limited to, a next task, a specific location, a specific orientation, a specific height, a head turn, a body rotation, a hand re-position, or a hand gesture.
- Time-varying stimuli may be used to indicate a pattern of motions to aid user comprehension of the desired action.
- a virtual barrier may indicate a direction using stimuli that have a spatial position, for example a haptic pattern on one side of a suit, glove or head-mounted system, or a sound that has a simulated 3D position.
- a virtual barrier may change the display to include directional attributes, for example shaking the display in a preferred direction or adding arrows.
- a gradient is added to the display to indicate direction.
- the gradient may be any visual effect that has a direction, such as, but not limited to, brightness, distortion, or time-based flashes of out-of-view objects on one side of the display. For example, if the action is to move left, the right side of the display may be dimmed out and the left side of the display enhanced by brightening.
- a plurality of virtual barriers may also be used to provide and enhance a specific path, for example a safe exit route in a smoke-filled building, or to a required position and orientation with respect to a physical object.
- a path is defined by a sequence of virtual barriers placed in correct relative position and orientation of the screen; note that in any situation, only a portion of a virtual barrier sequence may be in effect or in view.
- FIG. 1A shows how one embodiment of an HR system shows a virtual barrier 108 A correctly positioned between walls 102 , 104 .
- the orientation of the user 100 in space is indicated by compass 106 A.
- the field of view (FOV) of user 100 is indicated by the dotted box 110 ; note that box 110 and virtual barrier 108 A are not in the real environment.
- the image of virtual barrier 108 A is rendered on the head-mounted display 112 in the correct orientation of the user 100 according to the world-position of the gap between walls 102 , 104 .
- FIG. 1B shows the user 100 in a new position and orientation as indicated by the compass 106 B.
- user 100 is still looking at the gap between walls 102 , 104 .
- a virtual barrier 108 B is rendered on the head-mounted display 112 in the gap between walls 102 , 104 , but correctly rendered according to the new position and orientation of user 100 .
- the rendering of virtual barrier 108 A, 108 B at the times of FIG. 1A and FIG. 1B reinforce the appearance of a fixed solid object in real 3D space on display 112 .
- the walls 102 , 104 and other real-world objects are also visible to the user 100 in their FOV.
- the user 100 may view the walls 102 , 104 through a transparent portion of the display 112 , such as in an AR HMD, or an image of the walls 102 , 104 may be presented in the display 112 with the virtual barrier 108 A, 108 B, such as in a VR HMD.
- FIG. 2A shows an example embodiment of a HR system rendering a virtual barrier 250 on the display 240 of user 200 .
- the position and orientation of the gap between walls 210 , 212 is determined with respect to user 200 .
- the field of view of user 200 is indicated by the dotted box 230 ; note that box 230 and virtual barrier 250 are not in the real environment.
- the image of virtual barrier 250 is rendered on the head-mounted display 240 in the correct orientation of the user 200 according to the world-position of the gap between walls 210 , 212 .
- the image of the virtual barrier 250 rendered is selected to denote a default stop barrier, comprising a pole 202 , stand 204 , stand 206 , and icon 208 .
- FIG. 2B shows an example embodiment of a HR system rendering a virtual barrier 260 on the display 240 of user 200 .
- a virtual barrier 260 is rendered on the head-mounted display 240 in the gap between walls 210 , 212 correctly rendered according to the current position and orientation of user 200 .
- the status of the virtual barrier 260 has changed, for example a danger level has increased behind the barrier (e.g. the collapse of a floor behind the world position between walls 210 , 212 ).
- the change of status causes the HR system to render a different virtual barrier 260 on head-mounted display 240 .
- the virtual barrier appears as a striped wall barrier 222 with hazard symbol 228 , indicating that the barrier is less approachable than at the time of FIG. 2A .
- FIG. 3 shows an example embodiment of a HR system rendering a brightness gradient within the field of view 300 of head-mounted display 360 worn by user 350 .
- the rendered field of view 300 is superimposed on real world space including walls 320 , 322 for clarity.
- the field of view 300 includes the visible portions of real-world walls 320 , 322 and a virtual barrier 302 positioned at the gap.
- the field of view 300 is split into two regions 304 , 306 .
- region 304 the scene is rendered at a brightness level that may be brighter or the same brightness as before the gradient was applied at the time of FIG. 3 .
- region 306 a gradual reduction in brightness from left to right is applied, with the right-most edge details almost completely black.
- the addition of the brightness gradient in regions 304 , 306 encourages user 350 to move to the left where objects are more visible; this may be reinforced by making region 304 larger at later times as user 350 turns in the correct direction.
- FIG. 4A shows an example snapshot of an embodiment of an HR system display.
- the field of view 410 is the same size as the display.
- Within the field of view 410 are two walls 412 , 414 and a virtual barrier 416 rendered by the example HR system.
- FIG. 4B shows an example snapshot of an embodiment of an HR system display showing the same scene as FIG. 4A at a different time.
- the field of view 410 is split into three regions 402 , 404 , 406 .
- objects previously out of the field of view 410 are rendered, for example fire hydrant 420 .
- Other embodiments may include a different region of the field of view to show objects that are not actually within the field of view of the user, such as a region at the top of the display or a region at the bottom of the display.
- eye gaze tracking may be used to determine when a user looks to predefined region of the display, and objects that are out of view may be shown in that region in response to the user looking at that region of the display.
- region 404 the left side of the actual field of view is rendered at an x-compression level defined by the ratios of 402 , 404 and 406 .
- region 406 the right side of the actual field of view is rendered at a higher x-compression level than used in region 404 .
- the addition of the distortion gradient in regions 402 , 404 , 406 encourages a user to move to the left where objects are not distorted; this may be reinforced by making region 402 larger at later times as the field of view moves in the correct direction.
- FIG. 5 shows an overhead map of a path presented by an embodiment of a head-mounted display.
- the path is defined by the sequence of virtual barriers 502 , 504 , 512 , 514 , 518 on the left and by the sequence of virtual barriers 506 , 508 , 510 , 516 , 520 on the right.
- each virtual barrier is represented as the edge of the path but may be rendered in any manner.
- the HR system display view 500 is rendered in full-screen as the path from an overhead perspective, but the map may be shown in a window of the display in embodiments.
- the current position of the user relative to the path is shown by icon 522 which moves as the user moves.
- the arrow 524 indicates routing instructions, and may be enhanced, for example, using voice commands or text prompts.
- the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, each virtual barrier 502 , 504 , 512 , 514 , 518 , 506 , 508 , 510 , 516 , 520 may use one or more encroaching stimuli to keep the user on path.
- the overhead map may show the whole map or a current portion which may scroll or update.
- FIG. 6A shows part of a 3D path presented by an embodiment of a head-mounted display.
- the current portion of the path facing the user is defined by the sequence of virtual barriers 602 , 604 on the left and by the sequence of virtual barriers 606 , 608 on the right.
- each virtual barrier is represented as a blank wall but may be rendered in any manner.
- Real-world objects are not shown in FIG. 6A for clarity, but embodiments may include real-world objects overlaid by the virtual barriers.
- the HR system display view 650 is rendered in full-screen as the portion of the path directly in front of the user, but the map may be shown in a window of the display in some embodiments.
- the current position of the user relative to the path is indicated by the lower edge of the screen.
- the arrows 610 , 612 indicate routing instructions, and may be enhanced, for example, using voice commands or text prompts.
- the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, each virtual barrier 602 , 604 , 606 , 608 may use one or more encroaching stimuli to keep the user on path.
- the display 650 is updated to reflect the current perspective of the user.
- FIG. 6B shows the exit of a 3D path presented by an embodiment of a head-mounted display.
- the current portion of the path facing the user path is defined by virtual barrier 632 on the left and by virtual barrier 634 on the right.
- each virtual barrier is represented as a blank wall but may be rendered in any manner.
- Real-world objects are not shown in FIG. 6B for clarity, but embodiments may include real-world objects overlaid by the virtual barriers.
- the HR system display view 655 is rendered in full-screen as the portion of the path directly in front of the user, but the map may be shown in a window of the display.
- the current position of the user relative to the path is indicated by the lower edge of the screen.
- the arrows 620 , 622 , 624 indicate routing instructions to the exit, and may be enhanced, for example, using voice commands or text prompts.
- the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, each virtual barrier 632 , 634 may use one or more encroaching stimuli to keep the user on path.
- the map is updated to reflect the current perspective of the user.
- the exit of the path is shown as icon 630 but may be enhanced or replaced with a view of the real-world from the correct perspective.
- FIG. 7 shows a path for a hand motion presented by an embodiment of a head-mounted display.
- the required path for hand 702 starts at 3D position 704 and ends at 3D position 706 and is blocked by real-world components 710 , 712 .
- the real-world interference components 710 , 712 may be rendered as artificial objects or shown as real-world objects.
- the path between 3D locations 704 , 706 that the hand 702 is to follow is shown by dotted line segments 720 , 722 , 724 . Note that the dotted line segments 720 , 722 , 724 might not be shown on the display but are included in FIG. 7 to show the desired path of the hand 702 .
- a sequence of virtual barriers 730 , 732 , 734 on the left and a sequence of virtual barriers 740 , 742 , 744 on the right guard the path are displayed to the user.
- each virtual barrier 730 , 732 , 734 , 740 , 742 , 744 may use one or more encroaching stimuli to keep the user on path, such as, but not limited to, a haptic stimulus in a glove worn by hand 702 .
- the position of the hand 702 is updated to show the current location in 3D space or is shown as the real-world hand.
- the virtual barriers 730 , 732 , 734 , 740 , 742 , 744 may not be shown on the display to reduce screen clutter.
- FIG. 8 is a block diagram of an embodiment of an HR system 800 which may have some components implemented as part of a head-mounted assembly.
- the HR system 800 may be considered a computer system that can be adapted to be worn on the head, carried by hand, or otherwise attached to a user.
- a structure 805 is included which is adapted to be worn on the head of a user.
- the structure 805 may include straps, a helmet, a hat, or any other type of mechanism to hold the HR system on the head of the user as an HMD.
- the HR system 800 also includes a display 850 coupled to position the display 850 in a field-of-view (FOV) of the user.
- the structure 805 may position the display 850 in a field of view of the user.
- the display 850 may be a stereoscopic display with two separate views of the FOV, such as view 852 for the user's left eye, and view 854 for the user's right eye.
- the two views 852 , 854 may be shown as two images on a single display device or may be shown using separate display devices that are included in the display 850 .
- the display 850 may be transparent, such as in an augmented reality (AR) HMD.
- AR augmented reality
- the view of the FOV of the real-world as seen through the display 850 by the user is composited with virtual objects that are shown on the display 850 .
- the virtual objects may occlude real objects in the FOV as overlay elements and may themselves be transparent or opaque, depending on the technology used for the display 850 and the rendering of the virtual object.
- a virtual object, such as a virtual barrier may be positioned in a virtual space, that could be two-dimensional or three-dimensional, depending on the embodiment, and may be anchored to an associated real object in real space.
- two different views of the virtual barrier may be rendered and shown in two different relative positions on the two views 852 , 854 , depending on the disparity as defined by the inter-ocular distance of a viewer.
- the HR system 800 includes one or more sensors in a sensing block 840 to sense at least a portion of the FOV of the user by gathering the appropriate information for that sensor, for example visible light from a visible light camera, from the FOV of the user. Any number of any type of sensor, including sensors described previously herein, may be included in the sensor block 840 , depending on the embodiment.
- the HR system 800 may also include an I/O block 820 to allow communication with external devices.
- the I/O block 820 may include one or both of a wireless network adapter 822 coupled to an antenna 824 and a network adapter 826 coupled to a wired connection 828 .
- the wired connection 828 may be plugged into a portable device, for example a mobile phone, or may be a component of an umbilical system such as used in extreme environments.
- the HR system 800 includes a sound processor 860 which takes input from one or microphones 862 .
- the microphones 862 may be attached to the user.
- External microphones for example attached to an autonomous drone, may send sound data samples through wireless or wired connections to I/O block 820 instead of, or in addition to, the sound data received from the microphones 862 .
- the sound processor 860 may generate sound data which is transferred to one or more speakers 864 , which are a type of sound reproduction device.
- the generated sound data may be analog samples or digital values. If more than one speaker 864 is used, the sound processor may generate or simulate 2D or 3D sound placement.
- a first speaker may be positioned to provide sound to the left ear of the user and a second speaker may be positioned to provide sound to the right ear of the user. Together, the first speaker and the second speaker may provide binaural sound to the user.
- the HR system 800 includes a stimulus block 870 .
- the stimulus block 870 is used to provide other stimuli to expand the HR system user experience.
- Embodiments may include numerous haptic pads attached to the user that provide a touch stimulus.
- Embodiments may also include other stimuli, such as, but not limited to, changing the temperature of a glove, changing the moisture level or breathability of a suit, or adding smells to a breathing system.
- the HR system 800 may include a processor 810 and one or more memory devices 830 , which may also be referred to as a tangible medium or a computer readable medium.
- the processor 810 is coupled to the display 850 , the sensing block 840 , the memory 830 , I/O block 820 , sound block 860 , and stimulus block 870 , and is configured to execute the instructions 832 encoded on (i.e. stored in) the memory 830 .
- the HR system 800 may include an article of manufacture comprising a tangible medium 830 , that is not a transitory propagating signal, encoding computer-readable instructions 832 that, when applied to a computer system 800 , instruct the computer system 800 to perform one or more methods described herein, thereby configuring the processor 810 .
- the processor 810 included in the HR system 800 may be able to perform methods described herein autonomously, in some embodiments, processing facilities outside of that provided by the processor 810 included inside of the HR system 800 may be used to perform one or more elements of methods described herein.
- the processor 810 may receive information from one or more of the sensors 840 and send that information through the wireless network adapter 822 to an external processor, such as a cloud processing system or an external server.
- the external processor may then process the sensor information to identify a location for a virtual barrier in the FOV and send the location to the processor 810 through the wireless network adapter 822 .
- the processor 810 may then use the geometry, appearance and location of the virtual barrier in the FOV to render and show the virtual barrier on the display 850 .
- the instructions 832 may instruct the HR system 800 to detect one or more objects in a field-of-view (FOV) using at least one sensor 840 coupled to the computer system 800 .
- the instructions 832 may further instruct the HR system 800 , using at least one sensor 840 , to determine the world position of the one or more objects.
- FOV field-of-view
- the instructions 832 may further instruct the HR system 800 to establish a world position for a barrier according the world position of the one or more objects detected in the FOV.
- the instructions 832 may further instruct the HR system 800 to render an image of a virtual barrier on the display 850 at a position corresponding to the world position of a barrier.
- the instructions 832 instruct the HR system 800 to render the virtual barrier using images determined by information received by the HR system 800 related to conditions occluded behind the world position of the barrier.
- the instructions 832 may further instruct the HR system 800 to render an image of a virtual barrier on the display 850 at a position corresponding to the world position of a barrier at a fixed position over periods of time to correct for motion.
- the instructions 832 instruct the HR system 800 to render the virtual barrier from different perspectives as the user moves as determined by gyroscopes in the sensor block 840 .
- the instructions 832 may further instruct the HR system 800 to present an additional sensory stimulus to the user to encourage the user to avoid the world position of the virtual barrier object.
- the instructions 832 instruct the HR system 800 to determine the distance of the user from the world position of the virtual barrier object using at least one sensor 840 .
- the instructions 832 instruct the HR system 800 to provide one or more stimuli, for example an unnatural sound presented by sound processor 860 , a change of appearance of the virtual barrier object on the display 850 , a visual flash on the display 850 , a sudden loud noise presented by sound processor 860 , or a haptic push delivered through the stimulus block 870 .
- the haptic push may be delivered by any mechanism, such as, but not limited to, the HR system 800 , a suit worn by the user, or a glove worn by the user.
- the instructions 832 may further instruct the HR system 800 to calculate a direction for the user to move to avoid the world position of the virtual object and to deliver a haptic push in the determined direction to the HR system 800 , a suit worn by the user, or a glove worn by the user using the stimulus block 870 .
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- FIG. 9 is a flowchart 900 of an embodiment of a method for directing a user to approach a virtual barrier.
- the method starts 901 and a user position is received 902 from hardware integrated into a head-mounted display in some embodiments.
- the user position may be obtained relatively from a known position using depth sensor camera or gyroscopes, using absolute positioning systems such as GPS, or other mechanisms, depending on the embodiment.
- the flowchart 900 continues by determining 904 a distance ‘d’ from the known world-position of a virtual barrier, where the distance ‘d’ is the difference between the user location and the world-position of the virtual barrier.
- the flowchart 900 continues by comparing 906 ‘d’ to a nearness threshold ‘T1’.
- ‘d’ is greater than or equal to ‘T1’ and the flowchart returns to the start 901 . If the user is proximal to the virtual barrier, then ‘d’ is less than ‘T1’ and the flowchart 900 continues to start a stimulus 908 , such as a sound, visual cue or haptic push is some embodiments.
- a stimulus 908 such as a sound, visual cue or haptic push is some embodiments.
- the flowchart 900 continues by receiving 910 a user position and determining 912 a distance ‘d’ to the virtual barrier at a new time instant.
- the distance ‘d’ is compared 914 to ‘T1’ and if ‘d’ is less than or equal to ‘T1’, then the user is still proximal to the barrier, and an update for ‘d’ 910 , 912 is repeatedly determined at subsequent time points. If ‘d’ is greater than ‘T1’, then the user has moved away from the barrier and the flowchart 900 continues by stopping the stimulus 916 and then returning to the start 901 .
- the threshold T1 may be a fixed value, for example set by default or by the user. In other embodiments, the threshold T1 may be a fixed value determined by system context, for example the type of barrier, the status of a barrier, or the current environment. In yet other embodiments, the threshold T1 may be variable depending on the state of the HR system or on user state.
- FIG. 10 is a flowchart 1000 of an embodiment of a method for directing a user to retreat from a virtual barrier.
- the method starts 1001 and a user position is received 1002 from hardware integrated into a head-mounted display in some embodiments.
- the user position may be obtained relatively from a known position using depth sensor camera or gyroscopes, using absolute positioning systems such as GPS, or other mechanisms, depending on the embodiment.
- the flowchart 1000 continues by determining 1004 a distance ‘d’ from the known world-position of a virtual barrier, where the distance ‘d’ is the difference between the user location and the world-position of the virtual barrier.
- the flowchart 1000 continues by comparing 1006 ‘d’ to a nearness threshold ‘T1’.
- a reassuring stimulus 1008 such as a sound, visual cue or haptic push is some embodiments.
- the flowchart 1000 continues by receiving 1010 a user position and determining 1012 a distance ‘d’ to the virtual barrier at a new time instant.
- the distance ‘d’ is compared 1014 to ‘T2’ and if ‘d’ is less than or equal to a threshold ‘T2’, then the user is still proximal to the barrier, and an update for ‘d’ 1010 , 1012 is repeatedly determined at subsequent time points. If ‘d’ is greater than ‘T2’, then the user has moved away from the barrier and the flowchart 1000 continues by stopping the stimulus 1016 and then returning to the start 1001 .
- the thresholds T1 and T2 may be a fixed value, for example set by default or by the user. In other embodiments, the thresholds T1 and T2 may be a fixed value determined by system context, for example the type of barrier, the status of a barrier, or the current environment. In yet other embodiments, the thresholds T1 and T2 may be variable depending on the state of the HR system or on user state.
- the thresholds T1 and T2 may have the same value.
- the HR system may require that the stimulus is applied for a minimum time, thus often overriding stopping 1016 the stimulus.
- FIG. 11 is a flowchart 1100 of an embodiment of a method for providing directional guidance to a user.
- the method starts 1101 and a world-position is established 1110 for a barrier.
- the flowchart 1100 continues by rendering 1120 an image of a virtual barrier object at a position corresponding to the world position on a head-mounted display worn by a user.
- the flowchart 1100 continues by presenting 1140 an additional sensory stimulus to the user to encourage the user to avoid the world position of the virtual barrier object; in some embodiments, the additional sensory stimulus is presented 1140 depending on user proximity to the established world-position of the barrier.
- the flowchart 1100 then returns to the start 1101 .
- the world-position is established 1110 by capturing 1112 an image of the field-of-view using a camera coupled to the HR system.
- objects in the field of view are detected 1114 .
- the position of the detected objects is computed using depth information from a sensor coupled to the HR system.
- depth and object orientation the objects can be located 1116 in 3D real-world space.
- the virtual barrier is located at a fixed position relative to one or more of the objects in 3d real-world space.
- a vector to the virtual barrier may be determined 1142 using the world-position of the virtual barrier and computing the position relative to the current viewpoint of the user.
- a direction for the user to move to avoid the world position of the virtual barrier is calculated 1144 , for example an opposite vector.
- Information related to the direction may be communicated 1146 to a haptic system coupled to the HR system, for example one or more of a plurality of haptic pads in clothing worn by the user.
- the additional stimulus may then then applied 1148 by one or more haptic pads to encourage the user in the calculated direction.
- FIG. 12 is a flowchart 1200 of an embodiment of a method to provide directional guidance to a user wearing a head-mounted display.
- the method starts 1201 and includes obtaining 1203 a path through real-world space.
- the path can be obtained in any way, including, but not limited to, receiving a path from an external source, detecting locations of real world objects and calculating a path based on those locations, and receiving input from the user to defining the path.
- Two or more virtual path indicators are presented 1205 to the user on the HMD.
- the virtual path indicators may show a path for the user to follow and/or may be virtual barriers indicating a place where the user should not go.
- the virtual path indicators can be overlaid on a view of a portion of the real-world space, to direct the user to follow the path.
- the plurality of virtual path indicators are displayed as a two-dimensional map of the path, but in other embodiments, the plurality of virtual path indicators are displayed as a path between two points in three dimensions in real-world space. In some cases, a subset of the plurality of virtual path indicators are displayed in correct perspective to the user as a portion of a three-dimensional map.
- the virtual path indicators may be presented sequentially, over time, as the user moves. So the method may include presenting a first virtual path indicator 1252 as a first virtual barrier object at a first location on the HMD based on a first location of the user at a first time, and presenting a second virtual path indicator 1254 as a second virtual barrier object at a second location on the HMD based on second location of the user at a second time.
- the first virtual barrier object may be positioned at a first fixed position in real-world space
- the second virtual barrier object may be positioned at a second fixed position in real-world space as the user moves.
- Additional virtual barrier objects may be presented to the user as the user moves to guide the user to an end point of path 1209 .
- Embodiments may be useful in a variety of applications and in a variety of environments. While use by emergency responders has been described in some detail above, many other fields of endeavor may also use embodiments. Non-limiting examples of environments where embodiments may be used are described below.
- One example environment where embodiments may be used is in surgical operation.
- a surgeon may use an embodiment to prevent scalpel from touching healthy internal organ.
- the virtual barriers may be situated just outside a cancer tumor to segregate it from remaining part of subject organ. If a scalpel held by a surgeon, or in some cases held by machine controlled by a surgeon, crosses the virtual barriers, a haptic shake may be presented by a glove worn by the surgeon. If the scalpel further goes beyond the cancer in the healthy organ, the stimuli may be changed, for example, the amplitude of the shake may in increased, or an audio alert may be generated.
- the haptic shake may be administered to the surgeon's wrist to avoid disturbing subtle finger movements of the surgeon.
- a helmet or goggles worn by a rider of a motorcycle or bicycle is marked with virtual barriers, and in case the rider gets within a certain distance from virtual barriers, a haptic shake may be added to the helmet or goggles.
- the haptic shake may be administered by handlebars of the cycle, and/or autonomous control of the cycle may be used avoid a collision the real-world obstacle marked by the virtual barrier.
- FIG. 1 Another example environment where embodiments may be used is a helmet worn by a worker in a factory or warehouse.
- a helmet worn by a worker in a factory or warehouse.
- the worker can see virtual barriers when she/he is in front of dangerous materials, such as toxic chemicals, highly heated materials or machines, heavy objects, or the like.
- Stimuli such as haptic shakes, alerting sound, and unpleasant odors, can also be administered by the helmet or another equipment.
- Information about the dangerous materials encountered by each worker may be shared with co-workers in a timely manner or synchronized by transmitting data through wireless devices carried by workers.
- Virtual path indicators may be generated by (i) identifying a current location using GPS, a cellular connection, a WiFi connection, or any other location detection mechanism and setting a goal location (manually or by using data transmitted from a remote server); (ii) sensing real objects by a sensor or obtaining geographical data from a remote server; (iii) calculating a route which avoids the real objects; and (iv) displaying virtual path indicators in accordance with the calculated route. For instance, a worker may set a location of a particular part in a warehouse as a goal. The HMD worn by the worker may then display a route to that location as a virtual path in the HMD.
- Virtual barriers may also be shown to the worker in the HMD to keep the worker away from dangerous areas, in addition to showing the virtual path.
- the route may be calculated based on a shortest distance, a safest path, or any other criteria.
- a computer embodied in the HMD or a remote server may utilize a map of the factory or warehouse to calculate the path. To ensure more security, the calculated path may maintain a minimum distance from dangerous areas.
- the route may be calculated by a computer embodied in a remote server monitoring factory's entire system.
- the HMD may then displays arrows and dotted lines in accordance with the optimum route. The arrows and dotted lines may be changed based on the current location of the worker and the calculated route on a moment-to-moment basis.
- FIG. 1 Another example environment where embodiments may be used is a helmet worn by a worker in a construction site.
- a worker wears a helmet with a HMD which presents virtual barriers to the worker as he/she is about to enter a dangerous or restricted area, such as a soft ground area, an area with a deep hole, an area storing heavy or fragile construction materials, or the like.
- Stimuli such as haptic shakes, alerting sound, and unpleasant odors, can also be administered by the helmet or other equipment.
- a central server may control and provide the information about dangerous or restricted areas to the HMDs of the workers and keep the information updated.
- a worker wearing the helmet with HMD may be able to see virtual path indicators.
- a worker at a construction site may select a location at the construction site as a goal and obtain its location using GPS data or other data obtained from a remote server.
- the HMD may then generate and/or display virtual barriers by obtaining data of dangerous area from data already stored in the helmet or data obtained from a remote server, such as locations of a soft ground area, an open pit, or a high traffic area.
- a computer embodied in the HMD or a remote server, having a map of the construction site may then calculate a route toward the goal location.
- the route may be a two-dimensional route using paths on the ground, or may be a three-dimensional route that includes paths at different levels, such as different floors in a building.
- the route may be calculated to maintain a minimum distance from virtual barriers. After the route is calculated, the HMD may then display arrows and dotted lines as a virtual path display of the route.
- FIG. 1 Another example environment where embodiments may be used is piloting an airplane.
- a pilot wears a helmet with HMD, she/he can receive guidance from the helmet.
- Real obstacles and/or hazardous locations may be overlaid or highlighted by virtual barriers which visually support the pilot. These virtual barriers may be generated based on GPS data and map information.
- the pilot may also receive stimuli, such as haptic shakes, alerting sounds, or unpleasant odors, to reinforce the danger and the virtual barriers are approached.
- a diving mask worn by a diver is a diving mask worn by a diver.
- a diver may be presented with virtual barriers by a HMD integrated into the diving mask when she/he faces dangerous creatures or is about to enter a restricted area in the water.
- she/he may receive haptic shakes or an alerting sound from the mask or a diving suit.
- the diver may also be able to see virtual path indicators.
- a diver may set a desired destination as a goal and the location of the diver calculated using GPS (if at the surface), using geographical data, or by using other location determination techniques.
- the location of the goal is also determined and a path calculated to that goal which may then be shown on the HMD as a virtual path display to guide the diver to the goal location.
- GPS signals may not be available underwater, internal navigation techniques may be used to determine the current location of the diver as they follow the instructions of the virtual path.
- virtual barriers may also be shown to the diver, in conjunction with the virtual path, to keep the diver away from danger.
- the HMD may have safe places stored in its memory, or safe places may be obtained from a remote server. If the diver encounters an emergency situation, this may be indicated to the HMD, a safe place may be selected and a virtual path to that safe place shown to the diver on the HMD to guide them to the safe place.
- aspects of the various embodiments may be embodied as a system, device, method, or computer program product apparatus. Accordingly, elements of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, or the like) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “server,” “circuit,” “module,” “client,” “computer,” “logic,” or “system,” or other terms. Furthermore, aspects of the various embodiments may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer program code stored thereon.
- a computer-readable storage medium may be embodied as, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or other like storage devices known to those of ordinary skill in the art, or any suitable combination of computer-readable storage mediums described herein.
- a computer-readable storage medium may be any tangible medium that can contain, or store a program and/or data for use by or in connection with an instruction execution system, apparatus, or device.
- a computer data transmission medium such as a transmission line, a coaxial cable, a radio-frequency carrier, and the like, may also be able to store data, although any data storage in a data transmission medium can be said to be transitory storage. Nonetheless, a computer-readable storage medium, as the term is used herein, does not include a computer data transmission medium.
- Computer program code for carrying out operations for aspects of various embodiments may be written in any combination of one or more programming languages, including object oriented programming languages such as Java, Python, C++, or the like, conventional procedural programming languages, such as the “C” programming language or similar programming languages, or low-level computer languages, such as assembly language or microcode.
- object oriented programming languages such as Java, Python, C++, or the like
- conventional procedural programming languages such as the “C” programming language or similar programming languages
- low-level computer languages such as assembly language or microcode.
- the computer program code if loaded onto a computer, or other programmable apparatus, produces a computer implemented method.
- the instructions which execute on the computer or other programmable apparatus may provide the mechanism for implementing some or all of the functions/acts specified in the flowchart and/or block diagram block or blocks.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server, such as a cloud-based server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- the computer program code stored in/on (i.e. embodied therewith) the non-transitory computer-readable medium produces an article of manufacture.
- the computer program code if executed by a processor causes physical changes in the electronic devices of the processor which change the physical flow of electrons through the devices. This alters the connections between devices which changes the functionality of the circuit. For example, if two transistors in a processor are wired to perform a multiplexing operation under control of the computer program code, if a first computer instruction is executed, electrons from a first source flow through the first transistor to a destination, but if a different computer instruction is executed, electrons from the first source are blocked from reaching the destination, but electrons from a second source are allowed to flow through the second transistor to the destination. So a processor programmed to perform a task is transformed from what the processor was before being programmed to perform that task, much like a physical plumbing system with different valves can be controlled to change the physical flow of a fluid.
- the singular forms “a”, “an”, and “the” include plural referents unless the content clearly dictates otherwise.
- the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
- the term “coupled” includes direct and indirect connections. Moreover, where first and second devices are coupled, intervening devices including active devices may be located there between.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/031,772, now U.S. Pat. No. 10,818,088, entitled Virtual Barrier Objects, filed Jul. 10, 2018. This application is also related to U.S. patent application Ser. No. 16/031,797, now U.S. Pat. No. 10,650,600 entitled Virtual Path Display, filed Jul. 10, 2018. The contents of both aforementioned applications are incorporated by reference herein for any and all purposes.
- The present subject matter relates to displaying information, and more specifically, to presenting virtual barriers or virtual paths added to a real-world scene.
- Many situations require the presentation information to a user in a way that the user can receive the information when it is needed and ensures that the user acts accordingly. One of many different professions where this is important is for emergency responders where the ability to receive the right information at the right time can be a matter of life or death. Traditionally, emergency responders have relied on audio transmissions over a radio for a majority of their information, but that is changing with the advent of widespread wireless digital communication.
- One new technology that is making its way into the world of emergency responders is digital displays. These displays may be on a handheld device, such as a mobile phone, or on a head-mounted display (HMD), such as a virtual reality (VR) display or an augmented reality (AR) display, which may be integrated into their emergency equipment, such as their helmet. Textual information can be presented to the emergency responder through the display and the information can be updated in real-time through the digital wireless interface from a command center or other information sources.
- The accompanying drawings, which are incorporated in and constitute part of the specification, illustrate various embodiments. Together with the general description, the drawings serve to explain various principles. In the drawings:
-
FIG. 1A shows a user wearing an embodiment of a head-mounted display presenting a virtual barrier; -
FIG. 1B shows a user wearing the embodiment of a head-mounted display presenting the virtual barrier from a different perspective; -
FIG. 2A shows a user wearing an embodiment of a head-mounted display presenting a virtual barrier; -
FIG. 2B shows a user wearing the embodiment of a head-mounted display presenting a different state of the virtual barrier; -
FIG. 3 shows a user wearing an embodiment of a head-mounted display presenting a brightness mask used to direct the user; -
FIG. 4A shows in-view and out-of-view objects near a user wearing an embodiment of a head-mounted display; -
FIG. 4B shows a distortion mask presenting out-of-view data to direct a user wearing an embodiment of a head-mounted display; -
FIG. 5 shows an overhead map of a path presented by an embodiment of a head-mounted display; -
FIG. 6A shows part of a 3D path presented by an embodiment of a head-mounted display; -
FIG. 6B shows the exit of a 3D path presented by an embodiment of a head-mounted display; -
FIG. 7 shows a path for a hand motion presented by an embodiment of a head-mounted display. -
FIG. 8 shows a block diagram of an embodiment of an HR system; -
FIG. 9 is a flowchart of an embodiment of a method for directing a user to approach a virtual barrier; -
FIG. 10 is a flowchart of an embodiment of a method for directing a user to retreat from a virtual barrier; -
FIG. 11 is a flowchart of an embodiment of a method for providing directional guidance to a user; and -
FIG. 12 is a flowchart of an embodiment of a method to provide directional guidance to a user wearing a head-mounted display. - In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures and components have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present concepts. A number of descriptive terms and phrases are used in describing the various embodiments of this disclosure. These descriptive terms and phrases are used to convey a generally agreed upon meaning to those skilled in the art unless a different definition is given in this specification. Some descriptive terms and phrases are presented in the following paragraphs for clarity.
- Hybrid Reality (HR), as the phrase is used herein, refers to an image that merges real-world imagery with imagery created in a computer, which is sometimes called virtual imagery. While an HR image can be a still image, it can also be a moving image, such as imagery created using a video stream. HR can be displayed by a traditional two-dimensional display device, such as a computer monitor, one or more projectors, or a smartphone screen. HR imagery can also be displayed by a head-mounted display (HMD). Many different technologies can be used in an HMD to display HR imagery. A virtual reality (VR) HMD system may receive images of a real-world object, objects, or scene, and composite those images with a virtual object, objects, or scene to create an HR image. An augmented reality (AR) HMD system may present a virtual object, objects, or scene on a transparent screen which then naturally mixes the virtual imagery with a view of a scene in the real-world. A display which mixes live video with virtual objects is sometimes denoted AR, but for the purposes of this disclosure, an AR HMD includes at least a portion of the display area that is transparent to allow at least some of the user's view of the real-world to be directly viewed through the transparent portion of the AR HMD. The display used by an HR system represents a scene which is a visible portion of the whole environment. As used herein, the term “scene” and “field of view” (FOV) are used to indicate what is visible to a user.
- The word “occlude” is used herein to mean that a pixel of a virtual element is mixed with an image of another object to change the way the object is perceived by a viewer. In a VR HMD, this can be done through use of a compositing process to mix the two images, a Z-buffer technique to remove elements of the image that are hidden from view, a painter's algorithm to render closer objects later in the rendering process, or any other technique that can replace a pixel of the image of the real-world object with a different pixel value generated from any blend of real-world object pixel value and an HR system determined pixel value. In an AR HMD, the virtual object occludes the real-world object if the virtual object is rendered, transparently or opaquely, in the line of sight of the user as they view the real-world object. In the following description, the terms “occlude”, “transparency”, “rendering” and “overlay” are used to denote the mixing or blending of new pixel values with existing object pixel values in an HR display.
- In some embodiments of HR systems, there are sensors which provide the information used to render the HR imagery. A sensor may be mounted on or near the display, on the viewer's body, or be remote from the user. Remote sensors may include, but are not limited to, fixed sensors attached in an environment, sensors attached to robotic extensions, sensors attached to autonomous or semi-autonomous drones, or sensors attached to other persons. Data from the sensors may be raw or filtered. Data from the sensors may be transmitted wirelessly or using a wired connection.
- Sensors used by some embodiments of HR systems include, but are not limited to, a camera that captures images in the visible spectrum, an infrared depth camera, a microphone, a sound locator, a Hall effect sensor, an air-flow meter, a fuel level sensor, an oxygen sensor, an electronic nose, a gas detector, an anemometer, a mass flow sensor, a Geiger counter, a gyroscope, an infrared temperature sensor, a flame detector, a barometer, a pressure sensor, a pyrometer, a time-of-flight camera, radar, or lidar. Sensors in some HR system embodiments that may be attached to the user include, but are not limited to, a biosensor, a biochip, a heartbeat sensor, a pedometer, a skin resistance detector, or skin temperature detector.
- The display technology used by an HR system embodiment may include any method of projecting an image to an eye. Conventional technologies include, but are not limited to, cathode ray tube (CRT), liquid crystal display (LCD), light emitting diode (LED), plasma or organic LED (OLED) screens, or projectors based on those technologies or digital micromirror devices (DMD). It is also contemplated that virtual retina displays, such as direct drawing on the eye's retina using a holographic grating, may be used. It is also contemplated that direct machine to brain interfaces may be used in the future.
- The display of an HR system may also be an HMD or a separate device, such as, but not limited to, a hand-held mobile phone, a tablet, a fixed monitor or a TV screen.
- The connection technology used by an HR system may include any physical link and associated protocols, such as, but not limited to, wires, transmission lines, solder bumps, near-field connections, infra-red connections, or radio frequency (RF) connections such as cellular, satellite or Wi-Fi® (a registered trademark of the Wi-Fi Alliance). Virtual connections, such as software links, may also be used to connect to external networks and/or external compute.
- In many HR embodiments, aural stimuli and information may be provided by a sound system. The sound technology may include monaural, binaural, or multi-channel systems. A binaural system may include a headset or another two-speaker system but may also include systems with more than two speakers directed to the ears. The sounds may be presented as 3D audio, where each sound has a perceived position in space, achieved by using reverberation and head-related transfer functions to mimic how sounds change as they move in a particular space.
- In many HR system embodiments, objects in the display may move. The movement may be due to the user moving within the environment, for example walking, crouching, turning, or tilting the head. The movement may be due to an object moving, for example a dog running away, a car coming towards the user, or a person entering the FOV. The movement may also be due to an artificial movement, for example the user moving an object on a display or changing the size of the FOV. In one embodiment, the motion may be due to the user deliberately distorting all or part of the FOV, for example adding a virtual fish-eye lens. In the following description, all motion is considered relative; any motion may be resolved to a motion from a single frame of reference, for example the user's viewpoint.
- When there is motion in an HR system, the perspective of any generated object overlay may be corrected so that it changes with the shape and position of the associated real-world object. This may be done with any conventional point-of-view transformation based on the angle of the object from the viewer; note that the transformation is not limited to simple linear or rotational functions, with some embodiments using non-Abelian transformations. It is contemplated that motion effects, for example blur or deliberate edge distortion, may also be added to a generated object overlay.
- In some HR embodiments, images from cameras, whether sensitive to one or more of visible, infra-red, or microwave spectra, may be processed before algorithms are executed. Algorithms used after image processing for embodiments disclosed herein may include, but are not limited to, object recognition, motion detection, camera motion and zoom detection, light detection, facial recognition, text recognition, or mapping an unknown environment. The image processing may also use conventional filtering techniques, such as, but not limited to, static, adaptive, linear, non-linear, and Kalman filters. Deep-learning neural networks may be trained in some embodiments to mimic functions which are hard to create algorithmically. Image processing may also be used to prepare the image, for example by reducing noise, restoring the image, edge enhancement, or smoothing.
- In some HR embodiments, objects may be detected in the FOV of one or more cameras. Objects may be detected by using conventional algorithms, such as, but not limited to, edge detection, feature detection (for example surface patches, corners and edges), greyscale matching, gradient matching, pose consistency, or database look-up using geometric hashing. Genetic algorithms and trained neural networks using unsupervised learning techniques may also be used in embodiments to detect types of objects, for example people, dogs, or trees.
- In embodiments of an HR system, object may be performed on a single frame of a video stream, although techniques using multiple frames are also envisioned. Advanced techniques, such as, but not limited to, Optical Flow, camera motion, and object motion detection may be used between frames to enhance object recognition in each frame.
- After object recognition, rendering the object may be done by the HR system embodiment using databases of similar objects, the geometry of the detected object, or how the object is lit, for example specular reflections or bumps.
- In some embodiments of an HR system, the locations of objects may be generated from maps and object recognition from sensor data. Mapping data may be generated on the fly using conventional techniques, for example the Simultaneous Location and Mapping (SLAM) algorithm used to estimate locations using Bayesian methods, or extended Kalman filtering which linearizes a non-linear Kalman filter to optimally estimate the mean or covariance of a state (map), or particle filters which use Monte Carlo methods to estimate hidden states (map). The locations of objects may also be determined a priori, using techniques such as, but not limited to, reading blueprints, reading maps, receiving GPS locations, receiving relative positions to a known point (such as a cell tower, access point, or other person) determined using depth sensors, WiFi time-of-flight, or triangulation to at least three other points.
- Gyroscope sensors on or near the HMD may be used in some embodiments to determine head position and to generate relative motion vectors which can be used to estimate location.
- In embodiments of an HR system, sound data from one or microphones may be processed to detect specific sounds. Sounds that might be identified include, but are not limited to, human voices, glass breaking, human screams, gunshots, explosions, door slams, or a sound pattern a particular machine makes when defective. Gaussian Mixture Models and Hidden Markov Models may be used to generate statistical classifiers that are combined and looked up in a database of sound models. One advantage of using statistical classifiers is that sounds can be detected more consistently in noisy environments.
- In some embodiments of an HR system, eye tracking of one or both viewer's eyes may be performed. Eye tracking may be used to measure the point of the viewer's gaze. In an HMD, the position of each eye is known, and so there is a reference frame for determining head-to-eye angles, and so the position and rotation of each eye can be used to estimate the gaze point. Eye position determination may be done using any suitable technique and/or device, including, but not limited to, devices attached to an eye, tracking the eye position using infra-red reflections, for example Purkinje images, or using the electric potential of the eye detected by electrodes placed near the eye which uses the electrical field generated by an eye independently of whether the eye is closed or not.
- Turning now to the current disclosure, systems that display HR imagery are becoming increasingly common and are making their way from entertainment and gaming into industrial and commercial applications. Examples of systems that may find HR imagery useful include aiding a person doing a task, for example repairing machinery, testing a system, or responding to an emergency.
- Many of the same environments where HR imagery might be used also provide information to a user. This information may be associated with real objects in the environment or may be related to the environment as a whole, for example an ambient or average value. In other cases, the information to be provided to the user is unrelated to the real environment they are working in. Providing the various types of information to the user in a way that can be readily understood by the user and is not confusing, distracting or obscuring details that the user needs can be a challenge.
- In an HR system which aids a person doing a task, for example repairing machinery, testing a system, or responding to an emergency, there may be areas of the environment that should not be entered because of potential danger, for example exposure to toxins, possible electrical shock, an unbreathable atmosphere, or a potentially unstable platform. In another example, there may objects in the environment that should be avoided, for example a critical component that cannot be repaired if broken, a sensitive detector that cannot be touched, or a device that may require a time-consuming and complex reconfiguration if knocked.
- The virtual barrier is an artificial object that is created as an overlay on the screen used to indicate a barrier, for example to provide information that may be occluded or recommend that the user's movements be restricted. To simulate a physical object, in some embodiments the HR system may use additional stimuli to reinforce the virtual barrier, for example sounds, haptic pushes and changes to the display. As one non-limiting example, an buzzing sound may increase in volume or frequency to indicate potential encroachment as a user approaches a virtual barrier.
- It is contemplated that a virtual barrier may be rendered in any manner, such as, but not limited to, a wall, a fence, a pole, an icon, or a cross. In some embodiments, a blanket color covering the component may be used to indicate that the component should be avoided. In some embodiments, additional graphics (e.g. icons or text) may be presented to provide further information.
- In some embodiments, the position and orientation of a virtual barrier may be fixed in 3D space. If there is relative motion as described previously herein, a virtual barrier may appear as a static object in the environment by updating the overlay to show a virtual barrier in the same position and orientation in 3D space.
- In some embodiments, a virtual barrier may also change appearance, such as, but not limited to, growing larger, changing color, or rendering a different image, to indicate a change in status.
- In some embodiments, if a virtual barrier is about to be crossed, the stimuli may be changed, increased in volume, or updated more frequently rate to encourage the user to move a hand or to move away. In one example embodiment, when a virtual barrier is being approached, a directional haptic shake in the HR system, glove or suit may be used to prompt the user to move in a specific direction. In another example embodiment, a surprising flash on the display may also be used to indicate that a virtual barrier is being crossed, causing the user to stop and reassess. Sounds that have a position in space may be presented to encourage the user to move in a specific direction, for example away from a virtual barrier. In some embodiments, the user may move away from a strident alarm or move towards a sound that is associated with continuing the task at hand.
- In some embodiments, the stimuli may include unexpected or unpleasant odors added to a breathing system, the temperature of a suit raised, or moisture added, to make the user more uncomfortable and so treat a virtual barrier as something to be avoided.
- If a user moves away from a virtual barrier, it is contemplated that the virtual barrier may reinforce the user's action, such as, but not limited to, flashing, offering a reassuring voice or sound, or reducing any applied discomfort.
- In some embodiments, a virtual barrier may also be used to direct the user of an HR system to controlled actions, such as, but not limited to, a next task, a specific location, a specific orientation, a specific height, a head turn, a body rotation, a hand re-position, or a hand gesture. Time-varying stimuli may be used to indicate a pattern of motions to aid user comprehension of the desired action. A virtual barrier may indicate a direction using stimuli that have a spatial position, for example a haptic pattern on one side of a suit, glove or head-mounted system, or a sound that has a simulated 3D position. A virtual barrier may change the display to include directional attributes, for example shaking the display in a preferred direction or adding arrows. In an example embodiment, a gradient is added to the display to indicate direction. The gradient may be any visual effect that has a direction, such as, but not limited to, brightness, distortion, or time-based flashes of out-of-view objects on one side of the display. For example, if the action is to move left, the right side of the display may be dimmed out and the left side of the display enhanced by brightening.
- A plurality of virtual barriers may also be used to provide and enhance a specific path, for example a safe exit route in a smoke-filled building, or to a required position and orientation with respect to a physical object. In one embodiment, a path is defined by a sequence of virtual barriers placed in correct relative position and orientation of the screen; note that in any situation, only a portion of a virtual barrier sequence may be in effect or in view.
-
FIG. 1A shows how one embodiment of an HR system shows avirtual barrier 108A correctly positioned betweenwalls user 100 in space is indicated bycompass 106A. The field of view (FOV) ofuser 100 is indicated by the dottedbox 110; note thatbox 110 andvirtual barrier 108A are not in the real environment. The image ofvirtual barrier 108A is rendered on the head-mounteddisplay 112 in the correct orientation of theuser 100 according to the world-position of the gap betweenwalls -
FIG. 1B shows theuser 100 in a new position and orientation as indicated by thecompass 106B. At the time shown inFIG. 1B ,user 100 is still looking at the gap betweenwalls FIG. 1A , avirtual barrier 108B is rendered on the head-mounteddisplay 112 in the gap betweenwalls user 100. The rendering ofvirtual barrier FIG. 1A andFIG. 1B reinforce the appearance of a fixed solid object in real 3D space ondisplay 112. - It should be noted that the
walls user 100 in their FOV. Depending on the technology used, theuser 100 may view thewalls display 112, such as in an AR HMD, or an image of thewalls display 112 with thevirtual barrier -
FIG. 2A shows an example embodiment of a HR system rendering avirtual barrier 250 on thedisplay 240 ofuser 200. The position and orientation of the gap betweenwalls user 200. The field of view ofuser 200 is indicated by the dottedbox 230; note thatbox 230 andvirtual barrier 250 are not in the real environment. The image ofvirtual barrier 250 is rendered on the head-mounteddisplay 240 in the correct orientation of theuser 200 according to the world-position of the gap betweenwalls virtual barrier 250 rendered is selected to denote a default stop barrier, comprising apole 202, stand 204, stand 206, andicon 208. -
FIG. 2B shows an example embodiment of a HR system rendering avirtual barrier 260 on thedisplay 240 ofuser 200. Similar to the vignette shown inFIG. 2A , avirtual barrier 260 is rendered on the head-mounteddisplay 240 in the gap betweenwalls user 200. At the time ofFIG. 2B , the status of thevirtual barrier 260 has changed, for example a danger level has increased behind the barrier (e.g. the collapse of a floor behind the world position betweenwalls 210, 212). The change of status causes the HR system to render a differentvirtual barrier 260 on head-mounteddisplay 240. InFIG. 2B , the virtual barrier appears as astriped wall barrier 222 withhazard symbol 228, indicating that the barrier is less approachable than at the time ofFIG. 2A . -
FIG. 3 shows an example embodiment of a HR system rendering a brightness gradient within the field ofview 300 of head-mounteddisplay 360 worn byuser 350. The rendered field ofview 300 is superimposed on real worldspace including walls view 300 includes the visible portions of real-world walls virtual barrier 302 positioned at the gap. The field ofview 300 is split into tworegions 304, 306. In region 304, the scene is rendered at a brightness level that may be brighter or the same brightness as before the gradient was applied at the time ofFIG. 3 . Inregion 306, a gradual reduction in brightness from left to right is applied, with the right-most edge details almost completely black. The addition of the brightness gradient inregions 304, 306 encouragesuser 350 to move to the left where objects are more visible; this may be reinforced by making region 304 larger at later times asuser 350 turns in the correct direction. -
FIG. 4A shows an example snapshot of an embodiment of an HR system display. The field ofview 410 is the same size as the display. Within the field ofview 410 are twowalls virtual barrier 416 rendered by the example HR system. In the real world, there is afire hydrant 420 on the left which is not in the field ofview 410. -
FIG. 4B shows an example snapshot of an embodiment of an HR system display showing the same scene asFIG. 4A at a different time. The field ofview 410 is split into threeregions region 402, objects previously out of the field ofview 410 are rendered, forexample fire hydrant 420. Other embodiments may include a different region of the field of view to show objects that are not actually within the field of view of the user, such as a region at the top of the display or a region at the bottom of the display. In some embodiments, eye gaze tracking may be used to determine when a user looks to predefined region of the display, and objects that are out of view may be shown in that region in response to the user looking at that region of the display. - In
region 404, the left side of the actual field of view is rendered at an x-compression level defined by the ratios of 402, 404 and 406. Inregion 406, the right side of the actual field of view is rendered at a higher x-compression level than used inregion 404. The addition of the distortion gradient inregions region 402 larger at later times as the field of view moves in the correct direction. -
FIG. 5 shows an overhead map of a path presented by an embodiment of a head-mounted display. InFIG. 5 , the path is defined by the sequence ofvirtual barriers virtual barriers system display view 500 is rendered in full-screen as the path from an overhead perspective, but the map may be shown in a window of the display in embodiments. The current position of the user relative to the path is shown byicon 522 which moves as the user moves. Thearrow 524 indicates routing instructions, and may be enhanced, for example, using voice commands or text prompts. At the time ofFIG. 5 , the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, eachvirtual barrier -
FIG. 6A shows part of a 3D path presented by an embodiment of a head-mounted display. InFIG. 6A , the current portion of the path facing the user is defined by the sequence ofvirtual barriers virtual barriers FIG. 6A for clarity, but embodiments may include real-world objects overlaid by the virtual barriers. The HRsystem display view 650 is rendered in full-screen as the portion of the path directly in front of the user, but the map may be shown in a window of the display in some embodiments. The current position of the user relative to the path is indicated by the lower edge of the screen. Thearrows FIG. 6A , the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, eachvirtual barrier display 650 is updated to reflect the current perspective of the user. -
FIG. 6B shows the exit of a 3D path presented by an embodiment of a head-mounted display. InFIG. 6B , the current portion of the path facing the user path is defined byvirtual barrier 632 on the left and byvirtual barrier 634 on the right. In this embodiment, each virtual barrier is represented as a blank wall but may be rendered in any manner. Real-world objects are not shown inFIG. 6B for clarity, but embodiments may include real-world objects overlaid by the virtual barriers. The HRsystem display view 655 is rendered in full-screen as the portion of the path directly in front of the user, but the map may be shown in a window of the display. The current position of the user relative to the path is indicated by the lower edge of the screen. Thearrows FIG. 6B , the user is on the path and so the HR system prompts are confined to simple routing instructions; however, if the user starts to stray from the desired path, eachvirtual barrier icon 630 but may be enhanced or replaced with a view of the real-world from the correct perspective. -
FIG. 7 shows a path for a hand motion presented by an embodiment of a head-mounted display. The required path forhand 702 starts at 3D position 704 and ends at3D position 706 and is blocked by real-world components world interference components 3D locations 704, 706 that thehand 702 is to follow is shown by dottedline segments line segments FIG. 7 to show the desired path of thehand 702. A sequence ofvirtual barriers virtual barriers path segments virtual barrier hand 702. The position of thehand 702 is updated to show the current location in 3D space or is shown as the real-world hand. In some embodiments, thevirtual barriers -
FIG. 8 is a block diagram of an embodiment of anHR system 800 which may have some components implemented as part of a head-mounted assembly. TheHR system 800 may be considered a computer system that can be adapted to be worn on the head, carried by hand, or otherwise attached to a user. In the embodiment of theHR system 800 shown, astructure 805 is included which is adapted to be worn on the head of a user. Thestructure 805 may include straps, a helmet, a hat, or any other type of mechanism to hold the HR system on the head of the user as an HMD. - The
HR system 800 also includes adisplay 850 coupled to position thedisplay 850 in a field-of-view (FOV) of the user. Thestructure 805 may position thedisplay 850 in a field of view of the user. In some embodiments, thedisplay 850 may be a stereoscopic display with two separate views of the FOV, such asview 852 for the user's left eye, and view 854 for the user's right eye. The twoviews display 850. In some embodiments, thedisplay 850 may be transparent, such as in an augmented reality (AR) HMD. In systems where thedisplay 850 is transparent, the view of the FOV of the real-world as seen through thedisplay 850 by the user is composited with virtual objects that are shown on thedisplay 850. The virtual objects may occlude real objects in the FOV as overlay elements and may themselves be transparent or opaque, depending on the technology used for thedisplay 850 and the rendering of the virtual object. A virtual object, such as a virtual barrier, may be positioned in a virtual space, that could be two-dimensional or three-dimensional, depending on the embodiment, and may be anchored to an associated real object in real space. Note that if thedisplay 850 is a stereoscopic display, two different views of the virtual barrier may be rendered and shown in two different relative positions on the twoviews - In some embodiments, the
HR system 800 includes one or more sensors in asensing block 840 to sense at least a portion of the FOV of the user by gathering the appropriate information for that sensor, for example visible light from a visible light camera, from the FOV of the user. Any number of any type of sensor, including sensors described previously herein, may be included in thesensor block 840, depending on the embodiment. - The
HR system 800 may also include an I/O block 820 to allow communication with external devices. The I/O block 820 may include one or both of awireless network adapter 822 coupled to anantenna 824 and anetwork adapter 826 coupled to awired connection 828. Thewired connection 828 may be plugged into a portable device, for example a mobile phone, or may be a component of an umbilical system such as used in extreme environments. - In some embodiments, the
HR system 800 includes asound processor 860 which takes input from one ormicrophones 862. In someHR systems 800, themicrophones 862 may be attached to the user. External microphones, for example attached to an autonomous drone, may send sound data samples through wireless or wired connections to I/O block 820 instead of, or in addition to, the sound data received from themicrophones 862. Thesound processor 860 may generate sound data which is transferred to one ormore speakers 864, which are a type of sound reproduction device. The generated sound data may be analog samples or digital values. If more than onespeaker 864 is used, the sound processor may generate or simulate 2D or 3D sound placement. In someHR systems 800, a first speaker may be positioned to provide sound to the left ear of the user and a second speaker may be positioned to provide sound to the right ear of the user. Together, the first speaker and the second speaker may provide binaural sound to the user. - In some embodiments, the
HR system 800 includes astimulus block 870. Thestimulus block 870 is used to provide other stimuli to expand the HR system user experience. Embodiments may include numerous haptic pads attached to the user that provide a touch stimulus. Embodiments may also include other stimuli, such as, but not limited to, changing the temperature of a glove, changing the moisture level or breathability of a suit, or adding smells to a breathing system. - The
HR system 800 may include aprocessor 810 and one ormore memory devices 830, which may also be referred to as a tangible medium or a computer readable medium. Theprocessor 810 is coupled to thedisplay 850, thesensing block 840, thememory 830, I/O block 820,sound block 860, andstimulus block 870, and is configured to execute theinstructions 832 encoded on (i.e. stored in) thememory 830. Thus, theHR system 800 may include an article of manufacture comprising atangible medium 830, that is not a transitory propagating signal, encoding computer-readable instructions 832 that, when applied to acomputer system 800, instruct thecomputer system 800 to perform one or more methods described herein, thereby configuring theprocessor 810. - While the
processor 810 included in theHR system 800 may be able to perform methods described herein autonomously, in some embodiments, processing facilities outside of that provided by theprocessor 810 included inside of theHR system 800 may be used to perform one or more elements of methods described herein. In one non-limiting example, theprocessor 810 may receive information from one or more of thesensors 840 and send that information through thewireless network adapter 822 to an external processor, such as a cloud processing system or an external server. The external processor may then process the sensor information to identify a location for a virtual barrier in the FOV and send the location to theprocessor 810 through thewireless network adapter 822. Theprocessor 810 may then use the geometry, appearance and location of the virtual barrier in the FOV to render and show the virtual barrier on thedisplay 850. - In some embodiments, the
instructions 832 may instruct theHR system 800 to detect one or more objects in a field-of-view (FOV) using at least onesensor 840 coupled to thecomputer system 800. Theinstructions 832 may further instruct theHR system 800, using at least onesensor 840, to determine the world position of the one or more objects. - The
instructions 832 may further instruct theHR system 800 to establish a world position for a barrier according the world position of the one or more objects detected in the FOV. - The
instructions 832 may further instruct theHR system 800 to render an image of a virtual barrier on thedisplay 850 at a position corresponding to the world position of a barrier. In one non-limiting example, theinstructions 832 instruct theHR system 800 to render the virtual barrier using images determined by information received by theHR system 800 related to conditions occluded behind the world position of the barrier. - The
instructions 832 may further instruct theHR system 800 to render an image of a virtual barrier on thedisplay 850 at a position corresponding to the world position of a barrier at a fixed position over periods of time to correct for motion. In one non-limiting example, theinstructions 832 instruct theHR system 800 to render the virtual barrier from different perspectives as the user moves as determined by gyroscopes in thesensor block 840. - The
instructions 832 may further instruct theHR system 800 to present an additional sensory stimulus to the user to encourage the user to avoid the world position of the virtual barrier object. In one non-limiting example, theinstructions 832 instruct theHR system 800 to determine the distance of the user from the world position of the virtual barrier object using at least onesensor 840. As the distance changes over time, theinstructions 832 instruct theHR system 800 to provide one or more stimuli, for example an unnatural sound presented bysound processor 860, a change of appearance of the virtual barrier object on thedisplay 850, a visual flash on thedisplay 850, a sudden loud noise presented bysound processor 860, or a haptic push delivered through thestimulus block 870. The haptic push may be delivered by any mechanism, such as, but not limited to, theHR system 800, a suit worn by the user, or a glove worn by the user. - In some embodiments, the
instructions 832 may further instruct theHR system 800 to calculate a direction for the user to move to avoid the world position of the virtual object and to deliver a haptic push in the determined direction to theHR system 800, a suit worn by the user, or a glove worn by the user using thestimulus block 870. - Aspects of various embodiments are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus, systems, and computer program products according to various embodiments disclosed herein. It will be understood that various blocks of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- The flowchart and/or block diagrams in the figures help to illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products of various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
-
FIG. 9 is aflowchart 900 of an embodiment of a method for directing a user to approach a virtual barrier. The method starts 901 and a user position is received 902 from hardware integrated into a head-mounted display in some embodiments. The user position may be obtained relatively from a known position using depth sensor camera or gyroscopes, using absolute positioning systems such as GPS, or other mechanisms, depending on the embodiment. Theflowchart 900 continues by determining 904 a distance ‘d’ from the known world-position of a virtual barrier, where the distance ‘d’ is the difference between the user location and the world-position of the virtual barrier. Theflowchart 900 continues by comparing 906 ‘d’ to a nearness threshold ‘T1’. If the user is not near the virtual barrier, then ‘d’ is greater than or equal to ‘T1’ and the flowchart returns to thestart 901. If the user is proximal to the virtual barrier, then ‘d’ is less than ‘T1’ and theflowchart 900 continues to start astimulus 908, such as a sound, visual cue or haptic push is some embodiments. - The
flowchart 900 continues by receiving 910 a user position and determining 912 a distance ‘d’ to the virtual barrier at a new time instant. The distance ‘d’ is compared 914 to ‘T1’ and if ‘d’ is less than or equal to ‘T1’, then the user is still proximal to the barrier, and an update for ‘d’ 910, 912 is repeatedly determined at subsequent time points. If ‘d’ is greater than ‘T1’, then the user has moved away from the barrier and theflowchart 900 continues by stopping thestimulus 916 and then returning to thestart 901. - In some embodiments, the threshold T1 may be a fixed value, for example set by default or by the user. In other embodiments, the threshold T1 may be a fixed value determined by system context, for example the type of barrier, the status of a barrier, or the current environment. In yet other embodiments, the threshold T1 may be variable depending on the state of the HR system or on user state.
-
FIG. 10 is aflowchart 1000 of an embodiment of a method for directing a user to retreat from a virtual barrier. The method starts 1001 and a user position is received 1002 from hardware integrated into a head-mounted display in some embodiments. The user position may be obtained relatively from a known position using depth sensor camera or gyroscopes, using absolute positioning systems such as GPS, or other mechanisms, depending on the embodiment. Theflowchart 1000 continues by determining 1004 a distance ‘d’ from the known world-position of a virtual barrier, where the distance ‘d’ is the difference between the user location and the world-position of the virtual barrier. Theflowchart 1000 continues by comparing 1006 ‘d’ to a nearness threshold ‘T1’. If the user is near the virtual barrier, then ‘d’ is less than or equal to ‘T1’ and the flowchart returns to thestart 1001. If the user is not proximal to the virtual barrier, then ‘d’ is greater than ‘T1’ and theflowchart 1000 continues to start a reassuring stimulus 1008, such as a sound, visual cue or haptic push is some embodiments. - The
flowchart 1000 continues by receiving 1010 a user position and determining 1012 a distance ‘d’ to the virtual barrier at a new time instant. The distance ‘d’ is compared 1014 to ‘T2’ and if ‘d’ is less than or equal to a threshold ‘T2’, then the user is still proximal to the barrier, and an update for ‘d’ 1010, 1012 is repeatedly determined at subsequent time points. If ‘d’ is greater than ‘T2’, then the user has moved away from the barrier and theflowchart 1000 continues by stopping thestimulus 1016 and then returning to thestart 1001. - In some embodiments, the thresholds T1 and T2 may be a fixed value, for example set by default or by the user. In other embodiments, the thresholds T1 and T2 may be a fixed value determined by system context, for example the type of barrier, the status of a barrier, or the current environment. In yet other embodiments, the thresholds T1 and T2 may be variable depending on the state of the HR system or on user state.
- In some embodiments, the thresholds T1 and T2 may have the same value. In such embodiments the HR system may require that the stimulus is applied for a minimum time, thus often overriding stopping 1016 the stimulus.
-
FIG. 11 is aflowchart 1100 of an embodiment of a method for providing directional guidance to a user. The method starts 1101 and a world-position is established 1110 for a barrier. Theflowchart 1100 continues by rendering 1120 an image of a virtual barrier object at a position corresponding to the world position on a head-mounted display worn by a user. Theflowchart 1100 continues by presenting 1140 an additional sensory stimulus to the user to encourage the user to avoid the world position of the virtual barrier object; in some embodiments, the additional sensory stimulus is presented 1140 depending on user proximity to the established world-position of the barrier. Theflowchart 1100 then returns to thestart 1101. - In some embodiments the world-position is established 1110 by capturing 1112 an image of the field-of-view using a camera coupled to the HR system. Using object recognition, objects in the field of view are detected 1114. The position of the detected objects is computed using depth information from a sensor coupled to the HR system. By using depth and object orientation, the objects can be located 1116 in 3D real-world space. In some embodiments, the virtual barrier is located at a fixed position relative to one or more of the objects in 3d real-world space.
- In some embodiments a vector to the virtual barrier may be determined 1142 using the world-position of the virtual barrier and computing the position relative to the current viewpoint of the user. A direction for the user to move to avoid the world position of the virtual barrier is calculated 1144, for example an opposite vector. Information related to the direction may be communicated 1146 to a haptic system coupled to the HR system, for example one or more of a plurality of haptic pads in clothing worn by the user. The additional stimulus may then then applied 1148 by one or more haptic pads to encourage the user in the calculated direction.
-
FIG. 12 is aflowchart 1200 of an embodiment of a method to provide directional guidance to a user wearing a head-mounted display. The method starts 1201 and includes obtaining 1203 a path through real-world space. The path can be obtained in any way, including, but not limited to, receiving a path from an external source, detecting locations of real world objects and calculating a path based on those locations, and receiving input from the user to defining the path. - Two or more virtual path indicators are presented 1205 to the user on the HMD. The virtual path indicators may show a path for the user to follow and/or may be virtual barriers indicating a place where the user should not go. The virtual path indicators can be overlaid on a view of a portion of the real-world space, to direct the user to follow the path. In some embodiments, the the plurality of virtual path indicators are displayed as a two-dimensional map of the path, but in other embodiments, the plurality of virtual path indicators are displayed as a path between two points in three dimensions in real-world space. In some cases, a subset of the plurality of virtual path indicators are displayed in correct perspective to the user as a portion of a three-dimensional map.
- In some embodiments, the virtual path indicators may be presented sequentially, over time, as the user moves. So the method may include presenting a first
virtual path indicator 1252 as a first virtual barrier object at a first location on the HMD based on a first location of the user at a first time, and presenting a secondvirtual path indicator 1254 as a second virtual barrier object at a second location on the HMD based on second location of the user at a second time. The first virtual barrier object may be positioned at a first fixed position in real-world space, and the second virtual barrier object may be positioned at a second fixed position in real-world space as the user moves. Additional virtual barrier objects may be presented to the user as the user moves to guide the user to an end point ofpath 1209. - Embodiments may be useful in a variety of applications and in a variety of environments. While use by emergency responders has been described in some detail above, many other fields of endeavor may also use embodiments. Non-limiting examples of environments where embodiments may be used are described below.
- One example environment where embodiments may be used is in surgical operation. A surgeon may use an embodiment to prevent scalpel from touching healthy internal organ. For example, the virtual barriers may be situated just outside a cancer tumor to segregate it from remaining part of subject organ. If a scalpel held by a surgeon, or in some cases held by machine controlled by a surgeon, crosses the virtual barriers, a haptic shake may be presented by a glove worn by the surgeon. If the scalpel further goes beyond the cancer in the healthy organ, the stimuli may be changed, for example, the amplitude of the shake may in increased, or an audio alert may be generated. In one embodiment, the haptic shake may be administered to the surgeon's wrist to avoid disturbing subtle finger movements of the surgeon.
- Another example environment where embodiments may be used is in a helmet or goggles worn by a rider of a motorcycle or bicycle. For example, other vehicles and/or the shoulder of the road are marked with virtual barriers, and in case the rider gets within a certain distance from virtual barriers, a haptic shake may be added to the helmet or goggles. In another embodiments, the haptic shake may be administered by handlebars of the cycle, and/or autonomous control of the cycle may be used avoid a collision the real-world obstacle marked by the virtual barrier.
- Another example environment where embodiments may be used is a helmet worn by a worker in a factory or warehouse. For example, by wearing a helmet with a HMD, the worker can see virtual barriers when she/he is in front of dangerous materials, such as toxic chemicals, highly heated materials or machines, heavy objects, or the like. Stimuli, such as haptic shakes, alerting sound, and unpleasant odors, can also be administered by the helmet or another equipment. Information about the dangerous materials encountered by each worker may be shared with co-workers in a timely manner or synchronized by transmitting data through wireless devices carried by workers.
- By wearing the helmet a worker may be able to see virtual path indicators. Virtual path indicators may be generated by (i) identifying a current location using GPS, a cellular connection, a WiFi connection, or any other location detection mechanism and setting a goal location (manually or by using data transmitted from a remote server); (ii) sensing real objects by a sensor or obtaining geographical data from a remote server; (iii) calculating a route which avoids the real objects; and (iv) displaying virtual path indicators in accordance with the calculated route. For instance, a worker may set a location of a particular part in a warehouse as a goal. The HMD worn by the worker may then display a route to that location as a virtual path in the HMD. Virtual barriers may also be shown to the worker in the HMD to keep the worker away from dangerous areas, in addition to showing the virtual path. The route may be calculated based on a shortest distance, a safest path, or any other criteria. A computer embodied in the HMD or a remote server may utilize a map of the factory or warehouse to calculate the path. To ensure more security, the calculated path may maintain a minimum distance from dangerous areas. The route may be calculated by a computer embodied in a remote server monitoring factory's entire system. The HMD may then displays arrows and dotted lines in accordance with the optimum route. The arrows and dotted lines may be changed based on the current location of the worker and the calculated route on a moment-to-moment basis.
- Another example environment where embodiments may be used is a helmet worn by a worker in a construction site. For instance, a worker wears a helmet with a HMD which presents virtual barriers to the worker as he/she is about to enter a dangerous or restricted area, such as a soft ground area, an area with a deep hole, an area storing heavy or fragile construction materials, or the like. Stimuli, such as haptic shakes, alerting sound, and unpleasant odors, can also be administered by the helmet or other equipment. A central server may control and provide the information about dangerous or restricted areas to the HMDs of the workers and keep the information updated.
- Also, a worker wearing the helmet with HMD may be able to see virtual path indicators. For example, a worker at a construction site may select a location at the construction site as a goal and obtain its location using GPS data or other data obtained from a remote server. The HMD may then generate and/or display virtual barriers by obtaining data of dangerous area from data already stored in the helmet or data obtained from a remote server, such as locations of a soft ground area, an open pit, or a high traffic area. A computer embodied in the HMD or a remote server, having a map of the construction site, may then calculate a route toward the goal location. This may be done by searching for a shortest distance path, a safest path, a path that avoids the virtual barriers, or using any other criteria to compute the route. The route may be a two-dimensional route using paths on the ground, or may be a three-dimensional route that includes paths at different levels, such as different floors in a building. For enhanced safety, the route may be calculated to maintain a minimum distance from virtual barriers. After the route is calculated, the HMD may then display arrows and dotted lines as a virtual path display of the route.
- Another example environment where embodiments may be used is piloting an airplane. When a pilot wears a helmet with HMD, she/he can receive guidance from the helmet. Real obstacles and/or hazardous locations may be overlaid or highlighted by virtual barriers which visually support the pilot. These virtual barriers may be generated based on GPS data and map information. The pilot may also receive stimuli, such as haptic shakes, alerting sounds, or unpleasant odors, to reinforce the danger and the virtual barriers are approached.
- Another example environment where embodiments may be used is a diving mask worn by a diver. A diver may be presented with virtual barriers by a HMD integrated into the diving mask when she/he faces dangerous creatures or is about to enter a restricted area in the water. At the same time, she/he may receive haptic shakes or an alerting sound from the mask or a diving suit.
- By wearing the diving mask with HMD, the diver may also be able to see virtual path indicators. A diver may set a desired destination as a goal and the location of the diver calculated using GPS (if at the surface), using geographical data, or by using other location determination techniques. The location of the goal is also determined and a path calculated to that goal which may then be shown on the HMD as a virtual path display to guide the diver to the goal location. Because GPS signals may not be available underwater, internal navigation techniques may be used to determine the current location of the diver as they follow the instructions of the virtual path. In some embodiments, virtual barriers may also be shown to the diver, in conjunction with the virtual path, to keep the diver away from danger. The HMD may have safe places stored in its memory, or safe places may be obtained from a remote server. If the diver encounters an emergency situation, this may be indicated to the HMD, a safe place may be selected and a virtual path to that safe place shown to the diver on the HMD to guide them to the safe place.
- As will be appreciated by those of ordinary skill in the art, aspects of the various embodiments may be embodied as a system, device, method, or computer program product apparatus. Accordingly, elements of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, or the like) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “server,” “circuit,” “module,” “client,” “computer,” “logic,” or “system,” or other terms. Furthermore, aspects of the various embodiments may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer program code stored thereon.
- Any combination of one or more computer-readable storage medium(s) may be utilized. A computer-readable storage medium may be embodied as, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or other like storage devices known to those of ordinary skill in the art, or any suitable combination of computer-readable storage mediums described herein. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain, or store a program and/or data for use by or in connection with an instruction execution system, apparatus, or device. Even if the data in the computer-readable storage medium requires action to maintain the storage of data, such as in a traditional semiconductor-based dynamic random access memory, the data storage in a computer-readable storage medium can be considered to be non-transitory. A computer data transmission medium, such as a transmission line, a coaxial cable, a radio-frequency carrier, and the like, may also be able to store data, although any data storage in a data transmission medium can be said to be transitory storage. Nonetheless, a computer-readable storage medium, as the term is used herein, does not include a computer data transmission medium.
- Computer program code for carrying out operations for aspects of various embodiments may be written in any combination of one or more programming languages, including object oriented programming languages such as Java, Python, C++, or the like, conventional procedural programming languages, such as the “C” programming language or similar programming languages, or low-level computer languages, such as assembly language or microcode. The computer program code if loaded onto a computer, or other programmable apparatus, produces a computer implemented method. The instructions which execute on the computer or other programmable apparatus may provide the mechanism for implementing some or all of the functions/acts specified in the flowchart and/or block diagram block or blocks. In accordance with various implementations, the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server, such as a cloud-based server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). The computer program code stored in/on (i.e. embodied therewith) the non-transitory computer-readable medium produces an article of manufacture.
- The computer program code, if executed by a processor causes physical changes in the electronic devices of the processor which change the physical flow of electrons through the devices. This alters the connections between devices which changes the functionality of the circuit. For example, if two transistors in a processor are wired to perform a multiplexing operation under control of the computer program code, if a first computer instruction is executed, electrons from a first source flow through the first transistor to a destination, but if a different computer instruction is executed, electrons from the first source are blocked from reaching the destination, but electrons from a second source are allowed to flow through the second transistor to the destination. So a processor programmed to perform a task is transformed from what the processor was before being programmed to perform that task, much like a physical plumbing system with different valves can be controlled to change the physical flow of a fluid.
- Unless otherwise indicated, all numbers expressing quantities, properties, measurements, and so forth, used in the specification and claims are to be understood as being modified in all instances by the term “about.” The recitation of numerical ranges by endpoints includes all numbers subsumed within that range, including the endpoints (e.g. 1 to 5 includes 1, 2.78, π, 3.
33 , 4, and 5). - As used in this specification and the appended claims, the singular forms “a”, “an”, and “the” include plural referents unless the content clearly dictates otherwise. Furthermore, as used in this specification and the appended claims, the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise. As used herein, the term “coupled” includes direct and indirect connections. Moreover, where first and second devices are coupled, intervening devices including active devices may be located there between.
- The description of the various embodiments provided above is illustrative in nature and is not intended to limit this disclosure, its application, or uses. Thus, different variations beyond those described herein are intended to be within the scope of embodiments. Such variations are not to be regarded as a departure from the intended scope of this disclosure. As such, the breadth and scope of the present disclosure should not be limited by the above-described exemplary embodiments, but should be defined only in accordance with the following claims and equivalents thereof.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/078,271 US20210043007A1 (en) | 2018-07-10 | 2020-10-23 | Virtual Path Presentation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/031,772 US10818088B2 (en) | 2018-07-10 | 2018-07-10 | Virtual barrier objects |
US17/078,271 US20210043007A1 (en) | 2018-07-10 | 2020-10-23 | Virtual Path Presentation |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/031,772 Continuation US10818088B2 (en) | 2018-07-10 | 2018-07-10 | Virtual barrier objects |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210043007A1 true US20210043007A1 (en) | 2021-02-11 |
Family
ID=69138464
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/031,772 Active US10818088B2 (en) | 2018-07-10 | 2018-07-10 | Virtual barrier objects |
US17/078,271 Abandoned US20210043007A1 (en) | 2018-07-10 | 2020-10-23 | Virtual Path Presentation |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/031,772 Active US10818088B2 (en) | 2018-07-10 | 2018-07-10 | Virtual barrier objects |
Country Status (1)
Country | Link |
---|---|
US (2) | US10818088B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11857378B1 (en) * | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3327544B1 (en) * | 2016-11-25 | 2021-06-23 | Nokia Technologies Oy | Apparatus, associated method and associated computer readable medium |
US10497161B1 (en) | 2018-06-08 | 2019-12-03 | Curious Company, LLC | Information display by overlay on an object |
US10650600B2 (en) | 2018-07-10 | 2020-05-12 | Curious Company, LLC | Virtual path display |
US10902678B2 (en) | 2018-09-06 | 2021-01-26 | Curious Company, LLC | Display of hidden information |
US11055913B2 (en) | 2018-12-04 | 2021-07-06 | Curious Company, LLC | Directional instructions in an hybrid reality system |
US10970935B2 (en) | 2018-12-21 | 2021-04-06 | Curious Company, LLC | Body pose message system |
US10872584B2 (en) | 2019-03-14 | 2020-12-22 | Curious Company, LLC | Providing positional information using beacon devices |
AU2020308728A1 (en) * | 2019-06-24 | 2022-02-10 | Touchmagix Media Pvt. Ltd. | Interactive reality activity augmentation |
US11852500B1 (en) * | 2019-08-29 | 2023-12-26 | Snap Inc. | Navigation assistance for the visually impaired |
US11175730B2 (en) * | 2019-12-06 | 2021-11-16 | Facebook Technologies, Llc | Posture-based virtual space configurations |
US11257280B1 (en) | 2020-05-28 | 2022-02-22 | Facebook Technologies, Llc | Element-based switching of ray casting rules |
US11256336B2 (en) | 2020-06-29 | 2022-02-22 | Facebook Technologies, Llc | Integration of artificial reality interaction modes |
US11178376B1 (en) | 2020-09-04 | 2021-11-16 | Facebook Technologies, Llc | Metering for display modes in artificial reality |
US11294475B1 (en) | 2021-02-08 | 2022-04-05 | Facebook Technologies, Llc | Artificial reality multi-modal input switching model |
CN114281195B (en) * | 2021-12-27 | 2022-08-05 | 广东景龙建设集团有限公司 | Method and system for selecting assembled stone based on virtual touch gloves |
Family Cites Families (117)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3861350A (en) | 1971-07-23 | 1975-01-21 | Albert B Selleck | Warning system and device, and malodorous warning composition of matter and process for its preparation |
US5309169A (en) * | 1993-02-01 | 1994-05-03 | Honeywell Inc. | Visor display with fiber optic faceplate correction |
US5815411A (en) * | 1993-09-10 | 1998-09-29 | Criticom Corporation | Electro-optic vision system which exploits position and attitude |
US7406214B2 (en) | 1999-05-19 | 2008-07-29 | Digimarc Corporation | Methods and devices employing optical sensors and/or steganography |
US20020176259A1 (en) | 1999-11-18 | 2002-11-28 | Ducharme Alfred D. | Systems and methods for converting illumination |
US20020191004A1 (en) | 2000-08-09 | 2002-12-19 | Ebersole John Franklin | Method for visualization of hazards utilizing computer-generated three-dimensional representations |
US20020196202A1 (en) | 2000-08-09 | 2002-12-26 | Bastian Mark Stanley | Method for displaying emergency first responder command, control, and safety information using augmented reality |
US6903752B2 (en) * | 2001-07-16 | 2005-06-07 | Information Decision Technologies, Llc | Method to view unseen atmospheric phenomenon using augmented reality |
AU2002366994A1 (en) | 2002-01-15 | 2003-07-30 | Information Decision Technolog | Method and system to display both visible and invisible hazards and hazard information |
US20030210812A1 (en) | 2002-02-26 | 2003-11-13 | Ali Khamene | Apparatus and method for surgical navigation |
WO2007004493A1 (en) | 2005-07-01 | 2007-01-11 | National Institute For Materials Science | Fluorophor and method for production thereof and illuminator |
US20070045641A1 (en) | 2005-08-23 | 2007-03-01 | Yin Chua Janet B | Light source with UV LED and UV reflector |
US7853061B2 (en) | 2007-04-26 | 2010-12-14 | General Electric Company | System and method to improve visibility of an object in an imaged subject |
US9015029B2 (en) | 2007-06-04 | 2015-04-21 | Sony Corporation | Camera dictionary based on object recognition |
US20090065715A1 (en) | 2007-08-24 | 2009-03-12 | Lee Wainright | Universal ultraviolet/ IR/ visible light emitting module |
US7961202B2 (en) | 2007-10-26 | 2011-06-14 | Mitel Networks Corporation | Method and apparatus for maintaining a visual appearance of at least one window when a resolution of the screen changes |
JP2011515717A (en) | 2008-03-24 | 2011-05-19 | グーグル インコーポレイテッド | Panoramic image in driving instructions |
US9398266B2 (en) | 2008-04-02 | 2016-07-19 | Hernan Carzalo | Object content navigation |
US20100117828A1 (en) | 2008-11-07 | 2010-05-13 | Stuart Owen Goldman | Alarm scheme with olfactory alerting component |
US8009022B2 (en) | 2009-05-29 | 2011-08-30 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
US10019634B2 (en) | 2010-06-04 | 2018-07-10 | Masoud Vaziri | Method and apparatus for an eye tracking wearable computer |
US20110270135A1 (en) | 2009-11-30 | 2011-11-03 | Christopher John Dooley | Augmented reality for testing and training of human performance |
US8614539B2 (en) | 2010-10-05 | 2013-12-24 | Intematix Corporation | Wavelength conversion component with scattering particles |
US9348141B2 (en) | 2010-10-27 | 2016-05-24 | Microsoft Technology Licensing, Llc | Low-latency fusing of virtual and real content |
KR20130136566A (en) | 2011-03-29 | 2013-12-12 | 퀄컴 인코포레이티드 | Modular mobile connected pico projectors for a local multi-user collaboration |
US8863039B2 (en) | 2011-04-18 | 2014-10-14 | Microsoft Corporation | Multi-dimensional boundary effects |
US20120289290A1 (en) | 2011-05-12 | 2012-11-15 | KT Corporation, KT TECH INC. | Transferring objects between application windows displayed on mobile terminal |
US20130249948A1 (en) | 2011-08-26 | 2013-09-26 | Reincloud Corporation | Providing interactive travel content at a display device |
US20130222371A1 (en) | 2011-08-26 | 2013-08-29 | Reincloud Corporation | Enhancing a sensory perception in a field of view of a real-time source within a display screen through augmented reality |
US20130249947A1 (en) | 2011-08-26 | 2013-09-26 | Reincloud Corporation | Communication using augmented reality |
KR101407670B1 (en) | 2011-09-15 | 2014-06-16 | 주식회사 팬택 | Mobile terminal, server and method for forming communication channel using augmented reality |
US9229231B2 (en) | 2011-12-07 | 2016-01-05 | Microsoft Technology Licensing, Llc | Updating printed content with personalized virtual data |
US9361530B2 (en) | 2012-01-20 | 2016-06-07 | Medivators Inc. | Use of human input recognition to prevent contamination |
US9165381B2 (en) | 2012-05-31 | 2015-10-20 | Microsoft Technology Licensing, Llc | Augmented books in a mixed reality environment |
JP5991039B2 (en) | 2012-06-18 | 2016-09-14 | 株式会社リコー | Information processing apparatus and conference system |
US9645394B2 (en) | 2012-06-25 | 2017-05-09 | Microsoft Technology Licensing, Llc | Configured virtual environments |
US9292085B2 (en) | 2012-06-29 | 2016-03-22 | Microsoft Technology Licensing, Llc | Configuring an interaction zone within an augmented reality environment |
US8953841B1 (en) * | 2012-09-07 | 2015-02-10 | Amazon Technologies, Inc. | User transportable device with hazard monitoring |
EP2920773A4 (en) | 2012-11-16 | 2016-07-06 | Flir Systems | Synchronized infrared beacon / infrared detection system |
WO2014147898A1 (en) | 2013-03-19 | 2014-09-25 | 株式会社村田製作所 | Laminate ceramic electronic component |
US9367136B2 (en) * | 2013-04-12 | 2016-06-14 | Microsoft Technology Licensing, Llc | Holographic object feedback |
US9286725B2 (en) | 2013-11-14 | 2016-03-15 | Nintendo Co., Ltd. | Visually convincing depiction of object interactions in augmented reality images |
EP3075106A4 (en) | 2013-11-25 | 2017-06-14 | ABL IP Holding LLC | System and method for communication with a mobile device via a positioning system including rf communication devices and modulated beacon light sources |
CN103697900A (en) | 2013-12-10 | 2014-04-02 | 郭海锋 | Method for early warning on danger through augmented reality by vehicle-mounted emotional robot |
US10586395B2 (en) * | 2013-12-30 | 2020-03-10 | Daqri, Llc | Remote object detection and local tracking using visual odometry |
KR20150101612A (en) | 2014-02-27 | 2015-09-04 | 엘지전자 주식회사 | Head Mounted Display with closed-view and Method for controlling the same |
US10203762B2 (en) | 2014-03-11 | 2019-02-12 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US10430985B2 (en) | 2014-03-14 | 2019-10-01 | Magic Leap, Inc. | Augmented reality systems and methods utilizing reflections |
KR20150118813A (en) | 2014-04-15 | 2015-10-23 | 삼성전자주식회사 | Providing Method for Haptic Information and Electronic Device supporting the same |
US20150325047A1 (en) | 2014-05-06 | 2015-11-12 | Honeywell International Inc. | Apparatus and method for providing augmented reality for maintenance applications |
US9588586B2 (en) | 2014-06-09 | 2017-03-07 | Immersion Corporation | Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity |
US20160147408A1 (en) | 2014-11-25 | 2016-05-26 | Johnathan Bevis | Virtual measurement tool for a wearable visualization device |
US10235768B2 (en) | 2014-12-10 | 2019-03-19 | Mitsubishi Electric Corporation | Image processing device, in-vehicle display system, display device, image processing method, and computer readable medium |
US10065074B1 (en) | 2014-12-12 | 2018-09-04 | Enflux, Inc. | Training systems with wearable sensors for providing users with feedback |
US9746921B2 (en) | 2014-12-31 | 2017-08-29 | Sony Interactive Entertainment Inc. | Signal generation and detector systems and methods for determining positions of fingers of a user |
US9953216B2 (en) | 2015-01-13 | 2018-04-24 | Google Llc | Systems and methods for performing actions in response to user gestures in captured images |
KR102317803B1 (en) | 2015-01-23 | 2021-10-27 | 삼성전자주식회사 | Electronic device and method for controlling a plurality of displays |
US10212355B2 (en) | 2015-03-13 | 2019-02-19 | Thales Defense & Security, Inc. | Dual-mode illuminator for imaging under different lighting conditions |
AU2016233280B2 (en) | 2015-03-16 | 2021-03-25 | Magic Leap, Inc. | Augmented reality pulse oximetry |
EP3274986A4 (en) | 2015-03-21 | 2019-04-17 | Mine One GmbH | Virtual 3d methods, systems and software |
WO2016181398A1 (en) | 2015-05-11 | 2016-11-17 | Vayyar Imaging Ltd | System, device and methods for imaging of objects using electromagnetic array |
JP6609994B2 (en) | 2015-05-22 | 2019-11-27 | 富士通株式会社 | Display control method, information processing apparatus, and display control program |
EP3310196A4 (en) | 2015-06-19 | 2019-03-20 | Oakley, Inc. | Sports helmet having modular components |
KR102447438B1 (en) | 2015-07-01 | 2022-09-27 | 삼성전자주식회사 | Alarm device and method for informing location of objects thereof |
US20170103440A1 (en) | 2015-08-01 | 2017-04-13 | Zhou Tian Xing | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
US9852599B1 (en) | 2015-08-17 | 2017-12-26 | Alarm.Com Incorporated | Safety monitoring platform |
WO2017039308A1 (en) | 2015-08-31 | 2017-03-09 | Samsung Electronics Co., Ltd. | Virtual reality display apparatus and display method thereof |
US10297129B2 (en) | 2015-09-24 | 2019-05-21 | Tyco Fire & Security Gmbh | Fire/security service system with augmented reality |
FR3042925B1 (en) | 2015-10-26 | 2017-12-22 | St Microelectronics Crolles 2 Sas | SYSTEM FOR CONVERTING THERMAL ENERGY INTO ELECTRICAL ENERGY. |
US20170169170A1 (en) | 2015-12-11 | 2017-06-15 | Yosko, Inc. | Methods and systems for location-based access to clinical information |
US20180239417A1 (en) | 2015-12-30 | 2018-08-23 | Shenzhen Royole Technologies Co. Ltd. | Head-mounted display device, head-mounted display system, and input method |
WO2017117562A1 (en) | 2015-12-31 | 2017-07-06 | Daqri, Llc | Augmented reality based path visualization for motion planning |
US20170192091A1 (en) | 2016-01-06 | 2017-07-06 | Ford Global Technologies, Llc | System and method for augmented reality reduced visibility navigation |
US9978180B2 (en) | 2016-01-25 | 2018-05-22 | Microsoft Technology Licensing, Llc | Frame projection for augmented reality environments |
CA3055023A1 (en) | 2016-03-01 | 2017-09-08 | ARIS MD, Inc. | Systems and methods for rendering immersive environments |
CN105781618A (en) | 2016-03-15 | 2016-07-20 | 华洋通信科技股份有限公司 | Coal mine safety integrated monitoring system based on Internet of Things |
WO2017161192A1 (en) | 2016-03-16 | 2017-09-21 | Nils Forsblom | Immersive virtual experience using a mobile communication device |
US20170277257A1 (en) | 2016-03-23 | 2017-09-28 | Jeffrey Ota | Gaze-based sound selection |
US10551826B2 (en) | 2016-03-24 | 2020-02-04 | Andrei Popa-Simil | Method and system to increase operator awareness |
US9928662B2 (en) | 2016-05-09 | 2018-03-27 | Unity IPR ApS | System and method for temporal manipulation in virtual environments |
US9922464B2 (en) | 2016-05-10 | 2018-03-20 | Disney Enterprises, Inc. | Occluded virtual image display |
US9925920B2 (en) | 2016-05-24 | 2018-03-27 | Ford Global Technologies, Llc | Extended lane blind spot detection |
US10046236B2 (en) | 2016-06-13 | 2018-08-14 | Sony Interactive Entertainment America, LLC | Browser-based cloud gaming |
US10171929B2 (en) | 2016-06-23 | 2019-01-01 | Lightbox Video Inc. | Positional audio assignment system |
US10102732B2 (en) | 2016-06-28 | 2018-10-16 | Infinite Designs, LLC | Danger monitoring system |
AU2017301435B2 (en) | 2016-07-25 | 2022-07-14 | Magic Leap, Inc. | Imaging modification, display and visualization using augmented and virtual reality eyewear |
CN107657662A (en) | 2016-07-26 | 2018-02-02 | 金德奎 | Augmented reality equipment and its system and method that can be directly interactive between a kind of user |
US10486742B2 (en) * | 2016-08-01 | 2019-11-26 | Magna Electronics Inc. | Parking assist system using light projections |
US10155159B2 (en) | 2016-08-18 | 2018-12-18 | Activision Publishing, Inc. | Tactile feedback systems and methods for augmented reality and virtual reality systems |
US10656731B2 (en) | 2016-09-15 | 2020-05-19 | Daqri, Llc | Peripheral device for head-mounted display |
US10134192B2 (en) | 2016-10-17 | 2018-11-20 | Microsoft Technology Licensing, Llc | Generating and displaying a computer generated image on a future pose of a real world object |
US10281982B2 (en) * | 2016-10-17 | 2019-05-07 | Facebook Technologies, Llc | Inflatable actuators in virtual reality |
US10088902B2 (en) * | 2016-11-01 | 2018-10-02 | Oculus Vr, Llc | Fiducial rings in virtual reality |
DE102016121663A1 (en) | 2016-11-11 | 2018-05-17 | Osram Gmbh | Activating a transmitting device of a lighting device |
US9911020B1 (en) | 2016-12-08 | 2018-03-06 | At&T Intellectual Property I, L.P. | Method and apparatus for tracking via a radio frequency identification device |
US11347054B2 (en) | 2017-02-16 | 2022-05-31 | Magic Leap, Inc. | Systems and methods for augmented reality |
WO2018160593A1 (en) | 2017-02-28 | 2018-09-07 | Magic Leap, Inc. | Virtual and real object recording in mixed reality device |
US10250328B2 (en) | 2017-03-09 | 2019-04-02 | General Electric Company | Positioning system based on visible light communications |
US10408624B2 (en) | 2017-04-18 | 2019-09-10 | Microsoft Technology Licensing, Llc | Providing familiarizing directional information |
US10460585B2 (en) | 2017-06-05 | 2019-10-29 | Symbol Technologies, Llc | RFID directed video snapshots capturing targets of interest |
US10528228B2 (en) | 2017-06-21 | 2020-01-07 | Microsoft Technology Licensing, Llc | Interaction with notifications across devices with a digital assistant |
US20190007548A1 (en) | 2017-06-28 | 2019-01-03 | The Travelers Indemnity Company | Systems and methods for discrete location-based functionality |
EP3422149B1 (en) | 2017-06-30 | 2023-03-01 | Nokia Technologies Oy | Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality |
US10867205B2 (en) | 2017-07-18 | 2020-12-15 | Lenovo (Singapore) Pte. Ltd. | Indication of characteristic based on condition |
KR20200042474A (en) | 2017-07-24 | 2020-04-23 | 사이륨 테크놀로지즈 인코포레이티드 | Thin layer material for short wavelength infrared generation |
US10725537B2 (en) * | 2017-10-02 | 2020-07-28 | Facebook Technologies, Llc | Eye tracking system using dense structured light patterns |
CN115175064A (en) | 2017-10-17 | 2022-10-11 | 奇跃公司 | Mixed reality spatial audio |
US10748426B2 (en) | 2017-10-18 | 2020-08-18 | Toyota Research Institute, Inc. | Systems and methods for detection and presentation of occluded objects |
US20190132815A1 (en) | 2017-10-27 | 2019-05-02 | Sentry Centers Holdings LLC | Systems and methods for beacon integrated with displays |
US11360546B2 (en) | 2017-12-22 | 2022-06-14 | Ultrahaptics Ip Ltd | Tracking in haptic systems |
KR102050999B1 (en) | 2017-12-27 | 2019-12-05 | 성균관대학교산학협력단 | Method and apparatus for transmitting of energy and method and node for receiving of energy |
US11032662B2 (en) | 2018-05-30 | 2021-06-08 | Qualcomm Incorporated | Adjusting audio characteristics for augmented reality |
US10497161B1 (en) | 2018-06-08 | 2019-12-03 | Curious Company, LLC | Information display by overlay on an object |
US10706629B2 (en) | 2018-06-15 | 2020-07-07 | Dell Products, L.P. | Coordinate override in virtual, augmented, and mixed reality (xR) applications |
PL3821152T3 (en) | 2018-07-10 | 2024-04-02 | Saint-Gobain Performance Plastics Rencol Limited | Torque assembly and method of making and using the same |
US10650600B2 (en) | 2018-07-10 | 2020-05-12 | Curious Company, LLC | Virtual path display |
US10902678B2 (en) | 2018-09-06 | 2021-01-26 | Curious Company, LLC | Display of hidden information |
-
2018
- 2018-07-10 US US16/031,772 patent/US10818088B2/en active Active
-
2020
- 2020-10-23 US US17/078,271 patent/US20210043007A1/en not_active Abandoned
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11857378B1 (en) * | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
Also Published As
Publication number | Publication date |
---|---|
US10818088B2 (en) | 2020-10-27 |
US20200020161A1 (en) | 2020-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210043007A1 (en) | Virtual Path Presentation | |
US10650600B2 (en) | Virtual path display | |
US11238666B2 (en) | Display of an occluded object in a hybrid-reality system | |
US11282248B2 (en) | Information display by overlay on an object | |
US11995772B2 (en) | Directional instructions in an hybrid-reality system | |
US10872584B2 (en) | Providing positional information using beacon devices | |
US10482662B2 (en) | Systems and methods for mixed reality transitions | |
US20180190022A1 (en) | Dynamic depth-based content creation in virtual reality environments | |
US10970935B2 (en) | Body pose message system | |
US20180136716A1 (en) | Method for operating a virtual reality system, and virtual reality system | |
JP7322713B2 (en) | Information processing device, information processing method, and program | |
US11889291B2 (en) | Head-related transfer function | |
EP4341797A1 (en) | Audio enhanced augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
AS | Assignment |
Owner name: CURIOUS COMPANY, LLC, OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YOUNG, BRUCE A;JONES, ANTHONY MARK;JONES, JESSICA AF;SIGNING DATES FROM 20180528 TO 20180529;REEL/FRAME:054443/0076 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |