US20140361988A1 - Touch Free Interface for Augmented Reality Systems - Google Patents
Touch Free Interface for Augmented Reality Systems Download PDFInfo
- Publication number
- US20140361988A1 US20140361988A1 US14/345,592 US201214345592A US2014361988A1 US 20140361988 A1 US20140361988 A1 US 20140361988A1 US 201214345592 A US201214345592 A US 201214345592A US 2014361988 A1 US2014361988 A1 US 2014361988A1
- Authority
- US
- United States
- Prior art keywords
- processor
- visual data
- user
- predefined
- augmented reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 50
- 238000000034 method Methods 0.000 claims abstract description 40
- 230000000007 visual effect Effects 0.000 claims description 66
- 230000033001 locomotion Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 abstract description 6
- 210000003811 finger Anatomy 0.000 description 28
- 238000010295 mobile communication Methods 0.000 description 21
- 238000004891 communication Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 14
- 230000003213 activating effect Effects 0.000 description 12
- 238000001514 detection method Methods 0.000 description 10
- 210000004247 hand Anatomy 0.000 description 10
- 230000001276 controlling effect Effects 0.000 description 7
- 230000002596 correlated effect Effects 0.000 description 7
- 238000010079 rubber tapping Methods 0.000 description 6
- 230000003993 interaction Effects 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 230000000881 depressing effect Effects 0.000 description 3
- 230000030279 gene silencing Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- 238000010422 painting Methods 0.000 description 3
- 238000002310 reflectometry Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- JLQUFIHWVLZVTJ-UHFFFAOYSA-N carbosulfan Chemical compound CCCCN(CCCC)SN(C)C(=O)OC1=CC=CC2=C1OC(C)(C)C2 JLQUFIHWVLZVTJ-UHFFFAOYSA-N 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 210000005224 forefinger Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present invention relates to methods and systems for augmented reality.
- Augmented reality is a term for a live, direct or an indirect, view of a physical, real-world environment whose elements are augmented by computer-generated information such as text, sound, video, graphics or GPS data. Artificial information about the environment and its objects is thus overlaid on a real world view or image. Augmentation is typically in real-time and in semantic context with environmental elements so that information about the surrounding real world of the user becomes interactive and digitally manipulatable.
- the main hardware components for augmented reality are a processor, display, sensors and input devices. These elements, specifically a CPU, display, camera and MEMS sensors such as accelerometer, GPS, or solid state compass are present in portable device such as smartphones, which allow them to function as augmented reality platforms.
- Augmented reality systems have found applications in entrainment, navigation, assembly processes, maintenance, medical procedures.
- Portable augmented reality systems have also found applications in tourism and sightseeing where augmented reality is used to present information of real world objects and places objects being viewed.
- An immersive augmented reality experience is provided using a head-mounted display, typically in the form of goggles or a helmet.
- a head-mounted display virtual visual objects are superimposed on the user's view of a real world scene.
- the head mounted display is tracked with sensors that allow the system to align virtual information with the physical world.
- the tracking may be performed, for example, using any one or more of such technologies as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors.
- Head-mounted displays are either optical see-through or video see-through.
- Optical see-through employs solutions such as half-silver mirrors to pass images through the lens and overlay information to be reflected into the user's eyes, and transparent LCD projectors that display the digital information and images directly or indirectly to the user retina.
- the present invention provides an interactive system for augmented reality.
- the interactive system of the invention includes a wearable data display device that may be incorporated for example, into a pair of glasses or goggles.
- the wearable display has a device providing location extraction capabilities (such as GPS) and a compass.
- the system also includes a user interface that allows a user to select computer generated data to augment a real world scene that the user is viewing.
- a camera obtains images of the real-world scene being viewed.
- a processor detects a predefined object in images of the real world scene captured by the camera such as a user's finger. When the user points to an element in the scene, data relating to the element are displayed on the data display device and are superimposed on the user's view of the scene.
- the invention provides a method for augmented reality comprising:
- the image sensor may be selected from a camera a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, and a reflectivity sensor.
- One or more of the state sensors may be selected from an optical sensor, an accelerometer, GPS, a gyroscope, a compass, magnetic sensor, a sensor indicating the direction of the device relative to the Earth's magnetic field, a gravity sensor and an RFID detector.
- the data associated with the identified object may be obtained by searching in a memory for data associated with the real world object.
- the predefined object may be, for example, a hand, a part of a hand, two hands, parts of two hands, a finger, part of a finger, or a finger tip.
- the viewing device may be configured to be worn by a user, for example, glasses or goggles.
- the viewing device may be incorporated in a mobile communication device.
- the step of identifying in the images of the real world scene obtained by the image sensor or sensors may comprise determining a location (X,Y) of the predefined object in an image obtained by the image sensors and determining one or both of location and an orientation of the display device provided by the sensors.
- the method of the invention may further comprise communicating with an external device or website.
- the communication may comprise sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
- the method may further comprise sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
- the method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
- the method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
- the message to the external device or website may be a command.
- the command may be selected from a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data relating to a real world object identified in an image.
- the message to the mobile communication device may be a command.
- the command may be selected from a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or website, a command to activate a service running on the mobile communication device e, a command to stop a service running on the mobile communication device, or a command to send data relating to a real world object identified in an image.
- the method may further comprise receiving from the external device or website data relating to a real world object identified in an image and presenting the received data to a user.
- the communication with the external device or website may be over a communication network.
- the command to the external device may be selected from depressing a virtual key displayed on a display device of the external device; rotating a selection carousel; switching between desktops, running on the external device a predefined software application; turning off an application on the external device; turning speakers on or off; turning volume up or down; locking the external device, unlocking the external device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the external device, performing one or more multi-touch commands, a touch gesture command, typing, clicking on a
- the predefined gesture may be selected from a swiping motion, a pinching motion of two fingers, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, holding an activating object for a predefined amount of time, clicking on an activatable icon, double clicking on an activatable icon, clicking from the right side on an activatable icon, clicking from the left side on an activatable icon, clicking from the bottom on an activatable icon, clicking from the top on an activatable
- the data associated with the identified object may be any one or more of visual data, audio data, or textual data.
- the data associated with the identified object may be an activatable icon.
- the activatable icon may be a 2D or 3D activatable icon.
- the activatable icon may be perceived by a user in a 3D space in front of the user.
- the method of the invention may have two or more operational modes.
- the method may change the operational mode of the system upon identification of a predefined gesture.
- An operational mode may be specified by any one or more of the gestures to be identified, algorithms that are active on the gesture detection module; a resolution of images captured by the image sensor, and a capture rate of images captured by the image sensor, the level of details of the data to be presented, the activatable icons to be presented to the user, a source of the data to be presented, a level of details of the data to be presented, activatable icons to be displayed on the display device, an active on-line service.
- the operational mode may be a mode selected from a mode of video recording of images by the image sensor upon identification of a predefined gesture; a mode of recording sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture; a mode of continuously monitoring video or sound and following a detection of a predefined gesture, recording the video or sound starting from a predefined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture, a mode of adding tags in a captured and real-time recorded video upon identification of a predefined gesture am mode of selecting an area in the field of view as captured by the camera, and copying the area to another location in the field of view and resizing it, a mode employing a tracker on a selected area in an image and is presenting the selected area in real-time in the resized and relocated area on the display device, a mode of capturing an image upon identification of a predefined gesture.
- the method of the invention may further comprise running a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
- An object recognition module may be employed to detect the predefined object only when the display device has level of motion below a predetermined threshold.
- the method may further comprise providing feedback when a predefined gesture has been identified.
- the feedback may be, for example, visual feedback, audio feedback, haptic feedback, directional vibration, air tactile feedback, or ultrasonic feedback.
- the feedback may be a visual indication in a form selected from an activatable icon displayed on the display device, a change in an activatable icon displayed on the display device, a change in color of an activatable icon displayed on the display device, a change in size of an activatable icon displayed on the display device, animation of an activatable icon displayed on the display device, an indication light, an indicator moving on a display device, an indicator moving on the display device that appears on top of all other images or video appearing on the display device and the appearance of a glow around the predefined object.
- the feedback may be a vibration, a directional vibration indication, or an air tactile indication.
- part of an activatable icon displayed on the display device may not presented where the predefined object is located, so that the predefined object appears to be on top of the activatable icon.
- Activatable icons may be removed from the display device when the display device has a level of activity above a predefined threshold.
- the removed icons on the display device may be removed, for example, when the display device has a level of motion below the predefined threshold.
- the method may be brought into an active mode when a predefined action is performed.
- the predefined action may be selected from bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer.
- the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor.
- the system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view.
- the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
- the method of the invention may further comprise attaching a visual indication to a real-world object indicating the existence in a memory of data correlated with the real-world object.
- the visual indication may be overlaid on an image of the real-world object.
- the visual may be selected from an activatable icon, a photo, and an image of an envelope.
- the method of the invention may further comprise a calibration process to record one or more physical parameters of the predefined object.
- the calibration process may comprise any one or more steps selected from presenting on the display activatable icons in different locations in a 3D space, extracting physical features of the predefined object, and determining a correlation between dimensions of the predefined object and its distance from the camera.
- the calibration process may comprise a step of constructing a triangle having vertices at one of the image sensors and at a tip of the predefined object and having a side formed by a user's line of sight. The distance of the real world object from the camera may be estimated based on information extracted in the calibration.
- the method may further comprise displaying a keyboard enabling text typing.
- the keyboard may be displayed upon detection of a predefined gesture, such as a gesture from right to left, presenting an open hand, presenting two open hands in a predefined region of the field of view of an image sensor.
- the keyboard may be displayed upon performing a click gesture in a 3D typing area or where a predefined activatable icon is perceived to be located.
- the invention also provides a system comprising a device configured to execute the method of the invention.
- the invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
- the computer program may be embodied on a computer readable medium.
- a user may interact with a visual image typically displayed through glasses.
- the user's view of reality is, thus, augmented by the information presented on the display.
- One issue with augmented reality devices is the manner in which the user interacts with and controls the device.
- Traditional control devices e.g., a mouse, track ball, or touch screen, are difficult to use with augmented reality devices.
- Using gesture recognition in an augmented reality system is not trivial, because the user, and thus the augmented reality device, is constantly moving in real time.
- the invention thus provides a computer program product containing instructions for causing a processor to perform a method comprising:
- the augmented information may include at least one of information associated with objects in the environment; images associated with the environment; and distances associated with the environment.
- the correlating may include determining a reference location in three dimensional space of at least a portion of the user's hand, and determining in at least one of the augmented information and the image information data associated with the reference location.
- the altering may include changing the augmented information as a function of the data associated with the reference location.
- FIG. 1 shows schematically a system for augmented reality in accordance with one embodiment of the invention
- FIG. 2 shows a system for augmented reality comprising a set of goggles in accordance with one embodiment of the invention
- FIG. 3 shows the system of FIG. 2 in use
- FIG. 4 a shows a view of a real-world scene displayed on a display device of the system of FIG. 2
- FIG. 4 b shows the view of FIG. 4 a with the user's finger pointing to an object in the view
- FIG. 4 c shows visual text relating to the object at which the user's finger is pointing overlaid on the view of FIG. 4 b;
- FIG. 5 shows a system for augmented reality integral with a communication device in accordance with another embodiment of the invention.
- FIG. 6 a shows Yet designating an area in the field of view of an image sensor by the user performing a gesture of “drawing” the contour of the area
- FIG. 6 b shows resizing the selected area by performing a second gesture
- FIG. 6 c shows the area after resizing
- FIG. 6 d shows the area after being dragged to a new location in the field of view.
- FIG. 1 shows schematically a system 30 for augmented reality in accordance with one embodiment of the invention.
- the system 30 includes one or more image sensors 32 configured to obtain images of a real world scene. Any type of image sensor may be used in the system of the invention such as a camera alight sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor.
- a camera alight sensor such as a camera alight sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor.
- SWIR shortwave infrared
- the system 30 further includes a viewing device 34 having one or more display devices 35 that enable a user to see both the real world scene and external information, such as images, videos, or audio signals, superimposed upon the real world scene.
- a viewing device 34 having one or more display devices 35 that enable a user to see both the real world scene and external information, such as images, videos, or audio signals, superimposed upon the real world scene.
- Any type of display device that allows a user to both see the real world scene and the displayed data may be used in the system of the invention.
- the display devices 35 may comprise, for example, a surface upon which visual material is presented to a user or one or more projectors that display images directly to the user's retina.
- a processor 36 obtains orientation and/or location data of the system 30 from one or more state sensors 38 , that may be, for example, any one or more of an optical sensor, an accelerometer, GPS, a gyroscope, a solid state compasses, magnetic sensor, gravity sensor, and an RFID detector.
- the processor 36 may be, for example, a dedicated processor, a general purpose processor, a DSP (digital signaling processor) processor, a GPU (visual processing unit) processor, dedicated hardware, or a processor that can run on an external device.
- the system 30 may run as a software on the viewing device 34 , or another device 37 , such as Smartphone, that incorporates the other components of the system 30 .
- the processor 36 is configured to run a gesture detection module 40 that identifies in images of the real world scene obtained by the image sensor 32 one or more real world objects at which a predefined object is pointing.
- the real world objects may be, for example, a building or a billboard. Determination of the real world objects utilizes data provided by the state sensors 38 .
- the predefined object may be a user's finger or other object such as a stylus or wand.
- the processor 36 When the processor 36 has identified a real world object at which the predefined object is pointing, the processor searches in a memory 42 for data associated with the identified object.
- the data may be, for example, visual data, audio data, or textual data.
- the visual data may be textual information relating to the identified object.
- the processor displays the associated visual data associated with the identified object on the display of the viewing device.
- the memory 42 may be integral with the system 30 or may be remotely located and accessed over a communication network, such as the Internet.
- the system 30 may thus comprise a communication module 39 allowing the system 30 to communicate with a network, wireless network, cellular network, an external device such as another device 30 , a mobile phone, tablet, or an Internet website and so on.
- the data may be an activatable icon.
- activatable icon refers to a region in an image or video associated with one or more messages or commands that are activated by a user interaction.
- the activatable icons may be, for example, a 2D or 3D visual element such as virtual buttons, a virtual keyboard or icon.
- Activatable icons are activated by means of one or more predefined objects that are recognizable by the system, and may be, for example, a stylus, one or more of a user's hands or a portion of a hand, one or more fingers or a portion of a finger such as a finger tip.
- Activation of one or more of the activatable icons by a predefined object results in the generation of a message or a command addressed to an operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
- the processor 36 may be configured to send a message or command to the device 37 or to a remote device, to an application running on the device, to a service running on the device 37 , and an operating system running on the device, to a process running on the device, a software program running in the background and one or more services running on the device or, a process running in the device.
- the message or command may be sent over a communication network such as the Internet or a cellular phone network.
- the command may be, for example, a command to run an application on the device, a command to stop an application running on the device, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data to the processor 36 relating to a real world object identified in an image by the processor 36 .
- the command may be a command to the device 37 such as depressing a virtual key displayed on a display device of the device; rotating a selection carousel; switching between desktops, running on the device a predefined software application; turning off an application on the device; turning speakers on or off; turning volume up or down; locking the device, unlocking the device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, controlling interactive video or animated content, editing video or images, pointing at a map, zooming-in or out on a map or images, painting on an image, pushing an activatable icon away from the display device, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the device,
- the communication module may be used to transmit a message that may be addressed, for example, to a remote device.
- the message may be, for example a command to a remote device.
- the command may be, for example a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, a command to stop a service running on the remote device.
- the message may be a command to the remote device selected from depressing a virtual key displayed on a display device of the remote device; rotating a selection carousel; switching between desktops, running on the remote device a predefined software application; turning off an application on the remote device; turning speakers on or off; turning volume up or down; locking the remote device, unlocking the remote device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the remote device, performing one or more multi-touch commands, a touch gesture command, typing, clicking
- the message can be request for data associated with the identified object.
- the data request message may be addressed to an application, a service, a process, a thread running on the device, or from an application, a service, a process, or a thread running on an external device, or from an online service.
- an object recognition module to detect the predefined object can be employed only when the headset is not moving significantly as determined from information obtained by the state sensors.
- FIG. 2 shows a system 2 for augmented reality in accordance with one embodiment of the invention.
- the system 2 comprises a portable viewing device that may be for example, an interactive head-mounted eyepiece such as a pair of eyeglasses or goggles 4 .
- the goggles 4 are provided with a image sensor 6 that obtains images of a real-world scene 8 .
- the scene 8 may include, for example, one or more buildings 12 , or one or more billboards 14 .
- the goggles may be provided with one or more display devices 10 that are located in the goggles 4 so as to be positioned in front of a user's eyes when the goggles 4 are worn by the user.
- the display devices 10 may be, for example, see-through devices such as transparent LCD screens through which the real world scene is viewed, together with presenting external data.
- the system 2 further comprises a processor 16 that is configured to identify in images captured by the image sensors 6 , a predefined object performs a gesture or pointing at a real world object in the real world scene 8 or activatable icons displayed to the user.
- the system 2 also includes one or more location and/or orientation sensors 23 such as GPS, an accelerometer, a gyroscope, a solid state compasses, magnetic sensor, or a gravity sensor.
- FIG. 5 shows a system 40 for augmented reality in accordance with another embodiment of the invention.
- the system 40 is integrated into a mobile communication device 42 such as a mobile phone, tablet, or camera.
- a front view of the communication device 42 is shown in FIG. 5 a
- a rear view of the communication device 42 is shown in FIG. 5 b .
- the communication device 42 is provided with an image sensors 46 on its rear surface, opposite to the display device, that obtains images of a real-world scene.
- the communication device 42 is also provided with a display device 48 on its front surface that is positioned in front of a user when the camera 46 is directed towards a real world scene.
- the display device 48 may be for example, a LCD screen that presents to the user images of a real world scene obtained by the camera 6 , together with visual data, as explained below.
- the system 40 utilizes the camera 46 , the display device 48 , and the processor of the communication device 42 , and further comprises one or more state sensors, contained within the housing of the communication device 42 which are not seen in FIG. 5 .
- the processor is configured to identify in images captured by the image sensors 46 a predefined object pointing at a real world object in the real world scene.
- FIG. 3 a shows the system 2 in use.
- the goggles 4 are placed over the eyes of a user 18 .
- the user faces the real world scene 8 and thus views the scene 8 .
- FIG. 3 b shows the system 40 in use.
- the user 18 holds the communication device 42 with the image sensors 46 facing the real world scene 8 and the display device 48 facing the user.
- the system 2 or 40 now executes the following process.
- the view of the scene 8 that the user would see when using the system 2 or 40 is displayed on the display device.
- FIG. 4 a shows the view of the scene 8 that the user would see when using the system 2 or 40 to view the real world scene 8 .
- the processor 36 analyzes images obtained by the image sensors to determine when a predefined object in images captured by the image sensors is performing a predefined gesture in relation to a real world object in the real world scene 8 .
- the viewing device 34 such as the goggles 4 or the communication device 42 is often not stable in use, due to movement of the user as occurs during walking, or movement of the user's head or hand. In this situation, the signal generated by the sensors 38 may be noisy and inaccurate. In this case, the machine vision module 37 runs a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
- the predefined gesture relating to a real world object or to an activatable icon may be, for example, pointing at the real world object or an activatable icon, or performing a swiping gesture over the real world object or an activatable icon.
- the activatable icon may or may not be correlated to a real world object.
- Other possible predefined gestures include a swiping motion, a pinching motion of two fingers such as with the fore finger and thumb or the middle finger and thumb, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, or at a real world object, pointing at an activatable icon or a real world object for a predefined amount of time, clicking on an activatable icon or real world object, double clicking on an activatable icon or real world object, clicking with a forefinger on an activ
- the predefined object may be, for example, a user hand, a part of a user's hand, such as the user's finger 20 or parts of two different hands.
- the predefined object may be a stylus or wand.
- the feedback may be a visual indication in a form selected from an activatable icon displayed on a display device, a change in an activatable icon on a display device, a change in color of an activatable icon on a display device, a change in size of an activatable icon, animation of an activatable icon, an indication light, an indicator moving on a display device, a vibration, a directional vibration indication, an air tactile indication.
- the indication may be provided by an indicator moving on a display device that appears on top of all other images or video appearing on the display device.
- Visual feedback may be the appearance of a glow around the predefined object when a system recognizes the predefined object.
- the gesture detection module 40 may use any method for detecting the predefined objects in images obtained by the image sensor 32 .
- the gesture detection module may detect the predefined object as disclosed in WO2005/091125 or WO 2010/086866.
- the processor 16 is further configured to determine the real world object in the scene 8 towards which the predefined gesture was performed. Thus, for example, in the image shown in FIG. 4 b , the processor 16 would determine that the user's finger 20 is pointing at the billboard 14 by determining the fingertip location (X,Y) in the image and combining this information with the location of the user and the orientation of the goggles 4 from the state sensors 21 .
- the real world object is thus indentified by the processor without presenting to the user a cursor or other marker to indicate the real world object that the user wishes to select, enabling a direct pointing on a real world object to start an interaction.
- the processor 16 searches in a memory, which may be integral with the processor 16 or may be remotely located, for data relating to the real-world object to which the user's finger 20 is pointing.
- the memory may have stored data relating to the billboard 14 .
- the data is displayed on the display device 10 superimposed on the user's view of the scene.
- visual data 21 relating to the billboard 14 is displayed on the display device 10 , as shown in FIG. 4 c.
- the visual data 21 may be static or animated.
- the visual data 21 may include one or more an activatable icons, such that when a predefined gesture is performed relative to one of the activatable icons, a command associated with the activatable icon is executed.
- the command may be, for example, to display specific visual material relating to the selected real world object.
- the activatable icons may be 2D or 3D activatable icons and may be presented to the user so that the user perceives the icon in front of him in a 3D space.
- an activatable icon is a region in a 2D or 3D image or video associated with one or more messages activated by user interaction.
- the activatable icons may be, for example, a 2D or 3D visual element.
- the activatable icons may be virtual buttons, a virtual keyboard, a 2D or 3D activatable icon, a region in an image or a video.
- An activatable icon may consist of two or more activatable icons.
- the processor may not present part of the activatable icon where the predefined object is located, so that the predefined object appears to be on top of the activatable icon.
- the activatable icons may be removed when the user rapidly moves his head and then returned when the head motion is below a predefined motion speed.
- the system 2 may have two or more operational modes and the processor 16 may be configured to identify one or more predefined gestures to change between the operational modes.
- a gesture may be used to turn the system on or off, select the source of the visual material to be presented, select the level of details of the visual material to be presented, select the buttons or activatable icons to be presented to the user, or activate an online service, such as an online service related to a selected real world object.
- Yet another mode of operation may be to start video recording of images by the image sensor and/or recording of sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture.
- Yet another mode of operation is continuously monitoring video and/or sound, but following a detection of a predefined gesture, the video/sound is recorded starting from a predetermined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture.
- the predetermined time may be defined by the user.
- Yet another mode of operation is adding tags in a captured and real-time recorded video upon identification of a predefined gesture.
- FIG. 6 Yet another mode of operation is shown in FIG. 6 .
- an area 62 in the field of view 60 as captured by the image sensor is designated by the user performing a gesture of “drawing” the contour of the area, shown by phantom lines in FIG. 6 .
- the selected area is then resized by the user performing a second gesture, such as separating two fingers or bringing two fingers closer together as indicated by the arrows 66 in FIG. 6 b , until the selected area attains the desired size ( 67 in FIG. 6 c ).
- the area 67 is then dragged to a new location in the field of view ( FIG. 6 d ) and copied in the new location in the field of view.
- the system then employs a tracker on the selected area and the selected area is presented in real-time in the resized and relocated area set by the user on the display device.
- a region of images containing a displayed activatable icon bounding box around a displayed activatable icon may be defined that remains fixed.
- the system employs a machine vision tracker to track this bounding box.
- the distance between the locations of the bounding boxes in two frames of a video sequence is less than a predefined distance, as determined using a video tracker, and the correlation value of the tracker of the bounding box is below a predefined value.
- CPU can be minimized by searching for the predefined object only in the vicinity of each displayed activatable icon.
- the objection recognition module is not activated all the time but only when the headset is not moving significantly as determined from information obtained by a state sensors.
- a user may choose different filters to screen data correlated with real-world objects, such as a filter “display data generated only by friends”, or display data from registered sources, or data generated in the last three months.
- the system 2 may have a stand-by mode in which the power consumption by the system 2 is minimal.
- the active mode may be different from the stand-by mode, for example, in the number of video frames per second that are being analyzed by the system, the resolution of images that are being analyzed, the portion of the image frame that is being analyzed, and/or the detection modules that are activated.
- the system 2 can be brought to the active mode by any technique.
- the system 2 may be brought to the active mode by bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer.
- a predefined gesture such as moving the hand from right to left across the field of view
- an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activ
- the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor.
- the system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view.
- the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
- a visual indication may be attached to a real-world object to let the user know that there is data correlated with the real-world object.
- Indication of relevant data may be overlaid on the location of the real-world object as a small visual indication such as an activatable icon of “i” may indicate information, and a logo of “photos” may indicate images related to the real-world object, or a logo of “envelop” indicates a message that was left by a friend or other user correlated to the real-world object.
- a small visual indication such as an activatable icon of “i” may indicate information
- a logo of “photos” may indicate images related to the real-world object
- a logo of “envelop” indicates a message that was left by a friend or other user correlated to the real-world object.
- the system 2 may be configured to undergo a calibration process to record various physical parameters of the predefined object so as to facilitate identification of the predefined object in images obtained by the camera by the processor 2 . This may be done, for example, by presenting to the user on the display activatable icons in different locations in the 3D space, and extracting physical features of the predefined object such as its size or orientation of the predefined object, and determining a correlation between the dimensions of the predefined object and its distance from the camera.
- the calibration may involve calculating the triangular of camera, the user's line of sight and the tip of the predefined object to determine the user is pointing at. The accuracy is improved by estimating the distance of the real world object from the camera based on information extracted in the calibration.
- the processor may be configured to identify in images obtained by the camera of the real world scene by another user of the system of the invention.
- the identification of another user in the real world scene may be performed, for example, by informing a remote server of the locations of the devices in a particular geographical area.
- the locations of the other devices can be sent to all of the devices in the geographical area.
- the two systems may be used for game playing.
- the other user may be represented to as an avatar with whom the user can interact by gestures such as send a message to the other user such as “like”.
- the processor may be configured to display a keyboard that enables text typing with one or more fingers or hands.
- Display of the keyboard may be initiated upon detection of a predefined gesture such as a gesture from right to left, or by the using presenting an open hand, or two open hands in a predefined region of the field of view of the camera, such as the bottom part of the field of view.
- a predefined gesture such as a gesture from right to left, or by the using presenting an open hand, or two open hands in a predefined region of the field of view of the camera, such as the bottom part of the field of view.
- Yet another way to initiate the display of the keyboard is when the user performs a click gesture in the 3D space where the typing area or an activatable icon is perceived to be located.
- the keyboard may be used, for example, in order to, write a note, conduct a search or to communicate with online services (such as Skype or twitter) by typing on virtual keyboard.
- the system may not present part of the keyboard where the predefined object is located, so that
- an animated hand When the system is in a typing mode, an animated hand may be presented on the keyboard whose position is correlated with the user's hands and fingers. The fingertips of the animated hands may be located above a virtual keystroke at the location where the character of the keystroke is seen.
- the keyboard and the animated hands are preferably opaque, so that the user is unable see the background behind the keyboard. This tends to make the keyboard clearer to the user.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/345,592 US20140361988A1 (en) | 2011-09-19 | 2012-09-19 | Touch Free Interface for Augmented Reality Systems |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161536144P | 2011-09-19 | 2011-09-19 | |
PCT/IL2012/050376 WO2013093906A1 (en) | 2011-09-19 | 2012-09-19 | Touch free interface for augmented reality systems |
US14/345,592 US20140361988A1 (en) | 2011-09-19 | 2012-09-19 | Touch Free Interface for Augmented Reality Systems |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2012/050376 A-371-Of-International WO2013093906A1 (en) | 2011-09-19 | 2012-09-19 | Touch free interface for augmented reality systems |
Related Child Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/060,533 Continuation US20160259423A1 (en) | 2011-09-19 | 2016-03-03 | Touch fee interface for augmented reality systems |
US15/090,527 Continuation US20160291699A1 (en) | 2011-09-19 | 2016-04-04 | Touch fee interface for augmented reality systems |
US15/096,674 Continuation US20160306433A1 (en) | 2011-09-19 | 2016-04-12 | Touch fee interface for augmented reality systems |
US15/144,209 Continuation US10401967B2 (en) | 2011-09-19 | 2016-05-02 | Touch free interface for augmented reality systems |
US15/256,481 Continuation US20170052599A1 (en) | 2011-09-19 | 2016-09-02 | Touch Free Interface For Augmented Reality Systems |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140361988A1 true US20140361988A1 (en) | 2014-12-11 |
Family
ID=47189999
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/345,592 Abandoned US20140361988A1 (en) | 2011-09-19 | 2012-09-19 | Touch Free Interface for Augmented Reality Systems |
US15/060,533 Abandoned US20160259423A1 (en) | 2011-09-19 | 2016-03-03 | Touch fee interface for augmented reality systems |
US15/090,527 Abandoned US20160291699A1 (en) | 2011-09-19 | 2016-04-04 | Touch fee interface for augmented reality systems |
US15/096,674 Abandoned US20160306433A1 (en) | 2011-09-19 | 2016-04-12 | Touch fee interface for augmented reality systems |
US15/144,209 Expired - Fee Related US10401967B2 (en) | 2011-09-19 | 2016-05-02 | Touch free interface for augmented reality systems |
US15/256,481 Abandoned US20170052599A1 (en) | 2011-09-19 | 2016-09-02 | Touch Free Interface For Augmented Reality Systems |
US16/557,183 Active US11093045B2 (en) | 2011-09-19 | 2019-08-30 | Systems and methods to augment user interaction with the environment outside of a vehicle |
US17/401,427 Active US11494000B2 (en) | 2011-09-19 | 2021-08-13 | Touch free interface for augmented reality systems |
Family Applications After (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/060,533 Abandoned US20160259423A1 (en) | 2011-09-19 | 2016-03-03 | Touch fee interface for augmented reality systems |
US15/090,527 Abandoned US20160291699A1 (en) | 2011-09-19 | 2016-04-04 | Touch fee interface for augmented reality systems |
US15/096,674 Abandoned US20160306433A1 (en) | 2011-09-19 | 2016-04-12 | Touch fee interface for augmented reality systems |
US15/144,209 Expired - Fee Related US10401967B2 (en) | 2011-09-19 | 2016-05-02 | Touch free interface for augmented reality systems |
US15/256,481 Abandoned US20170052599A1 (en) | 2011-09-19 | 2016-09-02 | Touch Free Interface For Augmented Reality Systems |
US16/557,183 Active US11093045B2 (en) | 2011-09-19 | 2019-08-30 | Systems and methods to augment user interaction with the environment outside of a vehicle |
US17/401,427 Active US11494000B2 (en) | 2011-09-19 | 2021-08-13 | Touch free interface for augmented reality systems |
Country Status (5)
Country | Link |
---|---|
US (8) | US20140361988A1 (ja) |
JP (3) | JP2014531662A (ja) |
KR (3) | KR20220032059A (ja) |
CN (2) | CN103858073B (ja) |
WO (1) | WO2013093906A1 (ja) |
Cited By (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140028716A1 (en) * | 2012-07-30 | 2014-01-30 | Mitac International Corp. | Method and electronic device for generating an instruction in an augmented reality environment |
US20140125580A1 (en) * | 2012-11-02 | 2014-05-08 | Samsung Electronics Co., Ltd. | Method and device for providing information regarding an object |
US20140225918A1 (en) * | 2013-02-14 | 2014-08-14 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for hmd |
US20140240226A1 (en) * | 2013-02-27 | 2014-08-28 | Robert Bosch Gmbh | User Interface Apparatus |
US20140270352A1 (en) * | 2013-03-14 | 2014-09-18 | Honda Motor Co., Ltd. | Three dimensional fingertip tracking |
US20140285520A1 (en) * | 2013-03-22 | 2014-09-25 | Industry-University Cooperation Foundation Hanyang University | Wearable display device using augmented reality |
US20140306881A1 (en) * | 2013-04-15 | 2014-10-16 | Olympus Corporation | Wearable device, program and display controlling method of wearable device |
US20140358002A1 (en) * | 2011-12-23 | 2014-12-04 | Koninklijke Philips N.V. | Method and apparatus for interactive display of three dimensional ultrasound images |
US20150146007A1 (en) * | 2013-11-26 | 2015-05-28 | Honeywell International Inc. | Maintenance assistant system |
US20150243105A1 (en) * | 2013-07-12 | 2015-08-27 | Magic Leap, Inc. | Method and system for interacting with user interfaces |
US20150271396A1 (en) * | 2014-03-24 | 2015-09-24 | Samsung Electronics Co., Ltd. | Electronic device and method for image data processing |
US20150356789A1 (en) * | 2013-02-21 | 2015-12-10 | Fujitsu Limited | Display device and display method |
US20150358614A1 (en) * | 2014-06-05 | 2015-12-10 | Samsung Electronics Co., Ltd. | Wearable device and method for providing augmented reality information |
US9213403B1 (en) | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
US20150364037A1 (en) * | 2014-06-12 | 2015-12-17 | Lg Electronics Inc. | Mobile terminal and control system |
US20160048203A1 (en) * | 2014-08-18 | 2016-02-18 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US20160048024A1 (en) * | 2014-08-13 | 2016-02-18 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
US20160054802A1 (en) * | 2014-08-21 | 2016-02-25 | Samsung Electronics Co., Ltd. | Sensor based ui in hmd incorporating light turning element |
US20160103437A1 (en) * | 2013-06-27 | 2016-04-14 | Abb Technology Ltd | Method and data presenting device for assisting a remote user to provide instructions |
CN105843390A (zh) * | 2016-02-24 | 2016-08-10 | 上海理湃光晶技术有限公司 | 一种图像缩放的方法与基于该方法的ar眼镜 |
JP2016161734A (ja) * | 2015-03-02 | 2016-09-05 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
US20160313956A1 (en) * | 2015-04-24 | 2016-10-27 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US9507426B2 (en) * | 2013-03-27 | 2016-11-29 | Google Inc. | Using the Z-axis in user interfaces for head mountable displays |
US9526443B1 (en) * | 2013-01-19 | 2016-12-27 | Bertec Corporation | Force and/or motion measurement system and a method of testing a subject |
US20160378294A1 (en) * | 2015-06-24 | 2016-12-29 | Shawn Crispin Wright | Contextual cursor display based on hand tracking |
WO2016206874A1 (de) * | 2015-06-23 | 2016-12-29 | Siemens Aktiengesellschaft | Interaktionssystem |
US20170061700A1 (en) * | 2015-02-13 | 2017-03-02 | Julian Michael Urbach | Intercommunication between a head mounted display and a real world object |
US9685005B2 (en) * | 2015-01-02 | 2017-06-20 | Eon Reality, Inc. | Virtual lasers for interacting with augmented reality environments |
WO2017112099A1 (en) * | 2015-12-23 | 2017-06-29 | Intel Corporation | Text functions in augmented reality |
US9770203B1 (en) | 2013-01-19 | 2017-09-26 | Bertec Corporation | Force measurement system and a method of testing a subject |
WO2017172211A1 (en) * | 2016-03-31 | 2017-10-05 | Intel Corporation | Augmented reality in a field of view including a reflection |
ES2643863A1 (es) * | 2016-05-24 | 2017-11-24 | Sonovisión Ingenieros España, S.A.U. | Método para proporcionar mediante realidad aumentada guiado, inspección y soporte en instalación o mantenimiento de procesos para ensamblajes complejos compatible con s1000d y dispositivo que hace uso del mismo |
WO2018071019A1 (en) * | 2016-10-13 | 2018-04-19 | Ford Motor Company | Dual-mode augmented reality interfaces for mobile devices |
US9990779B2 (en) * | 2016-05-13 | 2018-06-05 | Meta Company | System and method for modifying virtual objects in a virtual environment in response to user interactions |
US10010286B1 (en) | 2013-01-19 | 2018-07-03 | Bertec Corporation | Force measurement system |
US20180189474A1 (en) * | 2016-12-30 | 2018-07-05 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and Electronic Device for Unlocking Electronic Device |
CN108369640A (zh) * | 2015-12-17 | 2018-08-03 | 诺基亚技术有限公司 | 用于控制场景的捕获图像的图像处理以调适捕获图像的方法、装置或计算机程序 |
US20180225520A1 (en) * | 2015-02-23 | 2018-08-09 | Vivint, Inc. | Techniques for identifying and indexing distinguishing features in a video feed |
US10055888B2 (en) | 2015-04-28 | 2018-08-21 | Microsoft Technology Licensing, Llc | Producing and consuming metadata within multi-dimensional data |
US10068403B1 (en) | 2017-09-21 | 2018-09-04 | Universal City Studios Llc | Locker management techniques |
US10133345B2 (en) | 2016-03-22 | 2018-11-20 | Microsoft Technology Licensing, Llc | Virtual-reality navigation |
US10156726B2 (en) * | 2015-06-29 | 2018-12-18 | Microsoft Technology Licensing, Llc | Graphene in optical systems |
US10168768B1 (en) | 2016-03-02 | 2019-01-01 | Meta Company | Systems and methods to facilitate interactions in an interactive space |
US10186088B2 (en) | 2016-05-13 | 2019-01-22 | Meta Company | System and method for managing interactive virtual frames for virtual objects in a virtual environment |
US10231662B1 (en) | 2013-01-19 | 2019-03-19 | Bertec Corporation | Force measurement system |
US20190129179A1 (en) * | 2017-11-02 | 2019-05-02 | Olympus Corporation | Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer |
US10310624B2 (en) * | 2014-12-17 | 2019-06-04 | Konica Minolta, Inc. | Electronic apparatus, method for controlling electronic apparatus, and control program for the same |
US10405024B2 (en) * | 2016-10-26 | 2019-09-03 | Orcam Technologies Ltd. | Wearable device and methods for transmitting information based on physical distance |
US10413230B1 (en) | 2013-01-19 | 2019-09-17 | Bertec Corporation | Force measurement system |
US20190339837A1 (en) * | 2018-05-04 | 2019-11-07 | Oculus Vr, Llc | Copy and Paste in a Virtual Reality Environment |
US10481755B1 (en) * | 2017-04-28 | 2019-11-19 | Meta View, Inc. | Systems and methods to present virtual content in an interactive space |
US10602200B2 (en) | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Switching modes of a media content item |
US10606359B2 (en) | 2014-12-19 | 2020-03-31 | Immersion Corporation | Systems and methods for haptically-enabled interactions with objects |
US10620779B2 (en) * | 2017-04-24 | 2020-04-14 | Microsoft Technology Licensing, Llc | Navigating a holographic image |
US10646153B1 (en) | 2013-01-19 | 2020-05-12 | Bertec Corporation | Force measurement system |
US10656711B2 (en) | 2016-07-25 | 2020-05-19 | Facebook Technologies, Llc | Methods and apparatus for inferring user intent based on neuromuscular signals |
US10684367B2 (en) | 2014-11-26 | 2020-06-16 | Samsung Electronics Co., Ltd. | Ultrasound sensor and object detecting method thereof |
US10687759B2 (en) | 2018-05-29 | 2020-06-23 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
US10768426B2 (en) | 2018-05-21 | 2020-09-08 | Microsoft Technology Licensing, Llc | Head mounted display system receiving three-dimensional push notification |
US10789952B2 (en) * | 2018-12-20 | 2020-09-29 | Microsoft Technology Licensing, Llc | Voice command execution from auxiliary input |
US10842407B2 (en) | 2018-08-31 | 2020-11-24 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
US10856796B1 (en) | 2013-01-19 | 2020-12-08 | Bertec Corporation | Force measurement system |
US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
US10909767B1 (en) * | 2019-08-01 | 2021-02-02 | International Business Machines Corporation | Focal and interaction driven content replacement into augmented reality |
US10909762B2 (en) | 2018-08-24 | 2021-02-02 | Microsoft Technology Licensing, Llc | Gestures for facilitating interaction with pages in a mixed reality environment |
US10921764B2 (en) | 2018-09-26 | 2021-02-16 | Facebook Technologies, Llc | Neuromuscular control of physical objects in an environment |
US10929099B2 (en) * | 2018-11-02 | 2021-02-23 | Bose Corporation | Spatialized virtual personal assistant |
US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
US10970936B2 (en) | 2018-10-05 | 2021-04-06 | Facebook Technologies, Llc | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
US10970374B2 (en) | 2018-06-14 | 2021-04-06 | Facebook Technologies, Llc | User identification and authentication with neuromuscular signatures |
US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
CN112789577A (zh) * | 2018-09-20 | 2021-05-11 | 脸谱科技有限责任公司 | 增强现实系统中的神经肌肉文本输入、书写和绘图 |
US20210158630A1 (en) * | 2019-11-25 | 2021-05-27 | Facebook Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US11036302B1 (en) | 2018-05-08 | 2021-06-15 | Facebook Technologies, Llc | Wearable devices and methods for improved speech recognition |
US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
US11052288B1 (en) | 2013-01-19 | 2021-07-06 | Bertec Corporation | Force measurement system |
US11069148B2 (en) | 2018-01-25 | 2021-07-20 | Facebook Technologies, Llc | Visualization of reconstructed handstate information |
US11079846B2 (en) | 2013-11-12 | 2021-08-03 | Facebook Technologies, Llc | Systems, articles, and methods for capacitive electromyography sensors |
US11099716B2 (en) | 2016-12-23 | 2021-08-24 | Realwear, Inc. | Context based content navigation for wearable display |
EP3847531A4 (en) * | 2019-01-31 | 2021-11-03 | Huawei Technologies Co., Ltd. | HAND-ENTRY DETECTION ON THE FACE FOR INTERACTION WITH A DEVICE HAVING AN INTEGRATED CAMERA |
US11179066B2 (en) | 2018-08-13 | 2021-11-23 | Facebook Technologies, Llc | Real-time spike detection and identification |
US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
CN114089879A (zh) * | 2021-11-15 | 2022-02-25 | 北京灵犀微光科技有限公司 | 一种增强现实显示设备的光标控制方法 |
US20220103748A1 (en) * | 2020-09-28 | 2022-03-31 | Ilteris Canberk | Touchless photo capture in response to detected hand gestures |
US20220113814A1 (en) | 2019-09-30 | 2022-04-14 | Yu Jiang Tham | Smart ring for manipulating virtual objects displayed by a wearable device |
US11311209B1 (en) | 2013-01-19 | 2022-04-26 | Bertec Corporation | Force measurement system and a motion base used therein |
US11321880B2 (en) | 2017-12-22 | 2022-05-03 | Sony Corporation | Information processor, information processing method, and program for specifying an important region of an operation target in a moving image |
US11337652B2 (en) | 2016-07-25 | 2022-05-24 | Facebook Technologies, Llc | System and method for measuring the movements of articulated rigid bodies |
US11354896B2 (en) | 2019-07-31 | 2022-06-07 | Seiko Epson Corporation | Display device, display method, and computer program |
US11409497B2 (en) | 2016-12-23 | 2022-08-09 | Realwear, Inc. | Hands-free navigation of touch-based operating systems |
US11435857B1 (en) * | 2021-04-29 | 2022-09-06 | Google Llc | Content access and navigation using a head-mounted device |
US11481031B1 (en) | 2019-04-30 | 2022-10-25 | Meta Platforms Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
US20220350997A1 (en) * | 2021-04-29 | 2022-11-03 | Google Llc | Pointer-based content recognition using a head-mounted device |
US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US11507216B2 (en) * | 2016-12-23 | 2022-11-22 | Realwear, Inc. | Customizing user interfaces of binary applications |
US11520399B2 (en) | 2020-05-26 | 2022-12-06 | Snap Inc. | Interactive augmented reality experiences using positional tracking |
US11531402B1 (en) | 2021-02-25 | 2022-12-20 | Snap Inc. | Bimanual gestures for controlling virtual and graphical elements |
US11540744B1 (en) | 2013-01-19 | 2023-01-03 | Bertec Corporation | Force measurement system |
US20230068391A1 (en) * | 2020-02-28 | 2023-03-02 | Nec Corporation | Locker system, locker management system, and locker management method |
US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
US20230131474A1 (en) * | 2019-10-16 | 2023-04-27 | Samuel R. Pecota | Augmented reality marine navigation |
US11644799B2 (en) | 2013-10-04 | 2023-05-09 | Meta Platforms Technologies, Llc | Systems, articles and methods for wearable electronic devices employing contact sensors |
US11644902B2 (en) * | 2020-11-30 | 2023-05-09 | Google Llc | Gesture-based content transfer |
US11656686B2 (en) * | 2014-09-09 | 2023-05-23 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11666264B1 (en) | 2013-11-27 | 2023-06-06 | Meta Platforms Technologies, Llc | Systems, articles, and methods for electromyography sensors |
US20230221799A1 (en) * | 2022-01-10 | 2023-07-13 | Apple Inc. | Devices and methods for controlling electronic devices or systems with physical objects |
US11740313B2 (en) | 2020-12-30 | 2023-08-29 | Snap Inc. | Augmented reality precision tracking and display |
US20230333378A1 (en) * | 2017-08-25 | 2023-10-19 | Snap Inc. | Wristwatch based interface for augmented reality eyewear |
US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
US11798429B1 (en) | 2020-05-04 | 2023-10-24 | Snap Inc. | Virtual tutorials for musical instruments with finger tracking in augmented reality |
US11804041B2 (en) | 2018-12-04 | 2023-10-31 | Apple Inc. | Method, device, and system for generating affordances linked to a representation of an item |
US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
US11830351B2 (en) | 2015-02-20 | 2023-11-28 | Ultrahaptics Ip Ltd | Algorithm improvements in a haptic system |
US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
USD1009884S1 (en) * | 2019-07-26 | 2024-01-02 | Sony Corporation | Mixed reality eyeglasses or portion thereof with an animated graphical user interface |
US11861070B2 (en) | 2021-04-19 | 2024-01-02 | Snap Inc. | Hand gestures for animating and controlling virtual and graphical elements |
US11857331B1 (en) | 2013-01-19 | 2024-01-02 | Bertec Corporation | Force measurement system |
US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
US11886639B2 (en) | 2020-09-17 | 2024-01-30 | Ultraleap Limited | Ultrahapticons |
US11883847B2 (en) | 2018-05-02 | 2024-01-30 | Ultraleap Limited | Blocking plate structure for improved acoustic transmission efficiency |
US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
US11921928B2 (en) | 2017-11-26 | 2024-03-05 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
US11925863B2 (en) | 2020-09-18 | 2024-03-12 | Snap Inc. | Tracking hand gestures for interactive game control in augmented reality |
US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
US12001610B2 (en) | 2016-08-03 | 2024-06-04 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US12002448B2 (en) | 2019-12-25 | 2024-06-04 | Ultraleap Limited | Acoustic transducer structures |
US12072406B2 (en) | 2020-12-30 | 2024-08-27 | Snap Inc. | Augmented reality precision tracking and display |
US12086324B2 (en) | 2020-12-29 | 2024-09-10 | Snap Inc. | Micro hand gestures for controlling virtual and graphical elements |
US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
US12100288B2 (en) | 2023-07-27 | 2024-09-24 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
Families Citing this family (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9865125B2 (en) | 2010-11-15 | 2018-01-09 | Bally Gaming, Inc. | System and method for augmented reality gaming |
JP2014531662A (ja) | 2011-09-19 | 2014-11-27 | アイサイト モバイル テクノロジーズ リミテッド | 拡張現実システムのためのタッチフリーインターフェース |
US11068049B2 (en) | 2012-03-23 | 2021-07-20 | Microsoft Technology Licensing, Llc | Light guide display and field of view |
US9558590B2 (en) | 2012-03-28 | 2017-01-31 | Microsoft Technology Licensing, Llc | Augmented reality light guide display |
US10191515B2 (en) | 2012-03-28 | 2019-01-29 | Microsoft Technology Licensing, Llc | Mobile device light guide display |
US9717981B2 (en) | 2012-04-05 | 2017-08-01 | Microsoft Technology Licensing, Llc | Augmented reality and physical games |
US10502876B2 (en) | 2012-05-22 | 2019-12-10 | Microsoft Technology Licensing, Llc | Waveguide optics focus elements |
US10192358B2 (en) | 2012-12-20 | 2019-01-29 | Microsoft Technology Licensing, Llc | Auto-stereoscopic augmented reality display |
US20140094148A1 (en) | 2013-05-08 | 2014-04-03 | Vringo Infrastructure Inc. | Cognitive Radio System And Cognitive Radio Carrier Device |
US9672627B1 (en) * | 2013-05-09 | 2017-06-06 | Amazon Technologies, Inc. | Multiple camera based motion tracking |
KR102157313B1 (ko) | 2013-09-03 | 2020-10-23 | 삼성전자주식회사 | 촬영된 영상을 이용하여 객체를 인식하는 방법 및 컴퓨터 판독 가능한 기록 매체 |
KR102165818B1 (ko) * | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | 입력 영상을 이용한 사용자 인터페이스 제어 방법, 장치 및 기록매체 |
JP5877824B2 (ja) * | 2013-09-20 | 2016-03-08 | ヤフー株式会社 | 情報処理システム、情報処理方法および情報処理プログラム |
KR102119659B1 (ko) | 2013-09-23 | 2020-06-08 | 엘지전자 주식회사 | 영상표시장치 및 그것의 제어 방법 |
CN103501473B (zh) * | 2013-09-30 | 2016-03-09 | 陈创举 | 基于mems传感器的多功能耳机及其控制方法 |
KR101499044B1 (ko) * | 2013-10-07 | 2015-03-11 | 홍익대학교 산학협력단 | 사용자의 손동작 및 음성에 기초하여 사용자가 의도한 텍스트를 취득하는 웨어러블 컴퓨터 및 사용자가 의도한 텍스트를 취득하는 방법 |
US9671826B2 (en) * | 2013-11-27 | 2017-06-06 | Immersion Corporation | Method and apparatus of body-mediated digital content transfer and haptic feedback |
US10586395B2 (en) | 2013-12-30 | 2020-03-10 | Daqri, Llc | Remote object detection and local tracking using visual odometry |
US9264479B2 (en) * | 2013-12-30 | 2016-02-16 | Daqri, Llc | Offloading augmented reality processing |
EP2899609B1 (en) * | 2014-01-24 | 2019-04-17 | Sony Corporation | System and method for name recollection |
DE102014201578A1 (de) * | 2014-01-29 | 2015-07-30 | Volkswagen Ag | Vorrichtung und Verfahren zur Signalisierung eines Eingabebereiches zur Gestenerkennung einer Mensch-Maschine-Schnittstelle |
US20150227231A1 (en) * | 2014-02-12 | 2015-08-13 | Microsoft Corporation | Virtual Transparent Display |
WO2015161062A1 (en) * | 2014-04-18 | 2015-10-22 | Bally Gaming, Inc. | System and method for augmented reality gaming |
US9501871B2 (en) | 2014-04-30 | 2016-11-22 | At&T Mobility Ii Llc | Explorable augmented reality displays |
TWI518603B (zh) | 2014-05-22 | 2016-01-21 | 宏達國際電子股份有限公司 | 影像編輯方法與電子裝置 |
CN106575151A (zh) * | 2014-06-17 | 2017-04-19 | 奥斯特豪特集团有限公司 | 用于头戴式计算的外部用户接口 |
JP6500355B2 (ja) * | 2014-06-20 | 2019-04-17 | 富士通株式会社 | 表示装置、表示プログラム、および表示方法 |
US20150379770A1 (en) * | 2014-06-27 | 2015-12-31 | David C. Haley, JR. | Digital action in response to object interaction |
US9959591B2 (en) * | 2014-07-31 | 2018-05-01 | Seiko Epson Corporation | Display apparatus, method for controlling display apparatus, and program |
CN104156082A (zh) * | 2014-08-06 | 2014-11-19 | 北京行云时空科技有限公司 | 面向时空场景的用户界面及应用的控制系统及智能终端 |
CN104133593A (zh) * | 2014-08-06 | 2014-11-05 | 北京行云时空科技有限公司 | 一种基于体感的文字输入系统及方法 |
CN104197950B (zh) * | 2014-08-19 | 2018-02-16 | 奇瑞汽车股份有限公司 | 地理信息显示的方法及系统 |
JP5989725B2 (ja) * | 2014-08-29 | 2016-09-07 | 京セラドキュメントソリューションズ株式会社 | 電子機器及び情報表示プログラム |
DE102014217843A1 (de) * | 2014-09-05 | 2016-03-10 | Martin Cudzilo | Vorrichtung zum Vereinfachen der Reinigung von Oberflächen und Verfahren zum Erfassen geleisteter Reinigungsarbeiten |
US20160109701A1 (en) * | 2014-10-15 | 2016-04-21 | GM Global Technology Operations LLC | Systems and methods for adjusting features within a head-up display |
TWI613615B (zh) * | 2014-10-15 | 2018-02-01 | 在地實驗文化事業有限公司 | 導覽系統及方法 |
US10108256B2 (en) * | 2014-10-30 | 2018-10-23 | Mediatek Inc. | Systems and methods for processing incoming events while performing a virtual reality session |
CN107077214A (zh) * | 2014-11-06 | 2017-08-18 | 皇家飞利浦有限公司 | 用于在医院中使用的通信的方法和系统 |
US9600076B2 (en) * | 2014-12-19 | 2017-03-21 | Immersion Corporation | Systems and methods for object manipulation with haptic feedback |
CN104537401B (zh) * | 2014-12-19 | 2017-05-17 | 南京大学 | 基于射频识别和景深感知技术的现实增强系统及工作方法 |
US20160196693A1 (en) * | 2015-01-06 | 2016-07-07 | Seiko Epson Corporation | Display system, control method for display device, and computer program |
TWI619041B (zh) * | 2015-01-09 | 2018-03-21 | Chunghwa Telecom Co Ltd | Augmented reality unlocking system and method |
CN104570354A (zh) * | 2015-01-09 | 2015-04-29 | 京东方科技集团股份有限公司 | 交互式眼镜和导览系统 |
US10317215B2 (en) | 2015-01-09 | 2019-06-11 | Boe Technology Group Co., Ltd. | Interactive glasses and navigation system |
JP2016133541A (ja) * | 2015-01-16 | 2016-07-25 | 株式会社ブリリアントサービス | 電子眼鏡および電子眼鏡の制御方法 |
US10018844B2 (en) | 2015-02-09 | 2018-07-10 | Microsoft Technology Licensing, Llc | Wearable image display system |
US10317677B2 (en) | 2015-02-09 | 2019-06-11 | Microsoft Technology Licensing, Llc | Display system |
WO2016139850A1 (ja) * | 2015-03-05 | 2016-09-09 | ソニー株式会社 | 情報処理装置、制御方法、およびプログラム |
JP6596883B2 (ja) | 2015-03-31 | 2019-10-30 | ソニー株式会社 | ヘッドマウントディスプレイ及びヘッドマウントディスプレイの制御方法、並びにコンピューター・プログラム |
US20160292920A1 (en) * | 2015-04-01 | 2016-10-06 | Caterpillar Inc. | Time-Shift Controlled Visualization of Worksite Operations |
US10156908B2 (en) * | 2015-04-15 | 2018-12-18 | Sony Interactive Entertainment Inc. | Pinch and hold gesture navigation on a head-mounted display |
CN105138763A (zh) * | 2015-08-19 | 2015-12-09 | 中山大学 | 一种增强现实中实景与现实信息叠加的方法 |
US10768187B2 (en) * | 2015-08-25 | 2020-09-08 | Hitachi High-Tech Corporation | Automatic analysis device and specimen inspection automation system |
CN105205454A (zh) * | 2015-08-27 | 2015-12-30 | 深圳市国华识别科技开发有限公司 | 自动捕捉目标物的系统和方法 |
KR102456597B1 (ko) * | 2015-09-01 | 2022-10-20 | 삼성전자주식회사 | 전자 장치 및 그의 동작 방법 |
KR101708455B1 (ko) * | 2015-09-08 | 2017-02-21 | 엠더블유엔테크 주식회사 | 핸드 플로트 입체 메뉴 시스템 |
CN105183173B (zh) * | 2015-10-12 | 2018-08-28 | 重庆中电大宇卫星应用技术研究所 | 一种将战术和摩尔斯码手势输入并转换为语音的装置 |
DE102015221860A1 (de) * | 2015-11-06 | 2017-05-11 | BSH Hausgeräte GmbH | System und Verfahren zur Erleichterung einer Bedienung eines Haushaltsgeräts |
CN105872815A (zh) * | 2015-11-25 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | 视频播放的方法及装置 |
JP2017129406A (ja) * | 2016-01-19 | 2017-07-27 | 日本電気通信システム株式会社 | 情報処理装置、スマートグラスおよびその制御方法、並びにコンピュータ・プログラム |
SE541141C2 (en) * | 2016-04-18 | 2019-04-16 | Moonlightning Ind Ab | Focus pulling with a stereo vision camera system |
CN105915715A (zh) * | 2016-05-25 | 2016-08-31 | 努比亚技术有限公司 | 来电提醒方法及其装置、佩戴式音频设备和移动终端 |
WO2017217752A1 (ko) * | 2016-06-17 | 2017-12-21 | 이철윤 | 상품과 포장상자의 3차원 합성영상 생성 시스템 및 방법 |
CN106155315A (zh) * | 2016-06-28 | 2016-11-23 | 广东欧珀移动通信有限公司 | 一种拍摄中增强现实效果的添加方法、装置及移动终端 |
CN106125932A (zh) * | 2016-06-28 | 2016-11-16 | 广东欧珀移动通信有限公司 | 一种增强现实中目标对象的识别方法、装置及移动终端 |
CN106157363A (zh) * | 2016-06-28 | 2016-11-23 | 广东欧珀移动通信有限公司 | 一种基于增强现实的拍照方法、装置和移动终端 |
CN106066701B (zh) * | 2016-07-05 | 2019-07-26 | 上海智旭商务咨询有限公司 | 一种ar和vr数据处理设备与方法 |
KR20180009170A (ko) * | 2016-07-18 | 2018-01-26 | 엘지전자 주식회사 | 이동 단말기 및 그의 동작 방법 |
CN106354257A (zh) * | 2016-08-30 | 2017-01-25 | 湖北睛彩视讯科技有限公司 | 一种基于增强现实技术的移动场景融合系统及方法 |
CN106980362A (zh) * | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | 基于虚拟现实场景的输入方法及装置 |
JP2018082363A (ja) * | 2016-11-18 | 2018-05-24 | セイコーエプソン株式会社 | 頭部装着型表示装置およびその制御方法、並びにコンピュータープログラム |
US10996814B2 (en) | 2016-11-29 | 2021-05-04 | Real View Imaging Ltd. | Tactile feedback in a display system |
WO2018113740A1 (en) * | 2016-12-21 | 2018-06-28 | Zyetric Technologies Limited | Combining virtual reality and augmented reality |
USD864959S1 (en) | 2017-01-04 | 2019-10-29 | Mentor Acquisition One, Llc | Computer glasses |
CN110622219B (zh) * | 2017-03-10 | 2024-01-19 | 杰创科增强现实有限公司 | 交互式增强现实 |
EP4250066A3 (en) | 2017-03-21 | 2023-11-29 | InterDigital VC Holdings, Inc. | Method and system for the detection and augmentation of tactile interactions in augmented reality |
US10489651B2 (en) * | 2017-04-14 | 2019-11-26 | Microsoft Technology Licensing, Llc | Identifying a position of a marker in an environment |
US11054894B2 (en) | 2017-05-05 | 2021-07-06 | Microsoft Technology Licensing, Llc | Integrated mixed-input system |
US20210117680A1 (en) * | 2017-05-10 | 2021-04-22 | Humane, Inc. | Wearable multimedia device and cloud computing platform with laser projection system |
US11023109B2 (en) | 2017-06-30 | 2021-06-01 | Microsoft Techniogy Licensing, LLC | Annotation using a multi-device mixed interactivity system |
US10895966B2 (en) | 2017-06-30 | 2021-01-19 | Microsoft Technology Licensing, Llc | Selection using a multi-device mixed interactivity system |
CN107340871A (zh) * | 2017-07-25 | 2017-11-10 | 深识全球创新科技(北京)有限公司 | 集成手势识别与超声波触觉反馈的装置及其方法和用途 |
WO2019021447A1 (ja) * | 2017-07-28 | 2019-01-31 | 株式会社オプティム | ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム |
WO2019021446A1 (ja) * | 2017-07-28 | 2019-01-31 | 株式会社オプティム | ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム |
CN107635057A (zh) * | 2017-07-31 | 2018-01-26 | 努比亚技术有限公司 | 一种虚拟现实终端控制方法、终端和计算机可读存储介质 |
US10506217B2 (en) * | 2017-10-09 | 2019-12-10 | Facebook Technologies, Llc | Head-mounted display tracking system |
US20190129607A1 (en) * | 2017-11-02 | 2019-05-02 | Samsung Electronics Co., Ltd. | Method and device for performing remote control |
US10739861B2 (en) * | 2018-01-10 | 2020-08-11 | Facebook Technologies, Llc | Long distance interaction with artificial reality objects using a near eye display interface |
US10706628B2 (en) * | 2018-02-28 | 2020-07-07 | Lenovo (Singapore) Pte. Ltd. | Content transfer |
US20190324549A1 (en) * | 2018-04-20 | 2019-10-24 | Immersion Corporation | Systems, devices, and methods for providing immersive reality interface modes |
JP7056423B2 (ja) * | 2018-07-10 | 2022-04-19 | オムロン株式会社 | 入力装置 |
US11360558B2 (en) * | 2018-07-17 | 2022-06-14 | Apple Inc. | Computer systems with finger devices |
US10890653B2 (en) | 2018-08-22 | 2021-01-12 | Google Llc | Radar-based gesture enhancement for voice interfaces |
US10770035B2 (en) | 2018-08-22 | 2020-09-08 | Google Llc | Smartphone-based radar system for facilitating awareness of user presence and orientation |
US10698603B2 (en) | 2018-08-24 | 2020-06-30 | Google Llc | Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface |
CN109348003A (zh) * | 2018-09-17 | 2019-02-15 | 深圳市泰衡诺科技有限公司 | 应用控制方法及装置 |
CN110942518B (zh) * | 2018-09-24 | 2024-03-29 | 苹果公司 | 上下文计算机生成现实(cgr)数字助理 |
KR102620702B1 (ko) * | 2018-10-12 | 2024-01-04 | 삼성전자주식회사 | 모바일 장치 및 모바일 장치의 제어 방법 |
US10788880B2 (en) | 2018-10-22 | 2020-09-29 | Google Llc | Smartphone-based radar system for determining user intention in a lower-power mode |
CN109782639A (zh) * | 2018-12-29 | 2019-05-21 | 深圳市中孚能电气设备有限公司 | 一种电子设备工作模式的控制方法及控制装置 |
US20220057636A1 (en) * | 2019-01-24 | 2022-02-24 | Maxell, Ltd. | Display terminal, application control system and application control method |
JP6720385B1 (ja) * | 2019-02-07 | 2020-07-08 | 株式会社メルカリ | プログラム、情報処理方法、及び情報処理端末 |
CN110109547A (zh) * | 2019-05-05 | 2019-08-09 | 芋头科技(杭州)有限公司 | 基于手势识别的命令激活方法和系统 |
JP7331462B2 (ja) * | 2019-05-24 | 2023-08-23 | 京セラドキュメントソリューションズ株式会社 | ロボットシステム、ロボット制御方法及び電子装置 |
US10747371B1 (en) * | 2019-06-28 | 2020-08-18 | Konica Minolta Business Solutions U.S.A., Inc. | Detection of finger press from live video stream |
CN113012214A (zh) * | 2019-12-20 | 2021-06-22 | 北京外号信息技术有限公司 | 用于设置虚拟对象的空间位置的方法和电子设备 |
JP2022022568A (ja) * | 2020-06-26 | 2022-02-07 | 沖電気工業株式会社 | 表示操作部および装置 |
CN113190110A (zh) * | 2021-03-30 | 2021-07-30 | 青岛小鸟看看科技有限公司 | 头戴式显示设备的界面元素控制方法及装置 |
US20220326781A1 (en) * | 2021-04-08 | 2022-10-13 | Snap Inc. | Bimanual interactions between mapped hand regions for controlling virtual and graphical elements |
CN117121478A (zh) * | 2021-04-09 | 2023-11-24 | 三星电子株式会社 | 包括多个相机的可穿戴电子装置 |
CN113141529B (zh) * | 2021-04-25 | 2022-02-25 | 聚好看科技股份有限公司 | 显示设备及媒资播放方法 |
KR102629771B1 (ko) * | 2021-09-30 | 2024-01-29 | 박두고 | 손 또는 손가락 추적을 이용한 객체 인식용 웨어러블 장치 |
US11967147B2 (en) * | 2021-10-01 | 2024-04-23 | At&T Intellectual Proerty I, L.P. | Augmented reality visualization of enclosed spaces |
US20230315208A1 (en) * | 2022-04-04 | 2023-10-05 | Snap Inc. | Gesture-based application invocation |
CN115309271B (zh) * | 2022-09-29 | 2023-03-21 | 南方科技大学 | 基于混合现实的信息展示方法、装置、设备及存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020060648A1 (en) * | 2000-11-17 | 2002-05-23 | Taichi Matsui | Image-display control apparatus |
US20060101349A1 (en) * | 2000-05-29 | 2006-05-11 | Klony Lieberman | Virtual data entry device and method for input of alphanumeric and other data |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
US20110029185A1 (en) * | 2008-03-19 | 2011-02-03 | Denso Corporation | Vehicular manipulation input apparatus |
US20110221669A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Gesture control in an augmented reality eyepiece |
US20120032874A1 (en) * | 2010-08-09 | 2012-02-09 | Sony Corporation | Display apparatus assembly |
US20130050069A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation, A Japanese Corporation | Method and system for use in providing three dimensional user interface |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981309A (ja) * | 1995-09-13 | 1997-03-28 | Toshiba Corp | 入力装置 |
JP3365246B2 (ja) | 1997-03-14 | 2003-01-08 | ミノルタ株式会社 | 電子スチルカメラ |
JP3225882B2 (ja) * | 1997-03-27 | 2001-11-05 | 日本電信電話株式会社 | 景観ラベリングシステム |
DE19917660A1 (de) * | 1999-04-19 | 2000-11-02 | Deutsch Zentr Luft & Raumfahrt | Verfahren und Eingabeeinrichtung zum Steuern der Lage eines in einer virtuellen Realität graphisch darzustellenden Objekts |
AU7651100A (en) | 1999-09-15 | 2001-04-17 | Roche Consumer Health Ag | Pharmaceutical and/or cosmetical compositions |
US6771294B1 (en) * | 1999-12-29 | 2004-08-03 | Petri Pulli | User interface |
SE0000850D0 (sv) * | 2000-03-13 | 2000-03-13 | Pink Solution Ab | Recognition arrangement |
US7215322B2 (en) | 2001-05-31 | 2007-05-08 | Siemens Corporate Research, Inc. | Input devices for augmented reality applications |
US7126558B1 (en) | 2001-10-19 | 2006-10-24 | Accenture Global Services Gmbh | Industrial augmented reality |
AU2003217587A1 (en) * | 2002-02-15 | 2003-09-09 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
US6943754B2 (en) * | 2002-09-27 | 2005-09-13 | The Boeing Company | Gaze tracking system, eye-tracking assembly and an associated method of calibration |
US7676079B2 (en) * | 2003-09-30 | 2010-03-09 | Canon Kabushiki Kaisha | Index identification method and apparatus |
IL161002A0 (en) | 2004-03-22 | 2004-08-31 | Itay Katz | Virtual video keyboard system |
CN101375598A (zh) * | 2004-06-01 | 2009-02-25 | L-3通信公司 | 视频闪光/视觉警报 |
US20060200662A1 (en) * | 2005-02-01 | 2006-09-07 | Microsoft Corporation | Referencing objects in a virtual environment |
WO2007011306A2 (en) * | 2005-07-20 | 2007-01-25 | Bracco Imaging S.P.A. | A method of and apparatus for mapping a virtual model of an object to the object |
KR100687737B1 (ko) * | 2005-03-19 | 2007-02-27 | 한국전자통신연구원 | 양손 제스쳐에 기반한 가상 마우스 장치 및 방법 |
JP2009508274A (ja) * | 2005-09-13 | 2009-02-26 | スペースタイムスリーディー・インコーポレーテッド | 3次元グラフィカル・ユーザ・インターフェースを提供するシステム及び方法 |
JP4851771B2 (ja) * | 2005-10-24 | 2012-01-11 | 京セラ株式会社 | 情報処理システム、および携帯情報端末 |
US7509588B2 (en) * | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US7725547B2 (en) * | 2006-09-06 | 2010-05-25 | International Business Machines Corporation | Informing a user of gestures made by others out of the user's line of sight |
JP4961914B2 (ja) * | 2006-09-08 | 2012-06-27 | ソニー株式会社 | 撮像表示装置、撮像表示方法 |
JP5228307B2 (ja) * | 2006-10-16 | 2013-07-03 | ソニー株式会社 | 表示装置、表示方法 |
JP2010512693A (ja) * | 2006-12-07 | 2010-04-22 | アダックス,インク. | データの付加、記録および通信のためのシステムと方法 |
KR101285360B1 (ko) * | 2007-01-25 | 2013-07-11 | 삼성전자주식회사 | 증강현실을 이용한 관심 지점 표시 장치 및 방법 |
US8942764B2 (en) * | 2007-10-01 | 2015-01-27 | Apple Inc. | Personal media device controlled via user initiated movements utilizing movement based interfaces |
JP4933406B2 (ja) * | 2007-11-15 | 2012-05-16 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
US8165345B2 (en) * | 2007-12-07 | 2012-04-24 | Tom Chau | Method, system, and computer program for detecting and characterizing motion |
JP5250834B2 (ja) * | 2008-04-03 | 2013-07-31 | コニカミノルタ株式会社 | 頭部装着式映像表示装置 |
WO2009128064A2 (en) * | 2008-04-14 | 2009-10-22 | Pointgrab Ltd. | Vision based pointing device emulation |
US8971565B2 (en) | 2008-05-29 | 2015-03-03 | Hie-D Technologies, Llc | Human interface electronic device |
US8397181B2 (en) * | 2008-11-17 | 2013-03-12 | Honeywell International Inc. | Method and apparatus for marking a position of a real world object in a see-through display |
US9041660B2 (en) * | 2008-12-09 | 2015-05-26 | Microsoft Technology Licensing, Llc | Soft keyboard control |
CN102356398B (zh) | 2009-02-02 | 2016-11-23 | 视力移动技术有限公司 | 用于视频流中的对象识别和跟踪的系统和方法 |
US8880865B2 (en) * | 2009-02-20 | 2014-11-04 | Koninklijke Philips N.V. | System, method and apparatus for causing a device to enter an active mode |
US8009022B2 (en) | 2009-05-29 | 2011-08-30 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
KR101622196B1 (ko) * | 2009-09-07 | 2016-05-18 | 삼성전자주식회사 | 휴대용 단말기에서 피오아이 정보 제공 방법 및 장치 |
US20110107216A1 (en) * | 2009-11-03 | 2011-05-05 | Qualcomm Incorporated | Gesture-based user interface |
JP4679661B1 (ja) * | 2009-12-15 | 2011-04-27 | 株式会社東芝 | 情報提示装置、情報提示方法及びプログラム |
KR20110075250A (ko) | 2009-12-28 | 2011-07-06 | 엘지전자 주식회사 | 객체 추적 모드를 활용한 객체 추적 방법 및 장치 |
US20110214082A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Projection triggering through an external marker in an augmented reality eyepiece |
US9128281B2 (en) * | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
US8788197B2 (en) | 2010-04-30 | 2014-07-22 | Ryan Fink | Visual training devices, systems, and methods |
US8593375B2 (en) | 2010-07-23 | 2013-11-26 | Gregory A Maltz | Eye gaze user interface and method |
US20120062602A1 (en) * | 2010-09-13 | 2012-03-15 | Nokia Corporation | Method and apparatus for rendering a content display |
US8941559B2 (en) | 2010-09-21 | 2015-01-27 | Microsoft Corporation | Opacity filter for display device |
US8768006B2 (en) * | 2010-10-19 | 2014-07-01 | Hewlett-Packard Development Company, L.P. | Hand gesture recognition |
CN102147926A (zh) * | 2011-01-17 | 2011-08-10 | 中兴通讯股份有限公司 | 3d图标的处理方法、装置及移动终端 |
US9336240B2 (en) * | 2011-07-15 | 2016-05-10 | Apple Inc. | Geo-tagging digital images |
JP2014531662A (ja) * | 2011-09-19 | 2014-11-27 | アイサイト モバイル テクノロジーズ リミテッド | 拡張現実システムのためのタッチフリーインターフェース |
CN108469899B (zh) | 2012-03-13 | 2021-08-10 | 视力移动技术有限公司 | 识别可穿戴显示装置的观察空间中的瞄准点或区域的方法 |
-
2012
- 2012-09-19 JP JP2014531374A patent/JP2014531662A/ja active Pending
- 2012-09-19 US US14/345,592 patent/US20140361988A1/en not_active Abandoned
- 2012-09-19 CN CN201280048836.8A patent/CN103858073B/zh active Active
- 2012-09-19 KR KR1020227001961A patent/KR20220032059A/ko not_active Application Discontinuation
- 2012-09-19 KR KR1020147009451A patent/KR20140069124A/ko active Application Filing
- 2012-09-19 WO PCT/IL2012/050376 patent/WO2013093906A1/en active Application Filing
- 2012-09-19 CN CN202210808606.2A patent/CN115167675A/zh active Pending
- 2012-09-19 KR KR1020197034815A patent/KR20190133080A/ko not_active Application Discontinuation
-
2016
- 2016-03-03 US US15/060,533 patent/US20160259423A1/en not_active Abandoned
- 2016-04-04 US US15/090,527 patent/US20160291699A1/en not_active Abandoned
- 2016-04-12 US US15/096,674 patent/US20160306433A1/en not_active Abandoned
- 2016-05-02 US US15/144,209 patent/US10401967B2/en not_active Expired - Fee Related
- 2016-09-02 US US15/256,481 patent/US20170052599A1/en not_active Abandoned
-
2017
- 2017-10-02 JP JP2017192930A patent/JP2018028922A/ja active Pending
-
2019
- 2019-08-30 US US16/557,183 patent/US11093045B2/en active Active
-
2020
- 2020-09-18 JP JP2020157123A patent/JP7297216B2/ja active Active
-
2021
- 2021-08-13 US US17/401,427 patent/US11494000B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060101349A1 (en) * | 2000-05-29 | 2006-05-11 | Klony Lieberman | Virtual data entry device and method for input of alphanumeric and other data |
US20020060648A1 (en) * | 2000-11-17 | 2002-05-23 | Taichi Matsui | Image-display control apparatus |
US6822643B2 (en) * | 2000-11-17 | 2004-11-23 | Canon Kabushiki Kaisha | Image-display control apparatus |
US20110029185A1 (en) * | 2008-03-19 | 2011-02-03 | Denso Corporation | Vehicular manipulation input apparatus |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
US20110221669A1 (en) * | 2010-02-28 | 2011-09-15 | Osterhout Group, Inc. | Gesture control in an augmented reality eyepiece |
US20120032874A1 (en) * | 2010-08-09 | 2012-02-09 | Sony Corporation | Display apparatus assembly |
US20130050069A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation, A Japanese Corporation | Method and system for use in providing three dimensional user interface |
Cited By (201)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140358002A1 (en) * | 2011-12-23 | 2014-12-04 | Koninklijke Philips N.V. | Method and apparatus for interactive display of three dimensional ultrasound images |
US10966684B2 (en) * | 2011-12-23 | 2021-04-06 | Koninklijke Philips N.V | Method and apparatus for interactive display of three dimensional ultrasound images |
US20140028716A1 (en) * | 2012-07-30 | 2014-01-30 | Mitac International Corp. | Method and electronic device for generating an instruction in an augmented reality environment |
US20140125580A1 (en) * | 2012-11-02 | 2014-05-08 | Samsung Electronics Co., Ltd. | Method and device for providing information regarding an object |
US9836128B2 (en) * | 2012-11-02 | 2017-12-05 | Samsung Electronics Co., Ltd. | Method and device for providing information regarding an object |
US11540744B1 (en) | 2013-01-19 | 2023-01-03 | Bertec Corporation | Force measurement system |
US10010286B1 (en) | 2013-01-19 | 2018-07-03 | Bertec Corporation | Force measurement system |
US9770203B1 (en) | 2013-01-19 | 2017-09-26 | Bertec Corporation | Force measurement system and a method of testing a subject |
US10413230B1 (en) | 2013-01-19 | 2019-09-17 | Bertec Corporation | Force measurement system |
US11857331B1 (en) | 2013-01-19 | 2024-01-02 | Bertec Corporation | Force measurement system |
US9526443B1 (en) * | 2013-01-19 | 2016-12-27 | Bertec Corporation | Force and/or motion measurement system and a method of testing a subject |
US10231662B1 (en) | 2013-01-19 | 2019-03-19 | Bertec Corporation | Force measurement system |
US11311209B1 (en) | 2013-01-19 | 2022-04-26 | Bertec Corporation | Force measurement system and a motion base used therein |
US11052288B1 (en) | 2013-01-19 | 2021-07-06 | Bertec Corporation | Force measurement system |
US10856796B1 (en) | 2013-01-19 | 2020-12-08 | Bertec Corporation | Force measurement system |
US10646153B1 (en) | 2013-01-19 | 2020-05-12 | Bertec Corporation | Force measurement system |
US11262835B2 (en) * | 2013-02-14 | 2022-03-01 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for HMD |
US20140225918A1 (en) * | 2013-02-14 | 2014-08-14 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for hmd |
US10133342B2 (en) * | 2013-02-14 | 2018-11-20 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for HMD |
US9965896B2 (en) * | 2013-02-21 | 2018-05-08 | Fujitsu Limited | Display device and display method |
US20150356789A1 (en) * | 2013-02-21 | 2015-12-10 | Fujitsu Limited | Display device and display method |
US20140240226A1 (en) * | 2013-02-27 | 2014-08-28 | Robert Bosch Gmbh | User Interface Apparatus |
US20140270352A1 (en) * | 2013-03-14 | 2014-09-18 | Honda Motor Co., Ltd. | Three dimensional fingertip tracking |
US9122916B2 (en) * | 2013-03-14 | 2015-09-01 | Honda Motor Co., Ltd. | Three dimensional fingertip tracking |
US20140285520A1 (en) * | 2013-03-22 | 2014-09-25 | Industry-University Cooperation Foundation Hanyang University | Wearable display device using augmented reality |
US9507426B2 (en) * | 2013-03-27 | 2016-11-29 | Google Inc. | Using the Z-axis in user interfaces for head mountable displays |
US9811154B2 (en) | 2013-03-27 | 2017-11-07 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
US9213403B1 (en) | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
US20140306881A1 (en) * | 2013-04-15 | 2014-10-16 | Olympus Corporation | Wearable device, program and display controlling method of wearable device |
US20160103437A1 (en) * | 2013-06-27 | 2016-04-14 | Abb Technology Ltd | Method and data presenting device for assisting a remote user to provide instructions |
US9829873B2 (en) * | 2013-06-27 | 2017-11-28 | Abb Schweiz Ag | Method and data presenting device for assisting a remote user to provide instructions |
US10295338B2 (en) | 2013-07-12 | 2019-05-21 | Magic Leap, Inc. | Method and system for generating map data from an image |
US10866093B2 (en) | 2013-07-12 | 2020-12-15 | Magic Leap, Inc. | Method and system for retrieving data in response to user input |
US10533850B2 (en) | 2013-07-12 | 2020-01-14 | Magic Leap, Inc. | Method and system for inserting recognized object data into a virtual world |
US20150243105A1 (en) * | 2013-07-12 | 2015-08-27 | Magic Leap, Inc. | Method and system for interacting with user interfaces |
US10571263B2 (en) | 2013-07-12 | 2020-02-25 | Magic Leap, Inc. | User and object interaction with an augmented reality scenario |
US11221213B2 (en) | 2013-07-12 | 2022-01-11 | Magic Leap, Inc. | Method and system for generating a retail experience using an augmented reality system |
US11656677B2 (en) | 2013-07-12 | 2023-05-23 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
US11060858B2 (en) | 2013-07-12 | 2021-07-13 | Magic Leap, Inc. | Method and system for generating a virtual user interface related to a totem |
US11029147B2 (en) | 2013-07-12 | 2021-06-08 | Magic Leap, Inc. | Method and system for facilitating surgery using an augmented reality system |
US10473459B2 (en) | 2013-07-12 | 2019-11-12 | Magic Leap, Inc. | Method and system for determining user input based on totem |
US10591286B2 (en) | 2013-07-12 | 2020-03-17 | Magic Leap, Inc. | Method and system for generating virtual rooms |
US10352693B2 (en) | 2013-07-12 | 2019-07-16 | Magic Leap, Inc. | Method and system for obtaining texture data of a space |
US10408613B2 (en) | 2013-07-12 | 2019-09-10 | Magic Leap, Inc. | Method and system for rendering virtual content |
US10495453B2 (en) | 2013-07-12 | 2019-12-03 | Magic Leap, Inc. | Augmented reality system totems and methods of using same |
US10767986B2 (en) * | 2013-07-12 | 2020-09-08 | Magic Leap, Inc. | Method and system for interacting with user interfaces |
US10641603B2 (en) | 2013-07-12 | 2020-05-05 | Magic Leap, Inc. | Method and system for updating a virtual world |
US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
US11644799B2 (en) | 2013-10-04 | 2023-05-09 | Meta Platforms Technologies, Llc | Systems, articles and methods for wearable electronic devices employing contact sensors |
US11079846B2 (en) | 2013-11-12 | 2021-08-03 | Facebook Technologies, Llc | Systems, articles, and methods for capacitive electromyography sensors |
US20150146007A1 (en) * | 2013-11-26 | 2015-05-28 | Honeywell International Inc. | Maintenance assistant system |
US9740935B2 (en) * | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
US11666264B1 (en) | 2013-11-27 | 2023-06-06 | Meta Platforms Technologies, Llc | Systems, articles, and methods for electromyography sensors |
US9560272B2 (en) * | 2014-03-24 | 2017-01-31 | Samsung Electronics Co., Ltd. | Electronic device and method for image data processing |
US20150271396A1 (en) * | 2014-03-24 | 2015-09-24 | Samsung Electronics Co., Ltd. | Electronic device and method for image data processing |
US10600245B1 (en) * | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Navigating a virtual environment of a media content item |
US11508125B1 (en) | 2014-05-28 | 2022-11-22 | Lucasfilm Entertainment Company Ltd. | Navigating a virtual environment of a media content item |
US10602200B2 (en) | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Switching modes of a media content item |
US10484673B2 (en) * | 2014-06-05 | 2019-11-19 | Samsung Electronics Co., Ltd. | Wearable device and method for providing augmented reality information |
US20150358614A1 (en) * | 2014-06-05 | 2015-12-10 | Samsung Electronics Co., Ltd. | Wearable device and method for providing augmented reality information |
US9886848B2 (en) * | 2014-06-12 | 2018-02-06 | Lg Electronics Inc. | Mobile terminal and control system |
US20150364037A1 (en) * | 2014-06-12 | 2015-12-17 | Lg Electronics Inc. | Mobile terminal and control system |
US20160048024A1 (en) * | 2014-08-13 | 2016-02-18 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
US9696551B2 (en) * | 2014-08-13 | 2017-07-04 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
US10606348B2 (en) * | 2014-08-18 | 2020-03-31 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US10241568B2 (en) * | 2014-08-18 | 2019-03-26 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US9690375B2 (en) * | 2014-08-18 | 2017-06-27 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US20160048203A1 (en) * | 2014-08-18 | 2016-02-18 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US11586277B2 (en) | 2014-08-18 | 2023-02-21 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US20190196581A1 (en) * | 2014-08-18 | 2019-06-27 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
US9910504B2 (en) * | 2014-08-21 | 2018-03-06 | Samsung Electronics Co., Ltd. | Sensor based UI in HMD incorporating light turning element |
US20160054802A1 (en) * | 2014-08-21 | 2016-02-25 | Samsung Electronics Co., Ltd. | Sensor based ui in hmd incorporating light turning element |
US11656686B2 (en) * | 2014-09-09 | 2023-05-23 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11768540B2 (en) | 2014-09-09 | 2023-09-26 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US10684367B2 (en) | 2014-11-26 | 2020-06-16 | Samsung Electronics Co., Ltd. | Ultrasound sensor and object detecting method thereof |
US10310624B2 (en) * | 2014-12-17 | 2019-06-04 | Konica Minolta, Inc. | Electronic apparatus, method for controlling electronic apparatus, and control program for the same |
US10606359B2 (en) | 2014-12-19 | 2020-03-31 | Immersion Corporation | Systems and methods for haptically-enabled interactions with objects |
US9685005B2 (en) * | 2015-01-02 | 2017-06-20 | Eon Reality, Inc. | Virtual lasers for interacting with augmented reality environments |
US20170061700A1 (en) * | 2015-02-13 | 2017-03-02 | Julian Michael Urbach | Intercommunication between a head mounted display and a real world object |
US11830351B2 (en) | 2015-02-20 | 2023-11-28 | Ultrahaptics Ip Ltd | Algorithm improvements in a haptic system |
US10963701B2 (en) * | 2015-02-23 | 2021-03-30 | Vivint, Inc. | Techniques for identifying and indexing distinguishing features in a video feed |
US20180225520A1 (en) * | 2015-02-23 | 2018-08-09 | Vivint, Inc. | Techniques for identifying and indexing distinguishing features in a video feed |
JP2016161734A (ja) * | 2015-03-02 | 2016-09-05 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
US10747488B2 (en) * | 2015-04-24 | 2020-08-18 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US10359982B2 (en) * | 2015-04-24 | 2019-07-23 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US9983840B2 (en) * | 2015-04-24 | 2018-05-29 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US20160313956A1 (en) * | 2015-04-24 | 2016-10-27 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US9946504B2 (en) * | 2015-04-24 | 2018-04-17 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US20190317717A1 (en) * | 2015-04-24 | 2019-10-17 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US20180039466A1 (en) * | 2015-04-24 | 2018-02-08 | Panasonic Intellectual Property Corporation Of America | Head-mounted display apparatus worn on user's head |
US20180239573A1 (en) * | 2015-04-24 | 2018-08-23 | Panasonic Intellectual Property Corporation of Ame rica | Head-mounted display apparatus worn on user's head |
US10055888B2 (en) | 2015-04-28 | 2018-08-21 | Microsoft Technology Licensing, Llc | Producing and consuming metadata within multi-dimensional data |
WO2016206874A1 (de) * | 2015-06-23 | 2016-12-29 | Siemens Aktiengesellschaft | Interaktionssystem |
US20160378294A1 (en) * | 2015-06-24 | 2016-12-29 | Shawn Crispin Wright | Contextual cursor display based on hand tracking |
US10409443B2 (en) * | 2015-06-24 | 2019-09-10 | Microsoft Technology Licensing, Llc | Contextual cursor display based on hand tracking |
US10156726B2 (en) * | 2015-06-29 | 2018-12-18 | Microsoft Technology Licensing, Llc | Graphene in optical systems |
CN108369640A (zh) * | 2015-12-17 | 2018-08-03 | 诺基亚技术有限公司 | 用于控制场景的捕获图像的图像处理以调适捕获图像的方法、装置或计算机程序 |
US11587202B2 (en) * | 2015-12-17 | 2023-02-21 | Nokia Technologies Oy | Method, apparatus or computer program for controlling image processing of a captured image of a scene to adapt the captured image |
US9697648B1 (en) | 2015-12-23 | 2017-07-04 | Intel Corporation | Text functions in augmented reality |
WO2017112099A1 (en) * | 2015-12-23 | 2017-06-29 | Intel Corporation | Text functions in augmented reality |
US10082940B2 (en) | 2015-12-23 | 2018-09-25 | Intel Corporation | Text functions in augmented reality |
CN105843390A (zh) * | 2016-02-24 | 2016-08-10 | 上海理湃光晶技术有限公司 | 一种图像缩放的方法与基于该方法的ar眼镜 |
US10168768B1 (en) | 2016-03-02 | 2019-01-01 | Meta Company | Systems and methods to facilitate interactions in an interactive space |
US10133345B2 (en) | 2016-03-22 | 2018-11-20 | Microsoft Technology Licensing, Llc | Virtual-reality navigation |
US9933855B2 (en) | 2016-03-31 | 2018-04-03 | Intel Corporation | Augmented reality in a field of view including a reflection |
WO2017172211A1 (en) * | 2016-03-31 | 2017-10-05 | Intel Corporation | Augmented reality in a field of view including a reflection |
US10438419B2 (en) | 2016-05-13 | 2019-10-08 | Meta View, Inc. | System and method for modifying virtual objects in a virtual environment in response to user interactions |
US10186088B2 (en) | 2016-05-13 | 2019-01-22 | Meta Company | System and method for managing interactive virtual frames for virtual objects in a virtual environment |
US9990779B2 (en) * | 2016-05-13 | 2018-06-05 | Meta Company | System and method for modifying virtual objects in a virtual environment in response to user interactions |
ES2643863A1 (es) * | 2016-05-24 | 2017-11-24 | Sonovisión Ingenieros España, S.A.U. | Método para proporcionar mediante realidad aumentada guiado, inspección y soporte en instalación o mantenimiento de procesos para ensamblajes complejos compatible con s1000d y dispositivo que hace uso del mismo |
US11337652B2 (en) | 2016-07-25 | 2022-05-24 | Facebook Technologies, Llc | System and method for measuring the movements of articulated rigid bodies |
US10656711B2 (en) | 2016-07-25 | 2020-05-19 | Facebook Technologies, Llc | Methods and apparatus for inferring user intent based on neuromuscular signals |
US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
US12001610B2 (en) | 2016-08-03 | 2024-06-04 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US11119585B2 (en) | 2016-10-13 | 2021-09-14 | Ford Motor Company | Dual-mode augmented reality interfaces for mobile devices |
CN109804638A (zh) * | 2016-10-13 | 2019-05-24 | 福特汽车公司 | 移动装置的双模式增强现实界面 |
WO2018071019A1 (en) * | 2016-10-13 | 2018-04-19 | Ford Motor Company | Dual-mode augmented reality interfaces for mobile devices |
US10887486B2 (en) * | 2016-10-26 | 2021-01-05 | Orcam Technologies, Ltd | Wearable device and methods for transmitting information based on physical distance |
US20190379925A1 (en) * | 2016-10-26 | 2019-12-12 | OrCam Technologies, Ltd. | Wearable device and methods for transmitting information based on physical distance |
US10405024B2 (en) * | 2016-10-26 | 2019-09-03 | Orcam Technologies Ltd. | Wearable device and methods for transmitting information based on physical distance |
US11507216B2 (en) * | 2016-12-23 | 2022-11-22 | Realwear, Inc. | Customizing user interfaces of binary applications |
US11099716B2 (en) | 2016-12-23 | 2021-08-24 | Realwear, Inc. | Context based content navigation for wearable display |
US11409497B2 (en) | 2016-12-23 | 2022-08-09 | Realwear, Inc. | Hands-free navigation of touch-based operating systems |
US11947752B2 (en) | 2016-12-23 | 2024-04-02 | Realwear, Inc. | Customizing user interfaces of binary applications |
US20180189474A1 (en) * | 2016-12-30 | 2018-07-05 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and Electronic Device for Unlocking Electronic Device |
US10620779B2 (en) * | 2017-04-24 | 2020-04-14 | Microsoft Technology Licensing, Llc | Navigating a holographic image |
US10481755B1 (en) * | 2017-04-28 | 2019-11-19 | Meta View, Inc. | Systems and methods to present virtual content in an interactive space |
US20230333378A1 (en) * | 2017-08-25 | 2023-10-19 | Snap Inc. | Wristwatch based interface for augmented reality eyewear |
US10957135B2 (en) | 2017-09-21 | 2021-03-23 | Universal City Studios Llc | Locker management techniques |
US10068403B1 (en) | 2017-09-21 | 2018-09-04 | Universal City Studios Llc | Locker management techniques |
US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
US10823964B2 (en) * | 2017-11-02 | 2020-11-03 | Olympus Corporation | Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer |
US20190129179A1 (en) * | 2017-11-02 | 2019-05-02 | Olympus Corporation | Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer |
US11921928B2 (en) | 2017-11-26 | 2024-03-05 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
US12008682B2 (en) | 2017-12-22 | 2024-06-11 | Sony Group Corporation | Information processor, information processing method, and program image to determine a region of an operation target in a moving image |
US11321880B2 (en) | 2017-12-22 | 2022-05-03 | Sony Corporation | Information processor, information processing method, and program for specifying an important region of an operation target in a moving image |
US11069148B2 (en) | 2018-01-25 | 2021-07-20 | Facebook Technologies, Llc | Visualization of reconstructed handstate information |
US11883847B2 (en) | 2018-05-02 | 2024-01-30 | Ultraleap Limited | Blocking plate structure for improved acoustic transmission efficiency |
US20190339837A1 (en) * | 2018-05-04 | 2019-11-07 | Oculus Vr, Llc | Copy and Paste in a Virtual Reality Environment |
US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
US11036302B1 (en) | 2018-05-08 | 2021-06-15 | Facebook Technologies, Llc | Wearable devices and methods for improved speech recognition |
US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
US10768426B2 (en) | 2018-05-21 | 2020-09-08 | Microsoft Technology Licensing, Llc | Head mounted display system receiving three-dimensional push notification |
US11129569B1 (en) | 2018-05-29 | 2021-09-28 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
US10687759B2 (en) | 2018-05-29 | 2020-06-23 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
US10970374B2 (en) | 2018-06-14 | 2021-04-06 | Facebook Technologies, Llc | User identification and authentication with neuromuscular signatures |
US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
US11179066B2 (en) | 2018-08-13 | 2021-11-23 | Facebook Technologies, Llc | Real-time spike detection and identification |
US10909762B2 (en) | 2018-08-24 | 2021-02-02 | Microsoft Technology Licensing, Llc | Gestures for facilitating interaction with pages in a mixed reality environment |
US10842407B2 (en) | 2018-08-31 | 2020-11-24 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
US10905350B2 (en) | 2018-08-31 | 2021-02-02 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
CN112789577A (zh) * | 2018-09-20 | 2021-05-11 | 脸谱科技有限责任公司 | 增强现实系统中的神经肌肉文本输入、书写和绘图 |
US11567573B2 (en) * | 2018-09-20 | 2023-01-31 | Meta Platforms Technologies, Llc | Neuromuscular text entry, writing and drawing in augmented reality systems |
US10921764B2 (en) | 2018-09-26 | 2021-02-16 | Facebook Technologies, Llc | Neuromuscular control of physical objects in an environment |
US10970936B2 (en) | 2018-10-05 | 2021-04-06 | Facebook Technologies, Llc | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
US10929099B2 (en) * | 2018-11-02 | 2021-02-23 | Bose Corporation | Spatialized virtual personal assistant |
US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
US11941176B1 (en) | 2018-11-27 | 2024-03-26 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
US11804041B2 (en) | 2018-12-04 | 2023-10-31 | Apple Inc. | Method, device, and system for generating affordances linked to a representation of an item |
US10789952B2 (en) * | 2018-12-20 | 2020-09-29 | Microsoft Technology Licensing, Llc | Voice command execution from auxiliary input |
EP3847531A4 (en) * | 2019-01-31 | 2021-11-03 | Huawei Technologies Co., Ltd. | HAND-ENTRY DETECTION ON THE FACE FOR INTERACTION WITH A DEVICE HAVING AN INTEGRATED CAMERA |
US11393254B2 (en) | 2019-01-31 | 2022-07-19 | Huawei Technologies Co., Ltd. | Hand-over-face input sensing for interaction with a device having a built-in camera |
US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
US11481031B1 (en) | 2019-04-30 | 2022-10-25 | Meta Platforms Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
USD1009884S1 (en) * | 2019-07-26 | 2024-01-02 | Sony Corporation | Mixed reality eyeglasses or portion thereof with an animated graphical user interface |
US11354896B2 (en) | 2019-07-31 | 2022-06-07 | Seiko Epson Corporation | Display device, display method, and computer program |
US10909767B1 (en) * | 2019-08-01 | 2021-02-02 | International Business Machines Corporation | Focal and interaction driven content replacement into augmented reality |
US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US11747915B2 (en) | 2019-09-30 | 2023-09-05 | Snap Inc. | Smart ring for manipulating virtual objects displayed by a wearable device |
US20220113814A1 (en) | 2019-09-30 | 2022-04-14 | Yu Jiang Tham | Smart ring for manipulating virtual objects displayed by a wearable device |
US20230131474A1 (en) * | 2019-10-16 | 2023-04-27 | Samuel R. Pecota | Augmented reality marine navigation |
US20210158630A1 (en) * | 2019-11-25 | 2021-05-27 | Facebook Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US11907423B2 (en) * | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
US12002448B2 (en) | 2019-12-25 | 2024-06-04 | Ultraleap Limited | Acoustic transducer structures |
US20230068391A1 (en) * | 2020-02-28 | 2023-03-02 | Nec Corporation | Locker system, locker management system, and locker management method |
US11798429B1 (en) | 2020-05-04 | 2023-10-24 | Snap Inc. | Virtual tutorials for musical instruments with finger tracking in augmented reality |
US12014645B2 (en) | 2020-05-04 | 2024-06-18 | Snap Inc. | Virtual tutorials for musical instruments with finger tracking in augmented reality |
US11520399B2 (en) | 2020-05-26 | 2022-12-06 | Snap Inc. | Interactive augmented reality experiences using positional tracking |
US12008153B2 (en) | 2020-05-26 | 2024-06-11 | Snap Inc. | Interactive augmented reality experiences using positional tracking |
US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
US11886639B2 (en) | 2020-09-17 | 2024-01-30 | Ultraleap Limited | Ultrahapticons |
US11925863B2 (en) | 2020-09-18 | 2024-03-12 | Snap Inc. | Tracking hand gestures for interactive game control in augmented reality |
US20220103748A1 (en) * | 2020-09-28 | 2022-03-31 | Ilteris Canberk | Touchless photo capture in response to detected hand gestures |
US11546505B2 (en) * | 2020-09-28 | 2023-01-03 | Snap Inc. | Touchless photo capture in response to detected hand gestures |
US11644902B2 (en) * | 2020-11-30 | 2023-05-09 | Google Llc | Gesture-based content transfer |
US12086324B2 (en) | 2020-12-29 | 2024-09-10 | Snap Inc. | Micro hand gestures for controlling virtual and graphical elements |
US12072406B2 (en) | 2020-12-30 | 2024-08-27 | Snap Inc. | Augmented reality precision tracking and display |
US11740313B2 (en) | 2020-12-30 | 2023-08-29 | Snap Inc. | Augmented reality precision tracking and display |
US11531402B1 (en) | 2021-02-25 | 2022-12-20 | Snap Inc. | Bimanual gestures for controlling virtual and graphical elements |
US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
US11861070B2 (en) | 2021-04-19 | 2024-01-02 | Snap Inc. | Hand gestures for animating and controlling virtual and graphical elements |
US11435857B1 (en) * | 2021-04-29 | 2022-09-06 | Google Llc | Content access and navigation using a head-mounted device |
US11995899B2 (en) * | 2021-04-29 | 2024-05-28 | Google Llc | Pointer-based content recognition using a head-mounted device |
US20220350997A1 (en) * | 2021-04-29 | 2022-11-03 | Google Llc | Pointer-based content recognition using a head-mounted device |
CN114089879A (zh) * | 2021-11-15 | 2022-02-25 | 北京灵犀微光科技有限公司 | 一种增强现实显示设备的光标控制方法 |
US20230221799A1 (en) * | 2022-01-10 | 2023-07-13 | Apple Inc. | Devices and methods for controlling electronic devices or systems with physical objects |
US12100288B2 (en) | 2023-07-27 | 2024-09-24 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
Also Published As
Publication number | Publication date |
---|---|
KR20140069124A (ko) | 2014-06-09 |
US20160306433A1 (en) | 2016-10-20 |
US20220107687A1 (en) | 2022-04-07 |
US20160259423A1 (en) | 2016-09-08 |
CN103858073B (zh) | 2022-07-29 |
JP2021007022A (ja) | 2021-01-21 |
US10401967B2 (en) | 2019-09-03 |
KR20190133080A (ko) | 2019-11-29 |
CN103858073A (zh) | 2014-06-11 |
WO2013093906A1 (en) | 2013-06-27 |
US11093045B2 (en) | 2021-08-17 |
US20170052599A1 (en) | 2017-02-23 |
CN115167675A (zh) | 2022-10-11 |
US20160291699A1 (en) | 2016-10-06 |
JP2014531662A (ja) | 2014-11-27 |
KR20220032059A (ko) | 2022-03-15 |
JP7297216B2 (ja) | 2023-06-26 |
JP2018028922A (ja) | 2018-02-22 |
US20200097093A1 (en) | 2020-03-26 |
US11494000B2 (en) | 2022-11-08 |
US20160320855A1 (en) | 2016-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11494000B2 (en) | Touch free interface for augmented reality systems | |
US11262835B2 (en) | Human-body-gesture-based region and volume selection for HMD | |
US20210407203A1 (en) | Augmented reality experiences using speech and text captions | |
US10120454B2 (en) | Gesture recognition control device | |
CN105229582B (zh) | 基于近距离传感器和图像传感器的手势检测 | |
US11755122B2 (en) | Hand gesture-based emojis | |
US9329678B2 (en) | Augmented reality overlay for control devices | |
JP7092028B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20120229509A1 (en) | System and method for user interaction | |
US20200142495A1 (en) | Gesture recognition control device | |
EP2886173A1 (en) | Augmented reality overlay for control devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EYESIGHT MOBILE TECHNOLOGIES LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KATZ, ITAY;SHENFELD, AMNON;SIGNING DATES FROM 20140318 TO 20140319;REEL/FRAME:032482/0531 |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |