WO2015025251A1 - Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale - Google Patents
Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale Download PDFInfo
- Publication number
- WO2015025251A1 WO2015025251A1 PCT/IB2014/063914 IB2014063914W WO2015025251A1 WO 2015025251 A1 WO2015025251 A1 WO 2015025251A1 IB 2014063914 W IB2014063914 W IB 2014063914W WO 2015025251 A1 WO2015025251 A1 WO 2015025251A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- data
- augmented reality
- processor
- product
- Prior art date
Links
- 230000006854 communication Effects 0.000 title claims abstract description 155
- 238000004891 communication Methods 0.000 title claims abstract description 155
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 93
- 210000002569 neuron Anatomy 0.000 title claims description 7
- 238000000034 method Methods 0.000 claims abstract description 85
- 230000008569 process Effects 0.000 claims abstract description 15
- 230000015654 memory Effects 0.000 claims description 34
- 230000033001 locomotion Effects 0.000 claims description 22
- 230000003287 optical effect Effects 0.000 claims description 20
- 230000008859 change Effects 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 17
- 238000012546 transfer Methods 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 11
- 230000006870 function Effects 0.000 claims description 11
- 210000003128 head Anatomy 0.000 claims description 11
- 230000008520 organization Effects 0.000 claims description 10
- 210000004556 brain Anatomy 0.000 claims description 9
- 230000004424 eye movement Effects 0.000 claims description 9
- 210000003205 muscle Anatomy 0.000 claims description 9
- 208000001613 Gambling Diseases 0.000 claims description 8
- 238000013475 authorization Methods 0.000 claims description 6
- 230000007177 brain activity Effects 0.000 claims description 5
- 230000001413 cellular effect Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 5
- 210000001525 retina Anatomy 0.000 claims description 5
- 206010041235 Snoring Diseases 0.000 claims description 4
- 230000004397 blinking Effects 0.000 claims description 4
- 230000004633 cognitive health Effects 0.000 claims description 4
- 230000008921 facial expression Effects 0.000 claims description 4
- 210000001097 facial muscle Anatomy 0.000 claims description 4
- JYGXADMDTFJGBT-VWUMJDOOSA-N hydrocortisone Chemical compound O=C1CC[C@]2(C)[C@H]3[C@@H](O)C[C@](C)([C@@](CC4)(O)C(=O)CO)[C@@H]4[C@@H]3CCC2=C1 JYGXADMDTFJGBT-VWUMJDOOSA-N 0.000 claims description 4
- 230000001737 promoting effect Effects 0.000 claims description 4
- 230000014509 gene expression Effects 0.000 claims description 3
- 230000004886 head movement Effects 0.000 claims description 3
- 230000003993 interaction Effects 0.000 claims description 3
- 230000005291 magnetic effect Effects 0.000 claims description 3
- 239000011664 nicotinic acid Substances 0.000 claims description 3
- 230000001755 vocal effect Effects 0.000 claims description 3
- 208000019901 Anxiety disease Diseases 0.000 claims description 2
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 claims description 2
- XNOPRXBHLZRZKH-UHFFFAOYSA-N Oxytocin Natural products N1C(=O)C(N)CSSCC(C(=O)N2C(CCC2)C(=O)NC(CC(C)C)C(=O)NCC(N)=O)NC(=O)C(CC(N)=O)NC(=O)C(CCC(N)=O)NC(=O)C(C(C)CC)NC(=O)C1CC1=CC=C(O)C=C1 XNOPRXBHLZRZKH-UHFFFAOYSA-N 0.000 claims description 2
- 101800000989 Oxytocin Proteins 0.000 claims description 2
- 102100031951 Oxytocin-neurophysin 1 Human genes 0.000 claims description 2
- 230000032683 aging Effects 0.000 claims description 2
- 230000036506 anxiety Effects 0.000 claims description 2
- 230000036772 blood pressure Effects 0.000 claims description 2
- 230000036760 body temperature Effects 0.000 claims description 2
- 210000004027 cell Anatomy 0.000 claims description 2
- 230000037326 chronic stress Effects 0.000 claims description 2
- 230000003247 decreasing effect Effects 0.000 claims description 2
- 230000018109 developmental process Effects 0.000 claims description 2
- 229940011871 estrogen Drugs 0.000 claims description 2
- 239000000262 estrogen Substances 0.000 claims description 2
- 239000008103 glucose Substances 0.000 claims description 2
- 235000020627 health maintaining nutrition Nutrition 0.000 claims description 2
- 229960000890 hydrocortisone Drugs 0.000 claims description 2
- 238000007726 management method Methods 0.000 claims description 2
- 239000000463 material Substances 0.000 claims description 2
- 230000003340 mental effect Effects 0.000 claims description 2
- XNOPRXBHLZRZKH-DSZYJQQASA-N oxytocin Chemical compound C([C@H]1C(=O)N[C@H](C(N[C@@H](CCC(N)=O)C(=O)N[C@@H](CC(N)=O)C(=O)N[C@@H](CSSC[C@H](N)C(=O)N1)C(=O)N1[C@@H](CCC1)C(=O)N[C@@H](CC(C)C)C(=O)NCC(N)=O)=O)[C@@H](C)CC)C1=CC=C(O)C=C1 XNOPRXBHLZRZKH-DSZYJQQASA-N 0.000 claims description 2
- 229960001723 oxytocin Drugs 0.000 claims description 2
- 230000002093 peripheral effect Effects 0.000 claims description 2
- 230000002035 prolonged effect Effects 0.000 claims description 2
- 230000028327 secretion Effects 0.000 claims description 2
- 230000003860 sleep quality Effects 0.000 claims description 2
- 230000000638 stimulation Effects 0.000 claims description 2
- 230000035882 stress Effects 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 claims description 2
- 230000003542 behavioural effect Effects 0.000 claims 1
- 239000003086 colorant Substances 0.000 claims 1
- 238000009509 drug development Methods 0.000 claims 1
- 230000001815 facial effect Effects 0.000 claims 1
- 238000010438 heat treatment Methods 0.000 claims 1
- 238000012544 monitoring process Methods 0.000 claims 1
- 230000005236 sound signal Effects 0.000 claims 1
- 239000000969 carrier Substances 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 210000001956 EPC Anatomy 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 238000011112 process operation Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000036642 wellbeing Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/306—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using TV related infrastructures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/308—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using the Internet of Things
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/321—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices using wearable devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/327—Short range or proximity payments by means of M-devices
- G06Q20/3276—Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being read by the M-device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
Definitions
- This application relates generally to wearable personal digital interfaces and, more specifically, to an augmented reality eyeglass communication device.
- a person who goes shopping attends several stores to compare assortment of goods, prices and availability of desired products.
- Handheld digital devices e.g. smartphones, have become efficient assistants for performing shopping.
- the person may, for example, create a list of products to buy and may save this list on a smartphone.
- the smartphone When being at the store, the smartphone may be used to scan product barcodes to retrieve product information or perform payment based on payment information encoded in the product barcodes.
- long-term constant holding of the smartphone in a hand may cause inconvenience to the person who performs shopping at the store. For example, when the person wants to take a big size product, the person firstly needs to empty his hands and, therefore, to put the smartphone into his pocket. After inspecting the desired product, the person will need to get the smartphone out of the pocket in order to scan a barcode of the desired product or to see what products left in the list of products to buy.
- an augmented reality eyeglass communication device for facilitating shopping and a method for facilitating shopping using the augmented reality eyeglass communication device.
- the augmented reality eyeglass communication device may comprise a frame having a first end and a second end, and a right earpiece connected to the first end of the frame and a left earpiece connected to the second end of the frame.
- the eyeglass communication device may comprise a processor disposed in the frame, the right earpiece or the left earpiece and configured to receive one or more commands of a user, perform operations associated with the commands of the user, receive product information, and process the product information.
- the eyeglass communication device may comprise a display connected to the frame and configured to display data received from the processor.
- the display may include an optical prism element and a projector embedded in the display. The projector may be configured to project the data received from the processor to the optical prism element.
- the eyeglass communication device may comprise a transceiver electrically connected to the processor and configured to receive and transmit data over a wireless network.
- a Subscriber Identification Module (SIM) card slot may be disposed in the frame.
- the eyeglass communication device may comprise a camera disposed on the frame, the right earpiece or the left earpiece, at least one earphone disposed on the right earpiece or the left earpiece, a microphone configured to sense a voice command of the user, and a charging unit connected to the frame, the right earpiece or the left earpiece.
- the eyeglass communication device may be configured to perform phone
- a method for facilitating shopping using an augmented reality eyeglass communication device may include receiving, by a processor of the eyeglass communication device, product information associated with products comprised in a list of products of a user. Furthermore, the method may involve receiving, by the processor, location information associated with location of the user. In further embodiments, the method may include searching, based on the product information, by the processor, a database associated with a store for availability, location and pricing information associated with the products. The method may involve receiving, by the processor, the availability, location and pricing information associated with the product, and displaying, by a display of the eyeglass
- modules, subsystems, or devices can be adapted to perform the recited steps.
- Other features and exemplary embodiments are described below.
- FIG. 1 illustrates an environment within which an augmented reality eyeglass communication device for facilitating shopping and a method for facilitating shopping using an augmented reality eyeglass communication device may be implemented, in accordance with an example embodiment.
- FIG. 2 is a schematic representation of an augmented reality eyeglass communication device for facilitating shopping, in accordance with an example embodiment.
- FIG. 3 shows a schematic representation of tracking a hand gesture command performed by an augmented reality eyeglass communication device.
- FIG. 4 is a flow chart illustrating a method for facilitating shopping using an augmented reality eyeglass communication device, in accordance with an example embodiment.
- FIG. 5 shows a payment performed by an augmented reality eyeglass communication device, in accordance with an example embodiment.
- FIG. 6 is a schematic diagram illustrating an example of a computer system for performing any one or more of the methods discussed herein.
- An augmented reality eyeglass communication device for facilitating shopping and a method for facilitating shopping using the augmented reality eyeglass communication device are described herein.
- the eyeglass communication device allows a user to visually access information by simply looking trough eyeglass lenses configured as a display. Being worn by the user, the eyeglass communication device may provide for convenient carrying in many situations and environments, such as physical activity, sports, travels, shopping, telephone conversations, leisure time, and so forth.
- Disposing a processor, a transmitter, and SIM card slot in a structure of the eyeglass communication device, as well as insertion of a SIM card into the SIM card slot may allow the eyeglass communication device to perform communication functions of a mobile phone, e.g. a smartphone, and display data on a display of the eyeglass communication device.
- a user may review the data simply looking through lenses of the eyeglass communication device.
- the user may store information in a memory unit of the eyeglass communication device and review the information on the display of the eyeglass communication device.
- the user may perform a number of functions of the smartphone, such as accept or decline phone calls, make phone calls, listen to the music stored in the memory unit of the eyeglass communication device, a remote device or accessed via the Internet, view maps, check for weather forecasts, control remote devices to which the eyeglass communication device is currently connected, such as a computer, a TV, an audio or video system, and so forth. Additionally, the eyeglass communication device may allow the user to make a photo or video and upload it to a remote device or to the Internet.
- An augmented reality eyeglass communication device may be a useful tool for facilitating shopping.
- the user may use the eyeglass
- the display of the eyeglass communication device may be configured as an eyeglass lens, such as a prescription lens or a lens without diopters, and may include an optical prism element and a projector embedded into the display. Additionally, the display may be
- the camera lens may be configured to track eye movements. The tracked eye movements may be transmitted to the processor and interpreted as a command.
- the projector may project an image received from a processor of the eyeglass communication device to the optical prism element.
- the optical prism element may be configured so as to focus the image to a retina of the user.
- the eyeglass communication device may be configured to sense and process voice commands of the user. Therefore, the user may give voice commands to the eyeglass communication device and immediately see data associated with the commands on the display of the eyeglass communication device.
- the commands of the user may be processed by a processor of the eyeglass communication device or may be sent to a remote device, such as a search server, and information received from the remote device may be displayed on the display of the eyeglass communication device.
- the device may be used as a hands-free mobile computing device, to synchronize with one or more external devices in real time, track a
- an embedded emergency button configured to provide a medical alert signal, a request for help signal, or another informational signal.
- FIG. 1 illustrates an environment 100 within which a user 105 wearing an augmented reality eyeglass communication device 200 for facilitating shopping and methods for facilitating shopping using an augmented reality eyeglass communication device 200 can be implemented.
- the environment 100 may include a user 105, an eyeglass communication device 200, a communication network 110, a store server 115, a financial organization server 120, and a
- the device 200 may communicate with the store server 115, the financial organization server 120, and the communication server 125 via the network 110.
- the device 200 may retrieve information associated with a product 130 by, for example, scanning an image or a barcode of the product 130 or reading an RFID tag of the product 130.
- the barcode may include a one-dimensional barcode, a two-dimensional barcode, a three-dimensional barcode, a quick response code, a snap tag code, and other machine readable codes.
- the barcode may encode payment data, personal data, credit card data, debit card data, gift card data, prepaid card data, bank checking account data, digital cash data, and so forth. Additionally, the barcode may include a link to a web-resource, a payment request, advertising information, and other information.
- the barcode may encode electronic key data and be scannable by a web-camera of an access control system. The scanned data may be processed by the access control system and access to an item related to the access control system may be granted based on the processing.
- the network 110 may include the Internet or any other network capable of communicating data between devices. Suitable networks may include or interface with any one or more of, for instance, a local intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a MAN (Metropolitan Area Network), a virtual private network (VPN), a storage area network (SAN), a frame relay connection, an Advanced Intelligent Network (AIN) connection, a synchronous optical network (SONET) connection, a digital Tl, T3, El or E3 line, Digital Data Service (DDS) connection, DSL (Digital Subscriber Line) connection, an Ethernet connection, an ISDN (Integrated Services Digital Network) line, a dial-up port such as a V.90, V.34 or V.34bis analog modem connection, a cable modem, an ATM (Asynchronous Transfer Mode) connection, or an FDDI (Fiber Distributed Data Interface) or CDDI (Copper Distributed Data Interface) connection. Furthermore, communications may include or interface
- the network 110 can further include or interface with any one or more of an RS-232 serial connection, an IEEE-1394 (Firewire) connection, a Fiber Channel connection, an IrDA (infrared) port, a SCSI (Small Computer Systems Interface) connection, a Universal Serial Bus (USB) connection or other wired or wireless, digital or analog interface or connection, mesh or Digi® networking.
- RS-232 serial connection an IEEE-1394 (Firewire) connection, a Fiber Channel connection, an IrDA (infrared) port, a SCSI (Small Computer Systems Interface) connection, a Universal Serial Bus (USB) connection or other wired or wireless, digital or analog interface or connection, mesh or Digi® networking.
- the network 110 may include any suitable number and type of devices (e.g., routers and switches) for forwarding commands, content, and/or web object requests from each client to the online community application and responses back to the clients.
- the device 200 may be compatible with one or more of the following network standards: GSM, CDMA, LTE, IMS, Universal Mobile Telecommunication System (UMTS), RFID, 4G, 5G, 6G and higher.
- the device 200 may communicate with the GPS satellite via the network 110 to exchange data on a geographical location of the device 200. Additionally, the device 200 may communicate with mobile network operators using a mobile base station.
- the device 200 may be used as a standalone system operating via a WiFi module or a Subscriber Identity Module (SIM) card.
- SIM Subscriber Identity Module
- the methods described herein may also be practiced in a wide variety of network environments (represented by network 110) including, for example, TCP/IP- based networks, telecommunications networks, wireless networks, etc.
- the computer program instructions may be stored in any type of computer-readable media.
- the program may be executed according to a variety of computing models including a client/server model, a peer-to-peer model, on a stand-alone computing device, or according to a distributed computing model in which various functionalities described herein may be effected or employed at different locations.
- the user 105 wearing the device 200 may interact via the bidirectional communication network 110 with the one or more remote devices (not shown).
- the one or more remote devices may include a television set, a set-top box, a personal computer (e.g., a tablet or a laptop), a house signaling system, and the like.
- the device 200 may connect to the one or more remote devices wirelessly or by wires using various connections such as a USB port, a parallel port, an infrared transceiver port, a radiofrequency transceiver port, and so forth.
- the device 200 may be compatible with one or more of the following network standards: GSM, CDMA, LTE, IMS, and
- UTS Universal Mobile Telecommunication System
- 4G 4G
- 5G 5G
- 6G 6G
- upper RFID
- FIG. 2 shows a schematic representation of an exemplary eyeglass communication device 200 for facilitating shopping.
- the device 200 may comprise a frame 205 having a first end 210 and a second end 215.
- the first end 210 of the frame 205 may be connected to a right earpiece 220.
- the second end 215 of the frame 205 may be connected to a left earpiece 225.
- the frame 205 may be configured as a single unit or may consist of several pieces.
- the frame 205 may consist of two pieces connected to each other by a connector (not shown).
- the connector may include two magnets, one on each piece of the frame 205. When two parts of the connector are connected, the connector may look like a nose bridge of ordinary eyeglasses.
- the device 200 may comprise a processor 230 disposed in the frame 205, the right earpiece 220 or the left earpiece 225.
- the processor 230 may be configured to receive one or more commands of a user, perform operations associated with the commands of the user, receive product information, and process the product information.
- the processor 230 may operate on an operational system, such as iOS, Android, Windows Mobile, Blackberry, Symbian, Asha, Linux, Nemo Mobile, and so forth.
- the processor 230 may be configured to establish connection with a network to view text, photo or video data, maps, listen to audio data, watch multimedia data, receive and send e-mails, perform payments, etc. Additionally, the processor 230 may download applications, receive and send text, video, and multimedia data. In a certain embodiment, the processor 230 may be configured to process a hand gesture command of the user.
- the device 200 may also comprise at least one display 235.
- the display 235 may also comprise at least one display 235.
- the 235 may be embedded into the frame 105.
- the frame 105 may comprise openings for disposing the display 235.
- the frame 205 may be implemented without openings and may partially enclose two displays 235.
- the display 235 may be configured as an eyeglass lens, such as prescription lenses, non-prescription lenses, e.g., darkened lenses, safety lenses, lenses without diopters, and the like.
- the eyeglass lens may be changeable.
- the display 235 may be configured to display data received from the processor 230.
- the data received from the processor 230 may include video data, text data, payment data, personal data, barcode information, time data, notifications, and so forth.
- the display 235 may include an optical prism element 240 and a projector 245 embedded in the display 235.
- the display 235 may include a see-through material to display simultaneously a picture of real world and data requested by the user.
- the display 235 may be configured so that the optical prism element 240 and the projector 245 cannot be seen when looking from any side on the device 200. Therefore, the user 105 wearing the device 200 and looking through displays 235 may not see the optical prism element 240 and the projector 245.
- the projector 245 may receive an image 247 from the processor 230 and may project the image 247 to the optical prism element 240.
- the optical prism element 240 may be configured so as to focus the image 247 to a retina of the user.
- the projector 245 may be configured to project the data received from the processor 230 to a surface in environment of the user.
- the surface in environment of the user may be any surface in environment of the user, such as a vertical surface, a horizontal surface, an inclined surface in environment of the user, a surface of a physical object in environment of the user, and a part of a body of the user.
- the surface may be a wall, a table, a hand of the user, a sheet of paper.
- the data may include a virtual touch screen environment.
- the virtual touch screen environment may be see-through to enable the user to see the surroundings.
- Virtual objects in the virtual touch screen environment may be moveable and deformable.
- the user may interact with virtual objects visualized in the virtual touch screen environment.
- the device 200 may provide gesture tracking, surface tracking, code example tracking, and so forth.
- the device 200 may comprise a gesture sensor capable of measuring electrical activity associated with a muscle movement.
- the muscle movement may be detected and interpreted as a command.
- the user may interact with the data and/or objects projected by the projector 245 (e.g. a rear projector system), such as the virtual touch screen.
- the camera 260 may capture images or video of user body parts in relation to the projected objects and recognize user commands provided via virtual control components. Alternatively, motions of user fingers or hands may be detected by one or more sensors and interpreted by the processor.
- the device 200 may comprise two cameras, one for each eye of the user. Each of the two cameras may have a 23 degree field of view.
- the projector 245 may be configured rotatable to enable the processor 245 to project an image to the optical prism element 240, as well as to a surface in environment of the user.
- the image projected by the projector 245 may be refracted by an optical prism element embedded into a display 235 and directed to the surface in environment of the user.
- the data projected by the projector to the optical prism element may be perceived by a human eye as located at a distance of 3 to 8 meters.
- the device 200 may comprise a transceiver 250 electrically coupled to the processor 230.
- the transceiver 250 may be configured to receive and transmit data from a remote device over a wireless network, receive one or more commands of the user, and transmit the data and the one or more commands to the remote device.
- the remote device may include a store server, a communication server, a financial organization server, and so forth.
- the transceiver 250 may be disposed in the frame 205, the right earpiece 220, or the left earpiece 225.
- the device 200 may comprise a receiver configured to sense a change in frequency of a WiFi signal.
- the change may be caused by a move of a user hand.
- the change may be processed by the processor and a hand gesture associated with the change may be recognized and the corresponding command may be performed.
- the command may include controlling temperature settings, adjusting a volume on a stereo, flipping a channel on a television set, or shutting off lights, causing a fireplace to blaze to life, and so forth.
- the change in frequency may be sensed in a line of sight of the user, outside the line of sight of the user, through a wall, and so forth.
- the receiver sensing WiFi signal may be activated by a specific combination of gestures serving as an activating sequence or a password.
- WiFi signal change may be sensed by a microphone.
- the device 200 may comprise a SIM card slot 255 disposed in the frame 205, the right earpiece 220 or the left earpiece 225 and configured to receive a SIM card (not shown).
- the SIM card may store a phone number of the SI card, an operator of the SIM card, an available balance of the SIM card,, and so forth. Therefore, when the SIM card in received in the SI card slot 255, the device 200 may perform phone communication functions, i.e. may function as a mobile phone, in particular, a smartphone.
- the device 200 may comprise a camera 260 disposed on the frame 205, the right earpiece 220 or the left earpiece 225.
- the camera 260 may include one or more of the following: a digital camera, a mini-camera, a motion picture camera, a video camera, a still photography camera, and so forth.
- the camera 260 may be configured to take a photo or record a video, capture a sequence of images, such as the images containing a hand of the user.
- the camera 260 may communicate the captured photo or video to the transceiver 250. Alternatively, the camera 260 may transmit the images to the processor to recognize the hand gesture command.
- the camera 260 may be configured to perform simultaneously video recording and image capturing.
- FIG. 3 shows a schematic representation 300 of an embodiment of the device 200, in which the camera 260 may be configured to track a hand gesture command of the user 105.
- the tracked hand gesture command of the user may be communicated to a processor of the device 200.
- the user 105 may give a command to perform a command call, e.g. by moving a user hand up.
- the camera 260 may track the hand gesture command of the user 105 and communicate data associated with the tracked data to the processor of the device 200.
- the processor may process the received data and may give a command to a projector 245 to project an image of a keyboard, i.e. a virtual keyboard 305, to a surface 310 in an environment of the user 105, e.g.
- the user 105 may point figures of a telephone number on the virtual keyboard 305.
- the camera 260 may detect the figured pointed by the user 105 and communicate the numbers to the processor.
- the processor may process the received figures and give a command to perform a command call.
- the device 200 may comprise several cameras mounted on any side of the device 200 and directed in a way allowing capture of all areas around the device 200.
- the cameras may be mounted on front, rear, top, left and right sides of the device 200.
- the areas captured by the front-, rear-, top-, left- and right-side cameras may be displayed on the display 235 simultaneously or one by one.
- the user may select, for example, by voice command, one of the cameras, and the data captured by the selected camera may be shown on the display 235.
- the camera 260 may be configured to allow focusing on an object selected by the user, for example, by voice command.
- the camera 260 may be configured to scan a barcode. Scanning a barcode may involve capturing an image of the barcode using the camera 260. The scanned barcode may be processed by the processor 230 to retrieve the barcode information. Using the camera 260 of device 200, the user may capture pictures of various cards, tickets, or coupons. Such pictures, stored in the device 200, may comprise data related to captured cards, tickets, or coupons. [0044] One having ordinary skills in the art would understand that the term
- barcodes are not limited to printed barcodes having particular formats, but can be used for barcodes displayed on a screen of a PC, smartphone, laptop, another wearable personal digital device (WPD), and so forth. Additionally, barcodes may be transmitted to and from the eyeglass communication device electronically.
- barcodes may be in the form of an Electronic Product Code (EPC) designed as a universal identifier that provides a unique identity for every physical object (not just a trade item category) anywhere in the world. It should be noted that EPCs are not exclusively used with RFTD data carriers. They can be constructed based on reading of optical data carriers, such as linear barcodes and two-dimensional barcodes, such as Data Matrix symbols. For purposes of this document, all optical data carriers are referred to herein as "barcodes”.
- EPC Electronic Product Code
- the camera 260 may be configured to capture an image of a product.
- the captured image may be processed by the processor to retrieve image information.
- the image information may include a name of the product or a trademark of the product.
- Information associated with the product may be retrieved from the image information and displayed on the display 235.
- the device 200 may comprise at least one earphone 270 disposed on the right earpiece 220 or the left earpiece 225.
- the earphone 270 may play sounds received by the transceiver 250 from the control device.
- the device 200 may comprise a microphone 275.
- the microphone 275 may sense the voice command of the user and communicate it to the transceiver 250.
- the voice command may also include a voice memo, a voice message, and so forth. Additionally, the microphone 275 may sense other voice data and transmit the voice data to the processor.
- the device 200 may comprise a charging unit 280 connected to the frame 205, the right earpiece 220 or the left earpiece 225.
- the charging unit 280 may be configured to provide power to elements of the device 200.
- the charging unit may include one or more solar cells, a wireless charger accessory, a vibration charger configured to charge the devices using natural movement vibrations, and so forth.
- the device 200 may include at least one
- EEG electroencephalograph
- Neurons of the human brain can interact through a chemical reaction and emit a measurable electrical impulse.
- EEG sensors may sense the electrical impulses and translate the pulses into one or more commands. By sensing the electrical impulses, the device may optimize brain fitness and performance of the user, measure and monitor cognitive health and wellbeing of the user, and so forth.
- the device 200 may comprise a memory slot 285 disposed on the frame 205, the right earpiece 220 or the left earpiece 225.
- the memory slot 285 may be configured to capture a memory unit (not shown).
- the device 200 may display data stored in the memory unit of the device 200.
- data may include a photo or a video recorded by the camera 260, the information received from a remote device, payment information of the user in the form of a scannable barcode, discount or membership cards of the user, tickets, coupons, boarding passes, any personal information of the user, and so forth.
- the memory unit may include a smart media card, a secure digital card, a compact flash card, a multimedia card, a memory stick, an extreme digital card, a trans flash card, and so forth.
- the device 200 may comprise at least one sensor
- the sensor may include at least one eye-tracking unit, at least one motion sensing unit, and an accelerometer determining an activity of the user.
- the eye-tracking unit may track an eye movement of the user, generate a command based on the eye movement, and communicate the command to the transceiver 250.
- the motion sensing unit may sense head movement of the user, i.e. motion of the device 200 about a horizontal or vertical axis. In particular, the motion sensing unit may sense motion of the frame 205, the right earpiece 220 or the left earpiece 225.
- the user may give commands by moving the device 200, for example, by moving the head of the user.
- the user may choose one or more ways to give commands: by voice using the microphone 275, by eye movement using the eye-tracking unit, by head movement using the motion sensing unit, for example, by nodding or shaking the head, or use all these ways simultaneously.
- the device 200 may comprise one or more biometric sensors to sense biometric parameters of the user.
- the biometric parameters may be stored to the memory and processed by the processor to receive historical biometric data.
- the biometric sensors may include sensors for measuring a blood pressure, a pulse, a heart rate, a glucose level, a body temperature, an environment temperature, arterial properties, and so forth.
- the sensed data may be processed by the processor and/or shown on the display 235.
- one or more automatic alerts may be provided based on the measuring, such as visual alerts, audio alerts, voice alerts, and so forth.
- the device 200 may comprise one or more accelerometers. Using the accelerometers, the various physical data related to the user may be received, such as calories burned, sleep quality, breaths per minute, snoring breaks, steps walked, distance walked, and the like. In some embodiments, using the accelerometers, the device 200 may control snoring by sensing the position of the user while he is asleep.
- the device 200 may comprise a light indicator
- buttons 295 such as an on/off button and a reset button.
- the device 200 may comprise a USB slot 297 to connect to other devices, for example, to a computer.
- a gesture recognition unit including at least three
- the gesture recognition unit may be configured to track hand gesture commands of the user. Moreover, non-verbal communication of a human (gestures, hand gestures, emotion signs, directional indications, and facial expressions) may be recognized by the gesture recognition unit, a camera, and/or other sensors. Multiple hand gesture commands or gestures of other humans may be identified simultaneously. In various embodiments, hand gesture commands or gestures of other humans may be identified based on depth data, finger data, hand data, and other data, which may be received from sensors of the device 200.
- the 3D gesture recognition sensor may capture three dimensional data in real time with high precision.
- a human hand may be interpreted as a collection of vertices and lines in a 3D mesh. Based on relative position and interaction of the vertices and lines, the gesture may be inferred.
- a skeletal representation of a user body may be generated.
- a virtual skeleton of the user may be computed by the device 200 and parts of the body may be mapped to certain segments of the virtual skeleton.
- user gestures may be determined faster, since only key parameters are analyzed.
- deformable 2D templates of hands may be used.
- Deformable templates may be sets of points on the outline of human hands as linear simplest interpolation which performs an average shape from point sets, point variability parameters, and external deformators. Parameters of the hands may be derived directly from the images or videos using a template database from previously captured hand gestures.
- facial expressions of the user including a blink, a wink, a surprise expression, a frown, a clench, a smile, and so forth, may be tracked by the camera 260 and interpreted as user commands.
- user blinking may be interpreted by the device 200 as a command to capture a photo or a video.
- the device 200 may enable the user to control, remotely or non-remotely, various machines, mechanisms, robots, and so forth.
- Information associated with key components of the body parts may be used to recognize gestures.
- important parameters like palm position or joint angles, may be received.
- relative position and interaction of user body parts may be determined in order to infer gestures.
- Meaningful gestures may be associated with templates stored in a template database.
- images or videos of the user body parts may be used for gesture interpretation. Images or videos may be taken by the camera 260.
- the device 200 may comprise a RFID reader (not shown) to read a RFID tag of a product.
- the read RFID tag may be processed by the processor 230 to retrieve the product information.
- the device 200 may be configured to allow the user to view data in 3D format.
- the device 200 may comprise two displays 235 enabling the user to view data in 3D format. Viewing the data in 3D format may be used, for example, when working with such applications as games, simulators, and the like.
- the device 200 may be configured to enable head tracking. The user may control, for example, video games by simply moving his head. Video game application with head tracking may use 3D effects to coordinate actual
- the device 200 may comprise a vibration unit (not shown).
- the vibration unit may be mounted to the frame 205, the right earpiece 220 or the left earpiece 225.
- the vibration unit may generate vibrations.
- the user may feel the vibrations generated by the vibration unit.
- the vibration may notify the user about receipt of the data from the remote device, alert notification, and the like.
- the device 200 may comprise a communication circuit.
- the communication circuit may include one or more of the following: a Bluetooth module, a WiFi module, a communication port, including a universal serial bus (USB) port, a parallel port, an infrared transceiver port, a radiofrequency transceiver port, an embedded transmitter, and so forth.
- the device 200 may communicate with external devices using the communication circuit.
- the device 200 may comprise a GPS unit
- the GPS unit may be disposed on the frame 205, the right earpiece 220 or the left earpiece 225.
- the GPS unit may detect coordinates indicating a position of the user 105.
- the coordinates may be shown on the display 235, for example, on request of the user, stored in the memory unit 285, or sent to a remote device.
- the device 200 may comprise a Wi-Fi module (not shown) and a Wi-Fi signal detecting sensor (not shown).
- the Wi-Fi signal detecting sensor may be configured to detect change of a Wi-Fi signal caused by the hand gesture command of the user and communicate data associated with the detected change to the processor 230.
- the processor 230 may be further configured to process the data associated with the detected change of the Wi-Fi signal and perform the detected hand gesture command in accordance with the processed data. For example, a user may give a command to turn off the light in the room, e.g., by moving a user hand up and down.
- the Wi-Fi signal changes due to movement of the user hand.
- the Wi-Fi signal detecting sensor may detect change of the Wi-Fi signal and
- the processor 230 may process the received data to determine the command given by the user and send a command to a light controlling unit of the room to turn off the light.
- the device 200 may produce signals used to control a device remotely (e.g. TV set, audio system, and so forth), to enable a two way radio alert, a medical care alert, a radar, activate a door opener, control an operation transporting vehicle, a navigational beacon, a toy, and the like.
- a device remotely e.g. TV set, audio system, and so forth
- device 200 may include control elements to control operation or functions of the device.
- Access to the device 200 may be controlled by a password, a Personal
- the biometric authorization may include fingerprint scanning, palm scanning, face scanning, retina scanning, and so forth. The scanning may be performed using one or more biometric sensors. Additionally, the device 200 may include a fingerprint reader configured to scan a fingerprint. The scanned fingerprint may be matched to one or more approved fingerprints and if the scanned fingerprint corresponds to one of the approved fingerprints, the access to the device 200 may be granted.
- SDK Software Development Kit
- API Programming Interface
- the SDK and/or API may be used for third party integration purposes.
- the device 200 may comprise a GPS module to track geographical location of the device, an alert unit to alert the user about some events by vibration and/or sound, one or more subscriber identification module (SIM) cards, one or more additional memory units, a physical interface (e.g. a GPS module to track geographical location of the device, an alert unit to alert the user about some events by vibration and/or sound, one or more subscriber identification module (SIM) cards, one or more additional memory units, a physical interface (e.g. a GPS module to track geographical location of the device, an alert unit to alert the user about some events by vibration and/or sound, one or more subscriber identification module (SIM) cards, one or more additional memory units, a physical interface (e.g. a GPS module to track geographical location of the device, an alert unit to alert the user about some events by vibration and/or sound, one or more subscriber identification module (SIM) cards, one or more additional memory units, a physical interface (e.g. a GPS module to track geographical location of the device, an alert unit to alert
- microSecureDigital (microSD) slot) to receive memory devices external to the device, a two-way radio transceiver for communication purposes, and an emergency button configured to send an alarm signal.
- the vibration and sound of the alert unit may be used by a guide tool and an exercise learning service.
- device may be configured to analyze one or more music records stored in a memory unit.
- the device may communicate, over a network, with one or more music providers and receive data on music records suggested by the music providers for sale which are similar to the music records stored in the memory unit of the device.
- the received data may be displayed by the device.
- the processor may be configured to communicate with a gambling cloud service or a gaming cloud service, exchange gambling or gaming data with the gambling cloud service or the gaming cloud service, and, based on a user request, transfer payments related to gambling or gaming using payment data of the user associated with an account of the user in the cloud service, using payment data of the user stored in a memory unit or using a swipe card reader to read payment card data.
- FIG. 4 is a flow chart illustrating a method 400 for facilitating shopping using an augmented reality eyeglass communication device 200.
- the method 400 may start with receiving product information associated with products comprised in a list of products of a user at operation 402.
- the product information e.g., names or types of the products, may be received by a processor 230 of the device 200 by sensing a command of the user.
- the user may pronounce names of products the user wishes to buy and may give a voice command to include these products into the list of products.
- the device 200 may sense the voice command of the user via a microphone 275 and communicate the command to the processor 230.
- the processor 230 may receive location information associated with location of the user at operation 404.
- the processor 230 may search a database associated with a store for availability, location and pricing information associated with the products included into the list of products of the user. The search may be based on the product information.
- the store may include any store in proximity to location of the user or any store selected by the user.
- the processor 230 may receive the availability, location and pricing information associated with the product from the database of the store. The availability, location and pricing information associated with the product may be displayed to the user on a display 235 of the device 200 at operation 410.
- the method 400 may comprise plotting, by the processor 230, a route for the user on a map of the store based on the availability, location and pricing information associated with the product and the location information associated with the location of the user.
- the route may be displayed on the display 235.
- the user may give a command to provide description of a product present in the store.
- the device 200 may sense the command of the user via the microphone and communicate the command to the processor 230 of the device 200.
- the processor 230 may receive information associated with the product which description is requested by the user.
- the information associated with the product may be received by means of taking a picture of the product, scanning a barcode of the product, and reading a RFID tag of the product. The received
- the processor 230 may search, based on the received information associated with the product, the description of the product in a database available in a network, e.g., in the Internet. After receiving, by the processor, the description of the product from the network, the description of the product present in the store may be displayed to the user on the display 235.
- the user may give a command to provide description of a product by means of a hand gesture, for example, by moving a hand of the user from left to right.
- the method 400 may comprise tracking, by a camera of the device 200, a hand gesture command of the user.
- the hand gesture command of the user may be processed by a processor of the device 200.
- the processor may give a command to a projector of the device 200 to project the description of the product to a surface in environment of the user, e.g. a wall or the product itself, according to the hand gesture command.
- the processor 230 may optionally receive information about the products put by the user into a shopping cart.
- the information about the products may be received by means of taking a picture of the product, scanning a barcode of the product, and reading a RFID tag of the product.
- the processor 230 may remove, based on the received information, the products put by the user into the shopping cart from the list of products.
- the device 200 may notify the user about such an absence, for example, by means of a sound or vibration notification or by means of showing the notification on the display 235.
- the processor 230 may search availability information associated with the not available product in a database of a store located proximate to the location of the user, based on location information of the user.
- the processor 230 may search the database associated with the store for information about a product having the same
- the processor 230 After the processor 230 receives the information about the product having the same characteristics as the not available product, the information may be displayed to the user on the display 235.
- the user when all products the user needs are put into the shopping chart, the user may give a command to perform a payment.
- the processor 230 may receive information about the products put by the user into the shopping cart and, based on the received information, may generate a payment request.
- the generated payment request may be sent, by means of the transceiver 250, to a financial organization to perform a payment.
- the financial organization may include a bank.
- the financial organization may confirm the payment, for example, based on SIM information of the user received together with the payment request or any other information associated with the device 200 and stored in a database of the financial organization.
- FIG. 5 One example embodiment of the method 300 in respect of facilitating shopping will now be illustrated by FIG. 5.
- the user 105 may give a command, for example, by voice or by eye movement, to scan a barcode of a product 130.
- the device 200 may scan the barcode of the product 130 by means of a camera. After scanning the barcode of the product 130, the user 105 may receive payment data associated with the product 130.
- the payment data may encode payment request information, such as receiving account, amount to be paid, and so forth. However, in some embodiments, the amount to be paid may be provided by the user 105.
- the user may choose to pay electronically using the payment data stored on the device 200 or by a payment card.
- the user 105 may dispose the payment card in front of the camera of the device 200.
- information about the payment card may be stored in a memory unit of the device 200 or may be reached via the Internet.
- the device 200 may receive payment data associated with the payment card.
- the device 200 may generate a payment request 502 based on the payment data of the payment card and the payment data of the product 130.
- the payment request 502 may be then sent via the network 110 to the financial organization 504 associated with the payment data of the payment card.
- the financial organization 504 may process the payment request 502 and may either perform the payment or deny the payment.
- a report 506 may be generated and sent to the device 200 via the network 110.
- the report 506 may inform user 105 whether the payment succeeded or was denied.
- the user 105 may be notified about the report 506 by showing the report 506 on the display of the device 200, playing a sound in earphones of the device 200, or by generating a vibration by a vibration unit of the device 200.
- the user 105 may receive payments from other users via the device 200. Payment data associated with another user may be received by the device 200.
- the payment data may include payment account information associated with another user, payment transfer data, and so forth. Based on the payment data, an amount may be transferred from the payment account of another user to a payment account of the user.
- the information on the payment account of the user may be stored in the memory of the device 200 or on a server.
- the device 200 may be used for different purposes.
- the device may enable hands free check-in and/or check-out, hands free video calls, and so forth. Additionally, the device may perform hands free video calls, take pictures, record video, get directions to a location, and so forth. In some
- the augmented reality eyeglass communication device may make and receive calls over a radio link while moving around a wide geographic area via a cellular network, access a public phone network, send and receive text, photo, and video messages, access internet, capture videos and photos, play games, and so forth.
- the augmented reality eyeglass communication device may be used to purchase products in a retail environment.
- the augmented reality eyeglass communication device on receiving a user request to read one or more product codes, may read the product codes corresponding to products.
- the reading may include scanning the product code by the augmented reality eyeglass communication device and decoding the product code to receive product information.
- a product price Prior to the reading, an aisle location of products may be determined. Each reading may be stored in a list of read products on the augmented reality eyeglass communication device. Additionally, the user may create one or more product lists. [0088] In some embodiments, a request to check a total amount and price of the reading may be received from the user. Additionally, the user may give a command to remove some items from the reading, so some items may be selectively removed.
- Data associated with the product information may be transmitted to a payment processing system.
- the augmented reality eyeglass communication device may calculate the total price of the reading, and payment may be authorized and the authorization may be transmitted to the payment processing system.
- the payment processing system may perform the payment and funds may be transferred to a merchant account.
- the total price may be encoded in a barcode and the barcode may be displayed on a display of the augmented reality eyeglass communication device. The displayed barcode may be scanned by a sales person to accelerate check out.
- compensation may be selectively received based on predetermined criteria.
- the compensation may include a cashback, a discount, a gift card, and so forth.
- the user may pay with a restored payment card by sending a request to make payment via an interface of the augmented reality eyeglass communication device.
- the payment card may include any credit or debit card.
- the augmented reality eyeglass communication device may connect to a wireless network of a merchant to receive information, receive digital coupons and offers to make a purchase, receive promotional offers and advertising, or for other purposes.
- promotional offers and advertising may be received from a merchant, a mobile payment service provider, a third party, and so forth.
- a digital receipt may be received by email.
- the digital receipt may contain detailed information on cashback, discount, and so forth.
- a remote order for home delivery of one or more unavailable products may be placed with a merchant.
- Another possible use of the augmented reality eyeglass communication device is accessing game and multimedia data.
- a user request to display the game and multimedia data or perform communication may be received and the augmented reality eyeglass communication device communicate, over a network, with a game and multimedia server to transfer game and multimedia data or a communication server to transfer communication data.
- the transferred data may be displayed on a display of the augmented reality eyeglass communication device.
- a user command may be received and transferred to the game and multimedia server, the server may process the command and transfer data related to the processing to the augmented reality eyeglass communication device.
- the augmented reality eyeglass communication device may receive incoming communication data and notify the user about the incoming communication data. To notify the user, an audible sound may be generated. The sound may correspond to the incoming communication data. A user command may be received in response to the incoming communication data, and the incoming
- the game and multimedia data or the incoming communication data may be transferred to a television set, a set-top box, a computer, a laptop, a smartphone, a wearable personal digital device, and so forth.
- the augmented reality eyeglass communication device may be used to alert a driver and prevent the driver for falling asleep.
- the augmented reality eyeglass communication device may include a neuron sensor and camera to detect the state of an eye of the driver (open or not) by processing frontal or side views of the face images taken by the camera to analyze slackening facial muscles, blinking pattern and a period of time the eyes stay closed between blinks. Once it is determined that the driver is asleep, an audible, voice, light, and/or vibration alarm may be generated.
- the augmented reality eyeglass communication device may be used for personal navigation.
- the augmented reality eyeglass communication device may comprise a GPS unit to determine a geographical location of a user and a magnetic direction sensor to determine an orientation of a head of the user.
- the processor of the augmented reality eyeglass communication device may receive a destination or an itinerary, one or more geographical maps, the geographical location of the user, and the orientation of the head of the user, and generate navigation hints.
- the navigation hints may be provided to the user via a plurality of Light Emitting Diodes (LEDs).
- the LEDs may be disposed in a peripheral field of vision of the user and provide navigation hints by changing their color. For example, the LEDS located on in the direction where the user need to move to reach the destination or to follow the itinerary, may have a green color, while the LEDs located in a wrong direction may have a red color.
- geographical maps, the geographical location of the user, one or more messages, one or more alternative routes, one or more travel alerts, and so forth may be displayed on the display of the augmented reality eyeglass communication device.
- the augmented reality eyeglass communication device may receive user commands via a microphone.
- the augmented reality eyeglass communication device may comprise at least one electroencephalograph (EEG) sensor sensing one or more electrical impulses associated with the brain activity of the user.
- the electrical impulses may be translated in one or more commands.
- the electrical impulses may be used to detect and optimize brain fitness and performance of the user, measure and monitor cognitive health and well being of the user.
- Based on the electrical impulses undesired condition of the user may be detected and an alert associated with the undesired condition may be provided.
- the undesired condition may include chronic stress, anxiety, depression, aging, decreasing estrogen level, excess oxytocin level, prolonged Cortisol secretion, and so forth.
- healthy lifestyle tips may be provided to the user via the augmented reality eyeglass communication device.
- the healthy lifestyle tips may be associated with mental stimulation, physical exercise, healthy nutrition, stress management, sleep, and so forth.
- FIG. 6 shows a diagrammatic representation of a machine in the example electronic form of a computer system 600, within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a portable music player (e.g., a portable hard drive audio device such as an Moving Picture Experts Group Audio Layer 3 (MP3) player), a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA Personal Digital Assistant
- MP3 Moving Picture Experts Group Audio Layer 3
- MP3 Moving Picture Experts Group Audio Layer 3
- web appliance e.g., a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- MP3 Moving Picture Experts Group Audio Layer 3
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or
- the example computer system 600 includes a processor or multiple processors 602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 604 and a static memory 606, which communicate with each other via a bus 608.
- the computer system 600 may further include a video display unit 610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
- the computer system 600 may also include an alphanumeric input device 612 (e.g., a keyboard), a cursor control device 614 (e.g., a mouse), a disk drive unit 616, a signal generation device 618 (e.g., a speaker) and a network interface device 620.
- the disk drive unit 616 includes a computer-readable medium 622, on which is stored one or more sets of instructions and data structures (e.g., instructions 624) embodying or utilized by any one or more of the methodologies or functions described herein.
- the instructions 624 may also reside, completely or at least partially, within the main memory 604 and/or within the processors 602 during execution thereof by the computer system 600.
- the main memory 604 and the processors 602 may also constitute machine-readable media.
- the instructions 624 may further be transmitted or received over a network 626 via the network interface device 620 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP)).
- HTTP Hyper Text Transfer Protocol
- computer-readable medium 622 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions.
- the term "computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions.
- computer-readable medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory (RAMs), read only memory (ROMs), and the like.
- the example embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Finance (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Emergency Management (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
Abstract
L'invention concerne un dispositif de communication à lunettes à réalité augmentée, et un procédé permettant de faire des achats au moyen d'un dispositif de communication à lunettes à réalité augmentée. Le dispositif de communication à lunettes à réalité augmentée peut comprendre : une monture et des branches de lunettes droite et gauche reliées à la monture ; un processeur configuré pour recevoir une ou plusieurs commandes d'un utilisateur, exécuter des opérations associées aux commandes de l'utilisateur, recevoir des informations produit et traiter les informations produit ; un afficheur relié à la monture et configuré pour afficher des données reçues en provenance du processeur ; un émetteur-récepteur connecté électriquement au processeur et configuré pour recevoir et transmettre des données sur un réseau sans fil ; une fente pour carte SIM, une caméra, un écouteur, un microphone et une unité chargeur.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/973,146 US9153074B2 (en) | 2011-07-18 | 2013-08-22 | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
US13/973,146 | 2013-08-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015025251A1 true WO2015025251A1 (fr) | 2015-02-26 |
Family
ID=52483128
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2014/063914 WO2015025251A1 (fr) | 2013-08-22 | 2014-08-14 | Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale |
Country Status (2)
Country | Link |
---|---|
US (1) | US9153074B2 (fr) |
WO (1) | WO2015025251A1 (fr) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016149416A1 (fr) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Méthodes et systèmes de diagnostic et de traitement des troubles de santé |
US10448762B2 (en) | 2017-09-15 | 2019-10-22 | Kohler Co. | Mirror |
US10459231B2 (en) | 2016-04-08 | 2019-10-29 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US10515474B2 (en) | 2017-01-19 | 2019-12-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US10521014B2 (en) | 2017-01-19 | 2019-12-31 | Mindmaze Holding Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
WO2020041455A1 (fr) * | 2018-08-24 | 2020-02-27 | Nikola Mrvaljevic | Réalité augmentée pour détecter une fatigue athlétique |
US10663938B2 (en) | 2017-09-15 | 2020-05-26 | Kohler Co. | Power operation of intelligent devices |
US10887125B2 (en) | 2017-09-15 | 2021-01-05 | Kohler Co. | Bathroom speaker |
US10943100B2 (en) | 2017-01-19 | 2021-03-09 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US10962855B2 (en) | 2017-02-23 | 2021-03-30 | Magic Leap, Inc. | Display system with variable power reflector |
US11099540B2 (en) | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US11328533B1 (en) | 2018-01-09 | 2022-05-10 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression for motion capture |
US11855831B1 (en) | 2022-06-10 | 2023-12-26 | T-Mobile Usa, Inc. | Enabling an operator to resolve an issue associated with a 5G wireless telecommunication network using AR glasses |
US11886767B2 (en) | 2022-06-17 | 2024-01-30 | T-Mobile Usa, Inc. | Enable interaction between a user and an agent of a 5G wireless telecommunication network using augmented reality glasses |
US11921794B2 (en) | 2017-09-15 | 2024-03-05 | Kohler Co. | Feedback for water consuming appliance |
US11991344B2 (en) | 2017-02-07 | 2024-05-21 | Mindmaze Group Sa | Systems, methods and apparatuses for stereo vision and tracking |
Families Citing this family (327)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9811818B1 (en) * | 2002-10-01 | 2017-11-07 | World Award Academy, World Award Foundation, Amobilepay, Inc. | Wearable personal digital device for facilitating mobile device payments and personal use |
US20130018715A1 (en) * | 2011-07-18 | 2013-01-17 | Tiger T G Zhou | Facilitating mobile device payments using product code scanning to enable self checkout |
US9100493B1 (en) * | 2011-07-18 | 2015-08-04 | Andrew H B Zhou | Wearable personal digital device for facilitating mobile device payments and personal use |
US9153074B2 (en) * | 2011-07-18 | 2015-10-06 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
US9704154B2 (en) * | 2002-10-01 | 2017-07-11 | World Award Academy, World Award Foundation, Amobilepay, Inc. | Wearable personal digital device for facilitating mobile device payments and personal use |
US7771320B2 (en) | 2006-09-07 | 2010-08-10 | Nike, Inc. | Athletic performance sensing and/or tracking systems and methods |
US20130141313A1 (en) * | 2011-07-18 | 2013-06-06 | Tiger T.G. Zhou | Wearable personal digital eyeglass device |
US20130179307A1 (en) * | 2012-01-10 | 2013-07-11 | Thermo Fisher Scientific Inc. | Methods And Systems For Restocking Inventory |
US8918208B1 (en) * | 2012-02-07 | 2014-12-23 | Ryan Hickman | Projection of interactive map data |
US10965164B2 (en) | 2012-07-06 | 2021-03-30 | Energous Corporation | Systems and methods of wirelessly delivering power to a receiver device |
US10256657B2 (en) | 2015-12-24 | 2019-04-09 | Energous Corporation | Antenna having coaxial structure for near field wireless power charging |
US12057715B2 (en) | 2012-07-06 | 2024-08-06 | Energous Corporation | Systems and methods of wirelessly delivering power to a wireless-power receiver device in response to a change of orientation of the wireless-power receiver device |
US10312715B2 (en) * | 2015-09-16 | 2019-06-04 | Energous Corporation | Systems and methods for wireless power charging |
US11502551B2 (en) | 2012-07-06 | 2022-11-15 | Energous Corporation | Wirelessly charging multiple wireless-power receivers using different subsets of an antenna array to focus energy at different locations |
US9867062B1 (en) | 2014-07-21 | 2018-01-09 | Energous Corporation | System and methods for using a remote server to authorize a receiving device that has requested wireless power and to determine whether another receiving device should request wireless power in a wireless power transmission system |
US10992187B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | System and methods of using electromagnetic waves to wirelessly deliver power to electronic devices |
US10992185B2 (en) | 2012-07-06 | 2021-04-27 | Energous Corporation | Systems and methods of using electromagnetic waves to wirelessly deliver power to game controllers |
US10381880B2 (en) | 2014-07-21 | 2019-08-13 | Energous Corporation | Integrated antenna structure arrays for wireless power transmission |
US10439448B2 (en) | 2014-08-21 | 2019-10-08 | Energous Corporation | Systems and methods for automatically testing the communication between wireless power transmitter and wireless power receiver |
US9876394B1 (en) | 2014-05-07 | 2018-01-23 | Energous Corporation | Boost-charger-boost system for enhanced power delivery |
US9787103B1 (en) | 2013-08-06 | 2017-10-10 | Energous Corporation | Systems and methods for wirelessly delivering power to electronic devices that are unable to communicate with a transmitter |
US10063105B2 (en) | 2013-07-11 | 2018-08-28 | Energous Corporation | Proximity transmitters for wireless power charging systems |
US9632683B2 (en) * | 2012-11-08 | 2017-04-25 | Nokia Technologies Oy | Methods, apparatuses and computer program products for manipulating characteristics of audio objects by using directional gestures |
KR20140073237A (ko) * | 2012-12-06 | 2014-06-16 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 방법 |
US9547917B2 (en) | 2013-03-14 | 2017-01-17 | Paypay, Inc. | Using augmented reality to determine information |
US9383819B2 (en) | 2013-06-03 | 2016-07-05 | Daqri, Llc | Manipulation of virtual object in augmented reality via intent |
US9354702B2 (en) | 2013-06-03 | 2016-05-31 | Daqri, Llc | Manipulation of virtual object in augmented reality via thought |
US9874749B2 (en) * | 2013-11-27 | 2018-01-23 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
EP2818948B1 (fr) * | 2013-06-27 | 2016-11-16 | ABB Schweiz AG | Procédé et dispositif de présentation de données pour aider un utilisateur à distance à fournir des instructions |
US10289987B1 (en) * | 2018-04-10 | 2019-05-14 | Patricia A. Walker | Banking system using a wearable device for delivering virtual currency |
US10825004B1 (en) * | 2018-04-10 | 2020-11-03 | Patricia A. Walker | Banking system using a wearable device for delivering virtual currency |
KR102165818B1 (ko) | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | 입력 영상을 이용한 사용자 인터페이스 제어 방법, 장치 및 기록매체 |
US9405366B2 (en) | 2013-10-02 | 2016-08-02 | David Lee SEGAL | Systems and methods for using imagined directions to define an action, function or execution for non-tactile devices |
US9581816B2 (en) * | 2013-11-05 | 2017-02-28 | Mutualink, Inc. | Digital glass enhanced media system |
KR102091519B1 (ko) * | 2013-11-05 | 2020-03-20 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
US9151953B2 (en) | 2013-12-17 | 2015-10-06 | Amazon Technologies, Inc. | Pointer tracking for eye-level scanners and displays |
US9558592B2 (en) | 2013-12-31 | 2017-01-31 | Daqri, Llc | Visualization of physical interactions in augmented reality |
US9626801B2 (en) * | 2013-12-31 | 2017-04-18 | Daqri, Llc | Visualization of physical characteristics in augmented reality |
US20150186708A1 (en) * | 2013-12-31 | 2015-07-02 | Sagi Katz | Biometric identification system |
JP6851133B2 (ja) * | 2014-01-03 | 2021-03-31 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | ユーザに方向付けられた個人情報アシスタント |
FR3016229B1 (fr) | 2014-01-07 | 2016-02-05 | Systemes Et Technologies Identification Stid | Lecteur de controle d’acces et module complementaire de controle |
US10019149B2 (en) | 2014-01-07 | 2018-07-10 | Toshiba Global Commerce Solutions Holdings Corporation | Systems and methods for implementing retail processes based on machine-readable images and user gestures |
US9910501B2 (en) | 2014-01-07 | 2018-03-06 | Toshiba Global Commerce Solutions Holdings Corporation | Systems and methods for implementing retail processes based on machine-readable images and user gestures |
FR3016228B1 (fr) * | 2014-01-07 | 2016-02-05 | Systemes Et Technologies Identification Stid | Lecteur de controle d’acces avec dispositif de detection d’ouverture |
CN104794733B (zh) * | 2014-01-20 | 2018-05-08 | 株式会社理光 | 对象跟踪方法和装置 |
WO2015123771A1 (fr) * | 2014-02-18 | 2015-08-27 | Sulon Technologies Inc. | Suivi de gestes et commande en réalité augmentée et virtuelle |
US9615177B2 (en) | 2014-03-06 | 2017-04-04 | Sphere Optics Company, Llc | Wireless immersive experience capture and viewing |
US10394330B2 (en) | 2014-03-10 | 2019-08-27 | Qualcomm Incorporated | Devices and methods for facilitating wireless communications based on implicit user cues |
JP6689203B2 (ja) | 2014-03-19 | 2020-04-28 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 立体ビューワのための視線追跡を統合する医療システム |
JP6644699B2 (ja) * | 2014-03-19 | 2020-02-12 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | 視線追跡を使用する医療装置、システム、及び方法 |
KR102469752B1 (ko) * | 2014-07-31 | 2022-11-22 | 삼성전자주식회사 | 웨어러블 장치 및 그 제어 방법 |
US11328334B1 (en) * | 2014-04-30 | 2022-05-10 | United Services Automobile Association (Usaa) | Wearable electronic devices for automated shopping and budgeting with a wearable sensor |
US9501871B2 (en) | 2014-04-30 | 2016-11-22 | At&T Mobility Ii Llc | Explorable augmented reality displays |
US10158257B2 (en) | 2014-05-01 | 2018-12-18 | Energous Corporation | System and methods for using sound waves to wirelessly deliver power to electronic devices |
EP3143474B1 (fr) * | 2014-05-15 | 2020-10-07 | Federal Express Corporation | Dispositifs pouvant être portés pour un traitement de courrier et leurs procédés d'utilisation |
US9958947B2 (en) | 2014-06-25 | 2018-05-01 | Comcast Cable Communications, Llc | Ocular focus sharing for digital content |
US10068703B1 (en) | 2014-07-21 | 2018-09-04 | Energous Corporation | Integrated miniature PIFA with artificial magnetic conductor metamaterials |
EP2977855B1 (fr) * | 2014-07-23 | 2019-08-28 | Wincor Nixdorf International GmbH | Clavier virtuel et procédé de saisie pour un clavier virtuel |
US9576329B2 (en) * | 2014-07-31 | 2017-02-21 | Ciena Corporation | Systems and methods for equipment installation, configuration, maintenance, and personnel training |
WO2016018044A1 (fr) * | 2014-07-31 | 2016-02-04 | Samsung Electronics Co., Ltd. | Dispositif portable et son procédé de commande |
WO2016022008A1 (fr) * | 2014-08-08 | 2016-02-11 | Samsung Electronics Co., Ltd. | Procédé et appareil de génération de profil environnemental |
KR102243235B1 (ko) * | 2014-08-14 | 2021-04-22 | 삼성전자주식회사 | 전자 장치, 그 제어 방법, 기록 매체 및 상기 전자 장치와 연동되는 이어잭 단자 캡 |
KR101524575B1 (ko) * | 2014-08-20 | 2015-06-03 | 박준호 | 웨어러블 디바이스 |
US10617342B2 (en) | 2014-09-05 | 2020-04-14 | Vision Service Plan | Systems, apparatus, and methods for using a wearable device to monitor operator alertness |
US10448867B2 (en) * | 2014-09-05 | 2019-10-22 | Vision Service Plan | Wearable gait monitoring apparatus, systems, and related methods |
US9633497B2 (en) * | 2014-09-05 | 2017-04-25 | Shenzhen Mindray Bio-Medical Electronics Co., Ltd. | Systems and methods for medical monitoring device gesture control lockout |
US10261674B2 (en) | 2014-09-05 | 2019-04-16 | Microsoft Technology Licensing, Llc | Display-efficient text entry and editing |
US11918375B2 (en) | 2014-09-05 | 2024-03-05 | Beijing Zitiao Network Technology Co., Ltd. | Wearable environmental pollution monitor computer apparatus, systems, and related methods |
US9996874B2 (en) | 2014-09-11 | 2018-06-12 | Oracle International Corporation | Character personal shopper system |
US10725533B2 (en) * | 2014-09-26 | 2020-07-28 | Intel Corporation | Systems, apparatuses, and methods for gesture recognition and interaction |
US10263967B2 (en) * | 2015-09-01 | 2019-04-16 | Quantum Interface, Llc | Apparatuses, systems and methods for constructing unique identifiers |
US9857590B2 (en) * | 2014-10-11 | 2018-01-02 | Floyd Steven Hall, Jr. | Method and system for press-on displays for fashionable eyewear and nosewear |
US9720259B2 (en) * | 2014-10-13 | 2017-08-01 | William Hart | Eyewear pupilometer |
WO2016060461A1 (fr) | 2014-10-15 | 2016-04-21 | Jun Ho Park | Dispositif portable |
GB2532192A (en) | 2014-10-29 | 2016-05-18 | Ibm | Secure pairing of personal device with host device |
WO2016066470A1 (fr) | 2014-10-30 | 2016-05-06 | Philips Lighting Holding B.V. | Contrôle de la sortie d'informations contextuelles à l'aide d'un dispositif informatique |
FR3028980B1 (fr) * | 2014-11-20 | 2017-01-13 | Oberthur Technologies | Procede et dispositif d'authentification d'un utilisateur |
JP2016110590A (ja) * | 2014-12-10 | 2016-06-20 | コニカミノルタ株式会社 | 画像処理装置、データ登録方法およびデータ登録プログラム |
US10379357B2 (en) | 2015-01-08 | 2019-08-13 | Shai Goldstein | Apparatus and method for displaying content |
US20160204839A1 (en) * | 2015-01-12 | 2016-07-14 | Futurewei Technologies, Inc. | Multi-band Antenna for Wearable Glasses |
US10050868B2 (en) | 2015-01-16 | 2018-08-14 | Sri International | Multimodal help agent for network administrator |
US10291653B2 (en) * | 2015-01-16 | 2019-05-14 | Sri International | Visually intuitive interactive network management |
US10215568B2 (en) | 2015-01-30 | 2019-02-26 | Vision Service Plan | Systems and methods for tracking motion, performance, and other data for an individual such as a winter sports athlete |
WO2016140643A1 (fr) * | 2015-03-02 | 2016-09-09 | Hewlett-Packard Development Company, L.P. | Projection d'un affichage virtuel |
WO2016141373A1 (fr) | 2015-03-05 | 2016-09-09 | Magic Leap, Inc. | Systèmes et procédés de réalité augmentée |
US10180734B2 (en) | 2015-03-05 | 2019-01-15 | Magic Leap, Inc. | Systems and methods for augmented reality |
US10838207B2 (en) * | 2015-03-05 | 2020-11-17 | Magic Leap, Inc. | Systems and methods for augmented reality |
US10921896B2 (en) | 2015-03-16 | 2021-02-16 | Facebook Technologies, Llc | Device interaction in augmented reality |
US9934443B2 (en) | 2015-03-31 | 2018-04-03 | Daon Holdings Limited | Methods and systems for detecting head motion during an authentication transaction |
US10360617B2 (en) | 2015-04-24 | 2019-07-23 | Walmart Apollo, Llc | Automated shopping apparatus and method in response to consumption |
US9690374B2 (en) * | 2015-04-27 | 2017-06-27 | Google Inc. | Virtual/augmented reality transition system and method |
KR102393228B1 (ko) | 2015-05-11 | 2022-04-29 | 매직 립, 인코포레이티드 | 뉴럴 네트워크들을 활용하여 생체 인증 사용자 인식을 위한 디바이스들, 방법들 및 시스템들 |
KR20160133972A (ko) * | 2015-05-14 | 2016-11-23 | 엘지전자 주식회사 | 결제 정보와 관련된 결제 과정의 진행을 디스플레이부에 디스플레이할 수 있는 착용형 디스플레이 디바이스 및 그 제어 방법 |
US9824572B2 (en) * | 2015-05-21 | 2017-11-21 | Donald J Arndt | System, method, and computer program product for locating lost or stolen items |
US10642349B2 (en) * | 2015-05-21 | 2020-05-05 | Sony Interactive Entertainment Inc. | Information processing apparatus |
IL239191A0 (en) * | 2015-06-03 | 2015-11-30 | Amir B Geva | Image sorting system |
US10825049B2 (en) | 2015-06-09 | 2020-11-03 | Visa International Service Association | Virtual reality and augmented reality systems and methods to generate mobile alerts |
US20160374616A1 (en) * | 2015-06-24 | 2016-12-29 | Daqri, Llc | Electrode contact quality |
US10198620B2 (en) | 2015-07-06 | 2019-02-05 | Accenture Global Services Limited | Augmented reality based component replacement and maintenance |
US20170013107A1 (en) * | 2015-07-07 | 2017-01-12 | Rodney J. Adams | Sky zero |
CN105046635B (zh) * | 2015-07-08 | 2018-05-22 | 国家电网公司 | 一种基于谷歌眼镜的智能变电站设备识别系统及方法 |
US10685488B1 (en) * | 2015-07-17 | 2020-06-16 | Naveen Kumar | Systems and methods for computer assisted operation |
US9854372B2 (en) | 2015-08-29 | 2017-12-26 | Bragi GmbH | Production line PCB serial programming and testing method and system |
US9905088B2 (en) | 2015-08-29 | 2018-02-27 | Bragi GmbH | Responsive visual communication system and method |
US9949013B2 (en) | 2015-08-29 | 2018-04-17 | Bragi GmbH | Near field gesture control system and method |
US9949008B2 (en) | 2015-08-29 | 2018-04-17 | Bragi GmbH | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method |
US9843853B2 (en) | 2015-08-29 | 2017-12-12 | Bragi GmbH | Power control for battery powered personal area network device system and method |
US9972895B2 (en) | 2015-08-29 | 2018-05-15 | Bragi GmbH | Antenna for use in a wearable device |
US10122421B2 (en) | 2015-08-29 | 2018-11-06 | Bragi GmbH | Multimodal communication system using induction and radio and method |
KR20170028130A (ko) | 2015-09-03 | 2017-03-13 | 박준호 | 웨어러블 디바이스 |
US9298283B1 (en) | 2015-09-10 | 2016-03-29 | Connectivity Labs Inc. | Sedentary virtual reality method and systems |
US10523033B2 (en) | 2015-09-15 | 2019-12-31 | Energous Corporation | Receiver devices configured to determine location within a transmission field |
US11710321B2 (en) | 2015-09-16 | 2023-07-25 | Energous Corporation | Systems and methods of object detection in wireless power charging systems |
US10778041B2 (en) | 2015-09-16 | 2020-09-15 | Energous Corporation | Systems and methods for generating power waves in a wireless power transmission system |
JP2017072887A (ja) | 2015-10-05 | 2017-04-13 | 株式会社東芝 | 情報処理システム |
US10734717B2 (en) | 2015-10-13 | 2020-08-04 | Energous Corporation | 3D ceramic mold antenna |
US9549174B1 (en) * | 2015-10-14 | 2017-01-17 | Zspace, Inc. | Head tracked stereoscopic display system that uses light field type data |
US10506322B2 (en) | 2015-10-20 | 2019-12-10 | Bragi GmbH | Wearable device onboard applications system and method |
US9866941B2 (en) | 2015-10-20 | 2018-01-09 | Bragi GmbH | Multi-point multiple sensor array for data sensing and processing system and method |
US10104458B2 (en) | 2015-10-20 | 2018-10-16 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
US9980189B2 (en) | 2015-10-20 | 2018-05-22 | Bragi GmbH | Diversity bluetooth system and method |
US10063108B1 (en) | 2015-11-02 | 2018-08-28 | Energous Corporation | Stamped three-dimensional antenna |
US10027180B1 (en) | 2015-11-02 | 2018-07-17 | Energous Corporation | 3D triple linear antenna that acts as heat sink |
US9857874B2 (en) | 2015-11-03 | 2018-01-02 | Chunghwa Picture Tubes, Ltd. | Augmented reality system and augmented reality interaction method |
KR101774661B1 (ko) * | 2015-11-11 | 2017-09-04 | 현대자동차주식회사 | 운전자세 제어장치 및 방법 |
ITUB20156053A1 (it) * | 2015-11-11 | 2017-05-11 | Mariano Pisetta | Didattica digitale interattiva su dispositivi a realta' aumentata |
CN108604383A (zh) | 2015-12-04 | 2018-09-28 | 奇跃公司 | 重新定位系统和方法 |
US9939891B2 (en) | 2015-12-21 | 2018-04-10 | Bragi GmbH | Voice dictation systems using earpiece microphone system and method |
US9980033B2 (en) | 2015-12-21 | 2018-05-22 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
US10027159B2 (en) | 2015-12-24 | 2018-07-17 | Energous Corporation | Antenna for transmitting wireless power signals |
US10038332B1 (en) | 2015-12-24 | 2018-07-31 | Energous Corporation | Systems and methods of wireless power charging through multiple receiving devices |
US11863001B2 (en) | 2015-12-24 | 2024-01-02 | Energous Corporation | Near-field antenna for wireless power transmission with antenna elements that follow meandering patterns |
US10079515B2 (en) | 2016-12-12 | 2018-09-18 | Energous Corporation | Near-field RF charging pad with multi-band antenna element with adaptive loading to efficiently charge an electronic device at any position on the pad |
US10200790B2 (en) * | 2016-01-15 | 2019-02-05 | Bragi GmbH | Earpiece with cellular connectivity |
CN105487232A (zh) * | 2016-01-18 | 2016-04-13 | 京东方科技集团股份有限公司 | 一种智能穿戴设备 |
CA3011552A1 (fr) | 2016-01-19 | 2017-07-27 | Walmart Apollo, Llc | Systeme de commande d'article utilisable |
US10803145B2 (en) | 2016-02-05 | 2020-10-13 | The Intellectual Property Network, Inc. | Triggered responses based on real-time electroencephalography |
US10085091B2 (en) | 2016-02-09 | 2018-09-25 | Bragi GmbH | Ambient volume modification through environmental microphone feedback loop system and method |
JP6889728B2 (ja) | 2016-03-11 | 2021-06-18 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 畳み込みニューラルネットワークにおける構造学習 |
US10085082B2 (en) | 2016-03-11 | 2018-09-25 | Bragi GmbH | Earpiece with GPS receiver |
US20170259167A1 (en) * | 2016-03-14 | 2017-09-14 | Nathan Sterling Cook | Brainwave virtual reality apparatus and method |
US10045116B2 (en) | 2016-03-14 | 2018-08-07 | Bragi GmbH | Explosive sound pressure level active noise cancellation utilizing completely wireless earpieces system and method |
US10867314B2 (en) | 2016-03-22 | 2020-12-15 | Magic Leap, Inc. | Head mounted display system configured to exchange biometric information |
US10052065B2 (en) | 2016-03-23 | 2018-08-21 | Bragi GmbH | Earpiece life monitor with capability of automatic notification system and method |
US10015579B2 (en) | 2016-04-08 | 2018-07-03 | Bragi GmbH | Audio accelerometric feedback through bilateral ear worn device system and method |
US10013542B2 (en) | 2016-04-28 | 2018-07-03 | Bragi GmbH | Biometric interface system and method |
US10638316B2 (en) | 2016-05-25 | 2020-04-28 | Intel Corporation | Wearable computer apparatus with same hand user authentication |
US10888039B2 (en) | 2016-07-06 | 2021-01-05 | Bragi GmbH | Shielded case for wireless earpieces |
US10555700B2 (en) | 2016-07-06 | 2020-02-11 | Bragi GmbH | Combined optical sensor for audio and pulse oximetry system and method |
US11085871B2 (en) | 2016-07-06 | 2021-08-10 | Bragi GmbH | Optical vibration detection system and method |
US10201309B2 (en) | 2016-07-06 | 2019-02-12 | Bragi GmbH | Detection of physiological data using radar/lidar of wireless earpieces |
US10045110B2 (en) | 2016-07-06 | 2018-08-07 | Bragi GmbH | Selective sound field environment processing system and method |
US10582328B2 (en) | 2016-07-06 | 2020-03-03 | Bragi GmbH | Audio response based on user worn microphones to direct or adapt program responses system and method |
US10216474B2 (en) | 2016-07-06 | 2019-02-26 | Bragi GmbH | Variable computing engine for interactive media based upon user biometrics |
US10158934B2 (en) | 2016-07-07 | 2018-12-18 | Bragi GmbH | Case for multiple earpiece pairs |
US10621583B2 (en) | 2016-07-07 | 2020-04-14 | Bragi GmbH | Wearable earpiece multifactorial biometric analysis system and method |
US10165350B2 (en) | 2016-07-07 | 2018-12-25 | Bragi GmbH | Earpiece with app environment |
US10516930B2 (en) | 2016-07-07 | 2019-12-24 | Bragi GmbH | Comparative analysis of sensors to control power status for wireless earpieces |
US10587943B2 (en) | 2016-07-09 | 2020-03-10 | Bragi GmbH | Earpiece with wirelessly recharging battery |
US10649211B2 (en) | 2016-08-02 | 2020-05-12 | Magic Leap, Inc. | Fixed-distance virtual and augmented reality systems and methods |
US10397686B2 (en) | 2016-08-15 | 2019-08-27 | Bragi GmbH | Detection of movement adjacent an earpiece device |
US11254327B2 (en) | 2016-08-16 | 2022-02-22 | Ford Global Technologies, Llc | Methods and apparatus to present anticipated vehicle maneuvers to a passenger |
US10977348B2 (en) | 2016-08-24 | 2021-04-13 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
US10409091B2 (en) | 2016-08-25 | 2019-09-10 | Bragi GmbH | Wearable with lenses |
US10104464B2 (en) | 2016-08-25 | 2018-10-16 | Bragi GmbH | Wireless earpiece and smart glasses system and method |
US11200026B2 (en) | 2016-08-26 | 2021-12-14 | Bragi GmbH | Wireless earpiece with a passive virtual assistant |
US10887679B2 (en) | 2016-08-26 | 2021-01-05 | Bragi GmbH | Earpiece for audiograms |
US11086593B2 (en) | 2016-08-26 | 2021-08-10 | Bragi GmbH | Voice assistant for wireless earpieces |
US10313779B2 (en) | 2016-08-26 | 2019-06-04 | Bragi GmbH | Voice assistant system for wireless earpieces |
US10200780B2 (en) | 2016-08-29 | 2019-02-05 | Bragi GmbH | Method and apparatus for conveying battery life of wireless earpiece |
WO2018044711A1 (fr) * | 2016-08-31 | 2018-03-08 | Wal-Mart Stores, Inc. | Systèmes et procédés permettant de faire des achats au détail tout en désactivant des composants sur la base d'un emplacement |
US10303865B2 (en) | 2016-08-31 | 2019-05-28 | Redrock Biometrics, Inc. | Blue/violet light touchless palm print identification |
US11490858B2 (en) | 2016-08-31 | 2022-11-08 | Bragi GmbH | Disposable sensor array wearable device sleeve system and method |
US10580282B2 (en) | 2016-09-12 | 2020-03-03 | Bragi GmbH | Ear based contextual environment and biometric pattern recognition system and method |
US10598506B2 (en) | 2016-09-12 | 2020-03-24 | Bragi GmbH | Audio navigation using short range bilateral earpieces |
US10852829B2 (en) | 2016-09-13 | 2020-12-01 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
CN106355479A (zh) * | 2016-09-22 | 2017-01-25 | 京东方科技集团股份有限公司 | 一种虚拟试衣方法、虚拟试衣眼镜及虚拟试衣系统 |
US11283742B2 (en) | 2016-09-27 | 2022-03-22 | Bragi GmbH | Audio-based social media platform |
CN106408285A (zh) * | 2016-09-30 | 2017-02-15 | 中国银联股份有限公司 | 基于ar技术的支付方法以及支付系统 |
US10460095B2 (en) | 2016-09-30 | 2019-10-29 | Bragi GmbH | Earpiece with biometric identifiers |
US10049184B2 (en) | 2016-10-07 | 2018-08-14 | Bragi GmbH | Software application transmission via body interface using a wearable device in conjunction with removable body sensor arrays system and method |
US10585939B2 (en) * | 2016-10-11 | 2020-03-10 | International Business Machines Corporation | Real time object description service integrated with knowledge center on augmented reality (AR) and virtual reality (VR) devices |
SG10201608646SA (en) | 2016-10-14 | 2018-05-30 | Mastercard Asia Pacific Pte Ltd | Augmented Reality Device and Method For Product Purchase Facilitation |
KR102690172B1 (ko) * | 2016-10-17 | 2024-08-01 | 엘지전자 주식회사 | Hmd 디바이스 |
KR102662708B1 (ko) * | 2016-10-17 | 2024-05-03 | 엘지전자 주식회사 | Hmd 디바이스 |
CN110084089A (zh) * | 2016-10-26 | 2019-08-02 | 奥康科技有限公司 | 用于分析图像和提供反馈的可佩戴设备和方法 |
WO2018079167A1 (fr) * | 2016-10-27 | 2018-05-03 | ソニー株式会社 | Appareil de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme |
US10276028B2 (en) * | 2016-10-28 | 2019-04-30 | Matthew Bronniman | Eyeglass tracking system and method |
US10771877B2 (en) | 2016-10-31 | 2020-09-08 | Bragi GmbH | Dual earpieces for same ear |
US10455313B2 (en) | 2016-10-31 | 2019-10-22 | Bragi GmbH | Wireless earpiece with force feedback |
US10698983B2 (en) | 2016-10-31 | 2020-06-30 | Bragi GmbH | Wireless earpiece with a medical engine |
US10942701B2 (en) | 2016-10-31 | 2021-03-09 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
US10117604B2 (en) | 2016-11-02 | 2018-11-06 | Bragi GmbH | 3D sound positioning with distributed sensors |
US10617297B2 (en) | 2016-11-02 | 2020-04-14 | Bragi GmbH | Earpiece with in-ear electrodes |
US10225638B2 (en) | 2016-11-03 | 2019-03-05 | Bragi GmbH | Ear piece with pseudolite connectivity |
US10821361B2 (en) | 2016-11-03 | 2020-11-03 | Bragi GmbH | Gaming with earpiece 3D audio |
US10062373B2 (en) | 2016-11-03 | 2018-08-28 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
US10923954B2 (en) | 2016-11-03 | 2021-02-16 | Energous Corporation | Wireless power receiver with a synchronous rectifier |
US10205814B2 (en) | 2016-11-03 | 2019-02-12 | Bragi GmbH | Wireless earpiece with walkie-talkie functionality |
US10045117B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with modified ambient environment over-ride function |
US10058282B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
US10063957B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Earpiece with source selection within ambient environment |
US10045112B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with added ambient environment |
US10636063B1 (en) | 2016-11-08 | 2020-04-28 | Wells Fargo Bank, N.A. | Method for an augmented reality value advisor |
US10158634B2 (en) | 2016-11-16 | 2018-12-18 | Bank Of America Corporation | Remote document execution and network transfer using augmented reality display devices |
US10212157B2 (en) | 2016-11-16 | 2019-02-19 | Bank Of America Corporation | Facilitating digital data transfers using augmented reality display devices |
TWI587206B (zh) * | 2016-11-24 | 2017-06-11 | 財團法人工業技術研究院 | 互動顯示裝置及系統 |
WO2018098436A1 (fr) | 2016-11-28 | 2018-05-31 | Spy Eye, Llc | Dispositif d'affichage non obstruant monté sur l'oeil |
US10943229B2 (en) | 2016-11-29 | 2021-03-09 | Bank Of America Corporation | Augmented reality headset and digital wallet |
US10339583B2 (en) | 2016-11-30 | 2019-07-02 | Bank Of America Corporation | Object recognition and analysis using augmented reality user devices |
US10600111B2 (en) | 2016-11-30 | 2020-03-24 | Bank Of America Corporation | Geolocation notifications using augmented reality user devices |
US10685386B2 (en) | 2016-11-30 | 2020-06-16 | Bank Of America Corporation | Virtual assessments using augmented reality user devices |
US10481862B2 (en) | 2016-12-02 | 2019-11-19 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
US10586220B2 (en) | 2016-12-02 | 2020-03-10 | Bank Of America Corporation | Augmented reality dynamic authentication |
US20180158243A1 (en) * | 2016-12-02 | 2018-06-07 | Google Inc. | Collaborative manipulation of objects in virtual reality |
US10311223B2 (en) | 2016-12-02 | 2019-06-04 | Bank Of America Corporation | Virtual reality dynamic authentication |
US10607230B2 (en) | 2016-12-02 | 2020-03-31 | Bank Of America Corporation | Augmented reality dynamic authentication for electronic transactions |
US10109095B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
US10109096B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
CN116455101A (zh) | 2016-12-12 | 2023-07-18 | 艾诺格思公司 | 发射器集成电路 |
US10210767B2 (en) | 2016-12-13 | 2019-02-19 | Bank Of America Corporation | Real world gamification using augmented reality user devices |
US10217375B2 (en) | 2016-12-13 | 2019-02-26 | Bank Of America Corporation | Virtual behavior training using augmented reality user devices |
WO2018122709A1 (fr) * | 2016-12-26 | 2018-07-05 | Xing Zhou | Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale |
KR20180075843A (ko) * | 2016-12-27 | 2018-07-05 | 러브투트레일 주식회사 | 증강현실을 이용한 경로 안내 장치 및 이를 이용한 경로 안내 방법 |
US10506327B2 (en) | 2016-12-27 | 2019-12-10 | Bragi GmbH | Ambient environmental sound field manipulation based on user defined voice and audio recognition pattern analysis system and method |
WO2018127782A1 (fr) * | 2017-01-03 | 2018-07-12 | Xing Zhou | Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale |
US10680319B2 (en) | 2017-01-06 | 2020-06-09 | Energous Corporation | Devices and methods for reducing mutual coupling effects in wireless power transmission systems |
US10389161B2 (en) | 2017-03-15 | 2019-08-20 | Energous Corporation | Surface mount dielectric antennas for wireless power transmitters |
US10439442B2 (en) | 2017-01-24 | 2019-10-08 | Energous Corporation | Microstrip antennas for wireless power transmitters |
EP3570740B1 (fr) | 2017-01-23 | 2023-08-30 | Naqi Logix Inc. | Appareil et procédé pour utiliser une direction imaginée afin de définir au moins une action |
US10812936B2 (en) | 2017-01-23 | 2020-10-20 | Magic Leap, Inc. | Localization determination for mixed reality systems |
WO2018140555A1 (fr) | 2017-01-30 | 2018-08-02 | Walmart Apollo, Llc | Systèmes, procédés et appareil de distribution de produits et de gestion de chaîne d'approvisionnement |
US20180217666A1 (en) * | 2017-01-30 | 2018-08-02 | Neuroverse, Inc. | Biometric control system |
US10405081B2 (en) | 2017-02-08 | 2019-09-03 | Bragi GmbH | Intelligent wireless headset system |
US10582290B2 (en) | 2017-02-21 | 2020-03-03 | Bragi GmbH | Earpiece with tap functionality |
US10771881B2 (en) | 2017-02-27 | 2020-09-08 | Bragi GmbH | Earpiece with audio 3D menu |
US10684693B2 (en) | 2017-03-02 | 2020-06-16 | Samsung Electronics Co., Ltd. | Method for recognizing a gesture and an electronic device thereof |
WO2018165212A1 (fr) * | 2017-03-06 | 2018-09-13 | Fynd Technologies, Inc. | Bracelet de géolocalisation, système et procédés |
EP3596705A4 (fr) | 2017-03-17 | 2020-01-22 | Magic Leap, Inc. | Système de réalité mixte à déformation de contenu virtuel couleur et procédé de génération de contenu virtuel l'utilisant |
CA3054619C (fr) | 2017-03-17 | 2024-01-30 | Magic Leap, Inc. | Systeme de realite mixte a deformation de contenu virtuel et procede de generation de contenu virtuel l'utilisant |
AU2018233733B2 (en) | 2017-03-17 | 2021-11-11 | Magic Leap, Inc. | Mixed reality system with multi-source virtual content compositing and method of generating virtual content using same |
US11380430B2 (en) | 2017-03-22 | 2022-07-05 | Bragi GmbH | System and method for populating electronic medical records with wireless earpieces |
US11694771B2 (en) | 2017-03-22 | 2023-07-04 | Bragi GmbH | System and method for populating electronic health records with wireless earpieces |
US10575086B2 (en) | 2017-03-22 | 2020-02-25 | Bragi GmbH | System and method for sharing wireless earpieces |
US11544104B2 (en) | 2017-03-22 | 2023-01-03 | Bragi GmbH | Load sharing between wireless earpieces |
WO2018183292A1 (fr) | 2017-03-29 | 2018-10-04 | Walmart Apollo, Llc | Gestion de chaîne d'approvisionnement d'inventaire au détail |
US11093927B2 (en) * | 2017-03-29 | 2021-08-17 | International Business Machines Corporation | Sensory data collection in an augmented reality system |
US20180284914A1 (en) * | 2017-03-30 | 2018-10-04 | Intel Corporation | Physical-surface touch control in virtual environment |
WO2018183892A1 (fr) | 2017-03-30 | 2018-10-04 | Energous Corporation | Antennes plates ayant deux fréquences de résonance ou plus destinées à être utilisées dans des systèmes de transmission de puissance sans fil |
US10429675B2 (en) * | 2017-03-31 | 2019-10-01 | Mark Greget | System for using augmented reality for vision |
US10708699B2 (en) | 2017-05-03 | 2020-07-07 | Bragi GmbH | Hearing aid with added functionality |
US10511097B2 (en) | 2017-05-12 | 2019-12-17 | Energous Corporation | Near-field antennas for accumulating energy at a near-field distance with minimal far-field gain |
US12074452B2 (en) | 2017-05-16 | 2024-08-27 | Wireless Electrical Grid Lan, Wigl Inc. | Networked wireless charging system |
US11462949B2 (en) | 2017-05-16 | 2022-10-04 | Wireless electrical Grid LAN, WiGL Inc | Wireless charging method and system |
US12074460B2 (en) | 2017-05-16 | 2024-08-27 | Wireless Electrical Grid Lan, Wigl Inc. | Rechargeable wireless power bank and method of using |
US10146501B1 (en) | 2017-06-01 | 2018-12-04 | Qualcomm Incorporated | Sound control by various hand gestures |
US11116415B2 (en) | 2017-06-07 | 2021-09-14 | Bragi GmbH | Use of body-worn radar for biometric measurements, contextual awareness and identification |
US11013445B2 (en) | 2017-06-08 | 2021-05-25 | Bragi GmbH | Wireless earpiece with transcranial stimulation |
US10848853B2 (en) | 2017-06-23 | 2020-11-24 | Energous Corporation | Systems, methods, and devices for utilizing a wire of a sound-producing device as an antenna for receipt of wirelessly delivered power |
US20180373327A1 (en) * | 2017-06-26 | 2018-12-27 | Hand Held Products, Inc. | System and method for selective scanning on a binocular augmented reality device |
US20190019011A1 (en) * | 2017-07-16 | 2019-01-17 | Tsunami VR, Inc. | Systems and methods for identifying real objects in an area of interest for use in identifying virtual content a user is authorized to view using an augmented reality device |
US10344960B2 (en) | 2017-09-19 | 2019-07-09 | Bragi GmbH | Wireless earpiece controlled medical headlight |
US11272367B2 (en) | 2017-09-20 | 2022-03-08 | Bragi GmbH | Wireless earpieces for hub communications |
US10254548B1 (en) | 2017-09-29 | 2019-04-09 | Hand Held Products, Inc. | Scanning device |
US10691931B2 (en) | 2017-10-04 | 2020-06-23 | Toshiba Global Commerce Solutions | Sensor-based environment for providing image analysis to determine behavior |
US10122219B1 (en) | 2017-10-10 | 2018-11-06 | Energous Corporation | Systems, methods, and devices for using a battery as a antenna for receiving wirelessly delivered power from radio frequency power waves |
US11342798B2 (en) | 2017-10-30 | 2022-05-24 | Energous Corporation | Systems and methods for managing coexistence of wireless-power signals and data signals operating in a same frequency band |
CN109960964A (zh) | 2017-12-14 | 2019-07-02 | 红石生物特征科技有限公司 | 非接触式掌纹获取装置及其方法 |
US10676022B2 (en) | 2017-12-27 | 2020-06-09 | X Development Llc | Visually indicating vehicle caution regions |
CN108319363A (zh) * | 2018-01-09 | 2018-07-24 | 北京小米移动软件有限公司 | 基于vr的产品展示方法、装置及电子设备 |
US10615647B2 (en) | 2018-02-02 | 2020-04-07 | Energous Corporation | Systems and methods for detecting wireless power receivers and other objects at a near-field charging pad |
US10673414B2 (en) | 2018-02-05 | 2020-06-02 | Tectus Corporation | Adaptive tuning of a contact lens |
US11159057B2 (en) | 2018-03-14 | 2021-10-26 | Energous Corporation | Loop antennas with selectively-activated feeds to control propagation patterns of wireless power signals |
US10706396B2 (en) | 2018-03-19 | 2020-07-07 | Capital One Services, Llc | Systems and methods for translating a gesture to initiate a financial transaction |
JP7052496B2 (ja) * | 2018-03-30 | 2022-04-12 | ブラザー工業株式会社 | 通信装置と通信装置のためのコンピュータプログラム |
US10505394B2 (en) | 2018-04-21 | 2019-12-10 | Tectus Corporation | Power generation necklaces that mitigate energy absorption in the human body |
US10838239B2 (en) | 2018-04-30 | 2020-11-17 | Tectus Corporation | Multi-coil field generation in an electronic contact lens system |
US10895762B2 (en) | 2018-04-30 | 2021-01-19 | Tectus Corporation | Multi-coil field generation in an electronic contact lens system |
US10790700B2 (en) | 2018-05-18 | 2020-09-29 | Tectus Corporation | Power generation necklaces with field shaping systems |
US11050752B2 (en) | 2018-06-07 | 2021-06-29 | Ebay Inc. | Virtual reality authentication |
US11515732B2 (en) | 2018-06-25 | 2022-11-29 | Energous Corporation | Power wave transmission techniques to focus wirelessly delivered power at a receiving device |
US11137622B2 (en) | 2018-07-15 | 2021-10-05 | Tectus Corporation | Eye-mounted displays including embedded conductive coils |
US10897705B2 (en) | 2018-07-19 | 2021-01-19 | Tectus Corporation | Secure communication between a contact lens and an accessory device |
CN112513712B (zh) | 2018-07-23 | 2023-05-09 | 奇跃公司 | 具有虚拟内容翘曲的混合现实系统和使用该系统生成虚拟内容的方法 |
US10943521B2 (en) | 2018-07-23 | 2021-03-09 | Magic Leap, Inc. | Intra-field sub code timing in field sequential displays |
US10602513B2 (en) * | 2018-07-27 | 2020-03-24 | Tectus Corporation | Wireless communication between a contact lens and an accessory device |
US10722128B2 (en) | 2018-08-01 | 2020-07-28 | Vision Service Plan | Heart rate detection system and method |
US10529107B1 (en) | 2018-09-11 | 2020-01-07 | Tectus Corporation | Projector alignment in a contact lens |
EP3877831A4 (fr) | 2018-11-09 | 2022-08-03 | Beckman Coulter, Inc. | Lunettes de maintenance à fourniture de données sélective |
US11437735B2 (en) | 2018-11-14 | 2022-09-06 | Energous Corporation | Systems for receiving electromagnetic energy using antennas that are minimally affected by the presence of the human body |
US10838232B2 (en) | 2018-11-26 | 2020-11-17 | Tectus Corporation | Eye-mounted displays including embedded solenoids |
US11087577B2 (en) * | 2018-12-14 | 2021-08-10 | Johnson Controls Tyco IP Holdings LLP | Systems and methods of secure pin code entry |
US10644543B1 (en) | 2018-12-20 | 2020-05-05 | Tectus Corporation | Eye-mounted display system including a head wearable object |
WO2020160015A1 (fr) | 2019-01-28 | 2020-08-06 | Energous Corporation | Systèmes et procédés d'antenne miniaturisée servant à des transmissions d'énergie sans fil |
EP3921945A1 (fr) | 2019-02-06 | 2021-12-15 | Energous Corporation | Systèmes et procédés d'estimation de phases optimales à utiliser pour des antennes individuelles dans un réseau d'antennes |
WO2020214897A1 (fr) | 2019-04-18 | 2020-10-22 | Beckman Coulter, Inc. | Sécurisation de données d'objets dans un environnement de laboratoire |
WO2020226791A1 (fr) | 2019-05-07 | 2020-11-12 | Apple Inc. | Mécanisme de réglage pour visiocasque |
EP3973468A4 (fr) | 2019-05-21 | 2022-09-14 | Magic Leap, Inc. | Estimation de pose de main |
US11030459B2 (en) * | 2019-06-27 | 2021-06-08 | Intel Corporation | Methods and apparatus for projecting augmented reality enhancements to real objects in response to user gestures detected in a real environment |
US11907417B2 (en) | 2019-07-25 | 2024-02-20 | Tectus Corporation | Glance and reveal within a virtual environment |
US10944290B2 (en) | 2019-08-02 | 2021-03-09 | Tectus Corporation | Headgear providing inductive coupling to a contact lens |
KR20210019185A (ko) * | 2019-08-12 | 2021-02-22 | 엘지전자 주식회사 | 멀티미디어 디바이스 및 그 제어 방법 |
CN112492193B (zh) * | 2019-09-12 | 2022-02-18 | 华为技术有限公司 | 一种回调流的处理方法及设备 |
US11139699B2 (en) | 2019-09-20 | 2021-10-05 | Energous Corporation | Classifying and detecting foreign objects using a power amplifier controller integrated circuit in wireless power transmission systems |
CN115104234A (zh) | 2019-09-20 | 2022-09-23 | 艾诺格思公司 | 使用多个整流器保护无线电力接收器以及使用多个整流器建立带内通信的系统和方法 |
US11381118B2 (en) | 2019-09-20 | 2022-07-05 | Energous Corporation | Systems and methods for machine learning based foreign object detection for wireless power transmission |
WO2021055898A1 (fr) | 2019-09-20 | 2021-03-25 | Energous Corporation | Systèmes et procédés de détection d'objet étranger basée sur l'apprentissage automatique pour transmission de puissance sans fil |
US11662807B2 (en) | 2020-01-06 | 2023-05-30 | Tectus Corporation | Eye-tracking user interface for virtual tool control |
KR20210048725A (ko) | 2019-10-24 | 2021-05-04 | 삼성전자주식회사 | 카메라 제어를 위한 방법 및 이를 위한 전자 장치 |
US10901505B1 (en) | 2019-10-24 | 2021-01-26 | Tectus Corporation | Eye-based activation and tool selection systems and methods |
WO2021119483A1 (fr) | 2019-12-13 | 2021-06-17 | Energous Corporation | Station de charge présentant des contours de guidage permettant d'aligner un dispositif électronique sur la station de charge et de transférer efficacement de l'énergie radiofréquence en champ proche au dispositif électronique |
US10985617B1 (en) | 2019-12-31 | 2021-04-20 | Energous Corporation | System for wirelessly transmitting energy at a near-field distance without using beam-forming control |
US11405774B2 (en) | 2020-01-14 | 2022-08-02 | Facebook Technologies, Llc | Collective artificial reality device configuration |
US11562059B2 (en) * | 2020-01-14 | 2023-01-24 | Meta Platforms Technologies, Llc | Administered authentication in artificial reality systems |
US11799324B2 (en) | 2020-04-13 | 2023-10-24 | Energous Corporation | Wireless-power transmitting device for creating a uniform near-field charging area |
US11126405B1 (en) * | 2020-06-19 | 2021-09-21 | Accenture Global Solutions Limited | Utilizing augmented reality and artificial intelligence to automatically generate code for a robot |
US11995774B2 (en) * | 2020-06-29 | 2024-05-28 | Snap Inc. | Augmented reality experiences using speech and text captions |
US11294459B1 (en) | 2020-10-05 | 2022-04-05 | Bank Of America Corporation | Dynamic enhanced security based on eye movement tracking |
KR20220059975A (ko) * | 2020-11-02 | 2022-05-11 | 현대자동차주식회사 | 차량 및 그 제어방법 |
AU2021104706A4 (en) * | 2020-11-03 | 2021-09-30 | Christopher Mooney | A consumer product type source origin meta data identification and data processing system |
US11556912B2 (en) * | 2021-01-28 | 2023-01-17 | Bank Of America Corporation | Smartglasses-to-smartglasses payment systems |
US12014030B2 (en) | 2021-08-18 | 2024-06-18 | Bank Of America Corporation | System for predictive virtual scenario presentation |
US11592899B1 (en) | 2021-10-28 | 2023-02-28 | Tectus Corporation | Button activation within an eye-controlled user interface |
US11916398B2 (en) | 2021-12-29 | 2024-02-27 | Energous Corporation | Small form-factor devices with integrated and modular harvesting receivers, and shelving-mounted wireless-power transmitters for use therewith |
US11619994B1 (en) | 2022-01-14 | 2023-04-04 | Tectus Corporation | Control of an electronic contact lens using pitch-based eye gestures |
US20230351364A1 (en) * | 2022-04-29 | 2023-11-02 | INAMO Inc. | System and process for gift package of a prepaid card and a wearable device |
US11874961B2 (en) | 2022-05-09 | 2024-01-16 | Tectus Corporation | Managing display of an icon in an eye tracking augmented reality device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120235887A1 (en) * | 2010-02-28 | 2012-09-20 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element and an optically flat film |
US20130050432A1 (en) * | 2011-08-30 | 2013-02-28 | Kathryn Stone Perez | Enhancing an object of interest in a see-through, mixed reality display device |
US20130346168A1 (en) * | 2011-07-18 | 2013-12-26 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
Family Cites Families (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6764012B2 (en) * | 1997-02-10 | 2004-07-20 | Symbol Technologies, Inc. | Signaling arrangement for and method of signaling in a wireless local area network |
DE69840547D1 (de) * | 1997-10-30 | 2009-03-26 | Myvu Corp | Schnittstellensystem für brillen |
JP2001251542A (ja) * | 1999-12-28 | 2001-09-14 | Casio Comput Co Ltd | 携帯型撮像機器 |
US7461936B2 (en) * | 2000-06-02 | 2008-12-09 | Oakley, Inc. | Eyeglasses with detachable adjustable electronics module |
US7278734B2 (en) * | 2000-06-02 | 2007-10-09 | Oakley, Inc. | Wireless interactive headset |
US6729726B2 (en) * | 2001-10-06 | 2004-05-04 | Stryker Corporation | Eyewear for hands-free communication |
US20130225290A1 (en) * | 2011-11-02 | 2013-08-29 | Dylan T. X. Zhou | Wearable personal mini cloud game and multimedia device |
US9016565B2 (en) * | 2011-07-18 | 2015-04-28 | Dylan T X Zhou | Wearable personal digital device for facilitating mobile device payments and personal use |
US8985442B1 (en) * | 2011-07-18 | 2015-03-24 | Tiger T G Zhou | One-touch payment using haptic control via a messaging and calling multimedia system on mobile device and wearable device, currency token interface, point of sale device, and electronic payment card |
JP3988632B2 (ja) * | 2002-11-28 | 2007-10-10 | 日本電気株式会社 | 眼鏡型ディスプレイの制御方法 |
US8065235B2 (en) * | 2003-05-05 | 2011-11-22 | International Business Machines Corporation | Portable intelligent shopping device |
US7079876B2 (en) * | 2003-12-23 | 2006-07-18 | Isaac Levy | Wireless telephone headset built into eyeglasses |
US20050278446A1 (en) * | 2004-05-27 | 2005-12-15 | Jeffery Bryant | Home improvement telepresence system and method |
JP2005352024A (ja) * | 2004-06-09 | 2005-12-22 | Murata Mfg Co Ltd | 眼鏡型インタフェース装置及びセキュリティシステム |
US8931896B2 (en) * | 2004-11-02 | 2015-01-13 | E-Vision Smart Optics Inc. | Eyewear including a docking station |
US20060109350A1 (en) * | 2004-11-24 | 2006-05-25 | Ming-Hsiang Yeh | Glasses type audio-visual recording apparatus |
US20060153409A1 (en) * | 2005-01-10 | 2006-07-13 | Ming-Hsiang Yeh | Structure of a pair of glasses |
US20070104333A1 (en) * | 2005-11-08 | 2007-05-10 | Bill Kuo | Headset with built-in power supply |
US7810750B2 (en) * | 2006-12-13 | 2010-10-12 | Marcio Marc Abreu | Biologically fit wearable electronics apparatus and methods |
CN101101373A (zh) * | 2006-09-01 | 2008-01-09 | 刘美鸿 | 一种单片非球面透镜结构的虚拟屏幕显示装置 |
US7631968B1 (en) * | 2006-11-01 | 2009-12-15 | Motion Research Technologies, Inc. | Cell phone display that clips onto eyeglasses |
US7484847B2 (en) * | 2007-01-02 | 2009-02-03 | Hind-Sight Industries, Inc. | Eyeglasses having integrated telescoping video camera and video display |
US7798638B2 (en) * | 2007-01-02 | 2010-09-21 | Hind-Sight Industries, Inc. | Eyeglasses with integrated video display |
US9217868B2 (en) * | 2007-01-12 | 2015-12-22 | Kopin Corporation | Monocular display device |
GB0720165D0 (en) * | 2007-10-16 | 2007-11-28 | 3M Innovative Properties Co | Light-emitting device |
US20090219788A1 (en) * | 2008-03-03 | 2009-09-03 | Henley Jr Horace | Combination watch and cell phone foldable onto each other for use around a wrist of a user |
US8957835B2 (en) * | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
WO2010062481A1 (fr) * | 2008-11-02 | 2010-06-03 | David Chaum | Système et appareil d'affichage proche de l'oeil |
KR101529921B1 (ko) * | 2008-11-04 | 2015-06-18 | 엘지전자 주식회사 | 와치형 단말기 |
JP5389493B2 (ja) * | 2009-03-25 | 2014-01-15 | オリンパス株式会社 | 眼鏡装着型画像表示装置 |
US8427508B2 (en) | 2009-06-25 | 2013-04-23 | Nokia Corporation | Method and apparatus for an augmented reality user interface |
TWI554076B (zh) * | 2009-09-04 | 2016-10-11 | 普露諾洛股份有限公司 | 遠距離的電話管理器 |
US8210676B1 (en) * | 2009-09-21 | 2012-07-03 | Marvin James Hunt | Sportsman's reading glasses |
US20110193963A1 (en) * | 2010-02-04 | 2011-08-11 | Hunter Specialties, Inc. | Eyewear for acquiring video imagery |
US8482859B2 (en) | 2010-02-28 | 2013-07-09 | Osterhout Group, Inc. | See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film |
JP2013521576A (ja) * | 2010-02-28 | 2013-06-10 | オスターハウト グループ インコーポレイテッド | 対話式ヘッド取付け型アイピース上での地域広告コンテンツ |
US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
US20110246284A1 (en) * | 2010-04-01 | 2011-10-06 | Gary Chaikin | Systems and Methods for Adding Functionality to Merchant Sales and Facilitating Data Collection. |
US9400978B2 (en) * | 2010-04-09 | 2016-07-26 | Paypal, Inc. | Methods and systems for selecting accounts and offers in payment transactions |
EP2409736A1 (fr) * | 2010-07-22 | 2012-01-25 | Sony Ericsson Mobile Communications AB | Procédé pour le fonctionnement d'un dispositif d'entraînement |
US20120029994A1 (en) * | 2010-07-28 | 2012-02-02 | Symbol Technologies, Inc. | Coupon organization using a bar code reader |
US8582206B2 (en) * | 2010-09-15 | 2013-11-12 | Microsoft Corporation | Laser-scanning virtual image display |
US20120188501A1 (en) * | 2011-01-26 | 2012-07-26 | Lewis Page Johnson | Eyeglass temple insert and assembly |
US8787006B2 (en) * | 2011-01-31 | 2014-07-22 | Apple Inc. | Wrist-worn electronic device and methods therefor |
WO2012112822A2 (fr) * | 2011-02-16 | 2012-08-23 | Visa International Service Association | Appareils, procédés et systèmes de paiement mobile sans contact (« snap ») |
US20120281961A1 (en) * | 2011-05-06 | 2012-11-08 | Predator Outdoor Products, Llc | Eyewear for acquiring video imagery with one button technology |
US9330499B2 (en) | 2011-05-20 | 2016-05-03 | Microsoft Technology Licensing, Llc | Event augmentation with real-time information |
US20120316456A1 (en) * | 2011-06-10 | 2012-12-13 | Aliphcom | Sensory user interface |
US20130002724A1 (en) | 2011-06-30 | 2013-01-03 | Google Inc. | Wearable computer with curved display and navigation tool |
US20130265300A1 (en) * | 2011-07-03 | 2013-10-10 | Neorai Vardi | Computer device in form of wearable glasses and user interface thereof |
US20130241927A1 (en) * | 2011-07-03 | 2013-09-19 | Neorai Vardi | Computer device in form of wearable glasses and user interface thereof |
US20130002559A1 (en) * | 2011-07-03 | 2013-01-03 | Vardi Nachum | Desktop computer user interface |
US9047600B2 (en) * | 2011-07-18 | 2015-06-02 | Andrew H B Zhou | Mobile and wearable device payments via free cross-platform messaging service, free voice over internet protocol communication, free over-the-top content communication, and universal digital mobile and wearable device currency faces |
US20130172068A1 (en) * | 2011-11-02 | 2013-07-04 | Tiger T G Zhou | Wearable personal digital flexible cloud game, multimedia, communication and computing device |
US20130021374A1 (en) | 2011-07-20 | 2013-01-24 | Google Inc. | Manipulating And Displaying An Image On A Wearable Computing System |
US9285592B2 (en) | 2011-08-18 | 2016-03-15 | Google Inc. | Wearable device with input and output structures |
US20130054390A1 (en) * | 2011-08-22 | 2013-02-28 | Metrologic Instruments, Inc. | Encoded information reading terminal with nfc payment processing functionality |
US9720231B2 (en) * | 2012-09-26 | 2017-08-01 | Dolby Laboratories Licensing Corporation | Display, imaging system and controller for eyewear display device |
US10268276B2 (en) * | 2013-03-15 | 2019-04-23 | Eyecam, LLC | Autonomous computing and telecommunications head-up displays glasses |
-
2013
- 2013-08-22 US US13/973,146 patent/US9153074B2/en not_active Expired - Fee Related
-
2014
- 2014-08-14 WO PCT/IB2014/063914 patent/WO2015025251A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120235887A1 (en) * | 2010-02-28 | 2012-09-20 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element and an optically flat film |
US20130346168A1 (en) * | 2011-07-18 | 2013-12-26 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
US20130050432A1 (en) * | 2011-08-30 | 2013-02-28 | Kathryn Stone Perez | Enhancing an object of interest in a see-through, mixed reality display device |
Cited By (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10539794B2 (en) | 2015-03-16 | 2020-01-21 | Magic Leap, Inc. | Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus |
US10365488B2 (en) | 2015-03-16 | 2019-07-30 | Magic Leap, Inc. | Methods and systems for diagnosing eyes using aberrometer |
US20170007450A1 (en) | 2015-03-16 | 2017-01-12 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for delivery of medication to eyes |
US20170007843A1 (en) | 2015-03-16 | 2017-01-12 | Magic Leap, Inc. | Methods and systems for diagnosing and treating eyes using laser therapy |
US10345592B2 (en) | 2015-03-16 | 2019-07-09 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing a user using electrical potentials |
US10539795B2 (en) | 2015-03-16 | 2020-01-21 | Magic Leap, Inc. | Methods and systems for diagnosing and treating eyes using laser therapy |
US10345593B2 (en) | 2015-03-16 | 2019-07-09 | Magic Leap, Inc. | Methods and systems for providing augmented reality content for treating color blindness |
US10345590B2 (en) | 2015-03-16 | 2019-07-09 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for determining optical prescriptions |
US10359631B2 (en) | 2015-03-16 | 2019-07-23 | Magic Leap, Inc. | Augmented reality display systems and methods for re-rendering the world |
US10545341B2 (en) | 2015-03-16 | 2020-01-28 | Magic Leap, Inc. | Methods and systems for diagnosing eye conditions, including macular degeneration |
US10371946B2 (en) | 2015-03-16 | 2019-08-06 | Magic Leap, Inc. | Methods and systems for diagnosing binocular vision conditions |
US10371949B2 (en) | 2015-03-16 | 2019-08-06 | Magic Leap, Inc. | Methods and systems for performing confocal microscopy |
US10371947B2 (en) | 2015-03-16 | 2019-08-06 | Magic Leap, Inc. | Methods and systems for modifying eye convergence for diagnosing and treating conditions including strabismus and/or amblyopia |
US10371948B2 (en) | 2015-03-16 | 2019-08-06 | Magic Leap, Inc. | Methods and systems for diagnosing color blindness |
US10371945B2 (en) | 2015-03-16 | 2019-08-06 | Magic Leap, Inc. | Methods and systems for diagnosing and treating higher order refractive aberrations of an eye |
US10379351B2 (en) | 2015-03-16 | 2019-08-13 | Magic Leap, Inc. | Methods and systems for diagnosing and treating eyes using light therapy |
US10379350B2 (en) | 2015-03-16 | 2019-08-13 | Magic Leap, Inc. | Methods and systems for diagnosing eyes using ultrasound |
US10379354B2 (en) | 2015-03-16 | 2019-08-13 | Magic Leap, Inc. | Methods and systems for diagnosing contrast sensitivity |
US10379353B2 (en) | 2015-03-16 | 2019-08-13 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields |
US10386639B2 (en) | 2015-03-16 | 2019-08-20 | Magic Leap, Inc. | Methods and systems for diagnosing eye conditions such as red reflex using light reflected from the eyes |
US10386640B2 (en) | 2015-03-16 | 2019-08-20 | Magic Leap, Inc. | Methods and systems for determining intraocular pressure |
US10386641B2 (en) | 2015-03-16 | 2019-08-20 | Magic Leap, Inc. | Methods and systems for providing augmented reality content for treatment of macular degeneration |
US10429649B2 (en) | 2015-03-16 | 2019-10-01 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing using occluder |
US10437062B2 (en) | 2015-03-16 | 2019-10-08 | Magic Leap, Inc. | Augmented and virtual reality display platforms and methods for delivering health treatments to a user |
US10444504B2 (en) | 2015-03-16 | 2019-10-15 | Magic Leap, Inc. | Methods and systems for performing optical coherence tomography |
WO2016149416A1 (fr) * | 2015-03-16 | 2016-09-22 | Magic Leap, Inc. | Méthodes et systèmes de diagnostic et de traitement des troubles de santé |
US10451877B2 (en) | 2015-03-16 | 2019-10-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating presbyopia |
US11747627B2 (en) | 2015-03-16 | 2023-09-05 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields |
US10459229B2 (en) | 2015-03-16 | 2019-10-29 | Magic Leap, Inc. | Methods and systems for performing two-photon microscopy |
US10466477B2 (en) | 2015-03-16 | 2019-11-05 | Magic Leap, Inc. | Methods and systems for providing wavefront corrections for treating conditions including myopia, hyperopia, and/or astigmatism |
US10473934B2 (en) | 2015-03-16 | 2019-11-12 | Magic Leap, Inc. | Methods and systems for performing slit lamp examination |
US11474359B2 (en) | 2015-03-16 | 2022-10-18 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields |
US11256096B2 (en) | 2015-03-16 | 2022-02-22 | Magic Leap, Inc. | Methods and systems for diagnosing and treating presbyopia |
US10527850B2 (en) | 2015-03-16 | 2020-01-07 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina |
US11156835B2 (en) | 2015-03-16 | 2021-10-26 | Magic Leap, Inc. | Methods and systems for diagnosing and treating health ailments |
US10345591B2 (en) | 2015-03-16 | 2019-07-09 | Magic Leap, Inc. | Methods and systems for performing retinoscopy |
US20170000342A1 (en) | 2015-03-16 | 2017-01-05 | Magic Leap, Inc. | Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus |
US10564423B2 (en) | 2015-03-16 | 2020-02-18 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for delivery of medication to eyes |
US10983351B2 (en) | 2015-03-16 | 2021-04-20 | Magic Leap, Inc. | Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields |
US10969588B2 (en) | 2015-03-16 | 2021-04-06 | Magic Leap, Inc. | Methods and systems for diagnosing contrast sensitivity |
US10775628B2 (en) | 2015-03-16 | 2020-09-15 | Magic Leap, Inc. | Methods and systems for diagnosing and treating presbyopia |
US10788675B2 (en) | 2015-03-16 | 2020-09-29 | Magic Leap, Inc. | Methods and systems for diagnosing and treating eyes using light therapy |
US10459231B2 (en) | 2016-04-08 | 2019-10-29 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US11106041B2 (en) | 2016-04-08 | 2021-08-31 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US11614626B2 (en) | 2016-04-08 | 2023-03-28 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
US10943100B2 (en) | 2017-01-19 | 2021-03-09 | Mindmaze Holding Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US10515474B2 (en) | 2017-01-19 | 2019-12-24 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US11989340B2 (en) | 2017-01-19 | 2024-05-21 | Mindmaze Group Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
US11495053B2 (en) | 2017-01-19 | 2022-11-08 | Mindmaze Group Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US11195316B2 (en) | 2017-01-19 | 2021-12-07 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression in a virtual reality system |
US10521014B2 (en) | 2017-01-19 | 2019-12-31 | Mindmaze Holding Sa | Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location in at least one of a virtual and augmented reality system |
US11709548B2 (en) | 2017-01-19 | 2023-07-25 | Mindmaze Group Sa | Systems, methods, devices and apparatuses for detecting facial expression |
US11991344B2 (en) | 2017-02-07 | 2024-05-21 | Mindmaze Group Sa | Systems, methods and apparatuses for stereo vision and tracking |
US11774823B2 (en) | 2017-02-23 | 2023-10-03 | Magic Leap, Inc. | Display system with variable power reflector |
US11300844B2 (en) | 2017-02-23 | 2022-04-12 | Magic Leap, Inc. | Display system with variable power reflector |
US10962855B2 (en) | 2017-02-23 | 2021-03-30 | Magic Leap, Inc. | Display system with variable power reflector |
US11921794B2 (en) | 2017-09-15 | 2024-03-05 | Kohler Co. | Feedback for water consuming appliance |
US11314214B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Geographic analysis of water conditions |
US10887125B2 (en) | 2017-09-15 | 2021-01-05 | Kohler Co. | Bathroom speaker |
US11099540B2 (en) | 2017-09-15 | 2021-08-24 | Kohler Co. | User identity in household appliances |
US11892811B2 (en) | 2017-09-15 | 2024-02-06 | Kohler Co. | Geographic analysis of water conditions |
US10448762B2 (en) | 2017-09-15 | 2019-10-22 | Kohler Co. | Mirror |
US11314215B2 (en) | 2017-09-15 | 2022-04-26 | Kohler Co. | Apparatus controlling bathroom appliance lighting based on user identity |
US10663938B2 (en) | 2017-09-15 | 2020-05-26 | Kohler Co. | Power operation of intelligent devices |
US11949533B2 (en) | 2017-09-15 | 2024-04-02 | Kohler Co. | Sink device |
US11328533B1 (en) | 2018-01-09 | 2022-05-10 | Mindmaze Holding Sa | System, method and apparatus for detecting facial expression for motion capture |
WO2020041455A1 (fr) * | 2018-08-24 | 2020-02-27 | Nikola Mrvaljevic | Réalité augmentée pour détecter une fatigue athlétique |
US11855831B1 (en) | 2022-06-10 | 2023-12-26 | T-Mobile Usa, Inc. | Enabling an operator to resolve an issue associated with a 5G wireless telecommunication network using AR glasses |
US11886767B2 (en) | 2022-06-17 | 2024-01-30 | T-Mobile Usa, Inc. | Enable interaction between a user and an agent of a 5G wireless telecommunication network using augmented reality glasses |
Also Published As
Publication number | Publication date |
---|---|
US9153074B2 (en) | 2015-10-06 |
US20130346168A1 (en) | 2013-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9153074B2 (en) | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command | |
US20170103440A1 (en) | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command | |
US20170115742A1 (en) | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command | |
US9100493B1 (en) | Wearable personal digital device for facilitating mobile device payments and personal use | |
US11417066B2 (en) | System and method for selecting targets in an augmented reality environment | |
US10356398B2 (en) | Method for capturing virtual space and electronic device using the same | |
WO2018127782A1 (fr) | Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale | |
CN107851319A (zh) | 区域增强现实持久性标签对象 | |
US9153195B2 (en) | Providing contextual personal information by a mixed reality device | |
CN117043718A (zh) | 激活操作电子镜像设备的免提模式 | |
CN109863532A (zh) | 生成和显示在媒体覆盖中的定制头像 | |
CN107924590A (zh) | 增强现实系统中的基于图像的跟踪 | |
CN109791621A (zh) | 用于采用光学代码进行装置配对的系统和方法 | |
US20130172068A1 (en) | Wearable personal digital flexible cloud game, multimedia, communication and computing device | |
US20150309263A2 (en) | Planar waveguide apparatus with diffraction element(s) and system employing same | |
US20180350148A1 (en) | Augmented reality display system for overlaying apparel and fitness information | |
CN110168586A (zh) | 情境生成和对定制的媒体内容的选择 | |
CN106462825A (zh) | 数据网格平台 | |
CN109310353A (zh) | 经由计算机实现的代理传达信息 | |
WO2018122709A1 (fr) | Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale | |
CN109791664A (zh) | 通过过滤活动推导受众 | |
KR20190104282A (ko) | 영상 기반으로 정보를 제공하는 방법 및 이를 위한 이동 단말 | |
KR20170031722A (ko) | 웨어러블 디바이스를 이용한 정보처리 방법 | |
US20230403460A1 (en) | Techniques for using sensor data to monitor image-capture trigger conditions for determining when to capture images using an imaging device of a head- wearable device, and wearable devices and systems for performing those techniques | |
WO2024044184A1 (fr) | Vision informatique externe pour un dispositif de lunettes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14838589 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14838589 Country of ref document: EP Kind code of ref document: A1 |