EP2972560A2 - Système d'affichage vestimentaire compatible avec les données sociales - Google Patents
Système d'affichage vestimentaire compatible avec les données socialesInfo
- Publication number
- EP2972560A2 EP2972560A2 EP14775050.9A EP14775050A EP2972560A2 EP 2972560 A2 EP2972560 A2 EP 2972560A2 EP 14775050 A EP14775050 A EP 14775050A EP 2972560 A2 EP2972560 A2 EP 2972560A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- display
- social
- sensor
- sensor data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 claims abstract description 15
- 238000004891 communication Methods 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims abstract description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000001815 facial effect Effects 0.000 claims description 4
- 238000005065 mining Methods 0.000 claims description 3
- 238000007796 conventional method Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003278 mimic effect Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000006750 UV protection Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000004557 technical material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present invention relates generally to electrical and electronic hardware
- electromechanical and computing devices More specifically, techniques related to a social data- aware wearable display system are described.
- Conventional wearable devices also often are not hands-free, and even wearable display devices that are hands-free typically are not equipped to access social data automatically, and particularly in context (i.e., pertaining to a user's behavior, location and environment).
- FIG. 1 illustrates an exemplary social data-aware wearable display system
- FIG. 2 illustrates an exemplary wearable display device
- FIG. 3 illustrates another exemplary wearable display device.
- motion may be detected using an accelerometer that responds to an applied force and produces an output signal representative of the acceleration (and hence in some cases a velocity or displacement) produced by the force.
- Embodiments may be used to couple or secure a wearable device onto a body part.
- Techniques described are directed to systems, apparatuses, devices, and methods for using accelerometers, or other devices capable of detecting motion, to detect the motion of an element or part of an overall system.
- the described techniques may be used to accurately and reliably detect the motion of a part of the human body or an element of another complex system.
- operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
- FIG. 1 illustrates an exemplary wearable display device.
- wearable device 100 includes frame 102, lenses 104, display 106, and sensors 108-110.
- an object may be seen through lenses 104 (e.g., person 112).
- frame 102 may be implemented similarly to a pair of glasses.
- frame 102 may be configured to house lenses 104, which may be non-prescription or prescription lenses.
- frame 102 may be configured to be worn on a face (e.g., over a bridge of a nose, over a pair of ears, or the like) such that a user may be able to see through lenses 104.
- frame 102 may include sensors 108-110.
- one or more of sensors 108-110 may be configured to capture visual (e.g., image, video, or the like) data.
- one or more of sensors 108- 110 may include a camera, light sensor, or the like, without limitation.
- one or more of sensors 108-110 also may be configured to capture audio data or other sensor data (e.g., temperature, location, light, or the like).
- one or more of sensors 108-110 may include a microphone, vibration sensor, or the like, without limitation.
- one or more of sensors 108-110, or sensors disposed elsewhere on frame 102 may be configured to capture secondary sensor data (e.g., environmental, location, movement, or the like).
- one or more of sensors 108-110 may be disposed in different locations on frame 102 than shown, or coupled to a different part of frame 102, for capturing sensor data associated with a different direction or location relative to frame 102.
- display 106 may be disposed anywhere in a field of vision or field of view of an eye. In some examples, display 106 may be disposed on one or both of lenses 104. In other examples, display 106 may be implemented independently of lenses 104. In some examples, display 106 may be disposed in an unobtrusive portion of said field of vision. For example, display 106 may be disposed on a peripheral portion of lenses 104, such as near a corner of one or both of lenses 104. In other examples, display 106 may be implemented unobtrusively, for example by operating in two or more modes, where display 106 is disabled in one mode and enabled in another mode.
- display 106 may be configured to act similar to or provide a same function as lenses 104 (i.e., prescription lens or non-prescription lens).
- lenses 104 i.e., prescription lens or non-prescription lens.
- display 106 may mimic a portion of a clear lens where lenses 104 are clear.
- display 106 may mimic a portion of a prescription lens having a prescription similar, or identical, to lenses 104.
- display 106 may have other characteristics in common with lenses 104 (e.g., UV protection, tinting, coloring, and the like).
- other characteristics in common with lenses 104 e.g., UV protection, tinting, coloring, and the like.
- information may appear temporarily, and then disappear after a predetermined period of time (i.e., for a length of time long enough to be read or recognized by a user).
- display 106 may be implemented using transmissive display technology (e.g., liquid crystal display (LCD) type, or the like).
- display 106 may be implemented using reflective display technology (e.g., liquid crystal on silicon (LCoS) type, or the like), for example, with an electrically controlled reflective material in a backplane.
- reflective display technology e.g., liquid crystal on silicon (LCoS) type, or the like
- LCD liquid crystal on silicon
- the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
- FIG. 2 illustrates an exemplary social data-aware wearable display system.
- system 200 includes wearable device 202, including display 204, mobile device 206, applications 208- 210, network 212, server 214 and storage 216.
- wearable device may include communication facility 202a and sensor 202b.
- sensor 202b may be implemented as one or more sensors configured to capture sensor data, as described herein.
- communication facility 202a may be configured to exchange data with mobile device 206 and network 212 (i.e., server 214 using network 212), for example using a short-range communication protocol (e.g., Bluetooth®, NFC, ultra wideband, or the like) or longer-range communication protocol (e.g., satellite, mobile broadband, GPS, WiFi, and the like).
- short-range communication protocol e.g., Bluetooth®, NFC, ultra wideband, or the like
- longer-range communication protocol e.g., satellite, mobile broadband, GPS, WiFi, and the like.
- mobile device 206 may be implemented as a mobile communication device, mobile computing device, tablet computer, or the like, without limitation.
- wearable device 202 may be configured to capture sensor data (i.e., using sensor 202b) associated with an object (e.g., person 218) seen by a user while wearing wearable device 202.
- wearable device 202 may capture visual data associated with person 218 when a user wearing wearable device 202 sees person 218.
- wearable device 202 may be configured to send said visual data to mobile device 206 or server 214 for processing by application 208 and/or application 210, as described herein.
- mobile device 206 also may be implemented with a secondary sensor (not shown) configured to capture secondary sensor data (e.g., movement, location (i.e., using GPS), or the like).
- mobile device 206 may be configured to run or implement application
- server 214 may be configured to run or implement application 210, or other various applications.
- applications 208- 210 may be implemented in a distributed manner using both mobile device 206 and server 214.
- one or both of applications 208-210 may be configured to process sensor data received from wearable device 202, and to generate pertinent social data (i.e., social data relevant to sensor data captured by wearable device 202, and thus relevant to a user's environment) using the sensor data for presentation on display 204.
- pertinent social data i.e., social data relevant to sensor data captured by wearable device 202, and thus relevant to a user's environment
- social data may refer to data associated with a social network or social graph, for example, associated with a user.
- social data may be associated with a social network account (e.g., Facebook®, Twitter®, Linkedln®, Instagram®, Google+®, or the like).
- social data also may be associated with other databases configured to store social data (e.g., contacts lists and information, calendar data associated with a user's contacts, or the like).
- application 208 may be configured to derive characteristic data from sensor data captured using wearable device 202.
- wearable device 202 may be configured to capture visual data associated with one or more objects (e.g., person 218, or the like) able to be seen or viewed using wearable device 202, and application 208 may be configured to derive a face outline, facial features, a gait, or other characteristics, associated with said one or more objects.
- application 210 may be configured to run various algorithms using sensor data, including secondary sensor data, captured by wearable device 202 in order to generate (i.e., gather, obtain or determine by querying and cross-referencing with a database) pertinent social data associated with said sensor data.
- application 210 also may be configured to run one or more algorithms on secondary sensor data and derived data from mobile device 206 in order to generate pertinent social data associated with said sensor data.
- said algorithms may include a facial recognition algorithm, a social database mining algorithm, an intelligent contextual information provisioning algorithm (i.e., to enable mobile device 206 and/or wearable device 202 to provide data or services in response, or otherwise react, to sensor, social, and environmental data), or the like.
- one or both of applications 208- 210 also may be configured to format or otherwise process data (i.e., pertinent social data) to be presented, for example, using display 204.
- pertinent social data may be gathered from social networking databases, or other databases configured to store social data, as described herein.
- pertinent social data may include identity data associated with an identity, for example, of a member of a social network.
- identity data may reference or describe a name and other identifying information (e.g., a telephone number, an e-mail address, a physical address, a relationship (i.e., with a user of the social network to which said member belongs), an unique identification (e.g., a handle, a username, a social security number, a password, or the like), and the like) associated with an identity.
- applications 208-210 may be configured to obtain identity data associated with sensor data, for example, associated with an image or video of person 218, and to provide said identity data to wearable device 202 to present using display 204.
- pertinent social data generated by also may reference or describe an event or other social information (e.g., a birthday, a
- a favorite food e.g., a frequented venue (e.g., restaurant, cafe, shop, store, or the like) nearby, a relationship to a user (e.g., friend of a friend, co-worker, boss's daughter, or the like), a relationship status, or the like) relevant to a member of a social network identified using sensor data.
- a frequented venue e.g., restaurant, cafe, shop, store, or the like
- a relationship to a user e.g., friend of a friend, co-worker, boss's daughter, or the like
- a relationship status e.g., friendship status, or the like
- the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
- FIG. 3 illustrates another exemplary wearable display device.
- wearable device 302 includes viewing area 304 and focus feature 306.
- viewing area 304 may include display 308, which may be disposed on some or all of viewing area 304.
- display 308 may be dynamically focused using focus feature 306, for example, implemented in a frame arm of wearable device 302, to adapt to a user's eye focal length such that information and images (i.e., graphics) presented on display 308 appear focused to a user.
- focus feature 306 may be implemented with a sensor (or an array of sensors) to detect a touching motion (e.g., a tap of a finger, a sliding of a finger, or the like).
- focus feature 306 may be configured to translate said touching motion into a focal change implemented on display 308, for example, using software configured to adjust display 308 or optically moving lens surface with respect to each other (i.e., laterally or vertically).
- a camera (not shown), either visual or infrared or other type, may be implemented facing a user and configured to sense one or more parameters associated with a user's eye (e.g., pupil opening size, or the like). Said one or more parameters may be used by wearable device 308 to automatically focus information or images presented on display 308.
- the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne des techniques associées à un système d'affichage vestimentaire compatible avec les données sociales, comprenant un dispositif vestimentaire ayant un cadre configuré pour être porté, un affichage couplé au cadre, l'affichage étant situé dans un champ de vision, un capteur configuré pour capturer des données de capteur, et une installation de communication configurée pour envoyer les données de capteur vers un autre dispositif et recevoir des données sociales à présenter sur l'affichage, le système ayant également une application configurée pour traiter les données de capteur et pour générer les données sociales à l'aide des données de capteur.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361780892P | 2013-03-13 | 2013-03-13 | |
US14/205,138 US20150260989A1 (en) | 2014-03-11 | 2014-03-11 | Social data-aware wearable display system |
PCT/US2014/026861 WO2014160500A2 (fr) | 2013-03-13 | 2014-03-13 | Système d'affichage vestimentaire compatible avec les données sociales |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2972560A2 true EP2972560A2 (fr) | 2016-01-20 |
Family
ID=51625650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14775050.9A Withdrawn EP2972560A2 (fr) | 2013-03-13 | 2014-03-13 | Système d'affichage vestimentaire compatible avec les données sociales |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP2972560A2 (fr) |
AU (1) | AU2014243705A1 (fr) |
CA (1) | CA2906575A1 (fr) |
RU (1) | RU2015143311A (fr) |
WO (1) | WO2014160500A2 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9905143B1 (en) * | 2016-12-01 | 2018-02-27 | Varjo Technologies Oy | Display apparatus and method of displaying using image renderers and optical combiners |
CN108966198A (zh) * | 2018-08-30 | 2018-12-07 | Oppo广东移动通信有限公司 | 网络连接方法、装置、智能眼镜及存储介质 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999023524A1 (fr) * | 1997-10-30 | 1999-05-14 | The Microoptical Corporation | Systeme d'interface pour verres optiques |
US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
EP2801065A4 (fr) * | 2012-01-05 | 2015-08-05 | Visa Int Service Ass | Appareils, procédés et systèmes de capture visuelle de transaction |
-
2014
- 2014-03-13 EP EP14775050.9A patent/EP2972560A2/fr not_active Withdrawn
- 2014-03-13 RU RU2015143311A patent/RU2015143311A/ru not_active Application Discontinuation
- 2014-03-13 CA CA2906575A patent/CA2906575A1/fr not_active Abandoned
- 2014-03-13 WO PCT/US2014/026861 patent/WO2014160500A2/fr active Application Filing
- 2014-03-13 AU AU2014243705A patent/AU2014243705A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of WO2014160500A3 * |
Also Published As
Publication number | Publication date |
---|---|
AU2014243705A1 (en) | 2015-11-05 |
CA2906575A1 (fr) | 2014-10-02 |
WO2014160500A2 (fr) | 2014-10-02 |
RU2015143311A (ru) | 2017-04-19 |
WO2014160500A3 (fr) | 2014-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10962809B1 (en) | Eyewear device with finger activated touch sensor | |
US11333891B2 (en) | Wearable display apparatus having a light guide element that guides light from a display element and light from an outside | |
US9442567B2 (en) | Gaze swipe selection | |
US20140285402A1 (en) | Social data-aware wearable display system | |
EP3757718B1 (fr) | Dispositifs pouvant être portés pour un traitement de courrier et leurs procédés d'utilisation | |
CN117356116A (zh) | 用于定位可穿戴设备和向可穿戴设备递送内容的信标 | |
US20180329209A1 (en) | Methods and systems of smart eyeglasses | |
US11726338B2 (en) | Head-mounted electronic display device with lens position sensing | |
US20160086382A1 (en) | Providing location occupancy analysis via a mixed reality device | |
US20170344107A1 (en) | Automatic view adjustments for computing devices based on interpupillary distances associated with their users | |
CN106575162B (zh) | 促成计算设备上基于动态眼睛扭转的眼睛跟踪 | |
KR20150091322A (ko) | 아이웨어 상에서의 멀티 터치 상호작용 기법 | |
EP3067782B1 (fr) | Appareil de traitement d'informations, procédé de commande et programme associé | |
EP3092523B1 (fr) | Appareil d'affichage vestimentaire | |
US11567569B2 (en) | Object selection based on eye tracking in wearable device | |
US20150260989A1 (en) | Social data-aware wearable display system | |
EP2972560A2 (fr) | Système d'affichage vestimentaire compatible avec les données sociales | |
KR101805749B1 (ko) | 사용자 인증 장치 | |
CN117425889A (zh) | 作为生物计量信号的弯曲估计 | |
EP4260591A1 (fr) | Exploitation d'ancrage en nuage dans l'authentification | |
CN116325711A (zh) | 具有基于环境确定收件人以自动发送图像的图像捕获眼戴器 | |
KR20180082729A (ko) | 웨어러블 스마트 안경 디바이스 및 이를 이용한 영상이미지 표시방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20151013 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20161001 |