WO2014058357A1 - Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée - Google Patents

Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée Download PDF

Info

Publication number
WO2014058357A1
WO2014058357A1 PCT/SE2012/051072 SE2012051072W WO2014058357A1 WO 2014058357 A1 WO2014058357 A1 WO 2014058357A1 SE 2012051072 W SE2012051072 W SE 2012051072W WO 2014058357 A1 WO2014058357 A1 WO 2014058357A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
vehicle
location
sensor
Prior art date
Application number
PCT/SE2012/051072
Other languages
English (en)
Inventor
Joakim Söderberg
Original Assignee
Telefonaktiebolaget L M Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget L M Ericsson (Publ) filed Critical Telefonaktiebolaget L M Ericsson (Publ)
Priority to PCT/SE2012/051072 priority Critical patent/WO2014058357A1/fr
Publication of WO2014058357A1 publication Critical patent/WO2014058357A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • Virtual reality technology is known.
  • Virtual reality technology provides computer-simulated environments. It is used in gaming environments or in simulating training such as for pilots for example. Users can typically experience being in different and sometimes remote environments which can be places or situations in the real or imaginary world. Users can wear a head mounted display for experiencing the visually simulated environment. The user can experience a highly visual, three dimensional environment. Speakers or headphone may also provide sounds to the user to enhance or make the experience more realistic.
  • users of augmented reality apparatus are provided with data that is contextually relevant to the user and user location and/or situation.
  • the information that is provided to a user is specific to the user and to his or her particular location/situation which makes the information very convenient to the user.
  • the information provided to the user is not generic - in other words, different users in a same (or, identical) location are provided with information that is "customized" to their preferences, habits, etc.
  • a network node comprises: a receiving means for receiving data from at least one sensor associated with a user; a processor for enriching the received sensor data with pre- stored data corresponding to the user to obtain data that is contextually relevant for the user; a memory for storing the received sensor data, the pre-stored data and the enriched data; and a transmitting means for transmitting the enriched data to an augmented reality apparatus associated with the user.
  • FIG. 3 illustrates a node in accordance with exemplary embodiments
  • FIG. 5 illustrates a vehicle in accordance with exemplary embodiments.
  • the invention can additionally be considered to be embodied entirely within any form of computer readable carrier, such as solid-state memory, magnetic disk, or optical disk containing an appropriate set of computer instructions that would cause a processor to carry out the techniques described herein.
  • any such form of embodiments as described above may be referred to herein as "logic configured to” perform a described action, or alternatively as “logic that” performs a described action.
  • users of augmented reality apparatus are provided with data that is contextually relevant to the user and user location and/or situation.
  • Mobile devices such as smart phones are currently being equipped with sensors.
  • the number and functionality of these sensors is ever increasing.
  • These sensors can be used to detect, inter alia, location and environmental parameters (including weather related parameters) as well as objects and/or sounds in the user's vicinity.
  • a smart phone associated with a user can therefore be used to gather information about the user's location, actions/activities and identity.
  • the information provided to the user may be customized to user preferences or situations.
  • the museum data presented to the user may be filtered by a set of pre-stored user preferences.
  • the museum exhibits may include exhibits related to paintings, photography and sculpture.
  • a user may have a preference for paintings and this preference may be pre-stored.
  • the information presented to the user then would highlight exhibit information related to paintings in this example - that is, information relating to the list of current exhibits has been filtered by the user's preference for paintings.
  • the information presented to the user is therefore specific to the user reflecting his or her preferences, condition, etc.
  • a user may be in an area where the
  • environmental sensors detect the presence of a particular element or compound in the atmosphere. Sensors can also detect levels of these elements/compounds. The presence of this particular element may not be of concern to everyone but at certain levels they may affect a user who suffers from allergies or asthma for example. As the user travels within this area, he or she may be presented not only with information about the area but also given warnings or advisories regarding the detected elements/compounds. The user may be provided with instructions on how to handle the situation.
  • System 100 includes a (user) communication device 110 and an augmented reality (AR) apparatus 120 both associated with a user.
  • Communication device 110 may be a mobile communication device such as a smartphone.
  • Communication device 110 may include a plurality of sensors 115 (only one such sensor is illustrated).
  • AR apparatus 120 may include a projection means for superimposing data onto objects within a user's field of view.
  • System 100 also includes a network node 130 and data sources 140. The user communication device 110 and AR apparatus 120 can communicate with network node 130 over a network such as a communication network.
  • a data enrichment module 136 can enrich the extracted data. Enrichment may include comparing the extracted data such as location co-ordinates with data from other sources 140 to determine a particular user location for example.
  • Data from sources 140 can include weather related data, public service announcements, traffic conditions, information for objects and areas in the user's vicinity (such as the museum example above), etc. The location as determined can be used to obtain, for example, weather related data for that particular location that can be provided to the user.
  • Node 130 can also communicate with data sources 140 via the communication interface 132.
  • the user preferences and settings may be stored in user database 135.
  • the enriched data may be provided to the communication device 110 or AR apparatus 120 via communication interface 132.
  • node 130 can also include additional processors, memory, etc.
  • node 130 can include another memory for storing at least one of: received sensor data, extracted data, enriched data, data from other sources, etc.
  • Information in user database may include, for example, user preferences, user characteristics such as health records and user account information such as banking, credit cards, e-mail, etc. The nature and extent of information may be limited by user's willingness to provide the information.
  • the data that is contextually relevant to the user is provided to the user via at least one sensory medium of an augmented reality user apparatus at 230.
  • the sensory medium can be video, audio or tactile.
  • the data can be projected onto objects within the user's field of view. This could include projecting the data onto a display within augmented reality apparatus 120. Data could also be projected onto objects being viewed by the user using projector 125 for example.
  • an audio message may be provided to the user via speaker 125.
  • the user action can be estimated at 223.
  • a user's intention of using a debit card may be estimated.
  • the user intention can also be determined from past user activity, etc.
  • Pre-stored user data corresponding to the determined user identity can be retrieved at 224. Referring to the debit card example, the pin code for the user is retrieved.
  • Node 300 may be located on a network such as a radio network, a public network, a private network or a combination thereof.
  • Node 300 may include communication interfaces 310 (for receiving data) and 340 (for transmitting data), a processor 320 and computer readable medium 330 in the form of a memory.
  • the communication interface, the processor and the computer readable medium may all be interconnected via bus 350.
  • the network node may enrich the data as described above with respect to FIGs. 1 to 3.
  • Device 400 further includes a receiver 430 for receiving the enriched data.
  • the enriched data results from processing the transmitted data and filtering the processed data with pre-stored user data.
  • Device 400 includes a projecting means 440 for projecting the received enriched data.
  • Device 400 also includes a processor 450, memory 460 and bus 470 the functionality of each of which is known and not described herein further.
  • Exemplary embodiments as described above may be implemented within a vehicle.
  • a vehicle could be, but not limited to: a motorcycle, a car, a truck, a bus, a boat, a ship, a train or an airplane.
  • a plurality of sensors can be associated with a vehicle.
  • the vehicle sensors can be similar to those associated with a user communication device.
  • the vehicle sensors could also supplement (or substitute for) sensors associated with a communication device of a user associated with or traveling in the vehicle.
  • the augmented reality apparatus can also be associated with the vehicle. Vehicle preferences may be used to filter data from sensors that has been processed.
  • Vehicle 500 can have an augmented reality apparatus 520 associated therewith.
  • vehicle 500 can also include a user communication device 510 associated with an occupant of the vehicle.
  • the user can be the driver or owner of the car.
  • a common carrier such as a bus, train or plane, the user can be a passenger or operator of the common carrier.
  • Data from sensors 515, 550 and images and the like detected by camera 525 can be processed in the manner described above to provide user(s) within vehicle 500 with contextually relevant data.
  • the data may be contextually relevant to the vehicle in some embodiments.
  • Vehicle preferences can be used to filter the data.
  • the data from vehicle 500 can be processed by a network node connected to the vehicle via a communication network.
  • the data can be processed within the vehicle.
  • the vehicle preferences can be stored within the vehicle.
  • the user or vehicle preferences can be stored within a memory device associated with the car for example.
  • the car can be connected to a network to obtain the data from other sources.
  • the user device can be utilized to provide data that is specific to the user associated with the particular user device.
  • the data provided can also be contextually relevant to the particular vehicle 500 from which the data was gathered.
  • the data may be provided by projector 527 onto a display 560 connected to AR apparatus 520.
  • An audio amplification means such as a speaker can also be included within the vehicle for presenting the data (in an audio format).
  • multiple displays may be provided with each display being available to one user or passenger.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de présentation de données dans un environnement de réalité augmentée, le procédé comportant les étapes consistant à recevoir des données en provenance d'au moins un capteur associé à un utilisateur, à enrichir les données de capteur reçues à l'aide de données pré-stockées correspondant à l'utilisateur pour obtenir des données contextuellement pertinentes pour l'utilisateur et à présenter les données contextuellement pertinentes à l'utilisateur via au moins un support sensoriel associé à un appareil d'utilisateur de réalité augmentée.
PCT/SE2012/051072 2012-10-08 2012-10-08 Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée WO2014058357A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SE2012/051072 WO2014058357A1 (fr) 2012-10-08 2012-10-08 Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2012/051072 WO2014058357A1 (fr) 2012-10-08 2012-10-08 Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée

Publications (1)

Publication Number Publication Date
WO2014058357A1 true WO2014058357A1 (fr) 2014-04-17

Family

ID=50477688

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2012/051072 WO2014058357A1 (fr) 2012-10-08 2012-10-08 Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée

Country Status (1)

Country Link
WO (1) WO2014058357A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210233539A1 (en) * 2018-10-15 2021-07-29 Orcam Technologies Ltd. Using voice and visual signatures to identify objects
US11430216B2 (en) 2018-10-22 2022-08-30 Hewlett-Packard Development Company, L.P. Displaying data related to objects in images

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110098056A1 (en) * 2009-10-28 2011-04-28 Rhoads Geoffrey B Intuitive computing methods and systems
US20110161076A1 (en) * 2009-12-31 2011-06-30 Davis Bruce L Intuitive Computing Methods and Systems
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
EP2400733A1 (fr) * 2010-06-28 2011-12-28 Lg Electronics Inc. Terminal mobile pour afficher des informations de réalité augmentée
US8131118B1 (en) * 2008-01-31 2012-03-06 Google Inc. Inferring locations from an image
EP2466258A1 (fr) * 2010-12-15 2012-06-20 The Boeing Company Procédés et systèmes pour navigation augmentée
US20120203799A1 (en) * 2011-02-08 2012-08-09 Autonomy Corporation Ltd System to augment a visual data stream with user-specific content
US20120224060A1 (en) * 2011-02-10 2012-09-06 Integrated Night Vision Systems Inc. Reducing Driver Distraction Using a Heads-Up Display

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8131118B1 (en) * 2008-01-31 2012-03-06 Google Inc. Inferring locations from an image
US20110098056A1 (en) * 2009-10-28 2011-04-28 Rhoads Geoffrey B Intuitive computing methods and systems
US20110161076A1 (en) * 2009-12-31 2011-06-30 Davis Bruce L Intuitive Computing Methods and Systems
US20110164163A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Synchronized, interactive augmented reality displays for multifunction devices
EP2400733A1 (fr) * 2010-06-28 2011-12-28 Lg Electronics Inc. Terminal mobile pour afficher des informations de réalité augmentée
EP2466258A1 (fr) * 2010-12-15 2012-06-20 The Boeing Company Procédés et systèmes pour navigation augmentée
US20120203799A1 (en) * 2011-02-08 2012-08-09 Autonomy Corporation Ltd System to augment a visual data stream with user-specific content
US20120224060A1 (en) * 2011-02-10 2012-09-06 Integrated Night Vision Systems Inc. Reducing Driver Distraction Using a Heads-Up Display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AJANKI A ET AL.: "Contextual information access with Augmented Reality", MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2010 IEEE INTERNATIONAL WORKSHOP, 29 August 2010 (2010-08-29), pages 95 - 100, XP031765933 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210233539A1 (en) * 2018-10-15 2021-07-29 Orcam Technologies Ltd. Using voice and visual signatures to identify objects
US11430216B2 (en) 2018-10-22 2022-08-30 Hewlett-Packard Development Company, L.P. Displaying data related to objects in images

Similar Documents

Publication Publication Date Title
US9418481B2 (en) Visual overlay for augmenting reality
US20060009702A1 (en) User support apparatus
CN103914139B (zh) 信息处理设备、信息处理方法及程序
KR101229078B1 (ko) 실내외 상황인식 기반의 모바일 혼합현실 콘텐츠 운용 장치 및 방법
CN104252229B (zh) 用于通过跟踪驾驶员眼睛视线来检测驾驶员是否对广告感兴趣的设备和方法
CN105874528B (zh) 信息显示终端、信息显示系统以及信息显示方法
CN108027652A (zh) 信息处理设备、信息处理方法以及程序
US20150094118A1 (en) Mobile device edge view display insert
US20140241585A1 (en) Systems, methods, and apparatus for obtaining information from an object attached to a vehicle
US20150088637A1 (en) Information processing system, information processing method, and non-transitory computer readable storage medium
US20120092370A1 (en) Apparatus and method for amalgamating markers and markerless objects
US20220129942A1 (en) Content output system, terminal device, content output method, and recording medium
JPWO2017163514A1 (ja) 眼鏡型ウェアラブル端末、その制御方法および制御プログラム
TW201741630A (zh) 圖像的處理方法、裝置、設備及用戶介面系統
CN110348463A (zh) 用于识别车辆的方法和装置
TW201944324A (zh) 導引系統
CN116710878A (zh) 情境感知扩展现实系统
CN110998409A (zh) 增强现实眼镜、确定增强现实眼镜的姿态的方法、适于使用该增强现实眼镜或方法的机动车
CN114096996A (zh) 在交通中使用增强现实的方法和装置
WO2014058357A1 (fr) Procédés et appareil de présentation de données contextuellement pertinentes en réalité augmentée
KR20150045465A (ko) 이벤트와 연관된 정보를 사람에게 제공하기 위한 방법
CN110321854A (zh) 用于检测目标对象的方法和装置
KR20120070888A (ko) 제보방법, 이를 수행하는 전자기기 및 기록매체
JP2011197276A (ja) 広告画像表示装置、広告画像表示方法
JP7119985B2 (ja) 地図生成装置、地図生成システム、地図生成方法、及び地図生成プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12886193

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12886193

Country of ref document: EP

Kind code of ref document: A1