EP2057445A1 - Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée - Google Patents

Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée

Info

Publication number
EP2057445A1
EP2057445A1 EP05778974A EP05778974A EP2057445A1 EP 2057445 A1 EP2057445 A1 EP 2057445A1 EP 05778974 A EP05778974 A EP 05778974A EP 05778974 A EP05778974 A EP 05778974A EP 2057445 A1 EP2057445 A1 EP 2057445A1
Authority
EP
European Patent Office
Prior art keywords
sensor
illumination angle
unit
virtual
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05778974A
Other languages
German (de)
English (en)
Inventor
Ankit Jamwal
Alexandra Musto
Reiner Müller
Günter Schrepfer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gigaset Communications GmbH
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Priority to EP05778974A priority Critical patent/EP2057445A1/fr
Publication of EP2057445A1 publication Critical patent/EP2057445A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/10Photometry, e.g. photographic exposure meter by comparison with reference light or electric value provisionally void
    • G01J1/16Photometry, e.g. photographic exposure meter by comparison with reference light or electric value provisionally void using electric radiation detectors
    • G01J1/1626Arrangements with two photodetectors, the signals of which are compared
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation

Definitions

  • the present invention relates to a device and a method for guiding light in an augmented reality system or an "extended reality 1 " system, and in particular to a device and a method for generating virtual shadow and shadow or virtual brightening areas for inserted virtual objects corresponding to the actual lighting conditions, which can be used for mobile terminals such as mobile phones or PDAs (Personal Digital Assistant).
  • mobile terminals such as mobile phones or PDAs (Personal Digital Assistant).
  • Augmented reality or "augmented reality” represents a new field of technology in which, for example, a current visual perception of the real environment is simultaneously superimposed on additional visual information
  • a user simultaneously perceives both the real environment and the virtual image components generated by, for example, computer graphics, as combined representation (summation image).
  • This mixing of real and virtual image constituents into "augmented reality” or “augmented reality” enables the user to carry out his actions by directly including the superimposed and thus simultaneously perceptible additional information.
  • an "augmented reality” In order for an "augmented reality” to be as realistic as possible, there is a significant problem in determining the real lighting conditions in order to optimally adapt the virtual lighting conditions or a so-called light guide for the virtual object to be inserted.
  • the adaptation of the virtual lighting conditions to the actual lighting conditions is understood below to mean in particular the insertion of virtual shadow and / or brightening areas for the virtual object to be inserted.
  • the illumination direction is measured dynamically by means of image processing, a specially shaped object such as a shadow catcher being placed in the scene, and the shadows placed on this object by itself
  • a specially shaped object such as a shadow catcher being placed in the scene
  • this object or the "shadow cat” is always visible in the image when changes in the lighting take place, which is especially true for mobile "augmented reality Systems "is not practical.
  • the invention is therefore based on the object of providing a device and a method for guiding light in an augmented reality system or a system with "extended reality”. did “" create, which is simple and user-friendly and in particular for mobile applications were ⁇ can used.
  • a data processing unit can be based on previously known sensor positioning, the sensor orientation and the properties of the sensor radiation pattern and the detected sensor Aus ⁇ output signals determine an illumination angle with respect to the opti ⁇ cal axis of the recording unit.
  • the line guidance or a virtual shadow and / or a virtual fill-in area for the virtual object can subsequently be inserted into the display unit. In this way, a very realistic light guide for the virtual object is obtained with minimal effort.
  • a one-dimensional illumination angle is determined by ratio formation of two sensor output signals taking into account the sensor directional diagram and the sensor orientation.
  • GPS systems Global Positioning System
  • a spatial illumination angle may also be estimated based on only one dimensional illumination angle as well as the time of day, particularly in daylight environment, a respective time of day dependent sun altitude, i. vertical illumination angle, can be taken into account.
  • a detection unit for detecting a color temperature of the present illumination and an analysis unit for analyzing the color temperature can be used, wherein the detection unit is preferably implemented by the already existing recording unit or camera.
  • the directional diagrams of the sensors are preferably the same and the distances between the sensors to each other as large as possible.
  • the determination of the illumination angle as a function of the recording unit is carried out continuously with respect to a time axis, as a result of which a particularly realistic light guidance can be generated for the virtual objects.
  • the sensors with their sensor orientations and associated Richtdiagram ⁇ men can preferably be rotatably arranged. Furthermore, a threshold decision unit for determining a uniqueness of an illumination angle can be provided, wherein in the absence of uniqueness the virtual light guide is switched off. Accordingly, in the case of diffuse lighting conditions or lighting conditions with a large number of light sources distributed in space, no virtual shadow and / or brightening areas are generated for the virtual object.
  • a real object with a recording unit which has an optical axis
  • a virtual object to be inserted is generated with a data processing unit and likewise displayed on the display unit or superimposed on the real object.
  • a lighting is subsequently detected and output in each case as sensor output signals.
  • an illumination angle with respect to the optical axis is subsequently determined and a light guide or insertion depending on the determined illumination angle of virtual shadow and / or virtual light areas for the virtual object.
  • FIG. 1 shows a simplified representation of an application case for the method according to the invention and the associated device for carrying out a light guide in an augmented reality system;
  • FIG. 2 shows a simplified representation of the device according to FIG. 1 in order to illustrate the mode of action of the sensor directivity diagrams of the sensors when determining a lighting angle;
  • FIG. 3 shows a simplified representation to illustrate the one-dimensional illumination angle determined in an augmented reality system according to the invention.
  • FIG. 4 shows a simplified representation for illustrating a spatial illumination angle by means of two one-dimensional illumination angles.
  • FIG. 1 shows a simplified representation of an "Augmented Reality System” or “augmented reality” system, as may be implemented, for example, in a mobile terminal and in particular a mobile telecommunication terminal or mobile phone H, respectively.
  • an image of a real environment or a real object RO to be recorded is recorded by a camera or recording unit AE integrated in the mobile terminal H with an associated real shadow RS and displayed on a display unit I.
  • a so-called virtual object VO is superimposed on the recorded real object with its ZUT-associated shadow, which may be a flowerpot, for example, resulting in an augmented reality or the so-called augmented reality.
  • the real object RO with its associated real shadow RS and the virtual object VO can also represent any other objects.
  • a light source L is represented, for example, in the form of an incandescent lamp, which is primarily responsible for illuminating the real environment or the real object RO, and thus the real shadow or real shadow associated with the real object RO.
  • Shadow area RS generated. Since such a real shadow RS also changes correspondingly with a change in the illumination conditions, for example, is shortened or lengthened or rotated by a predetermined angle, such illumination conditions must also be taken into account in a so-called light guide for the virtual object VO.
  • the virtual object VO of the real environment represented on the display unit I is added, but also a corresponding virtual light guide, ie, for example, a virtual shadow VS of the virtual object VO and / or a virtual highlight area VA the virtual object VO supplemented depending on the respective lighting conditions.
  • a corresponding virtual light guide ie, for example, a virtual shadow VS of the virtual object VO and / or a virtual highlight area VA the virtual object VO supplemented depending on the respective lighting conditions.
  • the photosensitive sensors S each have a previously known sensor directional diagram with a known sensor orientation as well as a previously known sensor positioning.
  • the sensor output signals or their amplitude values output at the respective sensors can then be evaluated in such a way that an illumination angle with respect to the optical axis of the recording unit AE can be determined, which in turn a virtual light guide in the image of the display unit I for the virtual object VO or the generation of a virtual -Shat- range VS and / or a virtual Aufhell Silvers VA can be performed ,
  • This calculation is processed, for example, by a data processing unit already present in the mobile telecommunication terminal H, which is also responsible, for example, for connection establishment and termination as well as a multiplicity of further functionalities of the mobile terminal H.
  • FIG. 2 shows a simplified illustration for illustrating the basic mode of operation in the determination of a Beleu ⁇ htungswinkels, as it is required for the inventive light guide or the generation of virtual shadow and virtual Aufhell Suiteen.
  • the receiving unit AE e.g. arranged on the housing surface of the mobile terminal H, the receiving unit AE or a conventional Ka ⁇ mera and at least two photosensitive sensors Sl and S2.
  • the recording unit AE has an optical axis OA, which sets the reference axis for the illumination angle ⁇ to be determined to a light source L below.
  • the sensors S1 and S2 have a previously known sensor positioning and, according to FIG. 2, are at a previously known distance d1 and d2 from the mounting position. distance unit AE spaced. Furthermore, the sensors S1 and S2 have a previously known sensor orientation SA1 and SA2 with respect to the optical axis OA of the recording unit, which is correlated with a respective known directional pattern RD1 and RD2. According to FIG. 2, the sensor orientation SA1 and SA2 is parallel to the optical axis OA of the recording unit, resulting in a simplified calculation of the one-dimensional illumination angle ⁇ .
  • the curve of the directional diagram RD1 and RD2 is elliptical according to FIG. 2 and has an elliptical lobe shape in a spatial representation.
  • a distance from the sensor to the edge of the elliptic curve or the spatial elliptical lobe of the sensor directional diagram corresponds to an amplitude of a sensor output signal SS1 and SS2 which is output at the sensor when light from the light source L falls at a corresponding angle ⁇ 1 or ⁇ 2 to the sensor orientation SA1 or SA2 on the sensors S1 and S2.
  • an amplitude of the sensor output signal SSl and SS2 is a direct measure of the angles .beta.l and .beta.2, which is why, having knowledge of the properties of the directional diagram RD1 and RD2 or of the curve shapes and the sensor positions or the distances d1 and d2, as well as the sensor alignment SAl and SA2 with respect to the optical axis OA can uniquely determine a one-dimensional illumination angle ⁇ .
  • the corresponding virtual light guidance can now also be performed and, for example, a virtual shadow area VS and / or a virtual Aufhell Scheme VA in the image of the display unit I according to Figure 1 are realistic or winkeltreu inserted.
  • the photosensitive sensors S or S1 and S2 can be realized, for example, by a photodiode, a phototransistor or other photosensitive elements which have a previously known directional pattern. A directional pattern may also be adjusted or adjusted via a lens array located in front of the photosensitive sensor.
  • the resulting one-dimensional value can therefore be determined in a plane defined by the two sensor elements S1 and S2 by forming the ratio of the two sensor output signals SS1 and SS2 Light incident angle or illumination angle ⁇ , similar to the monopulse method in radar technology, are determined.
  • 'Since only a one-dimensional illumination angle may be ⁇ determined with two such light-sensitive sensors, however, is to determine a syndromemli ⁇ cher illumination angle for a realistic light guide, according to one embodiment of Figure 4, two such eindimensio ⁇ dimensional illumination angle for determining a spatial Be ⁇ leuchtungswinkels determined.
  • FIG. 4 two such arrangements are combined as shown in FIGS. 2 and 3, so that in each case one-dimensional illumination angles cty can be determined, for example in ay-direction and .alpha..sub.z, for example in a z-direction. This makes it possible to determine a resulting spatial illumination angle for a light source L in the room.
  • a third photosensitive sensor is preferably arranged, for example, on the housing surface of the mobile terminal H such that it is located in a further plane.
  • it is arranged, for example, perpendicular to the xy plane of the first two sensors in an xz or yz plane, as a result of which Angled coordinate system results.
  • one of the three sensors is used twice to determine the two one-dimensional illumination angles ⁇ y and ⁇ z .
  • other sensor arrangements and in particular a larger number of sensors are also possible, as a result of which an accuracy or a detection range of the lighting conditions can be further improved.
  • the respective sensor orientations, sensor positioning and sensor directional diagrams are taken into account accordingly in the evaluation of the output sensor output signals.
  • a common method for determining the spatial illumination angle from two one-dimensional illumination angles is, for example, the triangulation method known from GPS systems (Global Positioning System).
  • GPS systems Global Positioning System
  • any other methods for determining a spatial illumination angle are also possible.
  • such a spatial illumination angle can also be determined or estimated only on the basis of a one-dimensional illumination angle, provided that the plane of the two light-sensitive sensors necessary for this one-dimensional illumination angle is parallel to a horizon or an earth surface and Main illumination source is realized by the sun or sunlight, as spielmik in daylight environment is usually the case.
  • a time of day at a specific location is taken into consideration, from which a position of the sun or a second illumination angle can be estimated vertically or vertically to the earth's surface.
  • illumination changes taking place only in the horizontal direction are detected by the two sensors S1 and S2 or by the one-dimensional illumination angle .alpha.
  • the lighting changes taking place in the vertical direction are derived from an instantaneous time of day.
  • a timer unit which is usually present in mobile terminals H, for example in the form of a clock with time zone indication and summertime consideration, is used.
  • a detection unit for detecting a color temperature of the present illumination can be provided for determining a daylight or artificial light environment, wherein an analysis unit analyzes or evaluates the detected color temperature. Since the conventional recording units or cameras used in mobile terminals H generally provide such information with regard to a color temperature in any case, the recording unit AE and the data processing unit of the mobile terminal H are used as detection unit for the color temperature det. Due to the use of existing Zeitge ⁇ units and recording units, results for this second embodiment, a particularly simple and kos ⁇ tenberichte implementation.
  • the properties or curves according to FIG. 2 of the sensor directional diagrams of the sensors S used are preferably the same or identical and the distances between the sensors are as large as possible.
  • the illumination angle is continuously carried out with respect to a time as a function of the recording unit AE. More precisely, for each acquisition of a sequence of images, associated calculations and a corresponding light guidance are carried out. In principle, in order to save resources such as computing capacity, however, such calculations can also be limited to predetermined intervals which are independent of the functionality of the recording unit.
  • the sensors with their previously known sensor orientations and associated sensor directional diagrams can also be rotatable, e.g.
  • the changing angle values for the sensor orientations must also be recorded and transmitted to the data processing unit for compensation or consideration.
  • a threshold decision unit for determining a uniqueness of a lighting angle and thus the lighting conditions, wherein the virtual light guide for the virtual objects is switched off or no virtual shadow and / or virtual Aufhellberei ⁇ che are generated in the image of the display unit.
  • faulty virtual light guidance can thereby be prevented, which in turn permits a very realistic depiction of virtual objects.
  • the present invention has been described with reference to a mobile telecommunication terminal, such as a mobile H for example. However, it is not limited to and includes in the same way, other mobile devices such as PDAs (Personal Digital Aassistant). It can also be applied to stationary augmented reality systems. Furthermore, the present invention has been described with reference to a single light source such as a light bulb or a sun ei ⁇ ner. However, the invention is not limited thereto, but in the same way also includes other main light sources, which can be composed of a multiplicity of light sources or other types of light sources. Furthermore, the invention has been described with reference to two or three light-sensitive sensors for determining ariesswin ⁇ cle. However, it is not limited thereto, but equally includes systems with a plurality of photosensitive sensors that can be arbitrarily positioned and aligned to the receiving unit AE and the optical axis OA.

Abstract

L'invention concerne un dispositif et un procédé de guidage de la lumière dans un système de réalité augmentée. Une unité de prise de vue (AE) présentant un axe optique photographie un objet réel (RO, RS) et l'affiche sur une unité d'affichage (I). Une unité de traitement de données génère un objet virtuel (VO) et affiche également cet objet virtuel (VO) sur l'unité d'affichage (I). A partir de données relatives à au moins deux capteurs (S) photosensibles, notamment un positionnement connu, une orientation, un diagramme directionnel et un signal de sortie émis, un angle d'éclairage est déterminé et le guidage de la lumière pour l'objet virtuel (VO) dans l'unité d'affichage (I) est réalisé en fonction de cet angle d'éclairage.
EP05778974A 2004-10-13 2005-07-05 Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée Withdrawn EP2057445A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP05778974A EP2057445A1 (fr) 2004-10-13 2005-07-05 Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP04024431 2004-10-13
PCT/EP2005/053194 WO2006040200A1 (fr) 2004-10-13 2005-07-05 Dispositif et procede de simulation d'eclairage et d'ombre dans un systeme de realite augmentee
EP05778974A EP2057445A1 (fr) 2004-10-13 2005-07-05 Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée

Publications (1)

Publication Number Publication Date
EP2057445A1 true EP2057445A1 (fr) 2009-05-13

Family

ID=34926981

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05778974A Withdrawn EP2057445A1 (fr) 2004-10-13 2005-07-05 Dispositif et procédé de simulation d'éclairage et d'ombres dans un système à réalité amplifiée

Country Status (5)

Country Link
US (1) US20080211813A1 (fr)
EP (1) EP2057445A1 (fr)
JP (1) JP2008516352A (fr)
TW (1) TW200614097A (fr)
WO (1) WO2006040200A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320320A (zh) * 2018-01-25 2018-07-24 重庆爱奇艺智能科技有限公司 一种信息显示方法、装置及设备

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1913559B1 (fr) * 2005-08-09 2016-10-19 Qualcomm Connected Experiences, Inc. Procede et dispositifs pour visualiser un modele numerique dans un environnement reel
US8930834B2 (en) * 2006-03-20 2015-01-06 Microsoft Corporation Variable orientation user interface
US8139059B2 (en) * 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US9171399B2 (en) * 2013-03-12 2015-10-27 Autodesk, Inc. Shadow rendering in a 3D scene based on physical light sources
DE102008012066A1 (de) * 2008-02-29 2009-09-10 Navigon Ag Verfahren zum Betrieb einer Navigationseinrichtung
US8847956B2 (en) * 2008-03-10 2014-09-30 Koninklijke Philips N.V. Method and apparatus for modifying a digital image
WO2009141497A1 (fr) * 2008-05-22 2009-11-26 Nokia Corporation Dispositif et procédé pour afficher et mettre à jour des objets graphiques en fonction du mouvement d'un dispositif
JP2010008289A (ja) * 2008-06-27 2010-01-14 Sharp Corp 携帯端末装置
JP2010033367A (ja) * 2008-07-29 2010-02-12 Canon Inc 情報処理装置及び情報処理方法
US20100066750A1 (en) * 2008-09-16 2010-03-18 Motorola, Inc. Mobile virtual and augmented reality system
US8797321B1 (en) 2009-04-01 2014-08-05 Microsoft Corporation Augmented lighting environments
US8405658B2 (en) * 2009-09-14 2013-03-26 Autodesk, Inc. Estimation of light color and direction for augmented reality applications
KR101082285B1 (ko) * 2010-01-29 2011-11-09 주식회사 팬택 증강 현실 제공 단말기 및 방법
US20110234631A1 (en) * 2010-03-25 2011-09-29 Bizmodeline Co., Ltd. Augmented reality systems
US8878773B1 (en) 2010-05-24 2014-11-04 Amazon Technologies, Inc. Determining relative motion as input
US8502659B2 (en) 2010-07-30 2013-08-06 Gravity Jack, Inc. Augmented reality and location determination methods and apparatus
US8493206B2 (en) 2010-07-30 2013-07-23 Gravity Jack, Inc. Augmented reality and location determination methods and apparatus
US8519844B2 (en) * 2010-07-30 2013-08-27 Gravity Jack, Inc. Augmented reality and location determination methods and apparatus
US9489102B2 (en) 2010-10-22 2016-11-08 Hewlett-Packard Development Company, L.P. System and method of modifying lighting in a display system
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
WO2012054063A1 (fr) 2010-10-22 2012-04-26 Hewlett-Packard Development Company L.P. Système d'affichage à réalité augmentée et procédé d'affichage
US20120135783A1 (en) * 2010-11-29 2012-05-31 Google Inc. Mobile device image feedback
KR20120057799A (ko) * 2010-11-29 2012-06-07 삼성전자주식회사 휴대단말에서 사전 기능 제공 방법 및 장치
JP2012120067A (ja) * 2010-12-03 2012-06-21 Brother Ind Ltd シースルー型画像表示装置およびシースルー型画像表示方法
US8872854B1 (en) * 2011-03-24 2014-10-28 David A. Levitt Methods for real-time navigation and display of virtual worlds
US8643703B1 (en) 2011-03-30 2014-02-04 Amazon Technologies, Inc. Viewer tracking image display
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9449427B1 (en) 2011-05-13 2016-09-20 Amazon Technologies, Inc. Intensity modeling for rendering realistic images
US9041734B2 (en) * 2011-07-12 2015-05-26 Amazon Technologies, Inc. Simulating three-dimensional features
US9852135B1 (en) 2011-11-29 2017-12-26 Amazon Technologies, Inc. Context-aware caching
US9903830B2 (en) 2011-12-29 2018-02-27 Lifescan Scotland Limited Accurate analyte measurements for electrochemical test strip based on sensed physical characteristic(s) of the sample containing the analyte
US11073959B2 (en) * 2012-06-08 2021-07-27 Apple Inc. Simulating physical materials and light interaction in a user interface of a resource-constrained device
US9157883B2 (en) 2013-03-07 2015-10-13 Lifescan Scotland Limited Methods and systems to determine fill direction and fill error in analyte measurements
KR20140122458A (ko) * 2013-04-10 2014-10-20 삼성전자주식회사 휴대 단말 장치의 화면 표시 방법 및 장치
US9466149B2 (en) * 2013-05-10 2016-10-11 Google Inc. Lighting of graphical objects based on environmental conditions
US10371660B2 (en) 2013-05-17 2019-08-06 Lifescan Ip Holdings, Llc Accurate analyte measurements for electrochemical test strip based on multiple calibration parameters
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US9269012B2 (en) 2013-08-22 2016-02-23 Amazon Technologies, Inc. Multi-tracker object tracking
US9243276B2 (en) 2013-08-29 2016-01-26 Lifescan Scotland Limited Method and system to determine hematocrit-insensitive glucose values in a fluid sample
US9459231B2 (en) 2013-08-29 2016-10-04 Lifescan Scotland Limited Method and system to determine erroneous measurement signals during a test measurement sequence
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
US10096296B2 (en) * 2013-11-13 2018-10-09 Red Hat, Inc. Temporally adjusted application window drop shadows
CN103793063B (zh) * 2014-03-11 2016-06-08 哈尔滨工业大学 多通道增强现实系统
CN106133796B (zh) 2014-03-25 2019-07-16 苹果公司 用于在真实环境的视图中表示虚拟对象的方法和系统
US9857869B1 (en) 2014-06-17 2018-01-02 Amazon Technologies, Inc. Data optimization
CN104123743A (zh) * 2014-06-23 2014-10-29 联想(北京)有限公司 图像阴影添加方法及装置
US20160293142A1 (en) * 2015-03-31 2016-10-06 Upton Beall Bowden Graphical user interface (gui) shading based on context
DE102016006855A1 (de) 2016-06-04 2017-12-07 Audi Ag Verfahren zum Betreiben eines Anzeigesystems und Anzeigesystem
US10922878B2 (en) * 2017-10-04 2021-02-16 Google Llc Lighting for inserted content
US11302067B2 (en) * 2018-08-31 2022-04-12 Edx Technologies, Inc. Systems and method for realistic augmented reality (AR) lighting effects
US11189061B2 (en) 2019-06-25 2021-11-30 Universal City Studios Llc Systems and methods for virtual feature development
US11216665B2 (en) * 2019-08-15 2022-01-04 Disney Enterprises, Inc. Representation of real-world features in virtual space
WO2021109885A1 (fr) * 2019-12-06 2021-06-10 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Détection de source de lumière pour des technologies de réalité étendue

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4423778A1 (de) * 1994-06-30 1996-01-04 Christian Steinbrucker Meßsystem mit vier photosensitiven Komponenten zum Bestimmen des räumlichen Fehlwinkels einer punktförmigen Lichtquelle bezogen auf eine Grundflächennormale

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02115708A (ja) * 1988-10-25 1990-04-27 Matsushita Electric Ind Co Ltd 入射光源方向判定追尾装置
JP2606818Y2 (ja) * 1993-10-15 2001-01-29 カルソニックカンセイ株式会社 自動車用日射検出センサ
DE9418382U1 (de) * 1994-11-16 1996-03-21 Smit Michael Mischbildgenerator
JP3671478B2 (ja) * 1995-11-09 2005-07-13 株式会社デンソー 車両の日射検出装置及び車両用空気調和装置
DE19838460A1 (de) * 1998-08-25 2000-03-09 Daimler Chrysler Ag Einrichtung zur Bestimmung des Einfallswinkels einer Lichtquelle, insbesondere der Sonne
JP3486575B2 (ja) * 1999-08-31 2004-01-13 キヤノン株式会社 複合現実感提示装置およびその方法並びに記憶媒体
US6903707B2 (en) * 2000-08-09 2005-06-07 Information Decision Technologies, Llc Method for using a motorized camera mount for tracking in augmented reality
US7071898B2 (en) * 2002-07-18 2006-07-04 Information Decision Technologies, Llc Method for using a wireless motorized camera mount for tracking in augmented reality
JP2003287434A (ja) * 2002-01-25 2003-10-10 Iwane Kenkyusho:Kk 画像情報検索システム
US7042421B2 (en) * 2002-07-18 2006-05-09 Information Decision Technologies, Llc. Method for advanced imaging in augmented reality
US7209577B2 (en) * 2005-07-14 2007-04-24 Logitech Europe S.A. Facial feature-localized and global real-time video morphing

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4423778A1 (de) * 1994-06-30 1996-01-04 Christian Steinbrucker Meßsystem mit vier photosensitiven Komponenten zum Bestimmen des räumlichen Fehlwinkels einer punktförmigen Lichtquelle bezogen auf eine Grundflächennormale

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BABA M ET AL: "An advanced rangefinder equipped with a new image sensor with the ability to detect the incident angle of a light stripe; An advanced rangefinder equipped with a new image sensor with the ability to detect the incident angle of a light stripe", JOURNAL OF OPTICS. A, PURE AND APPLIED OPTICS, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL, GB, vol. 6, no. 1, 1 January 2004 (2004-01-01), pages 10 - 16, XP020081520, ISSN: 1464-4258 *
HIROTO MATSUOKA ET AL: "Regeneration of Real Objects in the Real World", INTERNATIONAL CONFERENCE ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES ACM SIGGRAPH 2002; JULY 21 - 26, 2002, SAN ANTONIO, TEXAS, USA, ACM NEW YORK, 21 July 2002 (2002-07-21), pages 243, XP007910276, ISBN: 978-1-58113-525-1, Retrieved from the Internet <URL:http://portal.acm.org/citation.cfm?id=1242256> [retrieved on 20091022] *
MATSUOKA H ET AL: "Environment mapping for objects in the real world: a trial using artoolkit", AGUMENTED REALITY TOOLKIT, THE FIRST IEEE INTERNATIONAL WORKSHOP SEP. 29, 2002, PISCATAWAY, NJ, USA,IEEE, 1 January 2002 (2002-01-01), pages 70 - 71, XP010620353, ISBN: 978-0-7803-7680-9 *
See also references of WO2006040200A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320320A (zh) * 2018-01-25 2018-07-24 重庆爱奇艺智能科技有限公司 一种信息显示方法、装置及设备
CN108320320B (zh) * 2018-01-25 2021-04-20 重庆爱奇艺智能科技有限公司 一种信息显示方法、装置及设备

Also Published As

Publication number Publication date
US20080211813A1 (en) 2008-09-04
TW200614097A (en) 2006-05-01
WO2006040200A1 (fr) 2006-04-20
JP2008516352A (ja) 2008-05-15

Similar Documents

Publication Publication Date Title
EP2057445A1 (fr) Dispositif et procédé de simulation d&#39;éclairage et d&#39;ombres dans un système à réalité amplifiée
DE60205662T2 (de) Vorrichtung und Verfahren zur Berechnung einer Position einer Anzeige
DE602004001500T2 (de) Apparat für dreidimensionale Messungen
CN102737370B (zh) 检测图像前景的方法及设备
EP2400261A1 (fr) Procédé de mesure optique et système de mesure destiné à la détermination de coordonnées 3D sur la surface d&#39;un objet de mesure
DE102008016215A1 (de) Informationsvorrichtungsbediengerät
DE102009029391B4 (de) Bildverarbeitungsgerät und Bildverarbeitungsverfahren
DE10058244A1 (de) Messverfahren zur Ermittlung der Position eines Objektes vor einem Bildschirm und Vorrichtung zur Durchführung des Verfahrens
DE112013000590T5 (de) Verbesserter Konstrast zur Objekterfassung und Charaktersierung durch optisches Abbilden
CH695121A5 (de) Verfahren und Anordnung zur Durchführung von geodätischen Messungen mittels Videotachymeter.
WO2005101308A2 (fr) Procede de compensation de rotation d&#39;images spheriques
WO2002006851A1 (fr) Procede pour determiner la distance de visibilite
DE202013011910U1 (de) Vermessungssystem zur Vermessung von Gliedmaßen
DE102013211492A1 (de) Bestimmung eines Messfehlers
DE112004001034T5 (de) 3D- und 2D-Meßsystem und -verfahren mit erhöhter Sensitivität und erhöhtem Dynamikbereich
DE112017001464B4 (de) Abstandsmessvorrichtung und Abstandsmessverfahren
DE102008023439B4 (de) Augmented Reality Fernglas zur Navigationsunterstützung
DE112008003807T5 (de) Ferngesteuertes Zeigen
DE102004008904A1 (de) Vorrichtung und Verfahren zur Bestimmung von Raumkoordinaten eines Objekts
DE102018104913A1 (de) Schwingungsüberwachung eines Objekts mittels Videokamera
DE3049397A1 (de) Verfahren und vorrichtung zum scharf-einstellen von fotographischen apparaten
EP1533629A2 (fr) Télémétrie à terminal mobile
DE10153113A1 (de) Verfahren und Vorrichtung zur Entfernungsbestimmung
EP3712696B1 (fr) Dispositif et procédé d&#39;optimisation des propriétés d&#39;une prise de vue photographique
DE3329603A1 (de) Anordnung zur automatischen scharfeinstellung fotografischer kameras

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070129

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: GIGASET COMMUNICATIONS GMBH

17Q First examination report despatched

Effective date: 20091104

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20100515