WO2006135241A1 - Visual system - Google Patents
Visual system Download PDFInfo
- Publication number
- WO2006135241A1 WO2006135241A1 PCT/NO2005/000416 NO2005000416W WO2006135241A1 WO 2006135241 A1 WO2006135241 A1 WO 2006135241A1 NO 2005000416 W NO2005000416 W NO 2005000416W WO 2006135241 A1 WO2006135241 A1 WO 2006135241A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- infrared
- camera
- computer
- projection device
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/18—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective
- G02B27/20—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical projection, e.g. combination of mirror and condenser and objective for imaging minute objects, e.g. light-pointer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
Definitions
- This invention relates to the projection of an infrared or near infrared image onto a reference surface from a projection device placed on a guiding object, pen or pointer in order to be able to find and continuously follow the image position and also the orientation of the guiding object, pen or pointer relative to the reference surface. It also relates to an apparatus, a system and the method of finding and following the image position and the orientation of the guiding object, pen or pointer.
- the published international patent application WO0227461 relates to a drawing, writing and pointing device for data presentations, computer supported office work and/or other interactive use of a computer.
- the invention is also related to a camera and recognition system for interaction with a drawing, writing and pointing device for data presentations, computer supported office work and/or other interactive use of a computer.
- the prior invention is further related to a system for drawing, writing and pointing which is suitable for data presentations, computer supported office work and/or other interactive use of a computer, comprising a drawing, writing and pointing device, a camera and recognition system in addition to a computer system with a projection screen, a computer screen, a flat screen or a virtual screen.
- Each marked object comprises one or more code patterns.
- Each code pattern is adapted to be detected by a camera and recognition system.
- the code patterns are selected such that they have good auto-correlation and cross- correlation properties .
- the U. S. Pat.No.5938308 describes a hand-held pointer which is able to project an image onto another image in order to assist a person in the description of this second image. This involves use of a laser source or another coherent light source and a' diffractive optical element to produce a desired image like a complex pattern image, i.e.
- the present invention in one aspect, relates to the projection of an infrared or near-infrared image or possibly a low intensity visible light image onto a reference surface from a projection device comprised by a guiding object, pen or pointer in order to be able to find and continuously follow the image position and also the orientation of the guiding object, pen or pointer relative to the reference surface.
- a projection device comprised by a guiding object, pen or pointer
- Reference surface in this context may be taken as any form of display surface or the like, comprising an arbitrary 3- dimensional surface, a plane surface, a rear projection screen (with a projector at the back side), a front projection or display screen, such as an LCD or CRT screen, oriented more or less vertically, as well as a table top surface.
- the projected image comprises one or more code patterns which have good auto-correlation and cross-correlation properties as described in
- the pattern image is made hardly visible or essentially invisible for a human being by using very low changes in intensity levels or by using infrared or near-infrared light at a relatively narrow wavelength band. This obtained, inter alia, by providing a projection device comprising diffractive optical elements, being per se known in the art.
- the detection of the projected image position may be performed by a camera and recognition system as in
- WO0227461 to find and report the image position and orientation continuously as an input device to a computer system.
- the pattern image projected from the guiding object, the pen or the pointer is not directly visible, or is only hardly. visible, for a human being.
- Hardly visible or essentially invisible in this context means that for all practical purposes the pattern image will be negligible within a total image displayed to an audience or user(s). Therefore the image will not be distracting the presenter nor the audience, while the computer system which receives the position and orientation of this pattern can make a desired visual feedback to the presenter and audience like an arrow, a drawing or writing stroke sequence or e.g., a selection of an object on the computer-generated screen.
- the wavelength used can be of infrared or near infrared radiation, well outside the human vision wavelength range used in the data projector systems.
- the energy required to make the image pattern detectable by the camera recognition system is much lower than it needed to be in order to make an arrow or logo of visible light clearly visible to the presenter and audience in the presence of very high light intensity from the data projector.
- the camera recognition system may be equipped with an optical filter which blocks the projector light but passes through the infrared or near infrared light, making the detection process insensitive to other patterns in the data projector images of visible light.
- the infrared or near-infrared light also may be coded in the guiding object, pen or pointer and be decoded by an infrared remote control decoder placed along with the camera to give information of the actual state and actual events of the guiding object, pen or pointer. This may also include information regarding user identification.
- an infrared emitter may also be placed along with the camera and recognition system, while an infrared receiver and decoder is placed inside the guiding object, the pen or the pointer.
- an infrared receiver and decoder is placed inside the guiding object, the pen or the pointer.
- FIG. 1 is a schematic axial section of a projection device for positioning using transmissive diffractive optical elements according to a preferred embodiment of the present invention.
- FIG. 2 is a schematic axial section of another preferred embodiment with the same optical alignment of optical elements as FIG. 1 but with a slimmer tip.
- FIG. 3 is a schematic axial section of a projection device for positioning using reflective diffractive optical element according to a preferred embodiment of the present invention.
- FIG. 4 is a schematic axial section of a projection device for positioning using transmissive diffractive optical element according to a preferred embodiment of the present invention.
- FIG. 5 shows the device operating close to the display screen.
- FIG. 6 shows the device operating distant to the display screen.
- FIG. 7 shows the device in a tilted orientation while operating close to the display screen.
- FIG. 8 shows the device in a tilted orientation while operating distant to the display screen.
- FIG. 9 illustrates a system according to the invention where the camera and recognition apparatus is placed before the front projection screen or possible other display screen, and the projection device is operating closely to the screen.
- FIG. 10 shows the system of Fig. 9, where the projection device is operating distantly from the screen.
- FIG. 11 illustrates a system where a camera and recognition apparatus is placed behind the back projection screen or possible other display screen, and the projection device is operating closely to the screen.
- FIG. 12 shows the system of Fig. 11, where the projection device is operating distantly from the screen.
- FIG. 13 is an example of the spectrum from a typical data projector for a given RGB-setting, displayed on a front projection screen 1 meter from the data projector and recorded using a calibrated spectrum analyzer.
- FIG. 14 illustrates a preferred type of pattern image to be projected by the projection device according to the present invention.
- FIG. 15 shows a phase pattern generated by the so called “Gerschberg-Saxton algorithm” to iteratively obtain a phase pattern to be used as the diffractive optical element to give the desired intensity pattern as displayed in Fig. 14.
- FIG. 16 shows a simulation of the resulting intensity pattern obtained in the far-field, or alternatively, by using a system with lenses and/or curved mirrors, obtained in vicinity of the diffractive optical element.
- FIG. 17 shows the phase pattern generated for forming an asymmetric part of the desired intensity pattern of FIG. 14.
- FIG. 18 shows the resulting intensity pattern obtained in the far-field.
- FIG. 19 shows the two discrete phase levels phase gratings of the desired symmetric intensity pattern as shown in FIG. 14.
- FIG. 20 shows the resulting intensity pattern obtained in the far-field.
- FIG. 21 shows the two discrete phase levels phase gratings of an asymmetric part of the desired intensity pattern of FIG. 14.
- FIG. 22 shows the resulting diffraction pattern with severe errors due to the non-capability of a phase-only two discrete phase levels diffraction grating to produce an asymmetric pattern.
- FIG. 23 shows an example of a simple symmetric intensity pattern that can be made by a phase-only two discrete phase levels diffractive grating.
- FIG 24 shows another example of a simple symmetric intensity pattern that can be made by a phase-only two discrete phase levels diffractive grating.
- near-infrared light or near-infrared radiation electromagnetic radiation of wavelength 750 - 1300 nm
- infrared light or infrared radiation electromagnetic radiation of wavelength > 1300 nm.
- visible light or visible radiation electromagnetic radiation of wavelength 400 - 750 nm, i.e., light normally detected by humans.
- an asymmetric pattern or image means a pattern or image that is not possible to invert according to the symmetry operation in DEFINITION (4) above, and to obtain the same or a very similar pattern or image. It is "asymmetric" (not symmetric) with respect to the symmetry operation defined in DEFINITION (4) above.
- reference surface is considered to comprise any form of display, such as an arbitrary 3-dimensional surface, a plane surface or a rear projection screen, or possibly a front projection screen. Instead of a more or less vertical surface there may also be a table top surface.
- the intensity distribution of the image is selected to have good auto-correlation and cross-correlation properties.
- the image formed from near-infrared or infrared light is in the sensitivity range of the camera system used to locate and track the pattern.
- the spectral output of a representative data projector is shown in FIG. 13. As can be seen, the light is confined to the wavelength range approximately of 400 - 700 nm.
- the light used to simultaneously project another image using the data projector is not interfering with the near- infrared or infrared image, and its associated camera detection system, since the wavelength used is different from those used by the data projector.
- FIG. 14 An example of such a pattern image with good auto- correlation and cross-correlation properties as well as circular symmetry is shown in FIG. 14.
- the gray scale colors are inverted due to document printing quality reasons, such that the white and black colors are used to present low and high light intensity, respectively.
- Only a fraction of the image as depicted in FIG. 14 is required to locate and track the center of the whole pattern based on the methods and systems described in WO0227461.
- the projected patterns from the preferred embodiments of the present invention can be described geometrically as part of conic sections, and can thus be analysed further to find the actual azimuth and elevation orientation angles and the distance of the pen relative to the reference surface.
- the image must have sharp and well defined edges or alternatively be distributed over a larger area with lower contrast and sharpness requirements .
- a preferred embodiment for projecting the image is by using a computer generated diffractive optical element and an infrared diode laser.
- a phase pattern of a diffractive optical element needed to generate a pattern as in FIG. 14 is readily made by computer calculations. There exist a vast number of computational methods to compute the diffractive optical element pattern, as reported in the scientific and technical literature [ref : Jakob Blad; x ⁇ New Design Method for Diffractive Optics", Chalmers University of Technology, G ⁇ teborg (2003), pp.15-23] .
- FIG. 15 shows a phase pattern generated by using the so called “ “Gerschberg-Saxton algorithm” [ref: J ⁇ rgen Bengtsson, “Diffractive Optics Design”, Chalmers University of Technology, G ⁇ teborg (1997), pp. 25-27] to iteratively obtain a phase pattern that can be used as the diffractive optical element to give an approximate intensity pattern as that ' displayed in FIG. 14.
- the gray-scale plot in FIG. 15, inverted here for document printing quality reasons, represents phase-levels in the range [0 2 ⁇ ] as distributed over the diffractive optical element.
- the corresponding "diffracted image" in terms of its Fourier transform is depicted in FIG. 16.
- 256 pixels were used in the diffractive optical element at 16 phase levels, and it is seen to reproduce approximately the same intensity distribution as in FIG 14 apart from reduction in light level intensity due to diffraction losses.
- a threshold in the detection camera system can be used to encode these as "bright/white” (recall that figures 14-24 are inverted) .
- FIG 17 shows that the phase pattern for the corresponding intensity distribution of an asymmetric pattern (with similarities to a part of the pattern in FIG 14) can be generated using the same algorithm using the same number of pixel resolution in the diffractive optical element and the same number of phase levels (concerning meaning of "asymmetric", see DEFINITION (5) ) .
- a two discrete phase-only grating can be used to generate any symmetric diffraction pattern without interference and mixing of the positive and negative orders (concerning meaning of "symmetric" see DEFINITION (4)).
- the diffractive optical element and corresponding diffraction pattern for the image in FIG 14 is displayed in FIG 19 and FIG 20, respectively.
- the resolution is lower owing to the restricted flexibility of the two discrete phase levels of the diffractive optical element.
- FIG 22 the result from attempting to generate an asymmetric pattern using ' the two discrete phase level grating of FIG 21, is shown.
- the resulting diffraction pattern in FIG 22 is a superposition of the original pattern and its inverted image. This is due to the difficulty of a phase-only two discrete phase level diffraction grating to produce an asymmetric pattern.
- a phase-grating of more than two phase-levels will be a pre-requisite for producing an asymmetric pattern used in the detector camera system. Moreover, it will generally give better image quality compared to the two discrete phase levels grating when used with the same pixel resolution.
- Other examples of simple symmetric patterns that also can be made by two discrete phase levels diffraction gratings are displayed in FIG 23 and FIG 24.
- the image pattern as in FIG 24 can be used to estimate the angular spread of the diffracted pattern owing to the resolution of the grating.
- ⁇ is the fan out angle as applicable for the first order diffraction steering of a laser beam by a one-dimensional blazed grating [ref: E. Hallstig, L. Sj ⁇ qvist, M. Lindgren; Opt. Eng. Volume 42(3) (2003) pp. 613-619].
- ⁇ is the wavelength of the light (unit is length)
- ⁇ is the pixel-pitch (unit is length)
- ⁇ is the period of the grating in pixels.
- the pixel-pitch can be estimated from the resolution of creating the grating and for typical diffractive optical element produced on polymer materials or micro-machined silicon the resolution is typically 0.5 ⁇ m or better. Hence it is possible to have 1 ⁇ m as pixel pitch.
- the wavelength is taken as 850 nm being in the near-infrared range, using 4 phase levels equally spaced between 0 and 3 ⁇ /4 radians gives the maximum diffracted 1 st order diffracted beam to be at an angle given by: sin ⁇ « 0.2125 and an angle of approximately 12°.
- 5 cm of free space propagation after reflection onto a two-dimensional grating (or transmission through) with similar resolution and phase-level accuracy can be used to produce an approximately 2 cm diameter circle or similar pattern. It is noted that a higher resolution or a smaller pixel pitch could generate an even larger angular spread.
- Suggestions of embodiments that can provide the • diffracted pattern as an image in close vicinity ( ⁇ approx. 10 cm) of the diffractive optical element is discussed in the following.
- the diffraction phenomenon generates a Fourier transform of any amplitude ⁇ and phase distribution on the diffractive optical element in the "far-field".
- the Fourier transform can be moved from the far-field to become closer to the output from the diffractive optical element by using a lens or spherical mirrors acting as a lens, placed in vicinity of the diffractive optical element.
- the phase distribution of a lens or a curved mirror has the property to move the Fourier transform of a planar wave-front to the focal plane.
- the position and size of the Fourier transform pattern relative to the diffractive optical element and laser diode can be controlled.
- Diffractive optical element with phase distribution patterns as in FIG. 15 can be provided by manufacturers of diffractive optical elements . These can be in the form of transmissive or reflective components. These components can be used in alternative embodiments of the present invention in optically alignments with optical elements like lenses, mirrors and light sources, as illustrated in FIG 1 - FIG 4. Referring to FIG 1 and FIG 2, the guiding device 16
- a resulting pattern 10 is projected from the guiding device to the screen when operated closely, and a pattern 11 is projected from the guiding device to the screen when operated distantly.
- the guiding device 16 pen, pointer
- a casing 12 has a battery 1, a printed circuit board 2, a laser diode 3, a transmissive diffractive optical element 5 with or without a lens 4 mounted near the tip 7, a reflective diffractive curved annular mirror 14 and a curved annular mirror 13, two buttons 8 and 9.
- a resulting pattern 10 is projected from the guiding device to the screen when operated closely, and a pattern 11 is projected from the guiding device to the screen when operated distantly.
- the guiding device 16 (pen, pointer) with a casing 12 has a battery 1, a printed circuit board 2, a laser diode 3, a refractive collimating lens 4, a transmissive diffractive optical element 5 mounted near the rear end 7, a curved annular mirror 13 and a neutral window and/or another transmissive diffractive optical element 15 for the light forming the pattern for distant operation.
- a resulting pattern 10 is projected from the guiding device to the screen when operated closely, and a patternll is projected from the guiding device to the screen when operated distantly.
- the casings 12 of these embodiments have the purpose to resemble a conventional whiteboard marker or pen and to provide the user with a natural, intuitive and ergonomic writing, drawing and pointing tool.
- One or more battery cells are supplying the energy required to emit light from the light source or a number of light sources.
- the printed circuit board may provide power management, the interface to the one, two or more button switches, a laser diode driver circuit, and circuits for modulating the laser, a remote infrared link and/or a radio-frequency link.
- the laser diode may be an infrared or near infrared diode.
- the purpose of the collimating lens 4 is to increase the aperture of the laser beam to cover the surface of the diffractive optical element 5.
- the concave lenses 6 and convex mirrors 13 and possibly 14 are for spreading the pattern to a large area when the guiding device is operated closely to the screen.
- the annular shape of the mirrors 13,14 and the annular shape of the possibly reflective diffractive optical elements 13,14 makes a free field path for the central part of the laser beam forming the optical intensity image when the guiding device is operating distantly towards the screen.
- the guiding object can be held in different orientations and distances to the screen as illustrated in FIG 5, 6, 7 and 8.
- the changes in the projected pattern image position, shape and size can be utilized to find the lateral position, the orientation (elevation and azimuth) and to estimate the distance from the guiding object to the screen surface.
- FIG 9 and FIG 10 illustrate a situation where the guiding device and the camera and recognition system are located before the front projection screen, the rear- projection screen or other display system, and where the guiding device can be used closely to the screen and distantly to the screen.
- the guiding device When close to the screen (fig. 9) the guiding device may be within the field of view of camera 18.
- the guiding device is provided with a code pattern, there may here be a combined function comprising the method described in WO0227461.
- FIG 11 and FIG 12 show a configuration where the guiding device are operated before the rear projection screen and can be used closely to the screen and/or distantly to the screen, while the projected pattern image from the guiding device is projected onto the rear projection screen surface, and can be detected by the camera and recognition system located behind the screen close to the projector.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Position Input By Displaying (AREA)
- Projection Apparatus (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020077010477A KR101158157B1 (en) | 2004-11-12 | 2005-11-02 | Visual system |
EP05858006A EP1817655B1 (en) | 2004-11-12 | 2005-11-02 | Visual system |
CA2586849A CA2586849C (en) | 2004-11-12 | 2005-11-02 | Method and apparatus for forming, projecting and detecting a coded pattern image with a camera and recognition system |
MX2007005464A MX2007005464A (en) | 2004-11-12 | 2005-11-02 | Visual system. |
AU2005333078A AU2005333078A1 (en) | 2004-11-12 | 2005-11-02 | Visual system |
JP2007541123A JP5225681B2 (en) | 2004-11-12 | 2005-11-02 | Visual system |
US11/719,065 US8436836B2 (en) | 2004-11-12 | 2005-11-02 | Method and apparatus for forming, projecting and detecting a coded pattern image with a camera and recognition system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NO20044936A NO323926B1 (en) | 2004-11-12 | 2004-11-12 | Visual system and control object and apparatus for use in the system. |
NO20044936 | 2004-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006135241A1 true WO2006135241A1 (en) | 2006-12-21 |
Family
ID=35220544
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/NO2005/000416 WO2006135241A1 (en) | 2004-11-12 | 2005-11-02 | Visual system |
Country Status (11)
Country | Link |
---|---|
US (1) | US8436836B2 (en) |
EP (2) | EP1817655B1 (en) |
JP (2) | JP5225681B2 (en) |
KR (1) | KR101158157B1 (en) |
CN (1) | CN101095098A (en) |
AU (1) | AU2005333078A1 (en) |
CA (1) | CA2586849C (en) |
MX (1) | MX2007005464A (en) |
NO (1) | NO323926B1 (en) |
TW (1) | TWI453524B (en) |
WO (1) | WO2006135241A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100462900C (en) * | 2007-05-24 | 2009-02-18 | 司秉玉 | Method and equipment for making interactive operation with display system |
WO2010001072A2 (en) * | 2008-07-04 | 2010-01-07 | Optinnova | Interactive display device and method, using a detection camera and optical pointer |
ITMN20090001A1 (en) * | 2009-01-14 | 2010-07-15 | Matteo Mode | OPTICAL INTERACTION DEVICE WITH GRAPHIC INTERFACE SYSTEMS WITH PIXEL GRID SCREEN |
GB2478400A (en) * | 2010-02-24 | 2011-09-07 | Intel Corp | Interactive Projected Displays |
WO2012004447A1 (en) | 2010-07-08 | 2012-01-12 | Nokia Corporation | Visual data distribution |
CN101408813B (en) * | 2007-10-10 | 2012-07-18 | 夏普株式会社 | Display system and method for detecting pointed position |
WO2013104988A1 (en) | 2012-01-09 | 2013-07-18 | Epson Norway Research And Development As | Low interference system and method for synchronization, identification and tracking of visual and interactive systems |
DE112011103849T5 (en) | 2010-11-22 | 2013-10-02 | Epson Norway Research And Development As | Camera-based multi-touch interaction and lighting system and method |
PL422791A1 (en) * | 2017-09-08 | 2019-03-11 | Sle Spółka Z Ograniczoną Odpowiedzialnością | Device for displaying the multimedia |
Families Citing this family (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007003682A1 (en) | 2005-06-30 | 2007-01-11 | Nokia Corporation | Camera control means to allow operating of a destined location of the information surface of a presentation and information system |
TW200802036A (en) * | 2006-06-01 | 2008-01-01 | Micro Nits Co Ltd | Input method of a pointer input system |
JP2008306512A (en) * | 2007-06-08 | 2008-12-18 | Nec Corp | Information providing system |
ES2323399B2 (en) * | 2007-06-19 | 2010-02-26 | Gat Microencapsulation Ag | SUSPENSIONS IN AGRICULTURAL SULFONILE AND COMBINATIONS OILS. |
WO2010023348A1 (en) * | 2008-08-26 | 2010-03-04 | Multitouch Oy | Interactive displays |
US9948872B2 (en) | 2009-03-02 | 2018-04-17 | Flir Systems, Inc. | Monitor and control systems and methods for occupant safety and energy efficiency of structures |
US9756264B2 (en) | 2009-03-02 | 2017-09-05 | Flir Systems, Inc. | Anomalous pixel detection |
US9473681B2 (en) | 2011-06-10 | 2016-10-18 | Flir Systems, Inc. | Infrared camera system housing with metalized surface |
US9635285B2 (en) | 2009-03-02 | 2017-04-25 | Flir Systems, Inc. | Infrared imaging enhancement with fusion |
US9998697B2 (en) | 2009-03-02 | 2018-06-12 | Flir Systems, Inc. | Systems and methods for monitoring vehicle occupants |
US9674458B2 (en) | 2009-06-03 | 2017-06-06 | Flir Systems, Inc. | Smart surveillance camera systems and methods |
US9208542B2 (en) | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
US9451183B2 (en) | 2009-03-02 | 2016-09-20 | Flir Systems, Inc. | Time spaced infrared image enhancement |
US9517679B2 (en) | 2009-03-02 | 2016-12-13 | Flir Systems, Inc. | Systems and methods for monitoring vehicle occupants |
US9986175B2 (en) | 2009-03-02 | 2018-05-29 | Flir Systems, Inc. | Device attachment with infrared imaging sensor |
US9843742B2 (en) | 2009-03-02 | 2017-12-12 | Flir Systems, Inc. | Thermal image frame capture using de-aligned sensor array |
USD765081S1 (en) | 2012-05-25 | 2016-08-30 | Flir Systems, Inc. | Mobile communications device attachment with camera |
US10757308B2 (en) | 2009-03-02 | 2020-08-25 | Flir Systems, Inc. | Techniques for device attachment with dual band imaging sensor |
US9235876B2 (en) | 2009-03-02 | 2016-01-12 | Flir Systems, Inc. | Row and column noise reduction in thermal images |
US10244190B2 (en) | 2009-03-02 | 2019-03-26 | Flir Systems, Inc. | Compact multi-spectrum imaging with fusion |
AT508438B1 (en) * | 2009-04-16 | 2013-10-15 | Isiqiri Interface Tech Gmbh | DISPLAY AREA AND A COMBINED CONTROL DEVICE FOR A DATA PROCESSING SYSTEM |
US9756262B2 (en) | 2009-06-03 | 2017-09-05 | Flir Systems, Inc. | Systems and methods for monitoring power systems |
US9716843B2 (en) | 2009-06-03 | 2017-07-25 | Flir Systems, Inc. | Measurement device for electrical installations and related methods |
US9843743B2 (en) | 2009-06-03 | 2017-12-12 | Flir Systems, Inc. | Infant monitoring systems and methods using thermal imaging |
US10091439B2 (en) | 2009-06-03 | 2018-10-02 | Flir Systems, Inc. | Imager with array of multiple infrared imaging modules |
US9819880B2 (en) | 2009-06-03 | 2017-11-14 | Flir Systems, Inc. | Systems and methods of suppressing sky regions in images |
US9292909B2 (en) | 2009-06-03 | 2016-03-22 | Flir Systems, Inc. | Selective image correction for infrared imaging devices |
EP2336861A3 (en) * | 2009-11-13 | 2011-10-12 | Samsung Electronics Co., Ltd. | Multi-touch and proximate object sensing apparatus using sensing array |
US9706138B2 (en) | 2010-04-23 | 2017-07-11 | Flir Systems, Inc. | Hybrid infrared sensor array having heterogeneous infrared sensors |
US9848134B2 (en) | 2010-04-23 | 2017-12-19 | Flir Systems, Inc. | Infrared imager with integrated metal layers |
US9207708B2 (en) | 2010-04-23 | 2015-12-08 | Flir Systems, Inc. | Abnormal clock rate detection in imaging sensor arrays |
CN101882012A (en) * | 2010-06-12 | 2010-11-10 | 北京理工大学 | Pen type interactive system based on projection tracking |
CN102298786B (en) * | 2010-06-22 | 2015-09-02 | 上海科技馆 | The devices and methods therefor that a kind of virtual drawing realizes |
JP5307108B2 (en) * | 2010-10-29 | 2013-10-02 | 株式会社コナミデジタルエンタテインメント | Detection system, electronic blackboard apparatus to which the detection system is applied, its control method, and computer program |
JP5692904B2 (en) * | 2010-11-17 | 2015-04-01 | 任天堂株式会社 | Input system, information processing apparatus, information processing program, and pointing position calculation method |
US8619065B2 (en) * | 2011-02-11 | 2013-12-31 | Microsoft Corporation | Universal stylus device |
US9179182B2 (en) | 2011-04-12 | 2015-11-03 | Kenneth J. Huebner | Interactive multi-display control systems |
US10079982B2 (en) | 2011-06-10 | 2018-09-18 | Flir Systems, Inc. | Determination of an absolute radiometric value using blocked infrared sensors |
US9509924B2 (en) | 2011-06-10 | 2016-11-29 | Flir Systems, Inc. | Wearable apparatus with integrated infrared imaging module |
CA2838992C (en) | 2011-06-10 | 2018-05-01 | Flir Systems, Inc. | Non-uniformity correction techniques for infrared imaging devices |
US10841508B2 (en) | 2011-06-10 | 2020-11-17 | Flir Systems, Inc. | Electrical cabinet infrared monitor systems and methods |
US10169666B2 (en) | 2011-06-10 | 2019-01-01 | Flir Systems, Inc. | Image-assisted remote control vehicle systems and methods |
US9900526B2 (en) | 2011-06-10 | 2018-02-20 | Flir Systems, Inc. | Techniques to compensate for calibration drifts in infrared imaging devices |
CN109618084B (en) | 2011-06-10 | 2021-03-05 | 菲力尔系统公司 | Infrared imaging system and method |
CN103828343B (en) | 2011-06-10 | 2017-07-11 | 菲力尔系统公司 | Based on capable image procossing and flexible storage system |
US9961277B2 (en) | 2011-06-10 | 2018-05-01 | Flir Systems, Inc. | Infrared focal plane array heat spreaders |
US9235023B2 (en) | 2011-06-10 | 2016-01-12 | Flir Systems, Inc. | Variable lens sleeve spacer |
US9143703B2 (en) | 2011-06-10 | 2015-09-22 | Flir Systems, Inc. | Infrared camera calibration techniques |
US9058653B1 (en) | 2011-06-10 | 2015-06-16 | Flir Systems, Inc. | Alignment of visible light sources based on thermal images |
US9706137B2 (en) | 2011-06-10 | 2017-07-11 | Flir Systems, Inc. | Electrical cabinet infrared monitor |
US10051210B2 (en) | 2011-06-10 | 2018-08-14 | Flir Systems, Inc. | Infrared detector array with selectable pixel binning systems and methods |
US10389953B2 (en) | 2011-06-10 | 2019-08-20 | Flir Systems, Inc. | Infrared imaging device having a shutter |
TW201306407A (en) * | 2011-07-28 | 2013-02-01 | Touch Micro System Tech | Linear scan structure and laser designator using the same |
US9707491B2 (en) | 2011-10-19 | 2017-07-18 | Randy Wayne Clark | Light activated glow-in-the-dark doodler |
CN102968219A (en) * | 2012-01-04 | 2013-03-13 | 北京理工大学 | Interactive computer input equipment on basis of graphic processing |
US9067127B2 (en) | 2012-01-13 | 2015-06-30 | Randy Wayne Clark | Light emitting toys and light activated targets |
WO2013189259A1 (en) * | 2012-06-20 | 2013-12-27 | Ming Fong | System for reproducing virtual objects |
US9811884B2 (en) | 2012-07-16 | 2017-11-07 | Flir Systems, Inc. | Methods and systems for suppressing atmospheric turbulence in images |
JP2014199633A (en) * | 2013-03-15 | 2014-10-23 | 株式会社リコー | Coordinate detection device, coordinate detection system, coordinate detection method, and coordinate detection program |
US20150370441A1 (en) * | 2014-06-23 | 2015-12-24 | Infosys Limited | Methods, systems and computer-readable media for converting a surface to a touch surface |
US9973692B2 (en) | 2013-10-03 | 2018-05-15 | Flir Systems, Inc. | Situational awareness by compressed display of panoramic views |
US11297264B2 (en) | 2014-01-05 | 2022-04-05 | Teledyne Fur, Llc | Device attachment with dual band imaging sensor |
CN104978077B (en) * | 2014-04-08 | 2020-01-31 | 联想(北京)有限公司 | interaction method and system |
US10192335B1 (en) | 2014-08-25 | 2019-01-29 | Alexander Wellen | Remote control highlighter |
JP6477131B2 (en) | 2015-03-27 | 2019-03-06 | セイコーエプソン株式会社 | Interactive projector, interactive projection system, and control method of interactive projector |
JP6485160B2 (en) | 2015-03-27 | 2019-03-20 | セイコーエプソン株式会社 | Interactive projector and interactive projector control method |
JP6459705B2 (en) | 2015-03-27 | 2019-01-30 | セイコーエプソン株式会社 | Interactive projector, interactive projection system, and interactive projector control method |
US9857918B2 (en) | 2015-03-31 | 2018-01-02 | Fujitsu Limited | Content display control method and system |
CN106353957A (en) * | 2016-09-29 | 2017-01-25 | 南京仁光电子科技有限公司 | Laser-projector-based interaction device and method |
US10412362B2 (en) | 2017-07-27 | 2019-09-10 | Qualcomm Incorporated | Active alignment correction for optical systems |
CN110161793A (en) * | 2019-04-16 | 2019-08-23 | 苏州佳世达光电有限公司 | A kind of projection adjustment system, projector and supporting mechanism |
CN111354018B (en) * | 2020-03-06 | 2023-07-21 | 合肥维尔慧渤科技有限公司 | Object identification method, device and system based on image |
CN115199152B (en) * | 2021-04-09 | 2023-09-15 | 南京工业大学 | Intelligent coded lock based on grating diffraction |
CN113203031A (en) * | 2021-05-17 | 2021-08-03 | 深圳市科拜斯物联网科技有限公司 | AI (Artificial intelligence) food waste identification monitoring device based on Internet of things and using method thereof |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4675665A (en) * | 1982-12-22 | 1987-06-23 | International Business Machines Corporation | Realtime tracking of a movable cursor |
US5138304A (en) * | 1990-08-02 | 1992-08-11 | Hewlett-Packard Company | Projected image light pen |
EP0515015A2 (en) | 1991-05-10 | 1992-11-25 | nVIEW CORPORATION | Method and apparatus for interacting with a computer generated projected image |
US5572251A (en) | 1994-03-17 | 1996-11-05 | Wacom Co., Ltd. | Optical position detecting unit and optical coordinate input unit |
US5938308A (en) | 1996-06-25 | 1999-08-17 | Digital Opitcs Corporation | Projection pointer |
US6050690A (en) * | 1998-01-08 | 2000-04-18 | Siemens Information And Communication Networks, Inc. | Apparatus and method for focusing a projected image |
US6275214B1 (en) * | 1999-07-06 | 2001-08-14 | Karl C. Hansen | Computer presentation system and method with optical tracking of wireless pointer |
US20010022575A1 (en) * | 1998-10-21 | 2001-09-20 | Richter Woflgang | Input device for a computer |
US20010026645A1 (en) * | 2000-02-22 | 2001-10-04 | Kazunori Hiramatsu | System and method of pointed position detection, presentation system, and program |
WO2002027461A1 (en) | 2000-09-11 | 2002-04-04 | Njoelstad Tormod | Drawing, writing and pointing device |
EP0718748B1 (en) * | 1994-12-22 | 2002-05-22 | Canon Kabushiki Kaisha | Pointed-position detecting apparatus and method |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05265637A (en) * | 1992-03-16 | 1993-10-15 | Toshiba Corp | Three-dimensional pointing device |
JP3277658B2 (en) * | 1993-12-28 | 2002-04-22 | 株式会社日立製作所 | Information display device |
US5712658A (en) * | 1993-12-28 | 1998-01-27 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
JP3470170B2 (en) * | 1993-12-28 | 2003-11-25 | 株式会社日立製作所 | Remote instruction input method and device |
EP0686935B1 (en) * | 1994-06-09 | 2002-09-18 | Corporation for National Research Initiatives | Pointing interface |
EP0704721A3 (en) * | 1994-09-27 | 1997-12-17 | AT&T Corp. | Methods and apparatus for generating and displaying holographic images utilizing a laser pointer |
GB9516441D0 (en) * | 1995-08-10 | 1995-10-11 | Philips Electronics Uk Ltd | Light pen input systems |
US5953001A (en) * | 1997-12-23 | 1999-09-14 | International Business Machines Corporation | Computer input stylus and texture control system |
JP3377431B2 (en) * | 1998-03-03 | 2003-02-17 | シャープ株式会社 | 3D position / direction indicator |
JP4697916B2 (en) * | 2000-07-07 | 2011-06-08 | キヤノン株式会社 | Coordinate input device, control method therefor, and program |
JP2002073267A (en) * | 2000-08-31 | 2002-03-12 | Canon Inc | Coordinate input device |
US7242388B2 (en) * | 2001-01-08 | 2007-07-10 | Vkb Inc. | Data input device |
JP4728540B2 (en) * | 2001-09-20 | 2011-07-20 | 株式会社リコー | Image projection device for meeting support |
KR100449710B1 (en) * | 2001-12-10 | 2004-09-22 | 삼성전자주식회사 | Remote pointing method and apparatus therefor |
JP4129168B2 (en) * | 2002-11-15 | 2008-08-06 | 日本放送協会 | Position detection apparatus, position detection method, and position detection program |
US7242818B2 (en) * | 2003-01-17 | 2007-07-10 | Mitsubishi Electric Research Laboratories, Inc. | Position and orientation sensing with a projector |
US7205527B2 (en) * | 2003-12-09 | 2007-04-17 | Delphi Technologies, Inc. | Apparatus for fanning out near infrared radiation in an automotive occupant position restraint system |
-
2004
- 2004-11-12 NO NO20044936A patent/NO323926B1/en not_active IP Right Cessation
-
2005
- 2005-11-02 AU AU2005333078A patent/AU2005333078A1/en not_active Abandoned
- 2005-11-02 JP JP2007541123A patent/JP5225681B2/en not_active Expired - Fee Related
- 2005-11-02 KR KR1020077010477A patent/KR101158157B1/en not_active IP Right Cessation
- 2005-11-02 EP EP05858006A patent/EP1817655B1/en not_active Not-in-force
- 2005-11-02 MX MX2007005464A patent/MX2007005464A/en active IP Right Grant
- 2005-11-02 US US11/719,065 patent/US8436836B2/en not_active Expired - Fee Related
- 2005-11-02 WO PCT/NO2005/000416 patent/WO2006135241A1/en active Application Filing
- 2005-11-02 CN CNA2005800387032A patent/CN101095098A/en active Pending
- 2005-11-02 CA CA2586849A patent/CA2586849C/en not_active Expired - Fee Related
- 2005-11-02 EP EP13160729.3A patent/EP2607995A1/en not_active Withdrawn
- 2005-11-10 TW TW094139424A patent/TWI453524B/en not_active IP Right Cessation
-
2012
- 2012-10-03 JP JP2012221513A patent/JP5490199B2/en not_active Expired - Fee Related
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4675665A (en) * | 1982-12-22 | 1987-06-23 | International Business Machines Corporation | Realtime tracking of a movable cursor |
US5138304A (en) * | 1990-08-02 | 1992-08-11 | Hewlett-Packard Company | Projected image light pen |
EP0515015A2 (en) | 1991-05-10 | 1992-11-25 | nVIEW CORPORATION | Method and apparatus for interacting with a computer generated projected image |
US5572251A (en) | 1994-03-17 | 1996-11-05 | Wacom Co., Ltd. | Optical position detecting unit and optical coordinate input unit |
EP0718748B1 (en) * | 1994-12-22 | 2002-05-22 | Canon Kabushiki Kaisha | Pointed-position detecting apparatus and method |
US5938308A (en) | 1996-06-25 | 1999-08-17 | Digital Opitcs Corporation | Projection pointer |
US6050690A (en) * | 1998-01-08 | 2000-04-18 | Siemens Information And Communication Networks, Inc. | Apparatus and method for focusing a projected image |
US20010022575A1 (en) * | 1998-10-21 | 2001-09-20 | Richter Woflgang | Input device for a computer |
US6275214B1 (en) * | 1999-07-06 | 2001-08-14 | Karl C. Hansen | Computer presentation system and method with optical tracking of wireless pointer |
US20010026645A1 (en) * | 2000-02-22 | 2001-10-04 | Kazunori Hiramatsu | System and method of pointed position detection, presentation system, and program |
WO2002027461A1 (en) | 2000-09-11 | 2002-04-04 | Njoelstad Tormod | Drawing, writing and pointing device |
Non-Patent Citations (1)
Title |
---|
See also references of EP1817655A4 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100462900C (en) * | 2007-05-24 | 2009-02-18 | 司秉玉 | Method and equipment for making interactive operation with display system |
CN101408813B (en) * | 2007-10-10 | 2012-07-18 | 夏普株式会社 | Display system and method for detecting pointed position |
WO2010001072A3 (en) * | 2008-07-04 | 2010-11-04 | Optinnova | Interactive display device and method, using a detection camera and optical pointer |
WO2010001072A2 (en) * | 2008-07-04 | 2010-01-07 | Optinnova | Interactive display device and method, using a detection camera and optical pointer |
FR2933511A1 (en) * | 2008-07-04 | 2010-01-08 | Optinnova | INTERACTIVE VISUALIZATION DEVICE AND METHOD USING DETECTION CAMERA AND OPTICAL POINTER |
WO2010082226A1 (en) * | 2009-01-14 | 2010-07-22 | Matteo Mode | Pointing device, graphic interface and process implementing the said device |
ITMN20090001A1 (en) * | 2009-01-14 | 2010-07-15 | Matteo Mode | OPTICAL INTERACTION DEVICE WITH GRAPHIC INTERFACE SYSTEMS WITH PIXEL GRID SCREEN |
GB2478400A (en) * | 2010-02-24 | 2011-09-07 | Intel Corp | Interactive Projected Displays |
WO2012004447A1 (en) | 2010-07-08 | 2012-01-12 | Nokia Corporation | Visual data distribution |
EP2591398A1 (en) * | 2010-07-08 | 2013-05-15 | Nokia Corp. | Visual data distribution |
EP2591398A4 (en) * | 2010-07-08 | 2014-04-02 | Nokia Corp | Visual data distribution |
US9100681B2 (en) | 2010-07-08 | 2015-08-04 | Nokia Technologies Oy | Visual data distribution |
DE112011103849T5 (en) | 2010-11-22 | 2013-10-02 | Epson Norway Research And Development As | Camera-based multi-touch interaction and lighting system and method |
WO2013104988A1 (en) | 2012-01-09 | 2013-07-18 | Epson Norway Research And Development As | Low interference system and method for synchronization, identification and tracking of visual and interactive systems |
EP3318961A1 (en) | 2012-01-09 | 2018-05-09 | Epson Norway Research and Development AS | Low interference system and method for synchronization, identification and tracking of visual and interactive systems |
PL422791A1 (en) * | 2017-09-08 | 2019-03-11 | Sle Spółka Z Ograniczoną Odpowiedzialnością | Device for displaying the multimedia |
Also Published As
Publication number | Publication date |
---|---|
EP1817655A1 (en) | 2007-08-15 |
NO20044936L (en) | 2006-05-15 |
EP1817655B1 (en) | 2013-03-27 |
NO323926B1 (en) | 2007-07-23 |
JP5225681B2 (en) | 2013-07-03 |
KR20070084084A (en) | 2007-08-24 |
CN101095098A (en) | 2007-12-26 |
JP2013033494A (en) | 2013-02-14 |
CA2586849C (en) | 2014-06-10 |
TWI453524B (en) | 2014-09-21 |
JP5490199B2 (en) | 2014-05-14 |
EP1817655A4 (en) | 2011-04-13 |
EP2607995A1 (en) | 2013-06-26 |
JP2008520034A (en) | 2008-06-12 |
US20090040195A1 (en) | 2009-02-12 |
US8436836B2 (en) | 2013-05-07 |
TW200628957A (en) | 2006-08-16 |
NO20044936D0 (en) | 2004-11-12 |
CA2586849A1 (en) | 2006-12-21 |
AU2005333078A1 (en) | 2006-12-21 |
KR101158157B1 (en) | 2012-06-19 |
MX2007005464A (en) | 2007-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8436836B2 (en) | Method and apparatus for forming, projecting and detecting a coded pattern image with a camera and recognition system | |
US8494229B2 (en) | Device and method for determining gaze direction | |
US9784973B2 (en) | Micro doppler presentations in head worn computing | |
EP3722927B1 (en) | Power management for head worn computing system | |
US9811159B2 (en) | Eye imaging in head worn computing | |
US10191279B2 (en) | Eye imaging in head worn computing | |
US9811153B2 (en) | Eye imaging in head worn computing | |
US9532715B2 (en) | Eye imaging in head worn computing | |
US20170235152A1 (en) | See-through computer display systems | |
JP3937533B2 (en) | Remote coordinate input device and remote coordinate input method | |
CN101336089A (en) | Eye tracker equipment | |
CN105212890B (en) | Eye tracker equipment | |
US20190204439A1 (en) | Three-dimensional reconstruction system and method, mobile device, eye protection method, ar device | |
JP2006260487A (en) | Pointer system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005858006 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2586849 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 3381/DELNP/2007 Country of ref document: IN Ref document number: MX/a/2007/005464 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077010477 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007541123 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580038703.2 Country of ref document: CN Ref document number: 2005333078 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 2005333078 Country of ref document: AU Date of ref document: 20051102 Kind code of ref document: A |
|
WWP | Wipo information: published in national office |
Ref document number: 2005858006 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11719065 Country of ref document: US |