WO2005006024A2 - Optical method and device for use in communication - Google Patents
Optical method and device for use in communication Download PDFInfo
- Publication number
- WO2005006024A2 WO2005006024A2 PCT/IL2004/000614 IL2004000614W WO2005006024A2 WO 2005006024 A2 WO2005006024 A2 WO 2005006024A2 IL 2004000614 W IL2004000614 W IL 2004000614W WO 2005006024 A2 WO2005006024 A2 WO 2005006024A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern
- input
- light
- operable
- sensing unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000004891 communication Methods 0.000 title claims abstract description 46
- 230000003287 optical effect Effects 0.000 title claims description 32
- 238000005286 illumination Methods 0.000 claims abstract description 100
- 230000008569 process Effects 0.000 claims abstract description 21
- 230000033001 locomotion Effects 0.000 claims description 149
- 238000003384 imaging method Methods 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 3
- 241000699666 Mus <mouse, genus> Species 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 239000013078 crystal Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 239000000835 fiber Substances 0.000 description 6
- 210000003811 finger Anatomy 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 210000004556 brain Anatomy 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 241000699670 Mus sp. Species 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000002688 persistence Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- CPBQJMYROZQQJC-UHFFFAOYSA-N helium neon Chemical compound [He].[Ne] CPBQJMYROZQQJC-UHFFFAOYSA-N 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000000059 patterning Methods 0.000 description 2
- 239000000049 pigment Substances 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000000087 stabilizing effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000012780 transparent material Substances 0.000 description 2
- JBRZTFJDHDCESZ-UHFFFAOYSA-N AsGa Chemical compound [As]#[Ga] JBRZTFJDHDCESZ-UHFFFAOYSA-N 0.000 description 1
- 241000870659 Crassula perfoliata var. minor Species 0.000 description 1
- 229910001218 Gallium arsenide Inorganic materials 0.000 description 1
- 108010001267 Protein Subunits Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000001311 chemical methods and process Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 210000004905 finger nail Anatomy 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000002844 melting Methods 0.000 description 1
- 230000008018 melting Effects 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000004377 microelectronic Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000002829 reductive effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003756 stirring Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0272—Details of the structure or mounting of specific components for a projector or beamer module assembly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1639—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/0279—Improving the user comfort or ergonomics
- H04M1/0285—Pen-type handsets
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72436—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. short messaging services [SMS] or e-mails
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- This invention relates to an optical method and device, which is particularly useful in communication.
- Optical pointers have been developed and widely used.
- the earliest optical pointers used tiny incandescent bulbs, a lens, and mask or transparency to project a dot or arrow.
- Such pointer devices were about as big as a full size (D- cell) flashlight, required a separate power pack attached by wires, and probably plugged into the wall. Performance of such devices was limited since the beam could not be collimated as well as a laser, but nonetheless was a major advance over the stick.
- these devices used an incandescent lamp, any color was possible using optical filters though given the brightness or lack thereof, white was most common.
- 4,200,367 discloses a non-laser projector for a film transparency having a first housing for enclosing an image transmitting system and a second housing having an open end through which the illumination from a projection bulb supported within the second housing is adapted to pass.
- the first and second housings are adjustably coupled to each other such that when they are located in juxtaposed position the illumination from the bulb is directed into the first housing so as to project an image of the transparency film along an optical path defined by the transmitting system onto a rear projection screen mounted in one wall of the first housing.
- the illumination from the lamp may be advantageously utilized for nonphotographic purposes, e. g., reading.
- the rear projection screen is pivotally mounted to the first housing such that it may be moved out of alignment with the optical path thereby enabling the image to be projected onto a remote viewing surface.
- the first laser-based pointers used helium-neon (HeNe) lasers with their high voltage power supplies packaged as compactly as possible, but still required a separate power pack or bulky case which included heavy batteries.
- HeNe helium-neon
- the development of inexpensive visible laser diodes significantly contributed into the development of optical pointer device.
- Laser diode device is known as the combination of a semiconductor chip that does the actual lasing along with a monitor photodiode chip (used for feedback control of power output) housed in a package (usually with three leads) that looks like a metal can transistor with a window in the top.
- Diode lasers use nearly microscopic chips of Gallium- Arsenide or other semiconductor materials operable to generate coherent light in a very small package.
- the energy level differences between the conduction and valence band electrons in these semiconductors provide the mechanism for laser action.
- Laser diodes are now quite inexpensive and widely available. The most common types found in popular devices like CD players and laser pointers have a maximum output in the 3 to 5mW range.
- Laser diodes are only slightly larger than a grain of sand, run on low voltage low current, and can be mass produced - originally driven by the CD player/CDROM revolution, barcode scanners, and other applications where a compact low cost laser source is needed.
- Pointers are commonly available with red or green beams, and at 3mW or 5mW of power.
- Laser pattern heads and generators have also been developed and are widely used. Pattern heads are either built-in (selected by a thumb-wheel type arrangement) or are in the form of interchangeable tips that slip over the end of the pointer. Passing the laser beam through a pattern head provides for projecting patterns, in the form of arrows, stars, squares or many other pre-designed shapes.
- Patent publication WO 03/036553 discloses an arrangement for and method of projecting an image on a viewing surface, utilizing sweeping a light beam along a plurality of scan lines that extend over the viewing surface, and selectively illuminating parts of the image at selected positions of the light beam on the scan lines.
- the viewing surface can be remote from a housing supporting the arrangement, or can be located on the housing.
- the present invention takes advantages of the general principles of a laser pointer, and provides for sensing an input pattern (graphics) to operate an illumination or projection process accordingly to thereby enable displaying an illuminated pattern indicative of the sensed input pattern.
- This allows communication between people at two or more sides by presenting (displaying) at one side the graphic information input at another side.
- ommunication used herein signifies projection of visual patterns from one side, where the pattern is created (input), to at least one other side where the pattern is viewed. It should be noted that the pattern may be viewed at the first side or from the first side as well.
- the term "created” used herein not necessarily signifies actual patterning (drawing) carried out at the first side, but may also refer to reception of a certain graphics input at the first side by the device of the present invention.
- the first side is not necessarily the side where the pattern (graphics) is created, but may actually be the side where the graphics is input (e.g., received from a remote side) and is projected to be viewed to the device user.
- the terms “first side” and “second side” are referred to two sites where the graphics pattern is, respectively, input and projected.
- the device of the present invention can be used similar to a standard pen, in that it can be held in the hand and manipulated to as if to draw, trace or write text or graphics according to the users intentions and abilities. Additionally or alternatively, graphics (e.g., text) may also be downloaded/uploaded from an external or attached device, or typed via an integrated keypad into the device memory. The user has the option to use the device to project what has been recorded onto a surface, for example by means of rapid deflection or manipulation of a laser beam path.
- graphics e.g., text
- graphics may also be downloaded/uploaded from an external or attached device, or typed via an integrated keypad into the device memory.
- the user has the option to use the device to project what has been recorded onto a surface, for example by means of rapid deflection or manipulation of a laser beam path.
- a "surface” or “plane” on which a pattern is projected or displayed is a surface of any geometry, whether flat or not, may and may not be stationary, may be a surface of a certain object (e.g., a person's back), and may be a "virtual" surface in air space.
- a method for use in communication between two or more parties comprising: identifying a pattern input at a first party side and generating data indicative of the input pattern, and using said data indicative of the input pattern for operating an illumination process to create an illuminated pattern, indicative of said input pattern, on at least one surface exposed to at least one of said two or more party sides.
- the identifying of the pattern may include identifying the pattern created as a certain motion (e.g., user's motion while drawing or a motion while scanning certain graphics.
- the pattern to be projected may be created by user's actuation of a touch screen or keypad, or user's operation of a computer mouse.
- the motion pattern may be identified (sensed) using one of the following: a roller balls system, joystick/pointing stick system, a touch pads system or pressure sensitive display system, an optical sensing system, an imaging system, a gyros and accelerometers system, and a keypad system.
- the pattern identification includes filtering the pattern features to select only the features that are to be included in the illuminated pattern.
- the operation of the illumination process may include operating a light manipulation system (e.g., deflection system) to direct one or more light beams in accordance with the input pattern.
- the operation of the illumination process may include operating a spatial light modulator (SLM) to affect a light beam passing therethrough in accordance with the input pattern to thereby produce an output light pattern of the SLM indicative of the identified input pattern, or operating a matrix of light sources in accordance with the input pattern to thereby produce an output light pattern (structured light).
- the data indicative of the identified input pattern is stored and used to operate the illuminating process so as to create high-frequency repetitions of the illuminated pattern on the projection surface such that these repetitions are substantially not noticeable to the human eye.
- a method for use in communication between two or more parties comprising: identifying an input motion pattern created at a first party side and generating data indicative of the input pattern; and using said data indicative of the input pattern for operating an illumination process to create an illuminated pattern, indicative of said input motion pattern, on at least one surface exposed to at least one of said two or more party sides.
- a method for projecting a pattern comprising: identifying a pattern input in a communication device, generating data indicative of the input pattern, and using said data indicative of the input pattern for operating an illumination process to create an illuminated pattern, indicative of said input pattern, on at least one plane exposed to the device user.
- a method for use in communication between two or more parties comprising: identifying a pattern input at a first party side and generating data indicative of the input pattern, and using said data indicative of the input pattern for operating an illumination process to create an illuminated pattern, indicative of said input pattern, and to project the illuminated pattern on at least one surface exposed to at least one said two or more party sides with high frequency repetitions of said illuminated pattern such that said repetitions are substantially not noticeable to the human eye.
- a device comprising: a sensing unit accommodated at a first party side and operable to identify a pattern input at the first side and generate data indicative of the input pattern; an illumination unit configured and operable to create at least one light pattern; and a control unit connectable to the sensing unit and to the illumination unit, the control unit being configured and operable for receiving the data indicative of the input pattern and generating operating data to operate the illumination unit to create the at least one illuminated light pattern indicative of said input pattern on at least one surface exposed to at least one second party side, the device thereby enabling communication between the first and second parties.
- a device comprising a sensing unit configured for identifying an input motion pattern created at a first party side and generating data indicative of the input motion pattern; an illumination unit configured and operable to create at least one light pattern; and a control unit connectable to the sensing unit and to the illumination unit, the control unit being configured and operable for receiving the data indicative of the input motion pattern and generating operating data to operate the illumination unit to create the at least one illuminated light pattern indicative of said input pattern on at least one surface exposed to at least one second party side, the device thereby providing for communication between the first and second parties.
- a device comprising a sensing unit configured for identifying an input pattern created at a first party side and generating data indicative of the input pattern; an illumination unit configured and operable to create at least one light pattern; and a control unit connectable to the sensing unit and to the illumination unit, the control unit being configured and operable for receiving the data indicative of the input pattern and generating operating data to operate the illumination unit to create the at least one illuminated light pattern, indicative of said input pattern, and project said at least one illuminated pattern, with high frequency repetitions of said illuminated pattern such that said repetitions are substantially not noticeable to the human eye, onto at least one surface exposed to at least one second party side.
- a communication device configured for data exchange with other communication systems via a communication link, the device comprising: a sensing unit configured and operable to identify a graphics pattern in a message input to the communication device and generate data indicative of the input pattern; an illumination unit configured and operable to create at least one light pattern; and a control unit connectable to the sensing unit and to the illumination unit and being configured and operable for receiving the data indicative of the input pattern and generating operating data to operate the illumination unit to create the at least one illuminated light pattern indicative of said input pattern and output said at least one illuminated pattern towards at least one surface.
- the present invention also provides a mobile phone device comprising: a sensing unit configured and operable to identify a graphics pattern in a message input to the mobile phone device and generate data indicative of the input pattern; an illumination unit configured and operable to create at least one light pattern; and a control unit connectable to the sensing unit and to the illumination unit and being configured and operable for receiving the data indicative of the input pattern and generating operating data to operate the illumination unit to create the at least one illuminated light pattern indicative of said input pattern and output said at least one illuminated pattern towards at least one plane.
- the input pattern may be that input by a user of the communication device, a pattern received at the device via a communication link, or a pattern selected by the device user from pre-stored graphics.
- Fig. 1A is a block diagram of a device according to the invention
- Fig. 1R is a flow diagram of a method according to the invention
- Fig. 2 is a schematic illustration of a hand held (pen-like) device of the present invention
- Fig. 3 illustrates another example of the device of the present invention utilizing a touch screen
- Fig. 4 illustrates a mobile phone device configured according to the invention
- Figs. 1A is a block diagram of a device according to the invention
- Fig. 1R is a flow diagram of a method according to the invention
- Fig. 2 is a schematic illustration of a hand held (pen-like) device of the present invention
- Fig. 3 illustrates another example of the device of the present invention utilizing a touch screen
- Fig. 4 illustrates a mobile phone device configured according to the invention
- FIG. 5A to 5C schematically illustrate three examples, respectively, of the illumination unit configuration suitable to be used in the device of the present invention
- Fig. 6 illustrates the principles of "blanking" (data filtering) aspect of the present invention used when crating data indicative of an input motion pattern to be illuminated/projected
- Fig. 7 illustrates an example of the device of the present invention
- Fig. 8 illustrates an example of the device of the present invention utilizing a light deflection system having separate deflectors for X- and Y-axes deflections
- Fig. 9 schematically illustrates another configuration of a light deflection system suitable to be used in the device of Fig. 8
- Fig. 10 illustrates yet another example of the device of the present invention.
- a device 10 is configured as the so-called “laser drawer” operable as a communication device, wherein the communication between two or more sides is achieved by projecting an input pattern created by the device 10 at the first side (e.g., by first user) to a certain plane or surface exposed to the second side (second user).
- the term "created” not necessarily signifies actual patterning (drawing) carried out at the first side, but may also refer to reception of a certain graphics input (e.g., text) by the device 10.
- the device 10 includes a sensing unit 12, an illumination unit 14, and a control unit (CPU) 16 for operating the illumination unit 14 in accordance with data coming from the sensing unit 12.
- the sensing unit 12 may be incorporated in a common housing 17 (preferably a hand held housing, for example shaped like a pen) carrying the illumination and control units, or may be associated with one or more external sensor.
- the sensing unit 12 is configured to detect a pattern created at the first side, and to generate data indicative of the detected pattern (input pattern).
- the sensing unit 12 includes one or more appropriately designed sensor, and may also include as its constructional part a processor configured and operable to translate the sensed data into a pattern of coordinates, or alternatively such a processor may be part of the control unit 16.
- input pattern is indicative of graphics, such as picture or text.
- graphics such as picture or text.
- This may be graphics created (e.g., "drawn") by the user's operation of the device (e.g., device motion, or typed keypad); or "pre-existing graphics" that are previously saved, downloaded, shared, etc.
- a pattern indicative of graphics to be projected is that of a motion carried out by the individual's limb or by an object which is in physical contact with the individual. It should be understood that the sensing of motion may be implemented with and without direct contact with the moving object (e.g., individual's limb), for example motion of the individual' hand over a mobile phone may be sensed by equipping the phone device with a triangulating system of sensors.
- the input pattern indicative of certain graphics is created as a motion pattern.
- the sensing unit 12 is thus configured for sensing a motion or graphics input and generating data indicative thereof.
- the motion pattern is created by a movement of the entire device 10, e.g., a user moves the pen-like device 10 while "drawing" a picture to be presented (projected) to him and/or to another user, and thus the sensing unit 12 just identifies its own motion.
- the sensing unit 12 is capable of detecting direction and distance of travel effected by the user or another object whose motion is going to be projected. Alternatively, the sensing unit 12 can detect the effected force or acceleration and its direction.
- the sensing unit 12 can utilize at least one of roller balls, touch pads (finger or stylus), optical sensing technology, gyros and accelerometers, joystick-like buttons or pads to sense direction and force, and other suitable known techniques, as will be described more specifically further below.
- the control unit 16 is typically a computer device (chip with embedded application (e.g., vector/raster graphics algorithms) preprogrammed for processing and analyzing data coming from the sensing unit 16 and being indicative of the detected pattern (e.g., motion pattern).
- the control unit 16 receives the pattern-related data (input pattern) and generates output data to operate the illumination (or projection) unit 14 to enable generation of an illuminated (projected) pattern indicative of the input pattern.
- the illumination unit 14 includes a light source assembly 24 that is configured for generating either a single light beam or a plurality of light beams; and, depending on the light source assembly configuration, may also include a light directing assembly 26 shown in the figure in dashed lines.
- a pattern to be projected is identified (sensed), and data indicative of such an input pattern is created.
- the pattern identification may consist of detecting a motion carried out by a user (e.g., actual movement or typing), or may consist of scanning a pre-existing graphics.
- FIG. 2 exemplifies a hand held communication device 100 of the present invention. To facilitate understanding, the same reference numbers are used for identifying components that are common in all the examples of the invention.
- the device 100 has a pen-like housing 17 carrying a sensing unit 12; an illumination unit 14; and a control unit 16. Also provided in the device 100 are power source 29 (battery arrangement) and user interface unit 27 (buttons) allowing the user to operate the device.
- the sensing unit 12 is configured for sensing the motion of the device 100 while being moved by a user and generating measured data (input pattern) indicative of the so-created motion pattern.
- the control unit 16 receives the measured data and processes it to generate output data for operating the illumination unit 14.
- Light 46 exiting the device 100 i.e., light produced by the illumination unit
- a remote plane projecting surface
- the device generally designated 110, includes an internal motion input unit (a sensing unit) 12 in the form of a graphic touch screen, on which a user draws his input pattern (graphics) by a stylus.
- the device also includes illumination and control units which are not shown here.
- Light 46 indicative of the drawn pattern, outputs the device 110 via an aperture 90.
- the device 110 may be employed when a user wishes to project graphics onto a surface in front of him, and probably to see the projected picture concurrently while drawing it.
- the device is preferably configured to provide absolute positioning and a format for review and editing of new (drawn) or stored graphics.
- the device of the present invention may be configured to project the same picture onto more than one plane.
- the illumination unit is configured to define more than one path of light indicated of the sensed pattern (input pattern).
- a light separating unit e.g., a beam splitter
- two identical light patterns 46 concurrently propagate from the device 110 in different directions towards two different projection planes.
- projection onto two or more different planes may be selectively implemented, i.e., the device normally operating in the single-projection mode and being selectively operable in the multiple-projection mode.
- the beam separating unit may be controllably shiftable between its operative and inoperative positions, e.g., movable with respect to the optical path of a laser beam coming from a laser source.
- the multiple-projection mode may be implemented using multiple illumination sub-units, each including a laser source assembly and possible also a light directing assembly.
- the control unit operates each of the illumination sub-units in accordance with the same data indicative of the input pattern.
- the present invention may be used in a variety of applications, for example in a mobile phone device. Fig.
- the sensing unit is configured and operated to detect an input text pattern 92 (generally, graphics) and generate data indicative of the input pattern.
- an input text pattern 92 generally, graphics
- the pattern to be identified is that input or selected by a phone user (using the phone keypad or touch screen), or the pattern received in the phone device while being generated at another communication system.
- the sensing unit is capable of identifying the pattern presented on the phone display or digital data indicative of the received graphics.
- a phone message for example, an SMS
- the sensing unit may contain any known suitable sensor(s), e.g., accelerometers to sense movement, or may be constituted by a graphics pad of its display unit and a stylus (including even a fingernail) used to input graphics.
- accelerometer configuration are described in the following US Patents: 4,945,765; 5,006,487; 5,447,067; 6,581,465; 6,705,166; 6,750,775. Reference is made to Figs.
- the illumination unit 14 utilizes manipulation of a single laser beam in accordance with the input pattern (e.g., motion pattern or input graphics).
- the illumination unit 14 includes a laser source 24 generating a laser beam L ⁇ 5 and the light directing assembly 26 in the form of a light beam deflector or manipulator for displacing the beam in accordance with the sensed input pattern.
- the input pattern data is stored in a memory utility of the control unit (16 in Figs.
- Beam manipulation options generally fall into two categories: reflection and transmission, implemented using mirrors, lenses or fibers moved by galvanometers, piezo-electric actuators or MEMS devices.
- the beam manipulating arrangement 26 is configured for moving a laser beam along two mutually perpendicular axes quickly and precisely and at a reasonable angle of movement in order to be suitable for the needs of the device.
- the beam directing unit 26 may also be designed to optimize laser graphics capabilities (either vector or raster).
- laser graphics utilizes programming (operating or manipulating) a laser beam, by means of a computer system, to draw an image that can be projected onto almost any type of surface, presenting the so-called "electronic paint brush".
- the so-created images can be animated sequences that zoom, dissolve and rotate. It is known to synchronize the fast moving laser beams (reflecting from an array of mirrors) with music to thereby produce fantastic visual displays of crisscrossing, multi-colored beam patterns.
- Laser graphics begin with a small dot of laser light.
- Laser vector graphics utilizes the parallelism of laser beams: when laser beams strike a surface, the reflection back to individual's eyes appears only as a bright dot of light. Laser images are drawn by guiding a laser beam (and thus a very bright dot) along the path of the original drawing.
- the information about this path is to be defined as a series of horizontal and vertical coordinates, which is accomplished through a digitizing process, e.g., utilizing the so-called "digitizing tablet” device.
- a digitizing process e.g., utilizing the so-called "digitizing tablet” device.
- the latter consist of the following: The original art is placed on the tablet, pin-registered to assure perfect alignment with each successive frame, and traced by hand one line at a time. The locations of key points along these lines are thus entered into the control unit, which then outputs the individual changes along the horizontal and vertical axes as a connect-the-dot list of instructions.
- these X-Y signals are simultaneously output as operating voltages to scanners (deflectors) of the illumination (projection) unit.
- Blanking can be performed with a third scanner, an acousto-optic modulator, or by electronically controlling the laser output as done with semiconductor lasers.
- Persistence of vision is the only reason the images drawn with laser light appear to exist at all. Otherwise, a static laser image, let alone an animated laser character, would not exist at all.
- a laser image, after all, is merely a dot of laser light tracing out what is essentially a connect-the-dot picture over and over again, approximately thirty times per second. Without persistence of vision, one would merely see the moving dot. With the benefit of this electro-chemical process, the entire path of the dot is retained. The human eye and brain perceive the image being traced, and not merely the dot which traces it.
- Raster graphics utilizes the same persistence of vision phenomena described above, and represents images not by connecting dots and lines as in vector graphics, but by displaying rows and rows of dots. As with television, dots are closely spaced and displayed in fast repetition. The eye and brain merge the dots and the viewer sees a solid two-dimensional object. Raster graphics can be displayed using the same laser deflection and blanking systems used in vector laser graphics. Raster graphics excel over vector graphics in their ability to fill a defined area, and to move very quickly. Certain objects are more easily recognized as a filled area, rather than a vector outline.
- the control unit 16 of the device of the present invention may operate the illumination unit 14 in consideration with this advantage. This can be realized in several ways: (a) upon selection by the user itself; (b) using a look-up table which defines certain circumstances when raster graphics is to be operated rather than vector graphics; using other adaptive algorithms such as neural networks, which can decide, by way of "self improvement, as to whether to use raster or vector graphics. Circumstances defining when either one of raster and vector graphics is preferred may include parameters of displayed patterns, such as types of shapes, forms, whether it includes single letters or sentences, and parameters related to the environmental conditions in which illumination/projection is to be carried out. In the latter case, the device may include environmental sensor(s), for example a light-meter.
- Piezo electric elements deflect a light beam depending on the voltage supply to these elements.
- Piezo actuators are very precise, strong, low power consumption, and display extremely fast response times, although suffering from a relatively small scan angle and high expense. Almost any actuator may deflect a beam via mirrors, optic fiber cantilevers, lenses, prisms, or other beam moving materials.
- Graphics, animations, abstracts and dynamic beam effects are generated by X-Y scanning of the laser beam using galvanometer scanners.
- the scanners are large (i.e. macroscopic) mechanically controlled mirrors, with limited applicability for small, hand-held devices (e.g., a 3mm tube galvanometer, commercially available from ABEM, Sweden). For two-dimensional scanning, two perpendicular tubes are used.
- the beam directing unit of the device of the present invention utilizes deflectors manufactured by solid-state microelectronics technology, MEMS, which enables smaller size, higher performance, and greater functionality of the device.
- MEMS systems interface with both electronic and non-electronic signals and interact with non- electrical physical world as well as the electronic world by merging signal processing with sensing and/or actuation.
- An MEMS system deals with moving-part mechanical elements, making miniature systems possible such as accelerometers, fluid-pressure and flow sensors, gyroscopes, and micro-optical devices.
- MEMS is also widely used to fabricate micro optical components or optical systems such as deformable micromirror array for adaptive optics, optical scanner for bar code scanning, optical switching for fiber optical communication etc.
- the illumination unit 14 includes a light source 24 in the form of a two-dimensional array (matrix) of point-like laser sources, generally at 24A, each for generating a laser beam.
- the laser sources 24A are mounted on a planar support element 24B and are arranged in a spaced-apart relationship.
- the light source 24 is operated by the control unit to selectively actuate the laser sources, in accordance with data indicative of the input pattern (motion or input graphics), and to produce a plurality of spatially separated light components (structured light), generally at i, indicative together of the pattern (picture) to be projected (displayed).
- the sensing unit (12 in Figs. 1 and 2) may utilize any graphics input options.
- the main consideration for graphics input is whether or not the input comes from an internal motion sensing component, an internal or external keypad or graphics pad, or other internal or external or attached devices.
- Some of the standard motion sensing options for use in the device of the present invention include: roller balls, touch pads (finger or stylus), optical sensing technology, gyros and accelerometers, joystick-like buttons or pads to sense direction and force, and many others. Any of these may be used either alone or in combination to sense motion and direction information.
- Any graphics input systems or combination of such systems used in the device of the present invention is capable of sensing direction and distance of travel (or acceleration).
- the sensing unit can be implemented using various configurations. This may be the so-called internal input motion unit, in which case it includes a touch screen, keypad or graphics pad.
- the type of sensor(s) used in the sensing unit determines the type of motion which can be detected.
- the motion sensing unit may include motion sensors of different types.
- the device may include both internal input motion unit in the form of a graphics (touch) screen and a connecting port for connecting to an external motion sensing assembly, and may be operable to selectively actuate either one of the internal and external motion input means.
- the motion sensing unit may utilize a computer mouse that is typically used to perform meaningful and useful two-dimensional instructions on a computer screen by direct translation of the manual sliding of a mouse-like input device on a flat surface which mimics the orientation of the screen itself. This may be a mechanical mouse.
- Such a mouse typically carries a rubber ball slightly protruding from a cage containing two rollers set at right angles. As one rolls the ball across the desktop, it turns the rollers, which in turn send horizontal and vertical positioning information back to the computer, thus enabling the computer to make the mouse pointer on the screen moving left, right, up and down.
- the construction and operation of such a mechanical computer mouse are known per se and therefore need not be described in more details, except to note that mechanical computer mice come in all shapes and sizes, including some shaped and held like a pen with a small roller ball at the tip.
- Another type of known mouse suitable to be used in the present invention is an optical mouse, which has no rolling ball.
- the motion sensing assembly 12 may utilize the optical navigation technology, such as that used in Microsoft's IntelliMouse, where one or more LED is used to illuminate the features of a surface, and miniature camera receives and processes the image and produces direction/speed data.
- This technology does not require a reflective pad, in fact almost any surface will suffice. The need for a fixed surface or reference point may be bypassed by measuring the inertia of movement itself, without any limitation of space.
- Inertial sensing may be performed with two types of sensors: accelerometers which sense translational acceleration, and gyroscopes which sense rotational rate. Together, accelerometers, tilt and pressure sensors, and tiny gyroscopes, can detect exact movements.
- micromechanical accelerometers MEMS technology described above
- graphics input e.g., via motion sensors
- Movements may be made horizontally, as in most desktop environments, or vertically, as in a wall or blackboard type environment.
- Air drawing should preferably be of similar performance for horizontal and vertical surfaces, as for example, a roller ball is capable of moving the same in either case.
- Air drawing using three-dimensional accelerometers, for example, provides for processing the input to determine whether the movement is horizontal or vertical at a higher degree. Based on the sensed input pattern, a two-dimensional graphic can be displayed. It should also be noted that motion to be sensed may be made to reproduce a mental concept, or to trace an existing drawing or graphic by physically tracing the existing drawing or graphic with the moving input device.
- the device of the present invention may be used as a laser pointer to draw or move the beam point on a surface like a wall to create a drawing or graphic (generally, to create a pattern). The laser point can also be used to trace existing images or objects.
- the movements required to draw or trace with the laser point can be recorded by the motion sensor and processed for immediate or eventual display projection or upload to a computer for analysis.
- graphics input can also be generated by using a light beam (a laser beam) as a two-dimensional scanner.
- a drawing, especially simple line drawings, or even three-dimensional objects can be scanned, and visual and contrast information sensed for example by an integrated camera.
- the scan is then processed to determine the best and most efficient (for example, least detailed) way to display the scanned object so that the projection display resembles the original.
- the device of the present invention i.e., creation of the input pattern
- Fig. 6 exemplifies a multiple-circle pattern 30 to be illuminated on a surface (projecting plane). Data indicative of this pattern can be created as follows: A user starts his motion with his index finger at point 31A in space. The motion sensing unit starts sensing this motion.
- the mouse buttons can be used: for example, keeping the button pressed while moving the mouse tells the program that this movement is “active” and should be displayed; and releasing the button tells the program that the current movement (while the button is released) is "passive” and should not be displayed.
- the communication device may include user interface means (e.g., buttons) to enable distinguishing between those movements that are and are not to be considered in creating the pattern to be projected.
- This non-displaying or "blanking" of the laser itself is accomplished in a number of manners, utilizing light beam manipulation (controlling the operation of the illuminator). Blanking input may for example be accomplished in the following manner.
- the sensor which makes contact with a surface is pressure sensitive, whereby a firm pressure against the surface indicates a movement intended to be used in the pattern creation (in projection or displaying); a softer pressure (but still contact) against the surface indicates movement describing positional information but not movement for display. It should be understood that other methods of inputting blanks while writing on surfaces may be used as well, such as the assumption that fast movements are positional information movements ("passive" movement) and slow movements are "active" to be used in the pattern creation, or vice versa. Blanking using accelerometers or gyros can also be accomplished with buttons as in the mouse-related example and the above described speed-sensing method.
- non-surface writing can sense changes in a vertical position: dips or lower movements indicate “active" movement for display, and heights or upper movements indicate positioning ("passive” movement).
- motion sensing can utilize both surface- writing aspects and accelerometer or gyro aspects. For example, a user may draw on a surface, and position is determined by accelerometer. When the device is moving while contacting a surface (e.g., surface or pressure sensor is activated), this indicates “active” (display) movement, while lifting the device off the surface (and the surface sensor) is indicative of position information.
- the control unit (16 in Figs.
- the illumination unit i.e., for the beam deflector arrangement in the example of Fig. 5 A, for the SLM in the example of Fig. 5B, and for the light source assembly in the example of Fig. 5C.
- This processing algorithm is optimized for the specific motion sensing and laser projecting (illumination) options used in the device, for example optimized for such parameters as a power supply to the actuators, parameters defining a response profile, compensation for limitations or physical characteristics of the particular beam-moving or generating system. Additionally, the sensing unit has its own characteristics which must be taken into consideration when interpreting the sensed input pattern.
- control unit is capable of interpreting the user's intentions and design instructions for the laser projector (illumination unit) which best match those intentions. It should also be noted that the control unit may be preprogrammed for compressing a projection in the direction of projection if the projection is to be projected on the floor at some distance from the first party such that the second party, who is assumed to be viewing the horizontal floor (display) with a line of site more perpendicular to the floor (display) surface, sees the image at normal dimensions rather than elongated. The amount of compression may be calculated by the device based on the pitch angle of the device (sensed by the device) being held during projection with respect to the projecting surface (floor).
- a similar modification to the projection can be performed if the display surface is vertical and perpendicular to the line of sight of the second party, but not the first party (the device operator).
- the device (its sensing unit) might be capable of sensing the roll orientation of the device and displaying the projection in the proper orientation, irrespective of the device orientation being held in the hand.
- Accelerometers can be used to detect jitter and shaking of the unit during projection and modulate the projection to counter the jitter, stabilizing the projection on the surface.
- Other image stabilizing methods can be used as well.
- the control unit may be configured and operable to allow the user to change the angle of projection (and therefore the size of the display).
- the device may be configured to allow standard display/projection effects, such as pulsed projections, fade-ins and outs, size fluctuations, shaking, rotating, warping, melting, eclipsing, morphing, etc.
- Instructions can also be optimized when a user draws words or graphics using movements, the speed, order and direction of which may be best suited for manual writing but may not be best optimized for rapid laser scanning movements; the processing algorithm might decide that a better looking, more efficient result will require projecting movements backwards, or jumping between letters or graphics lines using different positioning/blanking movements or order, or scanning certain graphics horizontally (as in raster graphics).
- the control unit may operate to reduce the size of a displayed area in accordance with a maximum recommended angle of projection.
- the desired approach may be to project all drawings at the same angle, every time.
- the complexity or "fill" of a drawing may result in a dilute or dim image, or will exceed the laser projector speed or cycle time recommendations in order to achieve it.
- the angle of projection may be reduced in order to create a better or more aesthetic result.
- a user inputs a point as the graphic, i.e.
- the control unit in the device of the present invention may operate such that even a point, if input into the sensor unit, will be displayed much more dilute and spread out than a laser pointer point (such as the circle mentioned above). It can be made impossible to keep a narrow beam of laser light directed to the eye, since it is moving around so rapidly and so widely. It should be noted that the projected image (pattern) need not be static. As the laser beam is cycling through the graphics, small changes from cycle to cycle will appear to the eye to be movement.
- the device may be designed to display animation. Animation may be input to the processor by inputting separate frames, as in a traditional animation. Alternatively, two or more images may be merged by the processor using existing merging algorithms and thus produce more "frames" to smooth the animation.
- scrolling marquee may be used to display longer text by displaying a window of, say, a few letters at a time moving across the window.
- Animations may run once, or may be looped (repeated) for extended projection.
- the "animation” may also simply be the display of separate images in sequence, not intending to simulate movement. For example, a sentence may be displayed a few words per image, a second or two per image.
- Frames or images used in these animations or dynamic displays may be inputted in any of the ways described above.
- Handwriting recognition analysis may be applied to the graphics input to convert any handwriting to more standard fonts for projection display. Such converted handwriting can then be manipulated with standard editing tools, for example, cutting and pasting and also even spell correction.
- the motion sensing unit is configured to detect direction and distance of travel effected by the user or another object whose motion is going to be projected, or to detect the effected force or acceleration and its direction.
- the case may be such that signals indicative of the detected motion directly operate the illumination unit to illuminate a pattern indicative of the motion signals (either one-to-one or after some processing by a mapping algorithm).
- the control unit may carry out a pattern recognition algorithm.
- This algorithm includes identification of motion, specific patterns in the motion (direct lines, curves), and repetitive patterns (e.g., a circle). Pattern identification can be either ad-hoc or based on pre-determined patterns to be introduced by the user or selected by him from a look-up table. The analysis results or part thereof may be stored for future use, or directly used by the control unit to operate the illumination unit accordingly. It may also be the case that the user creates a pattern, stores it, and then, using the control unit illuminates a second pattern which is a repetition of the first pattern that he created.
- Fig. 7 exemplifying a pen-like device, generally designated 200, constructed and operated according to the present invention.
- the device 200 includes a sensor unit or graphics input 12, illumination and control units (not shown here), and control buttons 27. Also provided in the device is a graphics screen/pad 40 (e.g., LCD) which may serve as an internal input motion utility or for displaying previews of graphics, either the last graphic inputted (e.g., drawn), or a graphic from the memory.
- a key pad arrangement 42 (multiple buttons) is also provided to allow the user to directly type a text or to scroll through the list of stored graphics and displaying each one on the screen 40. For example, 4-way buttons may be used to choose characters, and a disambiguation system may be employed to speed (and make more accurate) the text entry. When a graphic is chosen, a certain button of the key pad arrangement 42 may be used to select the graphic for projection.
- the device 200 and the separate (external) motion input unit may communicate with one another via wires (or fibers) or via wireless signal transmission (such as BlueTooth technology), or may be attached structurally, e.g., integral within the same unit. To this end, the device is equipped with appropriately designed communication ports 44.
- the device 200 may also allow for modifying inputted or drawn graphics either by an internal preview and mechanism for modification or a similar mechanism on an external or attached device. For example, the graphic may be previewed as a "vector graphic". The dots may be selected and moved by stylus or buttons, and the graphic created by these dots is modified accordingly, possibly to create new frames for animation, as described above.
- the device 200 may be designed in a linear orientation, where an output laser beam 46 propagates straight from the end of the device 200, opposite to the motion input unit 12, or the beam 46 emanates from the device perpendicular to the lengthwise orientation of the device (L-shaped design, where the beam exits from side of the device).
- the device 200 may also include a motion sensor for itself, in order to minimize the resulting unwanted movement or "jiggle" of the device when turning the "record" mode ON and OFF.
- “Jiggle” can be minimized in a number of ways, including an easy access to a light pressure button, or a light sensor, at or near the finger of thumb position on the device.
- the initiation and exit can be assumed using an algorithm in the control unit that assumes the start and end of a graphic movement. Even if the intended graphic is embedded within a larger series of movements, the user may then cut away any unintended or extraneous movements with a graphics display device in order to arrive at the intended graphic.
- This "record” button may or may not be the same button as the "blanking” button. For example, a long press of the button may indicate an initiation or exit from the "record” mode, while a short press of the same button may indicate that a blanking should start or stop.
- Blanking and record button conflicts may be avoided in this way, or by assigning either blanking features to the movement interpretation (as described above) or surface pressure sensors (as also described above), and/or record indications to movement processing algorithms. Both extraneous movements and blanking movements can be modified, subtracted or added (as the case may be) after input has been completed, by an integrated or external graphics display device and graphics manipulation methods (for example, passive motion may be represented by different colored lines or dotted lines).
- a separate anti- Jiggling device may be used for controlling the internal motion input unit.
- Such an anti- Jiggling device includes a control unit (CPU) and a transmitter, the communication device being thus equipped with an appropriate signal receiver.
- Figs. 8 to 10 exemplifying devices of the present invention utilizing a light directing assembly based on light deflection.
- the beam deflection can be realized by reflection, transmission or a combination of the two modes. Reflection and transmission can be realized using mirrors, lenses or fibers moved by galvanometers, piezo-electric actuators or MEMS devices. Any of these options is capable of deflecting a beam quickly and precisely, and at a reasonable angle of movement in order to be suitable for the needs of the device.
- Figs. 8 to 10 exemplifying devices of the present invention utilizing a light directing assembly based on light deflection.
- the beam deflection can be realized by reflection, transmission or a combination of the two modes. Reflection and transmission can be realized using mirrors, lenses or fibers moved by galvanometers, piezo-electric actuators or MEMS devices. Any of these options is capable of deflecting a beam quickly and precisely, and at a reasonable angle of movement in order to be suitable for the needs
- Fig. 8 shows a communication device 300 according to the invention.
- the device 300 includes a housing 17 designed so as to be conveniently held by user, a motion sensing unit 12, a control unit 16, an illumination unit 14 and a power source 29.
- the illumination unit 14 is designed similar to the above-described example of Fig. 3A, namely, includes a laser source 24, and a light directing assembly 26 in the form of a beam deflector.
- the beam deflector assembly 26 is configured as a manipulation device using separate units for deflecting a light beam along the X- and Y-axes, respectively.
- the deflector 26 includes two mirrors 60A and 60B driven (by appropriate actuators which are not specifically shown) for rotation about two mutually perpendicular axes, respectively, thus deflecting the laser beam in two mutually perpendicular directions.
- the laser source 24 and mirrors 60A and 60B are mounted in the so-called "180° back-illumination" configuration, i.e., the laser source 24 emits a laser beam directed towards mirror 60 A, which deflects (reflects) the beam to mirror 60B, which in turn deflects the laser beam towards the output from the device direction. Consequently, two spatial degrees of freedom for the movement of the laser beam are established by back-directing.
- This configuration has the advantage of a smaller footprint. Fig.
- a mirror 60 is used in the "back-illumination" configuration: A beam emitted by a laser source 24 propagates to the mirror 60 along a path 62A which forms an angle ⁇ >90° with the desired output beam direction 62B.
- a second stationary mirror can be used similar to the example of Fig. 8, where one of the two mirrors would be stationary and the other would be a 2-D scanning mirror.
- the light deflector assembly may be of any known suitable configuration utilizing either one-dimensional or two-dimensional deflectors, for example based on MEMS scanning mirrors.
- Fig. 10 exemplifies yet another configuration of a communication device 400 of the present invention.
- the device is configured generally similar to the previously described examples, namely, includes such main constructional parts as a sensing unit 12, an illumination unit 14, and a control unit 16, as well as a power source 29.
- the illumination unit of device 400 is configured generally similar to the devices shown in Figs. 5A and 9, namely utilizes a laser source 24 and a light directing assembly 26 in the form of a beam deflector (X-Y scanning transmission optics or scanner).
- Device 400 distinguishes from the above- described device 300 in that the deflector assembly 26 of device 400 is configured for the beam manipulation using a single transmission unit performing the manipulation in both the X- and Y-axes simultaneously. Additionally, the illumination unit of device 400 has a so-called "forward- illumination" configuration: the laser source 24 is oriented such that a laser beam 66 emitted by the laser source propagates towards an output facet 68 of the device. The laser beam 66 passes through the scanner 26 which operates to deflect the laser beam 66 towards the required direction (according to the input motion pattern).
- the scanner 26 may include a crystal 70 with unparallel facets 70 A and 70B.
- Variation of an index of refraction of the crystal 70 by application of an external field effects a change in an angle of the axis of propagation of the laser beam 46 exiting the crystal 70 with respect to the input laser beam axis.
- the frequency of the voltage change is dictated by the control unit 16, based on the input pattern to be illuminated/projected.
- the crystal 70 is mounted on an X-Y actuator 72.
- the crystal may be replaced by a glass plate with unparallel facets (e.g., a prism) mounted for movement by the actuator 72, and the movement of such a glass plate effects the beam deflection.
- the transmission-based scanner may be of any known suitable configuration, for example based on acousto-optic deflection.
- An example of acousto-micro-optic deflector is described in US Patent No. 6,751,009.As indicated above, various techniques can be used to affect the intensity and/or direction of a light beam. Some techniques can be used for both affecting the intensity and the direction of a light beam, and/or to actually implement one function by the other (i.e., deflecting a fraction of the beam can perform either modulation or deflection).
- Such techniques may utilize acousto-optic modulators, electro-optic and magneto-electro-optic effects, rotating prism or mirror to deflect the beam, or piezo-electric actuators to deflect the beam.
- the intensity or the beam spread can also be modified by changing the transparency or optical characteristics of certain materials that remain constantly in the beam path.
- the light directing arrangement 26 may utilize a shutter 80 mounted for movement across the beam propagation axis and thus selectively block the beam.
- the movement is fast enough to respond before a new active motion pattern feature is to be illuminated/projected.
- the moveable shutter 80 may be designed as a rounded iris with a fast response.
- a moveable shutter may be replaced by an acousto-optic modulator. Such a modulator, when operated, creates in front of the crystal 70 (within a space region of the beam propagation) a drastic change in the index of refraction, and consequently, the beam is blocked.
- the filter/shutter is located in the optical path of deflected light
- a filter or shutter may be associated with the laser source, thus affecting the light propagation while on the way to the first mirror (generally, light deflector).
Landscapes
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Optical Communication System (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IL2004/000614 WO2005006024A2 (en) | 2003-07-09 | 2004-07-08 | Optical method and device for use in communication |
US11/631,478 US20080048979A1 (en) | 2003-07-09 | 2004-07-08 | Optical Method and Device for use in Communication |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US60/485,942 | 2003-07-09 | ||
PCT/IL2004/000614 WO2005006024A2 (en) | 2003-07-09 | 2004-07-08 | Optical method and device for use in communication |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005006024A2 true WO2005006024A2 (en) | 2005-01-20 |
WO2005006024A3 WO2005006024A3 (en) | 2005-03-24 |
Family
ID=34062107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2004/000614 WO2005006024A2 (en) | 2003-07-09 | 2004-07-08 | Optical method and device for use in communication |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2005006024A2 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5818424A (en) * | 1995-10-19 | 1998-10-06 | International Business Machines Corporation | Rod shaped device and data acquisition apparatus for determining the position and orientation of an object in space |
US20010030642A1 (en) * | 2000-04-05 | 2001-10-18 | Alan Sullivan | Methods and apparatus for virtual touchscreen computer interface controller |
-
2004
- 2004-07-08 WO PCT/IL2004/000614 patent/WO2005006024A2/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5818424A (en) * | 1995-10-19 | 1998-10-06 | International Business Machines Corporation | Rod shaped device and data acquisition apparatus for determining the position and orientation of an object in space |
US20010030642A1 (en) * | 2000-04-05 | 2001-10-18 | Alan Sullivan | Methods and apparatus for virtual touchscreen computer interface controller |
Also Published As
Publication number | Publication date |
---|---|
WO2005006024A3 (en) | 2005-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080048979A1 (en) | Optical Method and Device for use in Communication | |
US11252385B2 (en) | Scanning laser projection display for small handheld devices | |
JP4994744B2 (en) | Portable image projector | |
JP4560019B2 (en) | Image projection device | |
US9348144B2 (en) | Display device and control method thereof | |
US20160210780A1 (en) | Applying real world scale to virtual content | |
KR101185670B1 (en) | A dual position display in an electronic device | |
CN117891335A (en) | Keyboard for virtual, augmented and mixed reality display systems | |
JPWO2018003860A1 (en) | Display device, program, display method and control device | |
US20170017323A1 (en) | External user interface for head worn computing | |
US20160026239A1 (en) | External user interface for head worn computing | |
US20080018591A1 (en) | User Interfacing | |
JPWO2018003861A1 (en) | Display device and control device | |
CN108933850A (en) | Mobile terminal | |
CN101095098A (en) | Visual system | |
US9851574B2 (en) | Mirror array display system | |
US20040070563A1 (en) | Wearable imaging device | |
JP6776578B2 (en) | Input device, input method, computer program | |
WO2005006024A2 (en) | Optical method and device for use in communication | |
US20150160543A1 (en) | Mobile microprojector | |
KR20200083762A (en) | A hologram-projection electronic board based on motion recognition | |
CN202058091U (en) | Portable communication device | |
CN101326477A (en) | Assemblies and methods for displaying an image | |
JP2018081415A (en) | Input device, input method and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WPC | Withdrawal of priority claims after completion of the technical preparations for international publication |
Ref country code: WO |
|
CFP | Corrected version of a pamphlet front page | ||
CR1 | Correction of entry in section i |
Free format text: IN PCT GAZETTE 03/2005 UNDER (30) DELETE "60/485,942 9 JULY 2003 (09.07.2003) US" |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11631478 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase | ||
WWP | Wipo information: published in national office |
Ref document number: 11631478 Country of ref document: US |