WO2010008657A1 - Method and apparatus for touchless input to an interactive user device - Google Patents

Method and apparatus for touchless input to an interactive user device Download PDF

Info

Publication number
WO2010008657A1
WO2010008657A1 PCT/US2009/042840 US2009042840W WO2010008657A1 WO 2010008657 A1 WO2010008657 A1 WO 2010008657A1 US 2009042840 W US2009042840 W US 2009042840W WO 2010008657 A1 WO2010008657 A1 WO 2010008657A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
recited
user
light sources
template
Prior art date
Application number
PCT/US2009/042840
Other languages
French (fr)
Inventor
Paul Futter
William O. Camp
Karin Johanne Spalink
Ivan Nelson Wakefield
Original Assignee
Sony Ericsson Mobile Communications Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Ericsson Mobile Communications Ab filed Critical Sony Ericsson Mobile Communications Ab
Priority to EP09789638A priority Critical patent/EP2304532A1/en
Publication of WO2010008657A1 publication Critical patent/WO2010008657A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Definitions

  • the present invention relates to interactive user devices, more particularly to providing for touchless user input to such devices.
  • Mf _ Mobile communication devices such as cellular phones, laptop computers, pagers, personal communication system (PCS) receivers, personal digital assistants (PDA), and the like, provide advantages of ubiquitous communication without geographic or time constraints. Advances in technology and services have also given rise to a host of additional features beyond that of mere voice communication including, for example, audio-video capturing, data manipulation, electronic mailing, interactive gaming, multimedia playback, short or multimedia messaging, web browsing, etc.
  • Other enhancements such as location-awareness features, e.g., satellite positioning system (SPS) tracking, enable users to monitor their location and receive, for instance, navigational directions.
  • SPS satellite positioning system
  • HJH N I xhe above described needs are fulfilled, at least in part, by mounting a plurality of light sources spaced from each other in a defined spatial relationship, for example in a linear configuration, on a surface of an interactive user device.
  • At least one light sensor is also positioned at the surface of the housing. The light sensor senses light that is reflected from an object placed by the user, such as the user's finger, within an area of the light generated by the light sources.
  • a processor in the user device can recognize the sensed reflected light as a user input command correlated with a predefined operation and respond accordingly to implement the operation.
  • the interactive device may be a mobile phone or other hand held device.
  • the predefined operation may relate to any function of the device that is normally responsive to user input.
  • a viable alternative is provided for keypad, joystick and mouse activation.
  • This alternative is not limited to handheld devices as it is applicable also to computer systems.
  • Each of the light sources preferably exhibits an identifiable unique characteristic.
  • the light sources may comprise LED's of different colors or emanate signals of different pulse rates.
  • the light sensor can identify components of the reflected light with corresponding sources.
  • the relative magnitudes of the one or more components are used as an indication of the position, in single dimension or two-dimension, of the user object.
  • the position is correlated by the processor with a predefined device operation.
  • Each light source may have an outer layer of film through which a unique image can be projected. The projected image may aid the user for positioning the user object.
  • the position of the user object may be linked to the device display. For example, one or more of the predetermined operations may be displayed as a menu listing. A listed element may be highlighted in the display as the user's object attains the spatial position associated with the element. Selection of a particular input may be completed by another user input, such as an audible input sensed by a microphone or a capacitive sensor, to trigger the operation by the processor.
  • a plurality of light sensors may be mounted on the housing surface. The number of sensors may be equal in number to the number of sources and positioned in a defined spatial relationship with respective sources, for example, linearly configured and in longitudinal alignment with the sources.
  • the processor can correlate the relative linear position of the light source with a predefined device operation.
  • This exemplified configuration of sources and sensors also can be used to track real time movement of the user object. For example, a sweep of the user's finger across the light beams generated by a particular plurality of adjacent sources can be correlated to device function (for example, terminating a call), while the sweep across a different plurality of light beams can be correlated with a different device function.
  • a retractable template can be provided at the bottom of the device.
  • the template may be imprinted with a plurality of two-dimensional indicia on its upper surface.
  • the template can be extended laterally from the housing to lie flat on the surface supporting the housing.
  • Each of the indicia can be correlated with a device function, as a guide for the appropriate positioning of the user's finger.
  • the template may be coupled electrically to the processor so that touching one of the indicia will trigger the photo sensing operation. For example, at each of the indicia a switch may be operable by depression of the user's finger to signal the processor. Alternatively, a capacitive sensor may be employed.
  • each of the indicia may represent a text entry, similar to an English language keyboard.
  • the template When extended to a different position, the template may represent text entry for a different language or, instead, a plurality of different input commands. Correlation of indicia positions with device operations for different lengths of template extension may be stored in the memory of the device.
  • the position of the user object in both the two-dimensional lateral and longitudinal components can be determined by the processor in response to the input data received from the plurality of sensors.
  • the distance in the lateral direction i.e., the direction parallel to the housing surface, can be determined based on the relative magnitudes of light sensed among the light sensors.
  • the distance in the longitudinal direction i.e., the direction perpendicular to the housing surface, also can be determined based on the relative magnitudes of the totality of the sensed reflected light.
  • FIG. 1 is a block diagram of an interactive user device, exemplified as a mobile communication device; l ⁇ ) ⁇ t ?l
  • FIG. 2 is a perspective view of a configuration including a plurality of light sources with corresponding photo-sensors.
  • FIG. 3 is a variation of the configuration shown in FIG. 2.
  • FIG. 4 is a plan view of a configuration such as shown in FIG. 2 illustrative of one mode of operation.
  • FIG. 5 is a plan view of a configuration such as shown in FIG. 2 with additional modification.
  • FIG. 6 is a flow chart exemplifying one mode of operation.
  • FIG. 1 is a block diagram of a mobile communication device such as a mobile phone.
  • mobile communication device 100 includes one or more actuators 101 , communications circuitry 103, camera 105, one or more sensors 107, and user interface 109. While specific reference will be made thereto, it is contemplated that mobile communication device 100 may embody many forms and include multiple and/or alternative components.
  • User interface 109 includes display 1 11 , keypad 113, microphone 1 15, and speaker 1 17.
  • Display 1 1 1 provides a graphical interface that permits a user of mobile communication device 100 to view call status, configurable features, contact information, dialed digits, directory addresses, menu options, operating states, time, and other service information, such as physical configuration policies associating triggering events to physical configurations for automatically modifying a physical configuration of mobile communication device 100, scheduling information (e.g., date and time parameters) for scheduling these associations, etc.
  • the graphical interface may include icons and menus, as well as other text, soft controls, symbols, and widgets. In this manner, display 111 enables users to perceive and interact with the various features of mobile communication device 100.
  • Keypad 113 may be a conventional input mechanism. That is, keypad 113 may provide for a variety of user input operations. For example, keypad 113 may include alphanumeric keys for permitting entry of alphanumeric information, such as contact information, directory addresses, phone lists, notes, etc. In addition, keypad 113 may represent other input controls, such as a joystick, button controls, dials, etc. Various portions of keypad 113 may be utilized for different functions of mobile communication device 100, such as for conducting voice communications, SMS messaging, MMS messaging, etc. Keypad 113 may include a "send" key for initiating or answering received communication sessions, and an "end” key for ending or terminating communication sessions.
  • Special function keys may also include menu navigation keys, for example, for navigating through one or more menus presented via display 111, to select different mobile communication device functions, profiles, settings, etc.
  • Other keys associated with mobile communication device 100 may include a volume key, an audio mute key, an on/off power key, a web browser launch key, a camera key, etc. Keys or key-like functionality may also be embodied through a touch screen and associated soft controls presented via display 111.
  • Microphone 115 converts spoken utterances of a user into electronic audio signals, while speaker 117 converts audio signals into audible sounds. Microphone 115 and speaker 117 may operate as parts of a voice (or speech) recognition system.
  • a user via user interface 109, can construct user profiles, enter commands, generate user-defined policies, initialize applications, input information (e.g., physical configurations, scheduling information, triggering events, etc.), and select options from various menu systems of mobile communication device 100.
  • input information e.g., physical configurations, scheduling information, triggering events, etc.
  • Communications circuitry 103 enables mobile communication device 100 to initiate, receive, process, and terminate various forms of communications, such as voice communications (e.g., phone calls), SMS messages (e.g., text and picture messages), and MMS messages.
  • Communications circuitry 103 can enable mobile communication device 100 to transmit, receive, and process data, such as endtones, image files, video files, audio files, ringbacks, ringtones, streaming audio, streaming video, etc.
  • the communications circuitry 103 includes audio processing circuitry 119, controller (or processor) 121 , location module 123 coupled to antenna 125, memory 127, transceiver 129 coupled to antenna 131, and wireless controller 133 (e.g., a short range transceiver) coupled to antenna 135.
  • t;ipj ' - Wireless controller 133 acts as a local wireless interface, such as an infrared transceiver and/or a radio frequency adaptor (e.g., Bluetooth adapter), for establishing communication with an accessory, hands-free adapter, another mobile communication device, computer, or other suitable device or network.
  • a radio frequency adaptor e.g., Bluetooth adapter
  • Processing communication sessions may include storing and retrieving data from memory 127, executing applications to allow user interaction with data, displaying video and/or image content associated with data, broadcasting audio sounds associated with data, and the like.
  • memory 127 may represent a hierarchy of memory, which may include both random access memory (RAM) and read-only memory (ROM).
  • Computer program instructions such as "automatic physical configuration" application instructions, and corresponding data for operation, can be stored in non-volatile memory, such as erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and/or flash memory; however, may be stored in other types or forms of storage.
  • Memory 127 may be implemented as one or more discrete devices, stacked devices, or integrated with controller/processor 121.
  • Memory 127 may store program information, such as one or more user profiles, one or more user defined policies, one or more triggering events, one or more physical configurations, scheduling information, etc.
  • system software, specific device applications, program instructions, program information, or parts thereof may be temporarily loaded to memory 127, such as to a volatile storage device, e.g., RAM.
  • Communication signals received by mobile communication device 100 may also be stored to memory 127, such as to a volatile storage device.
  • Controller/processor 121 controls operation of mobile communication device 100 according to programs and/or data stored to memory 127. Control functions may be implemented in a single controller (or processor) or via multiple controllers (or processors). Suitable controllers may include, for example, both general purpose and special purpose controllers, as well as digital signal processors, local oscillators, microprocessors, and the like. Controller/processor 121 may also be implemented as a field programmable gate array (FPGA) controller, reduced instruction set computer (RISC) processor, etc. Controller/processor 121 may interface with audio processing circuitry 119, which provides basic analog output signals to speaker 117 and receives analog audio inputs from microphone 115.
  • FPGA field programmable gate array
  • RISC reduced instruction set computer
  • ⁇ •- Controller/processor 121 in addition to orchestrating various operating system functions, can also enable execution of software applications.
  • One such application can be triggered by event detector module 137.
  • Event detector 137 is responsive to a signal from the user to initiate processing data received from sensors, as to be more fully described below.
  • the processor implements this application to determine the spatial location of the user object and to identify a user input command associated therewith.
  • FIG. 2 is a perspective view of a housing 200 of an interactive device, such as the communication device exemplified in FIG. 1.
  • a lower surface of the housing may be placed to rest on a planar support surface, such as a table, desk or counter.
  • Mounted on the side surface of the housing is a linear array of six light sources 202 and a corresponding linear array of six photo-sensors 204.
  • the sources may comprise, for example, light emitting diodes (LEDs). As shown, each light source is in relative vertical alignment on the side surface of the housing.
  • Illustrated in the drawing figure is a user's finger placed in proximity to the fourth vertically aligned pair of light source and photosensor.
  • the position of the user's hand represents the selection by the user of a specific input command to be transmitted to the processor.
  • the light generated by the source of this pair is reflected back to the photo-sensor of the pair.
  • the user may use any object dimensioned to provide appropriate overlap of a single generated light beam. Data received from the plurality of photo-sensors are processed to determine which photo-sensor has the strongest response to light generated by the LEDs.
  • the linear position of the user object can be determined by the processor by evaluating the relative strengths of the received photo-sensor inputs.
  • the processor can then access a database that relates position to predefined operation input selections.
  • the user selection is implemented by sensing a static placement of the object in the vicinity of a photo-sensor. As the user's finger or object must be moved to the desired position to effect the command selection, provision may be made to prevent reading of the sensor outputs until the user object has attained the intended position. Such provision may be implemented by triggering reading of the sensor outputs in response to an additional criterion.
  • criterion may comprise, for example, an audible input to the device microphone.
  • Such input may be a voice command or an audible tapping of the support surface when the object has reached its intended position.
  • Another such input may be a change in sensed capacitance when the user object is placed sufficiently close to the housing.
  • FIG. 2 may also be operated in a dynamic mode.
  • the user's finger or other object may be moved over time across the path of a plurality of the light beams. Such movement can be tracked to provide the processor with a corresponding time sequence of sources and, thus, object positions.
  • Specific user interface commands can be mapped in memory to respective various combinations of position sequences. For example, a finger sweep across all light beams may be translated to a command for terminating a call.
  • FIG. 3 is a variation of the configuration shown in FIG. 2, wherein light from fewer sources reflects from the user object to fewer sensors.
  • the LEDs may be of different colors or may produce light signals of different pulse widths.
  • Light sensed by the photo-sensors thus may be identified with respective sources.
  • the processor can access a database that correlates light beam characteristics with the light sources.
  • j MJ '. ⁇ Specifically illustrated are two sources 202 located near respective ends of the housing.
  • Sensor 204 is located near the center of the housing.
  • the user's finger is positioned intermediate the two sources in the vertical (or lateral) direction, somewhat closer to the upper source.
  • the light reflected from the object to the photo sensor 204 comprises a beam generated by the upper source and a beam generated by the lower source. As the object (finger) is closer to the upper source, its reflected beam will be of greater amplitude than the beam reflected by the lower source.
  • the lateral position of the object along-side the device can be determined by evaluating the relative strengths of the light received by sensor 204.
  • the beam components are distinguishable by virtue of their unique characteristics.
  • lMiOfit FlG. 4 is illustrative of an operational mode in which the two- dimensional position of the object can be determined using a configuration of light sources and photo-sensors such as shown in FIG. 2.
  • the user's finger is depicted in a first position that is relatively close to the housing and a second position that is further from the housing. In the first position, as the object is close in the longitudinal (horizontal) direction, only a few light source reflections will reach the third photo-sensor 204. Three such beams are illustrated, the reflected beam of the closest source being the strongest of the three.
  • the processor can evaluate the relative strengths of all reflected beams while identifying received data with the respective photo-sensors. This evaluation can determine the object location in the lateral direction (parallel to the housing edge) as well as its distance from the phone edge, i.e., the object location in the longitudinal direction.
  • FIG. 5 is a top view of the housing at rest on a support surface.
  • Template 210 is retractably coupled to the housing 200 near its bottom. Shown in a position extended from the housing, as indicated by the arrow, the template 210 can lie flat on the support surface to ease user implementation. The template can be retracted in the direction opposite the arrow to be encompassed by the housing when not in use.
  • the template 210 is imprinted with a plurality of indicia 212 on its upper surface.
  • the indicia are exemplified by a two- dimensional spaced array in rows and columns.
  • the indicia may be images of icons that are recognizable by the user.
  • the two-dimensional position of each of the indicia can be correlated with a device function and serve as a guide for the appropriate positioning of the user's finger.
  • the template may be coupled electrically to the processor so that touching one of the indicia will trigger the photo sensing operation. For example, at each of the indicia a switch may be operable by depression of the user's finger to signal the processor. Alternatively, a capacitive sensor may be employed.
  • the template may be utilized in a plurality of extended positions, the indicia representing a different set of commands for each extended position.
  • each of the indicia may represent a text entry, similar to an English language keyboard.
  • the template may represent text entry for a different language or, instead, a plurality of different input commands. Correlation of indicia positions with device operations for different lengths of template extension may be stored in the memory of the device. wi40 f FIG. 6 is a flowchart exemplifying a typical mode of operation.
  • Step 601 may be initiated in response to another command from the processor in dependence on a particular mode of operation of the device that calls for user input, or may be active at any time in the powered mode.
  • step 603 determination is made as to whether data representing sensed reflected light are to be input to the processor. For example, a triggering signal may be required to indicate user's placement at the desired location and selection is to be made, such as in the utilization of the two-dimensional template. (If, in another mode of operation, no triggering signal is required, step 603 may not be necessary.) If it is determined in step 603 that readout of the data produced by the light sensors is not to be activated, the flow chart reverts to step 601. : If it is determined at step 603 that sensed reflected light is to be used to activate a user input selection, the sensed data are input to the processor at step 605.
  • the processor evaluates the received data to determine the spatial position of the object. This evaluation may lead to a determination of a linear position for one dimensional operational mode or a determination of a two-dimensional position in other modes of operation.
  • the processor accesses an appropriate data base in the memory to correlate the determined position of the object with the appropriate selected command.
  • the command is implemented by the processor.
  • the flow chart process can end at this point or revert to step 601 for receipt of another user input.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

A pluralirty of light sources is mounted on a housing of an interactive user device. The sources are spaced from each other in a defined spatial relationship, for example in a linear configuration. At least one light sensor is also positioned at the surface of the housing. The light sensor senses light that is reflected from an object placed by the user, such as the user's finger, within in an area of the light generated by the light sources. A processor in teh user device can recognize the sensed reflected light as a user input command correlated with a predefined operation and respond accordingly to implement the operation.

Description

METHOD AND APPARATUS FOR TOUCHLESS INPUT
TO AN INTERACTIVE USER DEVICE Field of the Invention
HMM1 I , The present invention relates to interactive user devices, more particularly to providing for touchless user input to such devices.
Background
, :|Mf _ Mobile communication devices, such as cellular phones, laptop computers, pagers, personal communication system (PCS) receivers, personal digital assistants (PDA), and the like, provide advantages of ubiquitous communication without geographic or time constraints. Advances in technology and services have also given rise to a host of additional features beyond that of mere voice communication including, for example, audio-video capturing, data manipulation, electronic mailing, interactive gaming, multimedia playback, short or multimedia messaging, web browsing, etc. Other enhancements, such as location-awareness features, e.g., satellite positioning system (SPS) tracking, enable users to monitor their location and receive, for instance, navigational directions. i : :■'' !.:- The focus of the structural design of mobile phones continues to stress compactness of size, incorporating powerful processing functionality within smaller and slimmer phones. Convenience and ease of use continue to be objectives for improvement, extending, for example, to development of hands free operation. Users may now communicate through wired or wireless headsets that enable users to speak with others without having to hold their mobile communication devices to their heads. Device users, however, must still physically manipulate their devices. The plethora of additional enhancements increases the need for user input that is implemented by components such as keypad and joystick type elements. As these elements become increasingly smaller in handheld devices, their use can become cumbersome. In addition, development of joy stick mechanics and display interaction for these devices has become complex and these elements more costly.
;<lll{l4'i Accordingly, a need exists for a more convenient and less expensive means for providing user input to an interactive user device.
Disclosure
, HJHN I xhe above described needs are fulfilled, at least in part, by mounting a plurality of light sources spaced from each other in a defined spatial relationship, for example in a linear configuration, on a surface of an interactive user device. At least one light sensor is also positioned at the surface of the housing. The light sensor senses light that is reflected from an object placed by the user, such as the user's finger, within an area of the light generated by the light sources. A processor in the user device can recognize the sensed reflected light as a user input command correlated with a predefined operation and respond accordingly to implement the operation.
CV'H.1 The interactive device, for example, may be a mobile phone or other hand held device. The predefined operation may relate to any function of the device that is normally responsive to user input. Thus, a viable alternative is provided for keypad, joystick and mouse activation. This alternative is not limited to handheld devices as it is applicable also to computer systems. lύUv , Each of the light sources preferably exhibits an identifiable unique characteristic. For example, the light sources may comprise LED's of different colors or emanate signals of different pulse rates. The light sensor can identify components of the reflected light with corresponding sources. The relative magnitudes of the one or more components are used as an indication of the position, in single dimension or two-dimension, of the user object. The position is correlated by the processor with a predefined device operation. Each light source may have an outer layer of film through which a unique image can be projected. The projected image may aid the user for positioning the user object.
HMWK1S The position of the user object may be linked to the device display. For example, one or more of the predetermined operations may be displayed as a menu listing. A listed element may be highlighted in the display as the user's object attains the spatial position associated with the element. Selection of a particular input may be completed by another user input, such as an audible input sensed by a microphone or a capacitive sensor, to trigger the operation by the processor. i 0009| A plurality of light sensors may be mounted on the housing surface. The number of sensors may be equal in number to the number of sources and positioned in a defined spatial relationship with respective sources, for example, linearly configured and in longitudinal alignment with the sources. As the position of the user object is in proximity to the light sensor (and its paired light source) that detects the greatest amount of reflected light, the processor can correlate the relative linear position of the light source with a predefined device operation. This exemplified configuration of sources and sensors also can be used to track real time movement of the user object. For example, a sweep of the user's finger across the light beams generated by a particular plurality of adjacent sources can be correlated to device function (for example, terminating a call), while the sweep across a different plurality of light beams can be correlated with a different device function.
:<>u «f j J]16 light sources and photo-sensors preferably are mounted on a side surface of the device housing. The user can then place the device on a table or countertop easily within reach of the user's hand. A retractable template can be provided at the bottom of the device. The template may be imprinted with a plurality of two-dimensional indicia on its upper surface. The template can be extended laterally from the housing to lie flat on the surface supporting the housing. Each of the indicia can be correlated with a device function, as a guide for the appropriate positioning of the user's finger. The template may be coupled electrically to the processor so that touching one of the indicia will trigger the photo sensing operation. For example, at each of the indicia a switch may be operable by depression of the user's finger to signal the processor. Alternatively, a capacitive sensor may be employed.
«f S ; When fully extended, each of the indicia may represent a text entry, similar to an English language keyboard. When extended to a different position, the template may represent text entry for a different language or, instead, a plurality of different input commands. Correlation of indicia positions with device operations for different lengths of template extension may be stored in the memory of the device.
'•' • The position of the user object in both the two-dimensional lateral and longitudinal components can be determined by the processor in response to the input data received from the plurality of sensors. The distance in the lateral direction, i.e., the direction parallel to the housing surface, can be determined based on the relative magnitudes of light sensed among the light sensors. The distance in the longitudinal direction, i.e., the direction perpendicular to the housing surface, also can be determined based on the relative magnitudes of the totality of the sensed reflected light.
Brief Description of the Drawings i v , , The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawing and in which like reference numerals refer to similar elements and in which:
•■ •'-'' « ! FIG. 1 is a block diagram of an interactive user device, exemplified as a mobile communication device; l<)ήt ?l FIG. 2 is a perspective view of a configuration including a plurality of light sources with corresponding photo-sensors.
\W I h . FIG. 3 is a variation of the configuration shown in FIG. 2.
[do • "* ; FIG. 4 is a plan view of a configuration such as shown in FIG. 2 illustrative of one mode of operation.
nil 1 H ; FIG. 5 is a plan view of a configuration such as shown in FIG. 2 with additional modification.
, tti) I *) i FIG. 6 is a flow chart exemplifying one mode of operation.
Detailed Description ji--»J • ; In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of exemplary embodiments. It should be appreciated that exemplary embodiments may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring exemplary embodiments.
IWd , FIG. 1 is a block diagram of a mobile communication device such as a mobile phone. In this example, mobile communication device 100 includes one or more actuators 101 , communications circuitry 103, camera 105, one or more sensors 107, and user interface 109. While specific reference will be made thereto, it is contemplated that mobile communication device 100 may embody many forms and include multiple and/or alternative components.
:4*' 12 . User interface 109 includes display 1 11 , keypad 113, microphone 1 15, and speaker 1 17. Display 1 1 1 provides a graphical interface that permits a user of mobile communication device 100 to view call status, configurable features, contact information, dialed digits, directory addresses, menu options, operating states, time, and other service information, such as physical configuration policies associating triggering events to physical configurations for automatically modifying a physical configuration of mobile communication device 100, scheduling information (e.g., date and time parameters) for scheduling these associations, etc. The graphical interface may include icons and menus, as well as other text, soft controls, symbols, and widgets. In this manner, display 111 enables users to perceive and interact with the various features of mobile communication device 100.
.-.Ul? Keypad 113 may be a conventional input mechanism. That is, keypad 113 may provide for a variety of user input operations. For example, keypad 113 may include alphanumeric keys for permitting entry of alphanumeric information, such as contact information, directory addresses, phone lists, notes, etc. In addition, keypad 113 may represent other input controls, such as a joystick, button controls, dials, etc. Various portions of keypad 113 may be utilized for different functions of mobile communication device 100, such as for conducting voice communications, SMS messaging, MMS messaging, etc. Keypad 113 may include a "send" key for initiating or answering received communication sessions, and an "end" key for ending or terminating communication sessions. Special function keys may also include menu navigation keys, for example, for navigating through one or more menus presented via display 111, to select different mobile communication device functions, profiles, settings, etc. Other keys associated with mobile communication device 100 may include a volume key, an audio mute key, an on/off power key, a web browser launch key, a camera key, etc. Keys or key-like functionality may also be embodied through a touch screen and associated soft controls presented via display 111.
'•- - Microphone 115 converts spoken utterances of a user into electronic audio signals, while speaker 117 converts audio signals into audible sounds. Microphone 115 and speaker 117 may operate as parts of a voice (or speech) recognition system. Thus, a user, via user interface 109, can construct user profiles, enter commands, generate user-defined policies, initialize applications, input information (e.g., physical configurations, scheduling information, triggering events, etc.), and select options from various menu systems of mobile communication device 100. i|lϋ?5 » Communications circuitry 103 enables mobile communication device 100 to initiate, receive, process, and terminate various forms of communications, such as voice communications (e.g., phone calls), SMS messages (e.g., text and picture messages), and MMS messages. Communications circuitry 103 can enable mobile communication device 100 to transmit, receive, and process data, such as endtones, image files, video files, audio files, ringbacks, ringtones, streaming audio, streaming video, etc. The communications circuitry 103 includes audio processing circuitry 119, controller (or processor) 121 , location module 123 coupled to antenna 125, memory 127, transceiver 129 coupled to antenna 131, and wireless controller 133 (e.g., a short range transceiver) coupled to antenna 135. t;ipj' - Wireless controller 133 acts as a local wireless interface, such as an infrared transceiver and/or a radio frequency adaptor (e.g., Bluetooth adapter), for establishing communication with an accessory, hands-free adapter, another mobile communication device, computer, or other suitable device or network.
!•'*' Processing communication sessions may include storing and retrieving data from memory 127, executing applications to allow user interaction with data, displaying video and/or image content associated with data, broadcasting audio sounds associated with data, and the like. Accordingly, memory 127 may represent a hierarchy of memory, which may include both random access memory (RAM) and read-only memory (ROM). Computer program instructions, such as "automatic physical configuration" application instructions, and corresponding data for operation, can be stored in non-volatile memory, such as erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and/or flash memory; however, may be stored in other types or forms of storage. Memory 127 may be implemented as one or more discrete devices, stacked devices, or integrated with controller/processor 121. Memory 127 may store program information, such as one or more user profiles, one or more user defined policies, one or more triggering events, one or more physical configurations, scheduling information, etc. In addition, system software, specific device applications, program instructions, program information, or parts thereof, may be temporarily loaded to memory 127, such as to a volatile storage device, e.g., RAM. Communication signals received by mobile communication device 100 may also be stored to memory 127, such as to a volatile storage device.
'-''*'• . , Controller/processor 121 controls operation of mobile communication device 100 according to programs and/or data stored to memory 127. Control functions may be implemented in a single controller (or processor) or via multiple controllers (or processors). Suitable controllers may include, for example, both general purpose and special purpose controllers, as well as digital signal processors, local oscillators, microprocessors, and the like. Controller/processor 121 may also be implemented as a field programmable gate array (FPGA) controller, reduced instruction set computer (RISC) processor, etc. Controller/processor 121 may interface with audio processing circuitry 119, which provides basic analog output signals to speaker 117 and receives analog audio inputs from microphone 115.
< •- Controller/processor 121 , in addition to orchestrating various operating system functions, can also enable execution of software applications. One such application can be triggered by event detector module 137. Event detector 137 is responsive to a signal from the user to initiate processing data received from sensors, as to be more fully described below. The processor implements this application to determine the spatial location of the user object and to identify a user input command associated therewith.
<t | Ϊ«I PIG. 2 is a perspective view of a housing 200 of an interactive device, such as the communication device exemplified in FIG. 1. A lower surface of the housing may be placed to rest on a planar support surface, such as a table, desk or counter. Mounted on the side surface of the housing is a linear array of six light sources 202 and a corresponding linear array of six photo-sensors 204. The sources may comprise, for example, light emitting diodes (LEDs). As shown, each light source is in relative vertical alignment on the side surface of the housing.
Illustrated in the drawing figure is a user's finger placed in proximity to the fourth vertically aligned pair of light source and photosensor. The position of the user's hand represents the selection by the user of a specific input command to be transmitted to the processor. As shown, the light generated by the source of this pair is reflected back to the photo-sensor of the pair. In lieu of using a finger for input selection, the user may use any object dimensioned to provide appropriate overlap of a single generated light beam. Data received from the plurality of photo-sensors are processed to determine which photo-sensor has the strongest response to light generated by the LEDs. As the sensed reflected light is unique to the fourth light source in this example, the linear position of the user object can be determined by the processor by evaluating the relative strengths of the received photo-sensor inputs. The processor can then access a database that relates position to predefined operation input selections.
As described, the user selection is implemented by sensing a static placement of the object in the vicinity of a photo-sensor. As the user's finger or object must be moved to the desired position to effect the command selection, provision may be made to prevent reading of the sensor outputs until the user object has attained the intended position. Such provision may be implemented by triggering reading of the sensor outputs in response to an additional criterion. Such criterion may comprise, for example, an audible input to the device microphone. Such input may be a voice command or an audible tapping of the support surface when the object has reached its intended position. Another such input may be a change in sensed capacitance when the user object is placed sufficiently close to the housing.
0W3J The embodiment of FIG. 2 may also be operated in a dynamic mode. The user's finger or other object may be moved over time across the path of a plurality of the light beams. Such movement can be tracked to provide the processor with a corresponding time sequence of sources and, thus, object positions. Specific user interface commands can be mapped in memory to respective various combinations of position sequences. For example, a finger sweep across all light beams may be translated to a command for terminating a call. ϊ| V| ( FIG. 3 is a variation of the configuration shown in FIG. 2, wherein light from fewer sources reflects from the user object to fewer sensors. The light sources, which may comprise LEDs, are uniquely encoded. For example, the LEDs may be of different colors or may produce light signals of different pulse widths. Light sensed by the photo-sensors thus may be identified with respective sources. The processor can access a database that correlates light beam characteristics with the light sources. j (MJ '.~ Specifically illustrated are two sources 202 located near respective ends of the housing. Sensor 204 is located near the center of the housing. The user's finger is positioned intermediate the two sources in the vertical (or lateral) direction, somewhat closer to the upper source. The light reflected from the object to the photo sensor 204 comprises a beam generated by the upper source and a beam generated by the lower source. As the object (finger) is closer to the upper source, its reflected beam will be of greater amplitude than the beam reflected by the lower source. The lateral position of the object along-side the device can be determined by evaluating the relative strengths of the light received by sensor 204. The beam components are distinguishable by virtue of their unique characteristics. lMiOfit FlG. 4 is illustrative of an operational mode in which the two- dimensional position of the object can be determined using a configuration of light sources and photo-sensors such as shown in FIG. 2. The user's finger is depicted in a first position that is relatively close to the housing and a second position that is further from the housing. In the first position, as the object is close in the longitudinal (horizontal) direction, only a few light source reflections will reach the third photo-sensor 204. Three such beams are illustrated, the reflected beam of the closest source being the strongest of the three. In the second position, as the object is further away, more light source reflections, including weaker reflected beams, will arrive at the third photosensor 204. Weak reflected beams from some of the sources may also reach the second and fourth photo-sensors. The processor can evaluate the relative strengths of all reflected beams while identifying received data with the respective photo-sensors. This evaluation can determine the object location in the lateral direction (parallel to the housing edge) as well as its distance from the phone edge, i.e., the object location in the longitudinal direction.
[HU l " ; " t With the aid of the arrangement shown in FIG. 5, the user can take advantage of the multitude of possible commands made available by two-dimension position recognition, discussed with respect to FIG. 4. Although the sensors are not shown in FlG. 5, the configuration of sources and photo-sensors, relative to each other, may be the same as illustrated in FIG. 2. FIG. 5 is a top view of the housing at rest on a support surface. Template 210 is retractably coupled to the housing 200 near its bottom. Shown in a position extended from the housing, as indicated by the arrow, the template 210 can lie flat on the support surface to ease user implementation. The template can be retracted in the direction opposite the arrow to be encompassed by the housing when not in use. JOO.^ j The template 210 is imprinted with a plurality of indicia 212 on its upper surface. As illustrated, the indicia are exemplified by a two- dimensional spaced array in rows and columns. The indicia may be images of icons that are recognizable by the user. The two-dimensional position of each of the indicia can be correlated with a device function and serve as a guide for the appropriate positioning of the user's finger. The template may be coupled electrically to the processor so that touching one of the indicia will trigger the photo sensing operation. For example, at each of the indicia a switch may be operable by depression of the user's finger to signal the processor. Alternatively, a capacitive sensor may be employed. i ?M> * «? ϊ The template may be utilized in a plurality of extended positions, the indicia representing a different set of commands for each extended position. For example, when fully extended, each of the indicia may represent a text entry, similar to an English language keyboard. When extended to a different position, the template may represent text entry for a different language or, instead, a plurality of different input commands. Correlation of indicia positions with device operations for different lengths of template extension may be stored in the memory of the device. wi40 f FIG. 6 is a flowchart exemplifying a typical mode of operation.
The device is powered on at start and the light sources are activated to generate respective light beams at step 601. Step 601 may be initiated in response to another command from the processor in dependence on a particular mode of operation of the device that calls for user input, or may be active at any time in the powered mode.
•»*»-, At step 603, determination is made as to whether data representing sensed reflected light are to be input to the processor. For example, a triggering signal may be required to indicate user's placement at the desired location and selection is to be made, such as in the utilization of the two-dimensional template. (If, in another mode of operation, no triggering signal is required, step 603 may not be necessary.) If it is determined in step 603 that readout of the data produced by the light sensors is not to be activated, the flow chart reverts to step 601. : If it is determined at step 603 that sensed reflected light is to be used to activate a user input selection, the sensed data are input to the processor at step 605. The processor, at step 607, evaluates the received data to determine the spatial position of the object. This evaluation may lead to a determination of a linear position for one dimensional operational mode or a determination of a two-dimensional position in other modes of operation. At step 609, the processor accesses an appropriate data base in the memory to correlate the determined position of the object with the appropriate selected command. At step 611, the command is implemented by the processor. The flow chart process can end at this point or revert to step 601 for receipt of another user input.
; " "• '"' In this disclosure there are shown and described only preferred embodiments of the invention and but a few examples of its versatility. It is to be understood that the invention is capable of use in various other combinations and environments and is capable of changes or modifications within the scope of the inventive concept as expressed herein. The use of reflected light as a user input, as described herein, may be used as an alternative to traditional user input implementations or in addition to user interfaces maintained by the user devices.

Claims

WHAT IS CLAIMED IS:
1. A method comprising: generating a plurality of light beams from sources spaced from each other in a defined relationship; imposing a user object within an area of the light generated in the generating step; sensing light reflected from the user object; correlating the light sensed in the sensing step with a predefined operation of a user device.
2. A method as recited in claim 1, further comprising performing the predefined operation in response to the step of correlating.
3. A method as recited in claim 2, wherein: the step of generating comprises defining a unique characteristic for each of the light sources; the step of sensing comprises identifying components of the reflected light having characteristics that correspond to respective light sources; and the step of correlating comprises establishing relative magnitudes of the components of the reflected light.
4. A method as recited in claim 3, wherein the step of correlating further comprises: determining a two-dimensional position of the object in accordance with the relative magnitudes of the reflected light components; and identifying the predefined operation that corresponds to the position of the object.
5. A method as recited in claim 4, further comprising: formulating a template containing a plurality of two-dimensional position indicia, each of the indicia corresponding to a respective user device operation; and wherein the imposing step comprises employing the template by the user to position the object.
6. A method as recited in claim 5, wherein the object comprises the user's finger.
7. A method as recited in claim 4, further comprising displaying an image associated with the predefined operation that corresponds to the position of the object.
8. A method as recited in claim 2, wherein the step of sensing comprises: accessing a plurality of light sensors spaced in correspondence with respective ones of the light sources; identifying the light sensor that detects the greatest magnitude of reflected light with its corresponding light source; and the correlating step comprises determining a predefined operation that corresponds to the identified light source.
9. A method as recited in claim 8, wherein: the step of imposing comprises sweeping the object across a plurality of the light sources; the step of identifying is applied to each of the plurality of light sources; and the step of correlating comprises determining a predetermined operation that corresponds to the plurality of light sources identified.
10. A method as recited in claim 1 , further comprising: detecting a user input; and wherein the step of sensing is triggered in response to the detection of the user input.
1 1. A method as recited in claim 10, wherein the detecting step comprises receiving an audible signal.
12. A method as recited in claim 10, wherein the detecting step comprises sensing a capacitive field.
13. Apparatus comprising: an interactive user device embodied in a housing, the interactive device comprising a processor, a display, and a memory; a plurality of light sources spaced from each other at an outer surface of the housing; and at least one light sensor positioned at the surface of the housing; wherein the at least one light sensor is configured to input data to the processor data that correspond to sensed light generated by any of the light sources and reflected by an imposed user object, and the processor is configured to correlate the input data with a predefined operation of the user device.
14. Apparatus as recited in claim 13, wherein the plurality of light sources in a linear configuration, and a plurality of light sensors, equal in number to the number of light sources, are configured in a linear direction parallel to the light sensors, each light sensor in proximity to a respective light source.
15. Apparatus as recited in claim 13, wherein each of the light sources is a light emitting diode of specific color.
16. Apparatus as recited in claim 13, wherein each of the light sources emanates light at a uniquely identifiable pulse rate.
17. Apparatus as recited in claim 13, wherein the housing further comprises a retractable template extendable in a lateral direction from the surface to an open position, the template having a planar surface perpendicular to the housing surface in the open position, and wherein the template surface contains a plurality of two-dimensional position indicia, each of the indicia corresponding to a respective user device operation.
18. Apparatus as recited in claim 17, wherein the template indicia correspond to a first set of device operations when the template is extended to a first position and correspond to a second set of device operations when the template is extended to a second position.
19. Apparatus as recited in claim 13, wherein each light source comprises an outer film through which a unique image can be projected.
20. Apparatus as recited in claim 13, wherein the interactive user device comprises a mobile phone.
PCT/US2009/042840 2008-07-15 2009-05-05 Method and apparatus for touchless input to an interactive user device WO2010008657A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP09789638A EP2304532A1 (en) 2008-07-15 2009-05-05 Method and apparatus for touchless input to an interactive user device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/173,114 2008-07-15
US12/173,114 US20100013763A1 (en) 2008-07-15 2008-07-15 Method and apparatus for touchless input to an interactive user device

Publications (1)

Publication Number Publication Date
WO2010008657A1 true WO2010008657A1 (en) 2010-01-21

Family

ID=40933709

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/042840 WO2010008657A1 (en) 2008-07-15 2009-05-05 Method and apparatus for touchless input to an interactive user device

Country Status (3)

Country Link
US (1) US20100013763A1 (en)
EP (1) EP2304532A1 (en)
WO (1) WO2010008657A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9213443B2 (en) * 2009-02-15 2015-12-15 Neonode Inc. Optical touch screen systems using reflected light
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US8643628B1 (en) 2012-10-14 2014-02-04 Neonode Inc. Light-based proximity detection system and user interface
US8917239B2 (en) * 2012-10-14 2014-12-23 Neonode Inc. Removable protective cover with embedded proximity sensors
JP5493702B2 (en) * 2009-10-26 2014-05-14 セイコーエプソン株式会社 Projection display with position detection function
JP5326989B2 (en) * 2009-10-26 2013-10-30 セイコーエプソン株式会社 Optical position detection device and display device with position detection function
GB201010953D0 (en) * 2010-06-29 2010-08-11 Elliptic Laboratories As User control of electronic devices
JP2012038164A (en) * 2010-08-09 2012-02-23 Sony Corp Information processing unit
US20130297251A1 (en) * 2012-05-04 2013-11-07 Abl Ip Holding, Llc System and Method For Determining High Resolution Positional Data From Limited Number of Analog Inputs
US10282034B2 (en) 2012-10-14 2019-05-07 Neonode Inc. Touch sensitive curved and flexible displays
US9921661B2 (en) 2012-10-14 2018-03-20 Neonode Inc. Optical proximity sensor and associated user interface
US9164625B2 (en) 2012-10-14 2015-10-20 Neonode Inc. Proximity sensor for determining two-dimensional coordinates of a proximal object
US9741184B2 (en) 2012-10-14 2017-08-22 Neonode Inc. Door handle with optical proximity sensors
US10585530B2 (en) 2014-09-23 2020-03-10 Neonode Inc. Optical proximity sensor
US10324565B2 (en) 2013-05-30 2019-06-18 Neonode Inc. Optical proximity sensor
KR20140076057A (en) * 2012-12-12 2014-06-20 한국전자통신연구원 Apparatus and method for motion input based on multi-sensor
TWI454968B (en) 2012-12-24 2014-10-01 Ind Tech Res Inst Three-dimensional interactive device and operation method thereof
JP6561400B2 (en) * 2015-02-10 2019-08-21 任天堂株式会社 Information processing apparatus, information processing program, information processing system, and information processing method
JP6534011B2 (en) 2015-02-10 2019-06-26 任天堂株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
US10746898B2 (en) * 2018-08-01 2020-08-18 Infineon Technologies Ag Method and device for object recognition and analysis
CN115039060A (en) 2019-12-31 2022-09-09 内奥诺德公司 Non-contact touch input system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1039365A2 (en) * 1999-03-26 2000-09-27 Nokia Mobile Phones Ltd. An input arrangement for manual entry of data and a mobile phone
WO2004102301A2 (en) * 2003-05-15 2004-11-25 Qinetiq Limited Non contact human-computer interface
US6906701B1 (en) * 2001-07-30 2005-06-14 Palmone, Inc. Illuminatable buttons and method for indicating information using illuminatable buttons
US20060190836A1 (en) * 2005-02-23 2006-08-24 Wei Ling Su Method and apparatus for data entry input
EP1895392A1 (en) * 2006-08-30 2008-03-05 Siemens Aktiengesellschaft Device for operating the functions of a device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7512427B2 (en) * 2005-09-02 2009-03-31 Nokia Corporation Multi-function electronic device with nested sliding panels

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1039365A2 (en) * 1999-03-26 2000-09-27 Nokia Mobile Phones Ltd. An input arrangement for manual entry of data and a mobile phone
US6906701B1 (en) * 2001-07-30 2005-06-14 Palmone, Inc. Illuminatable buttons and method for indicating information using illuminatable buttons
WO2004102301A2 (en) * 2003-05-15 2004-11-25 Qinetiq Limited Non contact human-computer interface
US20060190836A1 (en) * 2005-02-23 2006-08-24 Wei Ling Su Method and apparatus for data entry input
EP1895392A1 (en) * 2006-08-30 2008-03-05 Siemens Aktiengesellschaft Device for operating the functions of a device

Also Published As

Publication number Publication date
EP2304532A1 (en) 2011-04-06
US20100013763A1 (en) 2010-01-21

Similar Documents

Publication Publication Date Title
US20100013763A1 (en) Method and apparatus for touchless input to an interactive user device
CN105446646B (en) Content input method, device and touch control device based on dummy keyboard
US8825113B2 (en) Portable terminal and driving method of the same
KR101231106B1 (en) apparatus and method of providing user interface for flexible mobile device
US20110319136A1 (en) Method of a Wireless Communication Device for Managing Status Components for Global Call Control
KR101081432B1 (en) Touch-controlled cursor type handheld electronic device
EP2332032B1 (en) Multidimensional navigation for touch-sensitive display
US20100079379A1 (en) Portable communication device having an electroluminescent driven haptic keypad
US20100177037A1 (en) Apparatus and method for motion detection in a portable terminal
US20060061557A1 (en) Method for using a pointing device
US7423628B2 (en) Track wheel with reduced space requirements
JP2012509524A (en) Portable communication device having a touch sensitive input device with non-linear active area
US9819915B2 (en) Smart laser phone
US7961176B2 (en) Input apparatus and method using optical sensing, and portable terminal using the same
US8195252B2 (en) Input device for mobile terminal using scroll key
CN106843672A (en) A kind of terminal screen locking operation device and method
US20110316805A1 (en) Electronic device
CN106775305A (en) A kind of terminal quick calling apparatus and method
KR100878715B1 (en) A Method of Clicking the Key on Mobile Phones Equipped with Optical Pointing Devices
KR101483302B1 (en) Method of controlling operation of mobile telephone by using touch-screen
CA2498322C (en) Track wheel with reduced space requirements
KR20080079446A (en) Pointing method of portable apparatus
KR101344302B1 (en) Method For Scrolling Using Touch Screen And Portable Terminal Having Scroll Function Using Touch Screen
KR20110119464A (en) Touch screen device and methods of operating terminal using the same
JP2012008988A (en) Electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09789638

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2009789638

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE