WO2012119308A1 - An apparatus and method for remote user input - Google Patents

An apparatus and method for remote user input Download PDF

Info

Publication number
WO2012119308A1
WO2012119308A1 PCT/CN2011/071631 CN2011071631W WO2012119308A1 WO 2012119308 A1 WO2012119308 A1 WO 2012119308A1 CN 2011071631 W CN2011071631 W CN 2011071631W WO 2012119308 A1 WO2012119308 A1 WO 2012119308A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
selection
remote surface
image
positioning
Prior art date
Application number
PCT/CN2011/071631
Other languages
French (fr)
Inventor
Kongqiao Wang
Anping Zhao
Liangfeng Xu
Jundong XUE
Chunli JING
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/CN2011/071631 priority Critical patent/WO2012119308A1/en
Publication of WO2012119308A1 publication Critical patent/WO2012119308A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0436Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which generating transducers and detecting transducers are attached to a single acoustic waves transmission substrate

Definitions

  • Embodiments of the present invention relate to an apparatus and method for remote user input.
  • Touch sensitive displays are now in common use in mobile electronic devices such as mobile cellular telephones, tablet computers etc.
  • a touch sensitive display displays an image.
  • a user is able to input different user commands by touching the image on the display at different portions of the image or by using different touch styles such as tap or trace.
  • an apparatus comprising: a projector configured to project an image onto a remote surface; selection circuitry configured to facilitate estimation of a value dependent upon displacement of the pointer from the apparatus for comparison with a reference value and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface.
  • a method comprising: projecting an image onto a remote surface; positioning the pointer relative to the image projected onto the remote surface; detecting a selection event by analyzing a value dependent upon displacement of the pointer from the source of the projecting; using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and determining a user input command, if any, based upon the location of the selection event within the image.
  • an apparatus comprising: a projector configured to project an image onto a remote surface; selection circuitry configured to facilitate estimation of a value dependent upon distance of the pointer from the apparatus for comparison with a reference value dependent upon a distance of the remote surface from the apparatus and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface
  • a method comprising: projecting an image onto a remote surface; positioning the pointer relative to the image projected onto the remote surface; detecting a selection event by analyzing kinematics of the pointer; using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and determining a user input command, if any, based upon the location of the selection event within the image.
  • Embodiments enable a user to input user commands at a projected image remote from the apparatus, for example, by touching the projected image.
  • Fig 1 illustrates an example of a remote user input apparatus configured to enable detection of a user input command at a projected image remote from the apparatus
  • Fig 2 illustrates an example of the remote user input apparatus
  • Fig 3 illustrates an example of the remote user input apparatus
  • Fig 4A illustrates a side view of an apparatus projecting an image onto a remote surface
  • Fig 4B illustrates a top view of an apparatus projecting an image onto a remote surface
  • Fig 5 illustrates a front view of an image projected onto a remote surface
  • Fig 6A illustrates a front view of pointer in front of the image projected onto the remote surface
  • Fig 6B illustrates a co-ordinate system for locating the pointer
  • Fig 7 illustrates a side view of a pointer in front of the image projected onto the remote surface
  • Fig 8 illustrates a method for determining a user input command made at a projected image remote from the apparatus
  • Fig 9 illustrates another method for determining a user input command made at a projected image remote from the apparatus
  • Figs 10A and 10B illustrate different uses of the apparatus.
  • the Figures illustrates an example of an apparatus 2 comprising: a projector 4 configured to project an image 34 onto a remote surface 30; selection circuitry 8 configured to facilitate estimation of a value dependent upon a displacement of a pointer 50 from the apparatus 2 for comparison with a reference value; and positioning circuitry 6 configured to position the pointer 50 relative to the image 34 projected onto the remote surface 30.
  • the displacement may, for example, be defined by a separation distance (z) of a pointer 50 from the apparatus 2 and/or by a transverse displacement in a plane between the apparatus 2 and the remote surface 30.
  • Fig 1 illustrates an example of a remote user input apparatus 2.
  • the remote user input apparatus 2 is configured to enable detection of a user input command at a location remote from the apparatus 2.
  • the user input apparatus 2 projects an image 34 onto a remote surface 30.
  • a pointer 50 can be touched against the projected image 34 on the remote surface 30 to generate a user input command.
  • Different user input commands may be generated by touching different portions of the projected image 34 on the remote surface 30 to generate different user commands.
  • different user input commands may be generated by touching the projected image 34 on the remote surface 30 in different ways to generate different user commands.
  • the touch may be a tap or a trace or a stationary pause.
  • the pointer 50 is something that has or identifies an end-point of a physical thing used by a user to indicate user-selection.
  • the end-point may be an end-point 50 of a user's limb 52 e.g. a hand-tip or fingertip.
  • the end-point may be an end-point of a physical device carried in a hand of a user (not illustrated).
  • the pointer 50 may have a position defined by a co-ordinate system 54 that can be defined by a distance z between the apparatus 2 and the pointer 50 and by a location (x,y) within a projected plane of the image 34 at distance z.
  • the apparatus 2 comprises: a projector 4 configured to project an image 34 onto a remote surface 30; selection circuitry 8 configured to facilitate estimation of a value dependent upon a distance (z) of a pointer 50 from the apparatus 2 for comparison with a reference value dependent upon a distance D of the remote surface 30 from the apparatus 2; and positioning circuitry 6 configured to position the pointer 50 relative to the image 34 projected onto the remote surface 30.
  • the projector 4 may, for example, comprise circuitry that generates light that is projected primarily in a specific first direction 26 as a beam which has a cross-section 35 that may expand with distance from the apparatus 2.
  • the light is encoded such that the cross- section of the beam forms a desired image 34 when its reaches a remote surface 30.
  • the light may be encoded when the light is generated.
  • a two-dimensional array of light emitting diodes may be used to generate a two-dimensional array of pixels in the projected image 34.
  • the light may be encoded after the light is generated using, for example, a two-dimensional array of filter elements which may, for example, comprise a two-dimensional array of nematic liquid crystal cells in combination with cross-polarizers.
  • the positioning circuitry 6 may, for example, comprise circuitry that is configured to capture, over time, information, such as for example reflected light, that positions the pointer 50.
  • the positioning will be relative to some defined reference system 54 (x, y) which has a known relationship to the projected image (see Fig 6B).
  • some defined reference system 54 x, y
  • the area of the image 34 is smaller than and lies wholly within the area 32 over which the pointer 50 can be positioned by positioning circuitry 6.
  • the selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value ( F(x, y, z)) dependent upon displacement (x and/or y and/or z) of the pointer 50 from the apparatus 2 for comparison with a reference value.
  • the difference d between the distance (D) of the remote surface 30 from the apparatus 2 and the distance (z) of the pointer 50 from the apparatus 2 indicates the spacing of the pointer 50 from the remote surface 30.
  • the difference d may therefore be monitored and a user selection event determined when the difference d becomes zero.
  • the selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value dependent upon and indicative of distance (z) of the pointer 50 from the apparatus 2 for comparison with a reference value dependent upon and indicative of a distance (D) of the remote surface 30 from the apparatus 2.
  • the first or second time derivative of the distance (z) of the pointer 50 from the apparatus 2 can be used to indicate a pause (a constant spacing of the pointer 50 from the remote surface 30). Time derivatives of the distance (z) may therefore be monitored and a user selection event determined when one or more of the time derivatives becomes a predetermined value for a predetermined time.
  • the selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value (derivative of z) that is dependent upon but not indicative of distance (z) of the pointer 50 from the apparatus 2 for comparison with a reference value.
  • the first or second time derivative of a transverse displacement (x and/or y) of the pointer 50 from the apparatus 2 can be used to indicate a pause (a constant transverse displacement of the pointer 50 from the apparatus 2).
  • Time derivatives of the transverse displacement may therefore be monitored and a user selection event determined when one or more of the time derivatives becomes a predetermined value (e.g. zero) for a predetermined time.
  • the selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value (derivative of x and/or y) that is dependent upon but not indicative of transverse displacement of the pointer 50 from the apparatus 2 for comparison with a reference value.
  • a light or sound wave may be transmitted by the selection circuitry 8 for reflection by the pointer 50.
  • the light or sound wave reflected by the pointer 50 may be detected by a sensor of the selection circuitry 8.
  • the time of flight of the wave from transmission to reception may be determined as a value, measured in time, indicative of the distance (z) of the pointer 50 from the apparatus 2.
  • a phase of the wave at reception may be compared to a phase at transmission to determine a value, measured in wavelengths, indicative of the distance (z) of the pointer 50 from the apparatus 2.
  • the selection circuitry may comprise at least one of multiple stereoscopic cameras.
  • the additional camera may be provided by the selection circuitry 8 or by the positioning circuitry 6.
  • the pointer 50 may, for example, be positioned relative to the image 34 projected onto the remote surface 30 according to the perspective of one of the stereoscopic cameras.
  • the pointer 50 may then, for example, be positioned relative to the image 34 projected onto the remote surface 30 according to the perspective of another of the stereoscopic cameras.
  • the different positions of the pointer from the different cameras as a result of parallax can be used, with a value of the distance separating the two stereoscopic cameras to estimate the distance of the pointer 50 from the apparatus 2.
  • the pointer 50 may, for example, be positioned by emitting structural infra red light.
  • the structural light comprises a pattern of light which when reflected has a pattern that depends upon a distance of the reflecting object.
  • Fig 3 illustrates an example of the apparatus 2.
  • the apparatus 2 comprises a housing 22, which is this example has faces arranged in a cuboid geometry. It has two parallel opposing end faces, two parallel opposing side faces and a front face that is parallel to and opposes a back face. The end faces are orthogonal to the front face, back face and side faces.
  • the apparatus 2 may be placed back face downwards on a flat surface when it is used as a remote user input apparatus.
  • One of the end faces 20 comprises an output aperture 23 for the projector 4.
  • the output aperture is oriented to project in a first direction 26.
  • the end face also comprises a positioning sensor of the positioning circuitry 6.
  • the positioning sensor 6 is oriented to sense in the first direction 26.
  • the end face also comprises a selection sensor of the selection circuitry 8.
  • the selection sensor 8 is oriented to sense in the first direction.
  • the selection circuitry 8 also comprises one or more emitters configured to transmit infra-red light in the first direction 26.
  • the output aperture 23, the positioning sensor 6 and the selection sensor 8 are positioned in a rectilinear arrangement so that they lie along a straight line that is parallel to the edge where the end face 20 meets the back face.
  • the rectiiinear arrangement is horizontal when it is used as a remote user input apparatus.
  • Fig 2 illustrates an example of the remote user input apparatus 2.
  • the apparatus comprises a projector 4, positioning circuitry 6 and selection circuitry 8.
  • the positioning circuitry 6 comprises a sensor such as a camera sensor.
  • the camera viewing angle is greater (wider) than a projection angle of the projector 4.
  • the camera sensor may, as an example, have a focal length of between 20cm and 200cm.
  • the selection circuitry 8 comprises a sensor 16 such as an infrared camera and one or more emitters 18 such as infra red light emitting diodes.
  • the apparatus 2 also comprises a processor 10 and a memory 12.
  • the processor 10 is configured to read from and write to the memory 12.
  • the processor 0 may also comprise an output interface via which data and/or commands are output by the processor 10 and an input interface via which data and/or commands are input to the processor 10.
  • the memory 12 stores a computer program 14 comprising computer program instructions that control the operation of the apparatus 2 when loaded into the processor 10.
  • the selection circuitry 8 comprises circuitry for determining information indicative of a timing off-set between a wave reflected from the pointer 50 and a reference.
  • the timing-off set is a value that varies with a distance z of the pointer 50 from the apparatus 2.
  • the one or more emitters 8 is configured to transmit infra red (IR) light waves that have a characteristic that varies in time.
  • the IR light may be transmitted as pulses having an on/off duty cycle in which case the intensity is a characteristic that changes between zero/low and high repeatedly.
  • the IR light is reflected from the remote pointer 50 and the reflected IR light is detected by the IR sensor 16.
  • the IR sensor 16 is configured to detect the characteristic of the received IR light. For example, the sensor 16 may detect the intensity of the received IR light.
  • the time of flight between sending at time tt an IR light pulse and receiving at time t2 the reflected IR light pulse may be used to determine the distance z of the pointer 50.
  • z 1 ⁇ 2(t2-t1 ) * c , where c is the speed of light
  • the time of flight between sending at time t1 an IR light pulse and receiving at time t2 the reflected IR light pulse may be used as a proxy to represent the distance z of the pointer 50 as it is related to it by a constant.
  • the distance D between the apparatus 2 and the remote surface 30 may be varied.
  • the selection circuitry 8 may also be used to measure the time of flight of IR pulses reflected off the remote surface 30.
  • the time of flight may be used to determine the distance D to the remote surface 30 or it may be used as a proxy to represent the distance D to the remote surface 30. This may occur as a calibration before the determination of the distance z (or its proxy).
  • the processor 10 is configured to track a separation distance d defined by a difference between the distance z (or its proxy) and the distance D (or its proxy).
  • a separation distance d defined by a difference between the distance z (or its proxy) and the distance D (or its proxy).
  • the image 34 comprises a plurality of active portions 40A, 40B, 40C that are distinct and separated by an inactive portion 40D. If the user touches any of the active portions 40A, 40B, 40C with the pointer 50 then a respective different user input command is executed. If the user touches any of the inactive portion 40D with the pointer 50 then no user input command is executed.
  • Fig 8 illustrates a method for determining a user input command made by a user at a remote surface 30.
  • the method is suitable for performance by the processor 10 of the apparatus 2.
  • a selection event is detected by comparing a change in separation distance d between the pointer and the remote surface 30
  • the selection event may detected by analyzing kinematics of the pointer including a change in separation distance between the pointer and the remote surface 30.
  • the location in three dimensions of the pointer 50 may be defined by giving a position (x,y) in two dimensions and a distance z from the apparatus 2.
  • a further or alternative constraint may be that ⁇ becomes zero instantaneously or for an extended period (a pause).
  • a tracing input occurs via the pointer x and y will typically be non-zero.
  • the position (x,y) of the pointer relative to the image 34 projected onto the remote surface 30 is used to locate the selection event within the image 34.
  • a user input command if any, is determined based upon the location of the selection event within the image 34.
  • Fig 9 illustrates another method for determining a user input command made by a user at a remote surface 30. The method is suitable for performance by the processor 10 of the apparatus 2. At block 71 , an optical calibration may occur to determine the distance D as described above.
  • camera data received from a camera sensor of the positioning circuitry 6 is processed to identify the pointer within a captured image.
  • an image portion corresponding to the pointer 50 is isolated within the captured image using computer vision techniques.
  • the image portion corresponding to the pointer 50 moves with the pointer and the pointer can be positioned by positioning (x,y) the image portion within the image.
  • a value dependent upon a separation distance z of the pointer from the apparatus 2 is estimated. This may for example, be achieved by transmitting a pulse of iR light from an emitter 18 of the selection circuitry 8 and detecting the reflected pulse at the sensor 16 of the selection circuitry 8.
  • the time of flight between the transmission of the pulse and the detection of the illumination of the pointer by the pulse can be used to determine the distance z or as a proxy for the distance z.
  • the illumination of the pointer by the pulse can be detected by detecting a change in intensity at (only) the image portion corresponding to the pointer 50.
  • a seiection event is detected by analyzing kinematics of the pointer.
  • the values of x, y, z and their first and second derivatives may be analyzed.
  • a selection event may be detected whenever there is a discontinuity in a time derivative of z indicating that the pointer has been moved towards the remote surface 30 and then away from the remote surface 30.
  • a selection event may be detected whenever (D-z) becomes zero i.e. changes in time to reach zero indicating that the pointer has been moved towards the remote surface 30.
  • the pointer 50 is positioned relative to the image projected onto the remote surface and the position (x,y) of the pointer relative to the image projected onto the remote surface 30 is used to locate the selection event within the image.
  • a user input command if any, is determined based upon the location of the selection event within the image.
  • the computer program instructions 14 provide the logic and routines that enables the apparatus to perform the methods illustrated in Figs 8 and 9.
  • the processor 10 by reading the memory 12 is able to load and execute the computer program 14.
  • the apparatus 2 therefore comprises: at least one processor 10; and at least one memory 12 including computer program code 14 the at least one memory 12 and the computer program code 14 configured to, with the at least one processor 10, cause the apparatus at least to perform the method of Fig 8 or 9.
  • the computer program may arrive at the apparatus 2 via any suitable delivery mechanism.
  • the delivery mechanism may be, for example, a computer- readable storage medium, a computer program product, a memory device, a record medium such as a compact disc read-only memory (CD-ROM) or digital versatile disc (DVD), an article of manufacture that tangibly embodies the computer program 14.
  • the delivery mechanism may be a signal configured to reliably transfer the computer program 14.
  • the memory 12 is illustrated as a single component it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage. References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc.
  • references to computer program, instructions, code etc. shouid be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed -function device, gate array or programmable logic device etc.
  • circuitry' refers to ali of the following:
  • circuitry to combinations of hardware circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processors)), software, and memory(ies) that work together to cause an apparatus to perform various functions) and (c) to circuits, such as a microprocessor(s) or a portion of a microprocessors), that require software or firmware for operation, even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term in this application, including in any claims.
  • circuitry would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware.
  • circuitry would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit.
  • module' refers to a unit or apparatus that excludes certain parts/components that would be added by an end manufacturer or a user.
  • the apparatus 2 for example as illustrated in Fig 1 , may be a module and it may comprise interfaces to the projector 4, positioning circuitry 6 and selection circuitry 8 that enable incorporation within an electronic device.
  • the processor 10 and memory 12 illustrated in Fig 2 may be part of an electronic device such as a mobile cellular telephone, computer, personal media player etc
  • Fig 10A illustrates the apparatus 2 in use. In this example, a normal to the remote surface 30 is not parallel to the first direction 26.
  • Fig 10B illustrates the apparatus 2 in use. In this example, a normal to the remote surface 30 is parallel to the first direction 26,
  • the blocks illustrated in the Figs 8 and 9 may represent steps in a method and/or sections of code in the computer program 14.
  • the illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Position Input By Displaying (AREA)

Abstract

An apparatus comprising: a projector configured to project an image onto a remote surface; selection circuitry configured to facilitate estimation of a value dependent upon displacement of the pointer from the apparatus for comparison with a reference vaiue and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface

Description

AN APPARATUS AND METHOD FOR REMOTE USER INPUT
TECHNOLOGICAL FIELD
Embodiments of the present invention relate to an apparatus and method for remote user input.
BACKGROUND
Touch sensitive displays are now in common use in mobile electronic devices such as mobile cellular telephones, tablet computers etc. A touch sensitive display displays an image. A user is able to input different user commands by touching the image on the display at different portions of the image or by using different touch styles such as tap or trace.
BRIEF SUMMARY
According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a projector configured to project an image onto a remote surface; selection circuitry configured to facilitate estimation of a value dependent upon displacement of the pointer from the apparatus for comparison with a reference value and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface.
According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: projecting an image onto a remote surface; positioning the pointer relative to the image projected onto the remote surface; detecting a selection event by analyzing a value dependent upon displacement of the pointer from the source of the projecting; using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and determining a user input command, if any, based upon the location of the selection event within the image. According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a projector configured to project an image onto a remote surface; selection circuitry configured to facilitate estimation of a value dependent upon distance of the pointer from the apparatus for comparison with a reference value dependent upon a distance of the remote surface from the apparatus and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface
According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: projecting an image onto a remote surface; positioning the pointer relative to the image projected onto the remote surface; detecting a selection event by analyzing kinematics of the pointer; using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and determining a user input command, if any, based upon the location of the selection event within the image.
Embodiments enable a user to input user commands at a projected image remote from the apparatus, for example, by touching the projected image. BRIEF DESCRIPTION
For a better understanding of various examples of embodiments of the present invention reference will now be made by way of example only to the accompanying drawings in which:
Fig 1 illustrates an example of a remote user input apparatus configured to enable detection of a user input command at a projected image remote from the apparatus; Fig 2 illustrates an example of the remote user input apparatus;
Fig 3 illustrates an example of the remote user input apparatus;
Fig 4A illustrates a side view of an apparatus projecting an image onto a remote surface;
Fig 4B illustrates a top view of an apparatus projecting an image onto a remote surface;
Fig 5 illustrates a front view of an image projected onto a remote surface; Fig 6A illustrates a front view of pointer in front of the image projected onto the remote surface;
Fig 6B illustrates a co-ordinate system for locating the pointer;
Fig 7 illustrates a side view of a pointer in front of the image projected onto the remote surface;
Fig 8 illustrates a method for determining a user input command made at a projected image remote from the apparatus;
Fig 9 illustrates another method for determining a user input command made at a projected image remote from the apparatus;
Figs 10A and 10B illustrate different uses of the apparatus.
DETAILED DESCRIPTION
The Figures illustrates an example of an apparatus 2 comprising: a projector 4 configured to project an image 34 onto a remote surface 30; selection circuitry 8 configured to facilitate estimation of a value dependent upon a displacement of a pointer 50 from the apparatus 2 for comparison with a reference value; and positioning circuitry 6 configured to position the pointer 50 relative to the image 34 projected onto the remote surface 30.
The displacement may, for example, be defined by a separation distance (z) of a pointer 50 from the apparatus 2 and/or by a transverse displacement in a plane between the apparatus 2 and the remote surface 30. Fig 1 illustrates an example of a remote user input apparatus 2. The remote user input apparatus 2 is configured to enable detection of a user input command at a location remote from the apparatus 2.
Referring to Fig 5, 6A and 6B, the user input apparatus 2 projects an image 34 onto a remote surface 30. A pointer 50 can be touched against the projected image 34 on the remote surface 30 to generate a user input command. Different user input commands may be generated by touching different portions of the projected image 34 on the remote surface 30 to generate different user commands. Alternatively different user input commands may be generated by touching the projected image 34 on the remote surface 30 in different ways to generate different user commands. For example, the touch may be a tap or a trace or a stationary pause.
The pointer 50 is something that has or identifies an end-point of a physical thing used by a user to indicate user-selection. As illustrated in Fig 6A, the end-point may be an end-point 50 of a user's limb 52 e.g. a hand-tip or fingertip. Alternatively, the end-point may be an end-point of a physical device carried in a hand of a user (not illustrated). As illustrated in Fig 6B, the pointer 50 may have a position defined by a co-ordinate system 54 that can be defined by a distance z between the apparatus 2 and the pointer 50 and by a location (x,y) within a projected plane of the image 34 at distance z.
The apparatus 2 comprises: a projector 4 configured to project an image 34 onto a remote surface 30; selection circuitry 8 configured to facilitate estimation of a value dependent upon a distance (z) of a pointer 50 from the apparatus 2 for comparison with a reference value dependent upon a distance D of the remote surface 30 from the apparatus 2; and positioning circuitry 6 configured to position the pointer 50 relative to the image 34 projected onto the remote surface 30.
As illustrated in Figs 4A and 4B, the projector 4 may, for example, comprise circuitry that generates light that is projected primarily in a specific first direction 26 as a beam which has a cross-section 35 that may expand with distance from the apparatus 2. The light is encoded such that the cross- section of the beam forms a desired image 34 when its reaches a remote surface 30. The light may be encoded when the light is generated. For example, a two-dimensional array of light emitting diodes may be used to generate a two-dimensional array of pixels in the projected image 34. Alternatively, the light may be encoded after the light is generated using, for example, a two-dimensional array of filter elements which may, for example, comprise a two-dimensional array of nematic liquid crystal cells in combination with cross-polarizers.
As illustrated in Figs 4A and 4B, the positioning circuitry 6 may, for example, comprise circuitry that is configured to capture, over time, information, such as for example reflected light, that positions the pointer 50. The positioning will be relative to some defined reference system 54 (x, y) which has a known relationship to the projected image (see Fig 6B). Thus it is possible to position the pointer 50 relative to the image 34 projected onto the remote surface 30. In the examples of Figs 4A, 4B and 5, at the remote surface 30 the area of the image 34 is smaller than and lies wholly within the area 32 over which the pointer 50 can be positioned by positioning circuitry 6.
The selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value ( F(x, y, z)) dependent upon displacement (x and/or y and/or z) of the pointer 50 from the apparatus 2 for comparison with a reference value..
For example, the difference d between the distance (D) of the remote surface 30 from the apparatus 2 and the distance (z) of the pointer 50 from the apparatus 2 indicates the spacing of the pointer 50 from the remote surface 30. When the difference d becomes zero it is indicative that the pointer 50 has touched the remote surface 30. The difference d may therefore be monitored and a user selection event determined when the difference d becomes zero. The selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value dependent upon and indicative of distance (z) of the pointer 50 from the apparatus 2 for comparison with a reference value dependent upon and indicative of a distance (D) of the remote surface 30 from the apparatus 2.
Alternativeiy or additionally, the first or second time derivative of the distance (z) of the pointer 50 from the apparatus 2 can be used to indicate a pause (a constant spacing of the pointer 50 from the remote surface 30). Time derivatives of the distance (z) may therefore be monitored and a user selection event determined when one or more of the time derivatives becomes a predetermined value for a predetermined time. The selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value (derivative of z) that is dependent upon but not indicative of distance (z) of the pointer 50 from the apparatus 2 for comparison with a reference value.
Alternativeiy or additionally, the first or second time derivative of a transverse displacement (x and/or y) of the pointer 50 from the apparatus 2 can be used to indicate a pause (a constant transverse displacement of the pointer 50 from the apparatus 2). Time derivatives of the transverse displacement may therefore be monitored and a user selection event determined when one or more of the time derivatives becomes a predetermined value (e.g. zero) for a predetermined time. The selection circuitry 8 may, for example, comprise circuitry configured to facilitate estimation of a value (derivative of x and/or y) that is dependent upon but not indicative of transverse displacement of the pointer 50 from the apparatus 2 for comparison with a reference value.
There are various different technique available for determining a value indicative of the distance (z) of the pointer 50 from the apparatus 2.
For example, a light or sound wave may be transmitted by the selection circuitry 8 for reflection by the pointer 50. The light or sound wave reflected by the pointer 50 may be detected by a sensor of the selection circuitry 8. The time of flight of the wave from transmission to reception may be determined as a value, measured in time, indicative of the distance (z) of the pointer 50 from the apparatus 2. Alternatively, a phase of the wave at reception may be compared to a phase at transmission to determine a value, measured in wavelengths, indicative of the distance (z) of the pointer 50 from the apparatus 2.
As another example, the selection circuitry may comprise at least one of multiple stereoscopic cameras. The additional camera may be provided by the selection circuitry 8 or by the positioning circuitry 6. The pointer 50 may, for example, be positioned relative to the image 34 projected onto the remote surface 30 according to the perspective of one of the stereoscopic cameras. The pointer 50 may then, for example, be positioned relative to the image 34 projected onto the remote surface 30 according to the perspective of another of the stereoscopic cameras. The different positions of the pointer from the different cameras as a result of parallax can be used, with a value of the distance separating the two stereoscopic cameras to estimate the distance of the pointer 50 from the apparatus 2.
The pointer 50 may, for example, be positioned by emitting structural infra red light. The structural light comprises a pattern of light which when reflected has a pattern that depends upon a distance of the reflecting object. Fig 3 illustrates an example of the apparatus 2. The apparatus 2 comprises a housing 22, which is this example has faces arranged in a cuboid geometry. It has two parallel opposing end faces, two parallel opposing side faces and a front face that is parallel to and opposes a back face. The end faces are orthogonal to the front face, back face and side faces. The apparatus 2 may be placed back face downwards on a flat surface when it is used as a remote user input apparatus. One of the end faces 20 comprises an output aperture 23 for the projector 4. The output aperture is oriented to project in a first direction 26.
The end face also comprises a positioning sensor of the positioning circuitry 6. The positioning sensor 6 is oriented to sense in the first direction 26.
The end face also comprises a selection sensor of the selection circuitry 8. The selection sensor 8 is oriented to sense in the first direction. In this example, the selection circuitry 8 also comprises one or more emitters configured to transmit infra-red light in the first direction 26.
The output aperture 23, the positioning sensor 6 and the selection sensor 8 are positioned in a rectilinear arrangement so that they lie along a straight line that is parallel to the edge where the end face 20 meets the back face. When the apparatus 2 is pfaced back face downwards on a flat horizontal surface then the rectiiinear arrangement is horizontal when it is used as a remote user input apparatus.
Fig 2 illustrates an example of the remote user input apparatus 2. As in Fig 1 , the apparatus comprises a projector 4, positioning circuitry 6 and selection circuitry 8.
The positioning circuitry 6 comprises a sensor such as a camera sensor. The camera viewing angle is greater (wider) than a projection angle of the projector 4. The camera sensor may, as an example, have a focal length of between 20cm and 200cm.
The selection circuitry 8 comprises a sensor 16 such as an infrared camera and one or more emitters 18 such as infra red light emitting diodes.
The apparatus 2 also comprises a processor 10 and a memory 12. The processor 10 is configured to read from and write to the memory 12. The processor 0 may also comprise an output interface via which data and/or commands are output by the processor 10 and an input interface via which data and/or commands are input to the processor 10.
The memory 12 stores a computer program 14 comprising computer program instructions that control the operation of the apparatus 2 when loaded into the processor 10. In this example, the selection circuitry 8 comprises circuitry for determining information indicative of a timing off-set between a wave reflected from the pointer 50 and a reference. The timing-off set is a value that varies with a distance z of the pointer 50 from the apparatus 2. In this example, the one or more emitters 8 is configured to transmit infra red (IR) light waves that have a characteristic that varies in time. For example, the IR light may be transmitted as pulses having an on/off duty cycle in which case the intensity is a characteristic that changes between zero/low and high repeatedly.
The IR light is reflected from the remote pointer 50 and the reflected IR light is detected by the IR sensor 16. The IR sensor 16 is configured to detect the characteristic of the received IR light. For example, the sensor 16 may detect the intensity of the received IR light.
The time of flight between sending at time tt an IR light pulse and receiving at time t2 the reflected IR light pulse may be used to determine the distance z of the pointer 50. z= ½(t2-t1 ) * c , where c is the speed of light Alternatively the time of flight between sending at time t1 an IR light pulse and receiving at time t2 the reflected IR light pulse may be used as a proxy to represent the distance z of the pointer 50 as it is related to it by a constant. The distance D between the apparatus 2 and the remote surface 30 may be varied. The selection circuitry 8 may also be used to measure the time of flight of IR pulses reflected off the remote surface 30. The time of flight may be used to determine the distance D to the remote surface 30 or it may be used as a proxy to represent the distance D to the remote surface 30. This may occur as a calibration before the determination of the distance z (or its proxy).
The processor 10 is configured to track a separation distance d defined by a difference between the distance z (or its proxy) and the distance D (or its proxy). When the separation distance d becomes zero, the position of the pointer indicates a portion of the image 34 has been selected by the user touching it with the pointer 50 and a user input command, if any, is determined based upon the selected portion of the image.
For example, referring to Fig 5, the image 34 comprises a plurality of active portions 40A, 40B, 40C that are distinct and separated by an inactive portion 40D. If the user touches any of the active portions 40A, 40B, 40C with the pointer 50 then a respective different user input command is executed. If the user touches any of the inactive portion 40D with the pointer 50 then no user input command is executed.
Fig 8 illustrates a method for determining a user input command made by a user at a remote surface 30. The method is suitable for performance by the processor 10 of the apparatus 2. At biock 62, a selection event is detected by comparing a change in separation distance d between the pointer and the remote surface 30 The selection event may detected by analyzing kinematics of the pointer including a change in separation distance between the pointer and the remote surface 30. In this example, the location in three dimensions of the pointer 50 may be defined by giving a position (x,y) in two dimensions and a distance z from the apparatus 2.
By analyzing (D-z), and optionally the time derivatives of z, it is possible to identify when the pointer touches the remote surface 30 by requiring (D-z) to become zero.
A further or alternative constraint may be that ∑ becomes zero instantaneously or for an extended period (a pause).
Additional or alternative requirements may be that x becomes zero and y becomes zero for a single point touch. However, in other embodiments where a tracing input occurs via the pointer x and y will typically be non-zero. At block 64, the position (x,y) of the pointer relative to the image 34 projected onto the remote surface 30 is used to locate the selection event within the image 34.
At block 66, a user input command, if any, is determined based upon the location of the selection event within the image 34.
For example, referring to Fig 5, if the selection event occurs within any of the active portions 40A, 408, 40C then a respective different user input command is executed. However, if the selection event occurs in the inactive portion 40D then no user input command is executed. Fig 9 illustrates another method for determining a user input command made by a user at a remote surface 30. The method is suitable for performance by the processor 10 of the apparatus 2. At block 71 , an optical calibration may occur to determine the distance D as described above.
At block 72, camera data received from a camera sensor of the positioning circuitry 6 is processed to identify the pointer within a captured image.
At block 73, an image portion corresponding to the pointer 50 is isolated within the captured image using computer vision techniques. The image portion corresponding to the pointer 50 moves with the pointer and the pointer can be positioned by positioning (x,y) the image portion within the image.
At block 74 a value dependent upon a separation distance z of the pointer from the apparatus 2 (the source of projecting) is estimated. This may for example, be achieved by transmitting a pulse of iR light from an emitter 18 of the selection circuitry 8 and detecting the reflected pulse at the sensor 16 of the selection circuitry 8. The time of flight between the transmission of the pulse and the detection of the illumination of the pointer by the pulse can be used to determine the distance z or as a proxy for the distance z. The illumination of the pointer by the pulse can be detected by detecting a change in intensity at (only) the image portion corresponding to the pointer 50.
At block 75 a seiection event is detected by analyzing kinematics of the pointer. The values of x, y, z and their first and second derivatives may be analyzed. For example, a selection event may be detected whenever there is a discontinuity in a time derivative of z indicating that the pointer has been moved towards the remote surface 30 and then away from the remote surface 30. There may be additional or alternative constraints required on z and/or the derivatives of x and y. For example it may also be required that z is close to D. For example it may be required that there is little positional (x,y) change when the discontinuity in the second derivative of z occurs.
As another example, a selection event may be detected whenever (D-z) becomes zero i.e. changes in time to reach zero indicating that the pointer has been moved towards the remote surface 30. At block 76, the pointer 50 is positioned relative to the image projected onto the remote surface and the position (x,y) of the pointer relative to the image projected onto the remote surface 30 is used to locate the selection event within the image. At block 77, a user input command, if any, is determined based upon the location of the selection event within the image.
Referring back to Fig 2, the computer program instructions 14 provide the logic and routines that enables the apparatus to perform the methods illustrated in Figs 8 and 9. The processor 10 by reading the memory 12 is able to load and execute the computer program 14.
The apparatus 2 therefore comprises: at least one processor 10; and at least one memory 12 including computer program code 14 the at least one memory 12 and the computer program code 14 configured to, with the at least one processor 10, cause the apparatus at least to perform the method of Fig 8 or 9.
The computer program may arrive at the apparatus 2 via any suitable delivery mechanism. The delivery mechanism may be, for example, a computer- readable storage medium, a computer program product, a memory device, a record medium such as a compact disc read-only memory (CD-ROM) or digital versatile disc (DVD), an article of manufacture that tangibly embodies the computer program 14. The delivery mechanism may be a signal configured to reliably transfer the computer program 14. Although the memory 12 is illustrated as a single component it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/ dynamic/cached storage. References to 'computer-readable storage medium', 'computer program product', 'tangibly embodied computer program' etc. or a 'controller', 'computer', 'processor* etc. should be understood to encompass not only computers having different architectures such as single /multi- processor architectures and sequential (Von Neumann)/paraliel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. shouid be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed -function device, gate array or programmable logic device etc.
As used in this application, the term 'circuitry' refers to ali of the following:
(a ) hardwa re-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) to combinations of hardware circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processors)), software, and memory(ies) that work together to cause an apparatus to perform various functions) and (c) to circuits, such as a microprocessor(s) or a portion of a microprocessors), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term "circuitry" would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit.
As used here 'module' refers to a unit or apparatus that excludes certain parts/components that would be added by an end manufacturer or a user. The apparatus 2, for example as illustrated in Fig 1 , may be a module and it may comprise interfaces to the projector 4, positioning circuitry 6 and selection circuitry 8 that enable incorporation within an electronic device. For example, the processor 10 and memory 12 illustrated in Fig 2 may be part of an electronic device such as a mobile cellular telephone, computer, personal media player etc
Fig 10A illustrates the apparatus 2 in use. In this example, a normal to the remote surface 30 is not parallel to the first direction 26. Fig 10B illustrates the apparatus 2 in use. In this example, a normal to the remote surface 30 is parallel to the first direction 26,
The blocks illustrated in the Figs 8 and 9 may represent steps in a method and/or sections of code in the computer program 14. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some blocks to be omitted.
Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed.
Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.
Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not. Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.

Claims

1. An apparatus comprising:
a projector configured to project an image onto a remote surface;
selection circuitry configured to facilitate estimation of a value dependent upon displacement of the pointer from the apparatus for comparison with a reference value and
positioning circuitry configured to position the pointer relative to the image projected onto the remote surface
2, An apparatus as claimed in claim 1 , wherein the value dependent upon displacement is a distance measured in a first direction between the apparatus and the remote surface.
3, An apparatus as claimed in claim 2, wherein the reference value is dependent upon a distance in the first direction of the remote surface from the apparatus.
4. An apparatus as claimed in any preceding claim, wherein the value dependent upon displacement is dependent upon motion or iack of motion of the pointer.
5. An apparatus as claimed in any preceding claim, , wherein the projector comprises an output aperture oriented to project in a first direction, the positioning circuitry comprises a positioning sensor oriented to sense in the first direction and the selection circuitry comprises a selection sensor oriented to sense in the first direction.
6. An apparatus as claimed in claim 5, wherein the output aperture, the positioning sensor and the selection sensor are positioned at a face of the apparatus.
7. An apparatus as claimed in claim 5 or 6, wherein the output aperture, the positioning sensor and the selection sensor are positioned in a rectilinear arrangement.
8. An apparatus as claimed in claim 7, wherein the rectilinear arrangement is configured for horizontal orientation when the projector, the selection circuitry and the positioning circuitry are in use.
9. An apparatus as claimed in any preceding claim, wherein the positioning circuitry comprises a positioning sensor, the positioning sensor being a camera sensor.
10. An apparatus as claimed in claim 9, wherein the camera sensor is an infra-red camera sensor.
11. An apparatus as claimed in claim 10, wherein the positioning circuitry additionally comprises one or more infra-red light emitters.
12. An apparatus as claimed in any preceding claim, wherein the selection circuitry comprises a selection sensor configured to sense waves reflected from the pointer.
13. An apparatus as claimed in claim 12 wherein the selection sensor is configured to detect sound waves or the selection sensor is configured to detect light or the selection sensor is one of at least two stereoscopic cameras.
14. An apparatus as claimed in any preceding claim, wherein the selection circuitry comprises circuitry configured to determine information indicative of a timing off-set between a wave reflected from the pointer and a reference wherein the timing-off set is a vaiue that varies with a distance of the pointer from the apparatus.
15. An apparatus as claimed in any preceding claim, wherein the selection circuitry comprises circuitry configured to determine information indicative of a timing off-set between a wave reflected from the remote surface to determine the reference value.
16. An apparatus as claimed in claim 15, configured so that determining information indicative of a timing off-set between a wave reflected from the remote surface and a reference wherein the timing-off set determines the reference value, occurs as a calibration, before determining information indicative of a timing off-set between a wave reflected from the remote surface and a reference wherein the timing-off set determines the reference value.
17. An apparatus as claimed in any of claims 14 to 16, wherein the selection circuitry comprises one or more emitters configured to transmit waves, for reflection from the pointer, having a characteristic that varies in time and wherein the circuitry configured to determine information indicative of a timing off-set is configured to detect the characteristic of a received reflected wave.
18. An apparatus as claimed in any of claims 14 to 17, wherein the selection circuitry comprises one or more emitters configured to transmit waves having an on/off duty cycle and wherein the circuitry configured to determine information indicative of a timing off-set between a wave reflected from the pointer is configured to detect a duty cycle of a received reflected wave.
19. An apparatus as claimed in any preceding claim comprising:
at Ieast one processor; and
at least one memory including computer program code
the at least one memory and the computer program code configured to, with the at Ieast one processor, cause the apparatus at Ieast to perform: detecting a selection event by analyzing a change in displacement between the pointer and the apparatus.
20. An apparatus as claimed in any preceding claim comprising;
at least one processor; and
at feast one memory including computer program code
the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform:
detecting a selection event by analyzing kinematics of the pointer including a change in displacement between the pointer and the apparatus.
21. An apparatus as claimed in claim 19 or 20, wherein the kinematics of the pointer analyzed include a change in separation distance between the pointer and the apparatus,
22. An apparatus as claimed in claim 19 or 20, wherein the kinematics of the pointer analyzed include a pause in movement of the pointer.
23. An apparatus as claimed in claim 19, 20, 21 or 22, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform:
using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and
determining a user input command, if any, based upon the location of the selection event within the image
24. An apparatus as claimed in any of claims 19 to 23, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform:
processing camera data to identify the pointer.
25. An apparatus as ciaimed in claim 24, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform:
isolating the pointer and positioning the pointer as it moves
26. An apparatus as ciaimed in any preceding claim configured as a module comprising interfaces that enable incorporation within an electronic device.
27. A method comprising:
projecting an image onto a remote surface;
positioning the pointer relative to the image projected onto the remote surface; detecting a selection event by analyzing a value dependent upon displacement of the pointer from the source of the projecting;
using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and
determining a user input command, if any, based upon the location of the selection event within the image.
28. A method as claimed in claim 27, further comprising: processing camera data to identify the pointer.
29. A method as claimed in claim 28, further comprising: isolating the pointer and positioning the pointer as it moves
30. A method as claimed in any of claims 27 to 29, wherein analyzing a value dependent upon displacement of the pointer from the source of the projecting involves analyzing a displacement of the pointer from an apparatus performing the projecting or a derivative thereof
31. A method as claimed in any of claims 27 to 30, wherein analyzing a value dependent upon displacement of the pointer from the source of the projecting involves analyzing a separation distance in a first direction between an apparatus performing the projecting and the pointer or a derivative thereof.
32. A method as claimed in any of claims 27 to 31 , wherein analyzing a value dependent upon displacement of the pointer from the source of the projecting involves analyzing to detect a pause in movement of the pointer.
33. A method as claimed in any of claims 27 to 32, further comprising: detecting a selection event by analyzing changes in the separation distance of the pointer and changes in the position of the pointer.
34. A method as claimed in any of claims 27 to 33, further comprising:
estimating a reference value dependent upon a separation distance of the pointer from the remote surface;
comparing the value dependent upon the separation distance of the pointer from the source of the projecting and the reference value dependent upon a separation distance of the remote surface from the source of the projecting to detect a selection event
35. A computer program which when loaded into a processor enables the methods of any of claims 27 to 34.
36. An apparatus comprising means for performing the methods of any of claims 27 to 34.
37. An apparatus comprising:
a projector configured to project an image onto a remote surface;
selection circuitry configured to facilitate estimation of a value dependent upon distance of the pointer from the apparatus for comparison with a reference value dependent upon a distance of the remote surface from the apparatus and positioning circuitry configured to position the pointer relative to the image projected onto the remote surface
38. A method comprising:
projecting an image onto a remote surface;
positioning the pointer reiative to the image projected onto the remote surface; detecting a selection event by analyzing kinematics of the pointer;
using the position of the pointer relative to the image projected onto the remote surface to locate the selection event within the image; and
determining a user input command, if any, based upon the Iocation of the selection event within the image.
PCT/CN2011/071631 2011-03-09 2011-03-09 An apparatus and method for remote user input WO2012119308A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/071631 WO2012119308A1 (en) 2011-03-09 2011-03-09 An apparatus and method for remote user input

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/071631 WO2012119308A1 (en) 2011-03-09 2011-03-09 An apparatus and method for remote user input

Publications (1)

Publication Number Publication Date
WO2012119308A1 true WO2012119308A1 (en) 2012-09-13

Family

ID=46797416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/071631 WO2012119308A1 (en) 2011-03-09 2011-03-09 An apparatus and method for remote user input

Country Status (1)

Country Link
WO (1) WO2012119308A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9251409B2 (en) 2011-10-18 2016-02-02 Nokia Technologies Oy Methods and apparatuses for gesture recognition

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1322329A (en) * 1998-10-07 2001-11-14 英特尔公司 Imput device using scanning sensors
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN101840302A (en) * 2009-03-12 2010-09-22 Lg电子株式会社 Portable terminal and the method that the mobile terminal user interface is provided

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1322329A (en) * 1998-10-07 2001-11-14 英特尔公司 Imput device using scanning sensors
CN101729628A (en) * 2008-10-15 2010-06-09 Lg电子株式会社 Mobile terminal having image projection
CN101840302A (en) * 2009-03-12 2010-09-22 Lg电子株式会社 Portable terminal and the method that the mobile terminal user interface is provided

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9251409B2 (en) 2011-10-18 2016-02-02 Nokia Technologies Oy Methods and apparatuses for gesture recognition

Similar Documents

Publication Publication Date Title
US11112872B2 (en) Method, apparatus and computer program for user control of a state of an apparatus
KR20110066198A (en) Stereo optical sensors for resolving multi-touch in a touch detection system
US10101817B2 (en) Display interaction detection
TWI536226B (en) Optical touch device and imaging processing method for optical touch device
US20120120030A1 (en) Display with an Optical Sensor
TWI461990B (en) Optical imaging device and image processing method for optical imaging device
JP2010511945A (en) Interactive input system and method
Moeller et al. ZeroTouch: a zero-thickness optical multi-touch force field
US9207811B2 (en) Optical imaging system capable of detecting a moving direction of an object and imaging processing method for optical imaging system
KR102298652B1 (en) Method and apparatus for determining disparty
US20130016069A1 (en) Optical imaging device and imaging processing method for optical imaging device
TWI439906B (en) Sensing system
WO2012119308A1 (en) An apparatus and method for remote user input
TWI454653B (en) Systems and methods for determining three-dimensional absolute coordinates of objects
US20160321810A1 (en) Optical navigation sensor, electronic device with optical navigation function and operation method thereof
US9652081B2 (en) Optical touch system, method of touch detection, and computer program product
US9152275B2 (en) Optical touch system, method of touch detection and non-transitory computer readable medium recording program instructions
US20120032921A1 (en) Optical touch system
US20160139735A1 (en) Optical touch screen
US20140145959A1 (en) Information processing apparatus, extension device, and input control method
US9535535B2 (en) Touch point sensing method and optical touch system
TWI464626B (en) Displacement detecting apparatus and displacement detecting method
US20180074648A1 (en) Tapping detecting device, tapping detecting method and smart projecting system using the same
US8922528B2 (en) Optical touch device without using a reflective frame or a non-reflective frame
TWI547849B (en) Optical sensing electronic devices and optical sensing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11860340

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11860340

Country of ref document: EP

Kind code of ref document: A1