GB2408326A - Modulated-beam optical pointer for position detection system - Google Patents

Modulated-beam optical pointer for position detection system Download PDF

Info

Publication number
GB2408326A
GB2408326A GB0425660A GB0425660A GB2408326A GB 2408326 A GB2408326 A GB 2408326A GB 0425660 A GB0425660 A GB 0425660A GB 0425660 A GB0425660 A GB 0425660A GB 2408326 A GB2408326 A GB 2408326A
Authority
GB
United Kingdom
Prior art keywords
image
optical
video
event
image capture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0425660A
Other versions
GB2408326B (en
GB0425660D0 (en
Inventor
Steven Lee Jones
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB0425660D0 publication Critical patent/GB0425660D0/en
Publication of GB2408326A publication Critical patent/GB2408326A/en
Application granted granted Critical
Publication of GB2408326B publication Critical patent/GB2408326B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen

Abstract

An optical pointer positioning system includes a video camera 4 which captures an image and a pointer device 5 which produces an infra-red beam projected onto the image, the beam being modulated by a signal which is remotely transmitted to it from the operation of the video camera. The signal is applied to the modulated beam such that it is on for one field of a video frame and off for the other. The detected position of the mark is then digitally filtered and used to position a steadied computer generated cursor. The modulated optical beam is also encoded with the status of user operable buttons which once detected and decoded are used to simulate mouse button events.

Description

1 2408326 Pointer Position Detection The invention relates to a pointer
position detection system, particularly but not exclusively for use in the wireless control of computers and computer displays.
Video projectors are a common tool used in many modern presentations, often being 0 used to project computer generated slides or images onto a screen for viewing by a large audience. Such presentations are becoming increasingly interactive, with a presenter changing slides, or starting or stopping moving images, using a controlling device such as a keyboard or mouse. Operating the computer can undesirably tie the presenter to its locality when, for example, attempting to direct a commentary towards an audience.
To free the presenter from being bound to the computer a number of wireless presentation control devices have been developed. These wireless devices range from the basic, which provide singular buttons to advance slides, to more advanced types, which provide complete cursor control using some form of joystick, tracker ball or inertial sensor. Wireless devices that provide cursor control using these methods are notoriously difficult to control. In particular, for a presenter viewing a presentation screen at an angle, i.e. not facing the screen straight on, controlling the cursor on the screen is non-intuitive. Smooth cursor control consequently requires some practice. 2s
Recently there have been proposals for wireless control systems based upon laser pointers. These project a visible light spot onto the display screen from which the cursor location is determined. Such input devices offer more intuitive cursor control than comparable methods.
Prior art laser pointer control systems commonly work by tracking the position of the visible laser pointer spot, within a projected image, using a camera mounted on top of the video projector. The view from the camera is analysed by a computer using image processing algorithms. The computer looks for a portion of the image that contains pixels with brightness above a set threshold level. It can use this information to lo identify where, within the projected image, a presenter is pointing and can control the position of a cursor accordingly, or invoke a preprogrammed event.
Producing a reliable control system based on such conventional image processing techniques is difficult to implement. Extraneous sunlight or reflections off shiny surfaces within the camera's view can falsely be interpreted by the computer as a laser spot causing erratic control behaviour. Changes in ambient light level can cause either false locations to be identified, or laser spots to be missed. Consequently, careful calibration of the system's image processing algorithms is required at start-up to set threshold detection levels. Strict control of the ambient lighting conditions must then be enforced to continue reliable operation. This manual intensity calibration procedure and controlled presentation environment is not ideal for practical situations. As a result, there is a requirement for a robust optical detection method that can more effectively distinguish the location of the spot, leading to a more reliable control system.
Exaggerated pointer shake is also a common problem encountered in prior art laser pointer control systems. This effect is caused by the large distance between the presenter and the screen leading to a magnification of any slight hand movement. This is distracting for an audience as the position that the presenter is attempting to indicate s is unclear. This effect detracts significantly from the use of optical pointing devices as a control method, and must be overcome if approaches of this type are to gain general popularity.
As discussed, successful spot detection within these prior art systems requires an 0 optical intensity that is considerably brighter than ambient light levels. Laser sources are therefore commonly employed as the optical input pointer. The use of an infrared source has the advantage of producing a spot that is invisible to the eye but visible to the camera thus obviating apparent laser pointer shake. It also overcomes the problem, with visible optical pointers, of the appearance of both the visible laser spot and computer-generated cursor on the screen at once, which can be distracting to an audience. However, the high optical intensity required, using the prior art systems, means that invisible wavelength pointers such as infrared lasers are too dangerous to employ in public situations, due to the inherent risk of eye damage.
Optical pointer control systems have also been proposed which use nonvideo camera based techniques. These systems commonly utilise high frequency modulated (light intensity) pointers coupled with a non-video camera based optical detector/decoder to determine the pointed position. A major disadvantage of such approaches is that a manual positional calibration regime must be employed in the system's configuration.
Positional calibration is required to instruct the system as to where the projection screen is placed within the optical detector's field of view. This allows the position of the optical pointer spot to be determined relative to the projected image. Video based control systems have the advantage of utilising the computer-projector- video-camera- computer 'closed loop' to provide automatic calibration. s
Optical pointer control systems have also been produced using image processing algorithms that locate the pointer position by detecting changes in the captured image.
Although this approach improves detection sensitivity, successful detection can only be achieved whilst the pointed spot is moving and fails if the pointer remains static.
0 This approach is also restricted to visible pointers that produce a small high intensity centre point such as that produced by a laser.
Presentation control systems should ideally allow the user to select specific points of interest within their presentation. Such actions are commonly performed by the click of a button. In previous optical pointer control systems the use of a flashing spot to indicate a button press has been proposed, the system interpreting this flashing as a button press. The problem with this approach is that flashing the spot on and off causes an interruption in the flow of positional information, as during the off period its position cannot be detected. This shortfall leads to jerky cursor control during button presses.
Alternative proposals have suggested using the momentary appearance of additional laser spots or shapes to indicate a button press. Projecting additional spots or shapes onto the video presentation in this manner has the disadvantage of being visually 2s distracting. Other more commonly proposed techniques used to communicate the actuation of a user-operated button is to include an additional wireless communication link to transmit the button status. This however adds to the overall cost and complexity of the system. Consequently, there is a need for a method of communicating button presses without interfering with the spot tracking process, s introducing additional spots or requiring an additional wireless communication link.
In consideration of the prior shortcomings, it is an aspect of the present invention to provide an improved system and method for detecting the position of an optical lo pointer.
At its most general, according to the present invention there is provided an optical pointer position detection system comprising: an optoelectronic detector for detecting an image, and an optical device for producing a modulated optical beam, wherein the modulation applied to the optical beam is derived from an image capture event of the optoelectronic detector.
According to another aspect of the invention there is provided a method for detecting an optical mark, comprising steps of projecting a modulated optical pointer at a video image, to produce an optical mark, and detecting the mark using an optoelectronic detector, wherein the modulation applied to the optical pointer is derived from an image capture event of the optoelectronic detector.
According to yet another aspect of the invention there is provided a method for 2s controlling an electronically generated cursor comprising the steps of projecting a cursor containing video output of a computer, projecting a modulated optical pointer at the video output, to produce an optical mark, detecting the mark using an optoelectronic detector, wherein the modulation applied to the optical pointer is derived from an image capture event of the optoelectronic detector, and invoking a software algorithm according to an output of the optoelectronic detector to position the cursor according to the detected position of the mark.
The software algorithm used to position the cursor may further comprise a step of digitally filtering or averaging the detected positions over time before positioning the lo cursor. The optoelectronic detector may be a video camera.
The step of detecting the mark may include image processing of the output of the optoelectronic detector. The image processing may include calculating a difference between images captured by the detector over a time interval ('differencing calculation'), such as difference in intensity, wavelength, or combination of both.
The above method for controlling an electronically generated cursor can further comprise the step of simulating a mouse button press event such that if for example the projected image of a computer output contains icons, the icons can be selected.
The occurrence of a simulated mouse button press may be determined using the above described image processing technique.
According to an aspect of the invention the modulation applied to the optical beam is derived from the occurrence of a capture synchronization event of the optoelectronic detector. The optical detector is preferably a video camera.
The capture synchronisation event may be a frame synchronisation event of the optoelectronic detector. In such a case, the time interval of the aforementioned image processing differencing calculation could correspond to one or more frame synchronisation events.
Alternatively, the capture synchronisation event may correspond to a field synchronisation event of the optoelectronic detector. In such a case, the time interval lo of the aforementioned image processing differencing calculation could correspond to
one or more field synchronisation events.
It is preferable that the capture synchronisation event may correspond to a field synchronisation event with the difference calculation calculating the difference between odd and even fields of the complete frame. This is preferable as a calculation is then obtained for each complete video frame capture, as opposed to every other frame as is only possible when using frame synchronisation. It is also preferable to use a half-frame video camera as the optoelectronic detector. Using such a device provides odd and even fields that are produced by the same set of video pixels, allowing a more accurate differencing calculation to be obtained.
It is preferable that the optics of the camera are configured such that the image seen by the camera is defocused. Defocusing the image causes the optical mark to become an enlarged diffused spot. This has the benefit of providing a larger spot from which to derive the centre of the pointed position.
According to an aspect of the present invention an electromagnetic signal is transmitted remotely to the optical pointer to indicate the occurrence of an image capture. This signal is used by the optical pointer when deriving beam modulation.
Preferably the electromagnetic signal is transmitted wirelessly and more preferably at a radio frequency.
The optical pointer may have an optical emission in the visible, infrared, ultraviolet or other part of the electromagnetic spectrum or multiple combinations thereof. The lo optical emission may be coherent or noncoherent. Preferably the optical pointer emits in the infrared. Nonvisible wavelengths such as this are preferred as any pointer shake is not visible to the audience. Also, the image processor, fully aware of the true spot positions, has the opportunity to perform digital filtering before optimally positioning the computer cursor. Consequently, a significant advancement provided by the current invention is digitallysteadied cursor placement.
Digitally-steadied cursor placement may be provided by a simple implementation of a smoothing algorithm that averages the perceived spot locations over time. An alternative method would be to use a more advanced digital filter that was configured for the particular mechanical spatial frequency of the pointer shake. This mechanical spatial frequency could be determined by examining the perceived spot locations over time and then analysing the positional data using a Fourier transform algorithm.
Furthermore, as the shake is deterministic in nature and will occur more strongly along specific axis of the presenter's hand, the filtering algorithm may also be tailored to each axis.
The device may be hand-held, fixed or attached to another static or moveable object.
The modulation applied to the light emission of said device may be amplitude modulation, frequency modulation or spatial modulation causing output to change in s intensity, wavelength (colour), pattern, shape, position or combinations thereof The optical emission may be a defined point such as that produced by a conventional laser pointer or an enlarged spot or pattern covering a larger area.
There are a number of preferred methods to generate the modulated infrared light of 0 the preferred optical pointer. For example, a high power IR laser source could be employed with some suitable diverging and collimating optics. Diverging the laser beam would mean that the optical power could be spread over a large enough area to reduce it to a safe intensity level. Diverging the laser beam also provides a larger target for detection by the image processing algorithm allowing for more accurate measurement to be obtained. As an alternative, a non-coherent IR LED could be employed with suitable collimating optics. As a further alternative, a conventional wideband incandescent light source could be employed with an external modulator, such as an LCD shutter. When combined with suitable collimating and filtering optics, this could also be used to generate a modulated IR source.
The present invention preferably allows large variations in ambient light level to be accommodated without the need for any intensity recalibration. The improved detection capability preferably allows greatly reduced optical pointer intensity levels to be reliably located. The use of low intensity levels means that invisible infrared 2s wavelengths may be more safely used. Furthermore lower positional calibration constraints are placed on the system due to the compensation which naturally occurs in eye-to-hand co-ordination, i.e. a user will intrinsically correct their (invisibly) pointed location until the computer cursor coincides with their intended location.
Particular elements within the systems closed optical loop cause geometrical distortions to occur between, for example, the original presentation slide and the captured image aPrer it has been projected by the video projector and recaptured by the video camera. These geometrical distortions may be caused by projector optics camera optics or axial misalignment between the projector and camera. As the system lo identifies the location of a pointed position within a captured image, and uses this to infer the location pointed at within the presentation slide, these geometric distortions cause inaccuracies between the calculated and actual pointed position. To compensate for these geometrical distortions a positional translation algorithm is ideally implemented. This positional translation algorithm may be obtained manually, requiring user interaction, or automatically by the system. Suitable calibration schemes used to obtain such positional translation algorithms are well known to those skilled in the art.
In an automatic calibration scenario, an algorithm could for example require a series of distortion measurements to be performed by the system at start up. This approach relies on the fact that in certain embodiments of the present invention the computer can project a known image through the projector and then capture it again using the video capture camera mounted on top of the projector (ensuring that any IR filters are retracted before hand). By comparing the know source image to the eventual captured image, a map of the distortion introduced by the system can be calculated.
In a preferred automatic calibration scenario, images are captured by the camera as a series of image pairs. For the first captured image the projector projects a known bright image such as a grid or chequer board and for the second captured image a negative of the first bright image is projected. By subtracting one image from the other, a difference image is obtained in which the position and dimensions of the video projector output are clearly identified. This provides an improved calibration method as the calibration routine uses a differential intensity value, as opposed to an absolute intensity value, leading to an improved signal to noise ratio.
In a manual calibration scenario the interaction of the presenter is required. In such a scenario the presenter could be asked to point to a series of regions within the projected image using the pointing device. This process relies on the fact that the pointer is now placed at a known position within the presentation environment and the system can detect this position within the captured image, allowing it to relate the two. As an example, the presenter may be asked to point to the four corners of the projected image at the request of the system. At each stage the captured image is analysed to locate the position of the pointed spot. Using these calculated points the geometrical distortion introduced by the system can be inferred and the required positional translation algorithm calculated.
Once the positional translation algorithm has been obtained using either of the above approaches, captured images can then be corrected for geometrical distortion before pointed positional information is extracted. 2s
Another benefit provided by the present invention is the method of optically encoding button presses, and other user operated events, without interrupting the optical detection process. This allows complete interactive cursor control to be achieved without interrupting the flow of control, interfering with the projected image or requiring additional hardware.
Using the above described image processing techniques, image processing spot calculation requires an image pair in which the spot is present in one image capture and absent in the other. Consequently, there exists two image pair arrangements in lo which successful detection can occur. Firstly, the spot could be present in the first image and absent in the second. Secondly, the spot could be absent in first image and present in the second. In either of these two scenarios a successful differencing calculation for the location of the pointed spot can be performed and the location of the spot determined. By using one image pair arrangement to represent a '1', and another a '0', additional information is essentially encoded in each image pair with no impairment to detection performance. This bit-per-image-pair modulation scheme can be used to encode the state of a single user-operated button. Alternatively, the method can be used to encode a series of bits forming a multi-bit word, allowing numerous buttons or user operable features to be transmitted.
The optical detector may be photosensitive in the visible, infrared, ultraviolet or other part of the electromagnetic spectrum or multiple combinations thereof. It may also be capable of distinguishing between the different wavelength of components of the electromagnetic spectrum (i. e. colour). The optical detector may also be fitted with an optical filter passing one or more specific wavelength bands.
As previously discussed, the scene viewed by the photosensitive device may contain the output of a visual display device such as a video projector, video screen or any other such image presentation device. The scene may additionally or alternatively include a non-active target surface such as a white/blackboard or any other region of interest.
The optical pointing device may also posses a variety of buttons, switches or other user operable fixtures. Signals generated from these features or any other data source lo may be transmitted by the device using RF, infrared or any other known transmission medium. Alternatively, said signals may be incorporated into the optical emission modulation scheme of said device in a similar manner to the above-described optically encoded button press technique.
Embodiments of the invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a diagram of a system according to the invention illustrating the interaction of a presenter, a pointing device, a video projector, a projected image, a camera and a computer; Figure 2 is a schematic diagram of the system illustrating the interconnection of the pointing device, camera and computer; Figure 3 is a schematic diagram showing the optics of the pointing device of Figure 1; Figure 4 is a schematic diagram showing an alternative arrangement of the optics shown in Figure 3 based on a wideband optical source; Figure 5 is a schematic diagram showing an alternative arrangement of the optics shown in Figure 3 based on an IR laser source; Figure 6 shows a diagram of the video image captured by the video camera with the s optical spot present; Figure 7 shows a diagram of the video image captured by the video camera with the optical spot absent; Figure 8 shows a diagram of the resultant image after an image processing differencing calculation; lo Figure 9 is a flow diagram illustrating the processing of an image captured by the camera and computer; Figure 10 is a diagram showing the positional translation algorithm; Figure 11 shows a diagram of a projector with inbuilt video camera device; Figure 12 shows a diagram of a dual wavelength video camera device and hand-held pointer; Figure 13 shows a diagram of a dual wavelength sensitive image sensor formed from two monochrome CCDs; Figure 14 shows a flow diagram of the image processing algorithm for the dual wavelength embodiment; Figure 15 shows a diagram of a pattern modulated hand-held pointer; Figure 16 shows examples of image pairs suitable for use in the pattern modulation embodiment; Figure 17 shows a flow diagram ofthe image processing algorithm for the pattern modulated hand-held pointer; Figure 18 is a diagram ofthe hand-held camera pointer system; Figure 19 is a schematic diagram of the system illustrating the interconnection of the pointing device, camera and computer for the hand- held camera embodiment; Figure 20 shows odd-field capture with IR target during 'on' period of modulation.
Figure 21 shows the even-field capture during 'off' period of modulation.
Figure 22 shows the resultant image after odd/even field differencing Figure 23 shows a flow diagram of the image processing algorithm for handheld camera embodiment.
Figure 24 shows a diagram of the dual wavelength hand-held camera arrangement; lo Figure 25 shows a diagram of the image processing algorithm for the dual wavelength hand-held camera embodiment; Embodiment la Referring to Figure 1, one embodiment of the present invention comprises a computer 1, a video projector 2, a projection screen 3, a video camera device 4 and a pointing device 5. The computer 1, for example a conventional Pentium based PC, contains a series of graphical presentation slides 6, such as those generated by Microsoft Powerpoint. The SVGA output of the computer 1 is connected to a video projector 2, such as a Sanyo PLV 20B. The video projector 2 projects the presentation slides 6 generated by the computer I onto the projection screen 3. A presenter 7 views the projected presentation slides 6 and directs the output of the hand held optical pointing device 5 onto the projection screen 3. The output of the hand held pointing device 5 produces an infrared optical beam that is coincident with the projected presentation slides 6. The video camera device 4 is placed on top of the video projector 2 at an angle so as to be able to view the projected presentation slides 6 and the output of the hand held optical pointing device 5. The video output signal generated by the video camera device 4 is connected to the video capture port of the computer 1 that is capable of capturing video images. s
The video camera device 4 and pointing device 5 are described in more detail with reference to Figure 2. The video camera device 4 comprises a housing 8, an image sensor 9, a retractable infrared pass filter 10, a video field timing extraction device 11, an RF transmitter 12 and an aerial 13. The scene viewed by image sensor 9, such as a 0 Maplin L77AB CCD device, is viewed through retractable infrared pass filter 10, such as Edmund Optics C45-670. The video output signal of image sensor 9 is connected to the video capture port of the computer 1, which is fitted with a video capture card 22 such as a Brooktree BT848 card. Image sensor 9 composes its video frames using a technique called interleaving. Interleaving is a process whereby a video camera splits a complete video frame into two hales. These two hales are called fields, one field contains the odd numbered horizontal lines (odd field) and the other field the even numbered horizontal lines (even field). The exposure of these two fields are also separated in time. Consequently, at time In the odd numbered horizontal lines are exposed, at time tn+ the even numbered horizontal lines are exposed and so on. One field is produced every 1/60th of a second providing 30 complete video frames a second.
The video output signal of image sensor 9 is also connected to the video field timing extraction device 11, such as National SemiconductorIC LM1881N. Video timimg 2s extraction chips are also a common component produced by various manufacturers which are capable of analysing a video signal and providing odd/even field event timimg signals as individual digital outputs. The odd/even field pulse output of the video field timing extraction device 11 is connected to the RF transmitter 12, such as RF Solutions AM RT5 433, and transmitted via the aerial 13. s
The pointing device 5 comprises of a body 14, an aerial 15, an RF receiver 16, a user operable button 17, modulator logic 18 and a light emitter 19. The aerial 15 receives the RF signal transmitted by transmitter 12. The RF receiver 16, such as RF Solutions AM HRR3 433, demodulates the received signal. The modulator logic 18 takes the lo output of the RF receiver 16 and the output of the user operable button 17 and applies the resultant modulated signal to light emitter 19 producing a modulated infrared optical beam.
The optics of light emitter 19 are described in more detail with reference to Figure 3.
Light emitter 19 shown in Figure 3 comprises an infrared LED 20, such as Siemens SFH487, and collimating lens 21. The light output of infrared LED 20 is collimated by collimating lens 21 and projected coaxially along the pointing device's 5 axis.
Figure 4 shows an alternative arrangement for the light emitter 19 comprising a wideband source 23, a collimating lens 24, a light modulator 25 and an IR filter 26. In this arrangement the modulator 25 is used to modulate the collimated output of the wideband source 23. The IR pass filter 26 is used to filter out the visible wavelengths from the modulated wideband wavelength spectrum, resulting in a modulated IR beam. Figure 5 shows an alternative arrangement for the light emitter 19 comprising an IR laser source 27, a diverging lens 28 and collimating lens 29. 2s Referring to Figures 1, 2 and 3, the video camera device 4 captures a
series of images of the projection screen 3. As the image sensor 9 views the projection screen 3 through a retractable infrared pass filter 10, it only sees light in the infrared range. As video the projector 2 only produces light in the visible spectrum, the projected s presentation slides 6 projected onto the projection screen 3 are not visible to the image sensor 9 unless retractable infrared filter 10 is retracted. The spot produced by the pointing device 5 is in the infrared range and consequently visible to the image sensor 9. The video output signal of image sensor 9 is connected to both the video capture card port 22 of computer 1 and the input of the video field timing extraction device lo 11. The video field timing extraction device 11 extracts the field synchronization signal from the video output signal produced by image sensor 9 and transmits it using the RF transmitter 12 through the aerial 13. The transmitted RF signal is detected by the RF receiver 16 within the hand held pointing device S. The signal extracted from the received RF signal is passed through the modulation logic 18 and used to amplitude modulate the light emitter 19 of the pointing device 5. With the user operable button 17 in the off state, the received field synchronization signal is applied by the modulation logic 18 in a manner such that an odd field capture of the image sensor 9 causes the light emitter 19 of the optical pointing device 5 to turn on.
Consequently, in the field image captured by the image sensor 9 during the odd field exposure, the spot produced by the optical pointer 5 is present (see Figure 6).
Correspondingly, the received signal relating to a subsequent even field capture of the image sensor 9 causes the light output of the optical pointing device 5 to turn off.
Consequently, in the image captured by the image sensor 9 during the even field exposure, the spot produced by the optical pointer 5 is no longer present (see Figure 7). With the user operable button 17 in the on state, the pointer modulation scheme is reversed and so light emitter 19 is off curing an odd field capture and on during an
even field capture.
The completed two-field frame obtained by the image sensor 9 has now been captured s by the video capture card 22 fitted to the computer 1. This complete video frame is composed of two near identical video fields, with the only difference being the appearance in one of the fields of the output from the hand held pointer device 5. By using a differencing algorithm between the two fields the presence of the spot can be determined (see Figure 8). This process is best described with reference to the flow 0 diagram shown in Figure 9.
In step at the full frame composed of both odd (fl) and even (f2) fields has been captured by the computer 1. In step s2 a difference image (fd) is calculated from the subtraction of fl from f2. In step s3 the difference image (Ed) is transposed by the positional translation algorithm (pt), which has been calculated at start up, to correct for geometrical distortion. This process is depicted in Figure 10, where 30 is the original image, 31 represents the distortion introduced by the systems, 32 is the distorted capture image, 33 represents the application of the positional translation algorithm and 34 is the corrected image. This transposition results in the corrected Jo difference image (fat). In step s4 the corrected difference image (fat) is searched for any image artefacts concurrent with a modulated pointed spot. In step s5 the detected spot is assessed to calculate its centre point within the image by estimating its centre of gravity (COG). In step s6 the position of the spot centre point is fed into the input of the digital filter algorithm, the output of which is then used to position the Is computer cursor in step s7. Consequently, in the projected image produced by video projector 2 the mouse cursor appears at the position where the presenter is pointing.
The field arrangement of the pointed spot is then assessed in step S8 to determine if the user operated button is depressed (i.e. present in odd field image absent in even field image or vice versa). The result of this assessment is then used to action any mouse button events such as producing a left mouse button click event.
Embodiment lb Figure 11 shows a second embodiment of the present invention, with an alternative lo arrangement for the video camera device and video projector. This arrangement comprises a video camera device 35, a video projector 36, a wavelength dependent beam-splitter 37, a video projector image generator 38 and optional image processing device 39. Figure 11 also depicts the forward projected visible light, represented by 40, and the reflected light from the IR spot, represented by 41. In this arrangement the video camera device 35 is housed internally within the video projector 36. Also incorporated within the video projector 36 is a wavelength dependent beam-splitter 37. The wavelength dependent beam-splitter 37 is chosen such that it is transparent to visible light but reflects light of the wavelength used by the pointing device 5. The beam-splitter 37 is placed inline with the image projection path so that visible light produced by the video projector image generator 38 passes through the beam-splitter 37. However, light of the wavelength produced by the pointing device 5, which is reflected back along the light path, is split off at an angle towards the video camera device 35.
The video output signal produced by the video camera device may then be connected to the video capture port of the computer 1. Alternatively, the video output signal maybe processed by an image processing device 39 contained within the video projector. This device 39 performs the necessary image processing on the video signal to extract the position of the pointed spot (as described previously in Figure 9). The s device then passes the calculated coordinates of the detected pointed position onto the computer 1. This device alleviates the need for the computer 1 to be capable of capturing and processing the video image.
Embodiment 1c Figure 12 shows a third embodiment of the present invention with an alternative to the single optical wavelength pointer and monochrome video camera device shown in Figure 2. In this arrangement the single optical wavelength pointing device 5 is replaced by a dual optical wavelength pointing device 42 which contains a dual infrared wavelength light source 43. The video camera device 4 is also replaced with a dual wavelength sensitive video camera device 44. In this arrangement the dual wavelength pointing device 42 is capable of producing two coaxially aligned infrared beams each at a different wavelength (wavelength x and wavelength y). These beams are capable of being independently modulated and are aligned such that the beam spots are both incident at the same point on the projection screen 3. The modulated dual infrared wavelength optical beam is depicted in Figure 12 as 45. The dual wavelength video camera device 44 is fitted with a dual wavelength sensitive image sensor 46 which is capable of simultaneously capturing separate images at two different optical wavelengths. The two wavelengths are chosen the same as two Is wavelengths employed in the dual wavelength pointing device 42. The image sensor may be formed using a single multi-wavelength sensitive CCD device or may be formed using two monochrome CCD cameras. An arrangement using two monochrome CCD cameras is shown in Figure 13. Figure 13 comprises a primary monochrome CCD camera 47, a primary infrared wavelength filter 48, a secondary monochrome CCD camera 49, a secondary infrared wavelength filter 50, a beam- splitter 51 and video combiner 52. The beam-splitter 51 is placed in the incoming light path so that it splits the incoming optical signal towards both primary monochrome CCD camera 47 and secondary monochrome CCD camera 49. Primary monochrome CCD camera 47 is fitted primary infrared wavelength filter 48 such that lo it only sees the image at wavelength x. Secondary monochrome CCD camera 49 is fitted with secondary infrared wavelength filter 50 such that it only sees the image at a wavelength y.
Operation of the dual wavelength arrangement is best described with reference to the flow diagram shown in Figure 14. In step s10, a field image is captured which contains two image planes fan and fyn (where x and y denote the two wavelengths at which the image plane is measured and n represents the field number). In step sit, a difference image (fxd) is calculated from the difference between the field image captured at wavelength x (fxn) and the previous field image captured at wavelength x (fond). In step s12 the difference image (fxd) is transposed by the positional translation algorithm (pt), which has been calculated at start up, to correct for geometrical distortion. This transposition results in the corrected difference image (fxdt). In step s13 the corrected difference image (fxdt) is searched for any image artefacts concurrent with a modulated pointed spot. Instep s14 the detected spot is assessed to calculate its centre point within the image by estimating its centre of gravity (COG). In step s15 the position of the spot centre point is fed into the input of the digital filter algorithm, the output of which is then used to position the computer cursor in step s16. Consequently, in the projected image produced by video projector 2 the mouse cursor appears at the position where the presenter is pointing. In step s17, s two images are captured relating to the next field event fan+' and fyn+'. In step s18, a difference image (fyd) is calculated using the field image captured at wavelength y (fyn+') and the previous field image captured at wavelength y (fyn). In step s19 the image is transposed, in step s20 the spot position is detected, in step s21 the the spot centre point is calculated, in step s22 the resultant position is digitally filtered and in 0 step s23 it is used to reposition the computer cursor. Consequently, in this alternative arrangement a cursor position is calculated for every field capture, as opposed to every other field capture with the single wavelength embodiment.
Embodiment Id Figure l 5 shows a fourth embodiment of the present invention, having an alternative arrangement for the hand held pointer device 5 shown in Figure 2. In Figure 15 the spot producing light emitter 19 is replaced by a light source that can produce a variable pattern. In this arrangement the output of the pointing device produces a Jo predetermined pattern onto the projection screen instead of a spot. The pattern produced by the variable pattern IR light source 53 can be electronically varied between two preset patterns. The IR pattern modulated infrared output is represented in Figure 15 by 54. This variation is performed in synchronization with the transmitted field synchronization signal transmitted from the video camera device.
These patterns are chosen suitably different such that when one pattern image is subtracted from the other by the differencing calculation, the centre point of the pattern can still be determined. Examples of such patterns are show in Figure 16 with and 56 depicting one possible image pair and 57 and 58 depicting another.
s Operation of the alternative pattern modulated arrangement is best described with reference to the flow diagram shown in Figure 17. In step s25, a field image is captured fin (where n represents the field number). In step s26, a difference image (Ed) is calculated from the current field image (fin) and the previous field image (bind). In step s27 the difference image (fd) is transposed by the positional translation algorithm (pt), which has been calculated at start up, to correct for geometrical distortion. This transposition results in the corrected difference image (fdt). In step s28 the corrected difference image (fat) is searched for any image artefacts concurrent with a modulated pointed pattern. The detected pattern difference is then assessed in step s29 to calculate its centre point within the image by estimating its centre of gravity (COG).
IS In step s30 the position of the pattern difference centre point is fed into the input of the digital filter algorithm, the output of which is then used to position the computer cursor in step s3 1. Consequently, in the projected image produced by video projector 2 the mouse cursor appears at the position where the presenter is pointing. In step s32 the field arangement of the modulated pattern is assesed to determine the state of any Jo user operated buttons. If a pattern modulation scheme is employed which uses non circularly symmetric patterns then the pattern difference image may also be assessed in step s33 to determine the rotation angle of the pointing device. The algorithm is then repeated for the next captured field. This approach allows a pointer position to be
obtained for every field capture. 2s
Embodiment 2 Figure 18 shows an alternative embodiment to that shown in Figure 1. Referring to Figure 18, the pointing system according to the alternative embodiment of the invention comprises a computer 69, a video capture port RF receiver 60, a video projector 61, a projection screen 62, a hand-held video camera device 63 and an IR target projection device 64. The video projector 61 projects the presentation slides 65 generated by the computer 69 onto the projection screen 62. The IR target projection device 64, placed on top of the video projector 61 (or alternatively inside it), projects ID an IR target onto the same projection screen 62. A presenter 66 views the projected presentation slides 65 and directs the hand-held video camera device 63 at the projection screen 62.
The hand-held video camera device 63, video capture port RF receiver 60 and IR target projection device 64 are described in more detail with reference to Figure 19.
The hand-held video camera device 63 comprises a housing 67, an image sensor 68, an RF transmitter 69 and an aerial 70. The IR target projection device 64 comprises of a body 71, an aerial 72, an RF receiver 73, a video synchronization timing extraction device 74, modulator logic 75 and an IR target generating source 76. The video capture port RF receiver 60 comprises an RF receiver 77 and an aerial 78.
The scene viewed by image sensor 68 is viewed through infrared pass filter 79. The video output signal of image sensor 68 is connected to the RF transmitter 69, and transmitted via the aerial 70. The transmitted RF video signal is received by both the aerial 72 fitted to IR target projection device 64 and the aerial 78 fitted to the video capture port RF receiver 60. The RF signal received by aerial 72 is decoded by the RF demodulator 80 and the extracted video signal passed to the video synchronization timing extraction device 74 which extracts field synchronization timing from the video signal. The extracted field synchronization timing signal is applied through the s modulator logic 75 to modulate the output of the IR target generating source 76. The output of the IR target generating source 76 is projected onto the projection screen 62.
The signal received by aerial 78 fitted to the video capture port RF receiver 60 is decoded by the RF demodulator 80 and the video signal passed to the video capture port of the computer 69, which is fitted with a video capture card.
The extracted field synchronization timing signal is applied to IR target generating source 76 so that a received signal corresponding to an odd field capture of the hand held video camera device 63 causes the light output of the IR target generating source 76 to turn on. Conversely, the subsequent received signal corresponding to an even field capture of the hand-held camera device 63 causes the light output of the IR target generating source 76 to turn off. Consequently, in the odd video field image captured by the hand-held video camera, the IR target is present. This odd video field image is shown in Figure 20. Correspondingly, in the even video field image captured by the hand-held video camera, the IR target is absent. This even video field image is shown in Figure 21. The transmitted video image comprising of both odd and even fields is received by the video capture port RF receiver 60 and subsequently processed by the computer 69 to produce field differencing image, as shown in Figure 22.
This process is best described with reference to the flow diagram shown in Figure 23.
In step s35 the full frame composing of both odd (fl) and even (f2) fields has been captured by the computer 69. In step s36 a difference image (fd) is calculated from the subtraction of fl from f2. In step s37 the difference image (Ed) is transposed by the positional translation algorithm (pt), which has been calculated at start up, to correct for image distortion. This transposition results in the corrected difference image (fdt).
In step s38 the corrected difference image (fat) is searched for any image artefacts concurrent with the modulated projected target. In step s39 the centre point of the projected target and its orientation is calculated. In step s40, the position and orientation of the projected target within the captured image is used to determine pointing position of the hand-held camera. In step s41, the calculated pointing to position of the hand-held camera is fed into the input of the digital filter algorithm, the output of which is then used to position the computer cursor. Consequently, in the projected image produced by video projector 61 the mouse cursor appears at the position where the presenter is pointing.
Embodiment 2b Figure 24 shows an alternative arrangement to the single wavelength target generator and monochrome CCD camera set up shown in Figure 19. Unlike Figure 19, the alternative embodiment shown in Figure 24 employs a dual wavelength target generator 81 and dual wavelength sensitive image sensor 82. The dual wavelength target generator 81 is capable of alternating the wavelength of the projected target between two IR wavelengths (wavelength x and y). The modulated dual wavelength target output is represented in Figure 24 by 83. This alternation is performed in synchronization to the transmitted field synchronization signal transmitted from the hand-held video camera device. The dual wavelength sensitive image sensor 82 is capable of simultaneously capturing separate images at two different optical wavelengths (wavelength x and y).
A flow diagram of the image processing algorithm for the dual-wavelength target embodiment is shown in Figure 25. In step s43, a field image is captured which contains two image planes fxn and fyn (where x and y denote the two wavelengths at which the image plane is measured and n represents the field number). In step s44, a difference image (fxd) is calculated from the difference between the field image captured at wavelength x (fxn) and the previous field image captured at wavelength x lo (fan-'). In step s45 the difference image (fxd) is transposed by the positional translation algorithm (pt), which has been calculated at start up, to correct for geometrical distortion. This transposition results in the corrected difference image (fxdt). In step s46 the corrected difference image (fxdt) is searched for any image artefacts concurrent with a modulated IR target. In step s47 the centre point and orientation of the modulated IR target are calculated and used to determine the pointed position of the hand-held camera. In step s48 the calculated pointed camera position is fed into the digital filter algorithm, the output of which is then used to position the computer cursor in step s49. In step s50 the next field images are captured (fxn+i and fyn+'). In step s51, a difference image (fyd) is calculated using the field image captured at wavelength y (fun+) and the previous field image captured at wavelength y (fyn). In step s52 the image is transposed, in step s53 the Icoation of the modulated target is determined, in step s54 the calculated centre point and orientation of the target is used to determine the pointed position of the camera, in step s55 the calcuated position and orientation are digitally filtered and then used to reposition the computer cursor in step s56. This alternative arrangement allows the system to calculate the pointed direction of the hand-held camera for each field occurrence.

Claims (1)

  1. Claims 1. A pointer position detection system, comprising: an
    optoelectronic detector for detecting an image; and s an optical device for producing a modulated optical beam, wherein the modulation applied to the optical beam is derived from an image capture event of the optoelectronic detector.
    2. A system according to claim 1, wherein the optoelectronic detector comprises lo a video camera.
    3. A system according to claim 2, wherein the image capture event is a frame synchronization event of the video camera.
    4. A system according to claim 2, wherein the image capture event is a field synchronization event of the video camera.
    5. A system according to claim 1, 2, 3 or 4. wherein the occurrence of the image capture event is transmitted remotely to the optical device.
    6. A system according to claim 5, wherein the occurrence of the image capture event is transmitted wirelessly via an electromagnetic signal.
    7. A system according to any one of claims 1 to 6, wherein the optical device has one or more buttons or operable fixtures and the modulation applied to the optical s beam is also derived from the state the of the buttons or operable fixtures.
    8. A system according to any one of claims 1 to 7 wherein the optical beam comprises an infra-red beam.
    lo 9. A system according to any one of claims 1 to 8 further comprising a means for generating a video image.
    10. A system according to any one of claims 1 to 9 further comprising a means for displaying a video image.
    11. A method of detecting an optical mark, comprising the steps of directing a modulated optical emission device at a scene, to produce an optical mark, and detecting the mark using an optoelectronic detector, wherein the modulation applied to the optical emission device is derived from an image capture event of the optoelectronic detector.
    12. A method according to claim 11, wherein the step of detecting the mark includes image processing of the output of the optoelectronic detector.
    2s 13. A method according to claim 12, wherein the image processing includes calculating a difference between images captured by the detector over a time interval.
    14. A method according to claim 13, wherein the difference between images is a difference in intensity measured at one or more wavelengths. s
    I S. A method according to any one of claims 11 to 14, wherein the optoelectronic detector is a video camera.
    16. A method according to claim 15, wherein the image capture event of the lo optoelectronic detector is a video frame synchronization event.
    17. A method according to claim 15, wherein the image capture event of the optoelectronic detector is a video field synchronization event.
    18. A method according to any one of claims 11 to 17, wherein the occurrence of the image capture event is transmitted remotely to the optical emitter.
    19. A method according to claim 18, wherein the occurrence of the image capture event is transmitted wirelessly via an electromagnetic signal.
    20. A method according to any one of claims 11 to 19 where the scene contains a video image.
    21. A method according to claim 20 where the video image contains a visible marker the position of which is derived from the detected position of the optical mark.
    22. A method according to claim 20 where the video image contains a visible marker whose position is derived from the detected position of the optical mark which has been digitialy filtered, averaged or smoothed.
    23. A method according to claims 11 to 22 wherein the optical emission device has one or more buttons or operable fixtures and the modulation applied to the optical emission is also derived from the state the of the buttons or operable fixtures lo 24. A method according to claim 23 where the output of the image processing is also used to determine the state of the buttons or operable fixtures.
    25. A method according to claim 24 where the state of the buttons or operable fixtures determined by the output of the image processing is used to invoke an event.
    26. A pointer position detection system substantially as described herein with reference to the accompanying drawings.
GB0425660A 2003-11-22 2004-11-22 Pointer position detection Expired - Fee Related GB2408326B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0327251A GB0327251D0 (en) 2003-11-22 2003-11-22 Pointer position detection

Publications (3)

Publication Number Publication Date
GB0425660D0 GB0425660D0 (en) 2004-12-22
GB2408326A true GB2408326A (en) 2005-05-25
GB2408326B GB2408326B (en) 2006-04-19

Family

ID=29764310

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0327251A Ceased GB0327251D0 (en) 2003-11-22 2003-11-22 Pointer position detection
GB0425660A Expired - Fee Related GB2408326B (en) 2003-11-22 2004-11-22 Pointer position detection

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB0327251A Ceased GB0327251D0 (en) 2003-11-22 2003-11-22 Pointer position detection

Country Status (1)

Country Link
GB (2) GB0327251D0 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2442398A (en) * 2006-06-29 2008-04-02 Spiral Scratch Ltd Control device
GB2445012A (en) * 2006-05-25 2008-06-25 Everest Display Inc Presentation system with image capturing projector
EP4131978A4 (en) * 2020-04-30 2023-08-30 Huawei Technologies Co., Ltd. Pointing remote control method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4893922A (en) * 1986-07-09 1990-01-16 Precitronic Gesellschaft Fur Feinmechanik Und Electronic Mbh Measurement system and measurement method
JPH104228A (en) * 1996-06-14 1998-01-06 Nec Corp Laser pointer
US6324296B1 (en) * 1997-12-04 2001-11-27 Phasespace, Inc. Distributed-processing motion tracking system for tracking individually modulated light points
US6608688B1 (en) * 1998-04-03 2003-08-19 Image Guided Technologies, Inc. Wireless optical instrument for position measurement and method of use therefor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4893922A (en) * 1986-07-09 1990-01-16 Precitronic Gesellschaft Fur Feinmechanik Und Electronic Mbh Measurement system and measurement method
JPH104228A (en) * 1996-06-14 1998-01-06 Nec Corp Laser pointer
US6324296B1 (en) * 1997-12-04 2001-11-27 Phasespace, Inc. Distributed-processing motion tracking system for tracking individually modulated light points
US6608688B1 (en) * 1998-04-03 2003-08-19 Image Guided Technologies, Inc. Wireless optical instrument for position measurement and method of use therefor

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2445012A (en) * 2006-05-25 2008-06-25 Everest Display Inc Presentation system with image capturing projector
GB2442398A (en) * 2006-06-29 2008-04-02 Spiral Scratch Ltd Control device
EP4131978A4 (en) * 2020-04-30 2023-08-30 Huawei Technologies Co., Ltd. Pointing remote control method and system

Also Published As

Publication number Publication date
GB2408326B (en) 2006-04-19
GB0327251D0 (en) 2003-12-24
GB0425660D0 (en) 2004-12-22

Similar Documents

Publication Publication Date Title
US6331848B1 (en) Projection display system
CN101238409B (en) Use of a CCD camera in a projector platform for smart screen capability and other enhancements
CN104730827B (en) The control method of projecting apparatus, optical projection system and projecting apparatus
US9753558B2 (en) Pointing system and method
US9753580B2 (en) Position detecting device, position detecting system, and controlling method of position detecting device
KR20070105322A (en) Pointer light tracking method, program, and recording medium thereof
CA2378154A1 (en) Computer presentation system and method with optical tracking of wireless pointer
US20020163576A1 (en) Position detector and attitude detector
JP2004265410A (en) Visible pointer tracking system and method using separately detectable pointer tracking signal
CN106851234A (en) The control method of projecting apparatus and projecting apparatus
KR101990001B1 (en) Input system for a computer incorporating a virtual touch screen
JPH08331667A (en) Pointing system
US7909471B2 (en) Rear projection type display apparatus
Kitajima et al. Simultaneous projection and positioning of laser projector pixels
US10812764B2 (en) Display apparatus, display system, and method for controlling display apparatus
CN107731198A (en) The control method of display device and display device
JP2011009803A (en) Optical wireless communication apparatus, optical wireless communication portable terminal, transmitter, optical wireless communication method, and program
JP2011009805A (en) Optical wireless communication terminal unit, optical wireless communication system, optical wireless communication method, and program
GB2408326A (en) Modulated-beam optical pointer for position detection system
WO2023047833A1 (en) Control device, control method, control program, and projection system
US20050110868A1 (en) System and method for inputting contours of a three-dimensional subject to a computer
JPH02170193A (en) Video projecting device
US20060153427A1 (en) Image methods for determining contour of terrain
JP2004110293A (en) Optical coordinate input system and light signal detection method
JP2020071573A (en) Display device, display system, and display method

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20131122