GB2524247A - Control of data processing - Google Patents

Control of data processing Download PDF

Info

Publication number
GB2524247A
GB2524247A GB1404729.4A GB201404729A GB2524247A GB 2524247 A GB2524247 A GB 2524247A GB 201404729 A GB201404729 A GB 201404729A GB 2524247 A GB2524247 A GB 2524247A
Authority
GB
United Kingdom
Prior art keywords
user
thermal
temperature
detection
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1404729.4A
Other versions
GB201404729D0 (en
GB2524247B (en
Inventor
Stephen Andrew Humphries
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB1404729.4A priority Critical patent/GB2524247B/en
Publication of GB201404729D0 publication Critical patent/GB201404729D0/en
Publication of GB2524247A publication Critical patent/GB2524247A/en
Application granted granted Critical
Publication of GB2524247B publication Critical patent/GB2524247B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Capturing successive thermal images via a thermal camera, detecting a user action from the thermal images and controlling a data processing operation in response to the detection. The user action maybe a user touching a surface and the detection comprises detecting from the thermal image a change in the temperature of the surface at a detection position from an initial temperature to a temperature higher than the initial temperature. The detecting step may detect user movement while touching the surface along a path larger than the detection position associated with a single touch so it detects heat residue left by a finger touch. The detected users actions may be filtered to so they decay slower than a rate of decay associated with the cooling of the surface. The surface maybe actively cooled in response to detection of a users touch. The thermal camera maybe stereoscopic and may be associated with a visible light camera.

Description

CONTROL OF DATA PROCESSING
This invention relates to the control of data processing.
It is known to use touch-sensitive controls in order to allow user control of data processing apparatus. These may be embodied as a so-called touch screen in which they are combined with a display device, or as separate stand-alone touch-sensitive input devices.
However, not all data processing apparatus is operable using a touch-sensitive input device.
Various aspects and features of the present invention are defined in the appended claims and within the text of the accompanying description and include at least a method of data processing, a data processing apparatus and computer software.
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 schematically illustrates a computer games machine with an associated thermal camera; Figure 2 schematically illustrates a part of the arrangement of Figure 1 in more detail; Figure 3 schematically illustrates a filter; Figure 4 schematically illustrates a user touching a surface; Figure 5 schematically illustrates the decay of the thermal signature corresponding to a single touch; Figure 6 schematically illustrates the thermal signature associated with a linear touching motion; Figure 7 is a schematic flowchart illustrating a calibration process; Figure 8 is a schematic flowchart illustrating a touch detection process; Figure 9 is a schematic flowchart illustrating a shape detection process; Figures ba and lOb schematically illustrate the association between a display screen
image and the field of view of a thermal camera;
Figure 11 schematically illustrates a combination of a visible light camera and a thermal camera; Figure 12 is a schematic flowchart illustrating a control technique; Figure 13 schematically illustrates a portable entertainment device; Figure 14 schematically illustrates the field of view of a thermal camera; Figure 15 schematically illustrates a stereoscopic thermal camera; Figure 16 schematically illustrates a cooled touch surface; and Figure 17 schematically illustrates the internal structure of a computer games machine.
Referring now to the drawings, Figure 1 schematically illustrates a computer games machine 10 with an associated thermal camera 20.
A thermal camera is a device for capturing an image, in a generally similar way to a conventional (visible light) camera except that the thermal camera is substantially insensitive to visible light in the 450-750 nm range, but is responsive to infrared radiation, for example electromagnetic radiation having a wavelength as long as (say) 14,000 nm. Thermal cameras are currently available with wavelength responses appropriate to various applications such as night vision, meteorology, non-invasive testing and the like. A type of thermal camera which is particularly relevant to the present embodiments is a thermal camera which detects heat in the temperature range associated with a domestic room environment containing a living human or animal body, which is to say about 0° C up to about 40° C. The thermal camera can be referred to as a forward looking infrared camera. In some embodiments, it is in a known (or at least a substantially constant) position and orientation. The position and orientation can be fixed with respect to, say, a head mountable display (HMD), for example by mounting the thermal camera in a forward-viewing position with respect to the HMD.
A thermal camera provides a pixel-based output in which a pixel value at a particular pixel position is associated with a detected temperature at that image position. For display purposes, the detected temperatures are sometimes associated with false colours, for example mapping brighter colours to higher temperatures and/or moving towards the red/yellow end of the visible colour spectrum to indicate higher temperatures. However, this is simply an artificial mapping used to assist a human user in making sense of an image acquired by a thermal camera. The thermal camera 20 in the present embodiments does not require the use of false colour mapping, but equally the embodiments would work using a camera 20 which provided false colour mapping.
The thermal camera 20 provides an input to the games machine 10, for example for the control of operations of the games machine 10. Some of the internal operations of the games machine 10 will be discussed below with reference to Figure 17, but at this stage in the description it is sufficient to describe the games machine 10 as a general-purpose data processing device capable of receiving and/or processing camera data as an input, and optionally having other input devices (such as games controllers, keyboards, computer mice and the like) and one or more output devices such as a display (not shown) or the like. It is noted that although the embodiments are described with respect to a games machine, this is just an example of broader data processing technology and the present disclosure is applicable to other types of data processing systems such as personal computers, tablet computers, mobile telephones and the like.
In general terms, in at least some embodiments of the images detected by the camera 20 are processed to provide control inputs to the games machine 10 which may be similar to the type of input provided by a mouse or touchscreen control. Techniques for achieving this will be described.
Figure 2 schematically illustrates a part of the arrangement of Figure 1 in more detail. It will be understood that many different functions may be carried out by the games machine 10, but a subset of those functions relevant to the present technique will be described.
Signals from the thermal camera 20 are passed to a position detector 30. The position detector 30 compares the images captured by the thermal camera 20 with calibration or baseline image data, the derivation of which will be described further below. From the comparison, the position detector 30 detects image positions within the captured images which correspond to areas which are hotter than the corresponding areas of the calibration or baseline image.
Data defining the detected positions (for example, (x,y) coordinates within the captured thermal image) and associated detected temperatures is passed to a filter 40 which, under the control of filter parameters provided by an analyser 50, applies a temporal filtering operation to the position data. This operation will be discussed further below.
The analyser 50 operates with respect to the filtered position data and in response to context data 60 to be discussed further below, to analyse the nature of the detected positions and generate a control output 70 to control one or more operations of the games machine 10.
Figure 3 schematically illustrates a filter suitable for use as the filter 40. A purpose of the filter 40 is to allow the analyser to lengthen the time for which a detected position is retained within the position data. This is significant in situations where the analyser 50 is attempting to detect the user forming a line or a shape such as a box by movements of the user's finger or hand. As discussed below, a detection technique involves detecting the way in which a surface which the user has touched heats up in response to that touch. The filter is used here because the natural cooling of the touched surface may take place more quickly than the time taken by the user to draw a complete shape such as a bounding box, making it difficult for an analysis technique to detect the presence of a drawn bounding box (because there will not be a single occasion at which the entire bounding box is present in the captured thermal images). By providing a temporal filtering arrangement (the filter 40) to provide a sustain function to the detected position data, detected positions corresponding to a shape such as a bounding box may be retained during the time period taken by the user to draw the bounding box.
The filter 40 operates under the control of filter parameters provided by the analyser 50.
If the analyser 50 is attempting to detect a single point touch, it may be the case that the analyser 50 provides filter parameters to the filter 40 which do not in fact lengthen or sustain the period over which a detected touch point is retained. In contrast, if the analyser 50 is attempting to detect a drawn shape such as a bounding box, it may provide filter parameters to the filter 40 which provide a sustain function having, for example, a decay time constant of (say) two seconds.
Turning to Figure 3, in a very schematic form, the filter 40 comprises an adder 80 which acts upon the detected temperature values at coordinate identified by the position detector 30 to add a current value of the detected temperature to a previous value multiplied by a parameter supplied as a filter parameter by the analyser 50. If the parameter is zero, no filtering effect is provided and the currently detected value is passed unchanged. If the parameter is greater than zero but less than one, the temperature value is sustained over a time period longer than the actual period over which that temperature value is detected. Accordingly this provides the effect of filtering detected user actions so as to cause detected user actions relating to a user touch at a detection surface position to decay more slowly than a rate of decay associated with the cooling of the surface.
Figure 4 schematically illustrates a user (in particular, a user's finger 100, though touches by a user's hand, finger, foot or the like could be detected) touching a surface 110.
Depending on the nature of the surface material, the surface will heat up in response to the touch. An area of elevated temperature 120 (relative to the previous temperature at that area) will be formed. Some characteristic features of the area 120 are that its temperature will generally be lower than body temperature but higher than a previous temperature at that area (and generally higher than the temperature of the remainder of the surface 110) and that once the finger is released or moved, the temperature of the area 120 will decay back to its previous value. The way in which this decay is exhibited will be described with reference to Figures 5 and 6 below. Both of Figures 5 and 6 provide a succession of schematic images as captured by the thermal camera 20 over a time period, with time running from the top to the bottom of the respective diagram.
Figure 5 schematically illustrates the decay of the thermal signature corresponding to a single touch. The area 120 is shown initially at an elevated temperature. Note that depending on the relative position of the thermal camera 20 and the user's hand, the area 120 may not be detectable by the camera until the user has moved his finger 100 out of the way. In the second drawing of Figure 5, the area 120 is reducing in temperature until, in the third drawing, it has returned to the original temperature at that image position.
Figure 6 schematically illustrates the thermal signature associated with a linear touching motion. Here, in the first drawing, an initial touch area 130 is detected. As the user moves the user's finger along an example horizontal line (relative to the thermal image orientation), the hottest touch area moves with the user's finger but because of either or both of (a) the thermal time constant of the material of the surface, corresponding to the time taken to the surface to naturally cool down; and (b) the sustain action of the filter 40 as discussed above, previously touched areas may still be detected as a trail 140 behind the area 130. The analyser 50 is operable to detect a linear sweep or in other words a touch along a line by detecting such a trail connected with the currently touched area 130.
Figure 7 is a schematic flowchart illustrating a calibration process. The process comprises a step 200 to enter the calibration mode, a step 210 at which an initial or minimum temperature is detected at each image position in images captured by the thermal camera 20, and a step 220 at which that detected temperature is stored as the calibration or baseline data corresponding to the captured thermal images.
Note that the calibration process could be carried out only once, for example at initialisation of the use of the thermal camera and/or in response to a detection (for example by a motion detector, not shown, associated with the camera 20) of motion of the camera 20, in which case each of the steps 200, 210, 220 would be carried out once and the step 210 would detect the initial temperature at each image position.
Alternatively, however, the calibration process could be continuously carried out by means of a looped operation of the steps 210, 220 (as indicated by the broken line in Figure 7), with the step 210 detecting a minimum temperature at each position, or in other words whether the currently detected temperature is lower than that held in the store as the baseline data. To do this, the step 210 can access the stored data and compare it with the currently detected data for each image position. If the currently detected temperature data indicates a lower temperature than the stored temperature data, then the stored temperature data can be replaced by the currently detected temperature data.
Figure 8 is a schematic flowchart illustrating a touch detection process, as carried out by the analyser 50 for example. At a step 230 the analyser 50 sets a decay period associated with the filter 40, by setting the filter parameter data discussed above. For example, in respect of a touch detection process, the filter parameter may be zero such that there is no sustain function associated with the detected temperature data. At a step 240, the analyser 50 detects a positive change (that is to say, an increase) in detected temperature at a particular image point. For example, the analyser 50 could detect an increase in temperature to a temperature between the previous temperature (in the baseline data) at that point and body temperature. For example, the analyser 50 could detect an increase in temperature from the initial baseline temperature Ti to a temperature Td in the range of: (Ti + 0.5(Tb-Ti)) <Id < (Tb -0.2(Tb-Ti)) (where Tb is body temperature).
In other words, in order to meet the detection criteria, the detected temperature has to be at least halfway between the initial temperature and body temperature, but no more than 80% of the way between the initial temperature and body temperature. The reason for the upper limit is that the aim is to detect the after-effect of the body touching a surface rather than the body itself. Therefore, in this example, a lower threshold temperature in the inequality given above is higher than the initial temperature, and a higher threshold temperature is higher than the lower threshold, but lower than the body temperature of the user.
Although an absolute value (of about 37°C) can be used as the body temperature, in other embodiments the body temperature can be taken as the highest detected temperature of any non-stationary (moving) object in the thermal images.
This therefore represents an example of detecting a user touching a surface, by detecting, from thermal images, a change in temperature of the surface at a detection position from an initial temperature to a temperature higher than the initial temperature.
In response to such a detection, as a step 250 a touch command is triggered corresponding to that image position. Techniques by which touch commands and other commands are applied will be discussed below.
Figure 9 is a schematic flowchart illustrating a shape detection process in which user movement, while touching the surface, is detected along a path larger than the detection position associated with a single touch. Here, the process steps are similar to those described with reference to Figure 8. At a step 260, a decay period corresponding to the operation of the filter 40 is set. In the case of the detection of a line or shape, a longer decay period corresponding to a non-zero filter parameter may be set, for example a decay time constant of (say) two seconds. At a step 270 a line or shape is detected using the techniques discussed in connection with Figure 6. At a step 280 the detected line or shape is interpreted as a drag, draw, box or other command depending upon the context data 60.
It is possible for the operations of Figure 8 and U to operate in parallel, for example with the steps of Figure 8 making use of unfiltered position data and the steps of Figure 9 making use of filtered position data.
The context data 60 can be used to define what the analyser 50 is expected to detect, if that is appropriate to the current data-processing situation. For example, if a display object has been selected and the system is expecting a drag or move operation, then the context data 60 could instruct the analyser 50 to carry out the operations of Figure 9. If a set of menu options is provided on-screen and so a point command is expected as a next data-processing operation, the context data 60 could instruct the analyser 50 to carry out the operations of Figure 8.
Accordingly, Figures 2, 8 and 9 provide examples of a method of data processing comprising: capturing successive thermal images of a scene; detecting a user action from the thermal images; and controlling a data processing operation in response to the detection.
Figures ba and lOb schematically illustrate the association between a display screen image and the field of view of a thermal camera. In particular, Figure iDa provides a simple schematic example of a displayed menu having multiple options (in this case, two options, a one player mode and a two player mode). To activate the relevant options, the user is expected to provide a touching motion generally corresponding to the positions of touchpads 300, 310.
The games machine 10 is operable to map the displayed screen, or at least parts of it such as displayed positions, to thermal image positions within the field of view of the thermal camera. In some embodiments this could be a one-to-one mapping so that the whole of the display screen is mapped to the whole of the field of view of the thermal camera. However, this could be hard for the user to operate, because the user would be trying to execute touch commands by touching the appropriate area of a separate surface different to the display screen. It could be hard for the user to establish which position to touch in order to activate a particular function. Various techniques can alleviate this potential problem. One example, shown schematically in Figure lOb, is to enlarge the touch areas 320, 330 which the user may use to activate the touchpads 300, 310, so that less accuracy is required on the part of the user.
A second possibility is to provide a provisional indication on the display screen of where the user has touched, so that the user can determine whether the user has touched the correct control, possibly confirming it by re--touching that same position within a predetermined period of, for example, three seconds. A third possibility is to provide a cursol or other position indication on the display screen of Figure ba corresponding to a detected position of the user's finger (with respect to the thermal images) before the user makes a touch. The cursor can be generated either using a visible camera to detect the position of the user's finger or by establishing the most extreme position of a moving object at body temperature (see above) in the thermal image. A fouith possibility is to provide markings such as a graticule on a touch surface intended for use with the present arrangements. It will be appreciated that two or more of the above techniques may be combined in any example system.
Figure 11 schematically illustrates a combination of a visible light camera 340 and the thermal camera 20. A combiner 350 combines the image data detected by the pair of cameras to form a composite output image having, for each of a set of image locations (or those image locations available in the thermal images), visible pixel data and associated temperature data.
This output can be used in various ones of the techniques discussed above, for example to allow the users finger to be identified and displayed as a cursor in the arrangements of Figures ba and lOb.
Note however that the user of visible light detection is not required according to the present disclosure. The arrangements described here could work in the dark or in low light levels.
Figure 12 is a schematic flowchart illustrating a control technique corresponding to the techniques described above. At a step 400, the field of view of the thermal camera 20 is associated with a displayed screen image. The various types of mapping discussed above may be used. At a step 410, a suiface touch or other operation (such as a drag, draw or box operation) is detected using one of the techniques described above with reference to Figures 8 and 9. At a step 420, the detected touch or other operation is mapped to a command relevant to the currently displayed screen image.
The mapping of a detected touch or touch-movement to a command, in any of the present examples, is carried out in the same way as a mouse click (for a single touch) or a mouse click-and-drag (for a touch-movement) is used to control a data processing operation. In other words, the control output generated by the analyser 50 can simply mimic the control output generated by a mouse driver.
One possible use of these techniques is to allow legacy (previously sold) data-processing apparatus which was not touch sensitive to be made controllable using a user's touch. An example will now be described.
Figure 13 schematically illustrates a portable entertainment device for 30, in this example a Sony ® PlayStation Portable ® entertainment device. This is used here is an example of a legacy device which was not touch sensitive and can act, in this context, as the games machine 10. Figure 14 schematically illustrates the field of view 440 of a thermal camera directed towards the entertainment device 430. A sub-area 450 within the field of view 440 represents the area corresponding to the display screen 460 of the entertainment device 430. The entertainment device 430 can detect the area 450 by various techniques. In one example technique, the user is asked to calibrate the alignment of the thermal camera by touching, in turn, various screen positions on the display screen 460. For example, displayed instructions may instruct the user to touch each screen corner in turn. In another example technique, a thermal signature corresponding to areas of the entertainment device 430 can be compared to thermal images captured by the thermal camera in order to determine the location of the display screen 460 in the field of view 440 of the thermal camera. In another possible technique, if the thermal camera is even slightly sensitive to visible light (or is associated with a visible camera as discussed with reference to Figure 11) then characteristic patterns may be displayed on the display screen 460 to allow those patterns to be identified in the captured images from the thermal camera 20. Of course, accommodations of these techniques may be used.
Once the position 450 of the display screen 460 in the field of view 440 of the thermal camera 20 has been identified, the user's touch on the display screen 460 can be detected using the various techniques discussed above, and mapped to data-processing commands relevant to the entertainment device 430.
Figure 15 schematically illustrates a stereoscopic thermal camera arrangement comprising a pair of thermal cameras 20', 20" each of which provides image data to a processor 470 which combines the detected thermal image data into a 3-D thermal image having associated depth values. The arrangement of Figure 15 can be used in place of any of the usages of the thermal camera 20 discussed above. The depth data can assist in providing an indication of whether the user is touching a surface or the user's hand is simply near a surface.
In the example discussed above, the natural cooling process was allowed to take place, so that once the user had touched a surface, the temperature of the surface would decay naturally back to its original value. That original value was in turn generally determined by ambient conditions. As an alternative approach, Figure 16 schematically illustrates an actively cooled touch surface 500 having an upper touch surface 510 and a cooling element 520 operable under the control of a controller 530 (which, in turn, is responsive to the analyser 50).
An example of the cooling element is a Peltier element acting as an example of a heat pump. A temperature sensor 540 can provide information to the controller 530 as to the current temperature of the surface 510 and/or the upper surface (as drawn) of the element 520, to allow the surface 510 temperature to be controlled to a desired range or value.
This arrangement has various advantageous features. It allows the touch surface temperature to be returned to a baseline value more quickly after a touch operation has been detected. So, for example, the analyser 50 could instruct the controller 530 to cool the surface 510 back to the baseline temperature as soon as the analyser 50 has successfully detected a touch operation. In turn, this allows a more rapid detection of successive touch operations to be made. Furthermore, the arrangement of Figure 16 allows the baseline temperature of the touch surface 510 to be set lower than the current ambient temperature, which could potentially allow more accurate detection of user touch in environments in which the ambient temperature is close to normal body temperature.
The embodiments discussed above have related to detecting a user touching a surface, but similar techniques could be used simply to detect the presence of a user or to detect body movements or actions of the user.
Figure 17 schematically illustrates parts of the internal structure of a computer games machine such as the computer games machine 10 (which, as discussed, is an example of a general-purpose data-processing machine). Figure 17 illustrates a central processing unit (CPU) 600, a hard disk drive (HDD) 610, a graphics processing unit (GPU) 620, a random access memory (RAM) 630, a read-only memory (ROM) 640 and an interface 650, all connected to one another by a bus structure 660. The HOD 610 and the ROM 640 are examples of a machine-readable non-transitory storage medium. The interface 650 can provide an interface to the thermal camera 20, to other input devices, to a computer network such as the Internet, to a display device (not shown) and so on. Operations of the apparatus shown in Figure 17 to perform one or more of the operations described above are carried out by the CPU 600 and the GPU 620 under the control of appropriate computer software stored by the HOD 610, the RAM 630 and/or the ROM 640. It will be appreciated that such computer software, and the storage media (including the non-transitory machine-readable storage media) by which such software is provided or stored, are considered as embodiments of the present disclosure.
The arrangements discussed above have various technical advantages. They require a low power consumption, as they can operate with passive (unpowered) touch surfaces. They can allow legacy (non-touch) devices to operate under touch control. They can work in low light or darkness.

Claims (19)

  1. CLAIMS1. A method of data processing comprising: capturing successive thermal images of a scene; detecting a user action from the thermal images; and controlling a data processing operation in response to the detection.
  2. 2. A method according to claim 1, in which: the user action is the user touching a surface; and the detecting step comprises detecting, from the thermal images, a change in temperature of the surface at a detection position from an initial temperature to a temperature higher than the initial temperature.
  3. 3. A method according to claim 2, in which the detecting step comprises detecting a surface temperature in a range between a lower threshold and a higher threshold; the lower threshold being higher than the initial temperature; and the higher threshold being higher than the lower threshold, but lower than the body temperature of the user.
  4. 4. A method according to claim 3, comprising the step of detecting the body temperature of the user from the thermal images as the highest temperature of any moving object in the thermal images.
  5. 5. A method according to any one of claims 2 to 5, in which the detecting step comprises detecting user movement, while touching the surface, along a path larger than the detection position associated with a single touch.
  6. 6. A method according to claim 5, in which the detecting step comprises filtering the detected user actions so as to cause detected user actions relating to a user touch at a detection surface position to decay more slowly than a rate of decay associated with the cooling of the surface.
  7. 7. A method according to any one of claims 2 to 6, comprising the step of actively cooling the surface in response to detection of a user touch.
  8. 8. A method according to any one of the preceding claims, comprising the step of mapping image positions within the thermal images to display positions on displayed images.
  9. 9. A method according to claim 7, comprising the step of displaying, on the displayed images, an indication of the position of a user with respect to the thermal images.
  10. 10. A method of data processing, the method being substantially as hereinbefore described with reference to the accompanying drawings.
  11. 11. Computer software which, when executed by a computer, causes the computer to carry out the method of any one of the preceding claims.
  12. 12. A machine-readable non-transitory storage medium which stores computer software according to claim 11.
  13. 13. Data processing apparatus comprising: a detector, configured in respect of a captured thermal image of a scene, to detect a user action from the thermal image; and a controller configured to control a data processing operation in response to the detection.
  14. 14. Apparatus according to claim 13, in which the apparatus comprises a computer games machine.
  15. 15. Apparatus according to claim 13 or claim 14, comprising a thermal camera configured to capture the thermal image.
  16. 16. Apparatus according to claim 15, in which the thermal camera is a stereoscopic thermal camera.
  17. 17. Apparatus according to claim 15 or claim 16, comprising a visible light camera associated with the thermal camera.
  18. 18. Apparatus according to any one of claims 13 to 17, comprising: a detection surface, the detector being operable to detect a user touch on the detection surface; and a cooling arrangement operable to cool the detection surface.
  19. 19. Apparatus according to claim 18, in which the cooling arrangement is responsive to the detector so as to cool the detection surface in response to detection, by the detector, of a user action with respect to the surface.
GB1404729.4A 2014-03-17 2014-03-17 Control of data processing Active GB2524247B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1404729.4A GB2524247B (en) 2014-03-17 2014-03-17 Control of data processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1404729.4A GB2524247B (en) 2014-03-17 2014-03-17 Control of data processing

Publications (3)

Publication Number Publication Date
GB201404729D0 GB201404729D0 (en) 2014-04-30
GB2524247A true GB2524247A (en) 2015-09-23
GB2524247B GB2524247B (en) 2021-04-21

Family

ID=50634895

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1404729.4A Active GB2524247B (en) 2014-03-17 2014-03-17 Control of data processing

Country Status (1)

Country Link
GB (1) GB2524247B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109782775A (en) * 2019-03-13 2019-05-21 刘乐 A kind of automobile obstacle avoidance system based on thermal image
CN111121239A (en) * 2018-11-01 2020-05-08 珠海格力电器股份有限公司 Intelligent control method and system for intelligent household appliance and intelligent household appliance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075462A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. Blow tracking user interface system and method
US20120075463A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
US20120146903A1 (en) * 2010-12-08 2012-06-14 Omron Corporation Gesture recognition apparatus, gesture recognition method, control program, and recording medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075462A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. Blow tracking user interface system and method
US20120075463A1 (en) * 2010-09-23 2012-03-29 Sony Computer Entertainment Inc. User interface system and method using thermal imaging
US20120146903A1 (en) * 2010-12-08 2012-06-14 Omron Corporation Gesture recognition apparatus, gesture recognition method, control program, and recording medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Emerging Signal Processing Applications (ESPA), 2012 IEEE International Conference, 12 January 2012, Elliot N Saba et al, Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras, pages 167-170 *
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 7 May 2011, Eric Larson et al, HeatWave: Thermal Imaging for Surface User Interaction, pages 2565-2574 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111121239A (en) * 2018-11-01 2020-05-08 珠海格力电器股份有限公司 Intelligent control method and system for intelligent household appliance and intelligent household appliance
CN109782775A (en) * 2019-03-13 2019-05-21 刘乐 A kind of automobile obstacle avoidance system based on thermal image
CN109782775B (en) * 2019-03-13 2020-04-14 刘乐 Automobile obstacle avoidance system based on thermal image

Also Published As

Publication number Publication date
GB201404729D0 (en) 2014-04-30
GB2524247B (en) 2021-04-21

Similar Documents

Publication Publication Date Title
US10921896B2 (en) Device interaction in augmented reality
US9652047B2 (en) Visual gestures for a head mounted device
KR101918829B1 (en) Method and device for detecting a touch between a first object and a second object
JP5528476B2 (en) Electronic data input system
EP2908215B1 (en) Method and apparatus for gesture detection and display control
US20150149956A1 (en) Method for gesture-based operation control
US20150220158A1 (en) Methods and Apparatus for Mapping of Arbitrary Human Motion Within an Arbitrary Space Bounded by a User&#39;s Range of Motion
US20150193111A1 (en) Providing Intent-Based Feedback Information On A Gesture Interface
JP2018501570A (en) Method, system, and apparatus for navigating in a virtual reality environment
JP6524589B2 (en) Click operation detection device, method and program
US9213413B2 (en) Device interaction with spatially aware gestures
US10409446B2 (en) Information processing apparatus and method for manipulating display position of a three-dimensional image
WO2015133889A1 (en) Method and apparatus to combine ocular control with motion control for human computer interaction
KR20120045667A (en) Apparatus and method for generating screen for transmitting call using collage
US20160048214A1 (en) Using distance between objects in touchless gestural interfaces
US20150277570A1 (en) Providing Onscreen Visualizations of Gesture Movements
EP2831701B1 (en) Display control device, display control method, and program
US20160147294A1 (en) Apparatus and Method for Recognizing Motion in Spatial Interaction
US20150185851A1 (en) Device Interaction with Self-Referential Gestures
CN109389082B (en) Sight line acquisition method, device, system and computer readable storage medium
JP2013205896A5 (en)
GB2524247A (en) Control of data processing
TWI444875B (en) Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor
US20240265643A1 (en) Method and device of visual assistance for user in extended reality environment
US20230092874A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments