GB2526386A - Selection system - Google Patents

Selection system Download PDF

Info

Publication number
GB2526386A
GB2526386A GB1416386.9A GB201416386A GB2526386A GB 2526386 A GB2526386 A GB 2526386A GB 201416386 A GB201416386 A GB 201416386A GB 2526386 A GB2526386 A GB 2526386A
Authority
GB
United Kingdom
Prior art keywords
finger
user
screen
cursor
selection system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1416386.9A
Other versions
GB201416386D0 (en
Inventor
Grant Morgan Phillips
Brian R A Wybrow
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB201416386D0 publication Critical patent/GB201416386D0/en
Publication of GB2526386A publication Critical patent/GB2526386A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1654Details related to the display arrangement, including those related to the mounting of the display in the housing the display being detachable, e.g. for remote use

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Movement of a finger of the user, or of pointing equipment, implements the selection of a particular feature by causing a cursor which to move in any chosen direction, until it lies within the recognisable boundaries of a target image representing the Application which the user desires to use. This may be applied in a mobile phone, Smart Watches, carried, i) on the wrist, or ii) on, or by, other parts of the body, for conveniently accessing information contained within other devices, such as Ipads (RTM), Tablets, Laptops and Personal Computers, and for conveniently accessing information contained within a pair of Smart Glasses. A single miniature camera, may be used together with a software program residing in a microcontroller held in the device, to rapidly take, and store, a series of digital photographs of the finger, and causes the cursor to move, as the finger, or pointing equipment, moves in any direction.

Description

SELECTION SYSTEM
This invention relates to a system which provides means for selection, by the user of the system, of information displayed to the user, either as an image on a visual display screen, or as an image presented directly to the eyes of the user, wherein, in the case of the use of a display screen, the device is a Mobile Phone, Wad, Tablet, Smart Watch, Laptop, or Personal Computer, and wherein, in the case of direct presentation of infonnation to the eyes of the user, the device is a single Smart Glass, or a pair of Smart Glasses, held on the head of the user, wherein each device utilises a so-called Head-Up display, wherein the said system thus allows the user to access information such as so-called, Applications and other information, which are available, due to the operation of at least one microcontroller located inside the device.
With the ever increasing tendency for mobile devices to become smaller, particularly for the case of watches, a limit is eventually reached, when attempting to select a feature by use of a finger touching the screen of a device which utilises touch-screen technology, for example a wrist-watch, such that it becomes necessary to use pointing equipment, such as a stylus, because a finger is likely to be larger than any one image representing a selectable feature which is displayed on the screen of the device, and because the finger obscures objects on the screen, and there are, many of them.
Moreover, the fingernails can also present a problem, because they can either damage the screen, or prevent proper contact with the screen.
Although pointing equipment is available to supplement the use of the fingers, this equipment constitutes a part of such devices, which the user would prefer not to have to carry, and to have to use, and thus represents an inconvenience. They also give rise to extra cost.
This problem, concerning selection, is particularly relevant to devices which are mounted on the wrist of the user, and is overcome by application of the principles of the present invention, wherein an accompanying advantage is that it also offers a more convenient means for accessing in±brmation on all devices which, at present, utilise touch-screen technology; the need for which, is obviated in the application of the present invention.
Where the invention is applied to smart glasses, both hands are free, thereby allowing variation in the method of use.
According to the present invention, a selection system provides more convenient means, than are presently available, A) for selecting a feature displayed as an image on the visual display screen of a mobile device such as a mobile phone, or Smart Watch, wherein the said mobile phone, or Smart Watch, is either carried, i) on the wrist, or ii) on, or by, other parts of the body, B) for selecting a feature displayed as an image on the visual display screen of other devices, such as Wads, Tablets, Laptops, Personal Computers, and C) for direct presentation of information to the eyes of the user, where the device is a single Smart Glass, or a pair of Smart Glasses, held on the head of the user, wherein each device utilises a so-called Head-Up display, wherein for cases A, and B, the said invention does not require direct contact of a finger of the user, with the screen, or direct contact of pointing equipment such as a stylus pen, with the screen, and wherein for case, C, the user does not have to make direct contact with controls located on the body of the said Smart Glasses, wherein, in cases, A, B, and C, the method of operation of the said system, involves the user holding a finger, for example the fore-finger, or pointing equipment, in a range of possible locations in space, in relation to the position, in space, of the device, so that, when attempting to locate a particular, so-called, Application, held within the memory and operating system contained within the microcontroller inside the device, the user is able to navigate around the screen of the device, for cases A, and B, and is able to navigate around the virtual image presented to the user, for case C, wherein, for cases A, and B, the said Application is represented by a particular image which is visible on the screen of the said device, and wherein, for case C, the said Application is represented by a virtual image presented in front of the eyes of the user.
The said navigation around the screen is achieved by arranging for A) movement of a fmger of the user of the invention, whether aided or unaided by auxiliary equipment, such as a camera-recognisable mark formed on a finger pocket, attached to the finger, or by means of a camera-recognisable mark created on the finger, or B) movement of pointing equipment possessing camera rccognisable features, which is held by the user, to implement the selection of a particular feature displayed on the screen of the device, by causing a cursor which is displayed on the said screen, to move across the screen of the device, in any chosen direction, until it lies within the recognisable boundaries of a target image representing the Application which the user desires to use, wherein the cursor is initially made visible on the screen, by pressing a button mounted on the device.
S With the cursor lying in this said position, the user then rapidly moves the finger away from observation by the camera, and presses a button on the device,which then causes* locking-on to the desired target image, so that the desired Application is then launched.
Alternatively, the rapid movement away from the screen, can, itself, trigger the locking-on to the target Application, in order to launch the Application.
The cursor is automatically turned off, when the process of locking-on to the target has occurred, but will be seen again, on the screen, in a prominent, parked position on the screen, when the Application has finished. Alternatively, the cursor can be made visible by pressing a button located on the device.
Movement of the cursor is achieved by various means, which include i) the use of at least one miniature camera mounted on the device, or ii) at least one sensor mounted on the device, or a combination of both i) and ii), such that, for instance, a finger located over the back of the hand carrying a mobile wrist-watch, can cause the said cursor to move to the desired location so that it rests over a particular image.
Where a camera, or cameras, are utilised in the invention, their location and inclination, will, in one variant of the invention, allow recognition of an in-focus image of the finger, previously taken and recorded, and utilised in a software program residing in the microcontroller of the device, to give rise to locking-on to the said cursor, as a result of the operation of the software program.
Alternatively, a single miniature camera, together with a software program residing in a microcontroller held in the device, will rapidly take, and store, a series of digital photographs of the finger, or of the camera recognisable surface of the pointing equipment and thus cause the cursor to move, as the finger, or pointing equipment, moves in any direction. This is similar to the way in which a digital mouse fimctions, but places the camera lens of the mouse, in the device, and replaces the mouse mat, with the surface of the finger or the camera recognisablc surface of the pointing equipment, which is viewed by the camera. Depending on the sensitivity of the camera, an auxiliary source of light such as a Light Emitting Diode (LED) is used to illuminate the surface of the fmger or the camera recognisable surface of the pointing equipment which is viewed by the camera. The auxiliary source of light will be particularly important under conditions of low ambient lighting.
The cursor moves in a plane which is parallel with that of the screen, if the screen is flat, and moves over a curved region, which is parallel with that of the screen, if the screen is curved. A thrther feature of the invention is that, by moving the finger in three-dimensional space, a particular layer of Applications can be selected from a range of layers, wherein each layer contains, for instance, a particular type of Application, for example one layer can contain games, and another layer can contain cooking recipes, and so on, for a range of different Applications.
As stated, movement of the cursor is implemented by either i) movement of a finger of the user of the selection system, whether the finger is aided or unaided by means of auxiliary equipment, such as a camera-recognisable mark formed on a finger pocket, attached to the finger, or is aided or unaided by means of a camera-recognisable mark created on the finger, or ii) movement, by the user, of pointing equipment possessing camera recognisable features, wherein this movement gives rise to the selection of a particular feature displayed on the s reen of the device, by causing a cursor which is displayed on the said screen, to move across the screen of the device, in any chosen direction, as the said finger, or the said pointing equipment moves, until the cursor lies within the recognisable boundaries of a target image displayed on the screen of the device, wherein the image represents the Application which the user desires to use, and wherein the cursor is initially made visible on the screen, by the pressing, by the user, of a button mounted on the device, and wherein the cursor disappears either automatically after a pre-settable time, or disappears as a result of a further pressing of the said button, wherein, after the said target image has been chosen by moving the cursor as described, the user rapidly moves the finger, or the pointing equipment, away from observation by a camera which is mounted on the said device, so that no further movement of the cursor can occur, and the user then presses a button on the device, which causes locking-on to the desired target image, so that the desired Application is then launched.
Alternatively, after the said target image has been chosen by moving the cursor as described, rapid movement of the said finger, or the said pointing equipment, away from the screen, itself, triggers the function of locking-on to the target Application, without the need for the pressing of a said button in order to launch the Application.
The cursor is then automatically turned off when the process of locking on to the target has occurred, wherein the cursor will be seen again, on the screen, in a prominent, parked position on the screen, when the Application has finished.
Another alternative is to arrange for the cursor to be made visible by pressing a button located on the device, and to be made invisible by a second pressing of the said button.
As an example of the method of operation of the selection system, a fmger of the right hand of the user, is, for instance, located over the back of the left hand which carries a mobile wrist-watch, and causes the said cursor to move to the desired location so that it rests over a particular image representing a desired Application.
As an alternative to the method of responding to the movement of the finger or pointing equipment, as described, the software program is arranged to only respond to movement of images which are recognised as those which have been pre-programmed into the microcontroller residing in the device containing the selection system, so that the equivalent of the recognition of a fingerprint of the user, or the equivalent of the pointing equipment "print", is operating, so that the user can ensure that the selection system cannot be accessed by other users unless their fingerprints have been previously recorded by the selection system or unless these users have access to the pointing equipment which is recognisable by the selection system, wherein the said pointing device is able to generate each one, of many camera recognisable images which are selectable by the user, so that a further level of security is achievable in pointing equipment which is sold to, or can otherwise be made available to, potential users of the selection system.
As an alternative to the use of a camera, the movement of a finger, or of pointing equipment, is responded to by means of sensors located in the device, wherein the device contains cursor control software, and wherein the cursor then moves in response to the movement of the finger, or to movement of the pointing equipment, wherein the said sensors detect, and respond to, the changing position in space, of the fmger, or of the pointing equipment, as either moves in a plane which is parallel with that of the screen of the device, wherein the finger or pointing device, moves in a fixed plane, within detectable limits in both cases.
One example of the use of sensors to control the movement of the cursor of the device of the device, is the use directional heat sensors, such as Passive Infrared Sensors (PIR's) which are located on the device, wherein the sensors detect heat, emanating from a finger, or from the functional part of the pointing equipment, wherein either, is located at a pre-selectable distance from a particular face of the sensor, and in a pre-selectable direction with respect to that face, wherein the operation of a software program residing in the microcontroller situated within the device, thereby provides three-dimensional control of the said cursor, based upon the corresponding response to the position of the said finger, or to the position of the said pointing equipment.
Another type of sensor is based upon the use of ultrasound radiation, wherein transmitters/sensors, transmit and receive, ultrasound radiation, wherein transmitted ultrasound radiation is reflected by the said finger, or by the functional part of the said pointing equipment, to an accompanying ultrasound sensor, wherein the time taken for the transmitted radiation to reach the said fmger or the functional part of the said pointing equipment, and the time taken for the received radiation to reach the sensor, is processed by means of a software program residing in a microcontroller held in the device, wherein the use of two such transmitters/receivers will then allow three-dimensional detection of the position of the fmger, or of the functional part of the said pointing equipment together with the corresponding following of the movement of the finger by the cursor, according to the: processing action of the software program.
In yet another method of applying the principles of the selection system, a miniature camera mounted on the said device of the selection system, is able to focus accurately on the said finger, or on the functional part of the said pointing equipment, wherein the use of two such cameras, can be aided by software which requires both cameras to have an image of the finger, or of the mark, in focus at the same time, wherein each image will be unique for any particular position of the fmger, and will thus identifS' where the finger is, in relation to the position of the two cameras, and will consequently cause the cursor to move to a unique, previously mapped, position.
Another variant of the selection system utilises the characteristics of the image representing an Application to show the user where a said fmger, or the functional part of the said pointing equipment, is taking them to on the screen, wherein the image representing the Application, becomes lighter or flashes on and off, when the software has detected that it is in the region required for implementation of selection of that Application.
Other variants of the selection system utilise, slider operated, or thumbwheel operated, parts, which cause a cursor to move in response to the movement by the user of the said system, of the said parts, wherein the said parts are conveniently mounted on the said device, and thereby allow the user to drive miniature mechanisms contained within the said device, wherein such mechanisms are miniature versions of potentiometer based devices used in joysticks, or are miniature versions of similar devices which utilise optical techniques, in order to implement movement of a screen cursor via the use of a software program residing in the microcontroller situated in the device.
It is pointed out, with reference to the foregoing, that each is applicable to the use of Smart Glasses.
In order to describe the invention in more detail, reference will now be made to the accompanying diagrams in which: Figure 1, shows three-dimensional, views of the hands of the user, and of a device, and exemplifies a method of applying the principles of the invention.
Figure 2 shows a three-dimensional view of externally visible embodiments of the invention.
Figure 3 shows two-dimensional and three-dimensional views of externally visible embodiments of the invention.
Figure 4 shows a three-dimensional view of externally visible embodiments of the invention, in use.
Figure 5 shows a three-dimensional view of externally visible embodiments of the invention, in use.
Figure 6 shows a three-dimensional view of externally visible embodiments of the $ invention, in use.
Figure 7 shows a three-dimensional view of externally visible embodiments of the invention in use.
With reference to Figure 1, which represents a three-dimensional view, the left hand, 1, of the user of the selection system, carries a mobile device, 2. possessing the operational features of the system, which is held on the wrist of the left hand by means of a strap, 3. The right hand, 4, of the user, is shown with its forefmgór, 5, pointing towards the mobile device, 2, such that end region, 6, of the forefinger, 5, can be monitored by means of sensors, 7, and 8, located on the side-face, SF, of the frame, F. The display screen, S, which lies within the inner boundaries of the frame, F, of the mobile device, 2, can be seen to contain two rectangles, 9, and 10, wherein rectangle, 9, represents an image of a computer Application, fonned on the screen, and the smaller rectangle, 10, represents an image of the cursor of the mobile device, 2.
It can thus be seen that the combination of information received by sensors, 7, and 8, with the operation of a software program residing in the microcontroller contained within the mobile device, 2, will enable detection of the end region, 6, of the forefinger, 5, when the end region, 6, is at a distance from the two sensors, which is pre-programmed into the microcontroller, whereupon, the software program will thus lock on to the cursor, 10, and cause the cursor on the screen, 5, to move about on the screen as the forefinger moves.
Design of the software program residing in the said microcontroller, will thus allow control of the position of the cursor, 10, such that movement of the forefinger in a particular direction, will cause the cursor to move accordingly, and eventually, to the Application represented by rectangle, 9.
The plane, HP, shown in the lower, three dimensional diagram of Figure 1, is a plane which is a tangent to the curve, Cl, which represents the top, outer edge, of the curved frame, F, of the device, 2, which is shown in plan view in the upper diagram of Figure 1. The plane, VP, is at a right angle with respect to plane, HP, and the planes, HP, and VP, are shown in enlarged form, on the right hand side of the lower diagram in Figure 1, as also, is curve, Cl. Curve, C2, represents the top, outer edge of the curved frame, F, of the device, 2, at the opposite end of the frame, F. With reference to Figure 2, which represents a three-dimensional view, the device, 2, described with reference to Figure 1, is shown again, in greater clarity, and with shading, but with only some of the features. Since these features have already been described with reference to Figure 1, they will not be described again.
With reference to Figure 3, which represents a three-dimensional view, this shows the is device, 2, again, but also shows, in the upper part of the diagram, for clarity in interpreting previous diagrams, a front profile view of the device, 2. Since these features have already been described with reference to Figure 1, they will not be described again.
With reference to Figure 4, which represents a three-dimensional view, the device, 2, is shown in relation to the right hand, 4, of the user, but without the left hand, so that the directional functionality of the sensors, 7, 8, can be seen more clearly in relation to the forefinger, 5, of the user. Since these features have already been described with reference to Figure 1, they will not be described again.
With reference to Figure 5, which represents a three-dimensional view, the device, 2, has a miniature camera, 11, located at the centre of the rear part of the surface of the frame, F, of the device, 2, with the camera, 11, circled, and also shown in the inset, enlarged diagram to the lower right of FigureS.
It can thus be seen that, as the right hand, 4, of the user, moves to the left of the diagram, in the direction of the lines! Ll (lower) and L2 (upper) having arrows at each of their ends to show two possible directions of hand movement, the view of the region of the forefinger, 5, represented by the extreme points, P1, RI, changes to the view represented by the extreme points, P2, R2. This thus means that the camera, 11, can process, in rapid succession, a series of digital images of the lower surface of the forefmger, 5, whose front boundaries, as viewed in Figure 5, are represented, in profile, by the positions, P1, RI, and P2, R2. The system is thus able to cause the cursor, 10 (see the inset diagram to the lower right of Figure 5) to move to the position of the Application, 9, by responding to the change in the digital image seen by the camera, and processed by the software program. The smaller diagram of the device, 2, at the upper left of Figure 5, shows how the forefinger, 5, can be moving relative to the screen of the device, 2, when the device is turned through 90 degrees with respect to the orientation shown in the middle part of Figure 5, wherein the forefinger, 5, of the io right hand, 4, would be moving in the same direction as is shown in Figure 5.
With reference to Figure 6, which represents a three-dimensional view, the device, 2, is shown tuned through 90 degrees with respect to the orientation shown in Figure 5, whilst remaining in a plane which is a tangent to the curve, Cl, and which touches the circle representing the circular boundary of the periphery of the camera lens of the camera, 11, at two points which are diametrically opposite one another, such that a straight line through them, passes though the uppermost part of curve, Cl. It can thus be seen that, as the forefinger, 5, of the right hand (not shown in the diagram for reasons of better clarity) moves across the region above the screen, 8, to the left, the camera will obtain a view of the lower surface of the forefinger, 5, represented in profile by the points P1, Ri, as it moves to the position, P2, R2. Thus, as already described with reference to Figure 5, the camera, 11, can process, in rapid succession, a series of digital images of the lower surface of the forefmger, 5, whose boundaries, as viewed in Figures, are represented, in profile, by the positions, P1, Rl, and P2, R2.
The smaller diagram at the top left of Figure 6, shows the device, 2, turned through an angle of 90 degrees, as already explained previously, with reference to Figure 5.
Whilst the above descriptions given with respect to Figures 5, and 6, effectively refers to the movement of the forefinger, 5, across the region above the screen, S. in two directions which are at 90 degrees with respect to one another, whilst in the same plane, with the device, 2, then remaining in the same orientation, the forefinger, 5, could be moved sideways, so that a view of the surface across the width of the forefmger, rather than a view across the length of the forefinger, would be analysed by the software program, and would cause the cursor, 10, to move accordingly.
The movement of the forefinger, 5, will thus be responded to in the same way that the movement of a digital mouse will be responded to, when the mouse is moved over a supporting surface, such as a mouse mat.
It is pointed out that any part of the body could be used to cause the cursor to move in the way described in the foregoing, but that it will be important for the software program which is used in conjunction with the miniature camera, 11, to be designed so that the system responds to a sufficiently narrow depth of focus, so that the user can rapidly move out of the focusable range of view, in order to give rise to the selection of a desired target Application.
It is pointed out that a wider depth of field can be accommodated by use of a more complicated software program, which can rapidly acquire, and rapidly compare, different images, wherein, for instance, the deceasing size of an image will reveal movement away from the camera, and can be used to select a particular layer, whilst a change in position of the image, with a limit placed on the response to a change in size of the image, can be used to cause the cursor to move only in a desired direction, in one plane.
With reference to Figure 7, which represents a three-dimensional view, a pair of so- called Virtual Image, Glasses, or Smart Glasses, 20, which are provided with a so-called Head-Up display, has arms, 21, and 22, wherein miniature cameras, or sensors, 23, and 24, are mounted on the outside of the arm, 21, so that the surface of the fore-finger of the right hand, 26, of the user, can be detected, and responded to, as already described with reference to earlier Figures. The said cameras or sensors, thus replace the usual controls which are utilised for communicating with the electronic circuitry residing in the body of the pair of glasses, 20, wherein at least one microcontroller resides in the body of the pair of glasses, 20, in order to allow the user of the present pair of glasses of the invention, to choose particular Applications, and to carry out other forms of manipulation, according to the methods already referred to with reference to earlier Figures. The invention thus allows control of the pair of glasses without needing to make direct contact with any part of it.
It is further pointed out with reference to Figure 7, that the camera, or sensor, 24, can be located further forward on the arm, 21, of the pair of glasses, 20, and designed so that it captures images, or detects radiation, from a region forward of the glasses, wherein the fore-finger, 25, of the right hand, 26, is then placed forward of the position shown in Figure 7. Similarly, the camera or sensor, 23, can be placed in a forward position towards the front of the left hand arm, 22. In a more elaborate arrangement, the forward facing cameras or sensors can be in addition to the ones shown in Figure 7.
In all cases, electronic circuitry, and microcontroller components, are located within the body of the pair of glasses, 20.
It is pointed out, with reference to the foregoing, that communication with auxiliary equipment by means of direct wiring or by methods not involving direct wiring will provide fI.irther flexibility in operation.
It is further pointed out that, in design variants of the selection system, the position of any camera or sensor on the device, is decided upon for the purpose of allowing use of the system by persons who are disabled and thus have limited control of the hand and ann.

Claims (22)

  1. CLAIMSA selection system which provides more convenient means, than are presently available, A) for selecting a feature displayed as an image on the visual display screen of a mobile device such as a mobile phone, or Smart Watch, wherein the said mobile phone, or Smart Watch, is either carried, i) on the wrist, or ii) on, or by, other parts of the body, B) for selecting a feature displayed as an image on the visual display screen of other devices, such as IPads, Tablets, Laptops, Personal Computers, and C) for direct presentation of inforniation to the eyes of the user, where the device is a single Smart Glass, or a pair of Smart Glasses, held on the head of the user, wherein each device utilises a so-called Head-Up display, wherein for cases A, and B, the said invention does not require direct contact of a finger of the user, with the screen, or direct contact of pointing equipment such as a stylus pen, with the screen, and wherein for case, C, the user does not have to make direct contact with controls located on the body of the said Smart Glasses, wherein, in cases, A, B, and C, the method of operation of the said system, involves the user holding a finger, for example the fore-finger, or pointing equipment, in a range of possible locations in space, in relation to the position, in space, of the device, so that, when attempting to locate a particular, so-called, Application, held within the memory and operating system contained within the microcontroller inside the device, the user is able to navigate around the screen of the device, for cases A, and B, and is able to navigate around the virtual image presented to the user, for case C, wherein, for cases A, and B, the said Application is represented by a particular image which is visible on the screen of the said device, and wherein, for case C, the said Application is represented by a virtual image presented in front of the eye, or eyes, of the user.
  2. 2. A selection system as claimed in Claim 1, wherein, where a display screen is utilised, the said system does not require direct contact of a finger of the user, with the screen, or direct contact of pointing equipment such as a stylus pen, with the screen, and wherein, where a Head-Up display is utilised, the fingers are not required to make direct contact with any part of the associated equipment in order to operate it.
  3. 3. A selection system as claimed in Claim 2, wherein the user holds a finger, or pointing equipment, in a range of possible locations in space, in relation to the position, in space, of the device, so that the user is able to navigate around the screen of the device, or around the virtual image presented to the eye or eyes of the user, when attempting to locate a particular, so-called, Application, held within the memory and operating system contained within the microcontroller inside the device, wherein the said Application is represented by a particular image which is visible on the screen of the said device, or by a virtual image presented to the eye or eyes of the user.
  4. 4. A selection system as claimed in Claim 3, wherein, the said navigation around -the screen is achieved, either A) by movement of a finger of the user of the selection system, whether the fmger is aided or unaided by means of auxiliary equipment, such as a camera-recognisable mark formed on a finger pocket, Is attached to the finger, is aided or unaided by means of a camera-recognisable mark created on the finger, or B) by movement of pointing equipment held by the user, which possesses camera recognisable features, wherein either method gives rise to the selection of a particular feature displayed on the screen of the device, or displayed as a virtual image to the user, by causing a cursor which is displayed on the said screen, to move across the screen of the device, in any chosen direction, as the said finger, or the said pointing equipment moves, until the cursor lies within the recognisable boundaries of a target image displayed on the screen of the device, wherein the image represents the Application which the user desires to use, and wherein the cursor is initially made visible on the screen, by the pressing, by the user, of a button mounted on the device, and wherein the cursor disappears either automatically after a pre-settable time, or disappears as a result of a further pressing of the said button.
  5. 5. A selection system as claimed in Claim 4, wherein, after the said target image has been chosen by moving the cursor as described, the user rapidly moves the finger, or the pointing equipment, away from observation by a camera which is mounted on the said device, so that no further movement of the cursor can occur, and the user then presses a button on the device, which causes locking-on to the desired target image, so that the desired Application is then launched.
  6. 6. A selection system as claimed in Claim 4, wherein, after the said target image has been chosen by moving the cursor as described, rapid movement of the said finger, or the said pointing equipment, away from the screen, or away from the said Glasses, itself, triggers the fUnction of locking-on to the target Application, without the need for the pressing of a said button in order to launch the Application.
  7. 7. A selection system as claimed in Claim 5, or Claim 6, wherein the cursor is automatically turned off when the process of locking-on to the target has occurred, wherein the cursor will be seen again, on the screen, or as a virtual image, in a prominent, parked position on the screen, when the Application has finished.
  8. 8. A selection system as claimed in Claim 5, or Claim 6, wherein, the cursor can be is made visible by pressing a button located on the device, and can be made invisible by a second pressing of the said button.
  9. 9. A selection system as claimed in Claim 7, or Claim 8, wherein, for instance, a fmger of the right hand of the user, is located over the back of the left hand which carries a mobile wrist-watch, and causes the said cursor to move to the desired location so that it rests over a particular image representing a desired Application, and wherein, in the case of the use of a Smart Glass, or a pair of Smart Glasses, a finger of the right hand moves in a region on space such that an image of the finger can be viewed by the camera mounted on the said Glasses, so that a cursor can be guided, by movement of the finger, around the virtual image presented to the user, so that the user can then select a desired Application.
  10. 10. A selection system as claimed in Claim 9, wherein, at least one miniature camera, together with a software program residing in a microcontroller held in the device, will rapidly take, and store, a series of digital photographs of the said fmger, or of the said pointing equipment, and cause the cursor to move, as the finger, or pointing equipment, moves in any direction in a fixed plane which is parallel with that of the screen of the device, and, wherein, in the case of the use of Smart Glasses, in a fixed plane, within detectable limits in both cases, wherein this functionality is similar to the way in which a digital mouse functions when moving over a mouse mat, or the like, but places the camera lens of the mouse, into the said device, and replaces the mouse mat, with the surface of the finger, or with the camera detectable surface of the pointing equipment, wherein either is viewed by the camera, and wherein, depending on the sensitivity of the camera, aa auxiliary source of light such as a Light Emitting Diode (LED) is used to illuminate the surface of the finger, or of the functional surface of the pointing equipment, which is viewed by the camera, wherein the auxiliary source of light will be particularly important under conditions of low ambient lighting.
  11. 11. A selection system as claimed in Claim 10, wherein the said cursor moves in a plane which is parallel with that of the screen, if the screen is flat, and moves over a curved region, which is parallel with that of the screen, if the screen is Is curved, wherein a further feature of the invention is that, by moving the finger, or the pointing equipment, in three-dimensional space, in a direction at right angles with the plane of the screen of the device, and in a direction along the principal axis of the camera mounted on the said Smart Glasses, a particular layer of Applications is selectable from a range of layers, wherein each layer contains, for instance, a particular type of Application, and wherein, for example, one layer contains games, and another layer contains cooking recipes, and so on, for a range of different App lications.
  12. 12. A selection system as claimed in Claim 11, wherein the said software program will only respond to movement of images which are recognised as those which have been pre-programmed into the microcontroller residing in the device containing the selection system, so that the equivalent of the recognition of a fmgerprint of the user, or the equivalent of the pointing equipment "print", is operating, so that the user can ensure that the selection system cannot be accessed by other users, unless their fingerprints have been previously recorded by the selection system or unless these users have access to the pointing equipment which is recognisable by the selection system.
  13. 13. A selection system as claimed in Claim 12, wherein the said pointing device is able to generate each one of many a camera recognisable images which are selectable by the user, so that a further level of security is achievable in pointing equipment which is sold to, or can otherwise be made available to, potential users of the selection system.
  14. 14. A selection system as claimed in Claim 3, wherein movement of a finger, or of the pointing device, is responded to by means of sensors located in the device, wherein the device contains cursor control software, and wherein the cursor then moves in response to the movement of the finger, or to movement of the pointing equipment, wherein the said sensors detect, and respond to, the changing position in space, of the fmger, or of the pointing equipment, as either moves in a plane which is parallel with that of thd screen of the device, within detectable limits, and wherein, in the case of the use of Smart Glasses, the said sensors are used in addition to the use of miniature cameras, wherein the finger or pointing device, moves in a fixed plane, within detectable limits in both cases.
  15. 15. A selection system as claimed in Claim 14, wherein the said sensors are directional heat sensors, such as Passive Infrared Sensors (PIR's) are located on the device, wherein the sensors detect heat, emanating from a finger, or from the functional part of the pointing equipment, wherein either is located at a pm-selectable distance from a particular face, of the sensor, and in a pit-selectable direction with respect to that face, and wherein the system tbnctions according to the operation of a software program residing in the microcontroller situated within the device, thereby providing three-dimensional control of the said cursor, based upon the corresponding response to the position of the said finger, or to the position of the said pointing equipment.
  16. 16. A selection system as claimed in Claim 14, wherein transmitters/sensors, which transmit and receive, ultrasound radiation, transmit ultrasound radiation which is reflected by the said finger, or by the functional part of the said pointing equipment, to an accompanying ultrasound sensor, wherein the time taken for the transmitted radiation to reach the said finger or the functional part of the said pointing equipment and the lime taken for the received radiation to reach the sensor, is processed by means of a software program residing in a microcontroller held in the device, wherein the use of two such transmitters/receivers will then allow three-dimensional detection of the position of the finger or the functional part of the said pointing equipment, and the corresponding following of the movement of the finger by the cursor, according to the processing action of the software program.
  17. 17. A selection system as claimed in Claim 3, wherein a miniature camera is able to focus accurately on the said finger, or on the functional part of the said pointing equipment, wherein the use of two such cameras can be aided by software which requires both cameras to have an image of the finger, or of the mark, in focus at the same time, wherein each image will be unique for any particular position of the finger, and will thus identii& where the finger is, in relation to the position of the two cameras, and will consequently cause the cursor to move to a unique, previously mapped, position.
  18. 18. A selection system as claimed in Claim 4, wherein instead of using a cursor to show the user where a said finger, or the functional part of the said pointing equipment, is taking them to, on the screen, the system causes an image representing an Application, to become lighter or to flash on and off, when the software has detected that it is in the region required for implementation of selection of a particular Application.
  19. 19. A selection system as claimed in Claim 2, wherein slider operated, or thumbwheel operated, parts, cause a cursor to move in response to the movement by the user of the said system, of the said parts, wherein the said parts are conveniently mounted on the said device, and thereby allow the user to drive miniature mechanisms contained within the said device, wherein such mechanisms are miniature versions of potentiometer based devices used in joysticks, or are miniature versions of similar devices which utilise optical techniques, to implement movement of a screen cursor, via the use of a software program residing in. the microcontroller situated in the device.
  20. 20. A selection system as claimed in any previous Claim, wherein communication with auxiliary equipment by means of direct wiring or by methods not involving direct wiring will provide further flexibility in operation.
  21. 21. A selection system as claimed in Claim 20, wherein the position of any camera or sensor on the device, is decided upon for the purpose of allowing use of the system by persons who are disabled and thus have limited control of the hand and arm.
  22. 22. Methods and apparatus, arranged and constructed to operate substantially as hereinbefore described with reference to any one of the embodiments illustrated in Figures Ito 7, of the accompanying drawings.
GB1416386.9A 2014-05-24 2014-09-17 Selection system Withdrawn GB2526386A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1409368.6A GB201409368D0 (en) 2014-05-24 2014-05-24 Selection system

Publications (2)

Publication Number Publication Date
GB201416386D0 GB201416386D0 (en) 2014-10-29
GB2526386A true GB2526386A (en) 2015-11-25

Family

ID=51177489

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1409368.6A Ceased GB201409368D0 (en) 2014-05-24 2014-05-24 Selection system
GB1416386.9A Withdrawn GB2526386A (en) 2014-05-24 2014-09-17 Selection system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1409368.6A Ceased GB201409368D0 (en) 2014-05-24 2014-05-24 Selection system

Country Status (1)

Country Link
GB (2) GB201409368D0 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008010024A1 (en) * 2006-07-16 2008-01-24 Cherradi I Free fingers typing technology
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20130196757A1 (en) * 2012-01-30 2013-08-01 Microsoft Corporation Multiplayer gaming with head-mounted display
US8624836B1 (en) * 2008-10-24 2014-01-07 Google Inc. Gesture-based small device input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008010024A1 (en) * 2006-07-16 2008-01-24 Cherradi I Free fingers typing technology
US8624836B1 (en) * 2008-10-24 2014-01-07 Google Inc. Gesture-based small device input
US20110221656A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Displayed content vision correction with electrically adjustable lens
US20130196757A1 (en) * 2012-01-30 2013-08-01 Microsoft Corporation Multiplayer gaming with head-mounted display

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Gerald Lynch, Samsung Smartwatch patent predicts gesture control on your wrist, accessible at: http://gizmodo.com/samsung-smartwatch-patent-predicts-gesture-control-on-y-1581952225 [Accessed 5 Jan 2015] *
Richard Metz, A gestural Interface for smart watches, assessible at: http://www.technologyreview.com/news/520841/a-gestural-interface-for-smart-watches/ [Accessed 5 Jan 2015] *
Shea Harris, Google patent details indicate google glass(es) may use hand gestures, accessible at: http://www.androidauthority.com/google-glass-glasses-hand-gestures-patent-87051/ [Accessed 5 Jan 2015] *

Also Published As

Publication number Publication date
GB201416386D0 (en) 2014-10-29
GB201409368D0 (en) 2014-07-09

Similar Documents

Publication Publication Date Title
CN104272218B (en) Virtual hand based on joint data
US11816296B2 (en) External user interface for head worn computing
EP3066551B1 (en) Multi-modal gesture based interactive system and method using one single sensing system
US8730169B2 (en) Hybrid pointing device
US10444908B2 (en) Virtual touchpads for wearable and portable devices
US20110298708A1 (en) Virtual Touch Interface
ES2709051T3 (en) Control of a graphical user interface
US9207852B1 (en) Input mechanisms for electronic devices
US9494415B2 (en) Object position determination
US20170083115A1 (en) Method for displaying a virtual interaction on at least one screen and input device, system and method for a virtual application by means of a computing unit
KR101343748B1 (en) Transparent display virtual touch apparatus without pointer
US11714540B2 (en) Remote touch detection enabled by peripheral device
CN102341814A (en) Gesture recognition method and interactive input system employing same
CN105829948B (en) Wearable display input system
Matulic et al. Pensight: Enhanced interaction with a pen-top camera
US11640198B2 (en) System and method for human interaction with virtual objects
Grubert et al. Glasshands: Interaction around unmodified mobile devices using sunglasses
US8581847B2 (en) Hybrid pointing device
WO2003003185A1 (en) System for establishing a user interface
Colaço Sensor design and interaction techniques for gestural input to smart glasses and mobile devices
GB2526386A (en) Selection system
Takahashi et al. Extending Three-Dimensional Space Touch Interaction using Hand Gesture
KR20140021166A (en) Two-dimensional virtual touch apparatus
Grubert et al. Towards Interaction Around Unmodified Camera-equipped Mobile Devices

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)