AU2002320736B2 - Selecting Moving Objects on a System - Google Patents

Selecting Moving Objects on a System Download PDF

Info

Publication number
AU2002320736B2
AU2002320736B2 AU2002320736A AU2002320736A AU2002320736B2 AU 2002320736 B2 AU2002320736 B2 AU 2002320736B2 AU 2002320736 A AU2002320736 A AU 2002320736A AU 2002320736 A AU2002320736 A AU 2002320736A AU 2002320736 B2 AU2002320736 B2 AU 2002320736B2
Authority
AU
Australia
Prior art keywords
gui object
sdr
pointing device
code
display space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2002320736A
Other versions
AU2002320736A1 (en
Inventor
Sammy Chan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPR9630A external-priority patent/AUPR963001A0/en
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2002320736A priority Critical patent/AU2002320736B2/en
Publication of AU2002320736A1 publication Critical patent/AU2002320736A1/en
Application granted granted Critical
Publication of AU2002320736B2 publication Critical patent/AU2002320736B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Description

S&F Ref: 616590
AUSTRALIA
PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT
ORIGINAL
Name and Address of Applicant: Actual Inventor(s): Address for Service: Invention Title: Canon Kabushiki Kaisha 30-2, Shimomaruko 3-chome, Ohta-ku Tokyo 146 Japan Sammy Chan Spruson Ferguson St Martins Tower,Level 31 Market Street Sydney NSW 2000 (CCN 3710000177) Selecting Moving Objects on a System ASSOCIATED PROVISIONAL APPLICATION DETAILS [33] Country [31] Applic. No(s) AU PR9630 [32] Application Date 19 Dec 2001 The following statement is a full description of this invention, including the best method of performing it known to me/us:- Batch No: 5815c -1- SELECTING MOVING OBJECTS ON A SYSTEM Technical Field of the Invention The present invention relates generally to selecting dynamically moving objects represented on a display screen and, in particular, to performing this selection using a remote control device. The present invention relates to a method and apparatus for selecting a dynamically moving object represented on a display screen. The invention also relates to a computer program product including a computer readable medium having recorded thereon a computer program for selecting a dynamically moving object represented on a display screen.
Background Art Display devices such as televisions (TVs) often display both static and dynamically moving graphical objects, as do display devices such as cathode ray tubes (CRTs) and Liquid Crystal Displays (LCDs) used with Personal Computers (PCs). These moving objects will be referred to in this description as Graphical User Interface (GUI) objects. The GUI objects of interest in this description are graphical elements such as icons or animated graphics that represent certain information to the user, and whose movements or positions on the display often change continuously in various directions or at various rates.
Common pointing devices used in PC systems include the mouse, trackball and joystick, and these are used to position a cursor at a desired position on the PC display screen. For example, the cursor may be positioned in such a manner as to define an insertion point at which text is to be incorporated into a document being displayed by a word processing application running on the PC. The pointing devices, and particularly the joystick, can also be used for dynamic applications such as PC based action games in which, for example, a representation of an aeroplane can thereby be guided about the display screen.
181202 616590.DOC 181202 61 65904J0C -2t The aforementioned pointing devices typically have a range of operation that enables the cursor to be moved about the entire expanse of the display screen, and a ;constant relationship is typically maintained between an operating position of the pointing device and the corresponding location of the cursor on the display screen. In highly dynamic video applications, this arrangement, however, requires a high degree of
INO
Cc sensory-motor coordination from the typical user, who is therefore unable to easily track a C desired dynamically moving object on the screen.
Summary of the Invention According to one aspect of the invention, there is provided a method of selecting a GUI object in a display space using a user controlled pointing device, the method comprising the steps of: establishing an initial range mapping between the pointing device and the display space; (ii) defining an initial Selected Display Region (SDR) in the display space dependent on the initial range mapping; (iii) receiving, from the pointing device, a set of coordinates in the initial SDR; (iv) updating the range mapping dependent upon the received set of coordinates; redefining the SDR dependent upon the updated range mapping; (vi) determining a presence of any GUI object within the redefined SDR; (vii) if one GUI object is present in the redefined SDR, selecting the GUI object; and (viii) if more than one GUI object is present in the redefined SDR, repeating steps (iii) (viii).
According to another aspect of the invention, there is provided a method of selecting a moving GUI object in a display space using a pointing device, the method comprising the steps of: receiving an initial coordinate position from the pointing device; receiving a directional signal from the pointing device; and selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at least one 181202 616590 O of step and step comprises an initial sub-step of freezing motion of the GUI objects in the display space until a selection is made.
tb3 ;According to another aspect of the invention, there is provided an apparatus for selecting a GUI object in a display space using a user controlled pointing device, the apparatus comprising: N means for establishing an initial range mapping between the pointing device and the display space; (ii) means for defining an initial Selected Display Region (SDR) in the display space dependent on the initial range mapping; (iii) means for receiving, from the pointing device, a set of coordinates in the initial SDR; (iv) means for updating the range mapping dependent upon the received set of coordinates;: means for redefining the SDR dependent upon the updated range mapping; (vi) means for detenrmining a presence of any GUI object within the redefined
SDR;
(vii) means for selecting, if one GUI object is present in the redefined SDR, the GUI object' and (viii) means for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii).
According to another aspect of the invention there is provided an apparatus for selecting a moving GUI object in a display space using a pointing device, the apparatus comprising: means for receiving an initial coordinate position from the pointing device; means for receiving a directional signal from the pointing device; and means for selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made.
181202 616590 i -4o According to another aspect of the invention there is provided a computer program for directing a processor to execute a procedure for selecting a GUI object in a ;display space using a user controlled pointing device, the program comprising: code for establishing an initial range mapping between the pointing device and the display space; INO(ii) code for defining an initial Selected Display Region (SDR) in the display space dependent on the initial range mapping; (iii) code for receiving, from the pointing device, a set of coordinates in the initial SDR; (iv) code for updating the range mapping dependent upon the received set of coordinates; code for redefining the SDR dependent upon the updated range mapping; (vi) code for determining a presence of any GUI object within the redefined
SDR;
(vii) code for selecting, if one GUI object is present in the redefined SDR, the GUI object; and (viii) code for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii).
According to another aspect of the invention there is provided a computer program for directing a processor to execute a procedure for selecting a moving GUI object in a display space using a pointing device, the program comprising: code for receiving an initial coordinate position from the pointing device; code for receiving a directional signal from the pointing device; and code for selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made.
According to another aspect of the invention there is provided a computer program product including a computer readable medium having recorded thereon a 181202 616590 o computer program for directing a processor to execute a procedure for selecting a GUI Nobject in a display space using a user controlled pointing device; the program comprising: Z code for establishing an initial range mapping between the pointing device and the display space; (ii) code for defining an initial Selected Display Region (SDR) in the display INDspace dependent on the initial range mapping; (iii) code for receiving, from the pointing device, a set of coordinates in the initial SDR; (iv) code for updating the range mapping dependent upon the received set of coordinates; code for redefining the SDR dependent upon the updated range mapping; (vi) code for determining a presence of any GUI object within the redefined
SDR;
(vii) code for selecting, if one GUI object is present in the redefined SDR, the GUI object; and (viii) code for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii).
According to another aspect of the invention there is provided a computer program product including a computer readable medium having recorded thereon a computer program for directing a processor to execute a procedure for selecting a moving GUI object in a display space using a pointing device, the program comprising: code for receiving an initial coordinate position from the pointing device; code for receiving a directional signal is received from the pointing device; and code for selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made.
Other aspects of the invention are also disclosed.
181202 616590 -6- Brief Description of the Drawings One or more embodiments of the present invention will now be described with reference to the drawings, in which: Fig. I shows a block diagram of a system in which a user can select a GUI object on a display screen; Fig. 2 is a schematic block diagram of a general-purpose computer upon which arrangements described can be practiced; Fig. 3A shows an illustrative the touch pad device; Fig. 3B shows an initial Selected Display Region (SDR); Fig. 3C shows a subsequent SDR after revised range mapping; Fig. 3D shows an alternative range mapping arrangement; Fig. 4 shows a process whereby a user can operate the system of Fig. I for selecting GUI objects by providing positional inputs; Fig. 5 depicts a system process flow chart of method steps by which the software application in Fig. I selects the GUI object using positional inputs; Fig. 6 depicts selective magnification of the Selected Display Region (SDR) on PC display screen; Fig. 7 shows a process whereby a user can operate the system of Fig. I for selecting objects by providing directional inputs; Fig. 8 illustrates one arrangement for mathematically defining a closest object in the specified direction; and Fig. 9 depicts a system process flow chart of method steps by which the application of Fig. 1 selects the GUI object using directional inputs.
Detailed Description including Best Mode Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features 181202 616590.DOC 181202 616590.DOC -7have for the purposes of this description the same function(s) or operation(s), unless the contrary intention appears.
As previously noted, it is often desirable for the user to have the capability of selecting one of a possibly large number of GUI objects that are moving dynamically on a display screen of a PC system or a TV system that has an associated PC capability.
Fig. 1 shows a block diagram of a system 100 by which a user 102 may select a GUI object 118 on a display area 402 of a display device 112. The user 102 uses a remote control device 104 in order to perform the aforementioned selection. The exemplary GUI object 118 has a direction and a velocity depicted by a vector 120. GUI objects typically represent information specific to an application 116 that is being run on a PC module 201. The exemplary remote control device 104 has an infra-red (IR) transmitter which converts user inputs 106 to corresponding commands 110 directed to the PC module 201. The exemplary remote control device 104 described in this specification is a touch pad device 410 equipped with a touch pad region 416 as well as other controls (see Fig. 3A). Other devices, however, such as joy-sticks, multi-way rocker switches, or other controls that provide positional and/or directional inputs may alternately be used.
When selected, the GUI object 118 can be highlighted on the display area 402 in order to provide feedback to the user 102 that selection has been performed. The software application 116 generates the dynamic GUI objects that are displayed on the display device 112, interprets the commands 110, and sends corresponding graphical data 114 to the display device 112, thereby changing the display of the GUI objects accordingly. The graphical data 114 can highlight a GUI object that has been thus selected by the user 102, thereby providing visual feedback 108 from the display device 112 to the user 102, and providing the user 102 with confirmation that the desired GUI object has been successfully selected.
181202 616590.DOC 181202 6! 6590.DOC -8- Fig. 2 illustrates how the method of selecting the GUI object is preferably practiced using a general-purpose computer system 200. In this arrangement, processes described in relation to Figs. 4, 5, 7, 9 may be implemented as software, such as the application program 116 executing within the computer system 200. In particular, the steps of method of selecting a GUI object are effected by instructions in the software that are carried out by the computer. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part performs the selection of GUI object methods, and a second part manages a user interface between the first part and the user.
The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer from the computer readable medium, and then executed by the computer. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer preferably effects an advantageous apparatus for selecting a desired GUI object.
The computer system 200 comprises the computer module 201, input devices such as a keyboard 202, mouse 203, and the touch pad device 410, output devices including a printer 215 and the display device 112 having the display area 402. A Modulator-Demodulator (Modem) transceiver device 216 is used by the computer module 201 for communicating to and from a communications network 220. The modem is, for example, connectable to the network 220 via a telephone line 221 or other functional medium. The modem 216 can be used to obtain access to the Intemrnet, and other network systems, such as a Local Area Network (LAN) or a Wide Area Network
(WAN).
The computer module 201 typically includes at least one processor unit 205, a memory unit 206, for example formed from semiconductor random access memory 181202 616590.DOC 181202 616590.DOC -9- (RAM) and read only memory (ROM). The module 201 also includes input/output interfaces, including a video interface 207, and an 1/O interface 213 for the keyboard 202, the mouse 203, the touch pad device 410 by means of a wireless connection 224, and optionally a joystick (not illustrated). The module 201 also has an interface 208 for the modem 216. A storage device 209 is provided and typically includes a hard disk drive 210 and a floppy disk drive 211. A magnetic tape drive (not illustrated) may also be used. A CD-ROM drive 212 is typically provided as a non-volatile source of data. The components 205 to213 of the computer module201, typically communicate via an interconnected bus 204 and in a manner which results in a conventional mode of operation of the computer system 200 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations or alike computer systems evolved therefrom.
Typically, the application program 116 is resident on the hard disk drive 210 and read and controlled in its execution by the processor 205. Intermediate storage of the program and any data fetched from the network 220 may be accomplished using the semiconductor memory 206, possibly in concert with the hard disk drive 210. In some instances, the application program may be supplied to the user encoded on a CD-ROM or floppy disk and read via the corresponding drive 212 or 211, or alternatively may be read by the user from the network 220 via the modem device 216. Still further, the software can also be loaded into the computer system 200 from other computer readable media.
The term "computer readable medium" as used herein refers to any storage or transmission medium that participates in providing instructions and/or data to the computer system 200 for execution and/or processing. Examples of storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 201.
181202 616590.DOC 181202 616590DOC Examples of transmission media include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Intemrnet or Intranets including email transmissions and information recorded on websites and the like.
The method of selecting a GUI object may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of selecting a GUI object. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
Two variations of the method of selecting a GUI object are described, each requiring the user 102 to use the touch pad device 410 in a slightly different manner, and for which the display device 112 operates in correspondingly different feedback modes.
Preparatory to describing the aforementioned selection methods, it is convenient to describe the exemplary touch pad device 410 that is used as the remote control device 104.
Fig. 3A illustrates the touch pad device 410 that is described in this specification.
The touch-pad device comprises an outer casing 432 and a number of controls. It can also include a small display (not shown) that mirrors, typically at a lower resolution, the display on the display area 402 of the display device 112. In general, controls on the remote control device 104 are used for making GUI object selection, and also for other purposes that are specific to the exemplary application 116. In the present example, a touch pad region 416 serves as the control for providing positional or directional information that is used to select the desired GUI object. A MULTI-SELECT toggle button 414 serves to enable or disable multiple selections of GUI objects, and an EXIT button 430 allows the user to exit an operation in a current context.
181202 616590.DOC 181202 616590.DOC It The touch pad device 410 allows the user 102 to specify positional information in the two-dimensional display space 402, in terms of a pair of coordinates, by touching a position on the touch pad region 416 with a finger or a stylus. The touch pad region 416 also allows the user 102 to specify directional information in the two-dimensional display space 402. Directional information is specified in terms of an initial finger placement coordinate on the touch pad region 416, and an angle defined by holding the user holding their finger down and dragging their finger in a desired direction along the surface of the touch pad region 416. An imaginary line (see 908 in Fig. 8) formed between the initial finger placement position and the finger position after the dragging action is completed is used to derive an angle relative to an axis.
The MULTI-SELECT 414 button is a toggle button which has an ON or OFF state. When OFF, only one GUI object can be selected at any one time. When a GUI object is selected, any previously selected GUI objects are de-selected. When the button 414 is ON, a new selection can be made without de-selecting any previous GUI object selections. The EXIT button is used to exit an operation in the current context so that it can be cancelled or retried. Thus, for example, if the user no longer wishes to select a GUI object, or if the desired GUI object is outside a Selected Display Region (SDR), the user can use the EXIT button to revert to the state just prior to inputting the last position.
This allows the user to cancel or retry the last operation. If the user wishes to make multiple selections of GUI objects, the MULTI-SELECT button has to be toggled ON before further selections are made. Otherwise, the next selection will de-select the current selection. While selecting multiple GUI objects, selecting an already selected GUI object will de-select it.
In a first arrangement, the user provides what are referred to as "positional inputs". This allows the user 102 to progressively perform selection, while maintaining the complete dynamic range of the touch pad device 410, in a smaller and smaller display 181202 616590.DOC 12 region, referred to as a Selected Display Region (SDR), on the display area 402 (see Fig.
3B) of the display device 112. This process continues until the SDR contains only the desired GUI object.
Fig. 3B shows a first instance of the display device 112 and the display area 402, in regard to which a technique called "range mapping" between the display area 402 of the display device 112 and the touch pad region 416 of the touch pad device 410 will now be described. In the case when positional inputs are used to select a GUI object, the selection process commences by performing an initial range mapping of the touch pad region 416 of the touch pad 410 (see Fig. 3A) to the entire display area 402. This initial to range mapping defines an initial SDR that encompasses the entire display area 402. The initial SDR is thus depicted by an elliptical boundary 404. The touch pad region 416 is divided into nine touch regions exemplified by touch regions 412 and 420, and in the initial range mapping, these nine touch regions map to nine corresponding display regions exemplified by display regions 400 and 406 on the display area 402. The term range mapping denotes the relationship between the display area 402, a current SDR, and the touch pad region 416, and denotes how a touch on the touch pad region 416 is interpreted by the application 116 as corresponding to a particular display region in the current SDR of the display area 402. In accordance with the initial range mapping, a user touch input to the top left hand touch region 412 of the touch pad region 416 is interpreted as corresponding to the top left hand display region 400 of the SDR 404. Similarly, an input to the bottom right hand touch region 420 of the touch pad region 416 is interpreted as corresponding to the bottom right hand display region 406 of the SDR 404.
If the user notices a desired GUI object, say in the display region 400 of the initial SDR 404, the user 102 touches a point on the touch region 412 of the touch pad region 416 that corresponds to an approximate position of the desired GUI object in the display region 400 in the current SDR 404. The touch pad device 410 then transmits a 181202 616590.DOC 13 corresponding command 110 that represents the location of the touch region 412 that has been pressed by the user, to the application 116. In one arrangement, the application 116 determines whether there is a single GUI object only in the display region 400. If this is the case, the application 116 selects that single GUI object in the display region 400, and that GUI object is highlighted in the display to show the user that the GUI object has been selected. If, however, there are multiple GUI objects in the display region 400, then none of these GUI items are highlighted, and the application 116 revises the range mapping and re-maps the touch pad region 416 to just the specified display region 400, thereby specifying a new SDR 404'.
Fig. 3C shows a second instance of the display device 112 and the display area 402, after the aforementioned revised range mapping. As noted, the new SDR is now defined by the elliptical boundary 404', and the rectangular boundary of the display region 400 is highlighted as depicted in Fig. 3C by a bold border. The touch pad region 416 of the touch pad 410 is re-mapped by the application 116 to only the highlighted new SDR 404'. The practical effect of the aforementioned highlighting is to provide the user with immediate visual feedback as to the region of the display to which the user has pointed, and this provides a new frame of reference to the user for inputting the next position on the touch pad 410. The revised range mapping results in the fact that an input to the touch region 412 of the touch pad region 416 is now interpreted as corresponding to a display region 422 of the SDR 404'. Similarly, an input to the touch region 420 of the touch pad region 416 is interpreted as corresponding to a display region 428 of the new SDR 404'.
If the desired GUI object is seen by the user to be, say, in the display region 422 of the new SDR 404', the user 102 touches a point on the touch region 412 of the touch pad region 416 that corresponds to an approximate position of the desired GUI object in the display region 422 in the new SDR 404'. The touch pad device 410 then transmits a 181202 616590.DOC 14corresponding command 110 that represents the location of the touch region 412 that has been pressed by the user, to the application 116. In the described arrangement, the application 116 determines whether there is a single GUI object only in the display region 422. If this is the case, the application 116 selects that single GUI object in the display region 422, and that GUI object is highlighted in the display to show the user that the GUI object has been selected. If, however, there are multiple GUI objects in the display region 422, then the application 116 revises the range mapping and re-maps the touch pad region 416 to just the specified display region 422, thereby specifying yet a new SDR 404".
In the described arrangement, if there is only one GUI object in a particular display region that the user has pointed to, then that GUI object is selected and highlighted. The highlighting of the GUI object indicates to the user that he has actually selected the highlighted GUI object. If, however, there are multiple GUI objects in the display region that has been pointed to, then no GUI object is highlighted. A revised range mapping is performed, and a new SDR is thereby defined and is also highlighted.
The user can again touch a point on the touch pad region 416 that corresponds, in the estimation of the user 102, to the coordinate position of the GUI object in the newly highlighted SDR. The user continues with the above process until the desired GUI object is selected.
The described arrangement highlights the SDR and the desired GUI object in the display area 402 in order to provide visual feedback cues to the user. These cues help the user track and select the GUI object. Other feedback indications can also be used. For example, indicative text messages can be provided, either on the display area 402, or on another auxiliary display (not shown).
In general, the range-mapping technique enables the touch pad region 416 of the touch pad device 410 to be mapped to an SDR comprising the entire display area 402, as well as to new SDRs encompassing any sub-section of the display region 402. Successive 181202 616590.DOC 18~2O2 616590.DOC range mapping operations thus provide the user of the touch pad 410 with a convenient and effective method for rapidly converging on a desired GUI object displayed in the display area 402, while maintaining the full dynamic range of the touch pad region 416 over successively smaller SDRs.
It is apparent that although the arrangement in Fig. 3 deals specifically with a two-dimensional display area 402, and a corresponding 2-D pointing device in the form of the touch pad 410, this is merely an example. Thus the aforementioned rangemapping concept can be applied to an N-dimensional display space and a corresponding N-dimensional pointing device.
A three-dimensional arrangement could be implemented, for example, using a "virtual-reality" display arrangement together with a joy stick having auxiliary controls to enable three-dimensional pointing.
Fig. 3D shows an alternative range mapping arrangement, wherein range mapping is performed on the basis of a user selected coordinate position of the desired GUI object, and the actual coordinate position of that GUI object which is closest to the user selected coordinate position, thus not relying on the 9-segment touch region as the basis for defining new SDRs. The display device 112 and the display area 402 show three displayed GUI objects 1000, 1006 and 1008. The touch pad region 416 is initially mapped to the entire display area 402, which thus forms the initial SDR. The user selects a coordinate position 1004 by touching a particular (corresponding) position of the touch pad region 416. The touch pad device 410 transmits a command 110 that represents the location of the touch region 412 that has been pressed by the user, to the application 116.
The application 116 then determines which is the closest GUI object to the user selected coordinate position, which in this case is the GUI object 1008 which is closer to the user selected coordinate position 1004 than other displayed GUI objects 1000 and 1006. The application 116 then defines a new SDR 1010 having the aforementioned closest GUI 181202 616590DOC -16object 1008 at the centre of the new SDR, and the user selected coordinate 1004 near a boundary thereof The application 116 also revises the range mapping and re-maps the touch pad region 416 to the new SDR 1010.
Fig. 4 shows a process 300 whereby the user 102 can operate the system of Fig. 1 for selecting the desired GUI object by providing positional inputs. The process commences at a step 302 at which stage the user 102 is ready to select the desired GUI object represented on the display area 402 of the display device 112. Thereafter, at a step 304 the user inputs an estimated coordinate position of the desired GUI object using the touch pad region 416 of the touch pad device 410. Thereafter, a testing step 314 determines whether there is a GUI object in the SDR. If there is no such object, then the process 300 is directed in accordance with a "yes" arrow back to the step 304. If, on the other hand, a GUI object is to be found in the SDR, then the process 300 is directed in accordance with a "no" arrow to a step 306, in which the application 116 highlights the perimeter of the new SDR resulting from the user input, using the range mapping process.
The step 306 also remaps the touch pad region 416 of the touch pad 410 to just the highlighted new SDR. The remapping ensures that the entire dynamic range of the touch pad device 410 is applied to the new SDR, thereby providing the user 102 with a finer granularity of control within the new SDR.
In a following testing step 308 the application 116 determines whether there is only one GUI object in the SDR that has now been highlighted on the display area 402. If there is only one displayed GUI object in the SDR, then the process is directed in accordance with a "yes" arrow to a step 310 in which the application 116 selects and highlights that GUI object. The highlighting operation provides the user 102 with visual feedback 108 that he has, indeed, selected the desired GUI object. The process 300 thereafter terminates at a step 312.
181202 616590,DOC 181202 61 6590.DOC 17- Returning to the testing step 308, if there are GUI objects other than the desired GUI object in the SDR, then the process 300 is directed in accordance with a "no" arrow to the step 304 where the user provides another positional input to the touch pad device 410.
Fig. 5 shows depicts a system process flow chart of method steps by which the application 116 of Fig. 1 selects the desired GUI object using the positional inputs. The process 700 commences with a step 702 which establishes an (initial) range mapping between the display area 402 of the display device 112 and the selectable range of the touch pad device 410. The process 700 is then directed to a step 704 that receives estimated coordinate information, referred to the display area 402, for the desired GUI object. Thereafter, a step 706 determines a new range mapping between the display area 402 and the selectable range of the touch pad device 410. Thereafter, the step 708 detennrmines a new SDR based upon the new range mapping, and the step 708 also highlights the new SDR to provide a visual feedback indication. Thereafter, a testing step 722 determines if a GUI object is to be found in the SDR. If no GUI object is to be found, then the process 700 is directed in accordance with a "yes" arrow back to the step 704. If, on the other hand, a GUI object is to be found in the SDR, then the process 700 is directed in accordance with a "no" arrow to a subsequent step 710 which remaps the touch pad region 416 of the touch pad 410 to just the aforementioned highlighted SDR.
Thereafter, the testing step 712 ascertains whether objects other than the desired GUI object are present in the highlighted current SDR. If no other objects are detected, then the process 700 is directed in accordance with "yes" arrow to a step 714 which highlights the representation of the GUI object, at which point the process 700 terminates.
If, on the other hand, other objects are detected in the step 712, then the process 700 is directed in accordance with a "no" arrow to the step 704 which receives a (new) set of estimated coordinates for the desired GUI object.
181202 616590.DOC 181202 616590.Doc 18- The steps 704-712 are repeated until a single object is present in the current SDR, at which point the process 700 is directed in accordance with the "yes" arrow to the step 714 which highlights the desired GUI object and terminates the process.
The step 702 can be implemented, in one arrangement indicated by reference numerals contained in a dashed box 718 connected by a dashed line 720 to the step 702 in question, by the processor 205, in conjunction with the touch pad device 410 and the display device 112, as can the step 710. Similarly, the step 704 can be implemented by the touch pad device 410, in conjunction with the processor 205. Similarly, the steps 706 and 712 can be implemented by the processor 205. Similarly, the steps 708 and 714 can be implemented by the processor 205, in conjunction with the video interface 207 and the display device 112.
As the user 102 directs her attention to successively smaller SDRs 404, 404', 404", the user must direct her attention to successively smaller segments of the display area 402 and visual resolution decreases. The SDR can however be enlarged on the display to show each SDR in magnified form, preferably using the entire display area 402. This displays GUI objects in greater detail.
Fig. 6 shows this aforementioned arrangement whereby as the attention of the user 102 is directed from a current SDR 502 to a new SDR 500, the new SDR 500 is magnified and mapped to the entire display area 402 as depicted by dashed lines 504, 508, 506 and 510. This provides the user 102 with a magnified view of contents of the new SDR 500 as shown. In order to provide the user with a frame of reference as to where the magnified new SDR 500 lies in regard to the overall display area 402, a reduced representation of both the overall display area 402 and the new SDR 500 (called a "picture-in-picture view") can be shown, at 516 and 514. This picture-in-picture view can be intelligently placed at one of the corners of the display area 402 to avoiding obstructing any of the GUI objects.
181202 616590.DOC IS) 202 61 6590.DOC 19- Having described selection of GUI objects using positional information, we turn to a second arrangement in which the user 102 can select a GUI object using Directional Inputs. In this arrangement, the touch pad region 416 of the touch pad device 410 is mapped to the entire display area 402 (see Fig. 3B). When the user observes the display 402 and notices a desired GUI object, he touches an initial position on the touch pad region 416 that corresponds to the perceived position of the desired GUI object on the display 402, and holds his finger down on the touch pad region 416 at the initially selected position. The touch pad device 410 transmits the coordinates of the touched position to the application 116. The application determines which is the nearest GUI object to the specified position and marks that GUI object in the memory 206 as a "targeted" GUI object. This currently targeted GUI object is highlighted in the display 402 to provide the user 102 with visual feedback of which GUI object has been targeted.
If the targeted GUI object is not the desired GUI object, the user uses the touch pad device 410 to indicate a direction to the desired GUI object by dragging his finger in the direction of that desired GUI object relative to the currently targeted and highlighted GUI object. The touch pad device 410 transmits the specified direction to the application 116 that then selects and highlights the GUI object that is nearest to the currently targeted GUI object in the specified direction. If that newly highlighted GUI object is the desired one, the user then lifts his finger off the touch pad to finalise the selection thereof.
Fig. 7 shows a process 600 for this arrangement whereby the user 102 can select a GUI object using directional input information. The process 600 commences at a step 602, at which juncture the user 102 is ready to select the desired GUI object. Thereafter, a step 604 receives an (initial) set of estimated coordinates for the desired GUI object from the user 102 who is using the touch pad device 410. Thereafter, at a step 606 the GUI object nearest to the aforementioned estimated coordinate position is identified, and 181202 616590.DOC 181202 616590D0C in a following step 608, the identified GUI object is targeted and highlighted, thereby indicating to the user 102 which object has been designated.
In a following step 614, the user determines whether the targeted (highlighted) GUI object is, indeed, the desired object. If this is the case, then the process 600 is directed in accordance with a "yes" arrow to a step 616. At that point, the user 102 releases control of the touch pad 410, finalising the selection of the desired GUI object, and the process 600 then terminates at a step 622.
Returning to the testing step 614, if the highlighted GUI object is seen by the user 102 not to be the desired GUI object, then the process 600 is directed in accordance with a "no" arrow to a step 612. The user has up to this point kept his or her finger on the touch pad region 416. The user maintains his finger in contact with the touch pad region 416 and "drags" his finger in the direction of the desired GUI object. In a subsequent step 610 the application 116 identified a GUI object that is "nearest" to the currently identified GUI object in the specified direction. This is described in more detail in regard to Fig. 8.
The process 600 is then directed back to the step 608 which highlights the newly identified GUI object.
The process 600 cycles through the steps 608, 614, 612 and 610, effectively pursuing the desired GUI object under the guidance of directional inputs provided by the user.
Fig. 8 shows one arrangement 900 by which directional inputs provided by the user 102 can be used in order to select a GUI object which is nearest to the currently identified GUI object, in the direction specified by the user. In Fig. 8, an object 920 is the desired GUI object, however the user 102 has, by touching the touch pad region 416, indicated a location on the display area 402 corresponding to that of a cursor 904 as shown. Having regard to the object 920, and to other objects 914 and 902, the cursor 904 is closest to the object 902 as depicted by an arrow 906, and accordingly the object 902 is 181202 616590DOC -21 highlighted by the application 116. The user 102 receives the visual feedback 108 by virtue of this highlighting, and drags his finger along the touch pad region 416 thereby establishing a direction vector 908 originating from the cursor 904. While the user is dragging his or her finger along the touch pad, or, for that matter while the user's finger is stationary upon the touch pad, noise can be generated by slight unintentional movements of the user's finger. Such noise can potentially have the undersirable effect of causing a different GUI object to be highlighted, rather than the desired GUI object. Noise filtering can, however, easily be applied when the present arrangement is implemented, thereby overcoming this problem. Thus, for example, any dragging movement whose magnitude falls below a certain threshold can be ignored. The direction vector 908 points along a dashed line 916 that is shown for convenience. The object 914 is closer to the cursor 904 than is the object 920, as shown by arrows 912 and 918 respectively. However, the object 914 falls outside the triangle 910 that establishes a maximum angular deviation about the line 916 into which GUI objects must fall in order to be selected. Accordingly, the GUI object 920 is deemed to be the closest GUI object to the cursor 904 because the GUI object 920 falls within the maximum angular deviation established by the triangle 910.
Accordingly, the GUI object 920 is highlighted, and this information is fed back visually to the user 102.
Fig. 9 shows a system process 800 of method steps by which the application 116 running on the PC system 200 selects a GUI object on the display device 112 using directional input information from the user 102. The process 102 commences at a step 802 that receives an estimated coordinate position of the desired GUI object. Thereafter, a step 804 highlights a GUI object on the display device 112 that is closest to the aforementioned estimated coordinate position. A subsequent testing step 806 establishes whether the highlighted GUI object is, indeed, the desired GUI object. If this is the case, then the process 800 is directed in accordance with a "yes" arrow to a step 814 at which 181202 616590.DOC 181202 6165 90. DOC 22 point the process 800 terminates. If, on the other hand, the highlighted object is not the desired object, then the process 800 is directed in accordance with a "no" arrow to a step 808 which receives a directional input that points towards the desired GUI object. The process 800 is then directed to a step 810 which highlights the "closest" GUI object, having regard to the aforementioned directional input. The process 800 is then directed in accordance with an arrow 812 back to the testing step 806.
The step 802 can be implemented, in one arrangement indicated by reference numerals contained in a dashed box 816 connected by a dashed line 818 to the step 802 in question, by the processor 205, in conjunction with the touch pad device 410, as can the step 808. Similarly, the step 804 can be implemented by the processor 205, in conjunction with the video interface 207 and the display device 112. Similarly, the steps 806 and 814 can be implemented by the processor 205. Similarly, the step 808 can be implemented by the touch pad device 410, in conjunction with the processor 205.
Industrial Applicability It is apparent from the above that the arrangements described are applicable to the image processing industry.
The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive.
There may be variations of the method depending on the type of control devices available and the user's preference or level of coordination. Thus, although the use of a touch pad device 410 is considered in the detailed description, other controls which provides positional and/or direction input can also be adopted. For example, a joystick or a multi-way rocker switch can be used instead for the positional input arrangement. For a joystick, one of the nine regions can be selected by moving the joystick to one of the nine positions and then pressing a joystick button. For a multi-way rocker switch, one of the 181202 616590,DOC 181202 616590DOC 23 nine regions can be selected by pressing the switch in one of the eight positions or the centre position.
Furthermore, if the remote control device is equipped with a mini-touch screen, a low-resolution image can be displayed on it to provide the user with better precision when inputting a position.
Furthermore, while the user is making a selection, the GUI objects are typically still moving in real-time, driven by the application 116. In order for the user to complete a selection, all the GUI objects in the SDR can be temporarily frozen until a GUI object selection is made.
In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including" and not "consisting only of'. Variations of the word comprising, such as "comprise" and "comprises" have corresponding meanings.
181202 616590.DOC 181202 616590.DOC

Claims (18)

  1. 2. A method according to claim I, wherein: step (vii) has an additional sub-step of providing a feedback signal to thereby identify the selected GUI object; and the method comprises a further step of: (ix) if no GUI objects are present in the redefined SDR, repeating steps to (ix).
  2. 3. A method according to claim 1, wherein the GUI object in the step (vii) is the only GUI object in the redefined SDR. t8202 616590.DOC 1812C2 61 6590.DOC S4. A method according to claim 1, wherein step has an additional sub-step of C providing a feedback signal to thereby identify the redefined SDR. A method according to claim 1, wherein step (iii) comprises an initial sub-step of: ID e¢ freezing motion of GUI objects in the display space until a coordinate position is C received from the pointing device. (Ni S6. A method according to claim 1, wherein the SDR is displayed in a magnified format in the display space. I
  3. 7. A method according to claim 1, wherein the display space is N-dimensional.
  4. 8. A method according to claim 7, wherein: N 2; and said pointing device is a touch pad.
  5. 9. A method of selecting a moving GUI object in a display space using a pointing device, the method comprising the steps of: receiving an initial coordinate position from the pointing device; receiving a directional signal from the pointing device; and selecting the GUI object located closest to the initial coordinate position in the display space and in a direction dependent upon the directional signal, wherein at least one of step and step comprises an initial sub-step of freezing motion of the GUI objects in the display space until a selection is made. 181202 616590 -26- V) 10. A method according to claim 9, wherein the display space is N-dimensional.
  6. 11. A method according to claim 10, wherein: 1N 2; and said pointing device is a touch pad. \0
  7. 12. A method according to claim 9 wherein, when a new GUI object is selected, a cN previously selected GUI object is de-selected.
  8. 13. A method according to claim 9 wherein, when a new GUI object is selected, a previously selected GUI object is not de-selected.
  9. 14. An apparatus for selecting a GUI object in a display space using a user controlled pointing device, the apparatus comprising: means for establishing an initial range mapping between the pointing device and the display space; (ii) means for defining an initial Selected Display Region (SDR) in the display space dependent on the initial range mapping; (iii) means for receiving, from the pointing device, a set of coordinates in the initial SDR; (iv) means for updating the range mapping dependent upon the received set of coordinates; means for redefining the SDR dependent upon the updated range mapping; (vi) means for determining a presence of any GUI object within the redefined SDR; 181202 616590 -27- t (vii) means for selecting, if one GUI object is present in the redefined SDR, the C GUI object; and ;(viii) means for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii). INO
  10. 15. An apparatus according to claim 14, wherein the GUI object in the step (vii) is C the only GUI object in the redefined SDR. t",
  11. 16. An apparatus for selecting a moving GUI object in a display space using a pointing device, the apparatus comprising: means for receiving an initial coordinate position from the pointing device; means for receiving a directional signal from the pointing device; and means for selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made.
  12. 17. A computer program for directing a processor to execute a procedure for selecting a GUI object in a display space using a user controlled pointing device, the program comprising: code for establishing an initial range mapping between the pointing device and the display space; (ii) code for defining an initial Selected Display Region (SDR) in the display space dependent on the initial range mapping; (iii) code for receiving, from the pointing device, a set of coordinates in the initial SDR; 181202 616590 t (iv) code for updating the range mapping dependent upon the received set of N coordinates; code for redefining the SDR dependent upon the updated range mapping; (vi) code for determining a presence of any GUI object within the redefined SDR; Cc (vii) code for selecting, if one GUI object is present in the redefined SDR, the C GUI object; and cN (viii) code for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii).
  13. 18. A computer program according to claim 17, wherein the GUI object in the step (vii) is the only GUI object in the redefined SDR.
  14. 19. A computer program for directing a processor to execute a procedure for selecting a moving GUI object in a display space using a pointing device, the program comprising: code for receiving an initial coordinate position from the pointing device; code for receiving a directional signal from the pointing device; and code for selecting the GUI object located closest to the initial coordinate position in the display space in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made. A computer program product including a computer readable medium having recorded thereon a computer program for directing a processor to execute a procedure for 181202 616590 181202 616590 -29- Vt' selecting a GUI object in a display space using a user controlled pointing device, the C program comprising: S(i) code for establishing an initial range mapping between the pointing device and the display space; (ii) code for defining an initial Selected Display Region (SDR) in the display \0 C€ space dependent on the initial range mapping; Cl (iii) code for receiving, from the pointing device, a set of coordinates in the C(N initial SDR; (iv) code for updating the range mapping dependent upon the received set of coordinates; code for redefining the SDR dependent upon the updated range mapping; (vi) code for determining a presence of any GUI object within the redefined SDR; (vii) code for selecting, if one GUI object is present in the redefined SDR, the GUI object; and (viii) code for repeating, if more than one GUI object is present in the redefined SDR, steps (iii) (viii).
  15. 21. A computer program product according to claim 20, wherein the GUI object in the step (vii) is the.only GUI object in the redefined SDR.
  16. 22. A computer program product including a computer readable medium having recorded thereon a computer program for directing a processor to execute a procedure for selecting a moving GUI object in a display space using a pointing device, the program comprising: code for receiving an initial coordinate position from the pointing device; code for receiving a directional signal from the pointing device; and 181202 616590 code for selecting the GUI object located closest to the initial coordinate IN position in the display space and in a direction dependent upon the directional signal, wherein at the motion of the GUI objects are frozen in the display space until a selection is made. (N C< 23. A method substantially as described herein with reference to Figs. 1,2,3A-3D, Sand 4-6, or with reference to Figs. 1,2,3A, and 7-9. (Ni
  17. 24. An apparatus substantially as described herein with reference to Figs. 1,2,3A-3D, and 4-6, or with reference to Figs. 1,2,3A, and 7-9. A computer program substantially as described herein with reference to Figs. 1,2,3A-3D, and 4-6, or with reference to Figs. 1,2,3A, and 7-9.
  18. 26. A computer program product substantially as described herein with reference to Figs. 1,2,3A-3D, and 4-6, or with reference to Figs. 1,2,3A, and 7-9. DATED this 2 3 rd Day of August, 2005 CANON KABUSHIKI KAISHA Patent Attorneys for the Applicant SPRUSON&FERGUSON 181202 616590 181202 616590
AU2002320736A 2001-12-19 2002-12-19 Selecting Moving Objects on a System Ceased AU2002320736B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002320736A AU2002320736B2 (en) 2001-12-19 2002-12-19 Selecting Moving Objects on a System

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
AUPR9630A AUPR963001A0 (en) 2001-12-19 2001-12-19 Selecting moving objects on a system
AUPR9630 2001-12-19
AU2002320736A AU2002320736B2 (en) 2001-12-19 2002-12-19 Selecting Moving Objects on a System

Publications (2)

Publication Number Publication Date
AU2002320736A1 AU2002320736A1 (en) 2003-07-10
AU2002320736B2 true AU2002320736B2 (en) 2005-09-08

Family

ID=39277064

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2002320736A Ceased AU2002320736B2 (en) 2001-12-19 2002-12-19 Selecting Moving Objects on a System

Country Status (1)

Country Link
AU (1) AU2002320736B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111870945A (en) * 2020-08-10 2020-11-03 网易(杭州)网络有限公司 Control selection method, device, host and medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927101B (en) * 2014-03-27 2016-10-19 小米科技有限责任公司 The method and apparatus of operational controls

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4987411A (en) * 1987-07-02 1991-01-22 Kabushiki Kaisha Toshiba Pointing apparatus
US5959628A (en) * 1994-06-28 1999-09-28 Libera, Inc. Method for providing maximum screen real estate in computer controlled display systems
US6362842B1 (en) * 1998-01-29 2002-03-26 International Business Machines Corporation Operation picture displaying apparatus and method therefor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4987411A (en) * 1987-07-02 1991-01-22 Kabushiki Kaisha Toshiba Pointing apparatus
US5959628A (en) * 1994-06-28 1999-09-28 Libera, Inc. Method for providing maximum screen real estate in computer controlled display systems
US6362842B1 (en) * 1998-01-29 2002-03-26 International Business Machines Corporation Operation picture displaying apparatus and method therefor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111870945A (en) * 2020-08-10 2020-11-03 网易(杭州)网络有限公司 Control selection method, device, host and medium

Similar Documents

Publication Publication Date Title
US7451408B2 (en) Selecting moving objects on a system
US5757361A (en) Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary
JP3168156B2 (en) Cursor movement control device
JP4035497B2 (en) Image display system, image display apparatus, image display method, and program
JP4482561B2 (en) Common on-screen zone for menu activation and stroke input
JP4370326B2 (en) Manipulating on-screen objects using zones surrounding the object
US20170115871A1 (en) Disambiguation of Multitouch Gesture Recognition for 3D Interaction
KR100686165B1 (en) Portable terminal having osd function icon and method of displaying osd function icon using same
JP4389090B2 (en) Information display device
JP3996852B2 (en) Remote control with touchpad for highlighting preselected parts of displayed slides
US20020015064A1 (en) Gesture-based user interface to multi-level and multi-modal sets of bit-maps
EP2256609A2 (en) Mobile device capable of touch-based zooming and control method thereof
JP2010086230A (en) Information processing apparatus, information processing method and program
KR100950080B1 (en) Method of controlling software functions, electronic device, and computer program product
JP2003296012A (en) System for inputting and displaying graphic and method of using interface
KR20090116435A (en) Method and apparatus processing image based on the first inputted command
JP5981175B2 (en) Drawing display device and drawing display program
JPH11249782A (en) Terminal equipment having plural windows, window display method and recording medium recording window display control program
TWI442305B (en) A operation method and a system of the multi-touch
AU2002320736B2 (en) Selecting Moving Objects on a System
US20070006086A1 (en) Method of browsing application views, electronic device, graphical user interface and computer program product
US20100017757A1 (en) Method and system to reduce workload and skills required in usage of mouse or other pointing devices
JP2005100132A (en) Display control device
JP4457130B2 (en) Display input device
JP2006330848A (en) Image editing system, method, and program

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired