WO2001016688A1 - User interface and method thereof - Google Patents

User interface and method thereof Download PDF

Info

Publication number
WO2001016688A1
WO2001016688A1 PCT/AU2000/001030 AU0001030W WO0116688A1 WO 2001016688 A1 WO2001016688 A1 WO 2001016688A1 AU 0001030 W AU0001030 W AU 0001030W WO 0116688 A1 WO0116688 A1 WO 0116688A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
software product
product according
display
display region
Prior art date
Application number
PCT/AU2000/001030
Other languages
French (fr)
Inventor
Mark Henry Lateo
Original Assignee
Telecottage Computers Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPQ2537A external-priority patent/AUPQ253799A0/en
Priority claimed from AUPQ5358A external-priority patent/AUPQ535800A0/en
Application filed by Telecottage Computers Pty Ltd filed Critical Telecottage Computers Pty Ltd
Priority to AU69735/00A priority Critical patent/AU6973500A/en
Publication of WO2001016688A1 publication Critical patent/WO2001016688A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computer software product stored on a computer readable medium and executable by a processor of a computational device, said software product including: display instructions to generate a user interface display upon a display device coupled to said processor; cursor instructions to display a movable cursor in response to data from a pointing device, and motion detection instructions to detect dynamic interaction of said cursor with a region of the display corresponding to at least one predetermined interaction; wherein said product operatively initiates a function associated with said region upon detecting corresponding predetermined interaction.

Description

USER INTERFACE AND METHOD THEREOF
FIELD OF THE INVENTION
This invention relates to a user interface and method thereof. This invention has particular, but not exclusive application, to a user interface of a computerised system or device and method thereof, for illustrative purposes reference will be made to such application. However, it is to be understood that this invention could be used in other applications, such as equipment controlled via a programmable logic controller interface, televisions, appliances with screens and other screen based apparatus.
BACKGROUND OF THE INVENTION
In computers, a command is a specific order from a user to the computer's operating system or to an application program to perform a service or task. Most present day computer systems employ a graphical user interface (GUI) to let a user enter commands and select options through objects displayed on the screen. Displayed objects are pictorial representation such as windows, menus, icons, images, graphics, text, links, buttons, dialog boxes and the like. Examples of GUI's include Microsoft Windows, NT, IBM, OS/2, Apple Computer Mac OS, Linux, Novel and Unix operating systems.
To navigate within a GUI, a screen cursor or pointer enables a user to select individual points on the display screen. The cursor is typically displayed as a small arrow, an I-beam or a grabber. The cursor is moved to a desired screen location by the user in response to moving a pointing device, such as a mouse, trackball, joystick, touch pad or like. Pointing devices have one or more buttons or switches which enables a user to specify further input by pressing and quickly releasing, ie "clicking", the button or switch. Accordingly, a user can issue a command by moving the pointing device to position the cursor onto or near a desired command selection or object, such as a menu item or icon, and clicking the button. Many operations require clicking the button twice in rapid succession, being a "double click".
This pressing or releasing of the button to issue commands may cause stress on hand movements in fingers, wrists and tendons, leading to injuries such repetitive strain injury (RSI). Further, a pointing device requires a level of precision and it can be difficult for the elderly, children or people with disabilities to coordinate clicking efficiently. Constant clicking may also lead to mechanical failure of the pointing device. We have found a graphical user interface which presents the user with an alternative to the clicking of a pointing device.
SUMMARY OF THE INVENTION
Accordingly, in one aspect this invention resides in a computer software product stored on a computer readable medium and executable by a processor of a computational device, said software product including: display instructions to generate a user interface display upon a display device coupled to said processor; cursor instructions to display a movable cursor in response to data from a pointing device, and motion detection instructions to detect dynamic interaction of said cursor with a region of the display corresponding to at least one predetermined interaction; wherein said product operatively initiates a function associated with said region upon detecting corresponding predetermined interaction. The computational device may be a computer system, such as desktop or laptop computer, a computerised device, such as mobile phones, personal digital assistants, electronic organisers and electric appliances and instruments, equipment controlled via a programmable logic controller interface, televisions, or any other screen based apparatus. For the purpose of illustration only, we will hereinafter refer to a computer system or computerised device. However, it is to be appreciated that a person skilled in the art will be able to include the invention as described into any electronic apparatus having a display. The display may be any suitable output surface and projection mechanism that shows text and/or graphic images to the user. The pointing device may be any suitable device for controlling the movement of the cursor on the display. For example, the pointing device may be a mouse, trackball, touch pad, joystick, laser or optical pointers or the like. The region of the display may be the whole screen of the display, a portion of the screen or a displayed object. The display region may be mono-functional being capable of initiating a single designated function. For example, if the display region is an object, then the response may be activation of the object's associated function. Alternatively, the display region may be multi-functional responding with any number of specified functions depending on number of predetermined interactions associated with the display region. For example, the function may be activation of an object, the launching of a specified program, allowing a user security access to the computer, shutting down the computer or any other known computer function.
To initiate an associated function of a desired region of the display, the user moves the cursor to dynamically interact with the display region. It is to be understood that a dynamic interaction requires further interaction than simply pointing the cursor at a display region to enable the region to recognise the cursor, such as when a user highlights an object or moves through a menu without activating a command. To be dynamic the cursor may interact with the display region in a state of action based on time, direction, speed, duration, or a combination thereof.
Examples of dynamic interactions of a cursor with a display region may be as follows:
The user may locate the cursor to remain within the desired display region for a period of time, for example at least 5 seconds. The cursor's delay on the display region may be static. Alternatively, movement of the cursor may be allowed as long as the cursor remains within the borders of the display region for the defined period.
The user may move the cursor over a display region with a combination of speed and direction. For example, the user may move the cursor over the display region in a fast vertical direction or a slow horizontal trajectory. The user may manipulate the cursor to move on and off a display region a number of times. For example, the cursor may be moved onto a display region two or more times within a period of time to activate a response from the display region. Alternatively, the cursor can be moved onto a display region two or more times, each touch requiring a set duration of time, for example one second. • The user may move the cursor in a particular pattern or sequence within or over the display region. For ease of reference, we will refer to this type of dynamic interaction as a cursor movement pattern. For example, a cursor movement pattern over a display region may include one or more of the following:
> rotating the cursor in a clockwise or anticlockwise direction
> sliding the cursor from left to right or right to left sliding the cursor up or down moving the cursor in a defined way or set pattern (e.g. a wave, a zigzag or alphabet character)
> moving the cursor diagonally
The above dynamic interactions are by way of example and it is to be appreciated that any dynamic interaction of cursor according to the invention would be suitable depending on the shape and size of the display region.
The display region, facilitated by the motion detection instructions, may detect the dynamic interaction of cursor, such as the cursor movement patterns, by any suitable method. In one embodiment, the display region includes a two- dimensional or three-dimensional matrix. For example, the matrix may include a lattice or web of lines. Suitably, the matrix includes a grid of lines extending in x-y directions or x-y-z directions on the display. The matrix may overlay the display region and may be visible or invisible to the user.
The lines of the matrix may detect the cursor as it moves across the lines and/or the intersection of lines and in particular may detect cursor movement patterns. For example, the lines of the matrix may recognise mouse-move events, mouse-over events or other corresponding or similar pointing device movement event. The cursor movement patterns may be recognised if the user moves the cursor in the required pattern in a particular region of the matrix, i. e. the pattern may be embedded in particular area of the matrix. For example, the user may have to move the cursor over a number of specified x-y or x-y-z coordinates in the matrix to initiate a response from the display region. Alternatively, the matrix may recognise the cursor movement pattern by the relative directional movement of the cursor over the lines allowing the user to enter the cursor movement pattern anywhere on the matrix. For example, the matrix may be able to follow the relative direction and distance between the coordinates that the cursor has moved enabling the recognition of a specified cursor movement pattern.
The matrix may be multifunctional being capable of recognising a plurality of cursor movement patterns. Each cursor movement pattern may have a specified associated function. For example, a cursor movement pattern in the form of the letter "W" may execute a word processing document and a cursor movement pattern in the form of the letter "X" may shut down the computer. Further, the matrix may be used for security, wherein the user must enter a preset cursor movement pattern before the computer will operate. This security measure may be used as an adjunct to the password.
In another embodiment, the display region may include two or more recognition areas, each of which independently detect the cursor passing thereover. For example, each recognition area may detect a mouse-move event, a mouse-over event or other corresponding or similar pointing device movement event. The recognition areas may be associated with the display region by being randomly arranged within the display region. The user can then manipulate the cursor to randomly move over the area of the display region until all recognition areas within the display region have detected the cursor, thereby activating the associated function. Suitably, the recognition areas are arranged in a pattern within the display region. For example, the recognition areas may form arrangements such as a line, rows/columns, a circle or any other shape. The recognition areas may not respond to a cursor unless the cursor moves over the recognition areas within a display region in a particular sequence or order. For example, if the recognition areas are in a horizontal line they may only detect the cursor if moved sequentially from extreme left to extreme right or vise versa. Accordingly, a user does not need to be aware of the position of the recognition areas in an display region, just the specific cursor movement pattern that is required to enable cursor recognition from all recognition areas activating a response from the display region. The invention as described may eliminate the need for pressing or clicking the button to instigate a response from a display region. However, it is to be appreciated that a display region may still retain its traditional functionality of responding to traditional clicking, so that the user may have a choice. It is also to be appreciated that a response may be activated from a display region having a single recognition area. However, to prevent a false trigger from the display region from random movements of the cursor over the display screen by the user, it is preferred that at least two recognition areas are included in the display region. Suitably, the display region including recognition areas is a displayed object. However, it is to be appreciated that the recognition areas may be arranged over the whole display screen or a portion thereof, such as in arrangement of rows and columns.
When a display region initially detects the cursor the particular dynamic interaction required to activate the response from the display region may be illustrated or presented to the user. For example, the illustration may be a graphical representation, such as an animated image, a window overlay depicting the required cursor movement pattern or sound such a spoken instruction. The illustration may also include a guide for directing the cursor to perform the required cursor movement pattern in the appropriate display region. Feedback may be provided to a user. Feedback may occur as the cursor is recognised by each recognition area in a display region and/or upon activation of a response from the display regions. The feedback can be in the form of a graphic, sound, multimedia or combination thereof. For example, audio feedback in the form of sounds and/or visual feedback in the form of various shapes, graphics and multimedia in the form of video may be generated in response to detection of the cursor by a recognition area or activation of the response. It is understood that the feedback is optional and may be customised to suit the user. Computer applications employing the user interface of the invention may include a library of the feedback objects. The software product according to the invention may be integrated in the operating system affecting all software applications in a computer system. For example, the desktop, control buttons and menus in a software program or a suite of similar software programs, for example Microsoft Office, may integrate a software product according to the invention during manufacture by the OEM (Original Equipment Manufacturer). Alternatively, the software product of the invention may interact with a traditional operating system and/or other software applications by associating at least one predetermined interaction with display regions that initiate a function in the traditional computer application and generating appropriate recognition areas and/or matrix to detect dynamic interaction of the cursor.
In a further alternative, the software product of the invention may enhance or augment the operating system and/or other software applications to recognise traditional objects and convert them to objects in accordance with the present invention. A traditional object that is activated by the pressing or clicking of a switch or button on the pointing device may be converted to an object which responds to a specific dynamic cursor interaction, such as a cursor movement pattern, by generating recognition areas or a matrix within the object. The software product of the invention, whether part of the operating system or another software application, may scan any of the software programs in a computer system or instruct another software program, such as a plug-in, to scan for specific codes related to objects that are desired to be converted. For example, if the program that requires converting is an Internet interface, the HTML in a Web page may be scanned for a specific code, such as an Uniform Resource Locator (URL), indicating a hyperlink. When hyperlinks are located in the Web page, the software product according to the invention may generate recognition areas or a matrix with the located hyperlinks converting them to objects that have at least one predetermined interaction. A person skilled in that art would be able to use any suitable object orientated programming language or technologies such as C++, Java, Javascript, VB, ActiveX and the like to program a software product in accordance with the invention. The invention as described may be employed in object based computer systems and software applications currently known in the art or be employed to create new GUI's, operating systems or software applications. Applications related to the Internet, such Web browser, interfaces, plug-ins and Web page editors, are particular suitable to utilise the invention as described as they employ hyperlinks and other objects to navigate through Web pages. However, any program or device that employs objects such as program interfaces and menu and icon systems, for example word processors, spreadsheets, databases, games, utilities, CAD etc, may employ the invention. Even accessories, such as screen savers may integrate the invention. In another aspect, this invention resides in a computational device implemented method for a user to interface with a computer system of the type operatively generating a display and a pointing device operated cursor on said display, the method including the steps of: monitoring dynamic interaction of said cursor with a region of the display in order to detect a predetermined interaction; and activating a function associated with said predetermined interaction upon detecting said interaction.
In a further aspect this invention resides in a user interface including: a display; a pointing device operated cursor on said display, and a display region responsive to a selective dynamic interaction with said cursor.
In a still further aspect this invention resides in a method of activating a displayed object in a computer system operable by a pointing device and cursor and including the steps of: generating two or more recognition areas associated with said displayed object, each said recognition area being responsive to detecting the cursor, and moving the cursor over said object by the user, wherein a response from all of said recognition areas associated with the said object activates said object.
BRIEF DESCRIPTION OF THE FIGURES
In order that this invention may be more readily understood and put into practical effect, reference will now be made to the accompanying drawings which illustrate preferred embodiments of the invention and wherein: FIG. 1 illustrates a block diagram of a typical prior art personal computer,
FIG. 2 illustrates the sequential movement of a cursor from left to right over an object to cause activation of the object; FIG. 3 illustrates the sequential movement of a cursor in a clockwise rotational movement over an object to cause activation of the object;
FIG. 4 illustrates the movement of the cursor from the top to the bottom of an object to cause activation of the object; FIG. 5 illustrates the movement of the cursor in a zig-zag pattern over an object to cause activation of the object;
FIG. 6 illustrates the horizontal movement of the cursor from the extreme right to the extreme left over an object to cause activation of the object;
FIG. 7 illustrates the rotational movement of the cursor in an anticlockwise direction over an object to cause activation of the object, and
FIG. 8 illustrates the movement of the cursor in a wave pattern over an object to cause activation of the object.
FIG. 9 illustrates a pop-up window overlay displaying the particular cursor movement pattern required to activate an object; FIG. 10 illustrates a "tractor-beam" facility which ensures that the cursor locks onto the window overlay of Fig. 8;
FIG. 11 illustrates a custom pointer which may pop-up to indicate to a user the required cursor movement pattern to activate an object;
FIG. 12 illustrates a display region including a matrix overlaying the entire computer display;
FIG. 13 illustrates a cursor movement pattern on the matrix;
FIG. 14 illustrates a matrix visible on a portion of the display,
FIG. 15 illustrates an invisible matrix on a computer display, and
FIG. 16 illustrates a flow chart for the steps required to initiate a function associated with a region of the display in accordance with the invention.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
With reference to Figure 1 , there is depicted a block diagram of a typical prior art personal computer 20. The personal computer of Figure 1 includes a processor 20 which is coupled to a memory 24, including primary volatile memory for example RAM, and secondary memory devices for example hard-drives and
CD-ROMs. Processor 22 executes a software product in the form of an operating system stored in memory 24. The processor usually executes further software products also stored in memory.
Processor 22 is also coupled to data entry modules such as a user controlled motion sensor 28 and data entry device, for example a keyboard 30. Motion sensor 28 may be in anyone of several forms. For example, it may be a mouse, trackball, joystick, touch pad or the like. Pointing devices have one or more buttons or switches which enable a user to specify further input by pressing and quickly releasing, i.e. "clicking" the button or switch.
Data generated by the processor under control of software 32 may be displayed on a display screen 26.
Most present day computers are operated by an operating system which contains instructions that generate a graphical user interface which is displayed upon display screen 26. The operating system GUI typically displays icons which are pictorial representations of functions that the operating system, or a software application run under the operating system, is capable of performing.
Many software products are programmed in object oriented programming languages so image-data representing an icon is contained as a field of an object and wherein coded instructions of a function that the object is capable of executing are associated with another field of the same object. In order to initiate execution of a function the operating system contains programmed instructions which convert positional data received from motion sensor 28 into a user movable icon, known as a cursor, displayed upon screen 26 as part of the GUI. As the motion sensor 28 is moved the positional data provided to processor 22 changes and is processed by the operating system to correspondingly change the position of the cursor displayed upon display 26. The operating system monitors the position of the cursor usually by updating the cursor's screen coordinate.
The range of screen coordinates occupied by stationary icons or other features of the GUI are also accessible to the operating system. Typically, the operating system contains instructions for determining when the cursor coordinate range is overlapping the coordinate range of a stationary object on the GUI screen, for example a functional icon. In that event the operating system typically changes the shape of the stationary icon as a prompt for the operator of the motion sensor 28 to enter a command confirming that the function associated with the stationary icon is to be initiated. At present, confirmation is typically initiated by clicking on a button of motion sensor 28 thereby transmitting a confirmation signal to processor 22. The present invention will be described bearing the foregoing in mind.
Accordingly in the interests of brevity it is to be understood that the software product of the present invention contains instructions including computer instructions facilitating the functionality that will be described. Such instructions may include display instructions to generate a user interface, cursor instructions to display a movable cursor in response to data from a pointing device and motion detection instructions to detect dynamic interaction of said cursor with a region of the display. The actual programming of a software product to implement the functionality is straightforward and will not be described in detail.
The present invention is particular well-suited to object based computer systems that use pointing devices such as a mouse, joystick, trackball or touch pad to move a cursor on a display screen by corresponding movement of the pointing device. For the purposes of illustration only, the invention will now be described in connection with the more popular mouse-based system. It is appreciated that the present invention can also be employed in any computational device such mobile phones, personal digital assistants, electronic organisers, screen based electric appliances and instruments, equipment controlled via a programmable controlled logic interface and other screen based apparatus.
EXAMPLE 1 Referring to FIG. 16, there is illustrated a flow chart for the steps required to initiate a function associated with a region of the display according to the invention as described. As shown in steps 40 to 42, the operating system contains programmed instructions which monitors and converts positional data received from user controlled motion sensor, such as a pointing device, into a user movable a cursor, displayed upon screen as part of the GUI. These steps are known in the art.
The operating system contains instructions for determining when the cursor coordinate range is overlapping the coordinate range of a specific region of the screen that will initiate a desired function. In step 43, the dynamic interaction of the cursor is detected and compared against the predetermined interaction associated with the specified region. At step 44, a positive match of the detected dynamic interaction with a corresponding predetermined interaction initiates the desired associated function in a next step 45. After initiating the associated function or if the detected dynamic interaction of the cursor does not match the predetermined interaction, the operating system continues to monitor position data from the motions sensor returning to step 40.
In preferred embodiments of the invention, the dynamic interaction of a cursor may be detected by cursor recognition areas or a matrix included in the region of the display, which are described in the following examples.
EXAMPLE 2
Generally, computer applications are event-driven. Events are actions that occur as the result of the user doing something. For example, the user may move the cursor over a link, click a button or type in a text field. Each action creates an event and a computer application is programmed to react to events in different ways according to the type of event. For example, a mousemove event occurs when the user moves the cursor. A computer application can recognise a mousemove event when the cursor is positioned within the borders of the object. This sometimes is referred to as a mouseover where an object changes or responds to the cursor as it passes over that object. The terms mousemove and mouseover are elements used in a programming language such as Javascript. It is to be appreciated that a person skilled in that art will be able to manipulate corresponding elements in other programming languages to achieve the same outcome.
According to one embodiment of the invention, two or more cursor recognition areas are generated within a display region, such as an object. Each recognition area recognises a mousemove or mouseover event when the cursor passes within the borders of a respective recognition area. Therefore, the object recognises the cursor anywhere within its borders and each recognition area within an object also independently recognises the cursor within its respective border. The object is not activated unless the cursor moves over all the recognition areas within the object in a particular sequence.
Referring to FIGS 2 to 8, the movement of a cursor 10, shown as an arrow pointer, across an object 1 1 is illustrated by the dotted path. The object 11 includes a plurality of recognition areas 12. Referring to FIGS. 2 and 3, the recognition areas 12 are arranged respectively in a horizontal line and two rows. The user moves the cursor 10 across the display screen by manipulation of the mouse to a desired object 1 1 to be activated. FIG. 2 depicts the sequential horizontal movement of the cursor 10 across the object 1 1 from left to right in a straight-line trajectory. FIG. 3 depicts the sequential rotational movement of the cursor 10 in a clockwise direction. As the cursor 10 moves within the object 11 , each recognition area 12 detects the cursor 10 passing thereover. When all recognition areas 12 have detected the cursor 10, the object 1 1 is activated to perform its associated function or action. In order to prevent false activation of the object it is preferred that the recognition areas 12 are triggered in a particular sequence or order. For example, an object may be activated if the cursor 10 is moved in a left to right horizontal trajectory across the object, but may not be activated in the user moves the cursor from right to left. FIGS. 4 to 8 illustrate further examples of cursor movement patterns that can be used to activate objects.
The recognition areas 12 are highlighted in the figures to assist in the understanding of the invention. This is an optional feature and it is to be appreciated that the user generally just has to be aware of the specified cursor movement pattern required to activate an object 11 and not the exact position of the recognition areas 12 within an object 11.
Feedback can be provided to a user when the cursor 10 has been detected within the borders of a recognition area 12 and/or when an object 11 is activated. For example, audio feedback in the form of sounds and/or visual feedback in the form of various shapes and graphics may be generated in response to detection of the cursor 10 within a recognition area 12 and/or upon activation of the object 1 1. This is illustrated in FIGS. 2 and 3, wherein each recognition area 12 generates a graphic image of a star as the cursor 10 moves across the object 1 1 . Feedback is optional and may be customised to suit the user. A computer application may become easier to manipulate and control when employing the invention. An object 1 1 , whether a button, icon, hyperlink etc, retains its basic functionality, however has an advantage of being able to be activated without clicking. A pattern or sequence of recognition areas 12 can be individually and independently programmed by the user into an object 11 to give it greater flexibility in activation.
EXAMPLE 3
The invention as described may be incorporated into navigation software for Internet/Intranet World Wide Web browsers. An Internet Web browser is the software interface that allows a computer user to control their position on Web pages. The World Wide Web (or Web for short) is a massive collection of data stored globally on computers. These computers are connected together on a large network, collectively known as the Internet. Web pages are connected by links known as hyperlinks. To move from one page to the next a user clicks on a hyperlink with the mouse or other pointing device and a new Web page is downloaded replacing the existing Web page on the computer screen. There are two main types of hyperlinks: text and graphical. A typical text hyperlink is generally represented as an underlined word. For example, a text hyperlink may look like this: telephone. Alternatively, a hyperlink may be represented as a graphic (image, icon, photo) hyperlink, as illustrated in FIG. 8.
Every Web page has an address or URL (Uniform Resource Locator). Clicking on the hyperlink transports the user to the address or URL that was programmed into the hyperlink. On a newly downloaded Web page, new hyperlinks would offer the opportunity to access even more Web pages. Traditional navigation of the Web makes a user repeat this process constantly, adding up to a lot of mouse clicks. On a Web page in accordance with invention as described, the hyperlinks can be automatically activated by a cursor movement pattern without the clicking of a mouse button.
Further, software, such as a Web browser plug-in, may be programmed to automatically search a Web page and convert hyperlinks into objects that are activated by a specific cursor movement pattern. For example, the software may scan the HTML in a Web page looking for the following URL code: <A HREF= >. HTML is the programming language that produces Web pages. URL is the address of a Web page. Hyperlinks are attached to this, for example <A HREF="http:// www.name.com. au"χ/A>. Upon locating the hyperlink, recognition areas may be generated in the object. The software may be a plug-in that is loaded into the Internet browser by the user that generates recognition areas into traditional objects. In addition to generating recognition areas the software may individually customised the hyperlink being converted. For example, the software may also convert the actual size, shape, colour, graphic, icon, font and style of the hyperlinks. The software may be turned on and off within the Internet browser, operating system or application at any time thereby deactivating the converted hyperlinks as required. The original HTML tags will not be modified thereby retaining the integrity of the original Web Page on the server. The user may also have a choice of the particular pattern of the recognition areas generated in an object customising a preferred cursor movement pattern for activating objects to suit the user.
Other embodiments of the invention include a Web page editor that creates Web pages with objects that are activated by one or more cursor movement patterns and a Web browser or Internet interface that has objects such as navigation icons and menu buttons that are also activated by one or more cursor movement patterns. The Web browser interface may also automatically convert a traditional object into an object that is activated by a cursor movement pattern by automatically generating recognition areas into the object.
The invention as described is user friendly and has the ability to be used by all computer consumers. It may provide quicker and easier program navigation. There is the potential that mouse clicks may not be needed and that the mouse button may become redundant. However, suitably the user retains the option to use the traditional clicking the mouse to activate an object. The invention is particularly applicable, but not restricted, to Internet use and has potential use for all object orientated software programs. A particular application employing the invention as described may offer the feature of personalising an object to suit the user by giving the user a choice of a particular cursor movement pattern to cause activation and activation feedback. Activating objects by a cursor movement pattern instead of clicking may place less stress on hands/wrists movement.
EXAMPLE 4 To increase the ease of activating a response from a display region such as an object, a trigger window 13, as illustrated in FIG. 9, may pop up displaying the recognition areas or trigger components 15 in the particular pattern required to activate the object. This trigger window 13 appears as an user interface overlay which illustrates or requests a movement of the mouse pointer to process a software driven mouse event. As the trigger window exists as a graphic object, it is not limited in size or orientation (i.e. vertical, horizontal, clockwise or counter clockwise triggers). Once the response conditions of the display region have been met, the trigger window 13 will disappear. The cursor (which has since moved in order to meet the trigger conditions of the display region) will be returned to the location, which was stored before the appearance of the trigger window. At this point, a software driven Mouse_Event Click event will be invoked to activate the underlying object, though the Click event is activated by the cursor movement pattern rather than the traditional mouse click.
In FIGS. 9 and 10, the trigger window 13 is a diagram box displaying a horizontal array of trigger components 15. The dark boxes indicate the trigger components 15 which have detected the cursor 10. The white boxes have not been triggered. When all recognition areas have triggered then the object will execute its associated function.
To ensure the cursor falls within the region of the trigger components 15 of the trigger window 13, the trigger components 15 may maintain "tractor beams" 14 for the cursor which will ensure the cursor will snap into and lock onto the target component 15, as illustrated in FIG. 10. This tractor-beam 14 facility extends the sensitive "hotspot" areas (marked as outlined rectangles in the diagram below) outside of the trigger components and places a temporary indicator within the target trigger component 15 to indicate when a component 15 has been triggered. This effectively guides and zooms the cursor into the trigger component 15 reducing the amount of accuracy required to make it trigger. In some instances it may not be clear as to what type of cursor movement is required to activate a response in certain display regions. In these uncertain regions, a custom pointer may be used. Once the cursor passes over a valid display region, the pointer changes to a custom pointer, which denotes the type of activation required. For example, an animated cursor of a revolving arrow may indicate a clockwise trigger event, as illustrated in FIG. 1 1 . Slight customization of these custom pointers is available to the user, for example change color or size. It is preferred that for the most part the custom Pointer will not be modifiable, as it will be loaded when its associated responsive display regions are loaded. This will ensure a standard operation of the responsive display regions.
Referring to FIG. 11 , the animated cursor of revolving arrows would indicate a clockwise rotation over would be required to trigger the associated response from the display region.
EXAMPLE 5
The main interface used by modern computers is the GUI (Graphical User Interface). This interface lends itself to the use of a mouse or similar pointing devices. Buttons and menu commands in software applications (such as wordprocessors, spreadsheets, databases, games etc ..) are clicked or double clicked to be activated with the currently available technology. Using the invention as described, an alternate navigation system becomes available. A program can search for the control buttons and menus in a software program and convert them into objects that respond to dynamic cursor interaction, such as a cursor movement pattern. Their original formatting (shape, size, colour, location etc ..) can be preserved or an enhanced look and feel can be applied using an object library. Upon conversion, clicking to activate is no longer required and all the associated benefits of invention as described become available to the user.
In another embodiment, rather than searching for the control buttons and menus in a software program and converting them into objects that responds to dynamic cursor interaction, such as a cursor movement pattern, the control buttons and menus in a software program or a suite of similar software programs for example Microsoft Office, have the invention integrated into them during manufacture by the OEM (Original Equipment Manufacturer). EXAMPLE 6
The cursor recognition area may be in the form of a matrix or grid. The matrix is usually invisible to the user, however the grid lines of the matrix have the option to be made visible if required (usually only required when changing configuration of matrix). When visible, the configuration of the matrix such as colors, shape, dimensions and physical location on screen, may be changed by the user. A password or graphical password in the form of a preset cursor movement pattern may then lock these settings. The matrix acts as a transparent layer on top of the desktop or application and is sensitive to cursor movements. The visible matrix is illustrated in FIGS. 12 and 14.
The matrix is sensitive to cursor movements and as such can be configured to respond when a preset cursor movement mode or pattern is entered. These preset patterns can be considered to be a graphical password. A preset cursor movement pattern may be embedded anywhere within the invisible matrix. Alternatively, the preset cursor movement pattern may be entered anywhere on the matrix which may recognised by the relative directional movements of the cursor. The patterns can be transparent, and when triggered can be set to control applications. Application control includes some of the following: Launched (opened), Resized (minimised, restored, maximised), Saved, Printed and Terminated (closed). The ability to create customised cursor movement patterns is also available. For example, if a cursor movement pattern in the form of a large 'W was embedded in the matrix and had been set to load a wordprocessor whenever it was triggered. Then drawing a large W, as illustrated in FIG. 13, within the matrix would trigger the matrix and load the wordprocessor. A library of preset cursor movement patterns is available to the matrix. The library contains a set of prefabricated templates (with multimedia feedback), such as shown in Table 1. These templates can be modified by the user to complete common computer tasks.
Figure imgf000021_0001
The user may select from the above template list and/or mix and match any of the configurations within the matrix library interface. For example, if Item 9 were selected, the cursor movement pattern for the virus scanner would be embedded into the matrix and trigger whenever a large 'V was drawn by the cursor within the matrix.
The matrix may be invoked (launched) by the usual START menu/Programs menu, desktop shortcut, hitting a 'hot key' on the keyboard eg 'ALT M' or loaded when preset conditions arise, e.g. at WINDOWS boot up. An icon in 'Settings - Control Panel' of the operating system and in the 'System Tray' of the Operating System would also afford access to the matrix properties.
The matrix lends itself to security applications, especially if the timer facility is active. The timer measures the time taken for a trigger sequence to be completed by the user. If the trigger sequence is not completed within a predetermined timeframe, then the cursor movement pattern will not trigger. The cursor movement pattern embedded in the matrix can be used as an adjunct to passwords. As the matrix can be rendered invisible, locked, resized and moved to any position on a computer screen. An unfamiliar user would have considerable difficulty knowing the exact location and required "password" (ie letter, shape or pattern) to be entered in the invisible matrix, in order to trigger the associated response within a set timeframe. FIGS. 14 and 15 illustrate a visible and invisible use of the matrix as security.
Greater control of software installation may be achieved by the inclusion of a matrix during the installation process. By requiring the user to trigger a randomly generated cursor movement pattern during installation on a randomly positioned matrix, installation of the new software program will be restricted if the incorrect cursor movement pattern is used. Random generation of the "installation cursor movement patterns" could be generated by a unique number, say specific serial number found only in the User Manual. Greater control of software installation may contribute to the reduction of software piracy (illegal installation of unlicensed software).
EXAMPLE 7 The PDA (Personal Digital Assistant) is a multipurpose electronic device, not unlike the early electronic organisers or calculators. A number of PDA manufacturers have various models on the market including the: 3Com Palm, Casio E, Compaq Aero, Ericsson MC, Hewlett-Packard Jornada, IBM WorkPad, Philips Nino, Psion Revo. Microsoft have developed a specific OS for the PDA, their Windows CE is basically a modified Win95 OS. Most PAD manufactures use the Windows CE OS and some have developed propriety OS.
The invention as described lends itself to the PDA, Internet enabled phones and other similar devices. The units may be purpose built by the manufacturer employing a user interface in accordance with the invention as described, or converted later. A program may search for the control buttons and menus in an existing software program and converts them into display regions which may respond to a specified dynamic interaction with a cursor. Their original formatting (shape, size, colour, location etc ..) can be preserved or an enhanced look and feel can be applied using an objects library. Upon conversion, clicking to activate is no longer required and all the associated benefits of the invention as described become available to the user.
Original Equipment Manufacturers of Operating Systems can integrate the user interface of the invention into the operating system of the PDA or similar electronic device during manufacture. Then all application programs on the hand held computer would automatically have access to the benefits of the invention.
It will of course be realised that while the foregoing has been given by way of illustrative example of this invention, all such and other modifications and variations thereto as would be apparent to persons skilled in the art are deemed to fall within the broad scope and ambit of this invention as is herein set forth.

Claims

CLAIMS:
1. A computer software product stored on a computer readable medium and executable by a processor of a computational device, said software product including: display instructions to generate a software product display upon a display device coupled to said processor; cursor instructions to display a movable cursor in response to data from a pointing device, and motion detection instructions to detect dynamic interaction of said cursor with a region of the display corresponding to at least one predetermined interaction; wherein said product operatively initiates a function associated with said region upon detecting the corresponding predetermined interaction.
2. A software product according to claim 1 , wherein the computational device is a desktop or laptop computer.
3. A software product according to claim 2, wherein the pointing device is a mouse trackball, touch pad, joystick, laser or optical pointer or the like.
4. A software product according to any one of the preceding claims, wherein the region of display is the whole display screen, a portion of the display screen or a displayed object.
5. A software product according to any one of the preceding claims, wherein the display region initiates a single function or is multi-functional.
6. A software product according to any one of the preceding claims, wherein the dynamic interaction is a state of action of the cursor based on the time, direction, speed, duration of the or combination thereof.
7. A software product according to claim 6, wherein the dynamic interaction is locating the cursor within the region for a defined period of time.
8. A software product according to claim 6, wherein the dynamic interaction is movement of the cursor over the region in a combination of speed and direction.
9. A software product accordingly to claim 6, wherein the dynamic interaction is movement of the cursor on and off a display region a plurality of times.
10. A software product accordingly to claim 6, where the dynamic interaction the cursor is moved over the display region in a pattern.
1 1. A software product according to claim 10, wherein the pattern is selected from the group of rotating the cursor in a clockwise or anticlockwise direction, moving the cursor from left to right or right to left, moving the cursor up or down, moving the cursor in a wave, a zigzag or alphabet character or other design, moving the cursor diagonally and a combination of one or more of the group.
12. A software product according to any one of claim 10 or 11 , wherein the display region includes a matrix of lines which detect the movement of the cursor.
13. A software product according to claim 12, wherein the lines of the matrix recognise mouse-move events, mouse-over events or other corresponding pointing device movement events.
14. A software product according to any one of claim 12 or claim 13, wherein the matrix is two-dimensional or three dimensional.
15. A software product according to any one of claims 12 to 14, wherein the matrix includes a grid of lines extending in x-y directions or x-y-z directions on the display.
16. A software product according to any one of claims 12 to 15, wherein the matrix is visible or invisible to a user.
17. A software product according to any one of claims 12 to 16, wherein the matrix recognising a plurality of cursor movement patterns.
18. A software product according to claim 17, wherein each cursor movement pattern has a specified associated function.
19. A software product according to any one of claims 12 to 18, wherein the matrix is in the desktop.
20. A software product according to any one of claims 12 to 19, wherein the matrix is used for security and the cursor movement pattern operates as a password.
21. A software product according to claim 10, wherein the display region includes two or more recognition areas, and wherein each recognition area independently recognises the cursor as passing thereover.
22. A software product according to claim 21 , wherein each recognition area is capable of detecting a mouse-move event, a mouse-over event or other corresponding pointing device movement event.
23. A software product according to any one claim 21 or claim 22, wherein the recognition areas are randomly arranged within the display region.
24. A software product according to claim 23, wherein the display region initiates a function when all recognition areas have detected the cursor.
25. A software product according to any one claim 21 or claim 22, wherein the recognition areas are arranged in a pattern within the display region. 25
26. A software product according to claim 25, wherein the recognition areas are arranged in a line, rows and columns, a circle or other shape.
27. A software product according to any one of claims 25 or 26, wherein the display region will not initiate a function until the recognition areas detect the cursor in a particular sequence.
28. A software product according to any one of claims 25 to 27, wherein the pattern is selected from the group of rotating the cursor in a clockwise or anticlockwise direction, moving the cursor from left to right or right to left, moving the cursor up or down, moving the cursor in a wave, a zigzag or alphabet character or other design, moving the cursor diagonally and a combination of one or more of the group.
29. A software product according to any one of claims 21 to 28, wherein the display region is an object.
30. A software product according to any preceding claims, wherein the selective dynamic interaction required to activate a response from the display region is illustrated to the user.
31. A software product according to claim 30, wherein the illustration is as an animated image or a window overlay displaying the required cursor movement pattern.
32. A software product according to any one of claim 30 or claim 31 , wherein the illustration includes a guide for directing the cursor to perform the required cursor movement pattern for a specified display region.
33. A software product according to any one of the preceding claims, wherein feedback is provided to the user. „„
26
34. A software product according to claim 33, wherein feedback is provide as the cursor is detected by each recognition area in a display region.
35. A software product according to any one claim 33 or claim 34, where feedback is provided upon activation of response from the display region.
36. A software product according to any one or claims 33 to 35, where the feedback is graphic, sound, multimedia or combination thereof.
37. A software product according to one of claims 10 to 29, where the matrix or recognition areas are generated in a display region by the operating system.
38. A software product according to one of claims 10 to 29, where the matrix or recognition areas are generated in a display region by a software program product other than the operating system.
39. A software product according to claim 1 , wherein the computational device is a mobile phone, a personal digital assistant, an electronic organiser, an electric appliance having a display, equipment controlled via a programmable logic controller interface or a television.
40. A software product according to claim 1 , wherein the software product is an Internet browser.
41 . A software product according to claim 2, wherein the region of the display is a hyperlink.
42. A computational device implemented method for a user to interface with a computer system of the type operatively generating a display and a pointing device operated cursor on said display, the method including the step of: monitoring dynamic interaction of said cursor with a region of the display in order to detect a predetermined interaction; and activating a function associated with said predetermined interaction upon detecting said interaction.
43. A method according to claim 40, wherein the region of the display includes a matrix or two or more cursor recognition areas to monitor the dynamic interaction of the cursor.
44. A method of activating a displayed object in a computer system operable by a pointing device and cursor and including the steps of: generating two or more recognition areas associated with said displayed object, each said recognition area being responsive to detecting the cursor, and moving the cursor over said object by the user, wherein a response from all of said recognition areas associated with the said object activates said object.
PCT/AU2000/001030 1999-08-30 2000-08-30 User interface and method thereof WO2001016688A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU69735/00A AU6973500A (en) 1999-08-30 2000-08-30 User interface and method thereof

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AUPQ2537 1999-08-30
AUPQ2537A AUPQ253799A0 (en) 1999-08-30 1999-08-30 Method and apparatus for activating a displayed object in a computer system
AUPQ5358 2000-01-31
AUPQ5358A AUPQ535800A0 (en) 2000-01-31 2000-01-31 User interface and method of use thereof

Publications (1)

Publication Number Publication Date
WO2001016688A1 true WO2001016688A1 (en) 2001-03-08

Family

ID=25646135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2000/001030 WO2001016688A1 (en) 1999-08-30 2000-08-30 User interface and method thereof

Country Status (1)

Country Link
WO (1) WO2001016688A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5060135A (en) * 1988-09-16 1991-10-22 Wang Laboratories, Inc. Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable
EP0597794A2 (en) * 1992-11-13 1994-05-18 International Business Machines Corporation Method and system for graphic interaction between data and applications within a data processing system
US5450539A (en) * 1992-07-09 1995-09-12 Apple Computer, Inc. Apparatus and method of dynamically displaying a graphic button on a monitor
US5611040A (en) * 1995-04-05 1997-03-11 Microsoft Corporation Method and system for activating double click applications with a single click
US5651120A (en) * 1986-06-12 1997-07-22 Keiji Kitagawa Graphic data processing apparatus using displayed graphics for application program selection
GB2319445A (en) * 1996-09-23 1998-05-20 Ibm Web browser which automatically loads selected types of graphics
US5818423A (en) * 1995-04-11 1998-10-06 Dragon Systems, Inc. Voice controlled cursor movement
US5959628A (en) * 1994-06-28 1999-09-28 Libera, Inc. Method for providing maximum screen real estate in computer controlled display systems

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5651120A (en) * 1986-06-12 1997-07-22 Keiji Kitagawa Graphic data processing apparatus using displayed graphics for application program selection
US5060135A (en) * 1988-09-16 1991-10-22 Wang Laboratories, Inc. Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable
US5450539A (en) * 1992-07-09 1995-09-12 Apple Computer, Inc. Apparatus and method of dynamically displaying a graphic button on a monitor
EP0597794A2 (en) * 1992-11-13 1994-05-18 International Business Machines Corporation Method and system for graphic interaction between data and applications within a data processing system
US5959628A (en) * 1994-06-28 1999-09-28 Libera, Inc. Method for providing maximum screen real estate in computer controlled display systems
US5611040A (en) * 1995-04-05 1997-03-11 Microsoft Corporation Method and system for activating double click applications with a single click
US5818423A (en) * 1995-04-11 1998-10-06 Dragon Systems, Inc. Voice controlled cursor movement
GB2319445A (en) * 1996-09-23 1998-05-20 Ibm Web browser which automatically loads selected types of graphics

Similar Documents

Publication Publication Date Title
US5805167A (en) Popup menus with directional gestures
US6388657B1 (en) Virtual reality keyboard system and method
US6600480B2 (en) Virtual reality keyboard system and method
US5936614A (en) User defined keyboard entry system
US8957854B2 (en) Zero-click activation of an application
US6154205A (en) Navigating web-based content in a television-based system
US6643824B1 (en) Touch screen region assist for hypertext links
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US6211878B1 (en) Method and apparatus for interacting and selecting information on a video device
US6643721B1 (en) Input device-adaptive human-computer interface
US8125457B2 (en) Switching display mode of electronic device
US7154480B2 (en) Computer keyboard and cursor control system with keyboard map switching system
US20030007017A1 (en) Temporarily moving adjacent or overlapping icons away from specific icons being approached by an on-screen pointer on user interactive display interfaces
EP0394614A2 (en) Advanced user interface
US7689924B1 (en) Link annotation for keyboard navigation
GB2509599A (en) Identification and use of gestures in proximity to a sensor
KR100222362B1 (en) A method for rapid repositioning of a display pointer
US6392676B1 (en) Method and system for displaying a mouse pointer
WO2012054212A2 (en) Scrubbing touch infotip
EP1220083A2 (en) Compact information terminal apparatus, method for controlling such apparatus and medium
WO2001016688A1 (en) User interface and method thereof
JP4681721B2 (en) Information terminal, information terminal control method, and recording medium
JP2002268791A (en) Visually independent user interface device
KR102274781B1 (en) Method of command generation according to six-axis motion analysis of feet and legs
KR100475595B1 (en) Input device for computer system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP