WO2003007144A1 - A system and a method for user interaction - Google Patents
A system and a method for user interaction Download PDFInfo
- Publication number
- WO2003007144A1 WO2003007144A1 PCT/SE2002/001274 SE0201274W WO03007144A1 WO 2003007144 A1 WO2003007144 A1 WO 2003007144A1 SE 0201274 W SE0201274 W SE 0201274W WO 03007144 A1 WO03007144 A1 WO 03007144A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- objects
- screen
- function
- displayed
- display
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/10—Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
Definitions
- the present invention relates to a system for making possible user interaction with objects on a screen, which system comprises a screen and means for displaying digitally generated objects on the screen.
- the invention further relates to a method for making possible user interaction with objects on a screen.
- the invention also relates to a computer program directly load- able into the internal memory of a computer, which computer program comprises software for implementing the method according to the invention, and a computer-readable medium having stored thereon a computer program intended to make a computer implement the method according to the invention.
- the invention also relates to the use of the inventive system for programming the movements of an industrial robot.
- the present invention further relates to a system for making possible user interaction with objects on a screen, which system comprises a screen and means for displaying digitally generated information objects and at least one digitally generated function object on the screen, the function object being activatable by a user through a pointing device for control of a system function.
- the invention further relates to a method for making possible user interaction with objects on a screen.
- the invention also relates to a computer program directly loadable into the internal memory of a computer, which computer program comprises software for implementing the method according to the invention, and a computer-readable medium having stored thereon a computer program intended to make a computer implement the method according to the invention.
- the invention also relates to the use of the inventive system for programming the movements of an industrial robot.
- function objects are for instance scroll bars which, when activated by means of a pointing device, will produce a displacement vertically or laterally of information objects displayed on the screen.
- a conventional word processing program executed in a personal computer (PC) is involved, an activation of said type of scroll bars will produce a displace- ment in desired direction of text displayed on the screen.
- the available screen area is also limited.
- the programming of the movements of an industrial robot by means of a programming unit, which is hand carried and communicates with the control arrangement of the robot, may be mentioned as an example of such an applica- tion.
- a screen on the hand carried programming unit it is possible for an operator to study the input program code that controls how the different parts of the robot is moving, and through some kind of input device it is possible for the operator to input new program code or edit previously input program code.
- this type of robot programming often takes place under rough external environmental conditions and the risks of the hand carried programming unit being exposed to impacts and hits are considerable, it is desirable to use a screen in the programming unit having a screen area as small as possible.
- the durability of the type of screen here in question is namely larger the smaller the dimension of the screen.
- a further advantage with a screen of small dimensions is that the screen requires less current supply the smaller it is.
- the charge amount of the programming unit can be limited when using a small screen, which in its turn results in decreased explosion hazards when the programming takes place in an environment with high explosion danger.
- the requirements of a small screen area will make it more difficult to present through the screen, in a well- arranged and user-friendly manner, the information required to the user.
- An object of the present invention is to achieve a system offer- ing improved possibilities to effectively use an available screen area.
- the inventive idea also includes a system offering an ef- fective use without having to forgo the possibilities for a system user, for instance a robot programmer, of interacting with the objects displayed on the screen in question.
- the inventive solution implies that it is possible for the user to simultaneously perceive on the screen objects that are present in several different layers, and for the user to control the mutual order between the layers so as to for instance accentuate the objects that are present in a certain layer before the objects of the other layers. Thereby, it is i.a. offered an excellent possibility for the user, in a simple and clear manner, of "navigating up to" the desired information in a certain amount of information displayable through a screen.
- the active objects i.e. the objects that are present in the active layer
- the passive objects are displayed with a higher spatial frequency and/or display sharpness than the passive objects, i.e. the objects that are present in a passive layer.
- the active objects can be considered to be in focus for the eyes of the user, whereas the passive objects are out of focus but however still perceivable for the user when the user looks at the image shown on the screen.
- the inventive solution makes it possible for the user to perceive the passive objects and the information comprised therein without any larger part of the brain capacity of the user having to be used for this.
- the active objects are displayed in a stronger shade of colour than the passive objects. It is realized that this will also offer the user a possibility of controlling through the layer shifting means which layer's objects that are to be made to appear more clearly than the other layers' objects.
- the different layers comprise mutually co-operating objects, which are adapted to co-operate in such a manner that an operation initiated by the user on an object present in a first layer will produce an operation on a co-operating object present in a second layer.
- This will for instance offer the user a possibility of immediately, through information displayed on the screen, learning how an alteration of a function parameter displayed on the screen in a first layer will affect an object controlled by this function parameter and displayed in a second layer.
- Said func- tion parameter is for instance included in a program code that controls the movements of a robot, the robot, which consequently constitutes the object controlled by the function parameter, being displayed in said second layer.
- a robot image dis- played in a layer on the screen is suitably provided with a colour marking indicating the part of the robot that is controlled by a program sequence displayed in another layer on the screen.
- the respective function object as to its surface size is adapted for co-operation with a pointing device in the form of a finger.
- the screen is a touch screen, in which case the respective function object as to its surface size is adapted for co-operation with a pointing device in the form of a finger as above indicted.
- the inventive solution further implies that the function object or objects, which are displayed on the screen and by means of which the user through a pointing device controls different types of system functions, are visible for and activatable by the user without their display on the screen entailing a limitation of the screen area available for display of information objects.
- the inventive solution entails that it is possible for the user to use a screen with a smaller screen area for showing a certain amount of information as compared to conventional solutions where a part of the screen area is re- served only for display of said function objects.
- the expression function object will in this description and the subsequent claims refer to an object that is digitally generated and displayed or displayable on a screen, which object is acti- vatable by a user by means of a pointing device, in the form of a computer mouse, a finger, a pointing pen or the like, for control of a system function.
- Said system function is for instance of a type that will produce some kind of alteration of the objects which are displayed or intended to be displayed on the screen, or some kind of alteration in how these objects are displayed on the screen, such as for instance a system function for producing a size alteration of the objects displayed on the screen.
- system function be of a type that will not directly affect the objects displayed on the screen or the form for their display, such as for instance a system function for initiating a print-out of information on a printer or initiating a storing of for instance program code on a storage medium.
- the expression information object will in this description and the subsequent claims refer to an object that is digitally generated and displayed or displayable on a screen, which object carries an information content intended for a user.
- the information object is possibly affectable by the user by means of a pointing device after the user first having marked the information object on the screen by means of a pointing device.
- the information object is for instance an image, a symbol, an individual character, a combination of characters etc.
- At least one function object is adapted to control a scrolling func- tion, an activation of this function object by means of the pointing device initiating a movement vertically or laterally of at least some of the information objects displayed on the screen.
- a conventional image scrolling function in an image displayed on the screen without any part of the screen area having to be reserved only for displaying the function objects related to the image rolling function, such as for instance scrolling bars. Since the image scrolling functions in accordance with the invention are implementable in a completely software-based manner, the need of hardware-based function members for control of the image scrolling on a screen is eliminated, which results in cost savings.
- the screen is a touch screen, the respective function object preferably being adapted as to its surface size for co-operation with a pointing device in the form of a finger.
- touch screen will in this description and the subsequent claims refer to a screen adapted to be able to receive control commands by the user pointing or lightly pressing against parts of the screen with a pointing device, for instance in the form of one of the fingers of the user.
- the area available for display of information objects will not be affected by the surface size of a function object displayed on the screen.
- a great latitude is obtained concerning the choice of surface size of a function object and its localization on the screen.
- the inventive system is a programming device, preferably for programming the movements of an industrial robot.
- the programming device is with advantage adapted to communicate with the control unit of the robot through a wireless connection, for instance implemented by means of blue-tooth technique.
- hand carried programming units are in some cased used when programming the movements of an industrial robot, in which case it is desirable to reduce the size of the screen of the programming unit as far as possible. It is real- ized that the inventive solution is very advantageous to use in this application.
- the inventive solution is particularly favourable for use together with touch screens, it is of course also applicable for use together with conventional screens where the user through a pointing device in the form of a computer mouse or the like controls the localization of a marker displayed on the screen.
- a function object is activated either directly when the marker is moved over the function object or when the user presses some kind of function button after the marker has been localized onto the function object by means of the pointing device.
- the invention also relates to a method for making possible user interaction with objects on a screen according to claim 23.
- the invention also relates to a computer program directly loadable into the internal memory of a computer according to claim 45, which computer program comprises software for implementing the inventive method.
- the invention also relates to a computer-readable medium according to claim 46, which medium has stored thereon a computer program intended to make a computer implement the inventive method.
- the invention also relates to the use of the inventive system for programming the movements of an industrial robot.
- the expression function object will in this description and the subsequent claims refer to an object that is digitally generated and displayed or displayable on a screen, which object is activatable by a user by means of a pointing device, in the form of a computer mouse, a finger, a pointing pen or the like, for control of a system function.
- Said system function is for instance of a type that will produce some kind of alteration of the objects which are displayed or intended to be displayed on the screen, or some kind of alteration in how these objects are displayed on the screen, such as for instance a system function for producing a size alteration of the objects displayed on the screen.
- system function be of a type that will not directly affect the objects displayed on the screen or the form for their display, such as for instance a system function for initiating a print-out of information on a printer or initiating a storing of for instance program code on a storage medium.
- the expression information object will in this description and the subsequent claims refer to an object that is digitally generated and displayed or displayable on a screen, which object carries an information content intended for a user.
- the information object is possibly affectable by the user by means of a pointing device after the user first having marked the information object on the screen by means of a pointing device.
- the information object is for instance an image, a symbol, an individual character, a combination of characters etc.
- Fig 1 a very schematical illustration of how objects dis- played on a screen in accordance with the invention are displayed in different layers
- Fig 2 a schematical illustration of how the objects displayed in the different layers illustrated in Fig 1 will appear for a user who is watching the screen
- Fig 3 a schematically shown screen surface illustrating a practical application of the inventive system
- Figs 4-6 schematically illustrated screen areas provided with information objects and function objects
- Fig 7 a simplified block diagram illustrating components included in an embodiment of the system according to the invention.
- Fig 8 a schematic illustration of a programming unit included in a preferred embodiment of the system according to the invention.
- Fig 9 a schematic illustration of a programming device when used for programming the movements of an industrial robot.
- the inventive system comprises means for displaying digitally generated objects on a screen 1 , which objects consist of function objects and/or information objects.
- said display means are adapted to display the objects 3 in two or more transparent and mutually superimposed layers 7a, 7b, which layers comprise an active layer 7a and one or several passive layers 7b.
- This display of objects in several layers is illustrated very schematically in Fig 1 .
- the objects 3 are intended to be displayed in an active layer 7a and two passive layers 7b, but it is also possible to let the number of passive layers be larger as well as smaller than two.
- That certain objects 3 are displayed in one and the same layer implies that they are concatenated with each other and affectable in group in such a manner that for instance a displacement of the layer vertically or laterally on the screen results in a corre- sponding displacement of all objects that are present in the layer and that a removal of the layer from the screen results in a removal from the screen of all objects that are present in the layer. It is of course also possible to let the objects that are present in one and the same layer be commonly affectable in many other manners according to requirements and application. That certain objects are present in one and the same layer will however not exclude that these objects are also individually affectable by the user.
- the denomination "active objects” will be used for the objects that are displayed in a so-called active layer and the denomination "passive objects” for the objects that are displayed in a so-called passive layer.
- the reference 3a is used for indicating active objects and the reference 3b for indi- eating passive objects.
- Fig 1 The layers 7a, 7b are in Fig 1 , for illustrative purposes, reproduced as being separate physical layers, but in the reality the different layers are of course only of virtual character.
- Fig 2 shows the objects 3a, 3b displayed in the different layers 7a, 7b as they are meant to appear to a user watching the screen 1 in question.
- the means 9 for displaying objects 3a, 3b on the screen 1 con- sist for instance of conventional computer components, such as a data processing unit 13 connected to the screen 1 , storage medium 14, application programs 1 1 executable in the data processing unit 13 etc. These components are illustrated very schematically in Fig 7.
- the display means 9 are adapted to display the active objects 3a in such a manner that they are visually distinguished from the passive objects 3b. In this manner it is possible to get the active objects 3a to appear in such a manner that they are dis- tinguished from the passive objects for a user who is watching the objects displayed on the screen 1 .
- the layer comprising the objects that are of primary interest to the user at a certain moment, for instance the objects the user at the moment intends to manipulate in some way, is intended to constitute the for the time being active layer 7a.
- the active objects 3a are therefore suitably displayed in a visually more conspicuous manner than the passive objects 3b, i.e. the active objects 3a are suitably displayed in such a manner that they will appear more clearly than the passive objects to a person watching the screen 1 .
- the active objects 3a are for instance displayed with a display sharpness or spatial frequency that distinguishes them from the passive objects 3b. It is also possible to use hatching in order to visually distinguish active and passive objects. It is also possible to achieve the visual distinction by displaying the active objects 3a in a shade of colour that distinguishes them from the passive objects 3b. It is of course also possible to use different combinations of distinguishing display sharpness, spatial frequency, hatching and shade of colour in order to achieve the desired visual distinction.
- the active objects 3a are preferably intended to be displayed with a spatial frequency and/or display sharpness that is higher than the spatial frequency and/or display sharpness of the passive objects 3b, and/or in a shade of colour that is stronger than the shade of colour of the passive objects 3b.
- the higher display sharpness of the active objects 3a is for instance produced in that these are displayed with a higher resolution than the passive objects 3b.
- layers that are kept hidden for the user and are made to appear to the user as an active or passive layer only when the user so orders i.e. layers that are kept hidden while the objects in an active layer and one or several passive layers are displayed on the screen, and the objects of which are made to be displayed on the screen as passive or active objects when so desired.
- the inventive system further comprises a layer shifting means, which is activatable by the user and adapted to produce a shift of layers when being activated so that the active layer 7a is changed into a passive layer 7b and one of the passive layers 7b is changed into an active layer 7a.
- the layer shifting means 6 is adapted to control the display means 9 to shift shade of colour and/or spatial frequency and/or display sharpness and/or hatching of the objects dis- played in the layer that is changed from an active into a passive layer at the shift of layers and in the layer that is changed from a passive into an active layer at the shift of layers.
- the user it is possible for the user to control which layer that at a certain given moment is to constitute the active layer, and consequently which layer's object that at a certain given moment is to be made to appear most clearly to the user.
- the layer shifting means which is schematically indicated at 6 in Fig 7, comprises a program sequence stored on a storage medium, which program sequence will perform the abovede- scribed shift of layers when being activated.
- the layer shifting means further comprises a control member communicating with said program sequence, by means of which control member it is possible for the user to activate the program sequence to perform a desired shift of layers.
- This control member is for in- stance software-based and consists of one or several function objects displayed on the screen, which objects are activatable by the user through a pointing device, such as a computer mouse or the like, by means of which the user controls the localization of a marker displayed on the screen.
- a function object 4 is either activated directly when the marker is moved over the function object or when the user presses some kind of function button after the marker has been localized on the function object by means of the pointing device.
- the previously mentioned display means 9 are adapted to display on the screen 1 a function object 4, included in the layer shifting means, for the respective layer 7a, 7b.
- each of these function objects 4 consists of an icon. It is possible to display all icons in one and the same layer but the re- spective icon is with advantage displayed in the layer associated with the icon. It is for instance also possible to let said function object 4 consist of a browsing flap or the like associated with the respective layer. By activating an icon and a browsing flap, respectively, the user initiates a shift of layers so that the layer associated with the icon/flap is changed into an active layer and the previous active layer is changed into a passive layer.
- control member included in the layer shifting means 6 be a hardware-based control member and for instance consist of a function lever or one or several function buttons on a keyboard.
- each layer is for instance associated with a specific function button.
- Systems where the layer shifting means comprises software-based con- trol members in combination with hardware-based control members are of course also possible within the scope of the invention.
- the screen 1 is a so-called touch screen
- a function object 4 included in the layer shifting means is in this case adapted to be activated in that the user with his finger 5, or any other pointing device, presses against the area of the screen surface that is covered by the function object 4 in question.
- the touch screen consequently has sensors that are detecting pressure. It is however also possible to let the touch screen be provided with sensors that do not require any direct touch of the screen 1 for activation of a function object.
- the screen is for instance provided with sensors, such as photocells, which detect a light beam from a light beam emitting pointing device directed against an area of the screen, or which detect the shadow from a pointing device placed in front of an area of the screen. It is of course also possible to let the screen be provided with other types of sensors detecting that the pointing device is placed in front of or directed against an area of the screen.
- the respective function object 4 is suitably adapted as to its surface size for co-operation with a pointing device 5 in the form of a finger. It is of course also possible to let the screen be pro- vided with other types of sensors detecting that a pointing device is placed in front of or directly against an area of the screen.
- the dif- ferent layers 7a, 7b comprise mutually co-operating objects 3a', 3b', which are adapted to co-operate in such a manner that an operation initiated by the user, for instance by means of a pointing device, on an object 3a', 3b' present in a first layer 7a, 7b will produce an operation on a co-operating object 3b', 3a' present in a second layer 7b, 7a.
- a marking performed by the user on a first object 3a' in a first layer 7a will for instance re- suit in an alteration, which is visible to the user and illustrative for the application in question, of a second object 3b' in a second layer 7b.
- the objects 3a', 3b' in question are in this case interconnected through a program sequence, which, when the first object 3a' is marked or in any other manner affected by the user, will perform a predetermined alteration of the second object 3b'.
- Said operation may for instance imply that the user alters a function parameter displayed on the screen 1 in a first layer, whereupon it is possible for the user to perceive on the screen how this alteration affects an object which is controlled by this function parameter and displayed in a second layer.
- Fig 3 an application relating to robot programming is illustrated, where for instance program code 3a' controlling the movements of an industrial robot is displayed in the active layer 7a, whereas an image 3b' of the industrial robot is displayed in the passive layer 7b.
- the robot programmer it is possible for the robot programmer to focus the displayed program code in order to check it and perform desired alterations therein, at the same time as the robot programmer "in the background" per- ceives how a performed alteration in the program code affects the industrial robot controlled by the program code and/or the part of the robot controlled by the program code sequence in question displayed on the screen.
- the part of the robot that is controlled by the program code sequence in question is for in- stance marked with a colour marking.
- a screen surface 2 of a screen 1 comprised in a system according to the invention is schematically illustrated.
- the inventive system comprises means for displaying on the screen 1 digitally generated information objects, schematically indicated at 3, and one or several digitally generated function objects, schematically indicated at 4a and 4b in Figs 4-6.
- Figs 4-6 further illustrate an embodiment of the invention where the function objects 4a, 4b are of another type than the function objects 4 included in the layer shifting means and are dis- played on the screen in a first layer, whereas information objects 3 are displayed in a second layer.
- the display means 9 are here adapted to display a function object 4a, 4b on an area of the screen that is also available for simultaneous display of information objects 3, the function object 4a, 4b being adapted to be visible to the user and activatable through a pointing device 5 even when an information object 3 simultaneously and visibly to the user is displayed on the screen area covered by the function object 4a, 4b, i.e. even when an information object 3 superimposes the function object 4a, 4b in question.
- Said information objects 3 are in Figs 4-6 displayed with a higher display sharpness than the function objects 4a, 4b, and they are consequently supposed to be present in the for the time being active layer 7a.
- Said function objects 4a, 4b are activatable by a user through a pointing device 5, for instance a finger as illustrated in Figs 4- 6, for control of a system function.
- the respective function object 4a, 4b is associated with a program sequence responsible for a certain system function.
- Said function object 4a is preferably adapted to control a scrolling function, in which case an activation of this function object 4a by means of the pointing device 5 initiates a movement vertically or laterally of at least some of the information objects 3 displayed on the screen.
- This type of function object is schematically illustrated at 4a ⁇ -4a 4 in Figs 4-6. It is however also possible to let the function object 4a, 4b be adapted to control any other type of system function as previously described.
- the function object is for instance related to a function menu, in which case an activation of the function object by means of the pointing device 5 is adapted to bring forth a presentation on the screen 2 of different selectable system func- tions.
- the latter type of function object is schematically illustrated at 4b in Figs 4-6.
- the function objects 4a, 4b suitably have a symbol or text that will help the user to understand which system function the respective function object controls.
- the function objects 4a, 4b are made to appear to the user less clearly on the screen 1 than the information objects 3, the user's possibility of perceiving the information content of the information objects 3 is not to any appreciable extent disturbed by the function objects 4a, 4b that are simultaneously displayed on the screen, even in case an information object 3 superimposes a function object 4a, 4b.
- said function objects 4a, 4b constitute passive objects as long as the function object 4a, 4b is not activated by the user through the pointing device 5, whereas the function object 4a, 4b is changed into an active object when the function object 4a, 4b is activated by the user through the pointing device 5, so that the activated function object 4a, 4b, and possibly also the rest of the function objects 4a, 4b, will appear more clearly than the information objects 3.
- said function objects 4a, 4b consequently also constitute control members included in the layer shifting means for initiation of a shift of layers.
- a function object 4a, 4b is displayed with a lower sharpness or weaker shade of colour than the information objects 3 as long as the function object 4a, 4b is not activated by the user through the pointing device 5, whereas the display sharpness or shade of colour of said objects 3, 4a, 4b is shifted when the function object 4a, 4b is activated by the user through the pointing device 5, so that the activated function object 4a, 4b, and possibly also the rest of the function objects 4a, 4b, will obtain a display sharpness on the screen that is higher than the display sharpness of the information objects 3 or a stronger shade of colour than these.
- a first function object 4a ⁇ adapted to control an up- scrolling function, an activation of this first function object by means of the pointing device 5 initiating a movement down- wards on the screen of at least some of the information objects 3 displayed on the screen, and a second function object 4a 2 adapted to control a downscrolling function, an activation of this second object by means of the pointing device 5 initiating a movement upwards on the screen of at least some of the in- formation objects 3 displayed on the screen.
- a third function object 4a 3 adapted to control a first lateral scrolling function, an activation of this third function object by means of the pointing device 5 initiating a movement in one lateral direction on the screen of at least some of the information objects 3 displayed on the screen, and at least one fourth function object 4a 4 adapted to control a second lateral scrolling function, an activation of this fourth function object by means of the pointing device 5 initiating a movement in the other lateral direction on the screen of at least some of the information objects 3 displayed on the screen.
- function objects 4b of previously described type, which are related to menu functions.
- a function object 4a which is displayed on the screen in accordance with the inventive solution and related to an image scrolling function, be designed as a scroll bar of conventional design.
- the function objects 4a, 4b are suitably displayed in a shade of colour that distinguishes them from the information objects 3 and/or with a display sharpness that distinguishes them from the information objects 3.
- the in- formation objects 3 are suitably displayed with a higher sharpness, i.e. they are adapted to appear clearly and sharply for a user watching the screen 1
- the function objects 4a, 4b are displayed with lower sharpness, i.e. they are adapted to appear less clearly and sharply to the user as compared to the information objects 3.
- the distinguishing display sharpness is for instance produced by displaying the function objects 4a, 4b with a lower resolution than the information objects 3. Since the function objects 4a, 4b are made to appear to the user less clearly on the screen 1 than the information objects 3, the user's possibility of perceiving the information content of the information objects 3 is not to any appreciable extent disturbed by the function objects 4a, 4b that are simultaneously displayed on the screen, even in case an information object 3 superimposes a function object 4a, 4b.
- an information object 3 displayed on the screen is markable by the user through the pointing device 5, in which case an operation performed by the user by means of the pointing device on a part of a marked information object 3, which is displayed on a screen area covered by a function object 4a, 4b and which consequently superimposes this function object 4a, 4b, is adapted to affect the marked information object without acti- vating the function object 4a, 4b.
- This embodiment will in the following be more closely described with reference to Figs 4-6.
- Fig 5 illustrates how the user marks an information object 3, in this case a line with for instance program code, by pressing thereon with his finger 5 on an area of the screen that is not covered by a function object 4a, 4b. That the information object 3 has been marked in this manner is for instance indicated in that the information object 3 changes its shade of colour, as illustrated in Fig 5.
- the information object 3 has higher priority than a function object 4a, 4b, which implies that a pressing performed by the user on an already marked information object 3 will be interpreted by the system as an operation on the information object 3 even though the user presses on the informa- tion object 3 on an area of the screen 1 that is also covered by a function object 4a, 4b, as illustrated in Fig 6.
- the pressing will consequently result in an operation on the marked information object 3 and not an activation of the function object 4a ! .
- the inventive system is intended to constitute a programming device for making possible programming of for instance the movements of an industrial robot, in which case the information objects 3 for instance constitute program code arranged in lines.
- the inventive system is with advantage a text-editing device, in which case the information objects 3 constitute text objects, such as characters arranged in lines or combinations of characters.
- a preferred embodiment of the inventive system is shown, which here constitutes a programming device 10 for programming an industrial robot 20.
- the industrial robot illustrated in Fig 9 only is a very simplified type of industrial robot shown for the purpose of exemplification, and this is consequently not in any way to be interpreted in a manner that is limiting for the invention.
- the device 10 is schematically shown connected to the robot 20 through communication lines 40, 41 and a robot control unit 50.
- the programming device 10 is, however, with advantage arranged to communicate with the control unit of the robot through a wireless connection, for instance implemented by means of blue-tooth technique.
- the very schematically shown device 10 is preferably a programming unit, also called Teach Pendant Unit (TPU), and comprises a screen 1 , which preferably is a pressure sensitive screen, a so-called touch screen, by means of which it is possible to make inputs to the device 10.
- a screen sensi- tive to light or other sorts of inputs and also a screen that is not intended to be used for any sort of input and consequently only has a display function.
- the device 10 also comprises a control lever 12, by means of which it is possible for an operator 30 to control movements of the robot 20 for pro- gramming purposes.
- the device 10 further comprises a data processing unit, schematically indicated by the square 13, to which the screen 1 is connected.
- the data processing unit 13 preferably comprises any available type of microprocessor and also different types of memories, data busses and other equipment necessary for executing computer-readable program code, for instance in the form of application programs, system programs, operating systems etc.
- An application program for programming the robot is also included in the device 10, which application program in- eludes a graphical user interface.
- buttons for instance in the form of activatable buttons, text, images, dialogue boxes, activatable icons, etc.
- graphic objects e.g. icons
- Java Java Script
- C C++
- Visual Basic By activating such an information object, for instance by pressing with a finger on the area of the screen 1 where said object is shown, it is possible to initiate execution of the corresponding computer program component in order to program the robot.
- the execution of these components will either take place in the existing computer processing unit of the device or in other appliances with which the device is communicating.
- Such a component is for instance used for programming reference positions for the industrial robot.
- Other components are for instance used for monitoring the status of different parts of the control system of the robot, controlling of mechanical parts included in the robot, controlling/handling of signals, controlling/handling of input/output units, inputting and monitoring of function values, handling of configuration data basis in the control system of the robot.
- the different objects are displayed on the screen in the abovedescribed manner.
- Software for implementing the inventive method is preferably arranged to be included in a computer program directly load- able into the internal memory of a computer.
- a computer program is suitably provided stored on a computer-readable storage medium such as for instance an optical storage medium in the form of a CD-ROM disc, a DVD disc etc, or a mag- netic storage medium in the form of a diskette, a cassette tape etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02746249A EP1410163A1 (en) | 2001-06-29 | 2002-06-27 | A system and a method for user interaction |
US10/481,599 US20040212626A1 (en) | 2001-06-29 | 2002-06-27 | System and a method for user interaction |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE0102319-1 | 2001-06-29 | ||
SE0102319A SE0102319D0 (en) | 2001-06-29 | 2001-06-29 | System and method for enabling user interaction |
SE0102318A SE0102318D0 (en) | 2001-06-29 | 2001-06-29 | Systems and procedure for user interaction |
SE0102318-3 | 2001-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003007144A1 true WO2003007144A1 (en) | 2003-01-23 |
Family
ID=26655503
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE2002/001274 WO2003007144A1 (en) | 2001-06-29 | 2002-06-27 | A system and a method for user interaction |
Country Status (3)
Country | Link |
---|---|
US (1) | US20040212626A1 (en) |
EP (1) | EP1410163A1 (en) |
WO (1) | WO2003007144A1 (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE0103531D0 (en) * | 2001-10-23 | 2001-10-23 | Abb Ab | Industrial Robot System |
US8050782B2 (en) * | 2004-05-20 | 2011-11-01 | Abb Research Ltd. | Method and system to retrieve and display technical data for an industrial device |
EP1880269A4 (en) * | 2005-05-04 | 2012-09-12 | Hillcrest Lab Inc | Methods and systems for scrolling and pointing in user interfaces |
US7433741B2 (en) * | 2005-09-30 | 2008-10-07 | Rockwell Automation Technologies, Inc. | Hybrid user interface having base presentation information with variably prominent supplemental information |
WO2007099511A2 (en) * | 2006-03-03 | 2007-09-07 | Syddansk Universitet | Programmable robot and user interface |
US20080056146A1 (en) * | 2006-08-29 | 2008-03-06 | Elliott Steven L | Method and apparatus for determining maximum round trip times for a network socket |
US20080056147A1 (en) * | 2006-08-29 | 2008-03-06 | Elliott Steven L | Method and apparatus for determining minimum round trip times for a network socket |
EP2453325A1 (en) | 2010-11-16 | 2012-05-16 | Universal Robots ApS | Method and means for controlling a robot |
DE102010063222B4 (en) * | 2010-12-16 | 2019-02-14 | Robert Bosch Gmbh | Device and method for programming a handling device and handling device |
KR102050895B1 (en) | 2011-09-28 | 2020-01-08 | 유니버셜 로보츠 에이/에스 | Calibration and programming of robots |
US9378581B2 (en) * | 2012-03-13 | 2016-06-28 | Amazon Technologies, Inc. | Approaches for highlighting active interface elements |
JP7042554B2 (en) | 2014-03-04 | 2022-03-28 | ユニバーサル ロボッツ アクツイエセルスカプ | Industrial robots with safety functions and methods for their safety control |
JP6678648B2 (en) | 2014-09-26 | 2020-04-08 | テラダイン、 インコーポレイテッド | Grippers and automatic test equipment |
JP6868574B2 (en) | 2015-07-08 | 2021-05-12 | ユニバーサル ロボッツ アクツイエセルスカプ | A programmable robot equipped with a method for end users to program industrial robots and software for their execution. |
TWI805545B (en) | 2016-04-12 | 2023-06-21 | 丹麥商環球機器人公司 | Method and computer program product for programming a robot by demonstration |
US11179856B2 (en) | 2017-03-30 | 2021-11-23 | Soft Robotics, Inc. | User-assisted robotic control systems |
JP6763846B2 (en) | 2017-11-24 | 2020-09-30 | ファナック株式会社 | Teaching device and teaching method for teaching robots |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0605945A1 (en) * | 1992-12-15 | 1994-07-13 | Firstperson, Inc. | Method and apparatus for presenting information in a display system using transparent windows |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
US6057840A (en) * | 1998-03-27 | 2000-05-02 | Sony Corporation Of Japan | Computer-implemented user interface having semi-transparent scroll bar tool for increased display screen usage |
US6118427A (en) * | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6134102A (en) * | 1995-07-22 | 2000-10-17 | Kuka Roboter Gmbh | Programming device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5119079A (en) * | 1990-09-17 | 1992-06-02 | Xerox Corporation | Touch screen user interface with expanding touch locations for a reprographic machine |
JP3039204B2 (en) * | 1993-06-02 | 2000-05-08 | キヤノン株式会社 | Document processing method and apparatus |
US5872573A (en) * | 1996-12-30 | 1999-02-16 | Barlo Graphics N.V. | Method and system for improving legibility of text and graphic objects laid over continuous-tone graphics |
US6353451B1 (en) * | 1998-12-16 | 2002-03-05 | Intel Corporation | Method of providing aerial perspective in a graphical user interface |
US6456262B1 (en) * | 2000-05-09 | 2002-09-24 | Intel Corporation | Microdisplay with eye gaze detection |
-
2002
- 2002-06-27 EP EP02746249A patent/EP1410163A1/en not_active Withdrawn
- 2002-06-27 US US10/481,599 patent/US20040212626A1/en not_active Abandoned
- 2002-06-27 WO PCT/SE2002/001274 patent/WO2003007144A1/en not_active Application Discontinuation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0605945A1 (en) * | 1992-12-15 | 1994-07-13 | Firstperson, Inc. | Method and apparatus for presenting information in a display system using transparent windows |
US6134102A (en) * | 1995-07-22 | 2000-10-17 | Kuka Roboter Gmbh | Programming device |
US6118427A (en) * | 1996-04-18 | 2000-09-12 | Silicon Graphics, Inc. | Graphical user interface with optimal transparency thresholds for maximizing user performance and system efficiency |
US6037937A (en) * | 1997-12-04 | 2000-03-14 | Nortel Networks Corporation | Navigation tool for graphical user interface |
US6057840A (en) * | 1998-03-27 | 2000-05-02 | Sony Corporation Of Japan | Computer-implemented user interface having semi-transparent scroll bar tool for increased display screen usage |
Non-Patent Citations (2)
Title |
---|
COLBY G. ET AL.: "Transparency and blur as selective cues for complex visual information", IMAGE HANDLING AND REPRODUCTION SYSTEMS INTEGRATION, CONF. PROCEEDINGS OF THE SPIE, vol. 1460, 26 February 1991 (1991-02-26), SAN JOSE, CA, USA, pages 114 - 125, XP002956205 * |
KAMBA T. ET AL.: "Using small screen space more efficiently", HUMAN FACTORS IN COMPUTING SYSTEMS. COMMON GROUND. CHI96 CONFERENCE PROCEEDINGS, 13 April 1996 (1996-04-13) - 18 April 1996 (1996-04-18), VANCOUVER, BC, CANADA, pages 383 - 390, XP002956206 * |
Also Published As
Publication number | Publication date |
---|---|
US20040212626A1 (en) | 2004-10-28 |
EP1410163A1 (en) | 2004-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2003007144A1 (en) | A system and a method for user interaction | |
CA2290166C (en) | Touch screen region assist for hypertext links | |
US20030193481A1 (en) | Touch-sensitive input overlay for graphical user interface | |
EP2606416B1 (en) | Highlighting of objects on a display | |
US8402386B2 (en) | Method and apparatus for two-dimensional scrolling in a graphical display window | |
KR20190009846A (en) | Remote hover touch system and method | |
US20040104942A1 (en) | Display and operating device, in particular a touch panel | |
US20060156249A1 (en) | Rotate a user interface | |
US20100134416A1 (en) | System and method of tactile access and navigation for the visually impaired within a computer system | |
CN107077274A (en) | Contextual tab in mobile band | |
WO2015030607A9 (en) | Gaze-controlled interface method and system | |
WO2019199504A1 (en) | System, method, computer readable medium, and viewer-interface for prioritized selection of mutually occluding objects in a virtual environment | |
JP2002351592A (en) | Method and system for magnifying/reducing graphical user interface (gui) widget based on selection pointer proximity | |
KR100222362B1 (en) | A method for rapid repositioning of a display pointer | |
US9158457B2 (en) | Adjustment of multiple user input parameters | |
US20060152495A1 (en) | 3D input device function mapping | |
CN102306158A (en) | Information display device | |
CN110075519B (en) | Information processing method and device in virtual reality, storage medium and electronic equipment | |
CN103752010B (en) | For the augmented reality covering of control device | |
US9791932B2 (en) | Semaphore gesture for human-machine interface | |
CN111736689A (en) | Virtual reality device, data processing method, and computer-readable storage medium | |
US7355586B2 (en) | Method for associating multiple functionalities with mouse buttons | |
WO2005081096A2 (en) | Control system for computer control devices | |
EP1182535A1 (en) | Haptic terminal | |
US11249732B2 (en) | GUI controller design support device, system for remote control and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ CZ DE DE DK DK DM DZ EC EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2002746249 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2002746249 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10481599 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: JP |