AU2011379028A1 - On screen help with contextual shortcut on an appliance - Google Patents

On screen help with contextual shortcut on an appliance Download PDF

Info

Publication number
AU2011379028A1
AU2011379028A1 AU2011379028A AU2011379028A AU2011379028A1 AU 2011379028 A1 AU2011379028 A1 AU 2011379028A1 AU 2011379028 A AU2011379028 A AU 2011379028A AU 2011379028 A AU2011379028 A AU 2011379028A AU 2011379028 A1 AU2011379028 A1 AU 2011379028A1
Authority
AU
Australia
Prior art keywords
assistance
appliance
processing means
touch
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU2011379028A
Other versions
AU2011379028B2 (en
Inventor
Jerome Brasseur
Arnd Hofmann
Petter Karlsson
Martin Knausenberger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electrolux Home Products Corp NV
Original Assignee
Electrolux Home Products Corp NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electrolux Home Products Corp NV filed Critical Electrolux Home Products Corp NV
Publication of AU2011379028A1 publication Critical patent/AU2011379028A1/en
Application granted granted Critical
Publication of AU2011379028B2 publication Critical patent/AU2011379028B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/65Updates
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

On-screen help with contextual shortcut on an appliance The invention relates to an appliance (100) with a user interface (110) com- prising a display (111) for showing indications about conditions in the appliance, and first (112, 114) and second (113) touch-sensitive input means. The second input means causes the interface to enter an assistance mode displaying which assistance items, which a processing means (121) extracts from a collection (122) of assistance information based on the content of the display when the assistance mode is invoked. In one embodiment, a data record ensures that the user returns to the same page after consulting the assistance mode. In another embodiment, the appliance is adapted to update the collection of assistance information on the basis of data received at an external communication interface. In another embodiment, the display reacts differently to short and long activations of a given touch-sensitive input means. In another embodiment, the display includes parallax compensation.

Description

WO 2013/053395 PCT/EP2011/067918 ON-SCREEN HELP WITH CONTEXTUAL SHORTCUT ON AN APPLIANCE Technical field of the invention The invention disclosed herein relates to graphic user interfaces for appliances, preferably cooking appliances. More precisely, the invention re 5 lates to methods and devices for providing contextual shortcuts, particularly shortcuts to assistance information. Background of the invention Household appliances are becoming more and more complex, and the 10 development of new functionalities is accompanied by parallel efforts to main tain the man-machine interface user-friendly and simple to handle. The prob lem of structuring information in graphical user interfaces (GUls) for house hold appliances has been addressed in several references. For instance, WO 2005/26620 discloses a domestic cooking appliance with a display 15 means adapted to output different information in different operating states of the appliance, offering situation-related information to the user. Such display means may be configured to hide information that is not related to the current operating state, so as to allow the user to focus on the relevant information. However, even the quantity of information pertaining to a particular operating 20 state may easily overwhelm a user, especially during the initial learning phase or when the appliance is being displayed in a retail store and buying decisions are to be taken. Hence, there is a need for improved GUIs for household ap pliances. 25 Summary of the invention In view of the above shortcomings of the prior art, it is an object of the present invention to propose a man-machine interface wherein the informa tion is organized in an improved manner. It is a particular object to hide cer tain information from the user if this information is currently of lesser rele 30 vance.
WO 2013/053395 PCT/EP2011/067918 2 An appliance may be equipped with control means in the form of a vis ual display for outputting information (e.g., an indication about conditions in the appliance) and first touch-sensitive input means (e.g., an area of a touch screen, a mechanical knob or button), which are communicatively coupled to 5 a processing means. Together, the display and the input means may be re garded as a GUI. The processing means is operable to invoke a specified control action (e.g., changing an operating condition of the appliance or browsing between menu pages) in response to actuation of the first touch sensitive input means. The first touch-sensitive input means is preferably a 10 'soft button' in the sense that its significance may change over time and/or with respect to different operating conditions, so that it can be associated with different control actions in different operating modes. To verify what control action is currently associated with the first touch-sensitive input means, the user may consult the visual display, which when controlled by the processing 15 means may display textual or graphic information indicating this. The inventors have realized that information can be organized not only into different views (e.g., menu pages) and spatially within a particular view, but also, conceptually speaking, along a further dimension. Accordingly, as set forth in the independent claims, the invention proposes devices and 20 methods enabling an assistance mode, in which one or more assistance items are shown on the visual display. The processing means is configured to store a collection of assistance information (e.g., a database) and to extract those assistance items therein that are relevant to the page currently dis played when the assistance mode is activated. For instance, this may be 25 achieved by storing associations between assistance items and information susceptible of being displayed, so that an assistance page may be dynami cally generated (or rendered) on the basis of the information currently dis played at the moment of activation. Since the assistance items to be dis played are extracted contextually (i.e., in response to the content of the dis 30 play at the moment when the assistance mode was entered), the assistance mode offers the user a shortcut to the relevant assistance information. Pref erably, the appliance comprises a second touch-sensitive input means for WO 2013/053395 PCT/EP2011/067918 3 entering the assistance mode. Like the first touch-sensitive input means, it may be either a touch-sensitive display area or a mechanical button or knob. Thus, the information (e.g., static information about the appliance, cur rent state information, measurements, available control actions, assistance 5 items) susceptible of being displayed by the visual display is structured both thematically and with respect to its expected frequency of use. Generally speaking, the thematic structure manifests itself by the partition into menu views, which may be associated with each of the available operating states of the appliance, while the frequency of use may translate into a partition into 10 main information and assistance information. Assistance information, which may include instructions for use, troubleshooting information and support de tails, is typically consulted less frequently than the main information, espe cially after an initial learning period, and may advantageously be localized in the assistance mode, so that it is hidden from the user except when the assis 15 tance mode is active. The invention therefore provides for a menu system with clearly arranged, visually distinct menu pages which is still rich in infor mation thanks to the additional depth afforded by the assistance mode. Advantageously, the activation of the assistance mode from a menu page changes the menu page into an assistance page containing all assis 20 tance items that are relevant in view of the menu page. If several assistance items are shown, the user can easily browse through these in the assistance mode and select the one he or she needs. This is preferred over the option of associating each menu page with several input means that lead to different (sets of) assistance items. One drawback of such a solution is that navigation 25 efficiency requires each such input means to carry an indication regarding the associated assistance item(s), which occupies valuable space that could oth erwise have been used to display the main information on the menu page. The assistance items may comprise textual information (e.g., charac ters), pictorial information (e.g., pictograms, symbols, icons, still and moving 30 images), aural information (e.g., speech, music, sampled, synthesized or re corded sounds). Within the inventive concept, assistance items may as well include tactile and haptic information (e.g., Braille print, vibrations) and the like.
WO 2013/053395 PCT/EP2011/067918 4 Preferable ways of generating (or rendering) an assistance page on the basis of a menu view will now be discussed. The features for enabling the generation, which characterize a first group of embodiments, are useful alone and may be practiced independently of the other inventive features. 5 Since experienced users access assistance information relatively sel dom, as already discussed above, it is preferable to limit the second touch sensitive input means to a single mechanical button or knob, or to a single area of the touch screen. The processing means is adapted to maintain a data record, preferably a non-permanent data record in a volatile memory, 10 relating to the information currently shown in the visual display. The data re cord may be a full copy of the display content but is preferably limited to the information needed to determine what assistance item(s) is (are) to be dis played. Generally speaking, it is advantageous to represent the display con tent in terms of its semantic content rather than its graphic form. For example, 15 the display content may be partitioned into functional blocks and described by a list of the functions to which the blocks relate; for the purpose of extracting assistance items, however, the spatial relative positions of these functional blocks are unimportant and are preferably omitted from the data record for simplicity. When the assistance mode is entered from a menu view, the proc 20 essing means generates and displays an assistance page containing an as sistance menu comprising selectable assistance items which have been ex tracted as a function of the content of the menu view from which the assis tance mode was entered. As such, one assistance item may appear in one assistance menu only (that is, this exact assistance menu is only generated 25 from a particular menu view). One or more assistance items in one assistance menu may however appear in assistance menus associated with many other - or even all - menu views in the menu system. An assistance item appearing in all assistance menus, regardless of the menu views from which they were generated, may be regarded as static items. Contact details to a service cen 30 tre, which may be of use in all operating modes of the appliance, are prefera bly presented as static assistance menu items. Preferably, said data record, which indicates the information currently shown in the visual display when the assistance mode was entered, is main- WO 2013/053395 PCT/EP2011/067918 5 tained unchanged throughout the assistance mode. The information in the data record may be used to facilitate navigation, namely by showing informa tion corresponding to the content of the data record on the visual display when the assistance mode is exited. This way, when a user exits the assis 5 tance mode after having studied the assistance information, he or she will return to a display image identical or quasi-identical (disregarding clocks and the like) to the one from which he or she invoked the assistance mode. In or der that the display image (e.g., menu view), to which the user exits the assis tance mode, faithfully reproduces the previous display image, the data record 10 preferably encodes the displayed information in sufficient detail that a visually identical display image can be generated. This may require the data record to be richer in content than in the case where the data record is merely used to control the extraction of assistance items. If the menu system executed by the GUI is composed of predefined 15 pages, each comprising, possibly, static information combined with areas for browsing, commanding control actions etc. and being interrelated by a brows ing structure, then it is advantageous to associate each page with an identi fier. The pages may be selected from a page database, which links each identifier to more comprehensive page information sufficient to generate the 20 visual image of the page. The data record may then simply contain the identi fier of the currently displayed page, which requires limited storage space. Ad vantageously, the assistance items that are to be displayed when the assis tance mode is entered from a specific menu page can be retrieved from a lookup table associating menu pages with assistance items. 25 It is currently envisaged to apply the invention to household (domestic) appliances, preferably cooking appliances and most preferably ovens or mi crowave ovens. A second group of embodiments of the present invention is intended to mitigate or overcome a problem identified by the inventors. They have real 30 ized that the static nature of user's manuals are sometimes an obstacle to subsequent improvement of existing products. Both minor improvements, such as bug fixes in software (upgrades), and the inclusion of new functional ities of potential utility to the user may be rejected for the sole reason that the WO 2013/053395 PCT/EP2011/067918 6 product has already been released to consumers and it is too late to update the accompanying manuals. This problem arises both when the user's man ual is distributed as an electronic document stored on a memory in the appli ance and when it is distributed as a paper manual. It is noted that the features 5 encountered in this group of embodiments may be practised independently of the other features of the invention. To achieve this, the processing means within the appliance further comprises an external communication interface for receiving data (or instruc tions) relating to maintenance of the collection of assistance information, by 10 way of addition, removal or replacement of assistance items. It is noted that the features characterizing this second group of embodiments may be prac tised independently of the other inventive features. The maintenance of the collection of assistance information may be carried out in a dedicated maintenance mode. The external communication 15 interface may be adapted to receive data relating to maintenance of other parts of the GUI as well, whereby a single maintenance episode may update both the collection of assistance information and, for instance, the main in formation contained in the menus. This setup enables a communicative link to be established from the appliance to an external site such as a customer ser 20 vice centre, user support forum, repair service or the like, even though the external site may be geographically remote. The communication link can be used to distribute software updates, software upgrades and accompanying updates to a menu system executed by the GUI in the appliance. Manually entered messages, automatic error messages identifying a failure condition in 25 the appliance, semi-automatic repair orders, spare part orders and the like may be transmitted over the communication link in the opposite direction, to wards the external site. The communication link may be wired or wireless. Advantageously, the processing means is adapted to generate a new collection of assistance information on the basis of instructions received at the 30 external communication interface and also of a static data record. Suitably, the static record contains a portion of the data that has been identified as relatively less prone to change, e.g., the hierarchic structure of the menu sys tem and graphic decorative material, while the instructions to be received at WO 2013/053395 PCT/EP2011/067918 7 the external communication interface relate to a portion of the data that is typically more variable. Such more variable portion may relate to the wording of menu texts, association of menu information and assistance items etc. In the maintenance mode, the processing means combines information from 5 both the static record and the instructions received by the external communi cation interface, formats these into user-readable form if necessary and stores the output data as a collection of assistance information. The collection of assistance information thus generated replaces the previous version of the collection, and this constitutes the update. In the maintenance mode, the 10 processing means may compile computer-readable code, execute formatting commands or interpret presentational mark-up and other codes defined by a mark-up language, such as HTML, XML, CSS and the like. Output data other than assistance items may be stored in other forms than in said collection. An advantage in separating the data underlying the assistance items into a static 15 and an variable portion is that a smaller quantity of data needs to be transmit ted to the appliance. This separation is advantageous from a further point of view, namely that it may reduce the vulnerability of the appliance to inadver tent amendments that may be prejudicial to the functioning of the appliance. If all or most data that are critical to the functioning are gathered in the static 20 portion, the risk of a standstill is reduced. A further possibility falling within the scope of the invention would be to express the amendments to be made to the collection of assistance informa tion in incremental form, that is, in terms of additions and deletions to be made within specific assistance items. Items that are not affected by additions 25 or deletions are left unamended in the maintenance mode. If a non incremental representation of the amendments is used, then, for complete ness, also (data underlying) unamended assistance items are be transmitted to and received by the external communication interface. Using an incre mental representation may be a more economic option in that a relatively 30 smaller transmitted data volume is sufficient to enable the maintenance op eration and in that the maintenance operation is in most cases concerned with only a subset of the collection of assistance information.
WO 2013/053395 PCT/EP2011/067918 8 Preferably, the appliance is adapted to enter the maintenance mode in response to a remote control command. For instance, the external communi cation interface may be adapted to receive and recognize a particular mes sage format that triggers the maintenance mode. This makes it possible to 5 distribute the data necessary for maintenance of the collection of assistance information over an extended period, which may be advantageous if band width is limited, but to carry these maintenance actions out at a well-defined point in time. This is useful to avoid divergences between different instances of the same appliance type with respect to the software installed, and particu 10 larly with respect to the assistance information. In this group of embodiments, where the assistance information in an appliance can be updated after its assemblage and delivery to the consumer, the assistance information may include information produced by the user of the appliance and/or other users of the same appliance type. For instance, 15 the user-generated information may include user-rated assistance items, dis cussion threads, frequently asked questions, questions and answers, recipes, how-to items, as well as social-media items like images, videos, polls, top lists etc. In a third group of embodiments, there are provided appliances with 20 visual displays capable of more accurate and reliable interaction with a user. The inventors have realized that display systems that are rich in information may suffer from overcrowding, wherein the visual features are so small that they are difficult to perceive visually and hard to actuate. In the third group of embodiments, this problem is solved by configuring touch-sensitive input 25 means in the display system with multiple meanings depending on the dura tion of actuation. More precisely, the processor associates a tactile activation of relatively shorter duration with a primary control action and associates a tactile activation or relatively longer duration with a secondary control action. The secondary control action may be to enter an assistance mode of the type 30 described previously. As one example, a touch-sensitive mean may react to a short press by entering "oven cooking settings", while a long press will enter "oven help". Clearly, the appliance may additionally include plural further touch-sensitive input means.
WO 2013/053395 PCT/EP2011/067918 9 This group of embodiments is particularly useful when the display de vice is a touch screen, wherein the first touch-sensitive input means may be a sensitive region of the touch screen. As a screen image becomes populated with a greater number of sensitive regions, the size of each decreases, and 5 so the probability that a user inadvertently touches and activates an adjacent region grows. According to this embodiment, however, the number of sensi tive regions needed to encode a given number of functions can be limited so that each remains conveniently large. In a further development, there is provided a guard mechanism for 10 avoiding inadvertent tactile activation of the first touch-sensitive input means, which as noted above may be a region of a touch screen. The guard mecha nism may comprise one or more of the following rules: 0 The processing means responds only to a release of an object (e.g., finger) from the first touch-sensitive input means. In other words, only a 15 complete application-and-release cycle will be perceived as a tactile activation by the processing means. For instance, if the user realizes that his or her finger has touched an unintended region, the option of sliding the finger out of the region may avoid submitting an inadvertent control action to the appliance. 20 0 The processing means responds to an application of an object (e.g., finger) to the first touch-sensitive input means by displaying a visual indication identifying the first touch-sensitive input means. In particular, the visual indication may relate to the control action with which the touch-sensitive input means is associated. Hence during the time pe 25 riod between application and release of the object, an indication is shown alerting the user of the control action that will be submitted if the tactile activation is completed. Preferably, the indication is shown in an area of the screen where it is not hidden by the touching object. Further, the guard mechanism can be specifically adapted to a partition 30 of the control actions as to the seriousness of an inadvertent activation. For instance, the control actions may be divided into operational commands and navigation commands, wherein the former control present and/or future condi tions prevailing in the appliance other than in the processing means and vis- WO 2013/053395 PCT/EP2011/067918 10 ual display, such as a start of a wash cycle or start of a sleep mode. The navigation commands may be used to enter and leave different parts of a menu system executed by the processing means and shown by the visual display, and so will have a limited impact on the physical reality outside the 5 menu system. In particular, an inadvertent entry of a navigation command will not spend energy or time and need not be subject to the same precautionary measures. Based hereon, it is advantageous to configure the guard mecha nism with a further rule: The processing means responds to an invoked operational command 10 by requesting a separate confirmatory activation. The confirmatory activation may relate to activating a different touch-sensitive input means. This gives the user yet another opportunity to cancel a control action that he or she has initiated inadvertently. Accordingly, the navigation commands are not subject to the separate confirmatory activation, which fa 15 cilitates navigation around the menu system. In a further development of the preceding embodiment, the potential cancellation of a control action is facilitated by storage of an indication regard ing the content of the screen image from which the invocation of the control action was initiated. The storage may be in the form of a data record relating 20 to the semantic and/or graphical content of the screen image, as discussed above. Hence, if a control action is cancelled for lack of the confirmatory ac tion, the user is brought back to the screen image. This typically leads to more efficient user interaction than if the user had been brought back to a root position in the menu system after the cancelling. 25 It is noted that the features appearing in the above third group of em bodiments can be practised alone and independently of the other inventive features described herein. The invention relates both to appliances with the features outlined above and to the methods which these perform during operation. The inven 30 tion further relates to the method of transmitting, from a remote site, update information and executable update instructions to an appliance, causing this to enter maintenance mode and update its collection of assistance informa- WO 2013/053395 PCT/EP2011/067918 11 tion. The method may comprise the final step of confirming that the requested update has been completed successfully. It is noted that the invention relates to all combinations of features, even if they are recited in different claims. 5 Brief description of the drawings The invention will now be described in more detail with reference to the accompanying drawings, on which: figures 1 and 2 are generalized block diagrams of appliances with 10 graphic user interfaces, in accordance with embodiments of the present in vention; figures 3 and 4 are exemplary views of menu systems suitable to be applied in connection with the present invention; figure 5 schematically illustrates a process of dynamically updating a 15 collection of assistance information; figure 6 schematically illustrates a process of generating assistance in formation through user interaction; and figure 7 is an exemplary sequence of menu views produced by a menu system in an appliance according to the present invention. 20 Unless otherwise indicated, like reference numbers are used to indi cate like drawing items. Detailed description of embodiments Figure 1 shows an appliance 100, such as a cooking appliance or in 25 particular an oven or microwave oven, comprising a control section 120, func tional components 131, 132, 133 controllable by electric signals from the processing means 120, a user interface 110 enabling a user to communicate with the processing means 120 and an external communication interface 140 enabling the processing means 120 to transmit data to external entities or 30 sites and/or to receive data from such entity or site. The external communica tion 140 interface may be a wired interface, such as an Ethernet adapter or memory card reader. It may also be wireless, e.g., a Bluetooth, Zigbee or WLAN adapter or an antenna for communication over a wireless telephone WO 2013/053395 PCT/EP2011/067918 12 network. The processing means 120 comprises at least a microprocessor 121 and a memory 122 for storing, inter alia, assistance information. The memory 122 may for instance be organized as a database, preferably a relational da tabase, from which items can be extracted individually in response to queries 5 in a per se known fashion. The user interface 110 includes a visual display 111 and several touch-sensitive input means 112, 113, 114. The user inter face 110 executes a menu system. In the appliance 100 shown in figure 1, the first and third input means 112, 114 are soft buttons which are, in each menu view, associated with particular control actions affecting the operating 10 conditions in the appliance (e.g., by sending control information to the func tional components 131, 132, 133) or associated with browsing actions relative to the menu system. The control action currently associated with each input means 112, 114 may for example be indicated by text or symbols in the lower portion of the display 111. The second touch-sensitive input means 113 is 15 associated with an assistance mode, in which the display 111 shows assis tance information extracted by the microprocessor 121 from the memory 122 in the processing means 120. Figure 2 shows an appliance similar to that of figure 1, however equipped with a touch-sensitive display 211 instead of the conventional dis 20 play 111. First and second touch-sensitive input means 212, 213 are provided in the form of areas in the touch-sensitive screen, which are preferably visu ally distinctive. The second input means 213 is, similarly to the embodiment shown in figure 1, associated with the assistance mode. Figure 3 is a view 300 of a menu system for an appliance of the type 25 shown in figures 1 and 2. The view 300 is comprised of an macro navigation section 310, a fine navigation section 320 and navigation means 330, which may be provided in the form of touch-sensitive areas in the displayed image or may be buttons separate from the display. Alternatively, if the display is touch-sensitive, the navigation may be commanded by finger gestures, e.g., 30 horizontal swipes for macro navigation, vertical swipes for fine navigation and taps for selection. The macro navigation section 310 is operable to cyclically browse between menu views 311 to 315, which may in a cooking appliance be views relating to conventional baking, steam cooking, fan-assisted cook- WO 2013/053395 PCT/EP2011/067918 13 ing, service functionalities and grill cooking. The fine navigation section 320 may change in response to browsing between different menu views. For ex ample, the menu items 321, 322, 323 shown in figure 3 relate to the fan assisted cooking only and will be replaced by other items in other menu 5 views. In particular, several menu views may include an item with an "assis tance" symbol similar to that of the third menu item 323; this item 323 is used for entering the assistance mode, but because of the contextual matching of assistance items to menu views, the appearance of the assistance menu opened in response to activating the third menu item 323 may vary between 10 menu views. The third menu item 323, as well as its counterparts in different menu views, therefore realizes a second touch-sensitive input means in the sense of the present invention. Alternatively, the assistance mode may be activated by a second touch-sensitive input means in the form of a static area of a touch screen or a 15 button located outside the display. Figure 4 is another exemplifying view 400 of a menu system, which enables a user to perform macro navigation by selecting an item (e.g., fan assisted cooking) in a first column 401, whereby items relating to the selec tion appear in a second column 402 in a hierarchic fashion. Fine navigation is 20 performed by selection of an item (e.g., fan speed) from said second column 402, upon which selection a third column 403 with selectable further menu items appears (e.g., different fan speeds). As suggested by the symbols, the content of the menus may be exactly as in the menu system of figure 3, in cluding the item for invoking assistance mode, and it is possible to represent 25 the same structure of hierarchic menus using either one of the menu systems. Figure 5 illustrates information flows during a maintenance process for updating the assistance information in the appliance 100. A remote site 599, at which update information is prepared, and the appliance 100, via its exter nal communication interface 140, are connected to a communication network 30 580. The external communication interface 140 receives update information, which is in this example a new instance of a variable portion of the assistance information, e.g., all textual information, which is used as a basis for generat ing a new collection of assistance information 122. The appliance 100 further WO 2013/053395 PCT/EP2011/067918 14 comprises a memory storing a static data record 510, which may be protected from any amendments or from amendments ordered by unauthorised parties, and is also used as a basis for generating said new collection of assistance information 122. The static record contains information which a system de 5 signer has identified as being less prone or suitable to change, e.g., certain pictorial information, style sheets. After new update information has been re ceived at the communication interface 140 (and, possibly, after a message triggering the start of the maintenance process has been received), the mi croprocessor 121 is configured to generate a new collection of assistance 10 information 122 by concatenating input from the external communication in terface 140 and the static data record 510 and carrying out any formatting and similar actions that may be necessary to obtain a collection of assistance information 122 that is ready to use. Figure 6 illustrates a process for generating new assistance informa 15 tion by user interaction. The remote site 599 and a plurality of appliances 100, all of which may be located in different geographic sites, are interconnected via a communication network 580. In an initial step, a first appliance 1 00a sends a question Q via the network 580 to the remote site 599. A response A (which is preferably self-contained as to its content) is generated in manual or 20 automatic fashion at the remote site 599 and is transmitted to the first appli ance 1 00a and to all further connected appliances 1 00b, 100c, 1 00d. The us ers of the further connected appliances 1 00b, 100c, 1 00d may or may not read the response A immediately, but the appliances 1 00b, 100c, 1 00d may include the response A in their respective collections of assistance informa 25 tion for later reference. Many variations to the process shown in figure 6 can be envisaged considering that the question Q and response A can be replaced by other in formation without affecting the technical features of the connected devices. For instance, the question Q may be replaced by a message encoding a rat 30 ing of an assistance item ("Was this helpful?") by a user of the first appliance 1 00a, and the response A may be a summary of ratings from different users. This way, if assistance items are amended by adding the rating summary, navigation can be facilitated by favouring highly rated assistance items.
WO 2013/053395 PCT/EP2011/067918 15 Figure 7 illustrates a sequence of menu views which may be produced during use of an appliance according to the invention. Below each menu view there is a touch-sensitive input means labelled with an assistance symbol "i" for activating the assistance mode. The menu system illustrated in figure 7 is 5 composed of pages associated with identifiers, such as sc1 0, sc1 3. Next to each menu view is symbolically illustrated a data record 701 relating to the information currently shown in the visual display. Since the menu system is organized as pages, it is sufficient to store a page identifier in the data record 701. 10 The top menu view is a "home" view (identifier: sc10), as indicated by the encircled symbol, in which a user may select one of three menu items, each corresponding to a further menu view corresponding to an operational mode of the appliance. Selecting the lowermost item, which is in this example fan-assisted cooking, opens a menu view relating to the fan-assisted cooking 15 mode (identifier: sc1 3) with menu items such as fan speed and temperature. By pressing the assistance input means "i", the user will open an assistance menu associated with fan-assisted cooking (identifier: sc1 3a), as indicated by the rightward arrow. The data record 701 retains the value scl3, which identi fies the page from which the assistance mode was invoked. The assistance 20 menu may contain items for obtaining deepened assistance information about fan-assisted cooking, for getting interactive help on this subject, for placing repair orders directed to errors that are typically discovered during fan assisted cooking and for leaving the assistance mode. The assistance input means "i" may be used both for entering and exiting the assistance mode. 25 When the user leaves the assistance mode, he or she is taken to the page identified by the data record 701, that is, the menu view corresponding to fan assisted cooking, as indicated by the down/leftward arrow. The user can then continue browsing the menu system from where he or she left it. In one embodiment (not shown), there is provided an appliance com 30 prising a touch screen for showing indications about conditions prevailing in the appliance and for receiving user input; and a processing means commu nicatively coupled to the touch screen and configured to variably associate at least one region of the touch screen with one of a plurality of control actions WO 2013/053395 PCT/EP2011/067918 16 and further to indicate, using the touch screen, the control action with which the region is currently associated. In the appliance, the processing means is operable to associate a tactile activation of relatively shorter duration of a first region of the touch screen with a primary control action and to associate a 5 tactile activation of relatively longer duration of the first region with an assis tance mode. An assistance item extracted from a collection of assistance in formation is then shown on the touch screen, wherein the assistance item to be shown is extracted on the basis of the primary control action associated with the first region of the touch screen. 10 In a further embodiment, a touch screen in an appliance comprises a visual display surface and a touch-sensitive surface. The two surfaces are substantially parallel to one another and axially separated by some distance, such as at least 2 mm, such as at least 4 mm. The separation may be neces sary for reasons of heat insulation or for protecting the visual display surface 15 against sharp objects. The two surfaces may be separated by a transparent plate, a laminated plate or by two parallel transparent plates with an interme diate air gap. By a parallax-related effect, a geometric deviation between re gions of the visual and the touch-sensitive surfaces arise when the visual dis play is viewed from an oblique angle, as is often the case of domestic appli 20 ances for which both floor-level mounting and elevated mounting are fore seen. This makes accurate interaction with the touch screen less intuitive. In view of this, the processing means is operable to apply a parallax compensa tion, by which locations of tactile activations of the touch screen are shifted by a distance in at least one screen direction. The distance and the screen direc 25 tion are chosen in order that the parallax error is partially or completely com pensated. Since the placement and mounting of the appliance has a considerable impact on the parallax error, the parallax compensation is preferably adapted specifically for a particular installed appliance. The parallax compensation 30 may also be specifically adapted to a particular user, as body height, vision defects and various behavioural parameters may lead to significant differ ences between persons. The user currently interacting with the appliance WO 2013/053395 PCT/EP2011/067918 17 may identify him- or herself through a login procedure, or may be biometri cally identified en route. In particular, the processing means may be adapted to compute a mean deviation (or median or suitable quantile), with respect to at least one 5 spatial direction, of locations of past tactile activations of regions of the touch screen and the respective centres of the regions. The processing means then uses said computed mean deviation, or a predefined percentage thereof, as the distance to be used in the parallax compensation. In a further develop ment of this, the processing means is operable in a calibration mode, in which 10 a user is requested to touch predetermined regions to allow computation of a mean deviation. Further embodiments of the present invention will become apparent to a person skilled in the art after studying the description above. Even though the present description and drawings disclose embodiments and examples, 15 the invention is not restricted to these specific examples. Numerous modifica tions and variations can be made without departing from the scope of the present invention, which is defined by the accompanying claims. Any refer ence signs appearing in the claims are not to be understood as limiting their scope. 20 The systems and methods disclosed hereinabove may be implemented as software, firmware, hardware or a combination thereof. In a hardware im plementation, the division of tasks between functional units referred to in the above description does not necessarily correspond to the division into physi cal units; to the contrary, one physical component may have multiple functio 25 nalities, and one task may be carried out by several physical components in cooperation. Certain components or all components may be implemented as software executed by a digital signal processor or microprocessor, or be im plemented as hardware or as an application-specific integrated circuit. Such software may be distributed on computer readable media, which may com 30 prise computer storage media (or non-transitory media) and communication media (or transitory media). As is well known to a person skilled in the art, the term computer storage media includes both volatile and nonvolatile, remova ble and non-removable media implemented in any method or technology for WO 2013/053395 PCT/EP2011/067918 18 storage of information such as computer readable instructions, data struc tures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory tech nology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, 5 magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer. Further, it is well known to the skilled person that communication media typically embodies computer readable instructions, data structures, program modules or other 10 data in a modulated data signal such as a carrier wave or other transport me chanism and includes any information delivery media.

Claims (15)

1. An appliance (100) comprising: a visual display (111, 211) for showing indications about conditions 5 prevailing in the appliance; a first touch-sensitive input means (112, 114; 212); and a processing means (120), which is communicatively coupled to the visual display and the touch-sensitive input means and which is configured: to associate the touch-sensitive input means with a control ac 10 tion selectable from a plurality of control actions in respect of the appli ance, and to indicate, using the visual display, the control action with which the touch-sensitive input means is currently associated, characterized in that the processing means is configured to store a 15 collection of assistance information containing assistance items, wherein the collection of assistance information preferably comprises at least one in the group comprising: textual information, pictorial information, 20 aural information, and in that the appliance further comprises a second touch-sensitive input means (113; 213) for entering an assistance mode, wherein the proc essing means is configured to extract one or more assistance items, to be shown on the visual display, from the collection of assistance information on 25 the basis of indications present on the visual display at the moment of activa tion of the second touch-sensitive input means.
2. The appliance of any of claim 1, further comprising an external communication interface (140) commu 30 nicatively coupled to the processing means and configured to receive external instructions relating to addition and/or removal of assistance items in the col lection of assistance information. WO 2013/053395 PCT/EP2011/067918 20
3. The appliance of claim 2, wherein the processing means is operable in a maintenance mode, in which it implements instructions received at the external communication inter face by amending the stored collection of assistance information, and in 5 which it generates a new collection of assistance information on the basis of a static data record (510) and the received instructions.
4. The appliance of claim 2 or 3, wherein the processing means is configured to enter a maintenance 10 mode, in which it implements instructions received at the external communi cation interface by amending the stored collection of assistance information, in response to a message received at the external communication interface.
5. The appliance of any of claims 2 to 4, 15 wherein the collection of assistance information includes items gener ated by user interaction, such as at least one in the group comprising: a frequently asked question, a user-rated assistance item. 20
6. The appliance of any of claims 2 to 5, wherein the external communi cation interface (140) is further configured to transmit data relating to a failure condition prevailing in the appliance to a remote site.
7. The appliance of any of the preceding claims, wherein: 25 the second touch-sensitive input means is a single button (113); the processing means is configured to maintain a data record (701) containing data relating to indications currently shown on the visual display; the processing means is configured to associate the data record with the one or more assistance items to be extracted from the collection of assis 30 tance information; and the processing means is configured to show, in the assistance mode, an initial assistance menu, from which the one or more assistance items ex tracted by the processing means are individually selectable by a user. WO 2013/053395 PCT/EP2011/067918 21
8. The appliance of claim 7, wherein the processing means is configured to show an initial assis tance menu that further comprises at least one static item. 5
9. The appliance of claim 7 or 8, wherein the processing means is configured to maintain the data re cord unchanged in the assistance mode and to exit the assistance mode by showing indications corresponding to the content of the data record on the 10 visual display.
10. The appliance of any of claims 7 to 9, wherein the processing means is adapted to show pages selected from a collection of pages, each carrying an identifier, wherein the data record con 15 tains an identifier of a page shown at the moment of activation of the second touch-sensitive input means.
11. The appliance of any of the preceding claims, wherein the processing means is operable to associate a tactile activa 20 tion of relatively shorter duration of the first touch-sensitive input means with a primary control action and to associate a tactile activation of relatively longer duration of the first touch-sensitive input means with entry into the assistance mode, wherein the one or more assistance items to be shown are extracted on the basis of the primary control action currently associated with the first 25 touch-sensitive input means.
12. The appliance of claim 11, wherein the processing means is configured to apply a guard mechanism to avoid inadvertent tactile activation of the first touch-sensitive input means, said guard mechanism including that: 30 the processing means responds only to a release of an object from the first touch-sensitive input means as an activation; and WO 2013/053395 PCT/EP2011/067918 22 the processing means responds to an application of an object to the first touch-sensitive input means by displaying a visual indication identifying the first touch-sensitive input means. 5
13. The appliance of claim 11 or 12, wherein: the control actions include operational commands, for controlling pre sent and/or future conditions prevailing in the appliance other than in the processing means and visual display, and navigation commands; and the guard mechanism includes that the processing means responds to 10 an invoked operational command by requesting a separate confirmatory acti vation.
14. The appliance of claim 13, wherein: the guard mechanism further includes that the processing means re 15 sponds to an invoked operational command by further storing information re lating to a screen image, from which the operational command is invoked, and that it responds to a failing confirmatory activation by returning to this screen image. 20
15. A computer-implemented method of updating a collection of assistance information stored in a processing means (120) in an appliance (100), which is equipped with an external communication interface (140) communicatively coupled to the processing means, the method being performed at a remote site (599) and comprising the 25 steps of: preparing an executable update instruction relating to addition and/or deletion of assistance items in the collection of assistance information; transmitting the update instruction for execution at the appliance by means of a wired, wireless or portable data carrier; and 30 receiving a confirmation of completion of the update from the appli ance.
AU2011379028A 2011-10-13 2011-10-13 On screen help with contextual shortcut on an appliance Active AU2011379028B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2011/067918 WO2013053395A1 (en) 2011-10-13 2011-10-13 On screen help with contextual shortcut on an appliance

Publications (2)

Publication Number Publication Date
AU2011379028A1 true AU2011379028A1 (en) 2014-04-10
AU2011379028B2 AU2011379028B2 (en) 2018-01-25

Family

ID=44789481

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2011379028A Active AU2011379028B2 (en) 2011-10-13 2011-10-13 On screen help with contextual shortcut on an appliance

Country Status (5)

Country Link
US (1) US9898169B2 (en)
EP (1) EP2766808B1 (en)
CN (1) CN103890720B (en)
AU (1) AU2011379028B2 (en)
WO (1) WO2013053395A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2702922A3 (en) * 2012-08-29 2014-10-15 Diehl AKO Stiftung & Co. KG Method for operating a domestic appliance and domestic appliance with ergonomic operating device
US9961721B2 (en) * 2013-01-17 2018-05-01 Bsh Home Appliances Corporation User interface for oven: info mode
US9554689B2 (en) * 2013-01-17 2017-01-31 Bsh Home Appliances Corporation User interface—demo mode
US20140201688A1 (en) * 2013-01-17 2014-07-17 Bsh Home Appliances Corporation User interface - gestural touch
CA2910884C (en) * 2014-10-30 2023-05-23 Braeburn Systems Llc Quick edit system for programming a thermostat
USD781891S1 (en) 2014-12-17 2017-03-21 Go Daddy Operating Company, LLC Display screen with graphical user interface
USD762699S1 (en) * 2014-12-17 2016-08-02 Go Daddy Operating Company, LLC Display screen with graphical user interface
USD770497S1 (en) 2014-12-17 2016-11-01 Go Daddy Operating Company, LLC Display screen with graphical user interface
USD763290S1 (en) 2014-12-17 2016-08-09 Go Daddy Operating Company, LLC Display screen with graphical user interface
CN105982508B (en) * 2015-02-05 2018-11-23 佛山市顺德区美的电热电器制造有限公司 Electric heating utensil and its display control method
US9646503B2 (en) 2015-02-11 2017-05-09 Honeywell International Inc. Cockpit display systems and methods for generating navigation displays including landing diversion symbology
EP3115697A1 (en) * 2015-07-10 2017-01-11 Electrolux Appliances Aktiebolag A control unit for a domestic appliance
US10628518B1 (en) * 2016-01-12 2020-04-21 Silenceux Francois Linking a video snippet to an individual instruction of a multi-step procedure
DE102016223476A1 (en) * 2016-11-25 2018-05-30 BSH Hausgeräte GmbH Haptic control for a household appliance
WO2018098475A2 (en) * 2016-11-28 2018-05-31 Cloudamize, Inc. System and method for automated aggregation of system information from disparate information sources
JP2018146840A (en) * 2017-03-07 2018-09-20 オンキヨー株式会社 Electronic device, method, program, and computer readable recording medium
US10503739B2 (en) * 2017-04-20 2019-12-10 Breville USA, Inc. Crowdsourcing responses in a query processing system
NL2022189B1 (en) 2018-12-12 2020-07-03 Fri Jado Bv rotisserie oven, method carried out by a control system of a rotisserie oven, and computer program
CN114173362B (en) * 2019-10-17 2023-08-22 Oppo广东移动通信有限公司 Method and apparatus for wireless communication

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535323A (en) * 1992-06-29 1996-07-09 Digital Equipment Corporation Method of and system for displaying context sensitive and application independent help information
DE19754406A1 (en) 1997-12-09 1999-06-10 Bosch Gmbh Robert Radio receiver
US6104334A (en) * 1997-12-31 2000-08-15 Eremote, Inc. Portable internet-enabled controller and information browser for consumer devices
DE10100826B4 (en) * 2000-02-01 2005-11-10 Lg Electronics Inc. Internet refrigerator and operating procedures for this
JP2004505698A (en) * 2000-08-08 2004-02-26 エルジー エレクトロニクス インコーポレイティド Method and apparatus for displaying instructions for using a washing machine
US20040095370A1 (en) * 2001-08-02 2004-05-20 Maytag Corporation Installation instruction system for an appliance incorporating electronic interface screen
US6750433B2 (en) * 2001-11-29 2004-06-15 General Electric Company Oven display and user interface
US6687486B2 (en) * 2002-01-31 2004-02-03 General Instrument Corporation Method and apparatus to configure, provision and control a set-top terminal
KR100484820B1 (en) * 2002-10-10 2005-04-22 엘지전자 주식회사 Refrigerator system which is able to watch TV
KR100529878B1 (en) * 2002-11-08 2005-11-22 엘지전자 주식회사 Internet refrigerator with Web pad
DE10342321A1 (en) 2003-09-12 2005-04-07 BSH Bosch und Siemens Hausgeräte GmbH Control for a household appliance and display of information
US20070288331A1 (en) * 2006-06-08 2007-12-13 Whirlpool Corporation Product demonstration system and method
AU2006259965B2 (en) * 2005-06-23 2009-06-11 Lg Electronics Inc. Refrigerator
KR101202495B1 (en) * 2005-07-01 2012-11-16 엘지전자 주식회사 Method of controlling air conditioner
US8424321B2 (en) * 2005-09-16 2013-04-23 Lg Electronics Inc. Refrigerator having a plurality of display units
JP4588005B2 (en) * 2006-09-14 2010-11-24 シャープ株式会社 Communication terminal device, video display device, and control program
DE102006047813A1 (en) * 2006-10-06 2008-04-10 Lechmetall Landsberg Gmbh Edelstahlerzeugnisse Cooking appliance with automatic cooking program preselection and method for setting such a cooking appliance
JP2009044602A (en) * 2007-08-10 2009-02-26 Olympus Imaging Corp Imaging apparatus, imaging system and imaging method
WO2009149219A2 (en) * 2008-06-03 2009-12-10 Whirlpool Corporation Appliance development toolkit
KR101078464B1 (en) 2008-07-07 2011-10-31 엘지전자 주식회사 Display apparatus with local key and control method of the same
KR101517083B1 (en) * 2009-05-11 2015-05-15 엘지전자 주식회사 A Portable terminal controlling refrigerator and operation method for the same
KR101563487B1 (en) * 2009-05-11 2015-10-27 엘지전자 주식회사 Portable terminal controlling home appliance
US20110093158A1 (en) * 2009-10-21 2011-04-21 Ford Global Technologies, Llc Smart vehicle manuals and maintenance tracking system
WO2011105768A2 (en) 2010-02-23 2011-09-01 엘지전자 주식회사 Refrigerator including a terminal, and method for controlling same

Also Published As

Publication number Publication date
US20140317501A1 (en) 2014-10-23
EP2766808B1 (en) 2016-07-06
WO2013053395A1 (en) 2013-04-18
EP2766808A1 (en) 2014-08-20
CN103890720A (en) 2014-06-25
AU2011379028B2 (en) 2018-01-25
CN103890720B (en) 2017-07-11
US9898169B2 (en) 2018-02-20

Similar Documents

Publication Publication Date Title
AU2011379028B2 (en) On screen help with contextual shortcut on an appliance
JP6120954B2 (en) Screen reader with customizable web page output
Khan et al. Blind-friendly user interfaces–a pilot study on improving the accessibility of touchscreen interfaces
KR100829381B1 (en) Container for wireless internet ui
KR20140144104A (en) Electronic apparatus and Method for providing service thereof
US20160092152A1 (en) Extended screen experience
JP5986001B2 (en) Three-dimensional handler operation method and terminal device supporting the same
US20100269090A1 (en) Method of making it possible to simplify the programming of software
US9075500B2 (en) Method and system for presenting and navigating embedded user interface elements
Zhao et al. HUMAN-COMPUTER INTERACTION AND USER EXPERIENCE IN SMART HOME RESEARCH: A CRITICAL ANALYSIS.
US20140257790A1 (en) Information processing method and electronic device
US20150160792A1 (en) Dynamically-generated selectable option icons
CN102018571A (en) Medical instrument and application method thereof
JP2016522943A (en) User interface for controlling software applications
JP2023548807A (en) Information processing methods, devices and electronic equipment
US10067670B2 (en) Multi-switch option scanning
US11907518B2 (en) Cooking recipe display system, information terminal, cooking recipe display method, and program
Caporusso et al. Interface digital twins: rendering physical devices accessible to people who are blind
Prati et al. Design guidelines towards 4.0 HMIS: how to translate physical buttons in digital buttons
KR101875485B1 (en) Electronic apparatus and Method for providing service thereof
CN110362211A (en) Computer input, running gear and computer program product
EP4446871A1 (en) Generating software components
JP6109397B1 (en) Computer mounting method
KR101783279B1 (en) The System And The Method For Giving Imformation By Stages
KR101076235B1 (en) Apparatus and method for providing user interface using touch screen

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)