WO2016009390A1 - Tactile interface for a touchscreen - Google Patents

Tactile interface for a touchscreen Download PDF

Info

Publication number
WO2016009390A1
WO2016009390A1 PCT/IB2015/055408 IB2015055408W WO2016009390A1 WO 2016009390 A1 WO2016009390 A1 WO 2016009390A1 IB 2015055408 W IB2015055408 W IB 2015055408W WO 2016009390 A1 WO2016009390 A1 WO 2016009390A1
Authority
WO
WIPO (PCT)
Prior art keywords
software
touchscreen
user interface
tactile
cover
Prior art date
Application number
PCT/IB2015/055408
Other languages
French (fr)
Inventor
Sharon Greenberg
Original Assignee
Sharon Greenberg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharon Greenberg filed Critical Sharon Greenberg
Publication of WO2016009390A1 publication Critical patent/WO2016009390A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/039Accessories therefor, e.g. mouse pads
    • G06F3/0393Accessories for touch pads or touch screens, e.g. mechanical guides added to touch screens for drawing straight lines, hard keys overlaying touch screens or touch pads

Definitions

  • This disclosure relates generally to touchscreen covers with buttons, and, in particular, to method and apparatus for creating, defining and manufacturing of touchscreen covers with tactile buttons.
  • Graphical user interface is used by the devices' software to define how the end user can operate the software.
  • the end user needs to look at the screen and press the interface's buttons using a keyboard/ mouse / touchscreen.
  • Various accessories for the touchscreen are used to define a physical button over the touchscreen, wherein the physical button layout correspond to the software interface.
  • US8564538 (Touch screen overlays and methods for manufacturing the same); US20050164148 (Tactile overlay for an imaging display);
  • US20100302168 (Overlay keyboard for touch screen devices); US201 10260976 (Tactile overlay for virtual keyboard); US200301 12225 (Electronic device having a movable keypad);
  • an apparatus comprising a cover for covering at least part of a touchscreen of a device; the cover comprising one or more tactile buttons; the buttons are arranged in a layout; at least one of the buttons represent a usage; and when situating the cover with the touchscreen, a software is at least partly usable without viewing the touchscreen wherein the software comprises a user interface; elements of the user interface corresponds to the buttons and the layout; and the corresponding elements are usable as represented by the buttons.
  • the cover can be made from various materials, in various shapes and size, to accommodate various touchscreens.
  • the cover is designed to cover at least part of a touchscreen.
  • the cover includes one or more buttons. At least some of the buttons have tactile features thus the buttons can be perceptible by touch.
  • buttons can include various features like sizes, shapes, textures, and the like. Various buttons can be used to represent various software functions and usages like Play, Stop, Up, Down, Select, and the like. The buttons are arranged in a layout. Various covers for various touchscreens can be designed with various layouts with various buttons to accommodate various needs and requirements.
  • An appropriate software includes designing at least part of the elements (like buttons, slides, keyboard) of at least part of at least one of the software user interface (like the TUI (Touch User Interface) or the GUI (Graphic User Interface)) according the buttons associated usage and the buttons layout. Designing a software user interface according to a selected cover's buttons layout enables using the cover's button for using the software's interface when the cover is situated with the touchscreen.
  • buttons can be associated and can be used to use features like various software buttons, slides, and the like.
  • the software can support various covers using multiple interfaces, for example without the cover, with the cover, with a cover using a different layout, and the like.
  • the elements of the interfaces and the design of the interfaces can include updating
  • the software architecture (manually or automatically) the software settings (like usage, functionality, size, interface layout, and the like) according to the cover information.
  • the software architecture (the structures of the software system) can also be designed to be at least partly compliance with the cover's button and layout, thus the cover buttons can be used for controlling and using the software activities like navigating the software screens and menus, managing content, switching activity, and the like.
  • the cover can be supported by many types of software applications like music players, RSS reader, phone dialer, calendar, contacts, and the like.
  • the cover can be supported with core software like launchers and operating system, thus the cover can also be used to support functions like activating various software applications, switching between software applications, managing the software applications, and the like.
  • Designing the software user interface according to the cover's button layout enables controlling the software without looking at the screen.
  • This method enables a visually impaired end user to use the software without the need of looking at the screen and without the need of learning each software interface for using the software.
  • the method to substantially enable a non-visual use of an appropriate software on a device wherein the device comprising a touchscreen, wherein the method comprising situating a cover of an apparatus with the touchscreen; and using the cover's tactile buttons to use the software; wherein at least part of the elements of at least one of the user interfaces corresponds to at least part of the buttons and the buttons associated usage and at least part of the buttons layout, so the corresponding part of the interfaces is usable with the cover.
  • Another method comprises providing one or more layouts of a tactile buttons of a cover wherein the cover can cover at least part of the
  • the disclosed apparatus allow visually impaired end users to use a software running on a device with a touchscreen. Designing the user interface according to a fixed layout wherein the layout comprises tactile buttons and the buttons have associated usage - substantially reduce the learning curve of how to use the software. Using feedback (like sound and voice
  • announcements can inform the end user about his current screen, menu and options without the need of learning how to proceed but pressing the button with the associated usage.
  • a system comprising: a user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen; said user interface
  • apparatus defining a user interface layout over said touchscreen; an interface module configured to support usage of said user interface layout; and at least one translation module configured to translate said usage of said user interface layout to a usage of a software.
  • the user interface apparatus may comprise a cover configured to cover at least part of said device's touchscreen.
  • the cover may comprise one or more tactile buttons arranged in a layout.
  • the tactile buttons may be selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any combination thereof.
  • the cover may further comprise additional tactile features for easier tactile recognition.
  • the cover may further comprise signage for easier visual recognition.
  • the cover may comprise at least one viewer for viewing and/or touching the touchscreen.
  • the user interface apparatus may be at least partly a part of said touchscreen and/or said device.
  • the system may further comprise a connector for connecting said user interface apparatus with said device.
  • the user interface apparatus may further comprise an identifier for identifying said apparatus to said interface module.
  • the interface module may comprise a local application running on said device.
  • the interface module may be configured to receive said software usage from said translation module and communicate it to said software for execution.
  • the translation module may be configured to communicate said software usage to said software for execution.
  • the system may be configured to derive at least part of said translation from an identifier, said identifier comprising information regarding said user interface apparatus.
  • the system may be configured to derive said translation from a configuration definition related to said software.
  • the software may be selected from the group consisting of local or remote program, application, web application, simulated application, browser, website, server, API, and any combination thereof.
  • the usage of the user interface apparatus may comprise one of: navigation, selection, operation, and input.
  • the selection may comprise one of: application selection, operating system service selection and item selection within an application or an operating system service.
  • the system may further comprise a feedback module comprising one of sound, speech, vibration, lights, and any combination thereof.
  • the translation module may further comprise a software development kit configured to enable access of software pre-designed to interact with said system.
  • the translation module may further comprises an application programming interface configured to enable access of software pre-designed to interact with said system.
  • a user interface apparatus with tactile features configured to be used with at least part of a device's touchscreen, comprising: one or more tactile buttons arranged in a layout; and an identifier for identifying said apparatus to said device.
  • the tactile buttons may be selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any combination thereof.
  • the apparatus may further comprise additional tactile features for easier tactile recognition.
  • the apparatus may further comprise signage for easier visual recognition.
  • the apparatus may further comprise at least one viewer for viewing and/or touching the touchscreen.
  • the apparatus may beat least partly a part of said touchscreen and/or said device.
  • the apparatus may further comprise a connector for connecting said apparatus with said device.
  • a method of enabling non-visual use of a target software via a first device said first device comprising a touchscreen and said target software running on said first device or on a second device communicating with said first device, the method comprising: defining a user interface layout over said touchscreen by situating a cover relative to said touchscreen, said cover comprising tactile buttons arranged in a layout; and using a software to translate usage of said user interface layout to said target software usage.
  • the architecture of said software may be at least partly compliant with said layout.
  • the settings of said software may be determined according to information provided by an identifier of said cover.
  • situating may comprise coupling said cover with said touchscreen and/or said first device using a coupling mechanism.
  • a method of enable non-visual use of a software via a device comprising a touchscreen comprising: providing one or more layouts of tactile buttons of a cover covering at least part of the touchscreen; designing at least part of the elements of at least one interface of the software to correspond to at least part of the layout; situating the cover relative to the touchscreen; and using the tactile buttons of the cover to use said
  • the architecture of said software is at least partly compliant with said layout.
  • the software may be selected from the group consisting of: program, application, website, server, webapp, API, and any combination thereof.
  • a tactile keyboard comprising a slide button configured to select symbols comprising letters, the symbols spread along said slide button in a predefined order.
  • the keyboard may be connected with a device's touchscreen, said device running an interface application configured to interpret usage of said keyboard and provide feedback.
  • the feedback may depend on the speed of sliding across the slide button.
  • the feedback may comprise vocalizing the name of a letter during slow sliding.
  • the feedback may comprise generating a tone relative to said speed when the speed is high.
  • the properties of said sliding may change the usage interpretation.
  • Changing the usage interpretation may comprise changing the zoom of said symbols spread along said slide button.
  • the tactile keyboard may further comprise tactile means for indicating change of letter types.
  • Fig. 1 depicts two modes of usage of the user interface apparatus according to the present invention
  • Figs. 2A through 2C show three exemplary embodiments of the user interface apparatus according to the present invention
  • Fig. 3 is a block diagram showing an embodiment the system according to the present invention
  • Fig. 4 is a flowchart showing an exemplary scenario implemented by the interface application
  • Fig. 5 is a chart of an exemplary software translation and usage
  • Fig. 6 shows a schematic representation of a translated application API of an exemplary dialer application
  • Figs. 7A and 7B show two respective embodiments of the basic structure of the system according to the present invention.
  • the present invention seeks to overcome the requirement of looking at the device's touchscreen for using and operating a software.
  • the present invention uses the insight that:
  • Any function that can be performed by a touchscreen can also be
  • Any software is actually a decision tree which may be traversed using the keys of a keyboard to fully activate any function.
  • a user interface apparatus for attaching to a touchscreen display of a device.
  • the user interface apparatus may comprise a cover for covering at least part of a touchscreen of a device, the cover comprising one or more tactile buttons arranged in a predetermined layout.
  • buttons can include various features like sizes, shapes, textures, and the like.
  • Various buttons can be used to represent various software functions and usages like Play, Stop, Up, Down, Select, and the like.
  • the buttons are arranged in a layout.
  • Various covers for various touchscreens can be designed with various layouts with various buttons to accommodate various needs and requirements.
  • An appropriate software includes designing at least part of the elements (like buttons, slides, keyboard) of at least part of at least one of the software user interface (like the TUI (Touch User Interface) or the GUI (Graphic User Interface)) according to the buttons associated usage and the buttons layout.
  • Designing a software user interface according to a selected cover's buttons layout enables using the cover's button for using the software's interface when the cover is connected with the touchscreen.
  • the buttons can be associated and can be used to use features like various software buttons, slides, and the like.
  • the software can support various covers using multiple interfaces; for example without the cover, with the cover, with a cover using a different layout, and the like.
  • the elements of the interfaces and the design of the interfaces can include updating (manually or automatically) the software settings (like usage, functionality, size, interface layout, and the like) according to the cover information.
  • the software architecture (the structures of the software system) can also be designed to be at least partly compliant with the cover's button and layout, thus the cover buttons can be used for controlling and using the software activities like navigating the software screens and menus, managing content, switching activity, and the like.
  • the cover can be supported by many types of software applications like music players, RSS reader, phone dialer, calendar, contacts, and the like.
  • the cover can be supported with core software like launchers and operating system, thus the cover can also be used to support functions like activating various software applications, switching between software applications, managing the software applications, and the like.
  • a preferred embodiment of the cover is defining at least some of the buttons by a cutout from the cover for cheaper manufacturing.
  • inventions may include extruded areas, ridges, molded key features, indentations, protrusions, depression, or the like arranged to provide users with a tactile feel (e.g., for motion and/or location).
  • the apparatus can include a connector for connecting (wire/wireless) with the device.
  • a preferred connector has a wireless connection capabilities (like Bluetooth, NFC, Wi-Fi, and the like) for wirelessly connecting with a device which support such wireless connection.
  • An optional embodiment of the apparatus comprises an identifier for identifying apparatus properties and/or apparatus associated information and/or additional information.
  • Identifying the apparatus properties enables easier supporting of various covers (like covers layouts, styles, sizes, button associated usage, and the like) - thus the software interface and/or interface elements can be optimized for the cover in use.
  • Another optional feature of the identifier is the apparatus associated information. For example - associating a user ID to a specific apparatus enable shortening at least part of the end user identification process.
  • Yet another optional feature of the identifier is including or providing some additional information, like the apparatus position relative to the touchscreen, for dynamic and automatic software interface adjustments.
  • the apparatus can be combined with the device or with the device's touchscreen in various ways, including using an adhesion mechanism to couple the cover to the touchscreen; using a mount mechanism for mounting the cover over the touchscreen and/or the device; using a case for casing the device wherein the cover is part of the case (like a wallet case), and the like.
  • An optional embodiment of the apparatus comprises additional tactile characteristics for easier tactile recognition of each button without looking.
  • Braille symbols can be included with at least some of the buttons for easier recognition of the buttons and/or the button associated usage, by the visually impaired.
  • the apparatus may include signage for signing the meaning of the buttons for users with substantially proper vision.
  • the apparatus can include a viewer for viewing and/or touching the touchscreen.
  • the apparatus can also be combined and/or be a part of the touchscreen and/or device and/or any other part.
  • the button's associated function can be updated by the software to support additional features like associating the buttons used for traversing the application decision tree to control a music player, to associate the keyboard layout to slide the values in a slide-bar, and the like.
  • the user interface apparatus according to the present invention may be used in various modes, including as a buffer or translator to an application, as a remote control to an application, and the like. Two of the modes of usage, are depicted in Fig. 1.
  • User interface apparatus 100 may be used in a first mode to provide a touchscreen user interface navigation tool to users who are unable, selectively or un-selectively, permanently or temporarily, to see or to look at the touchscreen. This category may include visually impaired, blinds, drivers during driving, and more.
  • the device functions made available to those users by the user interface apparatus 100 may include at least some of the usual functionality of the device, such as, for example, browser 1 10, phone 120 and apps 130.
  • the user interface apparatus 100 second usage mode it may be used as a remote control device 140, to help operate at least one application running on another device, wherein the other device has or doesn't have a screen or a touchscreen.
  • a navigation application running on a vehicle's multimedia system, where the use of the user interface apparatus 100 enables control of the application and/or the system's operating system without looking at the screen of the other device or performing touch operation thereon.
  • user interface apparatus 100 comprises a cover 101A/101 B/101 C with sets of buttons.
  • the buttons are arranged in a layout.
  • the buttons represent various usages of the underlying user interface.
  • optional button set 102 can be used to browse between the software applications by using the "Left” and “Right” buttons and activate the selected software application using "Choose” button (as respectively signed on surface 101 A).
  • Optional button set 103 can be used to browse inside the selected software application menus using the "Prev” (Previous) and “Next” buttons (as respectively signed on surface 101 A), browse between the software application options using the “Up” and “Down” buttons (as respectively signed on surface 101 A) and activate the selected option using "Select” button (as respectively signed on surface 101 A).
  • Optional button set 104 can be used to activate popular functions like "Help” and “Search” (as respectively marked on surface 101 A).
  • Optional button set 105 can also be used to answer popular questions with answers like “Yes” and “No” (as respectively marked on surface 101 A).
  • Optional button set 106 can be used as a keyboard layout (as marked with the a-z letters, 0-9 numbers, and various symbols on cover 101 A).
  • Optional button set 107 can be used as arrows and other required keys, like “Space bar” and “Enter” (as respectively marked with the arrow symbols, "Space", and “Enter” on surface 101A).
  • cover 101 A can contain additional tactile characteristic such as, for example, 108 for easier tactile recognition of the button by the visually impaired, and signage 109 for easier visual recognition of the button by users with substantially proper vision.
  • additional tactile characteristic such as, for example, 108 for easier tactile recognition of the button by the visually impaired, and signage 109 for easier visual recognition of the button by users with substantially proper vision.
  • buttons 1 - 3 can similarly be used to browse between the software applications by using the "Prev” and “Next” buttons and activate the selected software application using "OK” button. The same buttons may also be used to navigate inside the selected software application menus. Buttons 4 - 7 may be used to perform predefined functions such as, for example, “Back” (#4), "Apps” (#5) - to return to the main application selection screen, "Messages” (#6) - to view the latest operating system messages and “Switches “ (#7) - to view and control the system settings like date, time, wifi, 3g and the like.
  • Buttons 9 - 12 may function as a continuous keyboard (#8), as will be explained in detail below and accompanying buttons for language selection (#9), "Space” (#10), “Enter” (#1 1 ), and “Backspace” (#12).
  • Button 13 - 15 are functional buttons to invoke, for example, “voice to text” functionality (13), “Settings” (#14) selection and “Search” (#15) functionality.
  • the keyboard (#8) and its accompanying buttons (#9-12) define an optional layout of a keyboard.
  • sliding across the slide button (#8) can be used to locate a single letter, symbol, action, and the like (hereinafter LETTER).
  • the LETTERS are arrange in a specific order, for example a-z will be spread where "a" is relatively located on one side of button #8 and "z" is relatively located on the other side of button #8.
  • the LETTERS are relatively spread along button #8 for easier orientation.
  • the speed of sliding across button #8 affects the type of feedback used for indicating the current optional selection.
  • buttons #9 can be used to cycle through various types of the LETTERS at button #8, for example: a-z, A-Z, 0-9, !-), and the like.
  • button #9 can include additional usages, like quick selection, i.e.
  • Button #10 can be used as a spacebar i.e. to enter spaces between words. In a preferred embodiment, button #10 can include additional usages, like traversing the LETTERS, i.e. - by sliding left and right along button #10 the insertion location of the next LETTER can be updated. Button #1 1 can be used as Enter i.e. create and go to a new line in the text. In a preferred embodiment, button #1 1 can include additional usages, like reading the current text, i.e. - by double pressing button #1 1 the entire text box is read out by the system.
  • Button #1 1 can be used as a backspace i.e. to delete the last inserted LETTER and go back.
  • button #1 1 can include additional usages, like discarding the whole line of LETTERS, i.e. -by double pressing button #1 1 the last line of LETTERS is deleted and removed.
  • Various optional embodiments of the keyboard are demonstrated in Fig2A, 2B and 2C.
  • Other embodiments of the user interface apparatus 100 may include covers with various layouts including various buttons with various associated usage.
  • buttons can made with any appropriate method, material, size and properties like a cutout from cover 101 , a structure, a structure with a nib, or any other combination of creating a tactile button.
  • User interface apparatus 100 can cover part or all of the device's touchscreen.
  • User interface apparatus 100 can be made with any appropriate features, thickness, size, shape, properties and materials such as plastic, rubber, silicon, leather, or any other suitable material or combination of materials.
  • user interface apparatus 100 includes a mechanism (not shown) for locating and/or fitting and/or situating and/or mounting and/or adhering and/or combining user interface apparatus 100 with the touchscreen (not shown) of a device, and/or with the device, and/or any required configuration.
  • User interface apparatus 100 can also be removable, fixed or be a part of the device and/or device touchscreen and/or any required configuration.
  • cover 101 of user interface apparatus 100 can include none, some, or all of the proposed buttons (see Figs. 2A and 2B) in any layout arrangement with any usage representation.
  • other embodiment can include none or any additional buttons with any required usage, like various symbols, combinations, meaning, functions, usages, and the like.
  • FIG. 2C provides a view of another embodiment of a user interface apparatus 100 attached to a device.
  • Cover 101 C is attached over a touchscreen of a smartphone 202.
  • Combining cover 101 C with smartphone 202 creates an arrangement 200 that can help any type of user that can't or won't look at the screen for any reason, and especially those with permanent or temporary visual impairment, to navigate the user interface, with the help of a designated software service, as will be described in detail below.
  • Cover 101 C may comprises some of the button and functionality as in cover 101 A.
  • button set 203, 204, 205, and 206 have similar functionally to button set 102, 103, 106, and 107 respectively.
  • the end user can use smartphone 202 by holding or placing the smartphone 202 and feeling cover 101 C buttons.
  • the system 300 comprises a system server 310, comprising at least one of the following - a website translations repository 320 and an application translations repository 330.
  • Website translations repository 320 stores at least one translation of a website, the translation comprising a translated architecture of the website, namely, the decision tree used for navigating through the website.
  • application translations repository 330 stores at least one translation of an application, the translation comprising a translated architecture of the application, namely, the decision tree used for navigating through the application.
  • the website translations stored on the system server 310 may be acquired by analyzing the website HTML (online or offline) or by using a service such as Kimono, provided by Kimono-Labs (www.kimonlabs.com) that turns websites into structured APIs 380 which communicate with the various websites 395.
  • Kimono provided by Kimono-Labs (www.kimonlabs.com) that turns websites into structured APIs 380 which communicate with the various websites 395.
  • the system 300 additionally comprises an interface module 350 running on user devices such as smartphone, tablet PC or any other touchscreen equipped device.
  • Interface module 350 is uniquely configured to serve a specific user interface device configuration, selected for example from the configuration shown in Figs. 2A through 2C. Namely, each selection of a button on the user interface device (cover) is uniquely interpreted by the interface module 350 as a specific command (e.g. navigation, specific function activation, keyboard etc. as described above).
  • User device 340 also comprises various applications 360 and a browser 370.
  • the system can be implemented and used using various technologies, including: running locally as an application, remotely as a server (e.g. SaaS - Software As A Service), as a website, as a webapp (e.g. web.whatsapp.com), as an API (e.g. as part of the operating system), and the like.
  • a server e.g. SaaS - Software As A Service
  • a webapp e.g. web.whatsapp.com
  • an API e.g. as part of the operating system
  • any combination of the technologies can be used, for example - the interface module can run locally (at the device) while the translation module is located on a remote server.
  • Another example - users without the installed interface module can use the system as a webapp or website, through the device's web browser.
  • the system can access and use at least one of the following software
  • APP applications, websites, webapps, servers, APIs, and the like
  • the system can use the operating system using the API of the operating system, and browse amazon using Amazon API (https://developer.amazon.com/public/apis).
  • a webapp e.g. web.whatsapp.com
  • Web Browser Automation e.g. Selenium
  • the APPs can be stored and/or executed locally (i.e. on the device), remotely (i.e. on a server) and/or virtually (e.g. running locally or remotely using Selenium).
  • the system software translates "on the fly" (e.g.
  • TRANSLATION uses a premade translation of the APP (e.g. API) (hereinafter TRANSLATION).
  • APP e.g. API
  • TRANSLATION can be done using the APP API (premade), analyzing the APP code (premade and/or "on the fly"), running a
  • the TRANSLATION is used to translate the end-user usage at the interface module of the system, to a usage at the APP. For example - see Fig. 5 and Fig. 6.
  • the TRANSLATION can be stored locally and/or remotely (on a server/s).
  • Figs 7A and 7B show two respective embodiments of the basic structure of the system.
  • the system comprises a tactile user interface apparatus 700 configured to be used with at least part of a device's touchscreen 710 and to define a user interface layout over device's touchscreen 710; an interface module 720 configured to support the usage of the defined user interface layout; and at least one translation module 730 for translating the usage of the defined user interface layout to a usage of a software 740.
  • the tactile interface apparatus 700 When situating or connecting the tactile user interface apparatus 700 with a device's touchscreen 710, the tactile interface apparatus 700 defines a set of buttons in a layout at the device's touchscreen 710. The tactile buttons are used to enable usage of software 740 via the system without looking at the device's touchscreen 710.
  • Interface module 720 supports the defined user interface layout by translating the usage performed on the device's
  • touchscreen 710 to a defined usage at software 740 using translation module 730.
  • the translation is kept at translation module 730 and interface module 720 can read, write, modify or save the translation.
  • the software 740 can update the device's touchscreen 710 or send feedback via interface module 720.
  • translation module 730 receives the usage performed at the defined user interface layout on the device's touchscreen 710 from interface module 720, translates the usage and performs a defined usage at software 740.
  • Other embodiments may include additional modules to support additional features like a feedback module to support announcements to the user using sound, speech, vibration, lights, and the like; communication module to support communication with additional various devices like external screen, speakers, headphones, operation systems and the like; software communicator to support various modes of communicating with various software; API module to support accessing and using the system using an API; SDK module to support development for the system using a SDK; and the like.
  • the translation module 730 can additionally be used as an API or a SDK to provide access to software designed for usage with the system.
  • the system software can run its own “apps” / "phone” / “browser” and/or use the pre/existing naturally/modified (or any combination thereof) compatible "apps” / "phone” / “browser”.
  • the software may be able to use the android native phone app (via API), but a proprietary phone app may be required to be developed, which uses the phone OS (Operating System) resources API, to fit it to the layout requirements (if the native phone app itself doesn't support API).
  • Another example - for sites with API like amazon.com) the system uses the API to simulate surfing / browsing.
  • the system may have to run a simulated web-browser at its servers, which will simulate the browsing / surfing while analyzing the generated web-page to translate the generated web-page to the user interface apparatus layout requirements.
  • Fig. 4 is a flowchart showing an exemplary scenario implemented by the interface module 350 running on the user device.
  • the local app namely the interface module 350 executable on the user device is launched.
  • the launching may be regularly as an app, website or the like, or triggered, for example, by attaching the user interface apparatus 100 of the present invention to the user's smartphone, using NFC technology for automatic identification of the apparatus or informing the OS applications launcher of the apparatus being attached.
  • interface module 350 requests and receives from the OS the ID of the currently attached user interface apparatus 100 - i.e. location of the cover over the touchscreen in case the cover doesn't cover the whole touchscreen, the names, functions, and locations of the cover's buttons, for example, to resolve the differences between fig. 2A, 2B, and 2C, or any other layout of buttons with various locations, properties and function representation..
  • Another example is to differentiate between a full function / "regular" cover layout, task specific cover (e.g. for operating a limited set of specifics apps or specific needs), and a configurable cover layout (e.g. a development cover in which the button functions are programmable), and other forms and combinations, one of which will be explained in detail below.
  • the OS can inform the interface module 350 about the status of the OS and current apps, for example, if the cover is placed while the end user is using a Smart TV app, interface module 350 will initialize and execute with the Smart TV function.
  • a default execution start may be defined, such as a home page automatically generated or defined by the user.
  • interface module 350 may "learn" over time the user's preferred usage mode and thus enhance the user experience by getting him faster to his destination.
  • interface module 350 checks whether the current screen ID is the ID of an OS navigation screen (i.e. a launcher - a screen for selecting applications to launch or other OS services). If not, namely the current screen is that of a browser or an application - interface module 350 sends the current screen ID to the system server 310 (step 440).
  • the system server identifies the appropriate link in its databases, finds or creates the translation
  • the server can be remote and/or locally.
  • step 460 interface module 350 selects a current item from the translated page and uses text-to-speech capabilities or previous knowledge of the page's layout to indicate the current position (e.g. "clock” or "select contact” or "John Doe") to the user. If in step 430 the screen is identified as a navigation screen, the program goes directly to step 460.
  • interface module 350 receives from the user interface apparatus 100 coordinates of user selection and checks in step 470 whether the user has entered a navigation command (e.g. commands #2 (next) or #3 (prev) in the exemplary apparatus of Fig. 2B). If the user has entered a navigation command, interface module 350 uses the page translation to find the required item (step 475) and loops back to step 460 to inform the user of the selected item. Otherwise, if the user selection is not of a navigation command, interface module 350 checks in step 480 whether the selection requires the display of a new page/screen (e.g. "contacts" screen or parking app zone selection screen or functional screen such as "Definitions") and returns to step 440 if it does. Otherwise, if the user has selected the keyboard, in step 485 interface module 350 aggregates the selected keyboard symbols until "enter" is selected and uses the appropriate API to communicate the alphanumeric input to the currently running app or service (step 490).
  • a navigation command e.g. commands #2 (next) or #3 (prev
  • a chart 500 of an exemplary software (app, website, and the like) translation and usage is shown. This chart is used to
  • the support of the software for the apparatus is from a launcher level and down to the single action. Many other formations and support level can be implemented.
  • various covers with various layouts with various buttons with various usages can be used for various software at various levels running on various devices with various touchscreens.
  • the launcher layer 501 of the software running on the device supports selecting which software 502 to execute. The selection can be done by pressing the user interface element (hereinafter UIE) associated with switching software. The pressing can be done directly on the touchscreen or via the associated cover button.
  • UIE user interface element
  • cover 101 A optional button set 102 represents switching the software, thus when attaching the cover 101 A to a touchscreen of the device and the software of the device supports the layout of the cover ("naturally” or via “translation” as has been explained above) - a visually impaired end user can switch and select between software applications by pressing the UIE associated with button set 102.
  • the software can use various indications like sound, speech (using recorded audio or TTS (Text To Speech) engine), vibration or any other form of feedback to indicate the current status, selection and any other information relevant to the user.
  • Any type of software, locally or remotely, can be supported, like dialer, calendar, music player, RSS reader, games, and the like.
  • buttons 508 can be used to answer the "yes/no" of field 506, or by selecting the answer using optional button set 103.
  • optional button set 106 can be used as a keyboard for inserting and editing the content in field 507.
  • software 1 , software 2, and software 3 of software 502 can represent, respectively, a dialer application, a music application, and a calendar application.
  • a dialer application By pressing the tactile button “5- apps” and the buttons “Prev”, “Next”, and “OK” at cover 101 B - the user can switch between the proposed software applications.
  • the user can choose between screen 1 , screen 2, screen 3, and screen 4 of software 2 that can represent, respectively, artists screen, albums screen, podcast screen, and streaming screen.
  • the software supports the use of the user interface apparatus 100 from the launcher level and down to a single usage like pressing a button.
  • Other kinds of support are also contemplated, like support only an element of the user interface, a few elements, a menu, a screen, application, operating system, and any other type of user interface at any level.
  • Fig. 6 shows a schematic representation of a translated application API of an exemplary dialer application, showing the application's decision tree and one possible path traversing the decision tree.
  • the user interface device was shown in its usage as a navigation tool for simulating touch screen user interface by navigating a decision tree, where the user interface device serves as a navigation tool.
  • the user interface device of the present invention may be any type of the user interface device of the present invention.
  • An application may be programmed with a specific mode of operation designed to be carried out using a predefined layout of the user interface device.
  • a user interface device layout is dictated by the application and each button has a predefined
  • An application may be programmed with the ability to broadcast its user interface requirements to a local or remote user interface device configuration software, whereby the user interface device buttons may be configured accordingly and the configuration communicated to the application.

Abstract

A system comprising: a user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen; the user interface apparatus defining a user interface layout over the touchscreen; an interface module configured to support usage of the user interface layout; and at least one translation module configured to translate the usage of said user interface layout to a usage of a software.

Description

TACTILE INTERFACE FOR A TOUCHSCREEN
FIELD OF THE INVENTION
This disclosure relates generally to touchscreen covers with buttons, and, in particular, to method and apparatus for creating, defining and manufacturing of touchscreen covers with tactile buttons.
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS
This patent application claims priority from and is related to U.S. Provisional Patent Application Serial Number 62/025,019, filed 07/16/2014, this U.S. Provisional Patent Application incorporated by reference in its entirety herein.
BACKGROUND OF THE INVENTION
Graphical user interface is used by the devices' software to define how the end user can operate the software. The end user needs to look at the screen and press the interface's buttons using a keyboard/ mouse / touchscreen. Various accessories for the touchscreen are used to define a physical button over the touchscreen, wherein the physical button layout correspond to the software interface.
Some of the accessories are described, for example, in: USD704691 (Case for electronic device);
US20070013662 (Multi-configurable tactile touch-screen keyboard and associated methods);
US8564538 (Touch screen overlays and methods for manufacturing the same); US20050164148 (Tactile overlay for an imaging display);
US20100302168 (Overlay keyboard for touch screen devices); US201 10260976 (Tactile overlay for virtual keyboard); US200301 12225 (Electronic device having a movable keypad);
US20120328349 (Keyboard overlay for optimal touch typing on a proximity- based touch screen);
US5412189 (Touch screen apparatus with tactile information); US8704790 (User interface system).
SUMMARY OF THE INVENTION
The present invention seeks to overcome the requirement of looking at the device's touchscreen for using and operating a software. In accordance with aspects of the various described embodiments, an apparatus for a
touchscreen display of a device is provided. For example, in one application an apparatus, comprising a cover for covering at least part of a touchscreen of a device; the cover comprising one or more tactile buttons; the buttons are arranged in a layout; at least one of the buttons represent a usage; and when situating the cover with the touchscreen, a software is at least partly usable without viewing the touchscreen wherein the software comprises a user interface; elements of the user interface corresponds to the buttons and the layout; and the corresponding elements are usable as represented by the buttons. The cover can be made from various materials, in various shapes and size, to accommodate various touchscreens. The cover is designed to cover at least part of a touchscreen. The cover includes one or more buttons. At least some of the buttons have tactile features thus the buttons can be perceptible by touch. Various buttons can include various features like sizes, shapes, textures, and the like. Various buttons can be used to represent various software functions and usages like Play, Stop, Up, Down, Select, and the like. The buttons are arranged in a layout. Various covers for various touchscreens can be designed with various layouts with various buttons to accommodate various needs and requirements. An appropriate software includes designing at least part of the elements (like buttons, slides, keyboard) of at least part of at least one of the software user interface (like the TUI (Touch User Interface) or the GUI (Graphic User Interface)) according the buttons associated usage and the buttons layout. Designing a software user interface according to a selected cover's buttons layout enables using the cover's button for using the software's interface when the cover is situated with the touchscreen. The buttons can be associated and can be used to use features like various software buttons, slides, and the like. The software can support various covers using multiple interfaces, for example without the cover, with the cover, with a cover using a different layout, and the like. Furthermore, the elements of the interfaces and the design of the interfaces can include updating
(manually or automatically) the software settings (like usage, functionality, size, interface layout, and the like) according to the cover information. The software architecture (the structures of the software system) can also be designed to be at least partly compliance with the cover's button and layout, thus the cover buttons can be used for controlling and using the software activities like navigating the software screens and menus, managing content, switching activity, and the like. The cover can be supported by many types of software applications like music players, RSS reader, phone dialer, calendar, contacts, and the like. Furthermore, the cover can be supported with core software like launchers and operating system, thus the cover can also be used to support functions like activating various software applications, switching between software applications, managing the software applications, and the like.
When situating the cover with the touchscreen of a device, software that support the cover's button layout can be used without looking at the screen. This enables visually impaired end users to use the software just by feeling the tactile buttons.
Designing the software user interface according to the cover's button layout enables controlling the software without looking at the screen. This method enables a visually impaired end user to use the software without the need of looking at the screen and without the need of learning each software interface for using the software. The method to substantially enable a non-visual use of an appropriate software on a device wherein the device comprising a touchscreen, wherein the method comprising situating a cover of an apparatus with the touchscreen; and using the cover's tactile buttons to use the software; wherein at least part of the elements of at least one of the user interfaces corresponds to at least part of the buttons and the buttons associated usage and at least part of the buttons layout, so the corresponding part of the interfaces is usable with the cover. Another method comprises providing one or more layouts of a tactile buttons of a cover wherein the cover can cover at least part of the
touchscreen; designing at least part of the elements of at least one interface of the software to corresponds to at least part of the layout; and situating the cover with the touchscreen wherein using the tactile buttons of the cover to use the elements of the software.
The disclosed apparatus allow visually impaired end users to use a software running on a device with a touchscreen. Designing the user interface according to a fixed layout wherein the layout comprises tactile buttons and the buttons have associated usage - substantially reduce the learning curve of how to use the software. Using feedback (like sound and voice
announcements) can inform the end user about his current screen, menu and options without the need of learning how to proceed but pressing the button with the associated usage.
Thus there is provided according to a first aspect of the invention a system comprising: a user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen; said user interface
apparatus defining a user interface layout over said touchscreen; an interface module configured to support usage of said user interface layout; and at least one translation module configured to translate said usage of said user interface layout to a usage of a software.
The user interface apparatus may comprise a cover configured to cover at least part of said device's touchscreen.
The cover may comprise one or more tactile buttons arranged in a layout. The tactile buttons may be selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any combination thereof.
The cover may further comprise additional tactile features for easier tactile recognition.
The cover may further comprise signage for easier visual recognition.
The cover may comprise at least one viewer for viewing and/or touching the touchscreen.
The user interface apparatus may be at least partly a part of said touchscreen and/or said device.
The system may further comprise a connector for connecting said user interface apparatus with said device.
The user interface apparatus may further comprise an identifier for identifying said apparatus to said interface module. The interface module may comprise a local application running on said device.
The interface module may be configured to receive said software usage from said translation module and communicate it to said software for execution.
The translation module may be configured to communicate said software usage to said software for execution.
The system may be configured to derive at least part of said translation from an identifier, said identifier comprising information regarding said user interface apparatus.
The system may be configured to derive said translation from a configuration definition related to said software.
The software may be selected from the group consisting of local or remote program, application, web application, simulated application, browser, website, server, API, and any combination thereof. The usage of the user interface apparatus may comprise one of: navigation, selection, operation, and input.
The selection may comprise one of: application selection, operating system service selection and item selection within an application or an operating system service.
The system may further comprise a feedback module comprising one of sound, speech, vibration, lights, and any combination thereof.
The translation module may further comprise a software development kit configured to enable access of software pre-designed to interact with said system.
The translation module may further comprises an application programming interface configured to enable access of software pre-designed to interact with said system.
According to another aspect of the present invention there is provided a user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen, comprising: one or more tactile buttons arranged in a layout; and an identifier for identifying said apparatus to said device.
The tactile buttons may be selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any combination thereof.
The apparatus may further comprise additional tactile features for easier tactile recognition.
The apparatus may further comprise signage for easier visual recognition. The apparatus may further comprise at least one viewer for viewing and/or touching the touchscreen.
The apparatus may beat least partly a part of said touchscreen and/or said device. The apparatus may further comprise a connector for connecting said apparatus with said device.
According to another aspect of the present invention there is provided a method of enabling non-visual use of a target software via a first device, said first device comprising a touchscreen and said target software running on said first device or on a second device communicating with said first device, the method comprising: defining a user interface layout over said touchscreen by situating a cover relative to said touchscreen, said cover comprising tactile buttons arranged in a layout; and using a software to translate usage of said user interface layout to said target software usage.
The architecture of said software may be at least partly compliant with said layout.
The settings of said software may be determined according to information provided by an identifier of said cover. Situating may comprise coupling said cover with said touchscreen and/or said first device using a coupling mechanism.
According to another aspect of the present invention there is provided a method of enable non-visual use of a software via a device comprising a touchscreen, the method comprising: providing one or more layouts of tactile buttons of a cover covering at least part of the touchscreen; designing at least part of the elements of at least one interface of the software to correspond to at least part of the layout; situating the cover relative to the touchscreen; and using the tactile buttons of the cover to use said
corresponding elements of the software. The architecture of said software is at least partly compliant with said layout.
The software may be selected from the group consisting of: program, application, website, server, webapp, API, and any combination thereof.
According to another aspect of the present invention there is provided a tactile keyboard comprising a slide button configured to select symbols comprising letters, the symbols spread along said slide button in a predefined order. The keyboard may be connected with a device's touchscreen, said device running an interface application configured to interpret usage of said keyboard and provide feedback.
The feedback may depend on the speed of sliding across the slide button. The feedback may comprise vocalizing the name of a letter during slow sliding.
The feedback may comprise generating a tone relative to said speed when the speed is high.
The properties of said sliding may change the usage interpretation. Changing the usage interpretation may comprise changing the zoom of said symbols spread along said slide button.
The tactile keyboard may further comprise tactile means for indicating change of letter types.
BRIEF DESCRIPTION OF THE DRAWINGS
For better understanding of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings.
With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the accompanying drawings:
Fig. 1 depicts two modes of usage of the user interface apparatus according to the present invention; Figs. 2A through 2C, show three exemplary embodiments of the user interface apparatus according to the present invention;
Fig. 3 is a block diagram showing an embodiment the system according to the present invention; Fig. 4 is a flowchart showing an exemplary scenario implemented by the interface application;
Fig. 5 is a chart of an exemplary software translation and usage;
Fig. 6 shows a schematic representation of a translated application API of an exemplary dialer application; and Figs. 7A and 7B show two respective embodiments of the basic structure of the system according to the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION
The following detailed description of the invention refers to the drawings referred to above and is presented to allow a person of ordinary skill in the art to make and use various aspects of the inventions. Descriptions of specific materials, techniques, and applications are provided only as examples.
Various modifications to the examples described herein will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other examples and applications without departing from the spirit and scope of the inventions. Dimensions of components and features shown in the figures are chosen for convenience or clarity of presentation and are not necessarily shown to scale. Wherever possible, the same reference numbers will be used throughout the drawings and the following description to refer to the same and like parts.
The present invention seeks to overcome the requirement of looking at the device's touchscreen for using and operating a software. The present invention uses the insight that:
1 . Any function that can be performed by a touchscreen can also be
performed by a keyboard. 2. Any software is actually a decision tree which may be traversed using the keys of a keyboard to fully activate any function.
In accordance with aspects of the various described embodiments, a user interface apparatus for attaching to a touchscreen display of a device is provided.
The user interface apparatus may comprise a cover for covering at least part of a touchscreen of a device, the cover comprising one or more tactile buttons arranged in a predetermined layout.
Various buttons can include various features like sizes, shapes, textures, and the like. Various buttons can be used to represent various software functions and usages like Play, Stop, Up, Down, Select, and the like. The buttons are arranged in a layout. Various covers for various touchscreens can be designed with various layouts with various buttons to accommodate various needs and requirements. An appropriate software includes designing at least part of the elements (like buttons, slides, keyboard) of at least part of at least one of the software user interface (like the TUI (Touch User Interface) or the GUI (Graphic User Interface)) according to the buttons associated usage and the buttons layout.
Designing a software user interface according to a selected cover's buttons layout enables using the cover's button for using the software's interface when the cover is connected with the touchscreen. The buttons can be associated and can be used to use features like various software buttons, slides, and the like. The software can support various covers using multiple interfaces; for example without the cover, with the cover, with a cover using a different layout, and the like. Furthermore, the elements of the interfaces and the design of the interfaces can include updating (manually or automatically) the software settings (like usage, functionality, size, interface layout, and the like) according to the cover information. The software architecture (the structures of the software system) can also be designed to be at least partly compliant with the cover's button and layout, thus the cover buttons can be used for controlling and using the software activities like navigating the software screens and menus, managing content, switching activity, and the like. The cover can be supported by many types of software applications like music players, RSS reader, phone dialer, calendar, contacts, and the like.
Furthermore, the cover can be supported with core software like launchers and operating system, thus the cover can also be used to support functions like activating various software applications, switching between software applications, managing the software applications, and the like.
A preferred embodiment of the cover is defining at least some of the buttons by a cutout from the cover for cheaper manufacturing. Other button
embodiments may include extruded areas, ridges, molded key features, indentations, protrusions, depression, or the like arranged to provide users with a tactile feel (e.g., for motion and/or location). The apparatus can include a connector for connecting (wire/wireless) with the device. A preferred connector has a wireless connection capabilities (like Bluetooth, NFC, Wi-Fi, and the like) for wirelessly connecting with a device which support such wireless connection. An optional embodiment of the apparatus comprises an identifier for identifying apparatus properties and/or apparatus associated information and/or additional information. Identifying the apparatus properties enables easier supporting of various covers (like covers layouts, styles, sizes, button associated usage, and the like) - thus the software interface and/or interface elements can be optimized for the cover in use. Another optional feature of the identifier is the apparatus associated information. For example - associating a user ID to a specific apparatus enable shortening at least part of the end user identification process. Yet another optional feature of the identifier is including or providing some additional information, like the apparatus position relative to the touchscreen, for dynamic and automatic software interface adjustments. The apparatus can be combined with the device or with the device's touchscreen in various ways, including using an adhesion mechanism to couple the cover to the touchscreen; using a mount mechanism for mounting the cover over the touchscreen and/or the device; using a case for casing the device wherein the cover is part of the case (like a wallet case), and the like. An optional embodiment of the apparatus comprises additional tactile characteristics for easier tactile recognition of each button without looking. For example, Braille symbols can be included with at least some of the buttons for easier recognition of the buttons and/or the button associated usage, by the visually impaired. The apparatus may include signage for signing the meaning of the buttons for users with substantially proper vision. The apparatus can include a viewer for viewing and/or touching the touchscreen. The apparatus can also be combined and/or be a part of the touchscreen and/or device and/or any other part. In some embodiments, the button's associated function can be updated by the software to support additional features like associating the buttons used for traversing the application decision tree to control a music player, to associate the keyboard layout to slide the values in a slide-bar, and the like. The user interface apparatus according to the present invention may be used in various modes, including as a buffer or translator to an application, as a remote control to an application, and the like. Two of the modes of usage, are depicted in Fig. 1.
User interface apparatus 100 may be used in a first mode to provide a touchscreen user interface navigation tool to users who are unable, selectively or un-selectively, permanently or temporarily, to see or to look at the touchscreen. This category may include visually impaired, blinds, drivers during driving, and more. The device functions made available to those users by the user interface apparatus 100 may include at least some of the usual functionality of the device, such as, for example, browser 1 10, phone 120 and apps 130.
In the user interface apparatus 100 second usage mode it may be used as a remote control device 140, to help operate at least one application running on another device, wherein the other device has or doesn't have a screen or a touchscreen. For example, a navigation application running on a vehicle's multimedia system, where the use of the user interface apparatus 100 enables control of the application and/or the system's operating system without looking at the screen of the other device or performing touch operation thereon.
With reference to FIGs. 2A through 2C, views of three exemplary
embodiments of the user interface apparatus 100 are shown. In these embodiment, user interface apparatus 100 comprises a cover 101A/101 B/101 C with sets of buttons. The buttons are arranged in a layout. In these examples the buttons represent various usages of the underlying user interface.
In the example of Fig. 2A, optional button set 102 can be used to browse between the software applications by using the "Left" and "Right" buttons and activate the selected software application using "Choose" button (as respectively signed on surface 101 A). Optional button set 103 can be used to browse inside the selected software application menus using the "Prev" (Previous) and "Next" buttons (as respectively signed on surface 101 A), browse between the software application options using the "Up" and "Down" buttons (as respectively signed on surface 101 A) and activate the selected option using "Select" button (as respectively signed on surface 101 A).
Optional button set 104 can be used to activate popular functions like "Help" and "Search" (as respectively marked on surface 101 A). Optional button set 105 can also be used to answer popular questions with answers like "Yes" and "No" (as respectively marked on surface 101 A). Optional button set 106 can be used as a keyboard layout (as marked with the a-z letters, 0-9 numbers, and various symbols on cover 101 A). Optional button set 107 can be used as arrows and other required keys, like "Space bar" and "Enter" (as respectively marked with the arrow symbols, "Space", and "Enter" on surface 101A).
Preferably, cover 101 A can contain additional tactile characteristic such as, for example, 108 for easier tactile recognition of the button by the visually impaired, and signage 109 for easier visual recognition of the button by users with substantially proper vision.
In the example of Fig. 2B, buttons 1 - 3 can similarly be used to browse between the software applications by using the "Prev" and "Next" buttons and activate the selected software application using "OK" button. The same buttons may also be used to navigate inside the selected software application menus. Buttons 4 - 7 may be used to perform predefined functions such as, for example, "Back" (#4), "Apps" (#5) - to return to the main application selection screen, "Messages" (#6) - to view the latest operating system messages and "Switches " (#7) - to view and control the system settings like date, time, wifi, 3g and the like. Buttons 9 - 12 may function as a continuous keyboard (#8), as will be explained in detail below and accompanying buttons for language selection (#9), "Space" (#10), "Enter" (#1 1 ), and "Backspace" (#12). Button 13 - 15 are functional buttons to invoke, for example, "voice to text" functionality (13), "Settings" (#14) selection and "Search" (#15) functionality.
The keyboard (#8) and its accompanying buttons (#9-12) define an optional layout of a keyboard. In this embodiment of the keyboard, sliding across the slide button (#8) can be used to locate a single letter, symbol, action, and the like (hereinafter LETTER). Preferably, the LETTERS are arrange in a specific order, for example a-z will be spread where "a" is relatively located on one side of button #8 and "z" is relatively located on the other side of button #8. Preferably, the LETTERS are relatively spread along button #8 for easier orientation. According to one embodiment, the speed of sliding across button #8 affects the type of feedback used for indicating the current optional selection. For example - during a relatively slow slide, the name of the LETTER is vocalized, and during a relatively fast slide, a relative tone is generated. In another embodiment, the speed of sliding can initiate a zoom in and zoom out to the sequence of the LETTERS when sliding slower or faster, respectively. In another embodiment, an auto-complete, dictionary and predictions can be made using the sliding properties, i.e. sliding locations, speed, direction and the like. Button #9 can be used to cycle through various types of the LETTERS at button #8, for example: a-z, A-Z, 0-9, !-),
Figure imgf000015_0001
and the like. In a preferred embodiment, button #9 can include additional usages, like quick selection, i.e. - a long press can return the type of LETTERS to a-z, and double press can return the type of LETTERS to 0-9. Button #10 can be used as a spacebar i.e. to enter spaces between words. In a preferred embodiment, button #10 can include additional usages, like traversing the LETTERS, i.e. - by sliding left and right along button #10 the insertion location of the next LETTER can be updated. Button #1 1 can be used as Enter i.e. create and go to a new line in the text. In a preferred embodiment, button #1 1 can include additional usages, like reading the current text, i.e. - by double pressing button #1 1 the entire text box is read out by the system. Button #1 1 can be used as a backspace i.e. to delete the last inserted LETTER and go back. In another embodiment, button #1 1 can include additional usages, like discarding the whole line of LETTERS, i.e. -by double pressing button #1 1 the last line of LETTERS is deleted and removed. Various optional embodiments of the keyboard are demonstrated in Fig2A, 2B and 2C.
Other embodiments of the user interface apparatus 100 may include covers with various layouts including various buttons with various associated usage.
Any of the buttons can made with any appropriate method, material, size and properties like a cutout from cover 101 , a structure, a structure with a nib, or any other combination of creating a tactile button.
User interface apparatus 100 can cover part or all of the device's touchscreen. User interface apparatus 100 can be made with any appropriate features, thickness, size, shape, properties and materials such as plastic, rubber, silicon, leather, or any other suitable material or combination of materials.
Usefully, user interface apparatus 100 includes a mechanism (not shown) for locating and/or fitting and/or situating and/or mounting and/or adhering and/or combining user interface apparatus 100 with the touchscreen (not shown) of a device, and/or with the device, and/or any required configuration. User interface apparatus 100 can also be removable, fixed or be a part of the device and/or device touchscreen and/or any required configuration.
In other embodiments, cover 101 of user interface apparatus 100 can include none, some, or all of the proposed buttons (see Figs. 2A and 2B) in any layout arrangement with any usage representation. In addition, other embodiment can include none or any additional buttons with any required usage, like various symbols, combinations, meaning, functions, usages, and the like.
FIG. 2C provides a view of another embodiment of a user interface apparatus 100 attached to a device. Cover 101 C is attached over a touchscreen of a smartphone 202. Combining cover 101 C with smartphone 202 creates an arrangement 200 that can help any type of user that can't or won't look at the screen for any reason, and especially those with permanent or temporary visual impairment, to navigate the user interface, with the help of a designated software service, as will be described in detail below. Cover 101 C may comprises some of the button and functionality as in cover 101 A. For example button set 203, 204, 205, and 206 have similar functionally to button set 102, 103, 106, and 107 respectively. The end user can use smartphone 202 by holding or placing the smartphone 202 and feeling cover 101 C buttons. Fig. 3 is a block diagram of an embodiment the system 300 according to the present invention. The system 300 comprises a system server 310, comprising at least one of the following - a website translations repository 320 and an application translations repository 330. Website translations repository 320 stores at least one translation of a website, the translation comprising a translated architecture of the website, namely, the decision tree used for navigating through the website. Similarly, application translations repository 330 stores at least one translation of an application, the translation comprising a translated architecture of the application, namely, the decision tree used for navigating through the application. The website translations stored on the system server 310 may be acquired by analyzing the website HTML (online or offline) or by using a service such as Kimono, provided by Kimono-Labs (www.kimonlabs.com) that turns websites into structured APIs 380 which communicate with the various websites 395.
The system 300 additionally comprises an interface module 350 running on user devices such as smartphone, tablet PC or any other touchscreen equipped device. Interface module 350 is uniquely configured to serve a specific user interface device configuration, selected for example from the configuration shown in Figs. 2A through 2C. Namely, each selection of a button on the user interface device (cover) is uniquely interpreted by the interface module 350 as a specific command (e.g. navigation, specific function activation, keyboard etc. as described above). User device 340 also comprises various applications 360 and a browser 370.
According to embodiments of the invention, the system can be implemented and used using various technologies, including: running locally as an application, remotely as a server (e.g. SaaS - Software As A Service), as a website, as a webapp (e.g. web.whatsapp.com), as an API (e.g. as part of the operating system), and the like. Furthermore, any combination of the technologies can be used, for example - the interface module can run locally (at the device) while the translation module is located on a remote server. Another example - users without the installed interface module can use the system as a webapp or website, through the device's web browser. The system can access and use at least one of the following software
technologies: applications, websites, webapps, servers, APIs, and the like (hereinafter APP). For example - the system can use the operating system using the API of the operating system, and browse amazon using Amazon API (https://developer.amazon.com/public/apis). Another example - the system can use a webapp (e.g. web.whatsapp.com) using Web Browser Automation (e.g. Selenium). The APPs can be stored and/or executed locally (i.e. on the device), remotely (i.e. on a server) and/or virtually (e.g. running locally or remotely using Selenium). To access and use at least one of the APPs, the system software translates "on the fly" (e.g.
https://www.jetbrains.com/idea/features/code_analysis.html) and/or uses a premade translation of the APP (e.g. API) (hereinafter TRANSLATION). For example, the TRANSLATION can be done using the APP API (premade), analyzing the APP code (premade and/or "on the fly"), running a
virtual/simulation of the APP while analyzing the APP behaviors and commands (premade and/or "on the fly"), pre-designed for being accessed by the system (premade) and the like. The TRANSLATION is used to translate the end-user usage at the interface module of the system, to a usage at the APP. For example - see Fig. 5 and Fig. 6. The TRANSLATION can be stored locally and/or remotely (on a server/s).
Figs 7A and 7B show two respective embodiments of the basic structure of the system. The system comprises a tactile user interface apparatus 700 configured to be used with at least part of a device's touchscreen 710 and to define a user interface layout over device's touchscreen 710; an interface module 720 configured to support the usage of the defined user interface layout; and at least one translation module 730 for translating the usage of the defined user interface layout to a usage of a software 740.
When situating or connecting the tactile user interface apparatus 700 with a device's touchscreen 710, the tactile interface apparatus 700 defines a set of buttons in a layout at the device's touchscreen 710. The tactile buttons are used to enable usage of software 740 via the system without looking at the device's touchscreen 710.
In the embodiment of Fig. 7A, Interface module 720 supports the defined user interface layout by translating the usage performed on the device's
touchscreen 710 to a defined usage at software 740 using translation module 730. The translation is kept at translation module 730 and interface module 720 can read, write, modify or save the translation. The software 740 can update the device's touchscreen 710 or send feedback via interface module 720.
In another embodiment, as demonstrated in Fig 7B, translation module 730 receives the usage performed at the defined user interface layout on the device's touchscreen 710 from interface module 720, translates the usage and performs a defined usage at software 740. Other embodiments may include additional modules to support additional features like a feedback module to support announcements to the user using sound, speech, vibration, lights, and the like; communication module to support communication with additional various devices like external screen, speakers, headphones, operation systems and the like; software communicator to support various modes of communicating with various software; API module to support accessing and using the system using an API; SDK module to support development for the system using a SDK; and the like. In another
embodiment, the translation module 730 can additionally be used as an API or a SDK to provide access to software designed for usage with the system. According to embodiments of the invention the system software can run its own "apps" / "phone" / "browser" and/or use the pre/existing naturally/modified (or any combination thereof) compatible "apps" / "phone" / "browser". For example, the software may be able to use the android native phone app (via API), but a proprietary phone app may be required to be developed, which uses the phone OS (Operating System) resources API, to fit it to the layout requirements (if the native phone app itself doesn't support API). Another example - for sites with API (like amazon.com) the system uses the API to simulate surfing / browsing. For sites without API (like ynet.com) the system may have to run a simulated web-browser at its servers, which will simulate the browsing / surfing while analyzing the generated web-page to translate the generated web-page to the user interface apparatus layout requirements.
Fig. 4 is a flowchart showing an exemplary scenario implemented by the interface module 350 running on the user device. In step 410 the local app, namely the interface module 350 executable on the user device is launched. The launching may be regularly as an app, website or the like, or triggered, for example, by attaching the user interface apparatus 100 of the present invention to the user's smartphone, using NFC technology for automatic identification of the apparatus or informing the OS applications launcher of the apparatus being attached.
In step 420 interface module 350 requests and receives from the OS the ID of the currently attached user interface apparatus 100 - i.e. location of the cover over the touchscreen in case the cover doesn't cover the whole touchscreen, the names, functions, and locations of the cover's buttons, for example, to resolve the differences between fig. 2A, 2B, and 2C, or any other layout of buttons with various locations, properties and function representation..
Another example is to differentiate between a full function / "regular" cover layout, task specific cover (e.g. for operating a limited set of specifics apps or specific needs), and a configurable cover layout (e.g. a development cover in which the button functions are programmable), and other forms and combinations, one of which will be explained in detail below. In addition, the OS can inform the interface module 350 about the status of the OS and current apps, for example, if the cover is placed while the end user is using a Smart TV app, interface module 350 will initialize and execute with the Smart TV function.
A default execution start may be defined, such as a home page automatically generated or defined by the user. Alternatively and additionally, interface module 350 may "learn" over time the user's preferred usage mode and thus enhance the user experience by getting him faster to his destination.
In step 430 interface module 350 checks whether the current screen ID is the ID of an OS navigation screen (i.e. a launcher - a screen for selecting applications to launch or other OS services). If not, namely the current screen is that of a browser or an application - interface module 350 sends the current screen ID to the system server 310 (step 440). The system server identifies the appropriate link in its databases, finds or creates the translation
associated with the link and communicates it to the interface module 350 (step 450). The server can be remote and/or locally.
In step 460 interface module 350 selects a current item from the translated page and uses text-to-speech capabilities or previous knowledge of the page's layout to indicate the current position (e.g. "clock" or "select contact" or "John Doe") to the user. If in step 430 the screen is identified as a navigation screen, the program goes directly to step 460.
In step 465 interface module 350 receives from the user interface apparatus 100 coordinates of user selection and checks in step 470 whether the user has entered a navigation command (e.g. commands #2 (next) or #3 (prev) in the exemplary apparatus of Fig. 2B). If the user has entered a navigation command, interface module 350 uses the page translation to find the required item (step 475) and loops back to step 460 to inform the user of the selected item. Otherwise, if the user selection is not of a navigation command, interface module 350 checks in step 480 whether the selection requires the display of a new page/screen (e.g. "contacts" screen or parking app zone selection screen or functional screen such as "Definitions") and returns to step 440 if it does. Otherwise, if the user has selected the keyboard, in step 485 interface module 350 aggregates the selected keyboard symbols until "enter" is selected and uses the appropriate API to communicate the alphanumeric input to the currently running app or service (step 490).
With reference to FIG. 5, a chart 500 of an exemplary software (app, website, and the like) translation and usage is shown. This chart is used to
demonstrate how a software running on a device can be used without looking at the screen of the device when used with the apparatus attached to the touchscreen. In this example, the support of the software for the apparatus is from a launcher level and down to the single action. Many other formations and support level can be implemented. Furthermore, various covers with various layouts with various buttons with various usages can be used for various software at various levels running on various devices with various touchscreens. In this example, the launcher layer 501 of the software running on the device, supports selecting which software 502 to execute. The selection can be done by pressing the user interface element (hereinafter UIE) associated with switching software. The pressing can be done directly on the touchscreen or via the associated cover button. In this example, cover 101 A optional button set 102 represents switching the software, thus when attaching the cover 101 A to a touchscreen of the device and the software of the device supports the layout of the cover ("naturally" or via "translation" as has been explained above) - a visually impaired end user can switch and select between software applications by pressing the UIE associated with button set 102. The software can use various indications like sound, speech (using recorded audio or TTS (Text To Speech) engine), vibration or any other form of feedback to indicate the current status, selection and any other information relevant to the user. Any type of software, locally or remotely, can be supported, like dialer, calendar, music player, RSS reader, games, and the like.
Once a software is selected, in this example "software 2", the end user can select which screen 503 to use. In this example, cover 101 A optional button set 103 represents traveling inside screens, menus and other options, thus traveling and selecting between screens 503, menus 504, modules 505, and buttons 508. Furthermore, optional button set 105 can be used to answer the "yes/no" of field 506, or by selecting the answer using optional button set 103. Furthermore, optional button set 106 can be used as a keyboard for inserting and editing the content in field 507.
As an example: software 1 , software 2, and software 3 of software 502 can represent, respectively, a dialer application, a music application, and a calendar application. By pressing the tactile button "5- apps" and the buttons "Prev", "Next", and "OK" at cover 101 B - the user can switch between the proposed software applications. When selecting software 2, thus the music application, the user can choose between screen 1 , screen 2, screen 3, and screen 4 of software 2 that can represent, respectively, artists screen, albums screen, podcast screen, and streaming screen.
In this example, the software supports the use of the user interface apparatus 100 from the launcher level and down to a single usage like pressing a button. Other kinds of support are also contemplated, like support only an element of the user interface, a few elements, a menu, a screen, application, operating system, and any other type of user interface at any level.
Fig. 6 shows a schematic representation of a translated application API of an exemplary dialer application, showing the application's decision tree and one possible path traversing the decision tree.
Using for example the cover 101 A of Fig. 2A, a set of selections for
performing the highlighted operation shown in Fig. 6 may look like:
1 . Vocalize "Dialer" 2. Select "Choose"
3. Vocalize "Contacts"
4. Select "Choose"
5. Vocalize 1st name
6. Select "Choose"
7. Vocalize "Email"
8. Select "Next"
9. Vocalize "Phone"
10. Select "Choose"
1 1 . Vocalize "Edit"
12. Select "Next"
13. Vocalize "Dial"
14. Select "Choose"
In the given examples, the user interface device was shown in its usage as a navigation tool for simulating touch screen user interface by navigating a decision tree, where the user interface device serves as a navigation tool.
In other uses the user interface device of the present invention may
permanently or configurably enable the operation of a specific application. For example: 1 . An application may be programmed with a specific mode of operation designed to be carried out using a predefined layout of the user interface device. In this embodiment a user interface device layout is dictated by the application and each button has a predefined
functionality within the application.
2. An application may be programmed with the ability to broadcast its user interface requirements to a local or remote user interface device configuration software, whereby the user interface device buttons may be configured accordingly and the configuration communicated to the application. It should be understood that the above description is merely exemplary and that there are various embodiments of the present invention that may be devised and that the features described in the above-described embodiments, and those not described herein, may be used separately or in any suitable combination; and the invention can be devised in accordance with
embodiments not necessarily described above. Furthermore, the above description is exemplary only and it will be apparent to those of ordinary skill in the art that numerous modifications and variations are possible. For example, various exemplary methods and systems described herein may be used alone or in combination with various other systems and methods. Additionally, particular examples have been discussed and how these examples are thought to address certain disadvantages in related art. This discussion is not meant, however, to restrict the various examples to methods and/or systems that actually address or solve the disadvantages.

Claims

1. A system comprising: a user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen; said user interface apparatus defining a user interface layout over said touchscreen; an interface module configured to support usage of said user interface layout; and at least one translation module configured to translate said usage of said user interface layout to a usage of a software.
2. The system of claim 1 , wherein said user interface apparatus
comprises a cover configured to cover at least part of said device's touchscreen.
3. The system of claim 2, wherein said cover comprises one or more
tactile buttons arranged in a layout.
4. The system of claim 3, wherein said tactile buttons are selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any combination thereof.
5. The system of claim 3, wherein said cover further comprises additional tactile features for easier tactile recognition.
6. The system of claim 2, wherein said cover further comprises signage for easier visual recognition.
7. The system of claim 2, wherein said cover comprises at least one
viewer for viewing and/or touching the touchscreen.
8. The system of claim 1 , wherein said user interface apparatus is at least partly a part of said touchscreen and/or said device.
9. The system of claim 1 , further comprising a connector for connecting said user interface apparatus with said device.
10. The system of claim 1 , wherein said user interface apparatus further comprises an identifier for identifying said apparatus to said interface module.
1 1 . The system of claim 1 , wherein said interface module comprises a local application running on said device.
12. The system of claim 1 , wherein said interface module is configured to receive said software usage from said translation module and communicate it to said software for execution.
13. The system of claim 1 , wherein said translation module is configured to communicate said software usage to said software for execution.
14. The system of claim 1 , configured to derive at least part of said
translation from an identifier, said identifier comprising information regarding said user interface apparatus.
15. The system of claim 1 , configured to derive said translation from a
configuration definition related to said software.
16. The system of claim 1 , wherein said software is selected from the
group consisting of local or remote program, application, web
application, simulated application, browser, website, server, API, and any combination thereof.
17. The system of claim 1 , wherein said usage of the user interface
apparatus comprises one of: navigation, selection, operation, and input.
18. The system of claim 17, wherein said selection comprises one of:
application selection, operating system service selection and item selection within an application or an operating system service.
19. The system of claim 1 , further comprising a feedback module
comprising one of sound, speech, vibration, lights, and any
combination thereof.
20. A user interface apparatus with tactile features, configured to be used with at least part of a device's touchscreen, comprising: one or more tactile buttons arranged in a layout; and
an identifier for identifying said apparatus to said device.
21 . The apparatus of claim 20, wherein said tactile buttons are selected from the group consisting of: cutouts, extruded areas, ridges, molded key features, indentations, protrusions, depression and any
combination thereof.
22. The apparatus of claim 20, further comprising additional tactile features for easier tactile recognition.
23. The apparatus of claim 20, further comprising signage for easier visual recognition.
24. The apparatus of claim 20, further comprising at least one viewer for viewing and/or touching the touchscreen.
25. The apparatus of claim 20, wherein said apparatus is at least partly a part of said touchscreen and/or said device.
26. The apparatus of claim 20, further comprising a connector for
connecting said apparatus with said device.
27. A method of enabling non-visual use of a target software via a first device, said first device comprising a touchscreen and said target software running on said first device or on a second device
communicating with said first device, the method comprising:
defining a user interface layout over said touchscreen by situating a cover relative to said touchscreen, said cover comprising tactile buttons arranged in a layout; and using a software to translate usage of said user interface layout to said target software usage.
28. The method of claim 27, wherein the architecture of said software is at least partly compliant with said layout.
29. The method of claim 27, wherein settings of said software are
determined according to information provided by an identifier of said cover.
30. The method of claim 27, wherein said situating comprises coupling said cover with said touchscreen and/or said first device using a coupling mechanism.
31 . A method of enable non-visual use of a software via a device comprising a touchscreen, the method comprising:
providing one or more layouts of tactile buttons of a cover covering at least part of the touchscreen;
designing at least part of the elements of at least one interface of the software to correspond to at least part of the layout;
situating the cover relative to the touchscreen; and
using the tactile buttons of the cover to use said corresponding elements of the software.
32. A method according to claim 31 , wherein the architecture of said software is at least partly compliant with said layout.
33. The method of claim 31 , wherein said software is selected from the group consisting of: program, application, website, server, webapp, API, and any combination thereof.
34. The system of claim 1 , wherein said translation module further
comprises a software development kit configured to enable access of software pre-designed to interact with said system.
35. The system of claim 1 , wherein said translation module further
comprises an application programming interface configured to enable access of software pre-designed to interact with said system.
36. A tactile keyboard comprising a slide button configured to select symbols comprising letters, the symbols spread along said slide button in a predefined order.
37. The tactile keyboard of claim 36, said keyboard connected with a device's touchscreen, said device running an interface application configured to interpret usage of said keyboard and provide feedback.
38. The tactile keyboard of claim 37, wherein said feedback depends on the speed of sliding across the slide button.
39. The tactile keyboard of claim 38, wherein said feedback comprises vocalizing the name of a letter during slow sliding.
40. The tactile keyboard of claim 38, wherein said feedback comprises generating a tone relative to said speed when the speed is high.
41 . The tactile keyboard of claim 37, wherein properties of said sliding changes the usage interpretation.
42. The tactile keyboard of claim 41 , wherein said changing the usage interpretation comprises changing the zoom of said symbols spread along said slide button.
43. The tactile keyboard of claim 37, further comprising tactile means for indicating change of letter types.
PCT/IB2015/055408 2014-07-16 2015-07-16 Tactile interface for a touchscreen WO2016009390A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462025019P 2014-07-16 2014-07-16
US62/025,019 2014-07-16

Publications (1)

Publication Number Publication Date
WO2016009390A1 true WO2016009390A1 (en) 2016-01-21

Family

ID=55077969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/055408 WO2016009390A1 (en) 2014-07-16 2015-07-16 Tactile interface for a touchscreen

Country Status (1)

Country Link
WO (1) WO2016009390A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050099403A1 (en) * 2002-06-21 2005-05-12 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
US8482540B1 (en) * 2011-01-18 2013-07-09 Sprint Communications Company L.P. Configuring a user interface for use with an overlay

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050099403A1 (en) * 2002-06-21 2005-05-12 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
US8482540B1 (en) * 2011-01-18 2013-07-09 Sprint Communications Company L.P. Configuring a user interface for use with an overlay

Similar Documents

Publication Publication Date Title
CN102144209B (en) Multi-tiered voice feedback in an electronic device
CN104461346B (en) A kind of method of visually impaired people's Touch Screen, device and intelligent touch screen mobile terminal
CN101002175B (en) Method, apparatus and computer program product to utilize context ontology in mobile device application personalization
CN102428429B (en) Searching Method Of A List And Portable Device Using The Same
CN102929505B (en) Adaptive input language switches
US20190311717A1 (en) Method and apparatus for executing application on basis of voice commands
EP2523107A1 (en) Mobile terminal and system for managing applications using the same
EP3933555A1 (en) Method and device for mapping applications to number keys
KR20180081849A (en) A lock screen method and mobile terminal
TW201243716A (en) Customized launching of applications
CN102426511A (en) System level search user interface
CN103873908A (en) Display apparatus, remote control apparatus, and method for providing user interface using the same
KR100649149B1 (en) Mobile communication device having display-hot keys and method of the same
CN102163120A (en) Prominent selection cues for icons
KR101882293B1 (en) Integrated keyboard for character input and content recommendation
CN103399847A (en) Application language library for managing computing environment language
CN102687504A (en) Method, navigation and display system for widgets on internet-enabled devices
US11907741B2 (en) Virtual input device-based method and system for remotely controlling PC
CN111367458A (en) Barrier-free film for intelligent touch screen mobile terminal and barrier-free touch screen method
Alajarmeh Non-visual access to mobile devices: A survey of touchscreen accessibility for users who are visually impaired
KR20110064629A (en) Operation method and device for optional key map of portable device
KR20140127146A (en) display apparatus and controlling method thereof
KR102044916B1 (en) System and providing method thereof of educational tutorial platform of smart device
WO2016009390A1 (en) Tactile interface for a touchscreen
JP2017531868A (en) Website information providing method and apparatus based on input method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15822564

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 12/05/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15822564

Country of ref document: EP

Kind code of ref document: A1