KR101864584B1 - Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device - Google Patents

Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device Download PDF

Info

Publication number
KR101864584B1
KR101864584B1 KR1020170098689A KR20170098689A KR101864584B1 KR 101864584 B1 KR101864584 B1 KR 101864584B1 KR 1020170098689 A KR1020170098689 A KR 1020170098689A KR 20170098689 A KR20170098689 A KR 20170098689A KR 101864584 B1 KR101864584 B1 KR 101864584B1
Authority
KR
South Korea
Prior art keywords
output signal
tactile
time
module
information
Prior art date
Application number
KR1020170098689A
Other languages
Korean (ko)
Inventor
조진수
정정일
Original Assignee
가천대학교 산학협력단
주식회사 피씨티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 가천대학교 산학협력단, 주식회사 피씨티 filed Critical 가천대학교 산학협력단
Priority to KR1020170098689A priority Critical patent/KR101864584B1/en
Application granted granted Critical
Publication of KR101864584B1 publication Critical patent/KR101864584B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a method for providing time information through a tactile interface device, a device, and a nontemporary computer-readable medium. In order to improve a level of informatization and increase efficiency in using smart devices (smart phones, smart pads, etc.) or smart braille devices for the blind, by allowing the blind to intuitively use, interact with, and control the tactile interface device corresponding multi-array tactile cell-based smart braille devices, it is possible to intuitively provide time information to the blind, or to intuitively utilize functions of an alarm and a timer.

Description

[0001] The present invention relates to a method, apparatus, and non-transitory computer-readable medium for providing time information through a tactile interface device,

The present invention relates to a method, apparatus and non-volatile computer-readable medium for providing time information via a tactile interface device, and more particularly to a smart device (smart phone, smart pad, etc.) In order to increase the efficiency of use and improve the level of informatization, it is possible to intuitively use, interact and control the tactile interface device corresponding to a multi-array tactile cell-based smart braille device, thereby intuitively providing time information to the visually impaired, Apparatus and non-transitory computer-readable medium for providing time information via a tactile interface device that allows the user to intuitively utilize the functions of an alarm, timer or the like.

In the information society, it is essential to acquire and use information using computers, and this tendency is the same for the general public as well as the visually impaired.

In order to improve the level of informatization of the visually impaired, they should also be able to acquire and use information at a level similar to that of the normal person. This is important because it makes the daily life of the visually impaired more convenient, Do. In addition, it can be said that it is very important because it can ultimately improve the welfare of the visually impaired by providing various education opportunities and expanding opportunities for social advancement and participation.

However, the conventional method of using the present computer mainly uses a method of visually recognizing visual information output through a monitor and inputting information using an input tool such as a keyboard, a mouse, or a touch pad. Therefore, there is a great limitation in visually impaired persons who can not use the time to perceive the output information of the computer at the same level as the general person and to interact such as immediate information input. As a result, the visual impairment significantly degrades the efficiency of computer use, thereby greatly depriving the computer of opportunities for information acquisition and utilization.

Various techniques have been developed to recognize the visual information and to interact with the computer through the use of the hearing, tactile sense, etc., in order to solve the difficulties in using the computer of the visually impaired. As a representative technology, there is a screen reader which assists the visually impaired to use the computer through the hearing. This is a device or software that helps users to use the computer by outputting the contents displayed on the computer screen and the keyboard information inputted by the user.

However, the screen reader has difficulty in finding the screen output information because it searches Graphic User Interface (GUI) elements of the output screen by only one line of linearized information without two-dimensional spatial position information on the output screen. Especially, The more information there is, the greater the difficulty. In addition, since screen readers provide only a simple textual explanatory description for various graphic information such as pictures and diagrams in addition to characters and GUI elements, the visually impaired can understand and interact with graphic information It will have great difficulty.

Another related art is a Braille information terminal which transmits character information through a braille cell by a tactile sense. This is a method that is used as an independent device by providing some functions of a computer useful for a visually impaired person and a method of using as a screen output auxiliary device for outputting text information of a computer screen analyzed by a screen reader in braille. Instead of acting as an interface for efficient interaction with the computer, both of them serve as substitute devices that perform only a few limited functions of the computer as a substitute for the computer, or as output auxiliary devices that output text information in braille do. Particularly, the Braille information terminal specialized in the output of braille has a problem that it can not display the graphic information like the screen reader.

Korean Published Patent No. 10-2012-0063982

An object of the present invention is to provide a tactile interface device, which is a smart braille device based on a multi-array tactile cell, for an intelligent device (smart phone, smart pad, etc.) or a smart braille device, To provide time information intuitively to a visually impaired person by allowing the user to use, interact and control the alarm or timer, or to provide time information through a tactile interface device that allows intuitive use of the alarm and timer functions Non-volatile computer-readable medium.

According to an aspect of the present invention, there is provided a method for providing time information through a tactile interface device that is implemented as a computing device including a processor and is connected to the computing device and capable of interacting with a user, The computing device includes: a time notification module for providing time information; And a TUI module for converting information that can be displayed graphically into a form that can be displayed in a tactile graphics on a tactile interface device, wherein the TUI module converts the display screen into an output signal to a tactile interface device A first output signal generating unit for generating a first output signal; And a second output signal generator operable to convert the display screen into an output signal to a tactile interface device in a second manner, wherein a time notification module is executed in the computing device to cause the visual display device A time notification step of outputting information including time information; Wherein the TUI module is executed in the computing device to generate an output signal for implementing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the time notification module, And a first TUI step of generating an input signal for inputting to the time notification module from the user input, wherein the first TUI step comprises the steps of: And a time notification output signal generating step of generating a time notification output signal for implementing a tactile screen of the tactile display interface device.

In some embodiments, the time informing step may include: a first time presentation step of generating first time data representing a combination of at least two of the day, hour, minute, and second; A second time representation step of generating second time data representing each piece of information of at least one of day, hour, minute, and second, wherein the time announcement output signal generation step comprises: A first time output signal generating step of generating a first time output signal for controlling the tactile cell array of the tactile display device so that the first time data can be expressed in braille form by the generator; A second time output signal for generating a second time output signal for controlling the tactile cell array of the tactile display device so as to express the second time data in a graphical form by a second output signal generator of the TUI module; Wherein the time announcement output signal comprises the first time output signal and the second time output signal.

In some embodiments, the time announcement output signal includes a control signal for a two-dimensional haptic cell array, and the tactile display of the tactile display device implemented by the time announcement output signal is coupled to the computing device A content area implemented based on a content part of a screen that can be displayed in the content area; A cursor area that implements the position and shape of the cursor of the current user; And a page area implemented based on page information of a screen that can be displayed on the computing device by the main screen module, wherein the first time output signal and the second time output signal are implemented in the content area It is possible.

In some embodiments, the content region may be divided into a plurality of sub-content regions, the cursor region may be divided into a plurality of sub-cursor regions, and a position of a cursor implemented in the cursor region may correspond to a position of the focused sub- The sub-content area corresponding to the sub-cursor area and the sub-content area corresponding to the sub-cursor area are aligned with each other on one axis, the first time output signal is displayed in one sub-content area, In the 2-hour output signal, each of 1 or more of day, hour, minute, and second may be displayed in one sub-content area.

In some embodiments, the method further comprises: an alarm step in which an alarm module is executed within the computing device to output information including alarm settings to a visual display device connected to the computing device; And a TUI module is executed in the computing device to generate an output signal for implementing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the alarm module, Further comprising: a second TUI step of generating an input signal for inputting to the alarm module from a user input, wherein the second TUI step is performed by the alarm module to generate a tactile sense corresponding to a screen that can be displayed on the computing device, And an alarm output signal generation step of generating an alarm output signal for implementing a tactile display of the interface device.

In some embodiments, the alarm output signal includes a control signal for a two-dimensional haptic cell array, and the tactile display of the tactile display device implemented by the alarm output signal is displayed on the computing device by the alarm module A content area that is implemented based on a content part of a screen that can be displayed; A cursor area that implements the position and shape of the cursor of the current user; And a page area implemented based on page information of a screen that can be displayed on the computing device by the main screen module.

In some embodiments, the alarming step may include generating an alarming status data representing the currently set alarming status; And an alarm setting input step of setting or changing the alarm status according to an input to the user's tactile interface device.

In some embodiments, the alarm setting input step may include an alarm braille input step of setting or changing an alarm status item by inputting braille information to the tactile interface device of the user, or an alarm braille input step of inputting a keypad input to the user's tactile interface device And an alarm keypad input step of setting or changing an item of the alarm status.

In some embodiments, the method further comprises: a timer step in which a timer module is executed within the computing device to output information including a timer status and a timer setting to a visual display device connected to the computing device; And a TUI module is executed in the computing device to generate an output signal for implementing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the timer module, Further comprising: a third TUI step of generating an input signal for inputting to the timer module from the user input, wherein the third TUI step comprises a step of generating a tactile sense corresponding to a screen that can be displayed on the computing device by the timer module, And a timer output signal generation step of generating a timer output signal for implementing a tactile display of the interface device.

In some embodiments, the timer output signal includes a control signal for a two-dimensional haptic cell array, and the tactile display of the tactile display device implemented by the timer output signal is displayed on the computing device by the timer module A content area that is implemented based on a content part of a screen that can be displayed; A cursor area that implements the position and shape of the cursor of the current user; And a page area implemented based on page information of a screen that can be displayed on the computing device by the main screen module.

In some embodiments, the timer step generates timer data representing real-time information of at least one of hour, minute, and second of currently set timer information. The timer output signal generation step may generate a timer output signal for controlling the tactile cell array of the tactile interface device so that the timer information can be expressed in a graphical form by the second output signal generation unit of the TUI module .

In some embodiments, the timer output signal includes a control signal for a two-dimensional haptic cell array, and the tactile display of the tactile display device implemented by the timer output signal is displayed on the computing device by the timer module A content area that is implemented based on a content part of a screen that can be displayed; A cursor area that implements the position and shape of the cursor of the current user; And a page area implemented based on page information of a screen that can be displayed on the computing device by the main screen module, and the timer output signal may be implemented in the content area.

In some embodiments, the content region may be divided into a plurality of sub-content regions, the cursor region may be divided into a plurality of sub-cursor regions, and a position of a cursor implemented in the cursor region may correspond to a position of the focused sub- Wherein the sub-content area corresponding to the sub-cursor area and the sub-content area corresponding to the sub-cursor area are aligned with each other on one axis, and each one or more of hour, minute and second in the timer output signal corresponds to one Or may be displayed in the sub content area.

In some embodiments, the first TUI step further comprises a buffer output signal generation step of generating a buffer output signal based on the time announcement output signal and the previous tactile display information of the tactile interface device, The signal and the buffer output signal may comprise control signals for a two-dimensional haptic cell array.

In some embodiments, the first TUI step is performed each time a screen that can be displayed on the computing device is changed by the time notifying module, and the buffer output signal generating step is performed on the immediately preceding output signal And generate a buffer output signal based on the difference of the current modified output signal.

SUMMARY OF THE INVENTION In order to solve the above problems, the present invention provides a non-volatile computer-readable medium for providing time information to a visually impaired person by controlling a tactile interface device connected to a computing device, Storing instructions for causing a computing device to perform the following steps: outputting information including visual information to a visual display device connected to the computing device; And an output signal generating unit for generating an output signal for realizing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the time notification step and inputting the output signal from the user input inputted from the tactile interface device to the time notification module Wherein the first TUI includes a time for implementing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device by the time notification step, And generating a timed announcement output signal to generate a timed announcement output signal.

In the present invention, the time informing step may include: a first time presentation step of generating first time data expressing combined information of at least two of day, hour, minute, and second; And a second time representation step of generating second time data representing each piece of information of at least one of day, hour, minute, and second, wherein the first TUI step is a step of displaying the first time data in braille A first time output signal generating a first time output signal for controlling the tactile cell array of the tactile interface device; And a second time output signal for generating a second time output signal for controlling the tactile cell array of the tactile display device so that the second time data can be expressed in a graphical form, A first time output signal and a second time output signal.

According to an embodiment of the present invention, a blind or tactile interface device is used to provide various contents (book, education, multimedia, internet, game, etc.) in a level similar to that of a general person in daily life such as education, Tactile User Interface for smart braille devices based on a multi-array tactile cell and an operating system for a braille or tactile interface device based on a main operating system such as Android for interlocking, controlling, and utilizing an application Can be exercised.

According to an embodiment of the present invention, visual information (information that can be perceived by the time) provided by a smart device is converted into tactile information (information capable of being perceived by the tactile sense) Based tactile display, manages various input information of a user, and can control and manage various application software operating in a smart device.

According to an embodiment of the present invention, the operation of an operating system (Android OS) and applications of a smart braille device is controlled through an input unit of a multi-array tactile cell-based smart braille device (braille keyboard / directional key / Graphic User Interface (GUI) information output from the device to the screen is converted into tactile user interface (TUI) data for the visually impaired to provide tactile and braille information so that the visually impaired can use the general smart device or smart braille device It is possible to more easily control various contents and utilize various contents.

According to an embodiment of the present invention, it is possible to provide intuitive time information on the time and day of the time information based on the multi-array tactile cell-based smart braille device and the TUI for the visually impaired.

According to an embodiment of the present invention, a specific time notification function and a timer function for providing a simplified time information input interface can be provided to a visually impaired person.

According to an embodiment of the present invention, it is possible to provide a method of recognizing through time information of grabbing and outputting of voice information, which is a time information recognition method familiar to the visually impaired, It is possible to obtain the effect of converting and displaying the braille character in the form of braille graphic according to the change of time.

According to an embodiment of the present invention, it is possible to provide a grace period time notation method for providing the interleaved information as braille braille information and an intuitive time information notation method for intuitively recognizing the time information by the blind person.

1 is a schematic diagram of an overall system including a tactile interface device and a computing device according to an embodiment of the present invention.
2 is a schematic diagram illustrating an internal configuration of a computing device according to an embodiment of the present invention.
3 is a diagram schematically showing an operating environment of the braille OS unit according to an embodiment of the present invention.
FIG. 4 is a diagram schematically showing an internal configuration of a braille OS according to an embodiment of the present invention.
FIG. 5 is a schematic diagram illustrating an internal configuration of a TUI module according to an exemplary embodiment of the present invention. Referring to FIG.
6 schematically shows a tactile display screen in a tactile interface device according to an embodiment of the present invention.
FIG. 7 schematically illustrates a method of controlling a tactile interface device according to an embodiment of the present invention.
8 schematically illustrates a detailed procedure of a trigger step according to an embodiment of the present invention.
FIG. 9 schematically shows a display screen in a computing apparatus and a tactile display in the tactile interface apparatus according to an embodiment of the present invention.
10 schematically shows the area configuration of a tactile display in a tactile interface device according to an embodiment of the present invention.
11 schematically shows an input / output process of the tactile interface device according to an embodiment of the present invention.
12 schematically shows a process of generating an output signal according to an embodiment of the present invention.
FIG. 13 schematically shows a process of generating a cursor area output signal according to an embodiment of the present invention.
Fig. 14 schematically shows an example of a form of cursor area notation according to an embodiment of the present invention.
Figure 15 schematically illustrates some examples of cursor region notation in accordance with an embodiment of the present invention.
16 schematically shows a process of generating a buffer output signal according to an embodiment of the present invention.
17 schematically shows an example of a TTS step according to an embodiment of the present invention.
18 schematically illustrates an example of an event step according to an embodiment of the present invention.
FIG. 19 shows a screen for controlling the tactile interface device according to an embodiment of the present invention.
FIG. 20 shows a screen for controlling the tactile interface device according to an embodiment of the present invention.
FIG. 21 shows a screen for controlling the tactile interface device according to an embodiment of the present invention.
22 schematically shows the configuration of a braille clock module and a tactile display interface device according to an embodiment of the present invention.
23 schematically shows an internal configuration of a braille clock module according to an embodiment of the present invention.
24 schematically shows an internal configuration of a TUI module of a braille clock module according to an embodiment of the present invention.
25 schematically shows an internal configuration of a time alarm module according to an embodiment of the present invention.
26 schematically shows an internal configuration of an alarm module according to an embodiment of the present invention.
27 schematically shows an internal configuration of a timer module according to an embodiment of the present invention.
28 schematically illustrates the main steps of a method for providing time information according to an embodiment of the present invention.
29 schematically illustrates sub-steps of the time informing step and the first TUI step according to an embodiment of the present invention.
30 schematically shows data processing of a first TUI stage according to an embodiment of the present invention.
31 schematically illustrates an alarm step and a second TII step according to an embodiment of the present invention.
32 schematically illustrates a detailed process of an alarm step according to an embodiment of the present invention.
33 schematically illustrates a timer step and a third TUI step according to an embodiment of the present invention.
FIG. 34 schematically illustrates a detailed process of a timer step according to an embodiment of the present invention.
FIG. 35 shows a screen of a tactile interface device in which TUI conversion is performed for a time informing step and a time informing step according to an embodiment of the present invention.
FIG. 36 shows a screen of a tactile interface device in which TUI conversion is performed for the time informing step and the time informing step according to an embodiment of the present invention.
37 shows a screen of the tactile interface device in which TUI conversion is performed for the alarm step and the alarm step according to the embodiment of the present invention.
FIG. 38 shows a screen of a tactile interface device in which a TUI conversion is performed for an alarm step and an alarm step according to an embodiment of the present invention.
FIG. 39 shows a screen of the tactile interface device in which the TUI conversion is performed for the alarm step and the alarm step according to an embodiment of the present invention.
FIG. 40 shows a screen of the tactile interface device in which TUI conversion is performed for the alarm step and the alarm step according to an embodiment of the present invention.
FIG. 41 shows a screen of the tactile interface device in which the TUI conversion is performed for the alarm step and the alarm step according to the embodiment of the present invention.
FIG. 42 shows a screen of the tactile interface device in which the TUI conversion is performed for the alarm step and the alarm step according to the embodiment of the present invention.
FIG. 43 shows a screen of the tactile interface device in which the TUI conversion is performed for the timer step and the timer step according to an embodiment of the present invention.
44 shows a screen of a tactile interface device in which a TUI conversion is performed for a timer step and a timer step according to an embodiment of the present invention.
FIG. 45 shows a screen of a tactile interface device in which a TUI conversion is performed for a timer step and a timer step according to an embodiment of the present invention.
46 is a diagram showing an example of an internal configuration of a computing device according to an embodiment of the present invention.

In the following, various embodiments and / or aspects are now described with reference to the drawings. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. However, it will also be appreciated by those of ordinary skill in the art that such aspect (s) may be practiced without these specific details. The following description and the annexed drawings set forth in detail certain illustrative aspects of one or more aspects. It is to be understood, however, that such aspects are illustrative and that some of the various ways of practicing various aspects of the principles of various aspects may be utilized, and that the description set forth is intended to include all such aspects and their equivalents.

In addition, various aspects and features will be presented by a system that may include multiple devices, components and / or modules, and so forth. It should be understood that the various systems may include additional devices, components and / or modules, etc., and / or may not include all of the devices, components, modules, etc. discussed in connection with the drawings Must be understood and understood.

As used herein, the terms "an embodiment," "an embodiment," " an embodiment, "" an embodiment ", etc. are intended to indicate that any aspect or design described is better or worse than other aspects or designs. . The terms 'component', 'module', 'system', 'interface', etc. used in the following generally refer to a computer-related entity, And a combination of software and software.

It is also to be understood that the term " comprises "and / or" comprising " means that the feature and / or component is present, but does not exclude the presence or addition of one or more other features, components and / It should be understood that it does not.

Also, terms including ordinal numbers such as first, second, etc. may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component. And / or < / RTI > includes any combination of a plurality of related listed items or any of a plurality of related listed items.

Furthermore, in the embodiments of the present invention, all terms used herein, including technical or scientific terms, unless otherwise defined, are intended to be inclusive in a manner that is generally understood by those of ordinary skill in the art to which this invention belongs. Have the same meaning. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the contextual meaning of the related art and, unless explicitly defined in the embodiments of the present invention, are intended to mean ideal or overly formal .

1. As a virtual operating system controlling the tactile interface device Braille OS module

1 is a schematic diagram of an overall system including a tactile interface device and a computing device according to an embodiment of the present invention.

The tactile interface device 1000 shown in FIG. 1 is only one example, but the present invention is not limited thereto. The tactile interface device 1000 connected to the computing device A that controls the tactile interface device of the present invention described below includes any interface device capable of providing tactile graphics, a tactile interface device , And a computing device capable of outputting tactile graphics.

As shown in FIG. 1, the computing device A is connected to the tactile interface device 1000, and the connection is a connection that includes both a wired connection and a wireless connection.

The computing device A includes a smart phone, a tablet personal computer (PC), a mobile phone, a videophone, an electronic book reader an e-book reader, a desktop PC, a laptop PC, a netbook PC, a personal digital assistant (PDA) A portable medical device, a portable multimedia player (PMP), an MP3 player, a mobile medical device, a camera, a wearable device (for example, a head- An electronic bracelet, an electronic necklace, an electronic app apparel, an electronic tattoo, or a smart (smart) device, such as a head-mounted device (HMD) watch and so on.

Such a computing device A may include one or more processors and memories, and may be optionally connected to a display device such as a monitor or may have its own display module.

Alternatively, the computing device A may be coupled to the tactile interface device 1000. In this case, the tactile interface device 1000 and the computing device A that controls the tactile interface device 1000 may be recognized as a single device. In the case of such a combined type device, the computing device (A) and the tactile interface device (1000) may be configured to share a memory with a processor.

The tactile interface apparatus 1000 shown in FIG. 1 includes an external apparatus connection unit 1100 capable of performing a wired or wireless communication with the computing apparatus A and the like; A tactile display unit 1200 for providing tactile information to a user through a plurality of pins; A direction key unit 1300 for changing the position of the input coordinate or detail to be focused; A keyboard unit 1400 for receiving information from a user in a key input form; And a control unit 1500 for controlling operations of the external device connection unit 1100, the tactile display unit 1200, the direction key unit 1300, and the keyboard unit 1400.

The external device connection unit 1100 includes at least one of a communication module unit capable of performing wireless communication and a wired connection unit capable of wiredly connecting with the external device. The communication module may include a Bluetooth communication module, a Zigbee communication module, an infrared communication module, a Bluetooth low energy (BLE) communication module, an audio communication module, an LTE (long term evolution) communication module, a WiFi communication module, Based infrared communication module, a wireless LAN (WLAN), a WiBro module, and a wireless USB (Wireless USB) module. The wired connection unit 1120 may include a connection module using a universal serial bus (USB) interface, and may be a wired connection module capable of transmitting and receiving data.

The tactile display unit 1200 provides tactile information with at least one dimensional tactile pixel, and the tactile pixel is constituted by a plurality of pins moving up and down by applying power to a transducer including a piezoelectric ceramics and an elastic body . Preferably, the tactile pixel provides tactile information in two dimensions.

The tactile display unit 1200 may include a display data receiving unit for receiving data generated based on data received from an external user terminal or generated within the tactile interface device; A tactile data converter for converting the data into tactile display data; And a plurality of pin driving modules driven by the tactile display data; And a driving power supply unit that is supplied with power for driving the tactile display unit 1200, and provides tactile information or tactile graphics based on the received data.

The tactile display unit 1200 may display or provide tactile pixels of one or more dimensions. In one example, the tactile pixel is constituted by a plurality of pins moving up and down by applying power to a transducer including a piezoelectric ceramic and an elastic body.

Specifically, the tactile pixel is represented by a plurality of pin drive module groups, and each of the pin drive module groups is made up of a plurality of pin drive modules. Alternatively, the entire pin drive module may constitute one pin drive module group.

Meanwhile, the direction key unit 1300 changes the position of the input coordinate or detail to be focused.

The keyboard unit 1400 may be configured by a plurality of keys, and the input of each of the keys may be a key input unit It can be converted into an instruction of an application being executed in the device A. [

Meanwhile, the user performs input to the tactile interface apparatus through the direction key unit 1300 and the keyboard unit 1400, and the tactile interface apparatus 1000 converts the inputted command or information, (A).

Preferably, the keyboard unit 1400 preferably includes a braille keyboard that converts a braille character generally used by the visually impaired to general characters and transmits a character input signal to the computer.

The keyboard unit 1400 receives the braille and transmits it to the tactile interface apparatus 1000 or the computing apparatus A connected to the tactile interface apparatus 1000. The keyboard unit 1400 may include a point key, a shortcut key, and an execution or space key.

Because braille consists of several dots to form a single letter, if the braille key is pressed at the same time, the information of the final braille can be transmitted. The transmitted braille information may be transferred to a general character through the software in the tactile interface apparatus 1000 or the software of the computing apparatus.

For this reason, the tactile display unit 1100 plays the same role as the monitor of the general computer, and the direction key unit 1300 and the keyboard unit 1400 function as a keyboard, And serves as an input device.

2 is a schematic diagram illustrating an internal configuration of a computing device according to an embodiment of the present invention.

A computing device that controls the tactile interface device 1000 according to the present embodiment may include a processor, a bus (corresponding to a bi-directional arrow between the processor, memory, and the network interface unit), a network interface, and a memory. The memory may include a braille OS part execution code 3400, a general OS part execution code 3500, an application part execution code 3600, and an input / output control routine 3700. [ The processor includes a braille OS unit 2000; A general OS unit 3100; And an application program section 3200. [ Here, the general OS unit 3100 corresponds to the main operating system of the computing device, and may be an ANDROID OS of GOOGLE, a WINDOWS OS of a microsoft, and the like.

In other embodiments, the computing device A that controls the tactile interface device 1000 may include more components than the components of FIG.

The memory may be a computer-readable recording medium and may include a permanent mass storage device such as a random access memory (RAM), a read only memory (ROM), and a disk drive. Braille OS unit executable code 3400, general OS executable code 3500, application program executable code 3600, and input / output control routine 3700 can be stored in the memory. These software components may be loaded from a computer readable recording medium separate from the memory using a drive mechanism (not shown). Such a computer-readable recording medium may include a computer-readable recording medium (not shown) such as a floppy drive, a disk, a tape, a DVD / CD-ROM drive, or a memory card. In other embodiments, the software components may be loaded into the memory via the network interface 3300 rather than from a computer readable recording medium.

The bus may enable communication and data transfer between components of a computing device controlling the tactile interface device. The bus may be configured using a high-speed serial bus, a parallel bus, a Storage Area Network (SAN), and / or other suitable communication technology.

The network interface unit 3300 may be a computer hardware component for connecting the computing device A that controls the tactile interface device 1000 to a computer network. The network interface unit 3300 may connect a computing device controlling the tactile interface device to a computer network via a wireless or wired connection. A computing device that controls the tactile interface device through the network interface 3300 may be wirelessly or wirelessly connected to the tactile interface device.

The processor may be configured to process instructions of a computer program by performing an input / output operation of a computing device that controls basic arithmetic, logic, and tactile interface devices. The instructions may be provided to the processor by the memory or network interface portion 3300 and through the bus. The processor includes a braille OS unit 2000; A general OS unit 3100; And the application program 3200. [0050] Such program codes may be stored in a recording device such as a memory.

The braille OS unit 2000 may be configured to perform a method of controlling the tactile interface apparatus 1000 to be described below. The above-mentioned processor may omit some components according to the method of controlling the tactile interface device, or may further include additional components not shown, or two or more components may be combined.

3 is a diagram schematically showing an operating environment of the braille OS unit according to an embodiment of the present invention.

The general OS portion 3100 corresponds to a software module corresponding to the main operating system of the computing device A and the Braille OS portion 2000 and the application program portion 3200 are operated under the control of the general OS portion 3100 do. That is, the method for controlling the tactile interface device according to the present invention may be implemented by a Braille OS part corresponding to a virtual sub operating system for a tactile interface device, which operates under the control of the main operating system of the computing device (A) .

 The braille OS unit 2000 controls the overall operation of the tactile interface apparatus 1000 and the input / output of the tactile interface apparatus 1000, and simultaneously controls the application program unit 3200 operating under the control of the general OS unit 3100, And controls the input / output between the tactile sensing interface device 1000. [

The braille OS unit 2000 may include an application program for controlling the environment of the tactile interface apparatus 1000 and an embedded basic program capable of inputting and outputting data from the tactile interface apparatus 1000.

Meanwhile, in the environment of the braille OS unit 2000, the developers develop software or application programs for the visually impaired based on the display of the actual computing device, for example, the display of the smartphone, The input / output to / from the interface device 1000 can be performed by the braille OS unit 2000.

In other words, the Braille OS unit 2000 for performing the method of controlling the tactile interface device of the present invention develops software having a general GUI by a general developer, Can be implemented in the tactile interface apparatus 1000 in a form capable of inputting and outputting, and it is possible to provide an effect that a general software developer can supply a variety of software to a visually impaired person.

4 is a view schematically showing an internal configuration of a braille OS unit according to an embodiment of the present invention. 4, the braille OS unit 2000 includes a trigger module 2100, a main screen module 2200, an environment setting module 2300, a TUI (Tactile User Interface) module 2400, A text to speech (TTS) module 2600, and an assessment module 2700.

The trigger module (2100) confirms the connection between the computing device (A) and the tactile interface device (1000). And executing the execution of the main screen module (2200) when it is determined that the computing device (A) and the tactile interface device (1000) are interconnected.

The braille OS unit 2000 is basically implemented in a computing device that interfaces with a visual display. Therefore, in order to use the tactile display device by executing each module of the Braille OS portion in the computing device, the Braille OS portion must be executed.

In general, the execution of the application on the smart phone is performed by the user checking the application icon and the touch input on the GUI of the smartphone, but this operation may be difficult for the visually impaired.

Unlike the execution of an application executed in a general computing device, the execution of such a braille OS unit must be executed by a visually impaired person visually difficult to recognize. Therefore, the trigger module 2100 operates in the background of the main operating system, When a blind person makes a connection between the tactile interface device and the computing device A (preferably such connection is made by a blind person entering a physical button on the tactile interface device), the trigger module 2100 ) Calls the execution of the main screen module 2200 primarily.

9, the main screen module 2200 displays a main screen of the braille OS unit such as a window desktop, and selects a module or a function desired by the user on the main screen. And the like.

Preferably, the main screen module 2200 provides an interface to be displayed on a computing device, as shown in FIG. 9A. That is, when the main screen module is executed, the visual display device connected to the computing device displays the GUI interface as shown in FIG. 9A. At the same time, the output signal of the GUI interface is converted into an output signal of the tactile interface by the TUI module 2400 and transmitted to the tactile interface device 1000 so that the visually impaired person can tactilely recognize the main screen have. An application program built in the braille OS part or an external application program that can operate under the control of the braille OS part can be manufactured to provide such a GUI.

On the other hand, the main screen module 2200 is provided by the braille OS unit 2000. That is, when the braille OS unit 2000 is executed, the main screen module provided by the braille OS unit, that is, the sub-operating system executed under the control of the main operating system, is executed. Alternatively, the main screen module can be executed by the trigger module.

The environment setting module 2300 is a module that provides an interface for setting the environment of the braille OS unit 2000 and / or the environment of the tactile interface device 1000. Similarly, when the environment setting module 2300 is executed, an interface for environment setting is provided in the display device connected to the computing device A, and at the same time, the TUI module 2400 transmits the interface to the tactile interface device 1000, To a form capable of input / output.

The TUI module 2400 may output an output signal for realizing a tactile screen of the tactile interface device 1000 corresponding to a screen that can be displayed on the computing device A by the main screen module or another application module And generates an input signal for inputting to the main screen module 2200 from the user input inputted from the tactile interface apparatus 1000. [

That is, the TUI module 2400 provides the GUI of the external application, which is provided by the main screen module 2200, the braille OS unit 2000, and is operated under the control of the built-in application or the main operating system, 1000) to the interface TUI. Accordingly, if a general developer develops only general software operating in the computing device A, the interface of the software is converted into a form capable of interfacing with the tactile interface device by the TUI module 2400, The software may be used by persons with disabilities.

That is, in the braille OS unit 2000, an application module is called according to a user's input in the main screen module 2200; And a second TUI step of executing a TUI module provided by the sub-operating system, wherein the second TUI step is performed by the application module, the touched interface device corresponding to a screen that can be displayed on the computing device, And generates an input signal for inputting to the application module from the user input inputted from the tactile interface device.

The built-in application module 2500 may correspond to an application such as an alarm, a clock, a basic document creator, a basic document viewer, and the like. Similarly, the built-in application may be produced based on a GUI for a general public. 2400), it is possible to use the blind person through the tactile interface device.

The TTS module 2600 may perform a TTS function for elements displayed or touched by the main screen module 2200, the environment setting module 2300, or another application module, or may perform a TTS function for a TTS And performs a function of requesting TTS performance while providing text information to be a target of TTS to a functioning module.

The event module 2700 generates an event notification output signal to the tactile interface device based on a connection state between the user terminal and the tactile interface device. In order to solve such a problem, the braille OS unit provides an event module, so that the information about the connection failure To the tactile display device immediately.

FIG. 5 is a schematic diagram illustrating an internal configuration of a TUI module according to an exemplary embodiment of the present invention. Referring to FIG. (Drawings not attached to word documents)

The TUI module generates an output signal for realizing a tactile screen of the tactile interface apparatus 1000 corresponding to a screen that can be displayed on the computing device A by the main screen module 2200 or other application An output signal generating unit 2410; An input signal generation unit 2420 for converting an input such as a key input in the tactile interface device into a form that can be input to a main screen module or other application running in the computing device and generating an input signal; A scaling information load unit (2430) for loading scaling information suitable for the pixels of the tactile display unit of the connected tactile display device in the operation of the output signal generation unit; And a buffer output signal generator 2440 for generating a buffer output signal based on the output signal and previous tactile display information of the tactile interface device.

Here, the output signal and the buffer output signal include a control signal for a two-dimensional haptic cell array.

A program module including the main screen module 2200 and the like can be converted into an input / output signal through the TUI module 2400 to the tactile interface device.

The scaling information load unit 2430 preferably loads tactile display pixel information in a manner that the tactile display device information of the tactile interface device is automatically received from the connected tactile interface device. Alternatively, such tactile display pixel information may be stored in the memory of the computing device (A).

6 schematically shows a tactile display screen in a tactile interface device according to an embodiment of the present invention.

As described above, the TUI module 2400 includes an output signal generation unit 2410 for generating an output signal for implementing a tactile screen of the tactile interface apparatus 1000 corresponding to a screen that can be displayed on the computing device A, (2410).

Meanwhile, the output signal includes a control signal for a two-dimensional tactile cell array, and a tactile display of the tactile display device implemented by the output signal can be displayed on the computing device by the main screen module 2200 A content area L1 implemented based on a content part of a screen; A cursor area (L2) that implements the position and shape of the cursor of the current user; And a page area L3 implemented based on page information of a screen that can be displayed on the computing device by the main screen module.

Meanwhile, as shown in FIG. 6, the tactile display device is provided with various physical keys, and each physical key is assigned a function to perform a more intuitive function. The TUI module 2400 performs the function of the physical key input, generates an instruction of an application operating in the computing device by the TUI module, inputs the instruction to an application operating in the computing device, .

FIG. 7 schematically illustrates a method of controlling a tactile interface device according to an embodiment of the present invention.

In step S10, a triggering step is performed in which the blind user pushes the tactile interface device ON or key input for connection of the tactile interface device to the computing device, thereby calling the execution of the main screen module.

As described above, the trigger module 2100 of the braille OS unit 2000 operates in the background under the control of the main operating system of the computing device. When the above conditions are satisfied, 2200).

In the case of a visually impaired person, the triggering step S10 preferably calls for execution of the main screen module 2200 because it is more advantageous for the first page to start.

In step S20, the main screen module 2200 is executed. In the GUI display connected to the computing device A, the screen of the main screen module 2200 is visually displayed by the execution of the main screen module. As described above, since the main screen module 2200 is basically displayed visually, the developers can develop the main screen module 2200 or other application applications in a general manner without considering the braille or the visually impaired . In addition, if a blind person has a problem in using a specific application, a general person can operate a general interface element (for example, a touch display) connected to the computing device to solve a problem caused by the blind user using the application You can do it.

In step S30, the TUI module 2400 is executed on the main screen displayed by the main screen module 2200, and the output elements of the main screen module 2200 are displayed on the tactile display And converts the input of the tactile sensing interface device 1000 into a command of a type that can be input to the main screen module 2200. [ This is referred to as a first TUI step (S30) for convenience.

In step S40, a separate application is executed by the user's operation. The additional application may be an application provided by the braille OS 2000 or an application provided separately from the braille OS 2000. [

In step S50, the TUI module 2400 is executed on a screen displayed by the separate application, and the output element of the screen is converted into a driving signal of the tactile display unit 1200 of the tactile interface apparatus 1000 Or converts the input from the tactile interface device into a command that can be input to the application. This is referred to as a second TUI step (S50) for convenience. Here, the second TUI step may also be performed by the TUI module 2400 provided by the braille OS unit 2000.

In step S60, the execution of the TTS function for the specific area is called according to the operation of the tactile display device of the user. Preferably, the braille OS unit 2000 determines a part to be a TTS object, and requests the TTS application operated by the main OS to perform a TTS function.

In step S70, an event module for notifying the tactile interface apparatus 1000 is executed based on the connection state of the computing device and the tactile interface apparatus.

8 schematically illustrates a detailed procedure of a trigger step according to an embodiment of the present invention.

The triggering step may be performed before the step of executing the main screen module 2200. That is, when the general user executes the braille OS unit 2000 on the computing device and the main screen module 2200 is not called, the user who is visually impaired may turn on the power of the tactile display device (in this case, (Or connected to a computing device that has been turned on) or by operating a tactile display device that is ON to request a connection to the computing device.

The triggering step includes: (S11) confirming a connection between the computing device (A) and the tactile sensing interface device (1000); And executing the execution of the step of executing the main screen module 2200 when it is determined that the computing device A and the tactile interface device 1000 are interconnected.

FIG. 9 schematically shows a display screen in a computing apparatus and a tactile display in the tactile interface apparatus according to an embodiment of the present invention.

9 (A) shows a display screen of the computing device when the main screen module is executed when the computing device is a smartphone, FIG. 9 (B) And shows the state of the tactile interface device at the same time point.

The main screen module 2200 displays each menu (contents) in a one-dimensionally arranged form and displays a cursor corresponding to the menu in a form such that the cursor is positioned next to each menu. That is, the main screen module divides the contents into a plurality of units and displays them on a computing device in a form divided into a plurality of units. Then, the TUI module (2400) divides the contents divided into a plurality of units into a plurality of sub content areas.

Similarly, in the tactile display device 1000, the respective menus (contents) are displayed in a one-dimensionally arranged form, and a cursor corresponding thereto is displayed in a form positioned next to the respective menus.

10 schematically shows the area configuration of a tactile display in a tactile interface device according to an embodiment of the present invention.

The tactile display of the tactile display device implemented by the output signal generated by the TUI module 2400 may include a content portion of the screen that can be displayed on the computing device A by the main screen module 2200 9); < / RTI > A cursor area L2 that implements the position and shape of the current user's cursor; And a page area L3 implemented based on page information of a screen that can be displayed on the computing device by the main screen module 2200. [

In addition, the content area may be divided into a plurality of sub content areas, and the cursor area may be divided into a plurality of sub-cursor areas, and a position of a cursor implemented in the cursor area may be a position of the focused sub content area . This type of tactile display corresponds to a structure that allows the visually impaired to identify and input information most efficiently.

The sub content area is composed of a plurality of braille cell groups, and each braille cell group includes a plurality of braille cells, for example, six braille cells. Here, one braille cell refers to one braille hole.

Likewise, each of said sub-cursor regions also includes a plurality of braille cells of a type capable of announcing at least one braille cell, preferably the location and type information of the cursor.

More preferably, as in the "focused L2 area" and the "focused L1 area" of FIG. 10, the sub-content area corresponding to the sub-cursor area and the sub-content area are aligned on one axis. In such a structure, the visually impaired can most efficiently grasp where the focus of his cursor is. This type of interface corresponds to a form derived from various experiments conducted by the applicant of the present invention against the blind.

11 schematically shows an input / output process of the tactile interface device according to an embodiment of the present invention.

Steps S110, S120, and S130 are performed by the TUI module 2400. On the other hand, in step S140, a command is input to the application, and the interface or screen output of the application is changed accordingly.

Steps S150, S160, and S170 are performed by the TUI module 2400. By performing these steps, a change of an application interface or a screen output is transferred to the tactile interface apparatus 1000, and the user can immediately recognize such a change.

12 schematically shows a process of generating an output signal according to an embodiment of the present invention.

The output signal generating step may include: a content region output signal generating step (S310) of generating a control signal of the two-dimensional haptic cell array with respect to the content area; Generating a control signal of the two-dimensional tactile cell array for the cursor area (S320); and generating a control signal of the two-dimensional tactile cell array for the page area Step S330.

10, the content area may be divided into a plurality of sub content areas, and the cursor area may be divided into a plurality of sub-cursor areas. In the content area output signal generation step S310, Wherein the generating of the cursor area output signal (S320) includes generating the output signal of the plurality of sub-content areas by dividing the data of the content part of the screen that can be displayed by the computing device And generates an output signal for controlling the tactile cell array at a portion of the cursor area corresponding to the sub content area in which the user's cursor is located in the content area.

FIG. 13 schematically shows a process of generating a cursor area output signal according to an embodiment of the present invention.

The step of generating the cursor area output signal (S 320) includes the steps of: determining the position of the cursor of the user (S 321); And determining the shape of the cursor of the user (S322 and S323).

The cursor of the user performs a function for focusing a part of the detailed content area. However, in the case of a tactile display device, since a braille pixel has pixels lower than a normal display pixel, there may be a problem that information in one detailed content area can not be displayed at a time .

In one embodiment of the present invention, the step of determining the shape of the user's cursor (S322 and S323) in addition to determining the position of the cursor is performed, thereby allowing the visually impaired to intuitively use the software.

Specifically, the step of determining the shape of the cursor of the user (S322 and S323) may be performed in the same manner as in the case where the cursor is located, or the cursor is located with the original text of the sub- A step S322 of loading an output signal of the device or tactile display information in the tactile interface device, and a step of generating a cursor layer output signal by determining a cursor type based on the original text and the output signal or the tactile display information .

That is, in the cursor region output signal generation step, tactile display information in the tactile interface device in the sub content area in which the cursor of the user is located and tactile display information in the sub content area in which the cursor of the user is located, And determines the shape of the cursor of the user.

Preferably, the step of determining the shape of the cursor of the user (S322 and S323) comprises the steps of: determining whether the tactile display information of the sub content area in which the cursor of the user is located, Determining a first cursor type when all the original text information of the main screen module in the contents area is included,

When the tactile display information in the tactile interface device in the sub content area in which the cursor of the user is located includes a part of the original text information of the main screen module in the sub content area in which the cursor of the user is located, .

More preferably, the original text information of the main screen module 2200 of the sub content area where the cursor of the user is located is first original text information, second original text information following the first original text information, When the tactile display information in the tactile interface device of the sub content area in which the cursor of the user is located corresponds to the first original text information, 2-1 cursor type to be displayed; A second-second cursor type displayed when the tactile display information in the tactile interface device in the sub content area where the cursor of the user is located corresponds to a part of the second original text information; And a second-second cursor type which is displayed when the tactile display information in the tactile interface device of the sub content area in which the cursor of the user is located corresponds to the third original text information.

In a preferred embodiment of the present invention, the information in the content area is divided into a plurality of content sub-areas and displayed on the tactile display device.

For example, it is assumed that the information in the content area is as follows.

 "The base of a refreshable braille display often integrates a pure braille keyboard. There are two sets of four keys on each side, one is a refreshable braille display consisting of a row of electro-mechanical character cells, each of which can raise or lower a combination of eight round-tipped pins . Other variants exist that use a conventional QWERTY keyboard for input and braille pins for output, as well as input-only and output-only devices.

On some models the position of the cursor is represented by vibrating the dots, and some models have a switch associated with each cell to move the cursor to that cell directly.

The mechanism which raises the dots uses the piezo effect of some crystals, so they are expanded when a voltage is applied to them. Such a crystal is connected to a lever, which in turn raises the dot. There is a crystal for each dot of the display, i.e. eight per character. "

In this case, the TUI module " The base of a refreshable braille display often integrates a pure braille keyboard. There are two sets of four keys on each side, one is a refreshable braille display consisting of a row of electro-mechanical character cells, each of which can raise or lower a combination of eight round-tipped pins . As well as as input-only and output-only devices. &Quot; are assigned to the first sub-content area to generate an output signal therefrom,

Quot; is assigned to the second sub content area, and the output signal < RTI ID = 0.0 >Lt; / RTI >

 "The mechanism that raises the dots uses the piezo effect of some crystals, so they expand when a voltage is applied to them. Such a crystal is connected to a lever, which in turn raises the dot. There is a crystal for each dot of the display, i.e. quot ;, " eight per character. " can be allocated to the third sub content area, and an output signal for the third sub content area can be generated.

However, in a case where one sub content area can display 30 CHARACTERs, the original text of the second sub content area can be divided as follows.

"On some models the position of"

&Quot; the cursor is represented by v " (the first part of the second original information)

&Quot; ibrating the dots, and some mo " (the second part of the second original information)

&Quot; dels have a switch associated " (the third part of the second original information)

&Quot; with each cell to move the cur " (the fourth part of the second original text information)

"Sor to that cell directly." (Third original information)

Here, one sub content area may display only one of the first to third text information. In this case, the blind person checks the braille information of the cursor displayed in the sub cursor area corresponding to the sub content area (second sub content area), and the information of the sub content area outputted from the present tactile display device is displayed You can see where it is located. Thereafter, the blind person can move between the first original text information and the third original text information by operating the direction key of the tactile display device. In this way, the visually impaired person can more intuitively grasp the information while minimizing the inconvenience caused by the tactile pixel of the tactile display device.

Fig. 14 schematically shows an example of a form of cursor area notation according to an embodiment of the present invention.

14 shows an example of a cursor shape that can be displayed in a total of four cursor areas, more preferably in a sub-cursor area. By viewing such a cursor shape and operating a direction key, the display contents of the sub content layer can be changed.

The first cursor shape, the second-first cursor shape, the second-second cursor shape, and the second-third cursor shape are respectively a first cursor shape, a second cursor shape, a third cursor shape , Corresponding to the fourth cursor shape.

Figure 15 schematically illustrates some examples of cursor region notation in accordance with an embodiment of the present invention.

As shown in (A) of FIG. 15, the entire contents are divided into contents 1 to 6 and are shown in respective sub-contents areas. When the user focuses on the first sub-contents area, A tactile graphic for the cursor is displayed in a sub-cursor area which is arranged side by side or side by side.

Here, when the content 1 can not be displayed in one sub content area, the content 1 is virtually divided into " content 1A content 1B content 1C ", and initially the content 1A is tactile as shown in FIG. And the tactile graphic for the cursor on the left has the second cursor shape in Fig.

Then, when the user presses the right direction key, the content 1B is displayed in tactile graphics as shown in Fig. 15C, and the tactile graphic for the left cursor has the third cursor shape in Fig.

Then, when the user presses the right direction key, the content 1C is displayed as a tactile graphic as shown in (D) of FIG. 15, and the tactile graphic for the left cursor has the fourth cursor shape as shown in FIG.

16 schematically shows a process of generating a buffer output signal according to an embodiment of the present invention.

Hereinafter, the operation of the buffer output signal generation step for the conversion of the main screen module will be described for convenience of explanation. However, the buffer output signal generation step may be applied not only to the conversion of the main screen module but also to the conversion of display screens of other applications.

As described above, the first TUI step (S30) may include an output for implementing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device by the main screen module 2200 or another application An output signal generating step (S150) for generating a signal; And a buffer output signal generation step (S160) of generating a buffer output signal based on the output signal and the previous tactile display information of the tactile interface device.

Here, the output signal and the buffer output signal include a control signal for a two-dimensional haptic cell array.

Preferably, the first TUI step (S30) is performed every time a screen that can be displayed on the computing device is changed by the main screen module, and the buffer output signal generation step is performed by the previous output Generates the buffer output signal based on the difference between the signal and the current modified output signal.

Specifically, the buffer output signal generation step (S160) includes the steps of: (S410) loading a previous output signal generated by the output signal generation unit before the screen changes to an input of a previous user, that is, a current user;

Calculating a difference between the current output signal generated by the output signal generator and the previous output signal after the screen is changed to the input of the current user (S420);

And generating a buffer output signal based on a difference between the current output signal and the previous output signal (S430).

Such a buffer output signal includes only the control signal for the currently changed one of the tactile arrays in the tactile interface device, thereby allowing the tactile interface device to operate more physically smoothly.

17 schematically shows an example of a TTS step according to an embodiment of the present invention.

As described above, the method of controlling the tactile interface apparatus 1000 further includes a TTS step of performing a TTS command in response to a user's command.

The TTS may include determining whether to perform a TTS based on an input signal or command from the tactile interface device (S510); Loading information on the currently set TTS mode (S520); A cursor position loading step (S530) of loading the position of the current user's cursor; And determining (S540) the original text information to be subjected to the TTS based on the TTS mode and the position of the cursor of the user; And requesting execution of the TTS module by the original text information determined in step S540 (S550).

Preferably, the TTS step may be performed for each sub content area. That is, the original text of the sub content area in which the current cursor is positioned can be outputted as a voice by the TTS step.

More preferably, the TTS mode is a mode for performing a TTS on the focused sub-content area; And a sub-content area following the focused sub-content area and the focused sub-content area to perform a TTS.

That is, " On some models the position of the cursor is represented by vibrating the dots, and some models have a switch associated with each cell to move the cursor to that cell directly. Mechanism which raises the dots uses the piezo effect of some crystals, so they expand when a voltage is applied to them. Such a crystal is connected to a lever, which in turn raises the dot. There is a crystal for each dot of the display, i.e. quot ;, " eight per character. " corresponds to the second sub content area, and the area where the cursor is focused is the first sub content area, the TTS mode includes a first mode for processing only the first sub content area, And a second mode for processing the second sub-content area, if the command related to the user's OFF is input.

18 schematically illustrates an example of an event step according to an embodiment of the present invention.

As shown in FIG. 18, the event step includes: determining (S610) a connection state between the user terminal and the tactile-sensing interface device; And generating an event output signal to be output to the tactile interface device when the connection state is determined to correspond to a preset reference (S620).

By performing such an event step, the visually impaired person can intuitively detect the obstacle when the obstacle to the tactile display device and the computing device is obstructed.

FIG. 19 shows a screen for controlling the tactile interface device according to an embodiment of the present invention, FIG. 20 shows a screen for controlling the tactile interface device according to an embodiment of the present invention, and FIG. And a screen for controlling the tactile interface device according to an embodiment.

As shown in FIGS. 19 through 21, the method for controlling the tactile interface device according to an embodiment of the present invention allows a user with visual impairments to intuitively interface with a program, , It is possible to supply software for the visually impaired by developing general software without knowing the braille and developing a separate input / output API.

2. How to provide time information through the tactile interface device

22 schematically shows the configuration of a braille clock module and a tactile display interface device according to an embodiment of the present invention.

The configuration of the computing device A shown in Fig. 22 is according to the description with reference to Figs. 2 and 3 described above. On the other hand, the configuration of the braille OS unit 2000 in FIG. 22 is based on the description with reference to FIGS.

Meanwhile, the braille clock module 5000 of FIG. 22 may correspond to one form of the application program unit 3200 of FIG. 3 or may correspond to the built-in application module 2500 of FIG.

Similarly, the information output from the braille clock module 5000 may be displayed in a visual form recognizable to a general person in a display device connected to the computing device (A) The input / output signals may be converted into a form that can be interfaced by the visually impaired person in the tactile interface apparatus 1000 by a common TUI module or a special TUI module for only the braille clock module built in the braille clock module 5000.

Preferably, in the embodiment of the present invention, the braille OS unit 2000 includes a common TUI module 2400, and the braille clock module 5000 may be provided with a braille- It is desirable to include a separate TUI module 5100 that may make the interface more robust.

Similarly, the execution of the braille clock module 5000 is controlled by the braille OS module 2000, and while the braille clock module 5000 is being controlled, the TTS module 2600, the engagement module 2700, Etc. may be executed simultaneously.

23 schematically shows an internal configuration of a braille clock module according to an embodiment of the present invention.

As described above, the braille clock module 5000 is implemented as a computing device A including a processor, and is connected to the computing device A through a tactile interface device 1000 capable of interacting with a user Provide time information.

The braille clock module 5000 includes a TUI module 5100 for converting input / output signals input from the internal function module of the braille clock module into a form that can be more intuitively recognized or input by the tactile interface device 1000, (Wherein the TUI module is preferably a module included in the braille clock module, separate from the TUI module 2400 of the braille OS portion 2000 of FIG. 4); A time notification module 5200 for providing time information; An alarm module 5300 for providing an interface by which a blind person can set an alarm and providing an alarm function; A timer module 5400 that provides an interface by which a blind person may set a timer and provides timer information in a more intuitive form to the blind.

The TUI module 5100 can display information graphically displayable in the time notification module 5200, the alarm module 5300 and the timer module 5400 in the tactile interface device 1000, .

The TUI module 5100 includes a first output signal generator 5111 that can convert a display screen to an output signal to a tactile interface device, as a first method; And a second output signal generation unit 5112 that can convert the display screen into an output signal to the tactile interface device in a second method. This is different from the general TUI module 2400 provided in the braille OS unit 2000 of FIG. 4 and FIG. 5, in that all information is not converted into the same form but is converted into a category of information By controlling the tactile cells in different ways, time information can be more intuitively provided to the visually impaired.

First, when the braille clock module 5000 is operated, the time notification module 5200 is executed in the computing device A first. The time notification module 5200 corresponds to a module that directly provides information on the current time to the visually impaired person through the tactile interface device 1000.

When the time notification module 5200 is executed, a time notification step (S1000) of outputting information including time information to the visual display device connected to the computing device A is performed. In other words, the time notification module 5200 is an application that can be used by the general public similar to the application program 3200 of FIG. 3 or the built-in application module 2500 of FIG. 4, and the basic output screen is GUI- It corresponds to visual output.

Thereafter, the TUI module 5100 is executed in the computing device A in accordance with the execution of the time notification module 5200, and the time notification module 5000 executes the TUI module 5100 corresponding to the screen displayed on the display device And generates an output signal for implementing the tactile display of the tactile interface apparatus 1000 and generates an input signal for inputting to the time notification module 5200 from the user input inputted from the tactile interface apparatus 1000 1 TUI step S1100 is performed. The TUI module 5100 converts the input and output signals of the time notification module 5200, the alarm module 5300 and the tire module 5400 into a form of input and output from the tactile interface device 1000. However, The execution of the input / output signal conversion of the time notification module 5200 will be referred to as a first TUI step S1100.

The first TUI step (S1100) performed by the TUI module (5100) is performed by the time notification module to implement a tactile screen of the tactile interface device (1000) corresponding to a screen that can be displayed on the computing device And a time notification output signal generating step of generating a time notification output signal.

Meanwhile, the time notification module 5200 provides a menu interface for allowing the alarm module 5300 and the timer module 5400 to be called, and the corresponding menu interface is connected to the tactile interface 5100 through the TUI module 5100. [ Is implemented in the device 1000. Here, the user can execute the alarm module 5300 by performing an input on the menu in the tactile interface device 1000.

An alarm step (S1200) of outputting information including an alarm setting to a visual display device connected to the computing device (A) when the alarm module (5300) is executed in the computing device (A) The TUI module 5100 is executed in the computing device A and the tactile screen of the tactile interface device 1000 corresponding to the screen displayed on the display device by the alarm module 5300 A second TUI step S1300 is performed to generate an output signal and to generate an input signal for inputting to the alarm module 5300 from the user input inputted from the tactile interface apparatus 1000. [

The second TUI step S1300 may include an alarm output signal for realizing a tactile screen of the tactile display device 1000 corresponding to a screen that can be displayed on the computing device A by the alarm module 5300 And generating an alarm output signal.

Meanwhile, the time notification module 5200 provides a menu interface for allowing the alarm module 5300 and the timer module 5400 to be called, and the corresponding menu interface is connected to the tactile interface 5100 through the TUI module 5100. [ Is implemented in the device 1000. Here, the timer module 5400 can be executed by the user by performing an input to the menu in the tactile interface device 1000. [

A timer step (S1400) for, when the timer module (5400) is executed in the computing device (A), outputting information including a timer status and a timer setting to a visual display device connected to the computing device (A); And the TUI module 5100 is executed in the computing device A to implement a tactile screen of the tactile interface device 1000 corresponding to a screen displayed on the display device by the timer module 5400 And a third TUI step (S1500) for generating an input signal for inputting to the timer module 5400 from the user input input from the tactile interface device 1000 is performed.

The third TUI step S1500 may include a timer output signal for realizing a tactile screen of the tactile interface device 1000 corresponding to a screen that can be displayed on the computing device A by the timer module 5400 And a timer output signal generating step of generating a timer output signal.

24 schematically shows an internal configuration of a TUI module of a braille clock module according to an embodiment of the present invention.

The output signal generator 5110, the input signal generator 5120, the scaling information loader 5130, and the buffer output signal generator 5140 of the TUI module 5100 are the same as those shown in FIGS. 5, 6, 10, and 12 , 13, 14, 15, and 16, and a description of some of the duplicated contents will be omitted.

Specifically, the TUI module 5100 of the braille clock module 5000 may be connected to the tactile interface device (not shown) corresponding to the screen that can be displayed on the computing device A by the main screen module 2200 or other application An output signal generating unit 5110 for generating an output signal for realizing a tactile display of the touch panel 1000; An input signal generating unit for converting an input such as a key input in the tactile sensing device 1000 into a form that can be input to a main screen module or other application executed in the computing device A and generating an input signal 5120); A scaling information load unit 5130 for loading scaling information suitable for pixels of the tactile display unit of the connected tactile display device in operation of the output signal generation unit 5110; And a buffer output signal generator 5140 for generating a buffer output signal based on the output signal and the previous tactile display information of the tactile interface device 1000.

Here, the output signal and the buffer output signal include a control signal for a two-dimensional haptic cell array.

The scaling information load unit 5130 preferably loads the tactile display pixel information in a manner that the tactile display device information of the tactile interface device is automatically received from the connected tactile interface device. Alternatively, such tactile display pixel information may be stored in the memory of the computing device.

The output signal generator 5110 generates an output signal for realizing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device (A). Here, the output signal includes a control signal for the two-dimensional tactile cell array, and the tactile display of the tactile display device implemented by the output signal is generated by the TUI module 2400 built in the braille OS unit 2000 Likewise, a content area L1 is implemented based on a content part of a screen that can be displayed on the computing device A by the main screen module 2200 as shown in FIG. 6; A cursor area (L2) that implements the position and shape of the cursor of the current user; And a page area L3 implemented based on page information of a screen that can be displayed on the computing device A by the main screen module 2200. [

Also, as shown in FIG. 6, the tactile display device is provided with various physical keys, and each physical key is assigned a function to perform a more intuitive function. By the input of the physical key, an instruction of an application operating in the computing device is generated by the TUI module, and the corresponding instruction can be input to an application operating in the computing device.

In addition, the tactile display of the tactile display device implemented by the output signal generated by the TUI module 5000 of the braille clock module may include the visual notification module 5200, the alarm module 5300), and a content area (L1) implemented based on a content portion of a screen that can be displayed on the computing device (A) by the timer module (5400); A cursor area (L2) that implements the position and shape of the cursor of the current user; And a page area L3 implemented based on page information of a screen that can be displayed on the computing device by the main screen module 2200. [

In addition, the content area may be divided into a plurality of sub content areas, and the cursor area may be divided into a plurality of sub-cursor areas, and a position of a cursor implemented in the cursor area may be a position of the focused sub content area . Such a tactile display form corresponds to a structure that can most effectively confirm and input information to a visually impaired person, and it enables a more intuitive interface to a visually impaired person in an interface providing time information.

More preferably, the tactile graphics of the tactile interface device implemented by the TUI module 5100 of the braille clock module 5000 may also include the tactile display of FIG. 10, as in the "focused L2 area" and the "focused L1 area" The sub-cursor area and the sub-content area corresponding to the sub-cursor area are aligned on one axis. In such a structure, the visually impaired can most efficiently grasp where the focus of his cursor is. This type of interface corresponds to a form derived from various experiments conducted by the applicant of the present invention against the blind.

The buffer output signal generation unit 5140 generates the buffer output signal of the output signal generation unit 5110 based on the output signals from the time notification module 5200, the alarm module 5300, and the timer module 5400 And generates a buffer output signal from the output signal.

Specifically, for example, the first TUI step (S110), the second TUI step (S1300), or the third TUI step (S1500) may include the time notification module 5200, the alarm module 5300, or the timer module 5400 An output signal generation step of generating an output signal for realizing a tactile screen of the tactile interface apparatus 1000 corresponding to a screen that can be displayed on the computing device And a buffer output signal generation step of generating a buffer output signal based on the output signal and previous tactile display information of the tactile interface device.

Here, the output signal and the buffer output signal include a control signal for a two-dimensional haptic cell array.

Preferably, the first TUI step S1100, the second TUI step S1300, or the third TUI step S1500 may include the time notification module 5200, the alarm module 5300, or the timer module 5400 And the buffer output signal generation step includes a step of generating the buffer output signal based on a difference between the immediately preceding output signal and the current modified output signal, And generates a buffer output signal.

Specifically, the buffer output signal generating step may include loading (S410) the previous output signal generated by the output signal generating unit before the screen changes to the previous, that is, the input of the current user, as shown in FIG. (S420) calculating a difference between the current output signal generated by the output signal generator 5110 and the previous output signal after the screen is changed to the input of the current user; And generating a buffer output signal based on a difference between the current output signal and the previous output signal (S430).

Such a buffer output signal includes only a control signal for the currently changed portion of the tactile array in the tactile interface device, so that the tactile interface device can operate more physically and smoothly.

25 schematically shows an internal configuration of a time alarm module according to an embodiment of the present invention.

The time alarm module 5200 includes a braille time display part 5210 for expressing time in braille; A graphic time rendering unit 5220 for graphically representing time; And a menu display unit 5225 for providing an interface or a menu for executing the alarm module or the timer mode.

The braille-time representing unit 5210 generates first time data representing two or more combinations of day, hour, minute, and second. The graphic time rendering unit 5220 generates second time data representing each of at least one of day, hour, minute, and second. The first time data is converted by the first output signal generator 5111 of the TUI module 5100 in the TUI conversion and the current time data by the combination of at least two of the current day, hour, minute, Meta information. The second time data is converted into data for each piece of information of at least one of the current day, hour, minute, and second and TUI is converted through the second output signal generator 5112 of the TUI module 5100 Information.

Specifically, the first time data generated by the braille time expression unit 5210 corresponds to, for example, " Wednesday 14:20:35 ", and the first time data corresponds to the first output signal And is converted into a braille form representing time through the generation unit 5111. [

On the other hand, the second time data generated by the graphic time representation unit 5220 may include, for example, data of a plurality of fields classified as "Wednesday", "14th", "20 minutes", "30 seconds" And each of the second time data is converted into a graphical form representing time through the second output signal generation unit 5112. [

The time represented by the braille time expression unit 5210 must be interpreted and understood by a user who is visually impaired to recognize the time. On the other hand, the time represented by the graphic time display unit 5220 can intuitively recognize the current time while the blind person performs the operation of browsing the tactile interface device with one hand, for example, It is more efficient to use the tactile graphics output from the tactile interface device by the graphic time display unit 5220 to intuitively recognize the information corresponding to the time.

That is, the time alarm module 5200 may generate a first time representation that generates first time data expressing combined information of at least two of day, hour, minute, and second performed by the braille time expression unit 5210 Step S1010; And a second time representation step (S1020) of generating second time data expressing information of at least one of day, hour, minute, and second performed by the graphic time expression unit.

Hereinafter, the TUI module 5100 controls the tactile cell array of the tactile display device to display the first time data in braille form by the first output signal generator 5111 of the TUI module A first time output signal generating step (S1110) for generating a time output signal; The second output signal generator 5112 of the TUI module generates a second time output signal for controlling the tactile cell array of the tactile display device so as to express the second time data in a graphical form, And an output signal generation step S1120. The time announcement output signal includes the first time output signal and the second time output signal.

Meanwhile, the second output signal generator 5112 expresses the information input from the time informing module 5200 in a graphical form recognizable as a braille cell, not a general braille cell, according to a predetermined rule. For example, when the tactile interface device outputs tactile graphics on a braille cell group basis in a 2 * 3 array, the graphic conversion for each day of the week can be performed as follows.

Figure 112017075181251-pat00001

On the other hand, the information at 14 indicates that the entirety of the braille cell group 12 of two 2 * 3 arrays and the two braille cells 2 of the one 2 * 3 array braille cell group are turned ON, May be converted by the generation unit 5112.

That is, the output signal generation unit 5110 converts the information through the first output signal generation unit 5111 or the second output signal generation unit based on the input meta information, and the second output signal generation unit 5112 ), It is converted into a graphic signal in accordance with the mapping rule preset for numbers and day of the week.

Meanwhile, the time announcement output signal generated by the first TUI step (S1100) includes a control signal for the 2-dimensional tactile cell array, and the tactile display of the tactile display device, which is implemented by the time announcement output signal, A content area implemented on the basis of a content part of a screen that can be displayed on the computing device A by the time notification module 5200 as in FIGS. 6 and 10; A cursor area that implements the position and shape of the cursor of the current user; And a page area implemented based on page information of a screen that can be displayed in the computing device (A) by the main screen module (2200), wherein the first time output signal and the second time output signal are And may be implemented in the content area.

In addition, the content area may be divided into a plurality of sub content areas, and the cursor area may be divided into a plurality of sub-cursor areas, and a position of a cursor implemented in the cursor area may be a position of the focused sub content area , The sub-content area corresponding to the sub-cursor area and the sub-content area corresponding to the sub-content area are aligned with each other on one axis, the first time output signal is displayed in one sub content area, In the signal, each of at least one of day, hour, minute, and second is displayed in one sub content area.

In such a layout, the visually impaired user can intuitively call the alarm module 5300 or the timer module 5400 in one page, and at the same time intuitively recognize the information on the current time.

26 schematically shows an internal configuration of an alarm module according to an embodiment of the present invention.

The alarm module 5300 includes an alarm status display unit 5310 for generating alarm status data representing a currently set alarm status to output information including an alarm setting to a visual display device connected to the computing device. And an alarm setting input unit 5320 for setting or changing the alarm status according to the input to the user's tactile interface device or the general input to the computing device.

The screen implemented by the alarm status display unit 5310 is converted into a form that can be expressed by the tactile interface apparatus by the TUI module 5100, and the information input by the tactile interface apparatus is converted into the TUI mode 5100 and input to the alarm setting input unit 5320. [

The alarm setting input unit 5320 includes an alarm setting item determination unit 5321 that determines what an alarm setting item is based on the position of the cursor currently in use; An alarm that provides an interface to change the alarm setting information by the braille input of the user's direct alarm time (for example, when the alarm is set at 2:30, when the user inputs 2 and 30 by braille keyboard) A braille input unit 5322; And an alarm keypad input section 5323 for providing an interface for changing the alarm setting information by inputting a direction key of the user (for example, when the current direction is 1:00, when the right direction key is pressed to change to 2:00).

The input / output in the alarm setting input unit 5320 is converted into a form that can be implemented in the tactile interface apparatus 1000 by the TUI module 5100.

27 schematically shows an internal configuration of a timer module according to an embodiment of the present invention.

A timer module 5400 is executed within the computing device A to cause a timer step S1400 to output information including a timer status and a timer setting to a visual display device connected to the computing device A have.

The TUI module 5100 is executed in the computing device A in accordance with the input and output according to the execution of the timer module 5400 and the timer module 5400 executes the TUI module 5100 corresponding to the screen displayed on the display device A third TUI that generates an output signal for implementing a tactile display of the tactile interface device 1000 and generates an input signal for input to the timer module 5400 from a user input input from the tactile interface device 1000, Step S1500 may be performed.

Meanwhile, the third TUI step S1500 generates a timer output signal for realizing a tactile screen of the tactile display device 1000 corresponding to a screen that can be displayed on the computing device by the timer module 5400 And generating a timer output signal.

The timer module 5400 includes a timer status display unit 5310 for generating current timer status data for outputting information including timer information to a visual display device connected to the computing device A; And a timer setting input unit 5420 for setting or changing the timer status according to input to the user's tactile interface device or general input to the computing device.

In this specification, " timer " refers to a timer for the remaining time, and for example, when setting a 30 second timer, the remaining time from 30 seconds to 0 seconds refers to the time being displayed in real time.

The screen realized by the timer status display unit 5410 is converted into a form that can be expressed by the tactile interface apparatus by the TUI module 5100 and the information input by the tactile interface apparatus 1000 is converted into the TUI (5100) and input to the timer setting input unit (5420).

Preferably, a screen implemented by the timer status expression unit 5410 is converted into a graphic form by the second output signal generation unit 5112 of the TUI module. That is, if a 30 second timer is set, if the "30 seconds" itself is displayed as a grace period, the visually impaired can not confirm how much time is currently left.

On the other hand, in the embodiment of the present invention, 30 seconds are displayed in the graphic form of "30 seconds", specifically, 30 seconds using the number of Braille cells corresponding to 30 in one sub content area corresponding to seconds , The blind cell is turned off in real time and displayed to the blind in a graphic manner, thereby providing the blind user with intuitive timer information.

On the other hand, the timer setting input unit 5420 determines what the timer setting item is based on (for example, where the current cursor is located corresponds to hour, minute, or second) based on the cursor position of the current use A timer setting item determination unit 5421 for determining the timer setting item; A timer that provides an interface to change timer setting information by braille input of the user's direct alarm time (for example, when the alarm is set to 2 hours 30 minutes, when the user inputs each of 2 and 30 by the braille keyboard) A braille input unit 5422; A timer keypad input unit 5423 for providing an interface for changing the alarm setting information by inputting a direction key of the user (for example, when the current timer setting time is one hour, and when the right direction key is pressed, changing to two hours); .

The input / output in the timer setting input unit 5420 is converted into a form that can be implemented in the tactile interface apparatus 1000 by the TUI module 5100.

28 schematically illustrates the main steps of a method for providing time information according to an embodiment of the present invention.

The method of providing time information corresponds to a method implemented by a computing device including a processor and providing time information through a tactile interface device connected to the computing device and capable of interacting with a user, Can be implemented by the organic operation of the internal components of the clock module.

Meanwhile, the computing device in which such a method is implemented includes a time notification module 5200 for providing time information; And a TUI module 5100 that converts information that can be displayed graphically into a form that can be displayed as a tactile graphic in a tactile interface device. In addition, as described above, the TUI module 5100 includes a first output signal generation unit 5111 that can convert a display screen into an output signal to a tactile interface device in a first method; And a second output signal generation unit 5112 that can convert the display screen into an output signal to the tactile interface device in a second method.

In step S1000, a time informing step is performed in which the time informing module 5200 is executed in the computing device and outputs information including time information to a visual display device connected to the computing device.

In step S1100, the TUI module 5100 is executed in the computing device, and an output signal for implementing the tactile screen of the tactile interface device corresponding to the screen displayed on the display device by the time notification module 5200 And a first TUI step of generating an input signal for inputting to the time notification module from the user input inputted in the tactile interface device is performed.

In step S1200, an alarm step is executed in which the alarm module 5300 is executed in the computing device and outputs information including an alarm setting to a visual display device connected to the computing device.

In step 1300, the TUI module 5100 is executed in the computing device to generate an output signal for implementing the tactile screen of the tactile interface device corresponding to the screen displayed on the display device by the alarm module 5300 And a second TUI step of generating an input signal for inputting to the alarm module from the user input inputted at the tactile interface device is performed.

In step S1400, a timer module 5400 is executed within the computing device to perform a timer step of outputting information including a timer status and a timer setting to a visual display device connected to the computing device.

In step S1500, the TUI module 5100 is executed in the computing device to generate an output signal for implementing the tactile screen of the tactile interface device corresponding to the screen displayed on the display device by the timer module 5400 And a third TUI step is performed to generate an input signal for inputting to the timer module from the user input input at the tactile interface device.

The first TUI step S1100, the second TUI step S1200, and the third TUI step S1300 correspond to a name arbitrarily assigned to identify which module is the TUI step, All in terms of the operation of the TUI module 5100. Meanwhile, the information input to the TUI module 5100 in the time notification step S1000, the alarm step S1200, and the timer step S1400 includes the conversion subject information and the meta information about the conversion subject information, The meta information should be converted by the first output signal generation unit 5111 according to the first method or converted by the second output signal generation unit 5112 according to the second method And the like.

29 schematically illustrates sub-steps of the time informing step and the first TUI step according to an embodiment of the present invention.

The time notification step (S1000) may include: a first time display step (S1010) of generating first time data expressing combined information of at least two of the day, hour, minute, and second; (S1020) for generating second time data representing each piece of information of at least one of day, hour, minute, and second. The first time data and the second time data generated in the time informing step are displayed on a general display (for example, a touch screen) of the computing device.

The time notification step S1000 is performed by the first output signal generation unit 5111 of the TUI module 5100 so as to represent the first time data in a braille form, A first time output signal generating step (S1110) for generating a first time output signal for controlling the array; And a second output signal generating unit 5112 of the TUI module 5100 generates a second time output signal for controlling the tactile cell array of the tactile display device so as to express the second time data in a graphical form A second time output signal generating step S1120. The time announcement output signal may include the first time output signal and the second time output signal. In FIG. 29, the first time representation step (S1010) and the second time representation step (S1020) The first time output signal generation step (S1110) and the second time output signal generation step (S1120) are performed. However, the present invention is not limited to this order, and the first time representation step (S 1010) The first time output signal generation step S1110, the second time representation step S1020, and the second time output signal generation step S1120.

30 schematically illustrates data processing in a first TUI stage according to an embodiment of the present invention.

As described above, the first output signal generator 5111 of the TUI module 5100 controls the tactile cell array of the tactile display device to display the first time data in braille, A first time output signal generation step (S1110) for generating an output signal may be performed.

Here, the first time data is data for the entire time, for example, " Wednesday 14: 40 " as shown in FIG. 30, (5111) and converted into braille data.

The second output signal generator 5112 of the TUI module 5100 generates a second time output signal for controlling the tactile cell array of the tactile interface device so as to express the second time data in a graphical form A second time output signal generating step S1120 may be performed.

Here, the second time data is a set of data for each element constituting time. For example, " Wednesday ", " 14 o'clock ", " 42 minutes " And the second time data may be transferred to the braille data by the second output signal generation unit 5112. [

31 schematically illustrates an alarm step and a second TUI step according to an embodiment of the present invention.

In step S1610, an alarm status representation step of generating alarm status data expressing the currently set notification status is performed. The alarm status data is displayed on the display device of the computing device (A), and simultaneously input to the TUI module (5100).

In step S1620, the inputted alarm status data is converted into a form that can be implemented in the tactile interface device.

In step S1630, an alarm setting input step is performed to set or change the alarm status according to the input to the tactile interface device of the user.

In step S1640, the TUI module converts the input / output signal associated with the interface for such input.

32 schematically illustrates a detailed process of an alarm step according to an embodiment of the present invention.

In step S1631, an alarm setting item determination step is performed to determine what the alarm setting item is based on the position of the cursor of the current use.

Thereafter, in accordance with the input to the tactile interface device of the user, step S1632 or S1633 may be performed.

In step S1632, an interface for changing the alarm setting information by braille input of the user's direct alarm time (for example, when the alarm is set at 2:30, when the user inputs 2 and 30 by braille keyboard) An alarm braille input step is performed.

In step S1633, an alarm keypad input step is performed to provide an interface for changing the alarm setting information by inputting the direction key of the user (for example, when the right direction key is pressed at 2 o'clock when the right direction key is pressed).

By providing such a plurality of input braille-based interfaces, the user can input the alarm function more intuitively.

33 schematically illustrates a timer step and a third TUI step according to an embodiment of the present invention.

In steps S1710 and S1730, a timer module is executed in the computing device to perform a timer step of outputting information including a timer status and a timer setting to a visual display device connected to the computing device.

In steps S1720 and S1740, the TUI module 5100 is executed in the computing device to implement a tactile display of the tactile interface device 1000 corresponding to the screen displayed on the display device by the timer module 5400 And a third TUI step of generating an input signal for inputting to the timer module 5400 from the user input input at the tactile interface device 1000 is performed.

Meanwhile, the tactile graphics of the tactile interface device represented by the third TUI step 1740 can be divided into a content area, a cursor area, and a page area as shown in FIGS. 6 and 10.

The content area may be divided into a plurality of sub content areas, the cursor area may be divided into a plurality of sub-cursor areas, and a position of a cursor implemented in the cursor area may be a position of the focused sub content area Wherein the sub-content area corresponding to the sub-cursor area and the sub-content area corresponding to the sub-content area are aligned with each other on one axis, and each one or more of hour, minute and second in the timer output signal corresponds to one sub- And provides time information via a tactile interface device, as shown in FIG.

For example, if a timer of 1 hour, 10 minutes and 10 seconds is set and the current 5 seconds has elapsed and corresponds to 1 hour, 10 minutes and 5 seconds,

The first sub content area graphically represents information on time, the second sub content area graphically represents information on minutes, and the third sub content area graphically represents information on seconds.

FIG. 34 schematically illustrates a detailed process of a timer step according to an embodiment of the present invention.

In step S1731, an alarm setting item determination step is performed to determine what timer setting item is based on the position of the current use cursor.

Thereafter, in accordance with the input to the tactile interface device of the user, step S1732 or S1733 may be performed.

In step S1732, an interface for changing the timer setting information by braille input of the user's direct alarm time (for example, when the timer is set to 2 hours and 30 minutes, when the user inputs each of the 2 and 30 by the braille keyboard) A timer braille input step is performed.

In step S1733, an alarm keypad input step is provided for providing an interface for changing the alarm setting information by inputting the direction key of the user (for example, in the case of the current one hour timer, when the right direction key is depressed, .

By providing such a plurality of input braille-based interfaces, the user can input the timer function more intuitively.

FIG. 35 shows a screen of a tactile interface device in which TUI conversion is performed for a time informing step and a time informing step according to an embodiment of the present invention.

FIG. 35A shows a display screen of the computing device in the time notification step (S1000), and FIG. 35B shows a tactile graphics in the tactile interface device.

As shown in FIG. 35B, the tactile graphic is divided into a content area, a cursor area, and a page area, and the content area and the cursor area each include a plurality of sub content areas and a sub cursor area.

The position of the cursor is displayed in the sub-cursor area, and the content for the cursor is displayed in the sub-contents area which is positioned in parallel with the corresponding sub-cursor area. With this structure, a visually impaired person can intuitively recognize which item currently has information, and can easily input the information.

The cursor shape is determined on the basis of the difference in the contents of the output in the original text and tactile interface device, as described with reference to Figs. 14 and 15. The description thereof will not be repeated because it is the same as that described with reference to FIG. 14 and FIG.

On the other hand, if the user further inputs a specific physical key of the tactile interface device, it can be switched to the input mode for the item, and in this case, the shape of the cursor can be changed to indicate that it corresponds to the current input mode. In the embodiment described below, a form in which (1,3) and (2, 3) are turned on in a 2 * 3 array corresponds to a cursor shape in the input mode.

In the present invention, the intuitive time information display method is a method of displaying time, minute, and second information of the time information of the smart device operating system in real time by using a plurality of tactile cells existing in the same row of the content area adapted to multi- It is converted into tactile information in the form of a bar graph to provide information. In addition, in order to solve the problem that it is somewhat inconvenient to recognize the day of the week information by providing the day-of-week information among the time information indicating method through the existing pastry station, do.

FIG. 36 shows a screen of a tactile interface device in which TUI conversion is performed for the time informing step and the time informing step according to an embodiment of the present invention.

As shown in FIG. 36, the tactile graphic of the tactile interface device implemented by the time informing step S1000 includes a plurality of sub content areas, and one of the sub content areas displays the grace time information , A plurality of sub content areas of one or more are allocated one sub content area for each hour, minute and second, and display time information in a graphic form.

With such a dual interface, the visually impaired person can intuitively understand the time information he or she wants to know more efficiently.

37 shows a screen of the tactile interface device in which TUI conversion is performed for the alarm step and the alarm step according to the embodiment of the present invention.

In the left screen of FIG. 37, a blind person focuses an alarm on the items to be input, for example, hours, minutes, and the user inputs a physical key of the tactile interface device, In Fig. 26, an input by the alarm braille input unit is called.

In the right screen of Fig. 37, the output interface is changed, and the user can input time information by braille input.

FIG. 38 shows a screen of a tactile interface device in which a TUI conversion is performed for an alarm step and an alarm step according to an embodiment of the present invention.

In the left screen of Fig. 38, a visually impaired person focuses an alarm on the items to be input in the setting information, for example, hour and minute, and the user inputs the physical key of the tactile interface device, In FIG. 26, an input by the alarm keypad input unit is called.

In the right screen of FIG. 38, the user can input time information through the keypad input on the same output interface.

Preferably, the user inputs an object to be focused by placing a cursor on the item to be input, and inputs a first key to enter the input mode;

 When the user presses the point key, the output screen is changed while entering the braille input mode as shown in FIG. 37, or if the user presses the keypad thereafter, the setting value is changed at a preset interval on the current output screen as shown in FIG.

FIG. 39 shows a screen of the tactile interface device in which the TUI conversion is performed for the alarm step and the alarm step according to an embodiment of the present invention.

Fig. 39 shows a change in the output of the tactile interface device and the computing device in the case of being subsequently switched to the input mode for the focused item and subsequently depressing the keypad.

Specifically, in the first tactile interface device and the computing device, the setting of 5 minutes is changed to 10 minutes and 15 minutes according to the input of the right keypad.

FIG. 40 shows a screen of the tactile interface device in which TUI conversion is performed for the alarm step and the alarm step according to an embodiment of the present invention.

As shown in FIG. 40, the alarm status information is divided into a plurality of sub content areas, and is implemented as tactile graphics.

FIG. 41 shows a screen of the tactile interface device in which the TUI conversion is performed for the alarm step and the alarm step according to the embodiment of the present invention.

41 shows a page for storing and loading alarm storage information.

FIG. 42 shows a timer setting screen of the timer step and a screen of the tactile interface device in which the TUI conversion is performed on the timer step according to an embodiment of the present invention.

As shown in FIG. 42, the user performs an input with respect to the item of hour, minute, and second with respect to the time information for setting the timer. As in the alarm setting, the user can set each item through direct point key input.

FIG. 43 shows a timer setting screen of the timer step and a screen of the tactile interface device in which the TUI conversion is performed for the timer step according to an embodiment of the present invention.

As shown in FIG. 43, the user performs an input for items of hour, minute, and second with respect to time information for setting a timer. As in the alarm setting, the user can set each item through the keypad input.

44 shows a screen of a tactile interface device in which a TUI conversion is performed for a timer step and a timer step according to an embodiment of the present invention.

As shown in FIG. 44, the timer information corresponding to the remaining time in the timer step (S1400) is expressed in each sub content layer in the form of a graphic instead of a braille type in the information of hour, minute and second. Thus, the blind can grasp the timer information in a more intuitive form.

FIG. 45 shows a screen of a tactile interface device in which a TUI conversion is performed for a timer step and a timer step according to an embodiment of the present invention.

As shown in FIG. 45, the timer information corresponding to the remaining time in the timer step (S1400) is expressed in each sub content layer in a graphical form in which hour, minute, and second information are not in braille. Thus, the blind can grasp the timer information in a more intuitive form.

46 is a diagram showing an example of the internal configuration of a computing device according to an embodiment of the present invention.

46, computing device 11000 includes at least one processor 11100, a memory 11200, a peripheral interface 11300, an input / output subsystem I / Osubsystem) 11400, a power circuit 11500, and a communication circuit 11600. At this time, the computing device 11000 may correspond to the user terminal A connected to the tactile interface device or the computing device A described above.

Memory 11200 can include, for example, a high-speed random access memory, a magnetic disk, SRAM, DRAM, ROM, flash memory or non-volatile memory. have. The memory 11200 may include software modules, a set of instructions, or various other data required for operation of the computing device 11000.

At this point, accessing memory 11200 from other components, such as processor 11100 or peripheral device interface 11300, may be controlled by processor 11100.

Peripheral device interface 11300 may couple the input and / or output peripheral devices of computing device 11000 to processor 11100 and memory 11200. The processor 11100 may execute a variety of functions and process data for the computing device 11000 by executing a software module or set of instructions stored in the memory 11200.

The input / output subsystem 11400 may couple various input / output peripherals to the peripheral interface 11300. For example, input / output subsystem 11400 may include a controller for coupling a peripheral, such as a monitor, keyboard, mouse, printer, or a touch screen or sensor, as needed, to peripheral interface 11300. According to another aspect, the input / output peripheral devices may be coupled to the peripheral device interface 11300 without going through the input / output subsystem 11400.

Power circuitry 11500 may provide power to all or a portion of the components of the terminal. For example, the power circuit 11500 may include one or more power supplies, such as a power management system, a battery or alternating current (AC), a charging system, a power failure detection circuit, a power converter or inverter, And may include any other components for creation, management, distribution.

Communication circuitry 11600 may enable communication with other computing devices using at least one external port.

Or as described above, communication circuitry 11600 may, if necessary, enable communications with other computing devices by sending and receiving RF signals, also known as electromagnetic signals, including RF circuitry.

46 is merely an example of the computing device 11000, and the computing device 11000 can be configured so that some of the components shown in Fig. 46 are omitted, or have additional components not shown in Fig. 46, Lt; RTI ID = 0.0 > components. ≪ / RTI > For example, in addition to the components illustrated in FIG. 18, a computing device for a mobile communication terminal may further include a touch screen, a sensor, and the like. The communication device 1160 may be connected to various communication methods (WiFi, 3G, LTE , Bluetooth, NFC, Zigbee, etc.). The components that may be included in computing device 11000 may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing or application specific integrated circuits.

The methods according to embodiments of the present invention may be implemented in the form of a program instruction that can be executed through various computing devices and recorded in a computer-readable medium. In particular, the program according to the present embodiment can be configured as a PC-based program or an application dedicated to a mobile terminal. An application to which the present invention is applied can be installed in a user terminal through a file provided by a file distribution system. For example, the file distribution system may include a file transfer unit (not shown) for transferring the file according to a request from the user terminal.

The apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components. For example, the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions. The processing device may execute an operating system (OS) and one or more software applications running on the operating system. The processing device may also access, store, manipulate, process, and generate data in response to execution of the software. For ease of understanding, the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG. For example, the processing unit may comprise a plurality of processors or one processor and one controller. Other processing configurations are also possible, such as a parallel processor.

The software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded. The software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave. The software may be distributed over a networked computing device and stored or executed in a distributed manner. The software and data may be stored on one or more computer readable recording media.

The method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. For example, it is to be understood that the techniques described may be performed in a different order than the described methods, and / or that components of the described systems, structures, devices, circuits, Lt; / RTI > or equivalents, even if it is replaced or replaced.

Therefore, other implementations, other embodiments, and equivalents to the claims are also within the scope of the following claims.

Claims (17)

CLAIMS What is claimed is: 1. A method for providing time information through a tactile interface device, the device being implemented in a computing device comprising a processor, the device being connected to the computing device and interacting with the user,
The computing device includes: a time notification module for providing time information; And a TUI module that converts information that can be displayed graphically into a form that can be displayed in a tactile graphics on a tactile interface device,
The TUI module comprises: a first output signal generator capable of converting a display screen into an output signal to a tactile interface device according to a first method; And a second output signal generation section capable of converting a display screen into an output signal to a tactile interface device in a second method,
A time notification step of executing a time notification module in the computing device and outputting information including time information to a visual display device connected to the computing device;
Wherein the TUI module is executed in the computing device to generate an output signal for implementing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the time notification module, A first TUI step of generating an input signal for input to the time notification module from a user input,
Wherein the first TUI includes a time notification output signal generation step of generating a time notification output signal for implementing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device by the time notification module And providing the time information via the tactile interface device.
The method according to claim 1,
The time informing step includes:
A first time representation step of generating first time data expressing combined information of at least two of day, hour, minute, and second;
A second time representation step of generating second time data representing each piece of information of at least one of day, hour, minute, and second,
The time notification output signal generation step may include:
A first time output signal generating unit for generating a first time output signal for controlling the tactile cell array of the tactile display device so as to represent the first time data in braille form by a first output signal generating unit of the TUI module; step;
A second time output signal for generating a second time output signal for controlling the tactile cell array of the tactile display device so as to express the second time data in a graphical form by a second output signal generator of the TUI module; Comprising:
Wherein the time announcement output signal comprises the first time output signal and the second time output signal.
The method of claim 2,
Wherein the time announcement output signal comprises a control signal for a two-dimensional haptic cell array and the tactile display of the tactile display device implemented by the time announcement output signal is capable of being displayed on the computing device by the time notification module A content area implemented based on a content part of the screen; And a cursor area for implementing the position and the shape of the cursor of the current user,
Wherein the first time output signal and the second time output signal are implemented in the content area.
The method of claim 3,
The content area may be divided into a plurality of sub content areas,
The cursor area may be divided into a plurality of sub-cursor areas,
Wherein a position of a cursor implemented in the cursor area corresponds to a position of the focused sub content area,
Wherein the sub-cursor area and the sub-content area corresponding to the sub-cursor area are aligned on a single axis,
Wherein the first time output signal is displayed in one sub content area,
Wherein in the second time output signal, each one or more of day, hour, minute, and second is displayed in one sub-content area.
The method according to claim 1,
The method comprises:
An alarm step in which an alarm module is executed within the computing device to output information including an alarm setting to a visual display device connected to the computing device; And
Wherein the TUI module is executed in the computing device to generate an output signal for realizing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the alarm module, Further comprising a second TUI step of generating an input signal for input to the alarm module from a user input,
And an alarm output signal generation step of generating an alarm output signal for realizing a tactile screen of the tactile display interface device corresponding to a screen that can be displayed on the computing device by the alarm module in the second TUI step, A method for providing time information via a tactile interface device.
The method of claim 5,
Wherein the alarm output signal includes a control signal for the two-dimensional tactile cell array, and the tactile display of the tactile display device implemented by the alarm output signal comprises: A content area implemented based on the portion; And a cursor area that implements the location and shape of the cursor of the current user.
The method of claim 6,
The alarming step comprises:
An alarm status display step of generating alarm status data expressing the currently set alarm status; And
And an alarm setting input step of setting or changing an alarm status according to an input to the tactile interface device of the user.
The method of claim 7,
The alarm setting /
An alarm braille input step of setting or changing an alarm status item by inputting braille information to the tactile interface device of the user or an alarm keypad setting or changing an alarm status item by inputting a keypad to the tactile interface device of the user Wherein the time information is provided through the tactile interface device.
The method according to claim 1,
The method comprises:
A timer step in which the timer module is executed in the computing device and outputs information including a timer status and a timer setting to a visual display device connected to the computing device; And
Wherein the TUI module is executed in the computing device to generate an output signal for implementing a tactile screen of the tactile interface device corresponding to a screen displayed on the display device by the timer module, Further comprising a third TUI step of generating an input signal for input to the timer module from a user input,
And the third TUI step includes a timer output signal generation step of generating a timer output signal for implementing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device by the timer module, A method for providing time information via a tactile interface device.
The method of claim 9,
Wherein the timer output signal includes a control signal for a two-dimensional tactile cell array, and the tactile display of the tactile display device implemented by the timer output signal comprises: A content area implemented based on the portion; And a cursor area that implements the location and shape of the cursor of the current user.
The method of claim 10,
Wherein the timer step comprises:
A timer status presentation step of generating timer data representing real-time information of at least one of hour, minute, and second of currently set timer information;
The step of generating the timer output signal includes:
And a second output signal generator of the TUI module generates a timer output signal for controlling the tactile cell array of the tactile interface device so as to display the timer information in a graphical form, How to.
The method of claim 11,
Wherein the timer output signal includes a control signal for a two-dimensional tactile cell array, and the tactile display of the tactile display device implemented by the timer output signal comprises: A content area implemented based on the portion; And a cursor area for implementing the position and the shape of the cursor of the current user,
Wherein the timer output signal is implemented in the content area.
The method of claim 12,
The content area may be divided into a plurality of sub content areas,
The cursor area may be divided into a plurality of sub-cursor areas,
Wherein a position of a cursor implemented in the cursor area corresponds to a position of the focused sub content area,
Wherein the sub-cursor area and the sub-content area corresponding to the sub-cursor area are aligned on a single axis,
Wherein in the timer output signal, each one or more of hour, minute, and second is displayed in one sub-content area.
The method according to claim 1,
The first TUI step
A buffer output signal generation step of generating a buffer output signal based on the time announcement output signal and the previous tactile display information of the tactile interface device,
Wherein the output signal and the buffer output signal comprise control signals for a two-dimensional haptic cell array.
15. The method of claim 14,
Wherein the first TUI step is performed each time a screen that can be displayed on the computing device is changed by the time notification module,
Wherein the buffer output signal generation step generates a buffer output signal based on a difference between the immediately previous output signal and the current modified output signal.
A non-transitory computer-readable medium for controlling a tactile interface device coupled to a computing device to provide time information to a blind person, the non-transitory computer-readable medium comprising instructions for causing a computing device to perform the following steps: Lt; / RTI >
The steps include:
A time notification step of outputting information including time information to a visual display device connected to the computing device;
And a controller for generating an output signal for realizing a tactile screen of the tactile interface device corresponding to the screen displayed on the display device by the time informing step and inputting the output signal to the computing device A first TUI step of generating an input signal,
Wherein the first TUI step includes generating a time announcement output signal for implementing a tactile screen of the tactile interface device corresponding to a screen that can be displayed on the computing device by the time informing step ≪ / RTI >
18. The method of claim 16,
The time informing step includes:
A first time representation step of generating first time data expressing combined information of at least two of day, hour, minute, and second;
A second time representation step of generating second time data representing each piece of information of at least one of day, hour, minute, and second,
Wherein the first TUI step comprises:
A first time output signal generating step of generating a first time output signal for controlling the tactile cell array of the tactile display device so that the first time data can be expressed in braille;
And a second time output signal generation step of generating a second time output signal for controlling the tactile cell array of the tactile display device so that the second time data can be expressed in a graphical form,
Wherein the time announcement output signal comprises the first time output signal and the second time output signal.
KR1020170098689A 2017-08-03 2017-08-03 Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device KR101864584B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020170098689A KR101864584B1 (en) 2017-08-03 2017-08-03 Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020170098689A KR101864584B1 (en) 2017-08-03 2017-08-03 Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device

Publications (1)

Publication Number Publication Date
KR101864584B1 true KR101864584B1 (en) 2018-06-07

Family

ID=62621877

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020170098689A KR101864584B1 (en) 2017-08-03 2017-08-03 Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device

Country Status (1)

Country Link
KR (1) KR101864584B1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101975761B1 (en) * 2018-07-02 2019-05-07 이단경 Braille input and output system for mobile device
KR102008844B1 (en) * 2019-03-14 2019-08-12 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing Calculator Function By Tactile Interface Device
KR102078354B1 (en) * 2019-03-14 2020-02-17 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing E-mail Management Function By Tactile Interface Device
KR102078360B1 (en) * 2019-03-14 2020-04-07 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Application Management Function By Tactile Interface Device
KR102109005B1 (en) * 2019-03-15 2020-05-11 가천대학교 산학협력단 Method and System for Providing CardGame Function By Tactile Interface Device
KR102099616B1 (en) * 2019-03-14 2020-05-15 가천대학교 산학협력단 Tactile Display Tablet

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3171265U (en) * 2011-08-11 2011-10-20 株式会社テクノスジャパン Medication management device
KR20120063982A (en) 2010-12-08 2012-06-18 한국전자통신연구원 Apparatus and method of man-machine interface for invisible user
KR20160111218A (en) * 2015-03-16 2016-09-26 주식회사 인프라웨어 Braille view port display device and method for visual handicapped person
KR101744118B1 (en) * 2016-04-20 2017-06-07 가천대학교 산학협력단 Tactile Interfacing Device and Tactile Interfacing Method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120063982A (en) 2010-12-08 2012-06-18 한국전자통신연구원 Apparatus and method of man-machine interface for invisible user
JP3171265U (en) * 2011-08-11 2011-10-20 株式会社テクノスジャパン Medication management device
KR20160111218A (en) * 2015-03-16 2016-09-26 주식회사 인프라웨어 Braille view port display device and method for visual handicapped person
KR101744118B1 (en) * 2016-04-20 2017-06-07 가천대학교 산학협력단 Tactile Interfacing Device and Tactile Interfacing Method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101975761B1 (en) * 2018-07-02 2019-05-07 이단경 Braille input and output system for mobile device
KR102008844B1 (en) * 2019-03-14 2019-08-12 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing Calculator Function By Tactile Interface Device
KR102078354B1 (en) * 2019-03-14 2020-02-17 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing E-mail Management Function By Tactile Interface Device
KR102078360B1 (en) * 2019-03-14 2020-04-07 주식회사 피씨티 Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Application Management Function By Tactile Interface Device
KR102099616B1 (en) * 2019-03-14 2020-05-15 가천대학교 산학협력단 Tactile Display Tablet
KR102109005B1 (en) * 2019-03-15 2020-05-11 가천대학교 산학협력단 Method and System for Providing CardGame Function By Tactile Interface Device

Similar Documents

Publication Publication Date Title
KR101864584B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Time Information By Tactile Interface Device
CN110554828B (en) Accessing a system user interface on an electronic device
CN106662966B (en) Multi-dimensional object rearrangement
KR102157759B1 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
CN107491181B (en) Dynamic phrase extension for language input
US10845880B2 (en) Method, device, and computer-readable medium for controlling tactile interface device interacting with user
KR101893014B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Controlling Tactile Interface Device
JP2022188134A (en) Context specific user interface
US10891875B2 (en) Method, device, and non-transitory computer-readable medium for controlling tactile interface device
KR102120451B1 (en) Method, Device, and Computer-Readable Medium for Providing Internet Browsing Service by Tactile Interface Device
KR102055696B1 (en) System, Method, and Non-transitory Computer-Readable Medium for Providing Messenger By Tactile Interface Device
KR102108998B1 (en) System, Method, and Non-transitory Computer-Readable Medium for Providing Word Processor By Tactile Interface Device
KR20170128196A (en) Method, Device, and Computer-Readable Medium for Controlling Tactile Interface Device
KR102008844B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Calculator Function By Tactile Interface Device
KR102187871B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Education Support Function By Tactile Interface Device
KR102066123B1 (en) System, Method, and Non-transitory Computer-Readable Medium for Providing Book Information By Tactile Interface Device
KR101935543B1 (en) Method, Device, and Computer-Readable Medium for Controlling Tactile Interface Device
KR20220115659A (en) System for Sending Documents Containing Private Information for Blind User
KR102078363B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Image Viewer Function By Tactile Interface Device
KR102109005B1 (en) Method and System for Providing CardGame Function By Tactile Interface Device
KR102078360B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Application Management Function By Tactile Interface Device
KR102078354B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing E-mail Management Function By Tactile Interface Device
KR102261668B1 (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Graphic Education Support Function By Tactile Interface Device
KR102668131B1 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
KR20220115660A (en) Method, Device, and Non-transitory Computer-Readable Medium for Providing Braille Education Support Function By Tactile Interface Device

Legal Events

Date Code Title Description
GRNT Written decision to grant