US20190012079A1 - Input Assistance Device, Smart Phone, and Input Assistance Method - Google Patents

Input Assistance Device, Smart Phone, and Input Assistance Method Download PDF

Info

Publication number
US20190012079A1
US20190012079A1 US16/131,687 US201816131687A US2019012079A1 US 20190012079 A1 US20190012079 A1 US 20190012079A1 US 201816131687 A US201816131687 A US 201816131687A US 2019012079 A1 US2019012079 A1 US 2019012079A1
Authority
US
United States
Prior art keywords
input
word
pattern image
pattern
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/131,687
Other languages
English (en)
Inventor
Kosuke Onoyama
Takashi Ozaki
Hidetake Ogino
Makoto Kimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamaha Corp
Original Assignee
Yamaha Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Corp filed Critical Yamaha Corp
Assigned to YAMAHA CORPORATION reassignment YAMAHA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OGINO, Hidetake, KIMURA, MAKOTO, ONOYAMA, KOSUKE, OZAKI, TAKASHI
Publication of US20190012079A1 publication Critical patent/US20190012079A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • G06F17/276
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/274Converting codes to words; Guess-ahead of partial word inputs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72519

Definitions

  • the present disclosure relates to a technology of assisting input of various types of data with respect to an electronic device.
  • a virtual keyboard called a software keyboard is generally used as an input unit to input words indicating various types of data.
  • the virtual keyboard is realized by displaying a pattern image (hereinafter, referred to as a virtual operator) of an operator corresponding to characters such as various types of symbols, such as alphabet, syllabary, numbers, and arithmetic symbols) in a display screen of a touch panel.
  • a virtual operator a pattern image of an operator corresponding to characters such as various types of symbols, such as alphabet, syllabary, numbers, and arithmetic symbols
  • a keyboard (hereinafter, referred to as a full-size keyboard) having about 80 to 110 keys is generally used as an input unit.
  • a virtual keyboard having sufficient number of virtual operators like the full-size keyboard due to restrictions such as narrowness of a display screen in many cases. Therefore proposed are various techniques to enable input of words and the like, even though the virtual operators are few. As an example of the techniques, there is a technique disclosed in JP-A-2015-228154 as Patent Literature 1.
  • a display content is switched to a display of the virtual operators corresponding to the respective characters of the column of “ (a)”.
  • the user recognizes the screen illustrated in FIG. 7B and touches the virtual operator corresponding to the character “ (u)” to input the character.
  • the “ (ka)” column consists of five characters of “ (ka)”, “ (ki)”, “ (ku)”, “ (ke)”, and “ (ko)” in this order.
  • the virtual operator corresponding to the character “ (ka)” is touched to display the respective operators corresponding to the respective characters of “ (ki)”, “ (ku)”, “ (ke)”, and “ (ko)” up and down and right and left by the fingertip or the like (see FIG. 8B ), and the user slides the fingertip or the like (flicking operation) in a direction of the virtual operator corresponding to the character “ (ku)”.
  • Patent Literature 1 JP-A-2015-228154
  • a non-limited object of the present invention is to provide a technology that enables an efficient input of words indicating various types of data with respect to an electronic device which uses a virtual keyboard as an input device.
  • an input assistance device includes at least one memory storing instructions; and at least one processor configured to implement the stored instructions to execute a plurality of tasks.
  • the plurality of tasks includes a display control task which causes a display device to display a plurality of first pattern images each corresponding to a different character and, by being triggered when any one of the plurality of first pattern images is designated, to display a plurality of second pattern images each corresponding to a word related to the character of the designated first pattern image surrounding the first pattern image as a reference so as to prompt an input of a word.
  • a smart phone having functions of the input assistance device.
  • an input assistance method includes displaying a plurality of first pattern images each corresponding to a different character by a display device, and displaying, by being triggered when any one of the plurality of first pattern images is designated, a plurality of second pattern images each corresponding to a word related to the character of the designated first pattern image surrounding the first pattern image as a reference so as to prompt an input of a word.
  • FIG. 1 is a perspective view illustrating an exterior of an electronic device 10 according to an embodiment of the present invention
  • FIG. 2 is a block diagram illustrating an exemplary configuration of the electronic device 10 according to an embodiment of the present invention
  • FIG. 3 is a diagram illustrating an example of a management table which is stored in a nonvolatile storage unit 134 of the electronic device 10 according to an embodiment of the present invention
  • FIG. 4 is a diagram illustrating an example of a screen which is displayed according to a setting assistance program in a display unit 120 a by a control unit 100 of the electronic device 10 according to an embodiment of the present invention
  • FIG. 5 is a flowchart illustrating a flow of an input assistance process which is performed according to an input assistance program by the control unit 100 according to an embodiment of the present invention
  • FIGS. 6A and 6B are diagrams illustrating an example of a candidate selection screen which is displayed in the display unit 120 a by the control unit 100 in order to prompt input of a word in the input assistance process according to an embodiment of the present invention
  • FIGS. 7A and 7B are diagrams illustrating an example of a virtual keyboard of the related art.
  • FIGS. 8A and 8B are diagrams illustrating an example of a virtual keyboard of the related art.
  • FIG. 1 is a perspective view illustrating an outline of an electronic device 10 according to an embodiment of the present invention.
  • the electronic device 10 is, for example, a tablet terminal, and includes a user IF unit 120 such as a touch panel.
  • a user of the electronic device 10 can enter various types of inputs by touching the user IF unit 120 .
  • a setting assistance program to perform various types of settings (for example, settings of a filtering condition) on a network device such as a router.
  • the user of the electronic device 10 can connect the electronic device 10 through a communication cable to the network device which is a target of a setting work (hereinafter, referred to as a setting target device), and can perform the setting work on the setting target device by performing the setting work according to the setting assistance program.
  • a setting target device a target of a setting work
  • description will be given about a case where the electronic device 10 has a wired connection to the setting target device.
  • a wireless connection may be employed.
  • the setting work is realized by inputting various types of commands and causing the electronic device 10 to operate according to the command.
  • the command input to the electronic device 10 is realized by inputting a character string indicating a command or an argument thereof (hereinafter, both will be collectively referred to as “command character string”) through an operation on a virtual keyboard displayed in the user IF unit 120 .
  • the electronic device 10 of the embodiment includes a display control unit which controls displaying of various types of screens to prompt the user to input the command character string. Therefore, the user can input the command character string with efficiency more than the related art.
  • a configuration hardware configuration and software configuration
  • FIG. 2 is a block diagram illustrating an exemplary configuration of the electronic device 10 .
  • the electronic device 10 includes a control unit 100 , a communication IF unit 110 , a storage unit 130 , and a bus 140 through which data is exchanged between these various types of elements, in addition to the user IF unit 120 .
  • the control unit 100 is, for example, a CPU (Central Processing Unit).
  • the control unit 100 supports the setting work by executing the setting assistance program.
  • the setting assistance program is stored in the storage unit 130 (more specifically, a nonvolatile storage unit 134 ).
  • the setting assistance program includes an input assistance program which causes the control unit 100 to execute the support the input of the command character string.
  • the communication IF unit 110 is, for example, an NIC (Network Interface Card).
  • the communication IF unit 110 is connected to the setting target device through the communication cable for example.
  • the communication IF unit 110 passes data received from the setting target device through the communication cable to the control unit 100 , and on the other hand, transmits the data received from the control unit 100 to the setting target device through the communication cable.
  • a wireless LAN IF wirelessly communicating with an access point of a wireless LAN may be used as a communication IF unit 110 .
  • the user IF unit 120 includes a display unit 120 a and an operation input unit 120 b as illustrated in FIG. 2 .
  • the display unit 120 a is a drive circuit which performs a drive control with respect to a display device such as a liquid crystal display (not illustrated in FIG. 2 ).
  • the display unit 120 a displays images indicating various types of screens under the control of the control unit 100 . As an example of a screen displayed in the display unit 120 a, there is a screen to prompt the user to perform the setting work.
  • the operation input unit 120 b is a transparent position detecting sensor in a sheet shape which is provided to cover the display screen of the display unit 120 a.
  • the position detecting method of the position detecting sensor may be an electrostatic capacitive type or an electromagnetic induction type.
  • the operation input unit 120 b forms a touch panel together with the display unit 120 a. The user may perform various types of input operations by touching the operation input unit 120 b using a touch pen or a fingertip, or by moving the fingertip while touching to perform a flicking.
  • the operation input unit 120 b assigns operation content data (for example, coordinate data such as a touch position in a two-dimensional coordinate space with the left upper corner or the like of a display screen of the display unit 120 a as the origin point) indicating a touch position or a trajectory of a flicking operation to the control unit 100 using the fingertip of the user. Therefore, a user's operation content is transferred to the control unit 100 .
  • operation content data for example, coordinate data such as a touch position in a two-dimensional coordinate space with the left upper corner or the like of a display screen of the display unit 120 a as the origin point
  • the storage unit 130 includes a volatile storage unit 132 and the nonvolatile storage unit 134 .
  • the volatile storage unit 132 is, for example, a RAM (Random Access Memory).
  • the volatile storage unit 132 is used by the control unit 100 as a work area when various types of programs are executed.
  • the nonvolatile storage unit 134 is, for example, a flash ROM (Read Only Memory) or a hard disk.
  • Various types of programs are stored in the nonvolatile storage unit 134 .
  • As a specific example of the program stored in the nonvolatile storage unit 134 there are a kernel which realizes an OS (Operating System) in the control unit 100 , a web browser, a mailer, and the setting assistance program described above.
  • OS Operating System
  • the setting assistance program includes the input assistance program and a management table.
  • FIG. 3 is a diagram illustrating an example of the management table. As illustrated in FIG. 3 , in the management table, all the available command character string data indicating each commands and argument thereof for the setting work are grouped for each head character. As illustrated in FIG. 3 , in each command character string data, there is stored subsequent character string data indicating other command character string (hereinafter, referred to as subsequent character string) which can be obtained with a space interposed in the command character string indicating the command character string data. For example, in a case where a word indicating the command character string data is a command, the subsequent character string data associated with the command character string data indicates an argument which can be designated to the command.
  • the command character string data corresponding to each head character is stored in a descending order of the frequency of use in the setting work.
  • the subsequent character string data corresponding to each command character string data is also stored in a descending order of the frequency of use as an argument in the setting work.
  • the frequency of use in the setting work may be obtained using statistics for example.
  • the command character string data and the subsequent character string data are stored in the management table in the descending order of use in the setting work, but may be stored in a dictionary order such as an alphabetic order.
  • Priority data indicating a priority corresponding to an order of the frequency of use or the dictionary order may be stored in the management table in association with each of the command character string data and the subsequent character string data.
  • the control unit 100 reads the kernel from the nonvolatile storage unit 134 to the volatile storage unit 132 by being triggered when the electronic device 10 is powered on (not illustrated in FIG. 2 ), and starts the operation of the kernel.
  • the control unit 100 which operates according to the kernel and has an OS realized therein, can perform another program according to an instruction issued through the operation input unit 120 b. For example, when a web browser is instructed to be executed through the operation input unit 120 b, the control unit 100 reads the web browser from the nonvolatile storage unit 134 to the volatile storage unit 132 , and starts the operation of the web browser. Similarly, when a setting assistance program is instructed to be executed through the operation input unit 120 b, the control unit 100 reads the setting assistance program from the nonvolatile storage unit 134 to the volatile storage unit 132 , and starts the operation of the setting assistance program.
  • the control unit 100 which operates according to the setting assistance program, first causes the display unit 120 a to display a command input screen A 01 (see FIG. 4 ) which displays a command prompt (“#” in the example illustrated in FIG. 4 ) to prompt the user to input a command. Further, the control unit 100 starts the input assistance program to support a command input.
  • the control unit 100 operating in accordance with the input assistance program serves as the display control unit described above.
  • an input assistance process that is, the process performed by the display control unit, includes the following two steps. It is the point that is featured in the embodiment.
  • a first step a plurality of pattern images each corresponding to a different character are displayed in the display unit 120 a, and prompt the user to instruct the head character of a desired command character string.
  • a plurality of pattern images (hereinafter, the pattern images displayed in a first step will be referred to as “first pattern images”, and the pattern images displayed in a second step will be referred to as “second pattern images”) each corresponding to the command character string starting from a character corresponding to the designated pattern image are displayed with the first pattern image as a reference (center in the example, but not limited thereto) by being triggered when one of the plurality of pattern images displayed in the display unit 120 a by the first step is designated. The user is prompted to input the command character string.
  • the input assistance process remarkably showing the feature of the embodiment will be described in detail.
  • FIG. 5 is a flowchart illustrating a flow of the input assistance process.
  • the control unit 100 displays a virtual keyboard A 02 in the command input screen A 01 (see FIG. 4 ) and the display unit 120 a to prompt the user to input a command (Step SA 100 ).
  • the process of Step SA 100 is the first step.
  • a plurality of virtual operators are provided in the virtual keyboard A 02 .
  • the plurality of virtual operators provided in the virtual keyboard are roughly classified into virtual operators corresponding to the characters of alphabets (the first pattern image; hereinafter referred to as a character input key) and other virtual operators.
  • a virtual operator for inputting a special character such as a space (in the example illustrated in FIG. 4 , the virtual operator assigned with a character string “SPACE”) and a virtual operator (in the example illustrated in FIG. 4 , the virtual operator assigned with a character string “123”) for switching to a number input.
  • the user who views the virtual keyboard A 02 performs a touch operation on the character input key corresponding to the head character of a desired input command character string, and can input the head character.
  • the operation input unit 120 b passes the operation content data indicating the touch position to the control unit 100 .
  • Step SA 110 the control unit 100 is on standby for the operation input unit 120 b to pass the operation content data.
  • the control unit 100 determines an operation content of the user with reference to the operation content data. Making an explanation in detail, the control unit 100 determines whether a coordinate position indicating the operation content data passed from the operation input unit 120 b is a position corresponding to any one character input key, or a position of any one of the other virtual operators. In the former case, the control unit 100 determines that the touch operation is performed on the character input key. In the latter case, the control unit 100 determines that the touch operation is performed on the other virtual operator.
  • the control unit 100 In a case where the coordinate position indicating the operation content data passed from the operation input unit 120 b is not a position of the character input key, and not the positions of the other virtual operators, the control unit 100 considers that the touch operation is an invalid operation, and waits for the input again.
  • Step SA 170 determines whether the operation is to instruct a setting assistance program to be ended.
  • the determination result is “Yes”
  • the command input screen A 01 and the virtual keyboard A 02 are deleted from the display screen of the display unit 120 a, and the input assistance program and the setting assistance program are ended.
  • the control unit 100 performs a process in accordance with an operation content (Step SA 180 ), and performs the process of Step SA 110 again. For example, in a case where there is a touch operation on the virtual operator to switch a number input, the control unit 100 switches the virtual keyboard A 02 into a virtual keyboard for the number input in Step SA 180 , and performs the process of Step SA 110 again.
  • Step SA 120 the control unit 100 narrows down the candidates of a user's input command character string from a user's operation content, and presents the candidates to the user and waits for a user's operation.
  • the process of Step SA 120 is the second step.
  • Step SA 120 the control unit 100 specifies a character corresponding to the character input key which is touched by the user, reads the command character string data indicating the command character string starting from the character from the management table, and presents the command character string indicating the command character string data as the candidates of the user's input command character string.
  • the control unit 100 causes the display unit 120 a to display the pattern image (the second pattern image) of an approximate fan shape assigned to the command character string indicating the command character string data read out of the management table in the above manner.
  • the control unit 100 causes the display unit 120 a to display a predetermined number of the approximate fan-shaped pattern images in a clockwise direction from 9 o'clock position with the touched character input key as a center. For example, in a case where the character designated by the touch operation is “s”, the image surrounding the virtual operator corresponding to the character “s” is updated as illustrated in FIG. 6A .
  • the predetermined number of pattern images is five.
  • the number of command character strings starting from the character may be six or more.
  • the respective pattern images corresponding to the sixth and subsequent command character strings may be displayed in scroll by being triggered when the lower end of the second pattern image corresponding to a lowest-priority command character string (“set” in the example illustrated in FIG. 6A ) in the second pattern images displayed in the display unit 120 a is flicked (see arrow C 3 in FIG. 6A ).
  • the pattern image assigned to the character “back” in FIG. 6A is a virtual operator that the user can cancel input.
  • the pattern image assigned to the character string “help” is a virtual operator that the user can view a help screen.
  • the pattern image assigned to the character “confirm” is a virtual operator that the user can perform a command of input completion to the command prompt of the command input screen A 01 .
  • the second pattern image corresponding to the command character string and the second pattern corresponding to the virtual operator may be displayed in regions separated from each other. In other words, the second pattern images displayed in the separated regions are displayed to prompt the user to input a word corresponding to a different type of process for each region.
  • the user who recognizes the image illustrated in FIG. 6A may select a desired command character string by a flicking operation on the second pattern image.
  • a flicking operation For example, in a case where the user desires to input “set” as the command character string, the user slides the fingertip touched on the virtual operator corresponding to the character “s” toward the pattern image assigned to the character string “set”, and performs an operation (the flicking operation illustrated by a trajectory with arrow C 1 in FIG. 6A ) to return to the virtual operator corresponding to the character “s” so as to select the character string “set”.
  • the command character string “save” corresponding to the second pattern image passed immediately before returning to the first pattern image is selected.
  • the second pattern image where the fingertip of the user is located may be inversely displayed.
  • Step SA 130 the control unit 100 determines whether the user selects a candidate with reference to the operation content data which is passed from the operation input unit 120 b. In a case where the determination result of Step SA 130 is “Yes”, the control unit 100 inputs the command character string selected by the user to the command prompt (also referred as a command line) of the command input screen A 01 (Step SA 140 ), and performs the process of Step SA 120 again. However, in Step SA 120 performed after Step SA 140 , the control unit 100 reads the subsequent character string data stored in the management table in association with the command character string data indicating the command character string selected right before, and presents the command character string indicating the subsequent character string data as the candidate of the command character string which the user inputs.
  • the command prompt also referred as a command line
  • the command character string selected by the flicking operation is “show”.
  • the subsequent character string data indicating the character strings “account”, “arp”, “log”, “status”, and “config” is stored in the command character string data which indicates the command character string “show”. Therefore, the control unit 100 displays the pattern image assigned with the command character string “show” at a position of the virtual operator corresponding to the character “s”, displays the pattern images assigned with the character strings of “account”, “arp”, “log”, “status”, and “config” surrounding the command character string “show” (see FIG. 6B ), and prompts the user to select a command character string following the command character string “show”.
  • Step SA 150 the control unit 100 performs a process according to the operation content of the user (Step SA 150 ). For example, in a case where the operation content of the user is a touch operation on a “help” key, the control unit 100 causes the display unit 120 a to display the help screen.
  • Step SA 160 subsequent to Step SA 150 in a case where the operation content of the user is a touch operation on “confirm” key and the determination result is “Yes”, the command input is considered as being finished, and Step SA 100 and the subsequent processes are performed again.
  • the determination result of Step SA 160 is “No”
  • the control unit 100 considers that the command input is ongoing, and performs SA 120 and the subsequent processes. Hitherto, the flow of the input assistance process in the embodiment is described.
  • the point that should pay attention to is that, according to the embodiment, there is no need to input the characters of the command character string one by one, so that time to input the command is significantly reduced. There is no need to take the fingertip off from the operation input unit 120 b until the command character string by the flicking operation is selected after the head character of a desired command character string and until the subsequent character string of the command character string is selected. Therefore, the number of times of touching on the operation input unit 120 b is reduced compared to the mode that the command character string of the input candidate is displayed in a separate frame, and the input can be made with efficiency.
  • the candidates to be presented to the user according to the user's designation of the head character may be narrowed down according to the type of input item to which the word is input instead of being narrowed down according to the type of the application program to which the word is input.
  • the present invention may be considered to be applied to a word input assistance of an address.
  • the character string data indicating the name of prefecture is classified for each head character and stored in the management table.
  • the subsequent character string data indicating the name of municipality which is associated with each character string data and belongs to the prefecture indicating the character string data is stored in the management table.
  • the input assistance program may start by being triggered when the cursor is positioned in an address input column.
  • a presenting order of the candidates to be presented to the user by displaying the second pattern image may be changed depending on the type of application program to which the word is input.
  • the present invention is not limited to a case where the word candidates are presented or not and the presenting order is changed or not according to the type of an application program.
  • the presentation of the word candidates and the presenting order may be changed depending on the types of control target device. For example, a network device and an audio device are different in device type. In this way, the operation target may be an application program or a device. Therefore, according to this example, the candidates of words may be presented according to an operation target, and the presenting order may be different.
  • the virtual keyboard A 02 is displayed in the display unit 120 a to prompt the user to designate the head character of a desired command character string.
  • the head character is designated
  • the plurality of second pattern images which correspond to the command character string starting from the character are displayed in a range of the first pattern image corresponding to the designated character so as to prompt the user to designate the command character string.
  • the control unit 100 may be caused to perform Step SA 140 and the subsequent processes of the flowchart illustrated in FIG. 5 to prompt the user to input the subsequent command character string (or edit the input command character string) by being triggered when any one of the input command character string to the command input screen A 01 is designated.
  • Step SA 140 the subsequent processes of the flowchart illustrated in FIG. 5
  • 6B may be overlapped with the command input screen A 01 by being triggered when an operation of designating the command character string “show” (a touch operation to the place of the command input screen A 01 ) is performed under a situation where #show log . . . is input to the command prompt of the command input screen A 01 .
  • the virtual keyboard A 02 is displayed in the display unit 120 a in order to prompt the user to designate the head character of a desired command character string.
  • the plurality of second pattern images corresponding to the command character string starting from the character is displayed surrounding the first pattern image corresponding to the designated character so as to prompt the user to designate the command character string.
  • the plurality of second pattern images displayed surrounding the user's designated first pattern image as a center may correspond to the command character string in which any one of the characters is matched with the character corresponding to the first pattern image.
  • the virtual keyboard A 02 is displayed in the display unit 120 a in order to urge the user to designate the character related to a desired input word.
  • the plurality of second pattern images corresponding to the word related to the character surrounding the first pattern image corresponding to the designated character are displayed, and may prompt the user to designate the word.
  • target application of the present invention is not limited to the tablet terminal.
  • the present invention may be applied as long as an electronic device uses a virtual keyboard as the input unit such as a smart phone, a PDA (Personal Digital Assistant), and a portable game console, so that the user can input each word of a command and an address with efficiency.
  • a display control unit is configured by a software module to perform the input assistance process (input assistance method) which apparently shows the feature of the present invention.
  • the display control unit may be configured by a hardware module such as an electronic circuit.
  • the electronic circuit may be a circuit configured by an FPGA (Field Programmable Gate Array).
  • An input assistance device having the display control unit may be provided as a single body.
  • an embodiment of the present invention provides an input assistance device which includes the following display control unit.
  • the display control unit performs a process (the process of the first step) to display a plurality of first pattern images each corresponding to a different character to the display unit (for example, a display device which roles as a display unit in the electronic device).
  • the display control unit is triggered when any one of the plurality of first pattern images displayed by the display device is designated, to perform a process (the process of the second step) of displaying the plurality of second pattern images corresponding to the word related to the character of the designated first pattern image surrounding the first pattern image as a center so as to prompt the user to input the word.
  • the word corresponding to the second pattern image that is, the word related to the character of the designated first pattern image
  • a word in which any character is matched with the character corresponding in the first pattern image such as a word starting from the character corresponding to the first pattern image.
  • the word in which any character of the word is matched with the character corresponding in the first pattern image designated by the user is called “the word including the character”.
  • the word related to the character of the designated first pattern image is not limited to the word including the character corresponding to the first pattern image.
  • the characters corresponding to the plurality of first pattern images are the head characters (that is, “ (a)”, “ (ka)”, “ (sa)”, “ (ta)”, “ (na)”, “ (ha)”, “ (ma)”, “ (ya)”, “ (ra)”, “ (wa)”) of columns of the Japanese syllabary, a word containing each character of the column corresponding to the character of the designated first pattern image may be set as the related word.
  • the second pattern images corresponding to a word which is matched with any one of the characters that is, “ (a)”, “ (i)”, “ (u)”, “ (e)”, “ (o)”) belonging to the column of “ (a)” may be displayed surrounding the first pattern image.
  • the second pattern images corresponding to the plurality of words related to the character corresponding to the first pattern image are displayed surrounding the first pattern image.
  • the user moves the fingertip toward the second pattern image corresponding to a desired word among the plurality of displayed second pattern images so as to input the word.
  • various modes may be considered for selecting the candidates of the word to prompt the user to select, by being triggered when the touch operation is performed on the first pattern image.
  • the display control unit selects a candidate presented to the user by displaying the second pattern image according to the type of the application program of a word input destination or the type of an input item to which the word is input (both will be collectively referred to as “type of the application of word input destination”). This is because the word input by the user can be considered to be narrowed down to some degrees according to the type of the application.
  • the word input by the user is considered as any one of the commands (or the arguments thereof).
  • the word input by the user is considered as the name of prefecture and the name of municipality.
  • the display control unit switches the second pattern image displayed surrounding the first pattern image according to the user's operation.
  • the user can input a word with efficiency regardless of the restriction.
  • a presenting order of the candidates shown to the user by displaying the second pattern image in the second step may be changed depending on the type of the application of the word input destination.
  • the display control unit in the input assistance device of the present invention has the feature of performing a following third step other than the first and second steps.
  • the display control unit performs the process of the third step by being triggered when the plurality of second pattern images displayed by the display device is selected in the second step.
  • the display control unit displays a third pattern image corresponding to the selected word at the position of the first pattern image, and displays a fourth pattern image which shows a candidate of a word (subsequent word) subsequent to the word surrounding the third pattern image as a reference (center in the example, but not limited thereto).
  • the input assistance method includes the first step of causing the display device to display the plurality of first pattern images each corresponding to different characters, and the second step which is performed by being triggered when any one of the plurality of first pattern images displayed by the display device in the first step is designated.
  • the plurality of second pattern images each corresponding to the words related to the character of the designated first pattern image are displayed surrounding the first pattern image as a center to prompt the input of a word.
  • the present invention may provide a program causing a general computer such as the CPU to perform the input assistance method, that is, a program causing the CPU to perform the first and second steps.
  • a program causing the CPU to perform the first and second steps.
  • the control unit such as an existing tablet terminal and an existing smart phone according to the program, it is possible to improve the efficiency when a word indicating various types of data is input to the existing tablet terminal and the existing smart phone.
  • the program may be distributed by storing in a type of a computer-readable recording medium such as a CD-ROM (Compact Disk-Read Only Memory) and a flash ROM (Read Only Memory), or may be downloaded through electronic telecommunication circuit such as the Internet.
  • CD-ROM Compact Disk-Read Only Memory
  • flash ROM Read Only Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
US16/131,687 2016-03-15 2018-09-14 Input Assistance Device, Smart Phone, and Input Assistance Method Abandoned US20190012079A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-050417 2016-03-15
JP2016050417A JP6798117B2 (ja) 2016-03-15 2016-03-15 入力支援装置
PCT/JP2017/009945 WO2017159607A1 (fr) 2016-03-15 2017-03-13 Téléphone intelligent, dispositif d'assistance à la saisie et son procédé d'assistance à la saisie

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/009945 Continuation WO2017159607A1 (fr) 2016-03-15 2017-03-13 Téléphone intelligent, dispositif d'assistance à la saisie et son procédé d'assistance à la saisie

Publications (1)

Publication Number Publication Date
US20190012079A1 true US20190012079A1 (en) 2019-01-10

Family

ID=59851581

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/131,687 Abandoned US20190012079A1 (en) 2016-03-15 2018-09-14 Input Assistance Device, Smart Phone, and Input Assistance Method

Country Status (4)

Country Link
US (1) US20190012079A1 (fr)
JP (1) JP6798117B2 (fr)
CN (1) CN108700953B (fr)
WO (1) WO2017159607A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7119857B2 (ja) * 2018-09-28 2022-08-17 富士通株式会社 編集プログラム、編集方法および編集装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US20120005576A1 (en) * 2005-05-18 2012-01-05 Neuer Wall Treuhand Gmbh Device incorporating improved text input mechanism
US20130019173A1 (en) * 2011-07-14 2013-01-17 Microsoft Corporation Managing content through actions on context based menus
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship
US20140351753A1 (en) * 2013-05-23 2014-11-27 Samsung Electronics Co., Ltd. Method and apparatus for user interface based on gesture
US20150040056A1 (en) * 2012-04-06 2015-02-05 Korea University Research And Business Foundation Input device and method for inputting characters

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07306847A (ja) * 1994-05-12 1995-11-21 Sharp Corp コンピュータオペレーション支援装置
JPH1027089A (ja) * 1996-07-11 1998-01-27 Fuji Xerox Co Ltd コンピュータ操作支援装置
JP2002014954A (ja) * 2000-06-28 2002-01-18 Toshiba Corp 中国語入力変換処理装置、中国語入力変換処理方法及び記録媒体
JP2002351600A (ja) * 2001-05-28 2002-12-06 Allied Brains Inc 入力操作支援プログラム
JP2005196250A (ja) * 2003-12-26 2005-07-21 Kyocera Corp 情報入力支援装置及び情報入力支援方法
CN101002455B (zh) * 2004-06-04 2011-12-28 B·F·加萨比安 在移动和固定环境中增强数据输入的设备及方法
CN100527057C (zh) * 2004-08-05 2009-08-12 摩托罗拉公司 字符预测方法及使用该方法的电子设备
JP5110763B2 (ja) * 2004-09-30 2012-12-26 カシオ計算機株式会社 情報表示制御装置及びプログラム
JP4639124B2 (ja) * 2005-08-23 2011-02-23 キヤノン株式会社 文字入力補助方法及び情報処理装置
CN101008864A (zh) * 2006-01-28 2007-08-01 北京优耐数码科技有限公司 一种数字键盘多功能、多语种输入系统和方法
JP2009169456A (ja) * 2008-01-10 2009-07-30 Nec Corp 電子機器、該電子機器に用いられる情報入力方法及び情報入力制御プログラム、並びに携帯端末装置
CN101526870B (zh) * 2008-03-07 2012-02-01 禾瑞亚科技股份有限公司 滑动式输入装置及其方法
EP2175355A1 (fr) * 2008-10-07 2010-04-14 Research In Motion Limited Dispositif électronique portable et procédé pour le rendu et l'entrée de caractères secondaires
CN101876878A (zh) * 2009-04-29 2010-11-03 深圳富泰宏精密工业有限公司 单词预测输入系统及方法
CN102081490B (zh) * 2009-11-27 2013-01-30 沈阳格微软件有限责任公司 面向触屏设备的点划式汉字声韵输入系统
JP2011118507A (ja) * 2009-12-01 2011-06-16 Mitsubishi Electric Corp 文字入力装置
JP5572059B2 (ja) * 2010-10-21 2014-08-13 京セラ株式会社 表示装置
JP5660611B2 (ja) * 2010-12-17 2015-01-28 Necカシオモバイルコミュニケーションズ株式会社 電子機器、文字入力方法、及びプログラム
JP5647919B2 (ja) * 2011-03-07 2015-01-07 株式会社Nttドコモ 文字認識装置、文字認識方法、文字認識システム、および文字認識プログラム
EP2669782B1 (fr) * 2012-05-31 2016-11-23 BlackBerry Limited Clavier d'écran tactile avec prédiction de mot de correction
JP5850014B2 (ja) * 2013-09-13 2016-02-03 カシオ計算機株式会社 文字入力装置、及びプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002390A (en) * 1996-11-25 1999-12-14 Sony Corporation Text input device and method
US20120005576A1 (en) * 2005-05-18 2012-01-05 Neuer Wall Treuhand Gmbh Device incorporating improved text input mechanism
US20130019173A1 (en) * 2011-07-14 2013-01-17 Microsoft Corporation Managing content through actions on context based menus
US20130198690A1 (en) * 2012-02-01 2013-08-01 Microsoft Corporation Visual indication of graphical user interface relationship
US20150040056A1 (en) * 2012-04-06 2015-02-05 Korea University Research And Business Foundation Input device and method for inputting characters
US20140351753A1 (en) * 2013-05-23 2014-11-27 Samsung Electronics Co., Ltd. Method and apparatus for user interface based on gesture

Also Published As

Publication number Publication date
JP2017168947A (ja) 2017-09-21
CN108700953B (zh) 2024-02-06
CN108700953A (zh) 2018-10-23
WO2017159607A1 (fr) 2017-09-21
JP6798117B2 (ja) 2020-12-09

Similar Documents

Publication Publication Date Title
US10359932B2 (en) Method and apparatus for providing character input interface
JP4863211B2 (ja) 文字データ入力装置
JP4501018B2 (ja) 携帯端末装置および入力装置
JP2004213269A (ja) 文字入力装置
JP5801348B2 (ja) 入力システム、入力方法およびスマートフォン
WO2014075408A1 (fr) Méthode et appareil pour établir un clavier virtuel
US20150074587A1 (en) Touch screen device and character input method thereof
JP2003271294A (ja) データ入力装置、データ入力方法、及びプログラム
JP5888423B2 (ja) 文字入力装置、文字入力方法、文字入力制御プログラム
KR101030177B1 (ko) 데이터 입력장치 및 데이터 입력방법
US20190012079A1 (en) Input Assistance Device, Smart Phone, and Input Assistance Method
KR101204151B1 (ko) 휴대 단말기의 문자 입력장치
JP7036862B2 (ja) 電子機器、制御方法、及びプログラム
JP2013003802A (ja) 文字入力装置、文字入力装置の制御方法、制御プログラム、及び記録媒体
JP4317634B2 (ja) 文字入力装置及び方法並びにこれに利用される記憶媒体
JP6925789B2 (ja) 電子機器、制御方法、及びプログラム
JP6029628B2 (ja) 表示制御装置、表示制御方法、及び表示制御プログラム
JP2010097401A (ja) 文字入力装置、文字入力方法及び文字入力プログラム
JPH07146754A (ja) リモート情報処理装置
KR20160112337A (ko) 터치스크린을 이용한 한글 입력방법
JP2014149385A (ja) 図形表示制御装置、図形表示制御方法及びプログラム
JP2016218890A (ja) 電子機器および入力方法
JP2016218898A (ja) 情報処理装置および情報処理方法
JP5270729B2 (ja) 文字データ入力装置
JP2019139485A (ja) 入力受付装置

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: YAMAHA CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ONOYAMA, KOSUKE;OZAKI, TAKASHI;OGINO, HIDETAKE;AND OTHERS;SIGNING DATES FROM 20181030 TO 20181102;REEL/FRAME:047470/0447

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION