WO2010087301A1 - 手話キーボードおよびそれを用いた手話検索装置 - Google Patents
手話キーボードおよびそれを用いた手話検索装置 Download PDFInfo
- Publication number
- WO2010087301A1 WO2010087301A1 PCT/JP2010/050897 JP2010050897W WO2010087301A1 WO 2010087301 A1 WO2010087301 A1 WO 2010087301A1 JP 2010050897 W JP2010050897 W JP 2010050897W WO 2010087301 A1 WO2010087301 A1 WO 2010087301A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- finger
- sign language
- input area
- upper body
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M11/00—Coding in connection with keyboards or like devices, i.e. coding of the position of operated keys
- H03M11/02—Details
- H03M11/04—Coding of multifunction keys
- H03M11/06—Coding of multifunction keys by operating the multifunction key itself in different ways
- H03M11/08—Coding of multifunction keys by operating the multifunction key itself in different ways by operating selected combinations of multifunction keys
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/0219—Special purpose keyboards
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
Definitions
- the present invention relates to a sign language keyboard used for visual communication expressed in sign language and a sign language search device using the same.
- sign language is often used as an information transmission means for hearing impaired people (deaf people).
- Sign language is a visual language that uses both finger movement and non-finger movement at the same time, and is a language that is in line with speech language.
- how to associate these sign language expressions with electronic information has been a challenge.
- Patent Document 1 in order for both the normal hearing person and the hearing-impaired person to grasp the information content easily, quickly and accurately, the text data is subjected to morphological analysis and keyword Is extracted from this keyword and converted into bulleted text data and fragmented sign language video data, and visually displayed to the deaf person.
- Sign language shows semantic content in conjunction with finger and non-finger movements, but it is difficult to guess the meaning of sign language from these two movements, and it was divided into finger movements and non-finger movements in advance. The only way to learn these meanings was to prepare cards.
- the present invention has been made in view of such circumstances, and its first problem is to focus on the characteristic that sign language is composed of finger movements and non-finger movements, and to provide input means capable of simple input. It is to provide.
- a second object of the present invention is to provide a search means capable of searching for the meaning content of sign language from finger movements and non-hand movements.
- the present invention comprises the following means.
- Claim 1 of the present invention is an input device for inputting sign language information that expresses the semantic content in conjunction with finger movements and non-finger movements, and at least a plurality of upper body parts in which the upper body part of the speaker is disassembled and displayed.
- Wrist states that represent the input area, the finger input area with each finger shape assigned to it, the movement of the speaker's wrist vertically upward, the vertical downward movement, and the horizontal movement
- the input identification means identifies the input area, the shape of the finger input in the finger input area, the upper body part subsequently input, and the wrist movement input as necessary, and dynamically
- This is an input device capable of inputting a changing sign language expression.
- the input device is divided into the upper body input area, the finger input area, and the wrist state input area, it is possible to smoothly input the finger movement and the non-finger movement in conjunction with each other. It is possible to input the information meaning as follows.
- the upper body input area, the finger input area, and the wrist state input area are each assigned to a general-purpose keyboard.
- a general-purpose keyboard By assigning each area to a general-purpose keyboard in this manner, inexpensive and simple sign language input can be performed without redesigning the physical configuration of the hardware.
- the upper body input area is arranged on the left side of the keyboard with respect to the sign language input person, the finger input area is arranged on the right side of the keyboard, and the wrist state input area is arranged on the upper part of the keyboard. It is characterized by being.
- the upper body input area, the finger input area, and the wrist state input area are each displayed on the touch panel screen as a keyboard image on a flat input screen.
- Claim 5 of the present invention is a retrieval device for retrieving semantic information of sign language from finger movements and non-finger movements, comprising at least a plurality of upper body input areas in which the upper body part of the speaker is disassembled and displayed, and the shape of the fingers.
- An input means having a finger input area assigned to each one; a wrist state input area that represents a movement in which the wrist of the speaker is vertically upward; a movement in which the speaker is vertically downward; and a movement in the horizontal direction; , After acquiring the finger movement information by the input of the finger input area of the input means, the non-hand movement information that may be combined with the finger movement information, that is, the input from the upper body input area and the wrist state input area Control means for predicting the input of the input and reading a significant word consisting of a combination of finger movement information and non-finger movement information stored in the storage means and displaying the significant word on the display means It is a sign language search apparatus.
- the control means acquires the finger movement information, reads a significant word consisting of a combination of finger movement information and non-hand movement information stored in the storage means, and displays the significant word on the display means.
- the search for the significant words is performed by combining the finger movement information and the upper body input information and / or wrist state input information.
- the sign language keyboard of the present invention it is possible to more easily and accurately search for expressions in sign language and their meaning contents.
- Explanatory drawing which shows arrangement
- Explanatory drawing which shows the upper body input area of an Example Explanatory drawing which shows the input procedure of an Example (1)
- Explanatory drawing which shows the input procedure of an Example (2)
- Explanatory drawing which shows the input procedure of an Example (3)
- Explanatory drawing which shows the input procedure of an Example (4)
- Explanatory drawing which shows the input procedure of an Example (5)
- Explanatory drawing which shows the search procedure of an Example (2)
- the sign language keyboard of the present invention is a general-purpose keyboard device having a key top shown in FIG. 1, and an upper body input area, a finger input area, and a wrist state input area are allocated to each key top.
- the upper body input area (101 to 109) is arranged on the left side when viewed from the input side (front side) of the keyboard, the finger input area (201 to 224) is arranged on the right side, and Wrist state input areas (301 to 305) are arranged.
- the upper body input area is composed of nine keys indicated by the key tops 101 to 109, where 101 is the upper left head, 102 is the central upper head, 103 is the upper right head, 104 is the left shoulder, and 105 is the lower face. , 106 is the right shoulder, 107 is the left hand, 108 is the abdomen, and 109 is the right hand.
- the finger input area is composed of 24 keys indicated by the key tops 201 to 224, and each key top is displayed with a finger shape used as sign language assigned thereto.
- the wrist state input area is composed of five keys indicated by the key tops 301 to 305.
- the left shift key 401 is a key for selecting the shape of the hand of the left hand. After the left shift key 401 is pressed, the finger input keys 201 to 224 are pressed to realize the left hand finger input. Yes. For the right hand, when the left shift key 401 is not pressed, the finger input mode for the right hand is set. Furthermore, the left finger input mode may be always used by using Caps Lock (a function realized by simultaneously pressing the shift key “Shift” and the control key “Ctrl”).
- a key (finger bending key 402) on which “L-shaped figure” for bending a finger is displayed on the key top is also prepared.
- the finger bending key 402 is pressed once, a bending operation is input.
- an operation of returning the bent finger is input. If the finger is bent 90 degrees or more, it is assumed that the finger is bent. If the finger is bent slightly, it is determined that the finger is not bent on the keyboard.
- the wrist state input keys 301 and 302 display a small circle and a large circle on the key top, and mean “small” and “large” indicating the size of the sign language.
- the wrist state input keys 303 to 305 mean the angle of the wrist, and a key 303 with the wrist vertically downward, a key 304 horizontal, and a key 305 vertically upward are arranged.
- keys other than the keys described above may be assigned functions as abbreviated keys such as conjunctions such as “is” and “but”.
- the sign language keyboard is connected to a general-purpose personal computer through a wireless interface such as a USB interface, Bluetooth, or infrared communication.
- the personal computer is connected to a central processing unit (CPU) through a bus (BUS), a main memory (MM), and a hard disk device (HD) as a large-scale storage device.
- a display device (DISP) as an output device, a printer device (PRN), and the like.
- an application program (APL) is installed together with an operating system (OS).
- the application program (APL) is read into the central processing unit (CPU) via the bus (BUS) and the main memory (MM) and sequentially executed.
- This application program has a function of converting sign language information input with a sign language keyboard into a sentence meaning it, and a sentence registered in a conversion table based on an analysis algorithm module according to the sign language grammar. Information is output to a display device (DISP).
- the analysis algorithm module may be provided in the sign language keyboard, and the converted sentence information may be input as a text code to the personal computer.
- the sign language “Thank you for your work” is a sign language in which both hands are made goo and the right hand is moved down from the neck toward the left hand placed in front of the chest.
- the left shift key 401 for selecting the left hand is pressed, then the key 201 meaning “go” of the finger keys is pressed, and the key 107 at the position of the left hand of the upper body input key is pressed.
- the key 201 that means “Goo” of the finger key again, and mean the chest position with the upper body input key 105 indicating the neck position pressed
- the upper body input key 108 is pressed.
- FIG. 4 shows a procedure for inputting a sign language meaning “together” using a sign language keyboard. “Standing up the index fingers of both hands, spreading them to the left and right of the chest, bringing the fingers close together, and sticking the sides of the fingers” Represents the operation.
- Fig. 5 shows the procedure for inputting a sign language meaning "small” using the sign language keyboard. Raise the left index finger, raise the right index finger and middle finger, and place them on the left and right index fingers of the left hand. To make a “small” kanji.
- FIG. 6 shows a procedure for inputting the sign language “search” with the sign language keyboard, and represents an operation of “making a ring with the index finger and the thumb and moving it from left to right at the height of the face”.
- FIG. 8 shows a procedure for inputting a sign language meaning “camp” using a sign language keyboard.
- “Camp” requires a complicated movement because it changes the shape of the hand while changing the position of the hand in sign language. Specifically, “Let the back of the left hand up, place the left arm parallel to the body in front of the chest, open the finger of the right hand, place it on the back of the left hand, and raise the hand to the position of the neck. Perform the action.
- the sign language keyboard device in which the upper body input key and the finger key are assigned to the general-purpose keyboard device has been described.
- the keyboard is not necessarily provided with a mechanical key pressing mechanism.
- an image displayed on a touch panel display device may be used.
- the keyboard shown in FIG. 1 may be printed on the surface of the touchpad which inputs with a touch pen, and it may have the function which can perform the same input as a keyboard in the touch position of a touch pen.
- an embodiment in which the sign language keyboard of the present embodiment is used as input means of a search device will be described.
- the keyboard is connected to a personal computer (not shown) via an interface such as a USB.
- the personal computer is a general-purpose computer, and has a central processing unit and a hard disk device, a display device, and the like connected by a bus with a main memory as a center.
- the hard disk device stores an operating system, a sign language application program, sign language video data, a word database expressed by a combination of finger and non-hand movements, and the sign language application program is stored via a bus and main memory.
- the central processing unit sequentially reads and executes the processing, thereby realizing the functions of the present embodiment described below.
- the keyboard (see FIG. 1) described in the present embodiment is connected to the bus via a USB interface. When performing a sign language search using a keyboard, the central processing unit first reads a sign language application program and waits for an interrupt process from the keyboard.
- the sign language application program detects an interrupt code of the hand key and uses the hand key Are displayed from the word database and the candidate words are displayed on the display device together with the sign language image (see FIG. 9).
- the word database displays only a group of frequently used words by a learning function.
- six words that can be expressed by using the finger movement of the finger key 203 that is, “I”, “sign language”, “tomorrow”, “think”, “listen” and “play” are characters. It is displayed with a sign language image. If a search target word is found at this stage, the search ends.
- a key indicating a non-finger movement for example, an upper body input key 101 (a key indicating the upper right of the head) is pressed following the input of the finger movement of the finger key 203. .
- candidate words that are signified by sign language with the index finger positioned at the upper right of the head are narrowed down and displayed on the display device together with the sign language image (FIG. 10). If the refinement search has not yet been completed by the refinement search shown in FIG. 10, words can be refined by further restricting the sign language operation by inputting a non-hand key. For example, as shown in FIG.
- the index finger of the left hand can be designated and input by pressing the finger key 203 while pressing the shift key 401 following the finger key 203 ⁇ upper body input key 101 (see FIG. 11).
- the word corresponding to the sign language using the left index finger is narrowed down while the right index finger is placed at the upper right of the head.
- pressing the upper body input key 103 representing the upper left of the head means a sign language action in which the index finger of the left hand is placed on the upper left of the head. Narrow down to the sign language action that means the word “play” to be placed on (see FIG.
- the sign language action is narrowed down and the corresponding words are narrowed down each time the key is pressed sequentially, such as the finger key 203 ⁇ the upper body input key 101 ⁇ the shift key 401 + the finger key 203 ⁇ the upper body input key 103.
- the present invention is not limited to this, and the key as shown in FIG. As long as a keyboard having a layout is provided, it may be realized by an electronic dictionary, a smart phone, or the like.
- the sign language application program has been described in the case where it is installed in a hard disk device of a personal computer.
- the present invention is not limited to this, and the program is installed in a server, and input results and search results are obtained via a communication network such as the Internet. You may make it display on a terminal device.
- the sign language keyboard of the present invention can be used as an input system or search system for meaning contents expressed in sign language, or as an electronic dictionary for sign language.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Computer Hardware Design (AREA)
- Input From Keyboards Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
一方、手話は手指動作と非指動作とが連動して意味内容を示すものであるが、これらの両動作から手話の意味内容を推測するのは難しく、あらかじめ手指動作と非指動作に分けたカード等を用意してこれらの意味内容を学習するしか方法がなかった。
本発明の第2の課題は手指動作と非手指動作とから手話の意味内容を検索可能な検索手段を提供することにある。
本発明の請求項5は、手指動作と非手指動作とから手話の意味情報を検索する検索装置であって、少なくとも話者の上半身部分を分解表示した複数の上半身入力領域と、手指の形状をその一つずつに割り当てた手指入力領域と、話者の手首を垂直上方にした動作と、垂直下方にした動作と、水平方向にした動作とをそれぞれ表す手首状態入力領域とを有する入力手段と、前記入力手段の手指入力領域の入力により手指動作情報を取得した後に、当該手指動作情報と組み合わされる可能性のある非手指動作情報、すなわち前記上半身入力領域からの入力と前記手首状態入力領域からの入力を予測して、記憶手段に記憶されている手指動作情報と非手指動作情報との組み合わせからなる有意単語を読み出して表示手段に表示する制御手段とを有する手話検索装置である。
これによれば、まず手指動作が当てはめられた手指入力領域を指定するだけで、この手指動作を出発動作として組み合わされる非手指動作が予想され、その組み合わせが意味する有意単語が表示手段に表示されるため、手話検索が可能となる。
本発明の請求項6は、前記制御手段は、前記手指動作情報を取得して前記記憶手段に記憶されている手指動作情報と非手指動作情報との組み合わせからなる有意単語を読み出して表示手段に表示した後に、さらに上半身入力領域および/または手首状態入力領域からの入力があったときには、前記手指動作情報と当該上半身入力情報および/または手首状態入力情報との組み合わせにより前記有意単語の絞り込み検索を行い、検索結果情報を前記表示手段に表示させる請求項5記載の手話検索装置である。
これによれば、手指入力領域からの入力に続いて、上半身入力領域と手首状態入力領域の入力によってさらに単語候補を絞り込むことができる。
次に、本実施形態の手話キーボードを検索装置の入力手段として用いた場合の実施形態について説明する。
当該キーボードは、図示しないパーソナルコンピュータにUSB等のインターフェースを介して接続されている。
パーソナルコンピュータは汎用のコンピュータであり、中央処理装置およびメインメモリを中心にバスで接続されたハードディスク装置、ディスプレイ装置等を有している。
ハードディスク装置内には、オペレーティングシステムとともに、手話アプリケーションプログラム、手話動画データ、手指動作と非手指動作の組み合わせにより表現される単語データベース等が格納されており、この手話アプリケーションプログラムをバスおよびメインメモリを介して中央処理装置が順次読み込んで実行処理することによって、以下に説明する本実施形態の機能が実現される。
本実施形態で説明したキーボード(図1参照)は、USBインターフェースを介して前記バスに接続されている。
キーボードを用いた手話検索を行う場合、まず中央処理装置は手話アプリケーションプログラムを読み込んでキーボードからの割り込み処理を待つ。
まず、キーボードの手指入力キー、ここでは人差し指を上方にした状態のキー、すなわち手指キー203が押されると、手話アプリケーションプログラムは、当該手指キーの割り込みコードを検出して、この手指キーを用いる単語を単語データベースから読み出してその候補単語を手話画像とともにディスプレイ装置に表示する(図9参照)。このとき、単語データベースは学習機能により、使用頻度の高い単語のグループのみを表示する。図9では、前記手指キー203の手指動作を使うことによって表現可能な6個の単語、すなわち「私」、「手話」、「明日」、「考える」、「聴く」および「遊ぶ」が文字と手話画像とで表示されている。
この段階で検索目的の単語が見つかった場合には検索は終了する。
当該検索結果が大量に索出された場合には、前記手指キー203の手指動作の入力に続いて、非手指動作を意味するキー、たとえば上半身入力キー101(頭の右上を示すキー)を押す。これによって、人差し指を頭の右上に位置させた状態の手話が意味する候補単語に絞り込まれてこれらの絞り込み候補単語が手話画像とともにディスプレイ装置に表示される(図10)。
上記図10に示した絞り込み検索でもまだ絞り込みが完了しない場合には、さらに非手指キーの入力によって、手話の動作を限定することで単語を絞り込むことができる。たとえば図11に示すように、手指キー203→上半身入力キー101に続いて、シフトキー401を押しながら手指キー203を押すことによって、左手の人差し指を指定入力することができる(図11参照)。この段階で、右手の人差し指を頭の右上に配置させながら左手の人差し指を使う手話に該当する単語に絞り込みが行われる。この結果同図に示すように、「聴」と「遊ぶ」の単語に絞り込まれることになる。
最後に、頭の左上を表す上半身入力キー103を押すことによって、左手の人差し指を頭の左上に配置させる手話動作を意味することになり、これによって最終的に両手の人差し指を頭のそれぞれの上方に配置させる「遊ぶ」という単語を意味する手話動作に絞り込まれる(図11参照)
このように、手指キー203→上半身入力キー101→シフトキー401+手指キー203→上半身入力キー103と順次キーが押されていく段階毎に手話動作が絞り込まれそれに該当する単語も絞り込みが行われていく。
以上の実施形態では、独立したキーボードがUSB規格等のインターフェースで手話アプリケーションプログラムをインストールしたパーソナルコンピュータに接続した場合で説明したが、これに限定されるものではなく、図1に示したようなキー配置を有するキーボードを備えたものであれば、電子辞書、スマートホン等で実現してもよい。
さらに、手話アプリケーションプログラムは、パーソナルコンピュータのハードディスク装置にインストールした場合で説明したが、これに限らず、当該プログラムはサーバにインストールされており、インターネット等の通信網を介して入力結果や検索結果を端末装置に表示させるようにしてもよい。
210~224 手指入力領域(手指入力キー)
301~305 手首状態入力領域(手首状態入力キー)
401 左シフトキー
402 指曲げキー
Claims (6)
- 手指動作と非手指動作とを連動して意味内容を表現する手話情報を入力するための入力装置であって、
少なくとも話者の上半身部位を分解表示した複数の上半身入力領域と、
手指の形状をその一つずつに割り当てた手指入力領域と、
話者の手首を垂直上方にした動作と、垂直下方にした動作と、水平方向にした動作とをそれぞれ表す手首状態入力領域と、
前記手指入力領域で入力された手指の形状と、それに引き続いて入力された上半身の部位と、必要に応じて入力された手首の動作とを入力識別手段で識別して動的に変化する手話表現の入力を可能とした入力装置。 - 前記上半身入力領域と、手指入力領域と、手首状態入力領域はそれぞれ汎用キーボードに割り当てられているキーボードであることを特徴とする請求項1記載の入力装置。
- 前記上半身入力領域は手話入力者に対してキーボードの左側に配置され、前記手指入力領域はキーボードの右側に配置され、手首状態入力領域はキーボードの上部に配置されていることを特徴とする請求項2記載の入力装置。
- 前記上半身入力領域と、手指入力領域と、手首状態入力領域はそれぞれ平面的な入力画面にキーボード画像としてタッチパネル画面に表示されている請求項1記載の入力装置。
- 手指動作と非手指動作とから手話の意味情報を検索する検索装置であって、
少なくとも話者の上半身部分を分解表示した複数の上半身入力領域と、
手指の形状をその一つずつに割り当てた手指入力領域と、
話者の手首を垂直上方にした動作と、垂直下方にした動作と、水平方向にした動作とをそれぞれ表す手首状態入力領域とを有する入力手段と、
前記入力手段の手指入力領域の入力により手指動作情報を取得した後に、当該手指動作情報と組み合わされる可能性のある非手指動作情報、すなわち前記上半身入力領域からの入力と前記手首状態入力領域からの入力を予測して、記憶手段に記憶されている手指動作情報と非手指動作情報との組み合わせからなる有意単語を読み出して表示手段に表示する制御手段と
を有する手話検索装置。 - 前記制御手段は、前記手指動作情報を取得して前記記憶手段に記憶されている手指動作情報と非手指動作情報との組み合わせからなる有意単語を読み出して表示手段に表示した後に、
さらに上半身入力領域および/または手首状態入力領域からの入力があったときには、前記手指動作情報と当該上半身入力情報および/または手首状態入力情報との組み合わせにより前記有意単語の絞り込み検索を行い、検索結果情報を前記表示手段に表示させる請求項5記載の手話検索装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010548492A JP5628691B2 (ja) | 2009-01-27 | 2010-01-25 | 手話キーボードおよびそれを用いた手話検索装置 |
KR1020117019697A KR101298926B1 (ko) | 2009-01-27 | 2010-01-25 | 수화 키보드 및 이를 이용한 수화 검색 장치 |
US13/146,533 US8711100B2 (en) | 2009-01-27 | 2010-01-25 | Sign language keyboard and sign language searching apparatus using the same |
EP20100735769 EP2392989A4 (en) | 2009-01-27 | 2010-01-25 | SIGN LANGUAGE KEYBOARD AND SIGN LANGUAGE SEARCHING DEVICE USING THE SAME |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009035850 | 2009-01-27 | ||
JP2009-035850 | 2009-01-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010087301A1 true WO2010087301A1 (ja) | 2010-08-05 |
Family
ID=42395562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/050897 WO2010087301A1 (ja) | 2009-01-27 | 2010-01-25 | 手話キーボードおよびそれを用いた手話検索装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8711100B2 (ja) |
EP (1) | EP2392989A4 (ja) |
JP (1) | JP5628691B2 (ja) |
KR (1) | KR101298926B1 (ja) |
WO (1) | WO2010087301A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102088909B1 (ko) * | 2013-03-15 | 2020-04-14 | 엘지전자 주식회사 | 이동 단말기 및 그의 변형 키패드 운용방법 |
US9495351B1 (en) * | 2013-10-20 | 2016-11-15 | Mary Shawver | Writing a visual language |
TWI501205B (zh) * | 2014-07-04 | 2015-09-21 | Sabuz Tech Co Ltd | 手語圖像輸入方法及裝置 |
US11836299B1 (en) * | 2022-10-04 | 2023-12-05 | Lenovo (Singapore) Pte. Ltd. | Virtual sign language system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000089660A (ja) * | 1998-09-09 | 2000-03-31 | Matsushita Electric Ind Co Ltd | 手話学習支援装置および手話学習支援プログラムを記録した記録媒体 |
JP2003044203A (ja) * | 2001-08-02 | 2003-02-14 | Canon I-Tech Inc | 情報処理装置 |
JP2005099977A (ja) * | 2003-09-24 | 2005-04-14 | Hitachi Ltd | 手話編集方法および装置 |
JP2008216397A (ja) | 2007-02-28 | 2008-09-18 | Nippon Telegr & Teleph Corp <Ntt> | 情報を提示する装置と方法及びその提示情報を作成する装置と方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5659764A (en) * | 1993-02-25 | 1997-08-19 | Hitachi, Ltd. | Sign language generation apparatus and sign language translation apparatus |
JPH08315185A (ja) * | 1995-05-18 | 1996-11-29 | Hitachi Ltd | 手話編集装置 |
JPH1139300A (ja) * | 1997-07-24 | 1999-02-12 | Toshiba Corp | 文字列予測装置、文字列予測方法及び文字列予測プログラムを記録した記録媒体 |
US6116907A (en) * | 1998-01-13 | 2000-09-12 | Sorenson Vision, Inc. | System and method for encoding and retrieving visual signals |
JP2008134919A (ja) * | 2006-11-29 | 2008-06-12 | Alpine Electronics Inc | 文字入力装置 |
US8862988B2 (en) * | 2006-12-18 | 2014-10-14 | Semantic Compaction Systems, Inc. | Pictorial keyboard with polysemous keys for Chinese character output |
CN101663880A (zh) * | 2007-04-27 | 2010-03-03 | 吴谊镇 | 用于输入中文字符的方法和设备 |
-
2010
- 2010-01-25 JP JP2010548492A patent/JP5628691B2/ja active Active
- 2010-01-25 KR KR1020117019697A patent/KR101298926B1/ko active IP Right Grant
- 2010-01-25 EP EP20100735769 patent/EP2392989A4/en not_active Withdrawn
- 2010-01-25 WO PCT/JP2010/050897 patent/WO2010087301A1/ja active Application Filing
- 2010-01-25 US US13/146,533 patent/US8711100B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000089660A (ja) * | 1998-09-09 | 2000-03-31 | Matsushita Electric Ind Co Ltd | 手話学習支援装置および手話学習支援プログラムを記録した記録媒体 |
JP2003044203A (ja) * | 2001-08-02 | 2003-02-14 | Canon I-Tech Inc | 情報処理装置 |
JP2005099977A (ja) * | 2003-09-24 | 2005-04-14 | Hitachi Ltd | 手話編集方法および装置 |
JP2008216397A (ja) | 2007-02-28 | 2008-09-18 | Nippon Telegr & Teleph Corp <Ntt> | 情報を提示する装置と方法及びその提示情報を作成する装置と方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2392989A4 |
Also Published As
Publication number | Publication date |
---|---|
EP2392989A1 (en) | 2011-12-07 |
EP2392989A4 (en) | 2014-01-08 |
JPWO2010087301A1 (ja) | 2012-08-02 |
JP5628691B2 (ja) | 2014-11-19 |
KR20110111504A (ko) | 2011-10-11 |
US8711100B2 (en) | 2014-04-29 |
KR101298926B1 (ko) | 2013-08-22 |
US20110285635A1 (en) | 2011-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7088861B2 (en) | System and method for chinese input using a joystick | |
US20110201387A1 (en) | Real-time typing assistance | |
US8199112B2 (en) | Character input device | |
KR20180102148A (ko) | 그래픽 키보드 내에서의 도형 심볼 검색 | |
US20080180283A1 (en) | System and method of cross media input for chinese character input in electronic equipment | |
KR20080106265A (ko) | 컴퓨팅 시스템에 데이터를 입력하는 시스템 및 방법 | |
JPWO2005101235A1 (ja) | 対話支援装置 | |
JP2005508031A (ja) | 部首に基づいた、適合化可能な画の書き順システム | |
JP2008546045A (ja) | 情報通信端末機の漢字入力方法及び入力装置 | |
KR20100003831A (ko) | 중국어 입력 장치 및 그 입력 방법 | |
JP5628691B2 (ja) | 手話キーボードおよびそれを用いた手話検索装置 | |
JP2010517159A (ja) | 電気電子機器のボタン効率増大方法 | |
JP2011076384A (ja) | 情報出力装置及び情報出力プログラム | |
JP2012098891A (ja) | 情報処理システムおよび情報処理方法 | |
US20040186729A1 (en) | Apparatus for and method of inputting Korean vowels | |
JP2012118643A (ja) | 点字式墨字入力方法、点字式墨字入力プログラム及び点字式墨字入力装置 | |
JP5008248B2 (ja) | 表示処理装置、表示処理方法、表示処理プログラム、および記録媒体 | |
KR101018821B1 (ko) | 중국어 문자 생성 방법 및 이에 사용되는 키입력 장치 | |
Jüngling et al. | Innovation potential for human computer interaction domains in the digital enterprise | |
Rakhmetulla et al. | Using Action-Level Metrics to Report the Performance of Multi-Step Keyboards | |
JP7109498B2 (ja) | 音声入力装置 | |
US20080294652A1 (en) | Personalized Identification Of System Resources | |
Marques | BRAILLESHAPES: efficient text input on smartwatches for blind people | |
Dey et al. | Voice key board: multimodal indic text input | |
Fareed et al. | Translation Tool for Alternative Communicators using Natural Language Processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10735769 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010548492 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13146533 Country of ref document: US Ref document number: 2010735769 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20117019697 Country of ref document: KR Kind code of ref document: A |