WO2011118312A1 - 座標入力装置及びプログラム - Google Patents
座標入力装置及びプログラム Download PDFInfo
- Publication number
- WO2011118312A1 WO2011118312A1 PCT/JP2011/053757 JP2011053757W WO2011118312A1 WO 2011118312 A1 WO2011118312 A1 WO 2011118312A1 JP 2011053757 W JP2011053757 W JP 2011053757W WO 2011118312 A1 WO2011118312 A1 WO 2011118312A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- coordinates
- coordinate
- information
- event
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/142—Image acquisition using hand-held instruments; Constructional details of the instruments
- G06V30/1423—Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a coordinate input device arranged on the surface of a display screen or a projection screen.
- the present invention relates to a coordinate input device that can simultaneously detect coordinate input of a plurality of points on a display screen or a projection screen and can issue an event corresponding to a combination of state information regarding the detected plurality of points, and a program thereof.
- an electronic board system as one of the devices to which the coordinate input device is applied.
- an event corresponding to an operation input detected by a coordinate input device is issued, and a character object or an image object (for example, composed of lines of various colors / thicknesses) reflecting the locus of the operation input Can be drawn on the operation screen.
- a character object or an image object for example, composed of lines of various colors / thicknesses reflecting the locus of the operation input
- an object on the operation screen can be operated through an operation input on the operation screen.
- the electronic board system can issue an event to a computer system operating in cooperation with the operation screen through an operation input accompanied by a specific movement.
- the computer system can execute enlargement / reduction, graphic deletion, and other commands through issued events.
- some coordinate input devices applied to an electronic board system can simultaneously detect a plurality of operation input coordinates and the size of an object used for operation input.
- Some coordinate input devices of this type can acquire operation input coordinates and the size of an input object through detection of a shadow caused by the input object.
- false detection may occur at the start of input. For example, when input is started with the palm of the hand, paying attention to the size of the shadow, the size of the shadow of the input object changes from the size of one finger to the size of the palm. For this reason, although it is actually input by the palm, it may be mistaken for input by one finger.
- a plurality of pieces of information related to state information such as the size of an object detected at the start of input and the number of input locations are stored in a storage area in the screen generation device, and a series of state transitions There is a method to prevent the occurrence of erroneous operations through confirmation of the above.
- the number of detections from the start of input to the end of input is less than the threshold value, or the number of detection points corresponding to the input stroke is small This is executed as a continuous event of a stroke shorter than the click action or the line intended by the user. In this case, the character line input may not be correctly reflected in the drawing on the operation screen.
- the present invention has been made in consideration of the above problems, and in a coordinate input device capable of simultaneously detecting coordinate inputs of a plurality of points, an error occurs when an event corresponding to a combination of state information regarding the detected plurality of points is issued.
- An object of the present invention is to provide a technique capable of suppressing or improving the generation of input.
- processing (means) for detecting information related to operation input on the displayed or projected operation screen, and the previous up operation (operation in a direction in which the input object leaves the operation screen (input end operation)) Processing (means) to calculate the elapsed time from the new down operation (operation in the direction in which the input object approaches the operation screen (input start operation)), and if the calculated elapsed time is equal to or less than the threshold value, the down operation
- a coordinate input device having a process (means) for issuing an event of input information to the detected coordinates is proposed.
- a coordinate input device having a process (means) or a program to be executed is proposed.
- the present invention even when a stroke such as a character is input quickly, it is possible to accurately grasp the start point of the stroke at the start of writing or in the middle of input. As a result, erroneous determination of the operation input is suppressed, and the user's character input can be accurately reflected on the drawing.
- FIG. 1 is a system configuration diagram showing an embodiment of an electronic board system according to the present invention.
- FIG. 1 shows an embodiment of an electronic board system.
- the electronic board system shown in FIG. 1 includes an electronic board 101, an input pen 102, an operation screen projection device 103, a control computer 104, a keyboard 105 attached to the control computer 104, and a display device 106.
- the electronic board 101 is a coordinate input device that detects operation coordinates of an input operation using a finger, a stylus pen (barb), an input pen 102, or other input objects.
- the electronic board 101 uses light irradiated in parallel to the projection surface of the operation screen, and uses a triangulation principle to detect the position where the input object blocks the light. It is an input device.
- the basic principle of this type of coordinate input device is well known. For example, two light sources (for example, an infrared light source) and an image sensor (imaging device) are disposed at positions on both ends of the upper side of the frame shape or near the center of the upper side.
- each light source when two light sources are arranged at the left and right ends of the upper side, each light source emits or scans a light beam over the entire range of the side located on the side opposite to the arranged side and the entire range of the lower side.
- the angle of view of the image sensor is about 90 °.
- the irradiation angle of each light source and the angle of view of the image sensor are set to about 180 °, respectively.
- a retroreflective member is arranged on the inner side of the frame (opposite surface of the light beam) on the three sides excluding the upper side. For this reason, the incident light is reflected in the same direction as the incident direction. The reflected light is received by an image sensor arranged in the vicinity of the light source. In the case of this type of coordinate input device, a plurality of coordinate inputs can be detected simultaneously.
- the electronic board 101 is arranged at a position in front of the screen or whiteboard on which the operation screen is projected from the operation screen projection device 103. That is, the operation input detection surface is formed in front of the screen or whiteboard. In this embodiment, an operation screen is projected, but a configuration in which the electronic board 101 is integrally disposed on the surface of a display device such as a flat display is also conceivable.
- the input area 107 assumed by the electronic board 101 for inputting coordinates of an input object is not limited to a large area assuming a screen or a white board, but is small like a display screen of a mobile phone, an electronic book or other portable terminal. Including area.
- FIG. 10 shows the connection relationship of the electronic circuits constituting the electronic board 101.
- the image sensors 1001 and 1002 are driven by a drive circuit 1003, and the operation of the drive circuit 1003 is controlled by the CPU 1006.
- the drive circuit 1003 gives a screen capture timing by the left and right image sensors 101 and 1002.
- Image signals output from the image sensors 1001 and 1002 are amplified by an amplifier 1004, input to an analog / digital conversion circuit (A / D) 1005, and converted into a digital signal format.
- a / D analog / digital conversion circuit
- the CPU 1006 detects the number, coordinate position, size, and the like of the input object based on the position information of the shadow of the input object appearing in the imaging data corresponding to the two left and right image sensors 1001 and 1002, and packet data having a data structure described later. Is generated.
- the generated packet data is output to the control computer 104 through the interface USB 1007 and the USB cable.
- FIG. 10 assumes that the light source always emits light. However, when it is necessary to control the light emission timing of the light source, the light source is connected to a drive circuit (not shown) controlled by the CPU 1006 and the red light is emitted. What is necessary is just to switch the light emission timing of external light.
- the operation screen projection device 103 is used for projecting characters and objects inputted on the screen or whiteboard through the operation screen or input objects.
- the control computer 104 has a function equivalent to that of a general-purpose personal computer, and a display content control program 1041 for processing character objects and image objects is stored in the internal memory. In addition, the control computer 104 detects an operation input by an input object, and also executes an event generation process according to the detected state.
- the display content control program 1041 executes an event generation process, and executes a program for preventing erroneous input proposed by the inventors as part of its function.
- the function corresponding to the program can be executed in the electronic board 101 or can be implemented in the operation screen projection device 103.
- the implementation of the function may be in the form of hardware (for example, a semiconductor integrated circuit or a processing board) or a program (for example, firmware or an application).
- FIG. 2 shows an example of the state of operation input that can be detected by the coordinate input device.
- this system only one finger or the electronic pen 102 is brought into contact with the coordinate input surface (virtual surface) of the electronic board 101, and two fingers are simultaneously applied to the coordinate input surface (virtual surface) of the electronic board. It is possible to detect a two-point input state in contact with each other and a palm input state in which the entire palm is in contact with the coordinate input surface (virtual surface) of the electronic board 101. These states change from each other depending on the contact state between the hand and the coordinate input surface (virtual surface) of the electronic board 101. This makes it possible to use different input methods depending on the function, such as drawing a line for one-point input, erasing the line for two-point input, and scrolling the screen with the palm.
- the detection surface is described as a virtual surface because the light traveling surface used for coordinate input by the electronic board 101 is a physically existing surface such as a screen surface, a whiteboard surface, or a display screen. Because it is different.
- FIG. 3 shows an example data structure of information output from the electronic board 101 to the control computer 104. Specifically, an example of a data structure corresponding to input data for one frame and input data corresponding to one input event is shown.
- the input data 301 for one frame includes a time 302 when an input object existing on the input area 107 is detected, the number of detected objects 303, coordinate information 304 for each detected object, and detected object size information 305.
- the configuration can cope with n (two or more) detected objects.
- the electronic board 101 when there is no object on the coordinate input surface, the electronic board 101 sends to the control computer 104 data including detection time when detection is performed and detected object number information “0” as input data. It is transmitted (packet data example 306).
- the detection time when detecting is performed from the electronic board 101 to the control computer 104, the detected object number information “1”, and the detected object number Data composed of the coordinate information of the input object corresponding to and the detected object size information is transmitted as input data (packet data example 307).
- the number of detected objects is “1” and the size of the input object is a small object equal to or smaller than a threshold value (for example, 10) specified from the detected object size information. For this reason, it can be determined that the input is one point.
- the detection time when the detection is executed from the electronic board 101 to the control computer 104 the detected object number information “2”, and the number of detected objects correspond.
- Data composed of coordinate information of each input object and detected object size information is transmitted as input data (packet data example 308).
- the number of detected objects is “2”, it can be determined that the input is two points.
- the electronic computer 101 sends a control time to the control computer 104, the detection time when the detection is executed, the detected object number information “1”, and the palm corresponding to the detected object number.
- Data including coordinate information and detected object size information is transmitted as input data (packet data example 309).
- the number of detected objects is “1” and the size of the input object is a large object that is equal to or larger than a threshold value (for example, 10) specified from the detected object size information. For this reason, it can be determined that the input is a palm.
- FIG. 4 shows an example of a hand movement during an input start operation when an attempt is made to input to the electronic board 101 with two fingers or the palm.
- the input is often made so that the second finger touches the coordinate input surface from the state where the first finger touches the coordinate input surface. .
- the number of detected objects changes from the input size of the detection count P1 (input with one finger) to the input size of the detection count P2 (input with two fingers). In this way, the input is performed while changing from one-finger input to two-finger input.
- the input is often made by touching the coordinate input surface from a part of the palm such as the tip of a finger.
- the number of detected objects changes from the input size of the detection count P′1 (input of one finger) to the input size of the detection count P′2 (input of the palm). In this way, the input is performed while changing from a single finger input to a palm input.
- the target input is started after the input with one finger is first executed. For this reason, if an input is determined at the stage of one finger input, an erroneous input occurs.
- erroneous input determination (error event) is suppressed by adopting a mechanism that does not issue an event even if input information is accumulated from the start of input until a certain number of times of detection (for example, the number of times P2 and P′2) is reached. be able to.
- FIG. 5 is a diagram illustrating a relationship between detected coordinates when a line is written on the electronic board 101 using a finger or the like and lines actually drawn by applying the above-described erroneous input determination suppression process.
- reference numeral 501 when a line is slowly written, the number of detected coordinates for one stroke is large.
- the circles representing the detection coordinates are closely arranged. In this case, even if the input information is not processed a certain number of times from the start of input, the drawn line does not differ greatly from the input line intended by the user.
- FIG. 501 shows a diagram illustrating a relationship between detected coordinates when a line is written on the electronic board 101 using a finger or the like and lines actually drawn by applying the above-described erroneous input determination suppression process.
- FIG. 6 shows a relationship between the input waiting time of each stroke and the input standby time when inputting characters by handwriting using the electronic board 101.
- the handwritten data is composed of strokes a1, a2, and a3.
- the input time for each stroke is Ta1, Ta2, and Ta3, and the input standby time is Ta12 and Ta23.
- FIG. 7 shows a difference in input point distribution examples between click input and line input.
- reference numeral 701 at the time of click input, if the input start point is the origin, the coordinate movement in the X direction and the Y direction is very small until the input ends. That is, the input points are concentrated in the vicinity of the origin.
- reference numeral 702 when a line is input, if the input start point is the origin, the input point is moved in the X or Y direction until the input is completed, and the input point is located at a position away from the origin. Is likely to be distributed. Therefore, when the input start point is used as a reference and the input object moves beyond a certain distance by the end of input, it can be estimated that the input is not a click input but a line input.
- FIG. 8 shows a functional block configuration of an erroneous input preventing apparatus realized as a partial function of the display content control program 1041 stored in the internal memory of the control computer 104.
- a circuit corresponding to the functional block configuration can be mounted on an electronic substrate or a semiconductor integrated circuit.
- the erroneous input prevention device includes an input information analysis unit 801, an execution function control unit 802, and an input information storage unit 803.
- the input information analysis unit 801 decomposes the input data 301 output from the electronic board 101 into detection time 302, detected object number information 303, coordinate information 304, and detected object size information 305.
- the execution function control unit 802 Based on the extracted input information (decomposed information) and the history of input information stored in the input information storage unit 803, the execution function control unit 802 notifies the user through the display device 106 and the operation screen projection device 103. Drawing information for the operation screen to be presented is generated. Further, the execution function control unit 802 executes a processing operation for storing newly extracted input information in the input information storage unit 803 as the latest history information. Details of processing operations executed by the execution function control unit 802 will be described later.
- FIG. 9 shows a flowchart corresponding to an erroneous input prevention processing operation executed as a function of the execution function control unit 802.
- the processing function of the execution function control unit 802 is executed as a part of the program function.
- the processing of the execution function control unit 802 will be described.
- the execution function control unit 802 detects the start of input (that is, when the number of input objects changes from “0” to “1” or more), the input function (input object number information, (Coordinate information, input object size information) is acquired as information to be added to the input history (step 901).
- the execution function control unit 802 reads the state of the event issuance start flag whose flag state is managed in step 910 described later from the data area, and determines whether or not the current state is the start of event issuance (step S109). 902).
- event issuance indicates an operation of actually reflecting input information on the content of the operation screen.
- the execution function control unit 802 determines that the current state is “starting event issuance” (when an affirmative result is obtained in step 902), the execution function control unit 802 generates an event for issuing the input information acquired in step 901. (Step 903). Thereafter, the execution function control unit 802 issues an event, reflects the input information on the display content of the operation screen, and sets an event issuance start flag (step 910). Further, the execution function control unit 802 adds the input information to the history of the input information storage unit 803 (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has ended (step 912). If the input has not ended, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- step 902 When a negative result is obtained in step 902 (that is, when the event issuance is not started), the execution function control unit 802 performs a process of determining whether or not the input information is during handwritten input (FIG. 6). Execute (step 904). Specifically, it is determined whether or not the elapsed time from the previous input end time (up operation detection time) to the current input start time (down operation detection time) is within a specified time (step 904).
- the specified time here takes into account the input language (for example, Japanese, English, etc.), character type (for example, kanji, hiragana, cursive, block, etc.), character size, whether the user is an adult or a child, etc. Is set. However, it is desirable that the specified time can be selectively set by the user. When the usage time adjustment function is installed, it is possible to further reduce the erroneous input.
- the execution function control unit 802 determines that a line is continuously written and the current input An issue event is generated for the information (step 903). Thereby, when the vertical stroke shown in FIG. 5 is input, it is possible to prevent a situation in which the input line is missing at the head portion. Thereafter, the execution function control unit 802 issues an event, reflects the input information on the display content of the operation screen, and sets an event issuance start flag (step 910). Further, the execution function control unit 802 adds the input information to the history of the input information storage unit 803 (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has ended (step 912). If the input has not ended, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- step 904 it is determined that the input start is longer than the specified time from the end of the previous input. That is, the execution function control unit 802 determines that the current input is not continuous as when characters are written. At this time, the execution function control unit 802 acquires the input start information from the history information, and determines whether or not the input information has passed for a specified time (step 905). If the specified time has elapsed, it is determined that the number of input objects and the size of the shadow have been determined, and an issue event corresponding to each determination result is generated (step 903).
- the execution function control unit 802 issues an event, reflects the input information on the display content of the operation screen, and sets an event issuance start flag (step 910). Further, the execution function control unit 802 adds the input information to the history of the input information storage unit 803 (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has ended (step 912). If the input has not ended, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- step 905 the operation is still performed when it is determined that the input start is within the specified time from the end of the previous input.
- the execution function control unit 802 determines whether or not the input object number information is “0” (step 906).
- the execution function control unit 802 When the input object number information is “0”, the execution function control unit 802 has the most frequent input method (one finger, two fingers in this embodiment) in order to avoid erroneous determination during initial input. An issue event is generated for either the finger or the palm (step 907). Thereafter, the execution function control unit 802 issues an event, reflects the input information on the display content of the operation screen, and sets an event issuance start flag (step 910). Further, the execution function control unit 802 adds the input information to the history of the input information storage unit 803 (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has been completed. If the input has not been completed, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- step 906 the case where a negative result is obtained also in step 906 will be described.
- the execution function control unit 802 extracts the input start point from the history information and determines the coordinates of the current input information in order to determine whether the input is a click input or a line input. It is determined whether or not has moved more than a specified distance (step 908).
- the specified distance here also takes into account the input language (eg, Japanese, English, etc.), character type (eg, kanji, hiragana, cursive, block, etc.), character size, and whether the user is an adult or a child. Is set. However, it is desirable that the specified distance can be selectively set by the user. When a function for adjusting the use distance is installed, it is possible to further reduce the erroneous input.
- the execution function control unit 802 determines that the line is being written quickly and issues information about the input start point so that the display content can be reflected from the input start point. An event is generated (step 909). Accordingly, it is possible to prevent a situation in which the input line is missing at the head portion when the horizontal stroke shown in FIG. 5 is input. Thereafter, the execution function control unit 802 issues an event, reflects the input information on the display content of the operation screen, and sets an event issuance start flag (step 910). Further, the execution function control unit 802 adds the input information to the history of the input information storage unit 803 (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has ended (step 912). If the input has not ended, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- step 908 If a negative result is obtained in step 908, the execution function control unit 802 adds the input information to the history (step 911). Thereafter, the execution function control unit 802 determines whether or not the input has been completed. If the input has not been completed, the execution function control unit 802 returns to step 901 to acquire the input start information again.
- the present invention is intended to process information related to operation input detected using a coordinate input device, and does not depend on the coordinate input method. Therefore, the present invention can be applied to prevention of erroneous input to a device (for example, a capacitive touch panel) that can simultaneously detect multiple points. Therefore, the coordinate input device according to the embodiment includes a touch panel. In this case, the operation input surface of the coordinate input device matches the surface of the touch panel.
- the coordinate input device may be any device as long as it can detect at least simultaneous operation inputs with respect to a plurality of coordinates, and may be an independent device, or may be a device integrated with a display device (for example, a flat panel display). .
- the coordinate input device according to the invention can also be applied to tablets and portable terminals.
- the erroneous input prevention device according to the invention may be mounted inside the coordinate input device, or may be mounted on a device integrated with the coordinate input device.
- the erroneous input prevention device according to the invention may be mounted on various other devices that operate in cooperation with the coordinate input device.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図1に、電子ボードシステムの実施形態例を示す。図1に示す電子ボードシステムは、電子ボード101、入力ペン102、操作画面投影装置103、制御用コンピュータ104、制御用コンピュータ104に付属するキーボード105及び表示装置106で構成されている。
図2に、座標入力装置によって検出可能な操作入力の状態例を示す。本システムでは、1本の指又は電子ペン102だけを電子ボード101の座標入力面(仮想面)に接触させた一点入力状態、2本の指を同時に電子ボードの座標入力面(仮想面)に接触させた2点入力状態、手のひら全体を電子ボード101の座標入力面(仮想面)に接触させた手のひら入力状態を検出可能である。これらの状態は、手と電子ボード101の座標入力面(仮想面)との接触状態によって相互に遷移する。これにより、1点入力では線を描く、2点入力では線を消す、手のひらでは画面をスクロールさせるというような入力方法を機能に応じて使い分けることができる。
図3に、電子ボード101から制御用コンピュータ104に出力される情報のデータ構造例を示す。具体的には、1フレーム分の入力データと、1つの入力イベントに対応する入力データに対応するデータ構造例を示す。
図4に、電子ボード101に指2本や手のひらで入力しようとしたときの入力開始動作時における手の動きの例を示す。
図5に、電子ボード101に指等を使用して線を書いたときの検出座標と、前述した誤入力判定の抑制処理の適用により実際に描画される線の関係を示す図である。符号501に示すように、線をゆっくり書いた場合、1ストロークに対する検出座標数が多い。符号501の場合、検出座標を表す○印が密に並んでいることが分かる。この場合、入力開始から一定回数分の入力情報を処理しなくても、描画される線がユーザの意図した入力線から大きく異なることはない。もっとも、図5の場合、横ストロークの先頭側の3つの検出座標(着色して示す)と縦ストロークの先頭側の3つの検出座標(着色して示す)とが線の描画に反映されていないことが分かる。実際、描画された各ストロークの長さは、入力時のストロークよりも短くなっている。
図6に、電子ボード101を使用して文字を手書き入力する場合における各ストロークの入力中時間と入力待機時間の関係を示す。図6に示すように「F」という文字を入力する場合、手書きデータはストロークa1、a2、a3で構成される。
図7に、クリック入力時と線入力時における入力点の分布例の違いを示す。符号701で示すように、クリック入力時は、入力開始点を原点とすると、入力終了までX方向及びY方向への座標移動は極めて少ない。すなわち、原点付近に入力点が集中的に分布する。一方、符号702で示すように、線入力時は、入力開始点を原点とすると、入力終了までに、X方向又はY方向に入力点の座標移動が発生し、原点から離れた位置に入力点が分布する可能性が高くなる。従って、入力開始点を基準とし、入力終了までに一定距離以上移動した場合はクリック入力でなく、線入力であると推測することが可能となる。
図8に、制御用コンピュータ104の内部メモリに格納された表示内容制御プログラム1041の一部の機能として実現される誤入力防止装置の機能ブロック構成を示す。もっとも、当該機能ブロック構成に対応する回路を電子基板や半導体集積回路に実装することもできる。
図9に、実行機能制御部802の機能として実行される誤入力防止処理動作に対応するフローチャートを示す。前述したように、実行機能制御部802の処理機能はプログラムの機能の一部として実行される。以下では、実行機能制御部802の処理として説明する。
前述した実施形態においては、複数座標を同時入力可能な座標入力装置として、三角測量の原理を用いる光学式の入力装置を適用する場合について説明した。しかしながら、本発明は、座標入力装置を用いて検出された操作入力に関する情報を処理対象とするものであり、座標入力の方式には依存しない。従って、多地点を同時に検出できるデバイス(例えば静電容量式のタッチパネル)に対する誤入力防止にも適用できる。従って、実施形態に係る座標入力装置には、タッチパネルも含まれる。この場合、座標入力装置の操作入力面とタッチパネルの表面と一致する。
102 入力ペン
103 操作画面投影装置
104 制御用コンピュータ
105 キーボード
106 表示装置
107 入力エリア
1041 表示処理プログラム
301 入力データ
302 検出時刻
303 検出物体数
304 座標情報
305 検出物体サイズ情報
801 入力情報解析部
802 実行機能制御部
803 入力情報記憶部
Claims (4)
- 入力物体の操作入力面に対するダウン操作とアップ操作を検出でき、かつ、複数座標に対する同時操作入力を検出可能な座標入力装置において、
表示又は投影された操作画面に対する入力物体の操作入力に関する情報を検出する処理手段と、
前回のアップ操作から新たなダウン操作までの経過時間を算出する処理手段と、
算出された経過時間が閾値以下の場合、ダウン操作の検出座標に入力情報のイベントを発行する処理手段と
を有することを特徴とする座標入力装置。 - 複数座標に対する同時操作入力を検出可能な座標入力装置において、
表示又は投影された操作画面に対する入力物体の操作入力の座標を検出する処理手段と、
検出された座標のデータ群をオブジェクトデータとして記憶領域に記憶する処理手段と、
入力開始時の座標から現在の座標までの移動距離を算出する処理手段と、
算出された移動距離が閾値以上の場合には、入力開始時の座標に入力開始点のイベントを発行する処理手段と
を有することを特徴とする座標入力装置。 - 入力物体の操作入力面に対するダウン操作とアップ操作を検出でき、かつ、複数座標に対する同時操作入力を検出可能な座標入力装置から、表示又は投影された操作画面に対する入力物体の操作入力に関する情報を入力するコンピュータに、
前回のアップ操作から新たなダウン操作までの経過時間を算出させる処理と、
算出された経過時間が閾値以下の場合、ダウン操作の検出座標に入力情報のイベントを発行させる処理と
を実行させるプログラム。 - 複数座標に対する同時操作入力を検出可能な座標入力装置から、表示又は投影された操作画面に対する入力物体の操作入力の座標を入力するコンピュータに、
検出された座標のデータ群をオブジェクトデータとして記憶領域に記憶させる処理と、
入力開始時の座標から現在の座標までの移動距離を算出させる処理と、
算出された移動距離が閾値以上の場合には、入力開始時の座標に入力開始点のイベントを発行させる処理と
を実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011800149326A CN102804111A (zh) | 2010-03-24 | 2011-02-22 | 坐标输入装置以及程序 |
EP11759120.6A EP2551751A4 (en) | 2010-03-24 | 2011-02-22 | COORDINATE INPUT DEVICE AND PROGRAM |
US13/634,442 US20130002542A1 (en) | 2010-03-24 | 2011-02-22 | Coordinate input device and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010068003A JP5486977B2 (ja) | 2010-03-24 | 2010-03-24 | 座標入力装置及びプログラム |
JP2010-068003 | 2010-03-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011118312A1 true WO2011118312A1 (ja) | 2011-09-29 |
Family
ID=44672884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/053757 WO2011118312A1 (ja) | 2010-03-24 | 2011-02-22 | 座標入力装置及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130002542A1 (ja) |
EP (1) | EP2551751A4 (ja) |
JP (1) | JP5486977B2 (ja) |
CN (1) | CN102804111A (ja) |
WO (1) | WO2011118312A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI764906B (zh) * | 2016-09-01 | 2022-05-21 | 日商和冠股份有限公司 | 座標輸入處理裝置、情感推定裝置、情感推定系統及情感推定用資料庫之構築裝置 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103577079B (zh) * | 2012-07-24 | 2017-11-07 | 腾讯科技(深圳)有限公司 | 电子设备中实现与应用交互的方法及电子设备 |
JP2014026365A (ja) * | 2012-07-25 | 2014-02-06 | Brother Ind Ltd | パネル制御装置、パネル制御方法、及びパネル制御プログラム |
WO2015052765A1 (ja) * | 2013-10-08 | 2015-04-16 | 日立マクセル株式会社 | 投射型映像表示装置、操作検出装置及び投射型映像表示方法 |
JP6219260B2 (ja) * | 2014-11-26 | 2017-10-25 | アルプス電気株式会社 | 入力装置とその制御方法及びプログラム |
CN105068688B (zh) * | 2015-08-12 | 2018-09-18 | Tcl移动通信科技(宁波)有限公司 | 一种触摸屏的报点方法及其装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09265345A (ja) * | 1996-03-28 | 1997-10-07 | Sanyo Electric Co Ltd | ストローク情報符号化方式 |
JP2000357046A (ja) * | 1999-06-15 | 2000-12-26 | Mitsubishi Electric Corp | 手書入力装置、方法ならびに手書入力プログラムを記録したコンピュータで読取可能な記録媒体 |
JP2006343856A (ja) * | 2005-06-07 | 2006-12-21 | Fujitsu Ltd | 手書き情報入力装置。 |
JP2009086886A (ja) | 2007-09-28 | 2009-04-23 | Hitachi Software Eng Co Ltd | 電子ボードシステム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5864635A (en) * | 1996-06-14 | 1999-01-26 | International Business Machines Corporation | Distinguishing gestures from handwriting in a pen based computer by stroke analysis |
EP1191430A1 (en) * | 2000-09-22 | 2002-03-27 | Hewlett-Packard Company, A Delaware Corporation | Graphical user interface for devices having small tactile displays |
US7254775B2 (en) * | 2001-10-03 | 2007-08-07 | 3M Innovative Properties Company | Touch panel system and method for distinguishing multiple touch inputs |
US8373660B2 (en) * | 2003-07-14 | 2013-02-12 | Matt Pallakoff | System and method for a portable multimedia client |
JP2006192246A (ja) * | 2004-12-13 | 2006-07-27 | Nintendo Co Ltd | ゲーム装置およびゲームプログラム |
US9785329B2 (en) * | 2005-05-23 | 2017-10-10 | Nokia Technologies Oy | Pocket computer and associated methods |
CN1991699A (zh) * | 2005-12-28 | 2007-07-04 | 中兴通讯股份有限公司 | 一种实现手写输入的方法 |
US8169421B2 (en) * | 2006-06-19 | 2012-05-01 | Cypress Semiconductor Corporation | Apparatus and method for detecting a touch-sensor pad gesture |
WO2008013658A2 (en) * | 2006-07-03 | 2008-01-31 | Cliff Kushler | System and method for a user interface for text editing and menu selection |
US8745518B2 (en) * | 2009-06-30 | 2014-06-03 | Oracle America, Inc. | Touch screen input recognition and character selection |
US8957918B2 (en) * | 2009-11-03 | 2015-02-17 | Qualcomm Incorporated | Methods for implementing multi-touch gestures on a single-touch touch surface |
-
2010
- 2010-03-24 JP JP2010068003A patent/JP5486977B2/ja not_active Expired - Fee Related
-
2011
- 2011-02-22 CN CN2011800149326A patent/CN102804111A/zh active Pending
- 2011-02-22 EP EP11759120.6A patent/EP2551751A4/en not_active Withdrawn
- 2011-02-22 WO PCT/JP2011/053757 patent/WO2011118312A1/ja active Application Filing
- 2011-02-22 US US13/634,442 patent/US20130002542A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09265345A (ja) * | 1996-03-28 | 1997-10-07 | Sanyo Electric Co Ltd | ストローク情報符号化方式 |
JP2000357046A (ja) * | 1999-06-15 | 2000-12-26 | Mitsubishi Electric Corp | 手書入力装置、方法ならびに手書入力プログラムを記録したコンピュータで読取可能な記録媒体 |
JP2006343856A (ja) * | 2005-06-07 | 2006-12-21 | Fujitsu Ltd | 手書き情報入力装置。 |
JP2009086886A (ja) | 2007-09-28 | 2009-04-23 | Hitachi Software Eng Co Ltd | 電子ボードシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2551751A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI764906B (zh) * | 2016-09-01 | 2022-05-21 | 日商和冠股份有限公司 | 座標輸入處理裝置、情感推定裝置、情感推定系統及情感推定用資料庫之構築裝置 |
Also Published As
Publication number | Publication date |
---|---|
JP5486977B2 (ja) | 2014-05-07 |
EP2551751A4 (en) | 2015-07-15 |
EP2551751A1 (en) | 2013-01-30 |
CN102804111A (zh) | 2012-11-28 |
US20130002542A1 (en) | 2013-01-03 |
JP2011203796A (ja) | 2011-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8446376B2 (en) | Visual response to touch inputs | |
TWI514229B (zh) | 圖形編輯方法以及電子裝置 | |
CN110058782B (zh) | 基于交互式电子白板的触摸操作方法及其系统 | |
US8907907B2 (en) | Display device with touch panel, event switching control method, and computer-readable storage medium | |
WO2011118312A1 (ja) | 座標入力装置及びプログラム | |
WO2011048840A1 (ja) | 入力動作解析方法および情報処理装置 | |
CN104166509A (zh) | 一种非接触式屏幕交互方法及系统 | |
KR102463657B1 (ko) | 다중 오브젝트 구조를 인식하기 위한 시스템 및 방법 | |
US20140354550A1 (en) | Receiving contextual information from keyboards | |
US20170192465A1 (en) | Apparatus and method for disambiguating information input to a portable electronic device | |
JP2016062183A (ja) | 情報処理装置、その制御方法、プログラム、及び記憶媒体 | |
JP2015022624A (ja) | 情報処理装置およびその制御方法、コンピュータプログラム、記憶媒体 | |
WO2011118313A1 (ja) | 座標入力装置及びプログラム | |
JP5440926B2 (ja) | 情報処理システム及びそのプログラム | |
JP5651358B2 (ja) | 座標入力装置及びプログラム | |
US20180059806A1 (en) | Information processing device, input control method for controlling input to information processing device, and computer-readable storage medium storing program for causing information processing device to perform input control method | |
US20140146001A1 (en) | Electronic Apparatus and Handwritten Document Processing Method | |
JP4901670B2 (ja) | 電子ボードシステム | |
US20130187893A1 (en) | Entering a command | |
JP2012022430A (ja) | 情報処理システム及びそのプログラム | |
JP2003186620A (ja) | ポインティング機能付き表示装置を備えた情報処理装置 | |
JP4901672B2 (ja) | 電子ボードシステム | |
US20230070034A1 (en) | Display apparatus, non-transitory recording medium, and display method | |
JP7508916B2 (ja) | 表示装置、表示方法およびプログラム | |
JP5140628B2 (ja) | 電子ボードシステム及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180014932.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11759120 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13634442 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2011759120 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011759120 Country of ref document: EP |