WO2014133258A1 - Pen input apparatus and method for operating the same - Google Patents

Pen input apparatus and method for operating the same Download PDF

Info

Publication number
WO2014133258A1
WO2014133258A1 PCT/KR2014/000024 KR2014000024W WO2014133258A1 WO 2014133258 A1 WO2014133258 A1 WO 2014133258A1 KR 2014000024 W KR2014000024 W KR 2014000024W WO 2014133258 A1 WO2014133258 A1 WO 2014133258A1
Authority
WO
WIPO (PCT)
Prior art keywords
pen input
pen
performance
touch
sensing unit
Prior art date
Application number
PCT/KR2014/000024
Other languages
French (fr)
Inventor
Beom Soo Yoo
Original Assignee
Pandoratech Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pandoratech Co., Ltd. filed Critical Pandoratech Co., Ltd.
Publication of WO2014133258A1 publication Critical patent/WO2014133258A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Definitions

  • the present invention relates to a pen input apparatus and a method for operating the same, more particularly, to a pen input apparatus and a method for operating the same which may recognize a user’s pen input, without an auxiliary device.
  • users can write memos in various methods, using various digital devices. Specifically, users can write characters or draw pictures with a wireless pen mouse and written characters or pictures may be displayed on computers or notebooks. Users can write memos on smart phones or tablet PCs directly, using touch pens.
  • a pad exclusively used for coordinate recognition may be used so as to recognize movement of a touch pen.
  • Electronic sensitized response between the pad exclusively used for coordinate recognition and a touch pen may be used, when a user uses a touch pen on the pad exclusively used for coordinate recognition, and coordinate setting for inputs created by the touch pen is enabled, such that the user’s pen input can be recognized based on the coordinate setting.
  • a location measurement device tracks movement of an electronic pen, while communicating with an electronic pen continuously. Especially, such a location measurement device measures a relative distance with an electronic pen through communication with the electronic pen to track a location of an electronic pen and then recognize locations between pen inputs made through the electronic pen.
  • a third attempt is to use a note exclusively used for coordinate recognition.
  • a pattern formed on the note exclusively used for coordinate recognition is recognized, using an electronic pen including a camera, and locations between a user’s pen inputs can be recognized based on the recognized pattern.
  • Exemplary embodiments of the present disclosure provide a pen input apparatus and a method for operating the same which can accurately realize user pen input through recognizing movement of a pen, even without an auxiliary device.
  • a pen input apparatus includes a touch sensing unit arranged in an end of the pen input apparatus to sense touch accompanied by pen input to determine performance or non-performance of a user’s pen input; an optical sensing unit arranged in the end of the pen input apparatus to sense a locus of the user’s pen input; a motion sensing unit to sense state change in one or more of rotation, motion and slope on one or more shafts; and a pen input analyzing unit to generate pen input data after analyzing performance or non-performance of the pen input determined by the touch sensing unit, the locus of the pen input sensed by the optical sensing unit and the state change sensed by the motion sensing unit.
  • the pen input analyzing unit may generate a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input; a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and the pen input data by combining the first pen input and the second pen input, and when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
  • a method for operating a pen input apparatus includes steps of sensing touch accompanied by the pen input to determine performance or non-performance of the user’s pen input; sensing a locus of the user’s pen input; sensing change in states of rotation, motion and slope on one or more shafts of the pen input apparatus; and generating pen input data by analyzing performance or non-performance of the pen input determined in the step of sensing the touch, the locus of the pen input sensed by the step of sensing the locus and the change in the states sensed in the step of sensing the change in the states.
  • the step of generating the pen input data may include a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input; a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and the pen input data by combining the first pen input and the second pen input, and when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
  • the motion of the pen input apparatus may be recognized and the user’s pen input may be realized accurately.
  • FIG. 1 is a diagram illustrating a pen input system according to one embodiment of the present disclosure
  • FIG. 2 is a block diagram illustrating a pen input apparatus according to one embodiment of the present disclosure
  • FIG. 3 is a diagram illustrating a pen input apparatus according to another embodiment of the present disclosure.
  • FIG. 4 is a diagram illustrating operation of the pen input apparatus according to one embodiment of the present disclosure.
  • FIG. 5 is a flow chart illustrating a method for operating a pen input apparatus according to one embodiment of the present disclosure.
  • FIG. 6 is a flow chart illustrating a step of generating pen input data by analyzing pen input shown in FIG. 5.
  • FIG. 1 is a diagram illustrating a pen input system 100 according to one embodiment of the present disclosure.
  • the pen input system 100 may include a pen input apparatus 110 and a terminal 120.
  • the pen input apparatus 110 is an apparatus which receives a user’s pen input to generate pen input data based on the user’s pen input.
  • the pen input apparatus 110 replaces a conventional keyboard or mouse and it may be used in writing letters or drawing a picture.
  • the pen input apparatus 110 may include an electronic pen and a pen mouse and the embodiments of the disclosure are not limited thereto. Various modifications for receiving a user’s pen input may be used.
  • the terminal 120 may be provided with the pen input data from the pen input apparatus 110 and display the pen input based on the pen input data.
  • Examples of the terminal 120 may include a smart phone, a tablet PC, a notebook and a computer and the examples are not limited thereto. Any external devices capable of transmitting/receiving data to/from the pen input apparatus 110 and displaying pen input.
  • FIG. 2 is a block diagram illustrating a pen input apparatus according to one embodiment of the present disclosure
  • FIG. 2 is a block diagram illustrating a pen input apparatus 200 according to one embodiment of the present disclosure.
  • the pen input apparatus 200 may include a touch sensing unit 210, an optical sensing unit 220, a motion sensing unit 230, a pen input analyzing unit 240, a communication unit 250, a memory unit 260 and a user interface unit 270.
  • the touch sensing unit 210 may sense touch generated by pen input.
  • the pen input apparatus 200 may touch an external object (e.g., paper and a pad).
  • the touch sensing unit 210 may sense such touch and determine whether the user performs pen input.
  • the touch sensing unit 210 may be arranged in an end of the pen input apparatus 200.
  • the touch sensing unit 210 may include a touch sensor and other various components.
  • the optical sensing unit 220 may sense a locus of the user’s pen input.
  • the optical sensing unit 220 may irradiate light outside and sense motion of an end of the pen input apparatus 200, in other words, the locus of the user’s pen input.
  • the optical sensing unit 220 may be arranged in the end of the pen input apparatus 200.
  • the touch sensing unit 210 sense touch input from the outside with a high recognition rate.
  • the optical sensing unit 220 may include one or more of an optical sensor, a light emitting diode (LED), a laser sensor and a blue track sensor.
  • the optical sensing unit 130 processes a signal transmitted from the optical sensor and recognize a locus of the pen input made by the pen input apparatus 100.
  • the configuration of the optical sensing unit 220 is exemplary and other various configurations for sensing the locus of the input made by the pen input apparatus 100 according to embodiments of the present disclosure may be used.
  • the motion sensing unit 230 may sense overall motion of the pen input apparatus 200, in other words, change in states such as one or more of rotation, motion and slope on one or more shaft.
  • the motion sensing unit 230 may include one or more measuring sensors for sensing state change of the pen input apparatus 200.
  • a measuring sensor may include one or more of an acceleration sensor, a gyro sensor, a geomagnetic sensor (e.g., a compass) and an electromagnetic field sensor.
  • the components of the motion sensing unit 230 are one of examples and other various components may consist of the motion sensing unit 230 to sense change in states of the pen input apparatus 200.
  • the pen input analyzing unit 240 may generate pen input data.
  • the pen input data is data on the user’s pen input and it may include a first pen input and a second pen input.
  • the first pen input means a pen input accompanying touch such as characters or lines to realize through the pen input apparatus 200.
  • the second pen input means a pen input accompanying no touch such as space between words.
  • a pen input for “ ⁇ ” includes a first pen input for “ ⁇ ” and a second pen input for space between “ ⁇ ” and “ ⁇ ”.
  • the analyzing of the pen input analyzing unit 240 may be performed based on result of sensing performed by the touch sensing unit 210, the optical sensing unit 220 and the motion sensing unit 230. Specifically, the pen input analyzing unit 240 may generate the first pen input accompanying the touch sensed by the touch sensing unit 210 based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the locus of the pen input sensed by the optical sensing unit 220. In other words, a locus of the pen input sensed by the optical sensing unit 220 while the touch sensing unit 210 is sensing touch may be generated as the first pen input.
  • the second pen input accompanying no touch sensed by the touch sensing unit 210 may be generated based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the state change sensed by the motion sensing unit 230.
  • state change sensed by the motion sensing unit 230 while the touch sensing unit 210 is sensing no touch may be generated as the second pen input.
  • the first pen input and the second pen input are generated, the first pen input and the second pen input are combined, only to generate pen input data.
  • a start point of the one pen input is identical to a finish point of the other pen input to make it possible to realize continuous pen inputs accurately.
  • the pen input analyzing unit 240 may generate a third pen input accompanying no touch based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the locus of the pen input sensed by the optical sensing unit 220.
  • the locus of the pen input sensed by the optical sensing unit 220 while the touch sensing unit 210 is sensing no touch may be generated as the third pen input.
  • the second pen may be compensated by the third pen input.
  • the locus of the pen input may be sensed by the optical sensing unit 220 at a high recognition rate, in case the touch sensing unit 210 senses touch.
  • the locus of the pen input may be sensed in a predetermined range (e.g., a range of predetermined distances or recognition rates), even with no touch sensed by the touch sensing unit 210. Accordingly, the pen input analyzing unit 240 compensates the second pen input, using the third pen input such that motion of the pen input apparatus may be reflected in the pen input data.
  • a predetermined range e.g., a range of predetermined distances or recognition rates
  • the pen input apparatus 200 may no optical sensing unit 220 and the pen input analyzing unit 240 may generate pen input data based on sensing of the touch sensing unit 210 and the motion sensing unit 230.
  • the pen input analyzing unit 240 may generate a first pen input and a second pen input based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the state change sensed by the motion sensing unit 230.
  • the state change sensed by the motion sensing unit 230 while the touch sensing unit 210 is sensing touch may be generated as the first pen input.
  • the state change sensed by the motion sensing unit 240 while the touch sensing unit 210 is sensing no touch may be generated as the second pen input.
  • the pen input analyzing unit 240 may combine the first pen input and the second pen input and generate pen input data based on the combined pen input.
  • a pen input start point of the one pen input is identical to a pen input finish point of the other pen input such that continuous pen inputs may be realized accurately.
  • the communication unit 250 may transmit one or more of the pen input data generated by the pen input analyzing unit 240 and the pen input data stored in the memory unit 260 to an external terminal.
  • the communication unit 250 may transmit the pen input data in real time or according to the operation of the pen input apparatus 200. For example, when the user interface unit 270 receives the user’s input about the transmission of the pen input data the communication unit 250, the communication unit 250 may transmit the pen input data stored in the memory 260. Alternatively, when the touch sensing unit 210 senses no touch, the communication unit 250 may transmit the pen input data stored in the memory unit 260 immediately or in a predetermined time.
  • the communication unit 250 may communicate with the external terminal through a wire communication method (e.g., Universe Serial Bus (USB) and a mouse port (PS/2 port)) or a wireless communication method (e.g., Bluetooth, Infrared Data Association (InDA) and other RF communication).
  • a wire communication method e.g., Universe Serial Bus (USB) and a mouse port (PS/2 port)
  • a wireless communication method e.g., Bluetooth, Infrared Data Association (InDA) and other RF communication.
  • Examples of the communication unit 250 are not limited thereto and various wireless communication methods enabling data communication may be applied.
  • the pen input data generated by the pen input analyzing unit 240 may be stored in the memory unit 260.
  • the pen input data stored in the memory unit 260 may be transmitted to the external terminal by the communication unit 250.
  • the pen input data is stored in the memory unit 260.
  • the communication unit 250 may transmit the pen input data stored in the memory unit 260 to the external terminal.
  • the user interface unit 270 may receive the user’s input.
  • the user’s input may include one or more commands for starting/finishing of the operation of the pen input apparatus and transmitting the pen input data.
  • the user interface unit 270 may include a button, a touch sensing unit (or a touch sensor) and an input device (e.g., a joy stick and a joy wheel).
  • Such the user interface unit 270 may be configured to be exposed outside the pen input apparatus 200 and a plurality of one type input devices or combined various input devices. Such configuration is one of examples and various configurations capable of receiving the user’s input may be applied according to embodiments of the present disclosure.
  • a writing material may be coupled to a writing material coupling unit (not shown).
  • the pen input apparatus 200 may be also used as writing material.
  • the writing material coupling unit may be provided in an end of the pen input apparatus 200.
  • FIG. 3 is a diagram illustrating a pen input apparatus according to another embodiment of the present disclosure.
  • a pen input device 300 includes a touch sensing unit 310, an optical sensing unit 320, a motion sensing unit 330, a pen input analyzing unit 340, a communication unit 350, a memory unit 260 and a user interface unit 370.
  • the structure and type of the pen input apparatus 300 shown in FIG. 3 are examples. Various structures and types may be applied according to embodiments.
  • FIG. 4 is a diagram illustrating operation of the pen input apparatus according to one embodiment of the present disclosure.
  • the pen input apparatus may sense the operation of the pen input apparatus and generate pen input 410, 430 and 450.
  • the user pen input includes a pen input 410, a pen input 430 and a pen input 450 which are performed sequentially.
  • the pen input 410 and the pen input 450 refers to pen inputs accompanying touch.
  • the pen input 430 refers to a pen input accompanying not touch.
  • the pen input apparatus generates pen input data based on the pen inputs 410, 430 and 450.
  • a pen input finish point 420 of the pen input is in accord with a pen input start point 420 of the pen input 430 to connect the continuous pen inputs 41, 430 and 450 with each other.
  • a pen input finish point of the pen input 430 is in accord with a pen input start point 440 of the pen input 450.
  • the pen input data is displayed as shown in FIG. 4 (b).
  • the pen inputs 410 and 450 are the pen inputs accompanying touch and then displayed, with an outline or color.
  • the pen input 430 is the pen input accompanying not touch and then displayed transparently. Accordingly, an entire character of “ ⁇ ” may be displayed.
  • FIG. 5 is a flow chart illustrating a method 500 for operating a pen input apparatus according to one embodiment of the present disclosure.
  • touch accompanied by the pen input may be sensed.
  • touch with an external object is sensed and it is determined whether the pen input is performed by the user.
  • a locus of the user’s pen input may be sensed.
  • the step (530) of sensing the locus of the pen input performed may be performed while the step (510) of sensing the touch of the pen input is performed or in a predetermined range (e.g., a range of predetermined distances and recognition rates) even though the touch of the pen input is not sensed.
  • the step (530) may sense change in one or more rotation, motion and slope on one or more shafts.
  • a step (540) analyzes the performance of the pen input determined in the step (510), the locus of the pen input sensed by the step (520) and the state change sensed in the step (530), and generates the pen input data based on the result of the analysis.
  • the pen input data means the data about the user’s pen input set in the position, considering the motion of the pen input apparatus 200.
  • the method 500 may further include a step of transmitting the pen input data to an external terminal.
  • the transmitting step may be performed by transmitting the pen input data in real time.
  • the transmitting step may be performed by transmitting the pen input data according to the user’s input on the transmission of the pen input data.
  • the transmitting step may be performed by transmitting the pen input data immediately or in a predetermined time, when not touch is sensed in the step (510).
  • the method 500 may further include a step of storing the pen input data.
  • the stored pen input data may be transmitted to the external terminal immediately or in a predetermined time.
  • the storing step may store the pen input data, unless the communication with the external terminal is connected. After that, when the communication with the external terminal is connected, the stored pen input data may be transmitted to the external terminal.
  • the method 500 may further include a step of receiving the user’s input for one or more of starting/finishing of the operation of the pen input apparatus and transmitting of the pen input data.
  • the method 500 may further include a step of coupling a writing material to an end of the pen input apparatus.
  • FIG. 6 is a flow chart illustrating a step of generating pen input data by analyzing pen input shown in FIG. 5.
  • the step (540) may include a step (610) of generating a first pen input, a step (620) of generating a second pen input and a step (630) of combining the first pen input and the second pen input.
  • the first pen input accompanying touch may be generated based on performance or non-performance of the pen input determined in the step (510) and the locus of the pen input sensed in the step (520).
  • the locus of the pen input sensed in the step (520) may be generated as the first pen input, while the touch is sensed in the step (510).
  • the second pen input accompanying no touch may be generated based on performance or non-performance of the pen input determined in the step (510) and the state change sensed in the step (520).
  • the state change sensed in the step (520) may be generated as the second pen input, while the touch is sensed in the step (510).
  • the first pen input and the second pen input may be combined and the pen input data may be generated based on the combined pen input.
  • a pen input start point of the one is in accord with a pen input finish point of the other one, such that the continuous pen inputs may be realized accurately.
  • a step of generating a third pen input accompanying not touch based on the step (540), the step (510) and the step (520) may be further provided.
  • the locus of the pen input sensed in the step (520) while no touch is sensed in the step (510) may be generated as the third pen input.
  • the step (540) may further include a step of compensating the second pen input, using the third pen input, once the third pen input is generated.
  • the step 520 of the method 500 shown in FIG. 5 may be omitted.
  • the step 540 may be performed by analyzing the pen input in the step 510 and the step 530 and generating the pen input data.
  • the step (540) may further include a step of generating a first pen input accompanying touch and a second pen input accompanying no touch based on performance or non-performance of the pen input determined in the step (510) and state change sensed in the step (530).
  • the state change sensed in the step (530) while the touch is sensed in the step (510) may be generated as the first pen input.
  • the state change sensed in the step (530) while no touch is sensed in the step (510) may be generated as the second pen input.
  • the step (540) may further include a step of combining the first pen input and the second pen input with each other to generate the pen input data.
  • the combining step makes a pen input start point of the one in accordance with a pen input finish point of the other, when one of the first and second pen inputs follows the other. Accordingly, the continuous pen inputs may be realized accurately.
  • processing units may include one or more of ASICs, DSPs, DSPDs, programmable logic devices, FPGAs (field programmable gate arrays), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions disclosed herewith and combinations of them.
  • the software, the firmware, the middleware, the microcode, the program code and the code segments may be stored in a mechanism-readable media (e.g., a storage component).
  • the code segment may refer to free combination of procedures, functions, sub-programs, programs, routines, sub-routines, modules, software packages, classes or commands, data structures and program statements.
  • the code segment may transmit and/or receive information, factors, parameters or memory contents, to be coupled to another code segment or a hardware circuit.
  • Information, factors, parameters and data may be transmitted, forwarded or received, using predetermined proper means including memory sharing, message transmitting, token transmitting and network transmitting.
  • modules e.g., procedures and factures
  • the software codes may be stored in memory units and implemented by processors.
  • the memory unit may be realized in a processor or outside the processor. When the memory unit is realized outside the processor, the memory may be coupled to the processor through well-known various means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

There are disclosed a pen input apparatus and a method for operating the same, which includes a touch sensing unit arranged in an end of the pen input apparatus to sense touch accompanied by pen input to determine performance or non-performance of a user's pen input; an optical sensing unit arranged in the end of the pen input apparatus to sense a locus of the user's pen input; a motion sensing unit to sense state change in one or more of rotation, motion and slope on one or more shafts; and a pen input analyzing unit to generate pen input data after analyzing performance or non-performance of the pen input determined by the touch sensing unit, the locus of the pen input sensed by the optical sensing unit and the state change sensed by the motion sensing unit. According to the embodiments of the present disclosure, the motion of the pen input apparatus may be recognized and the user's pen input may be realized accurately.

Description

PEN INPUT APPARATUS AND METHOD FOR OPERATING THE SAME
The present invention relates to a pen input apparatus and a method for operating the same, more particularly, to a pen input apparatus and a method for operating the same which may recognize a user’s pen input, without an auxiliary device.
With development of digital technology, users can write memos in various methods, using various digital devices. Specifically, users can write characters or draw pictures with a wireless pen mouse and written characters or pictures may be displayed on computers or notebooks. Users can write memos on smart phones or tablet PCs directly, using touch pens.
However, in case of writing letters with a conventional wireless pen mouse or touch pen, a user’s pen input accompanies movement of the touch pen (e.g., spacing between words). The movement of the touch pen has to be compensated to realize the user’s pen input accurately. For that, there have been various attempts to recognize and track the movement of the touch pen.
First of all, a pad exclusively used for coordinate recognition may be used so as to recognize movement of a touch pen. Electronic sensitized response between the pad exclusively used for coordinate recognition and a touch pen may be used, when a user uses a touch pen on the pad exclusively used for coordinate recognition, and coordinate setting for inputs created by the touch pen is enabled, such that the user’s pen input can be recognized based on the coordinate setting.
Another attempt is to use a communication device for location measurement. A location measurement device tracks movement of an electronic pen, while communicating with an electronic pen continuously. Especially, such a location measurement device measures a relative distance with an electronic pen through communication with the electronic pen to track a location of an electronic pen and then recognize locations between pen inputs made through the electronic pen.
A third attempt is to use a note exclusively used for coordinate recognition. A pattern formed on the note exclusively used for coordinate recognition is recognized, using an electronic pen including a camera, and locations between a user’s pen inputs can be recognized based on the recognized pattern.
However, such attempts require such auxiliary devices as the pad exclusively used for coordinate recognition, the location measurement device and the note exclusively used for coordinate recognition, and they have disadvantages of uneconomically high cost and user’s inconvenience.
Technology used for improving user convenience, while solving such disadvantages, is required.
[PRIOR ART REFERENCE]
Prior reference 001: KR 10-0970585 B1
Prior reference 002: KR 10-1066829 B1
Exemplary embodiments of the present disclosure provide a pen input apparatus and a method for operating the same which can accurately realize user pen input through recognizing movement of a pen, even without an auxiliary device.
To achieve these objects and other advantages and in accordance with the purpose of the invention, as embodied and broadly described herein, a pen input apparatus includes a touch sensing unit arranged in an end of the pen input apparatus to sense touch accompanied by pen input to determine performance or non-performance of a user’s pen input; an optical sensing unit arranged in the end of the pen input apparatus to sense a locus of the user’s pen input; a motion sensing unit to sense state change in one or more of rotation, motion and slope on one or more shafts; and a pen input analyzing unit to generate pen input data after analyzing performance or non-performance of the pen input determined by the touch sensing unit, the locus of the pen input sensed by the optical sensing unit and the state change sensed by the motion sensing unit.
The pen input analyzing unit may generate a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input; a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and the pen input data by combining the first pen input and the second pen input, and when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
In another aspect of the present disclosure, a method for operating a pen input apparatus includes steps of sensing touch accompanied by the pen input to determine performance or non-performance of the user’s pen input; sensing a locus of the user’s pen input; sensing change in states of rotation, motion and slope on one or more shafts of the pen input apparatus; and generating pen input data by analyzing performance or non-performance of the pen input determined in the step of sensing the touch, the locus of the pen input sensed by the step of sensing the locus and the change in the states sensed in the step of sensing the change in the states.
The step of generating the pen input data may include a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input; a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and the pen input data by combining the first pen input and the second pen input, and when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
According to the embodiments of the present disclosure, the motion of the pen input apparatus may be recognized and the user’s pen input may be realized accurately.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
FIG. 1 is a diagram illustrating a pen input system according to one embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating a pen input apparatus according to one embodiment of the present disclosure;
FIG. 3 is a diagram illustrating a pen input apparatus according to another embodiment of the present disclosure;
FIG. 4 is a diagram illustrating operation of the pen input apparatus according to one embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating a method for operating a pen input apparatus according to one embodiment of the present disclosure; and
FIG. 6 is a flow chart illustrating a step of generating pen input data by analyzing pen input shown in FIG. 5.
Hereinafter, exemplary embodiments of the disclosed subject matter are described more fully hereinafter with reference to the accompanying drawings. The disclosed subject matter may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Exemplary embodiments of the disclosed subject matter are described more fully hereinafter with reference to the accompanying drawings. The disclosed subject matter may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Rather, the exemplary embodiments are provided so that this disclosure is thorough and complete, and will convey the scope of the disclosed subject matter to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity. Like reference numerals in the drawings denote like elements.
FIG. 1 is a diagram illustrating a pen input system 100 according to one embodiment of the present disclosure.
The pen input system 100 may include a pen input apparatus 110 and a terminal 120.
The pen input apparatus 110 is an apparatus which receives a user’s pen input to generate pen input data based on the user’s pen input. The pen input apparatus 110 replaces a conventional keyboard or mouse and it may be used in writing letters or drawing a picture. The pen input apparatus 110 may include an electronic pen and a pen mouse and the embodiments of the disclosure are not limited thereto. Various modifications for receiving a user’s pen input may be used.
The terminal 120 may be provided with the pen input data from the pen input apparatus 110 and display the pen input based on the pen input data. Examples of the terminal 120 may include a smart phone, a tablet PC, a notebook and a computer and the examples are not limited thereto. Any external devices capable of transmitting/receiving data to/from the pen input apparatus 110 and displaying pen input.
FIG. 2 is a block diagram illustrating a pen input apparatus according to one embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating a pen input apparatus 200 according to one embodiment of the present disclosure.
The pen input apparatus 200 may include a touch sensing unit 210, an optical sensing unit 220, a motion sensing unit 230, a pen input analyzing unit 240, a communication unit 250, a memory unit 260 and a user interface unit 270.
The touch sensing unit 210 may sense touch generated by pen input. When the user makes pen input, the pen input apparatus 200 may touch an external object (e.g., paper and a pad). The touch sensing unit 210 may sense such touch and determine whether the user performs pen input. To make it easy to sense contact with the external object after the pen input is generated, the touch sensing unit 210 may be arranged in an end of the pen input apparatus 200. The touch sensing unit 210 may include a touch sensor and other various components.
The optical sensing unit 220 may sense a locus of the user’s pen input. The optical sensing unit 220 may irradiate light outside and sense motion of an end of the pen input apparatus 200, in other words, the locus of the user’s pen input. To sense the locus of the pen input easily, the optical sensing unit 220 may be arranged in the end of the pen input apparatus 200. As the optical sensing unit 220 senses the locus of the pen input by sensing a light reflected after irradiating light, the touch sensing unit 210 sense touch input from the outside with a high recognition rate. However, even though the touch sensing unit 210 through various types of optical sensing units senses no touch with the outside, a locus of the pen input may be sensed in a predetermined range (e.g., a predetermined range of distances or recognition rates). The optical sensing unit 220 may include one or more of an optical sensor, a light emitting diode (LED), a laser sensor and a blue track sensor. When the optical sensor senses a light reflected on a desk or note after irradiated from the LED, the optical sensing unit 130 processes a signal transmitted from the optical sensor and recognize a locus of the pen input made by the pen input apparatus 100. The configuration of the optical sensing unit 220 is exemplary and other various configurations for sensing the locus of the input made by the pen input apparatus 100 according to embodiments of the present disclosure may be used.
The motion sensing unit 230 may sense overall motion of the pen input apparatus 200, in other words, change in states such as one or more of rotation, motion and slope on one or more shaft. For that, the motion sensing unit 230 may include one or more measuring sensors for sensing state change of the pen input apparatus 200. Such a measuring sensor may include one or more of an acceleration sensor, a gyro sensor, a geomagnetic sensor (e.g., a compass) and an electromagnetic field sensor. The components of the motion sensing unit 230 are one of examples and other various components may consist of the motion sensing unit 230 to sense change in states of the pen input apparatus 200.
The pen input analyzing unit 240 may generate pen input data. The pen input data is data on the user’s pen input and it may include a first pen input and a second pen input. The first pen input means a pen input accompanying touch such as characters or lines to realize through the pen input apparatus 200. The second pen input means a pen input accompanying no touch such as space between words. For example, a pen input for “기” includes a first pen input for “ㅣ” and a second pen input for space between “ㄱ” and “ㅣ”.
The analyzing of the pen input analyzing unit 240 may be performed based on result of sensing performed by the touch sensing unit 210, the optical sensing unit 220 and the motion sensing unit 230. Specifically, the pen input analyzing unit 240 may generate the first pen input accompanying the touch sensed by the touch sensing unit 210 based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the locus of the pen input sensed by the optical sensing unit 220. In other words, a locus of the pen input sensed by the optical sensing unit 220 while the touch sensing unit 210 is sensing touch may be generated as the first pen input. Hence, the second pen input accompanying no touch sensed by the touch sensing unit 210 may be generated based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the state change sensed by the motion sensing unit 230. In other words, state change sensed by the motion sensing unit 230 while the touch sensing unit 210 is sensing no touch may be generated as the second pen input. When the first pen input and the second pen input are generated, the first pen input and the second pen input are combined, only to generate pen input data. When one of the first and second pen inputs is following the other, a start point of the one pen input is identical to a finish point of the other pen input to make it possible to realize continuous pen inputs accurately.
Moreover, the pen input analyzing unit 240 may generate a third pen input accompanying no touch based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the locus of the pen input sensed by the optical sensing unit 220. In other words, the locus of the pen input sensed by the optical sensing unit 220 while the touch sensing unit 210 is sensing no touch may be generated as the third pen input. When the third pen input is generated, the second pen may be compensated by the third pen input. The locus of the pen input may be sensed by the optical sensing unit 220 at a high recognition rate, in case the touch sensing unit 210 senses touch. The locus of the pen input may be sensed in a predetermined range (e.g., a range of predetermined distances or recognition rates), even with no touch sensed by the touch sensing unit 210. Accordingly, the pen input analyzing unit 240 compensates the second pen input, using the third pen input such that motion of the pen input apparatus may be reflected in the pen input data.
Alternatively, the pen input apparatus 200 may no optical sensing unit 220 and the pen input analyzing unit 240 may generate pen input data based on sensing of the touch sensing unit 210 and the motion sensing unit 230. Specifically, the pen input analyzing unit 240 may generate a first pen input and a second pen input based on performance or non-performance of the pen input determined by the touch sensing unit 210 and the state change sensed by the motion sensing unit 230. In other words, the state change sensed by the motion sensing unit 230 while the touch sensing unit 210 is sensing touch may be generated as the first pen input. The state change sensed by the motion sensing unit 240 while the touch sensing unit 210 is sensing no touch may be generated as the second pen input. When the first pen input and the second pen input are generated, the pen input analyzing unit 240 may combine the first pen input and the second pen input and generate pen input data based on the combined pen input. When one of the first and second pen inputs is following the other in the pen input data, a pen input start point of the one pen input is identical to a pen input finish point of the other pen input such that continuous pen inputs may be realized accurately.
The communication unit 250 may transmit one or more of the pen input data generated by the pen input analyzing unit 240 and the pen input data stored in the memory unit 260 to an external terminal. The communication unit 250 may transmit the pen input data in real time or according to the operation of the pen input apparatus 200. For example, when the user interface unit 270 receives the user’s input about the transmission of the pen input data the communication unit 250, the communication unit 250 may transmit the pen input data stored in the memory 260. Alternatively, when the touch sensing unit 210 senses no touch, the communication unit 250 may transmit the pen input data stored in the memory unit 260 immediately or in a predetermined time. The communication unit 250 may communicate with the external terminal through a wire communication method (e.g., Universe Serial Bus (USB) and a mouse port (PS/2 port)) or a wireless communication method (e.g., Bluetooth, Infrared Data Association (InDA) and other RF communication). However, examples of the communication unit 250 are not limited thereto and various wireless communication methods enabling data communication may be applied.
The pen input data generated by the pen input analyzing unit 240 may be stored in the memory unit 260. The pen input data stored in the memory unit 260 may be transmitted to the external terminal by the communication unit 250. In one embodiment, when communication between the communication unit 250 and the terminal is not connected, the pen input data is stored in the memory unit 260. When the communication between the communication unit 250 and the external terminal is connected, the communication unit 250 may transmit the pen input data stored in the memory unit 260 to the external terminal.
The user interface unit 270 may receive the user’s input. The user’s input may include one or more commands for starting/finishing of the operation of the pen input apparatus and transmitting the pen input data. The user interface unit 270 may include a button, a touch sensing unit (or a touch sensor) and an input device (e.g., a joy stick and a joy wheel). Such the user interface unit 270 may be configured to be exposed outside the pen input apparatus 200 and a plurality of one type input devices or combined various input devices. Such configuration is one of examples and various configurations capable of receiving the user’s input may be applied according to embodiments of the present disclosure.
A writing material may be coupled to a writing material coupling unit (not shown). As the writing material is coupled to the writing material coupling unit, the pen input apparatus 200 may be also used as writing material. To ease the user’s usage of the writing material, the writing material coupling unit may be provided in an end of the pen input apparatus 200.
FIG. 3 is a diagram illustrating a pen input apparatus according to another embodiment of the present disclosure.
Referring to FIG. 3, a pen input device 300 includes a touch sensing unit 310, an optical sensing unit 320, a motion sensing unit 330, a pen input analyzing unit 340, a communication unit 350, a memory unit 260 and a user interface unit 370.
Descriptions of the touch sensing unit 310, the optical sensing unit 320, the motion sensing unit 330, the pen input analyzing unit 340, the communication unit 350, the memory unit 360 and the user interface unit 370 which are provided in the pen input apparatus 300 shown in FIG. 3 are the same as the descriptions of the touch sensing unit 210, the optical sensing unit 220, the motion sensing unit 230, the pen input analyzing unit 240, the communication unit 250, the memory unit 260 and the user interface unit 270 which are provided in the pen input apparatus 200 shown in FIG. 2, respectively.
The structure and type of the pen input apparatus 300 shown in FIG. 3 are examples. Various structures and types may be applied according to embodiments.
FIG. 4 is a diagram illustrating operation of the pen input apparatus according to one embodiment of the present disclosure.
Referring to FIG. 4 (a), the pen input apparatus may sense the operation of the pen input apparatus and generate pen input 410, 430 and 450. The user pen input includes a pen input 410, a pen input 430 and a pen input 450 which are performed sequentially. The pen input 410 and the pen input 450 refers to pen inputs accompanying touch. The pen input 430 refers to a pen input accompanying not touch. The pen input apparatus generates pen input data based on the pen inputs 410, 430 and 450. At this time, a pen input finish point 420 of the pen input is in accord with a pen input start point 420 of the pen input 430 to connect the continuous pen inputs 41, 430 and 450 with each other. Also, a pen input finish point of the pen input 430 is in accord with a pen input start point 440 of the pen input 450.
The pen input data is displayed as shown in FIG. 4 (b). Referring to FIG. 4 (b), the pen inputs 410 and 450 are the pen inputs accompanying touch and then displayed, with an outline or color. The pen input 430 is the pen input accompanying not touch and then displayed transparently. Accordingly, an entire character of “기” may be displayed.
FIG. 5 is a flow chart illustrating a method 500 for operating a pen input apparatus according to one embodiment of the present disclosure.
In a step (510), touch accompanied by the pen input may be sensed. In a step of 510, touch with an external object is sensed and it is determined whether the pen input is performed by the user.
In a step (520), a locus of the user’s pen input may be sensed. The step (530) of sensing the locus of the pen input performed may be performed while the step (510) of sensing the touch of the pen input is performed or in a predetermined range (e.g., a range of predetermined distances and recognition rates) even though the touch of the pen input is not sensed.
The step (530) may sense change in one or more rotation, motion and slope on one or more shafts.
A step (540) analyzes the performance of the pen input determined in the step (510), the locus of the pen input sensed by the step (520) and the state change sensed in the step (530), and generates the pen input data based on the result of the analysis. The pen input data means the data about the user’s pen input set in the position, considering the motion of the pen input apparatus 200.
In one embodiment, the method 500 may further include a step of transmitting the pen input data to an external terminal. The transmitting step may be performed by transmitting the pen input data in real time. Alternatively, the transmitting step may be performed by transmitting the pen input data according to the user’s input on the transmission of the pen input data. Also, the transmitting step may be performed by transmitting the pen input data immediately or in a predetermined time, when not touch is sensed in the step (510).
In another embodiment, the method 500 may further include a step of storing the pen input data. The stored pen input data may be transmitted to the external terminal immediately or in a predetermined time. For example, the storing step may store the pen input data, unless the communication with the external terminal is connected. After that, when the communication with the external terminal is connected, the stored pen input data may be transmitted to the external terminal.
In a further embodiment, the method 500 may further include a step of receiving the user’s input for one or more of starting/finishing of the operation of the pen input apparatus and transmitting of the pen input data.
In a still further embodiment, the method 500 may further include a step of coupling a writing material to an end of the pen input apparatus.
FIG. 6 is a flow chart illustrating a step of generating pen input data by analyzing pen input shown in FIG. 5.
Referring to FIGS. 5 and 6, the step (540) may include a step (610) of generating a first pen input, a step (620) of generating a second pen input and a step (630) of combining the first pen input and the second pen input.
In the step (610, the first pen input accompanying touch may be generated based on performance or non-performance of the pen input determined in the step (510) and the locus of the pen input sensed in the step (520). In other words, the locus of the pen input sensed in the step (520) may be generated as the first pen input, while the touch is sensed in the step (510).
In the step (620), the second pen input accompanying no touch may be generated based on performance or non-performance of the pen input determined in the step (510) and the state change sensed in the step (520). In other words, the state change sensed in the step (520) may be generated as the second pen input, while the touch is sensed in the step (510).
In the step (630), the first pen input and the second pen input may be combined and the pen input data may be generated based on the combined pen input. In the step (630), when one of the first and second pen inputs follows the other one, a pen input start point of the one is in accord with a pen input finish point of the other one, such that the continuous pen inputs may be realized accurately.
Additionally, a step of generating a third pen input accompanying not touch based on the step (540), the step (510) and the step (520) may be further provided. In other words, the locus of the pen input sensed in the step (520) while no touch is sensed in the step (510) may be generated as the third pen input. The step (540) may further include a step of compensating the second pen input, using the third pen input, once the third pen input is generated.
Alternatively, the step 520 of the method 500 shown in FIG. 5 may be omitted. Accordingly, the step 540 may be performed by analyzing the pen input in the step 510 and the step 530 and generating the pen input data. Specifically, the step (540) may further include a step of generating a first pen input accompanying touch and a second pen input accompanying no touch based on performance or non-performance of the pen input determined in the step (510) and state change sensed in the step (530). In other words, the state change sensed in the step (530) while the touch is sensed in the step (510) may be generated as the first pen input. The state change sensed in the step (530) while no touch is sensed in the step (510) may be generated as the second pen input. After that, once the first pen input and the second pen input are generated, the step (540) may further include a step of combining the first pen input and the second pen input with each other to generate the pen input data. The combining step makes a pen input start point of the one in accordance with a pen input finish point of the other, when one of the first and second pen inputs follows the other. Accordingly, the continuous pen inputs may be realized accurately.
It should be understood that the embodiments disclosed in the present disclosure are realized by hardware, software, firm ware, middle ware, micro code and combinations of them. When the embodiments are realized by the hardware, processing units may include one or more of ASICs, DSPs, DSPDs, programmable logic devices, FPGAs (field programmable gate arrays), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions disclosed herewith and combinations of them.
When realizing the embodiments, the software, the firmware, the middleware, the microcode, the program code and the code segments may be stored in a mechanism-readable media (e.g., a storage component). The code segment may refer to free combination of procedures, functions, sub-programs, programs, routines, sub-routines, modules, software packages, classes or commands, data structures and program statements. The code segment may transmit and/or receive information, factors, parameters or memory contents, to be coupled to another code segment or a hardware circuit. Information, factors, parameters and data may be transmitted, forwarded or received, using predetermined proper means including memory sharing, message transmitting, token transmitting and network transmitting.
In case of the software, technologies disclosed herewith may be realized by modules (e.g., procedures and factures) for performing functions. The software codes may be stored in memory units and implemented by processors. The memory unit may be realized in a processor or outside the processor. When the memory unit is realized outside the processor, the memory may be coupled to the processor through well-known various means.
Various variations and modifications of the refrigerator described above are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.

Claims (14)

  1. A pen input apparatus comprising:
    a touch sensing unit arranged in an end of the pen input apparatus to sense touch accompanied by pen input to determine performance or non-performance of a user’s pen input;
    an optical sensing unit arranged in the end of the pen input apparatus to sense a locus of the user’s pen input;
    a motion sensing unit to sense state change in one or more of rotation, motion and slope on one or more shafts; and
    a pen input analyzing unit to generate pen input data after analyzing performance or non-performance of the pen input determined by the touch sensing unit, the locus of the pen input sensed by the optical sensing unit and the state change sensed by the motion sensing unit.
  2. The pen input apparatus as claimed in claim 1, further comprising:
    a communication unit to transmit the pen input data to an external terminal.
  3. The pen input apparatus as claimed in claim 2, further comprising:
    a memory unit to store the pen input data.
  4. The pen input apparatus as claimed in claim 3, wherein the communication unit transmits the pen input data stored in the memory unit, when no user’s pen input is sensed for a predetermined time.
  5. The pen input apparatus as claimed in claim 1, further comprising:
    a user interface unit to receive the user’ input for one or more of starting/finishing of the operation of the pen input device and transmitting of the pen input.
  6. The pen input apparatus as claimed in claim 1, wherein the pen input analyzing unit generates,
    a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input;
    a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and
    the pen input data by combining the first pen input and the second pen input, and
    when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
  7. The pen input apparatus as claimed in claim 6, wherein the pen input analyzing unit generates,
    a third pen input accompanying no touch based on performance or non-performance of the pen input and a locus of the pen input, and
    the second pen input is compensated, using the third pen input.
  8. A method for operating a pen input apparatus comprising steps of:
    sensing touch accompanied by the pen input to determine performance or non-performance of the user’s pen input;
    sensing a locus of the user’s pen input;
    sensing change in states of rotation, motion and slope on one or more shafts of the pen input apparatus; and
    generating pen input data by analyzing performance or non-performance of the pen input determined in the step of sensing the touch, the locus of the pen input sensed by the step of sensing the locus and the change in the states sensed in the step of sensing the change in the states.
  9. The method for operating the pen input apparatus according to claim 8, further comprising a step of:
    transmitting the pen input data to an external device.
  10. The method for operating the pen input apparatus according to claim 9, further comprising a step of:
    storing the pen input data.
  11. The method for operating the pen input apparatus according to claim 10, wherein the step of transmitting the pen input data comprises,
    a step of transmitting the stored pen input data, when no user’s pen input is sensed in a predetermined time in the step of sensing the touch.
  12. The method for operating the pen input apparatus according to claim 8, further comprising:
    a step of receiving the user’s input for one or more of starting/finishing of an operation of the pen input apparatus and transmitting of the pen input data.
  13. The method for operating the pen input apparatus according to claim 8, wherein the step of generating the pen input data comprises,
    a first pen input accompanying the touch based on performance or non-performance of the pen input and a locus of the pen input;
    a second pen input accompanying no touch based on performance or non-performance of the pen input and the state change; and
    the pen input data by combining the first pen input and the second pen input, and
    when one of the first and second pen inputs follows the other one in the pen input data, a pen input start of the one is in accord with a pen input finish point of the other.
  14. The method for operating the pen input apparatus as claimed in claim 13, wherein the step of generating the pen input data further comprises,
    a third pen input accompanying no touch based on performance or non-performance of the pen input and a locus of the pen input, and
    the second pen input is compensated, using the third pen input.
PCT/KR2014/000024 2013-02-27 2014-01-03 Pen input apparatus and method for operating the same WO2014133258A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130021229A KR101311506B1 (en) 2013-02-27 2013-02-27 Pen input apparatus and operation method thereof
KR10-2013-0021229 2013-02-27

Publications (1)

Publication Number Publication Date
WO2014133258A1 true WO2014133258A1 (en) 2014-09-04

Family

ID=49456715

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/000024 WO2014133258A1 (en) 2013-02-27 2014-01-03 Pen input apparatus and method for operating the same

Country Status (2)

Country Link
KR (1) KR101311506B1 (en)
WO (1) WO2014133258A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102228812B1 (en) * 2014-01-10 2021-03-16 주식회사 엘지유플러스 Digital device, server, and method for processing/managing writing data
KR102336708B1 (en) * 2015-06-17 2021-12-07 엘지이노텍 주식회사 Interactive whiteboard system
KR101954109B1 (en) 2017-06-28 2019-03-05 서울대학교산학협력단 the trace tracing system using the fingerprint recognition type virtual sign device
KR102328569B1 (en) * 2021-08-03 2021-11-19 (주)티에프프로젝트 A multipurpose pen

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030036979A (en) * 2001-11-01 2003-05-12 핑거시스템 주식회사 The apparatus of pen-type optical mouse and controlling method thereof
KR20030072988A (en) * 2002-03-08 2003-09-19 정대진 Electronic-pen and operating method there for
KR20090011841A (en) * 2007-07-27 2009-02-02 삼성전자주식회사 Trajectory estimation apparatus and method based on pen-type optical mouse

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000099251A (en) 1998-09-25 2000-04-07 Sanyo Electric Co Ltd Electronic pen device and character recognizing method
US7889186B2 (en) 2005-04-29 2011-02-15 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Pen input device and method for tracking pen position
JP2009099041A (en) 2007-10-18 2009-05-07 Smk Corp Pen type input device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030036979A (en) * 2001-11-01 2003-05-12 핑거시스템 주식회사 The apparatus of pen-type optical mouse and controlling method thereof
KR20030072988A (en) * 2002-03-08 2003-09-19 정대진 Electronic-pen and operating method there for
KR20090011841A (en) * 2007-07-27 2009-02-02 삼성전자주식회사 Trajectory estimation apparatus and method based on pen-type optical mouse

Also Published As

Publication number Publication date
KR101311506B1 (en) 2013-09-25

Similar Documents

Publication Publication Date Title
WO2015016569A1 (en) Method and apparatus for constructing multi-screen display
WO2012141352A1 (en) Gesture recognition agnostic to device orientation
WO2018026202A1 (en) Touch sensing device for determining information related to pen, control method therefor, and pen
WO2015030303A1 (en) Portable device displaying augmented reality image and method of controlling therefor
WO2015122559A1 (en) Display device and method of controlling therefor
WO2018151449A1 (en) Electronic device and methods for determining orientation of the device
WO2014025131A1 (en) Method and system for displaying graphic user interface
EP2761973A1 (en) Method of operating gesture based communication channel and portable terminal system for supporting the same
WO2016010195A1 (en) Mobile device and control method thereof
WO2014133258A1 (en) Pen input apparatus and method for operating the same
WO2018004140A1 (en) Electronic device and operating method therefor
WO2013005869A1 (en) Adaptive user interface
WO2020130356A1 (en) System and method for multipurpose input device for two-dimensional and three-dimensional environments
WO2020017890A1 (en) System and method for 3d association of detected objects
US11854310B2 (en) Face liveness detection method and related apparatus
WO2023059087A1 (en) Augmented reality interaction method and apparatus
WO2015030460A1 (en) Method, apparatus, and recording medium for interworking with external terminal
WO2016080557A1 (en) Wearable device and control method therefor
CN107783669A (en) Cursor generation system, method and computer program product
WO2009119978A1 (en) Method and system for inputting document information
WO2016072610A1 (en) Recognition method and recognition device
WO2020209455A1 (en) System and method for natural three-dimensional calibration for robust eye tracking
WO2019203591A1 (en) High efficiency input apparatus and method for virtual reality and augmented reality
WO2013115493A1 (en) Method and apparatus for managing an application in a mobile electronic device
WO2021153961A1 (en) Image input system using virtual reality and image data generation method using same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14757696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14757696

Country of ref document: EP

Kind code of ref document: A1