WO2023218886A1 - Information input device, information input method, and information input program - Google Patents

Information input device, information input method, and information input program Download PDF

Info

Publication number
WO2023218886A1
WO2023218886A1 PCT/JP2023/015665 JP2023015665W WO2023218886A1 WO 2023218886 A1 WO2023218886 A1 WO 2023218886A1 JP 2023015665 W JP2023015665 W JP 2023015665W WO 2023218886 A1 WO2023218886 A1 WO 2023218886A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
input device
unit
estimated value
writing instrument
Prior art date
Application number
PCT/JP2023/015665
Other languages
French (fr)
Japanese (ja)
Inventor
卓吾 岩間
政樹 武田
Original Assignee
ゼブラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ゼブラ株式会社 filed Critical ゼブラ株式会社
Publication of WO2023218886A1 publication Critical patent/WO2023218886A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present disclosure relates to an information input system, an information input method, and an information input program.
  • Patent Document 1 discloses a writing instrument that detects a writing motion and provides writing information to a user.
  • this writing instrument an inclined portion is provided at the tip of the barrel, and a motion sensor is housed in the inclined portion.
  • the motion sensor measures the handwriting motion, and handwriting information is acquired based on the measurement results of the motion sensor.
  • the motion sensor is placed near the tip of the writing instrument to suppress errors between the writing information measured by the motion sensor and the actual writing motion.
  • the motion sensor cannot be placed at the tip of the writing instrument, there is a limit to improving the accuracy of handwritten information by bringing the motion sensor closer to the tip of the writing instrument.
  • this method may give the user a sense of discomfort.
  • the present disclosure provides an information input system, an information input method, and an information input program that can improve the accuracy of input information while suppressing the discomfort that a user feels during input operations.
  • An information input system includes an acquisition unit that acquires observed values regarding the operation of an input device from a sensor, an estimation unit that estimates an estimated value of a state of the input device based on the observed values, and an estimated value. and an output unit that outputs information indicated by the trajectory to a display device as input information from an input device.
  • the estimation unit includes a model acquisition unit that acquires an input device model that models the input device, and a disturbance calculation unit that calculates a disturbance to the input device model using the observed value and mass information indicating the mass of the input device.
  • an analysis execution unit that simulates the operation of the input device using the input device model by applying a disturbance to the input device model that includes mass information, and the current state of the input device model indicated by the simulation execution result.
  • an estimated value deriving unit that derives the estimated value by associating it with the state of the input device.
  • An information input method is an information input method executed by an information input system including a processor.
  • This information input method includes the steps of acquiring observed values regarding the operation of the input device from the sensor, estimating the estimated value of the state of the input device based on the observed values, and inputting information indicated by the trajectory of the estimated value. and outputting to a display device as input information from the device.
  • the step of estimating the estimated value includes a step of obtaining an input device model that models the input device, and a step of calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device.
  • the step of simulating the operation of the input device using the input device model by applying a disturbance to the input device model including mass information includes the step of simulating the operation of the input device using the input device model, and calculating the state of the input device model indicated by the simulation execution result based on the current input
  • the method includes the step of deriving the estimated value by associating it with the state of the device.
  • An information input program includes a step of acquiring an observed value regarding the operation of an input device from a sensor, a step of estimating an estimated value of the state of the input device based on the observed value, and a trajectory of the estimated value. outputting the information indicated by the input device to the display device as input information by the input device.
  • the step of estimating the estimated value includes a step of obtaining an input device model that models the input device, and a step of calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device.
  • the step of simulating the operation of the input device using the input device model by applying a disturbance to the input device model including mass information includes the step of simulating the operation of the input device using the input device model, and calculating the state of the input device model indicated by the simulation execution result based on the current input
  • the method includes the step of deriving the estimated value by associating it with the state of the device.
  • observed values regarding the operation of the input device are acquired from the sensor, and disturbances to the input device model are calculated using the observed values and mass information. Then, by applying a disturbance to the input device model, a simulation of the operation of the input device using the input device model is executed. The state of the input device model indicated by the simulation execution result is associated with the current state of the input device. As a result, an estimated value of the state of the input device is derived.
  • a simulation using an input device model in this way, information obtained at the position of a sensor can be converted into information at an arbitrary position of the input device and processed. Therefore, the estimated value can be derived with high accuracy based on the observed value from the sensor, regardless of the position of the sensor with respect to the input device.
  • the input device may be a writing instrument that includes a pen tip at one end in the axial direction.
  • the estimation unit may estimate an estimated value of the position of the pen tip and an estimated value of the angle in the axial direction as the estimated values. In this case, it is possible to accurately acquire information indicated by the trajectory of the pen tip as handwriting information, while suppressing the discomfort felt by the user during the writing operation.
  • the information input system may further include a setting unit that sets a virtual plane in a space where the input device is placed, and a recognition unit that recognizes a trajectory of the estimated value with respect to the virtual plane. In this case, the user's writing motion on the virtual surface can be recognized.
  • the estimation unit may further include a monitoring unit that monitors changes in the estimated value for the virtual surface, and an adjustment unit that adjusts the estimated value for the virtual surface.
  • the monitoring unit includes a calculation unit that calculates an amount of change in the estimated value from the reference value using a predetermined state of the input device with respect to the virtual surface as a reference value, and a calculation unit that determines whether the amount of change is within a first tolerance range.
  • a determination unit may also be included.
  • the adjustment unit improves the simulation execution result by making the disturbance applied to the input device model smaller than the disturbance calculated by the disturbance calculation unit when the determination unit determines that the amount of change is not within the first tolerance range.
  • the amount of change in the estimated value shown may be reduced.
  • the determination unit may determine whether the amount of change is within a second tolerance range that is larger than the first tolerance range.
  • the adjustment unit may return the estimated value to the reference value when the determination unit determines that the amount of change is not within the second tolerance range. According to this configuration, when the estimated value changes further from the reference value, by returning the estimated value to the reference value, it becomes possible to more reliably recognize the locus of the estimated value with respect to the virtual plane.
  • the adjustment unit adjusts the amount of change to the disturbance calculated by the disturbance calculation unit as the amount of change deviates from the first tolerance range.
  • the reduction rate of the disturbance applied to the input device model may be increased. In this configuration, the larger the amount of change in the estimated value from the reference value, the smaller the movement of the estimated value indicated by the simulation execution result. This allows the user to recognize that the estimated value deviates from the reference value.
  • FIG. 1 is a diagram illustrating an example of application of an information input system according to an embodiment.
  • FIG. 2 is a diagram showing an example of a hardware configuration related to the information input system.
  • FIG. 3 is a sectional view showing an example of the configuration of a writing instrument included in the information input system.
  • FIG. 4 is a diagram illustrating an example of a functional configuration related to the information input system.
  • FIG. 5 is a diagram showing an example of a virtual plane set by the setting unit.
  • FIG. 6 is a diagram illustrating an example of the configuration of the estimation section.
  • FIG. 7 is a diagram illustrating a part of the configuration of the estimating section in FIG. 6 in more detail.
  • FIG. 8(a) is a diagram showing the positional relationship between the virtual plane and the writing instrument.
  • FIG. 8(a) is a diagram showing the positional relationship between the virtual plane and the writing instrument.
  • FIG. 8(b) is a diagram of FIG. 8(a) viewed from another angle.
  • FIG. 9 is a diagram showing an example of the configuration of the recognition section.
  • FIG. 10 is a diagram showing how drawing points are set on a virtual plane.
  • FIGS. 11(a) and 11(b) are diagrams showing an example of an image of the calibration process.
  • FIGS. 12(a) and 12(b) are diagrams showing an example of the image of the calibration process.
  • FIG. 13 is a diagram showing a part of the configuration of the recognition unit shown in FIG. 8 in more detail.
  • FIG. 14 is a diagram showing an example of an image in which handwritten information is extracted.
  • FIG. 15(a) is a diagram showing an example of current handwriting information.
  • FIG. 15(b) is a diagram showing an example of past handwriting information.
  • FIG. 16 is a flowchart showing an example of the processing contents of the information input method implemented in the information input system.
  • FIG. 17 is a flowchart illustrating an example of estimation processing.
  • FIG. 18 is a flowchart illustrating an example of feedback processing.
  • FIG. 19 is a flowchart illustrating an example of drift correction processing.
  • FIG. 20 is a flowchart illustrating an example of recognition processing.
  • FIG. 21 is a flowchart illustrating an example of adjustment processing.
  • FIG. 22 is a flowchart illustrating an example of the appraisal process.
  • FIG. 1 is a diagram showing an example of application of the information input system 1 according to the present embodiment.
  • the information input system 1 is a system for inputting handwriting information D4 (an example of input information) regarding a writing action by the user U to a display screen of an electronic device.
  • the information input system 1 includes, for example, a terminal 10, a writing instrument 20 (an example of an input device), and a database 30.
  • the writing instrument 20 is a tool used for writing characters, symbols, illustrations, and the like.
  • the writing instrument 20 may be a pen capable of writing using ink or graphite, such as a ballpoint pen, fountain pen, marker, or mechanical pencil, or may be a pointing device such as a stylus pen.
  • the user U may be a scribe who writes using the writing instrument 20.
  • the writing instrument 20 is connected to the terminal 10 by short-range wireless communication.
  • the short-range wireless communication may be, for example, a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark).
  • the communication method between the writing instrument 20 and the terminal 10 is not limited. In this embodiment, a case is illustrated in which one writing instrument 20 communicates with the terminal 10, but the number of writing instruments 20 is not limited. For example, two or more writing instruments 20 may communicate with one terminal 10.
  • the terminal 10 is a computer used by a scribe.
  • the terminal 10 is, for example, a high-performance mobile phone (smartphone), a tablet terminal, a wearable terminal (for example, a head mounted display (HMD), smart glasses, or a smart watch), a laptop personal computer, or a mobile phone. Good as a mobile device.
  • the terminal 10 may be a stationary terminal such as a desktop personal computer.
  • the terminal 10 is connected to a database 30 via a communication network.
  • the communication network may include, for example, the Internet or an intranet.
  • the database 30 is a non-temporary storage device that stores various data used by the information input system 1.
  • the database 30 records threshold information D11, past handwriting information D12, and a writing instrument model D13 (an example of an input device model).
  • the database 30 may store threshold information D11 and past handwriting information D12 regarding a plurality of users U, for example.
  • the database 30 may store multiple types of writing instrument models D13.
  • the database 30 may be constructed as a single database or may be a collection of multiple databases.
  • FIG. 2 is a diagram showing an example of a hardware configuration related to the information input system 1.
  • FIG. 2 shows a terminal computer 100 functioning as a terminal 10.
  • the terminal computer 100 includes, for example, a processor 101, a main storage section 102, an auxiliary storage section 103, a communication section 104, an input interface 105, and an output interface 106 as hardware components.
  • Processor 101 is a computing device that executes an operating system and application programs.
  • the processor 101 may be, for example, a CPU or a GPU, but the type of processor 101 is not limited thereto.
  • the main storage unit 102 is a device that stores programs for implementing the terminal 10, calculation results output from the processor 101, and the like.
  • the main storage unit 102 includes, for example, at least one of a ROM and a RAM.
  • the auxiliary storage unit 103 is generally a device that can store a larger amount of data than the main storage unit 102.
  • the auxiliary storage unit 103 is configured by a nonvolatile storage medium such as a hard disk or flash memory.
  • the auxiliary storage unit 103 stores a client program P1 for causing the terminal computer 100 to function as the terminal 10 and various data.
  • the communication unit 104 is a device that performs data communication with other computers via a communication network.
  • the communication unit 104 is configured by, for example, a network card or a wireless communication module.
  • the input interface 105 is a device that receives data based on user U's operations or actions.
  • the input interface 105 is configured by at least one of a keyboard, an operation button, a pointing device, a microphone, a sensor, and a camera.
  • the keyboard and operation buttons may be displayed on the touch panel.
  • the data input to the input interface 105 is not limited.
  • input interface 105 may accept input or selected data via a keyboard, operating buttons, or pointing device.
  • the input interface 105 may accept audio data input through a microphone.
  • the input interface 105 may accept image data (eg, video data or still image data) captured by a camera.
  • the output interface 106 is a device that outputs data processed by the terminal computer 100.
  • the output interface 106 is configured by at least one of a monitor, a touch panel, an HMD, and a speaker.
  • Display devices such as monitors, touch panels, and HMDs display processed data on screens.
  • the speaker outputs audio indicated by the processed audio data.
  • Each functional element of the terminal 10 is realized by loading a client program P1, which is an example of the information input system 1, into the processor 101 or the main storage unit 102, and causing the processor 101 to execute the program.
  • the client program P1 includes codes for realizing each functional element of the terminal 10.
  • the processor 101 operates the communication unit 104, input interface 105, or output interface 106 in accordance with the client program P1, and reads and writes data in the main storage unit 102 or the auxiliary storage unit 103. Through this processing, each functional element of the terminal 10 is realized.
  • the client program P1 may be provided after being recorded non-temporarily on a tangible recording medium such as a CD-ROM, a DVD-ROM, or a semiconductor memory. Alternatively, the client program P1 may be provided via a communication network as a data signal superimposed on a carrier wave.
  • FIG. 3 is a schematic diagram showing an example of the configuration of the writing instrument 20.
  • FIG. 3 shows a cross section of the writing instrument 20 taken along the axial direction L.
  • the writing instrument 20 includes, for example, a cylinder portion 201, a refill 203, a substrate 206, a pressure sensor 207, and a motion sensor 208 (an example of a sensor).
  • the cylindrical portion 201 may be a substantially cylindrical member extending along the axial direction L of the writing instrument 20.
  • the cylindrical portion 201 has an opening 202 at the tip in the axial direction L.
  • the refill 203 may be a cylindrical member filled with ink.
  • the refill 203 has an outer diameter smaller than the inner diameter of the cylindrical portion 201 and is housed inside the cylindrical portion 201 .
  • the tip of the refill 203 (hereinafter referred to as the “pen nib 204”) is exposed from the opening 202 of the cylindrical portion 201.
  • the substrate 206 is housed inside the cylindrical portion 201 behind the base end 205 of the refill 203 (that is, the end opposite to the pen tip 204 in the axial direction L).
  • the scribe When the scribe holds the cylinder portion 201 and presses the pen tip 204 against the medium, the ink inside the refill 203 oozes out from the pen tip 204. Therefore, the scribe can write by moving the writing instrument 20 while pressing the pen tip 204 against the medium.
  • the pressure sensor 207 is provided, for example, inside the cylindrical portion 201 between the base end 205 of the refill 203 and the substrate 206.
  • the pressure sensor 207 detects the pressure that the pen tip 204 receives from the medium when writing with the writing instrument 20 as writing pressure. However, when a writing operation is performed in the air, no writing pressure is generated on the writing instrument 20, so the pressure sensor 207 does not detect the writing pressure. In this embodiment, it is assumed that a writing operation is performed in the air, and a description of the process by which the pressure sensor 207 detects the writing pressure will be omitted.
  • the motion sensor 208 is provided, for example, on the substrate 206 inside the cylindrical portion 201.
  • the motion sensor 208 detects the motion of the writing instrument 20.
  • the motion sensor 208 may be a six-axis sensor that includes an acceleration sensor that detects acceleration in three axes that are orthogonal to each other, and a gyro sensor that detects angular velocity in the three axes. In this case, the motion sensor 208 detects observed values of acceleration in three axial directions and observed values of angular velocity in three axial directions when writing with the writing instrument 20 .
  • the writing instrument 20 includes, for example, a communication section 21 and an acquisition section 22 as functional elements.
  • the acquisition unit 22 acquires observed values of acceleration and angular velocity detected by the motion sensor 208 during a writing operation, either continuously or intermittently at a predetermined frequency.
  • the communication unit 21 transmits the observed values of acceleration and angular velocity acquired by the acquisition unit 22 to the terminal 10.
  • the communication unit 21 is capable of communicating with the terminal 10, for example, by short-range wireless communication such as Bluetooth (registered trademark).
  • the communication unit 21 and the acquisition unit 22 may be incorporated into the writing instrument 20 as a simple computer device.
  • FIG. 4 is a diagram showing an example of a functional configuration related to the information input system 1.
  • the terminal 10 includes, as functional elements, a communication section 11, an acquisition section 12, a display section 13 (an example of a display device), a setting section 14, an estimation section 15, a recognition section 16, and an adjustment section 17. , an appraisal section 18, and an output section 19.
  • the communication unit 11 receives information transmitted from the communication unit 21 of the writing instrument 20.
  • the information transmitted from the communication unit 21 of the writing instrument 20 includes, for example, observed values of acceleration and angular velocity acquired by the acquisition unit 22 of the writing instrument 20.
  • the acquisition unit 12 acquires the observed values of acceleration and angular velocity that the communication unit 11 receives.
  • the display unit 13 displays, for example, a virtual object associated with real space (ie, world coordinate system).
  • the display unit 13 may be realized, for example, by an information processing terminal including an HMD worn by the user U, or may be realized by a tablet terminal, projection mapping, or the like.
  • the display unit 13 captures an image in the same imaging direction as the line of sight of the user U, and displays the captured image with the virtual space superimposed thereon.
  • the user U can view virtual objects that do not exist in reality, which correspond to the arrangement of real objects arranged in real space.
  • the display unit 13 displays virtual objects using a virtual space that is three-dimensional coordinates.
  • the display unit 13 places a virtual object at a preset position on the virtual space, and calculates the correspondence between the virtual space and the real space.
  • the coordinate system in the virtual space will be referred to as the virtual coordinate system ⁇ .
  • the display unit 13 displays an image of the virtual object as viewed from a position and direction in virtual space that respectively correspond to the imaging position and imaging direction in real space. That is, the display unit 13 displays an image of the virtual object viewed from the line of sight of the virtual user U.
  • the display of the virtual object described above can be performed using conventional MR (Mixed Reality) technology.
  • the setting unit 14 sets a virtual surface S in the virtual space.
  • the virtual surface S may be a virtual writing surface for recognizing the writing motion of the writing instrument 20 by the user U.
  • the setting unit 14 sets the virtual plane S at an arbitrary position in the virtual space.
  • the virtual plane S set by the setting unit 14 is displayed in the virtual space by the display unit 13.
  • FIG. 5 is a diagram showing an example of the virtual surface S.
  • the virtual surface S may be, for example, a two-dimensional virtual plane set at an arbitrary position in the virtual coordinate system ⁇ .
  • the coordinate system that defines the virtual surface S will be referred to as the screen coordinate system ⁇ .
  • the virtual surface S is not limited to the example shown in FIG. 5, and may be a virtual curved surface that changes three-dimensionally, for example. In this case, the virtual surface S can be set on the surface of any three-dimensional virtual object displayed in the virtual space.
  • the type of virtual surface S is determined by receiving input from user U, for example. Therefore, when receiving an input from the user U, the setting unit 14 sets the virtual plane S corresponding to the input in the virtual space.
  • the setting unit 14 does not need to set the virtual plane S in response to the input from the user U, and may set the virtual plane S in advance at a fixed position in the virtual space, for example.
  • the virtual surface S does not need to be fixed in the virtual space, and may be set to move in the virtual space.
  • FIG. 6 is a diagram showing an example of the configuration of the estimation unit 15.
  • the estimation unit 15 estimates an estimated value of the state of the writing instrument 20 in the virtual space.
  • the state of the writing instrument 20 in the virtual space may be the position and orientation of the writing instrument 20 arranged in the virtual space. Therefore, the estimated value of the state of the writing instrument 20 may be the estimated value of the position of the writing instrument 20, or the estimated value of the posture of the writing instrument 20.
  • the estimated value of the position of the writing instrument 20 may be, for example, the estimated value of the position of the pen tip 204 (see FIG. 5).
  • the estimated value of the position of the pen tip 204 is represented by coordinate values of the virtual coordinate system ⁇ indicating the position of the pen tip 204.
  • the estimated value of the posture of the writing instrument 20 may be, for example, an estimated value of the angle in the axial direction L that changes as the writing instrument 20 moves.
  • the estimated value of the angle of the writing instrument 20 may be, for example, the current angle of the axial direction L of the writing instrument 20 with respect to the initial angle of the axial direction L of the writing instrument 20 in the virtual coordinate system ⁇ .
  • observation value D10 the observed value of acceleration and the observed value of angular velocity
  • estimate D20 the estimated value of the position of writing instrument 20 and the estimated value of the posture of writing instrument 20
  • the estimation unit 15 estimates the estimated value D20 by executing a simulation of the operation of the writing instrument 20 using a writing instrument model D13 that models the writing instrument 20.
  • the estimation unit 15 includes, for example, a model processing unit 151, an analysis unit 152, an estimated value derivation unit 153, and a feedback unit 154.
  • the model processing unit 151 includes, for example, a model acquisition unit 155 and an initialization processing unit 156.
  • the model acquisition unit 155 acquires the writing instrument model D13 from the database 30.
  • the writing instrument model D13 is, for example, CAD data of the writing instrument 20, and is divided into a finite number of elements that can be numerically analyzed.
  • the writing instrument model D13 does not need to be stored in the database 30 in advance, and may be generated by the terminal 10.
  • Model information D21 indicating the specifications of the writing instrument 20 is given to the writing instrument model D13.
  • the specifications of the writing instrument 20 may include, for example, the shape, dimensions, material, and mass of the writing instrument 20. These pieces of information are given to each element constituting the writing instrument model D13 and are reflected in the writing instrument model D13.
  • the information indicating the mass of the writing instrument 20 may be, for example, mass information indicating the distribution of mass for each part of the writing instrument 20.
  • the mass information is given to the writing instrument model D13 as information indicating the distribution of mass for each element of the writing instrument model D13. By referring to the mass information, it is possible to determine which parts have a higher mass and which parts have a lower mass in the writing instrument model D13.
  • the thickness of the pen tip 204 can be determined from the information indicating the shape and dimensions of the writing instrument 20.
  • the model information D21 may include color information indicating the color of the pen tip 204.
  • the model information D21 may be provided to the writing instrument model D13 in advance, or may be provided to the writing instrument model D13 at the time of simulation execution.
  • the initialization processing unit 156 initializes the state (for example, position and orientation) of the writing instrument model D13.
  • Initializing means setting the position and orientation of the writing instrument model D13 in the simulation space to initial values.
  • the initial value of the position of the writing instrument model D13 may be any coordinate value of the coordinate system in the simulation space.
  • the initial value of the angle of the writing instrument model D13 may be an angle from an arbitrary reference line of the coordinate system in the simulation space.
  • the coordinate system in the simulation space does not necessarily have to match the virtual coordinate system ⁇ in the virtual space, but is associated with the virtual coordinate system ⁇ .
  • the coordinate values of the coordinate system in the simulation space can be converted to the coordinate values of the virtual coordinate system ⁇ .
  • the initialization processing unit 156 initializes the position and orientation of the writing instrument model D13 at the timing when the model acquisition unit 155 acquires the writing instrument model D13, or at any timing when initialization is required.
  • the model processing unit 151 provides the analysis unit 152 with the writing instrument model D13 initialized by the initialization processing unit 156 and the model information D21 given to the writing instrument model D13.
  • the analysis unit 152 includes, for example, a disturbance calculation unit 157 and an analysis execution unit 158.
  • the disturbance calculation unit 157 uses the observed value D10 and the model information D21 to calculate a disturbance to be applied to the writing instrument model D13.
  • the disturbance may be, for example, a force and torque applied to the writing instrument 20 by a user during operation of the writing instrument 20.
  • the disturbance calculation unit 157 calculates the force and torque to be applied to each element of the writing instrument model D13 from the acceleration and angular velocity indicated by the observed value D10 and the mass information indicated by the model information.
  • the force F to be applied to the element of the writing instrument model D13 is expressed as in the following equation (1), where m is the mass of the element and Sa is the acceleration.
  • A is an arbitrary function defined to derive force F from mass m and acceleration Sa.
  • the disturbance calculation unit 157 calculates torque based on the force F.
  • the disturbance calculation section 157 provides the calculated disturbance as a disturbance condition D22 to the analysis execution section 158 together with the writing instrument model D13 and model information D21.
  • the analysis execution unit 158 executes a simulation of the operation of the writing instrument 20 using the writing instrument model D13.
  • the analysis execution unit 158 sets the disturbance condition D22 as an analysis condition for executing a simulation of the operation of the writing instrument 20.
  • the analysis execution unit 158 causes the writing instrument model D13 to operate in the simulation space by applying a disturbance condition D22 to the writing instrument model D13. As a result, the position and orientation of the writing instrument model D13 in the simulation space change.
  • the force F and torque included in the disturbance condition D22 and the mass information included in the model information D21 are given to each element of the writing instrument model D13.
  • the force F is proportional to the mass m
  • a large force F is applied to an element with a high mass m
  • a small force F is applied to an element with a low mass m.
  • the mass m near the pen tip 204 is higher than other parts
  • a relatively large force F is applied to the pen tip 204, and the amount of change in the position of the pen tip 204 in the simulation space becomes large.
  • the mass m near the pen tip 204 is lower than other parts, a relatively small force F is applied to the pen tip 204, and the amount of change in the position of the pen tip 204 in the simulation space becomes small.
  • the motion of the writing instrument model D13 in the simulation space becomes a natural motion that takes into account the mass m of the writing instrument 20, that is, a motion that is close to the motion of the actual writing instrument 20.
  • the analysis execution unit 158 provides the simulation execution result to the estimated value derivation unit 153.
  • the execution result of the simulation may be a result indicating the position and orientation of the writing instrument model D13 after execution of the simulation, or a result indicating the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation. Good too.
  • the estimated value deriving unit 153 derives the estimated value D20 of the current position and orientation of the writing instrument 20 from the simulation execution result. For example, the estimated value deriving unit 153 derives the estimated value D20 by associating the position and orientation of the writing instrument model D13 in the simulation space with the position and orientation of the writing instrument 20 in the virtual space. Associating the position and orientation of the writing instrument model D13 with the position and orientation of the writing instrument 20 involves converting the position and orientation of the writing instrument model D13 in the coordinate system of the simulation space to the position and orientation of the writing instrument model D13 in the virtual coordinate system ⁇ , respectively. , means that the position and orientation of the converted writing instrument model D13 are respectively regarded as the position and orientation of the writing instrument 20 in the virtual space.
  • the estimated value deriving unit 153 utilizes the correspondence between the coordinate system in the simulation space and the virtual coordinate system ⁇ in the virtual space, for example, as indicated by the simulation execution result (i.e., after the simulation execution).
  • the simulation execution result i.e., after the simulation execution.
  • the estimated value deriving unit 153 converts the amount of change in the position and orientation of the writing instrument model D13 caused by the execution of the simulation into the amount of change in the position and orientation of the writing instrument model D13 in the virtual space, and calculates the amount of change in the position and orientation of the writing instrument model D13 after the conversion.
  • the amount of change in position and orientation may be added to the position and orientation of the writing instrument 20 before the operation in the virtual space (that is, before the simulation is executed).
  • the estimated value deriving unit 153 derives the position and orientation of the writing instrument 20 after the amount of change has been added as the estimated value D20 of the current position and orientation of the writing instrument 20, respectively.
  • the estimated value deriving unit 153 provides the estimated value D20 to the recognizing unit 16 and the feedback unit 154, respectively.
  • the position of the pen tip 204 of the writing instrument model D13 can be determined using the observed value D10 from the motion sensor 208.
  • the observed value D10 from the motion sensor 208 indicates not the position of the pen tip 204 but the acceleration and angular velocity at the position where the motion sensor 208 is provided. By integrally calculating at least one of these accelerations and angular velocities, the position of the motion sensor 208 is determined.
  • the position of the motion sensor 208 can be converted to the position of the pen tip 204 using, for example, the following equation (2).
  • VS is a position vector indicating the position of the motion sensor 208
  • VS 0 is a position vector indicating the initial value of the position of the motion sensor 208
  • VH is a position vector indicating the position of the pen tip 204
  • VH 0 is the position vector of the pen tip 204.
  • RS is a rotation vector indicating the rotation of the motion sensor 208
  • RS is a rotation vector indicating the initial value of the rotation of the motion sensor 208
  • Rot is obtained from the angle between VH and VS.
  • the transformation rotation matrix shown is shown below. Since the position of the pen tip 204 with respect to the motion sensor 208 does not change, (VH-VS) is always constant (see FIG. 5).
  • Equation (2) VS, RS, and RS 0 are obtained from the acceleration and angular velocity indicated by the observed value D10. Therefore, the estimated value deriving unit 153 can estimate the position of the pen tip 204 and the orientation of the writing instrument 20 from the observed value D10 using equation (2).
  • FIG. 7 is a diagram showing an example of the configuration of the feedback section 154.
  • the feedback unit 154 monitors the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system ⁇ , and feeds back the monitoring results to the analysis unit 152.
  • the feedback unit 154 includes, for example, a monitoring unit 141 and an adjustment unit 142.
  • the monitoring unit 141 monitors the amount of change in the temporary current position and orientation of the writing instrument 20 in the virtual coordinate system ⁇ .
  • the monitoring unit 141 includes, for example, a calculation unit 143 and a determination unit 144.
  • the calculation unit 143 uses the predetermined position and orientation of the writing instrument 20 with respect to the virtual plane S as a reference value, and calculates the amount of change in the estimated value D20 from the reference value.
  • the reference value may be the initial value of the position and orientation of the writing instrument 20 in the virtual coordinate system ⁇ .
  • the initial values may be, for example, the position and orientation of the writing instrument 20 in a state in which the tip of the pen tip 204 faces the center Sc of the virtual plane S in the virtual coordinate system ⁇ .
  • the point of the pen nib 204 facing the center Sc of the virtual surface S may be defined as a virtual line extending from the pen nib 204 intersecting the center Sc of the virtual surface S. If the tip of the pen nib 204 faces the center Sc of the virtual surface S, the axial direction L of the writing instrument 20 may be inclined with respect to the normal direction of the virtual surface S, for example, or may be inclined with respect to the normal direction of the virtual surface S. It may be along the direction.
  • the initial state is not limited to a state in which the tip of the pen tip 204 faces the center Sc of the virtual surface S, but also a state in which the tip of the pen tip 204 points at a position offset from the center Sc of the virtual surface S. It's okay.
  • FIG. 8(a) is a diagram showing the positional relationship between the virtual surface S and the writing instrument 20.
  • FIG. 8(b) is a diagram of FIG. 8(a) viewed from another angle.
  • FIGS. 8A and 8B show the writing instrument 20A indicated by the reference value and the current writing instrument 20 indicated by the estimated value D20.
  • the amount of change in the estimated value D20 from the reference value can be expressed, for example, by taking the inner product of the unit direction vector V1 toward which the pen tip 204 of the writing instrument 20A is directed and the unit direction vector V2 toward which the pen tip 204 of the writing instrument 20A is directed. I can do it.
  • the angle ⁇ formed by the unit directional vector V1 and the unit directional vector V2 can be calculated.
  • the calculation unit 143 calculates the angle ⁇ formed by the unit direction vector V1 and the unit direction vector V2 as an index indicating the amount of change in the estimated value D20 from the reference value.
  • the calculation unit 143 provides the determination unit 144 with a calculation result D41 indicating the calculated angle ⁇ .
  • the determination unit 144 determines whether the amount of change (for example, angle ⁇ ) indicated by the calculation result D41 is within an allowable range.
  • the determination unit 144 first sets a first region R1 and a second region R2 on the virtual surface S.
  • the first region R1 may be, for example, a region inside the outer edge of the virtual surface S and including at least the center Sc.
  • the boundary of the first region R1 may be located midway between the outer edge of the virtual surface S and the center Sc, or may be shifted to the inside or outside of the midpoint.
  • the second region R2 may be larger than the first region R1.
  • the second region R2 may be, for example, a region inside the outer edge of the virtual surface S.
  • the determining unit 144 sets the range of the angle ⁇ formed by the unit directional vector V2 with respect to the unit directional vector V1 to a first tolerance range when the intersection PV2 of the unit directional vector V2 and the virtual plane S is located in the first region R1.
  • the fact that the angle ⁇ is within the first tolerance range means that the intersection PV2 is within the second region R2.
  • the determination unit 144 sets the range of angle ⁇ when the intersection PV2 is located in the second region R2 as a second tolerance range.
  • the fact that the angle ⁇ is within the second tolerance range means that the intersection PV2 between the unit direction vector V2 and the virtual surface S is within the second region R2.
  • the second tolerance range is a range of angles ⁇ larger than the first tolerance range.
  • the determining unit 144 first determines whether the angle ⁇ is within the second tolerance range. When determining that the angle ⁇ is not within the second tolerance range, that is, when determining that the intersection PV2 between the unit direction vector V2 and the virtual surface S is not within the second region R2, the determining unit 144 determines that the angle ⁇ is not within the second tolerance range.
  • the adjustment unit 142 is provided with a determination result D42 indicating that the value is not within the second allowable range.
  • the determination unit 144 determines that the angle ⁇ is within the first tolerance range. Determine whether it exists or not.
  • the determination unit 144 determines that the angle ⁇ is within the second tolerance range.
  • the adjustment unit 142 is provided with a determination result D42 indicating that the value is not within the allowable range.
  • the determination unit 144 repeats the above determination again for the angle ⁇ obtained based on the next estimated value D20.
  • the adjustment unit 142 adjusts the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system ⁇ based on the monitoring result by the monitoring unit 141 (specifically, the determination result D42 by the determination unit 144).
  • the adjustment unit 142 receives the determination result D42 indicating that the angle ⁇ is not within the second tolerance range, the adjustment unit 142 performs a calibration process to return the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system ⁇ to the reference value. Execute. For example, the adjustment unit 142 adjusts the position and orientation of the writing instrument model D13 in the simulation space so that the estimated value D20 becomes the reference value (that is, so that the pen tip 204 of the writing instrument 20 faces the center Sc of the virtual surface S). adjust.
  • the adjustment unit 142 calculates the disturbance necessary to move the unit direction vector V2 of the writing instrument 20 indicated by the estimated value D20 to the unit direction vector V1 of the writing instrument 20A indicated by the reference value, and analyzes the calculated disturbance.
  • Department 158 This disturbance can be set by adjusting the function A of the above-mentioned equation (1).
  • the analysis execution unit 158 changes the position and orientation of the writing instrument model D13 in the simulation space by applying the disturbance provided from the adjustment unit 142 to the writing instrument model D13. Then, the adjustment unit 142 reflects the position and orientation of the writing instrument model D13 after execution of the simulation on the current position and orientation of the writing instrument 20 in the virtual space. Thereby, the adjustment unit 142 returns the estimated value D20 of the current position and orientation of the writing instrument 20 to the reference value.
  • the adjustment unit 142 adjusts the amount of change in the estimated value D20 with respect to the reference value.
  • the amount of change in the estimated value D20 corresponds to the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation.
  • the amount of change in the position and orientation of the writing instrument model D13 depends on disturbances set as analysis conditions when executing the simulation. The disturbance can be adjusted by the magnitude of the function A shown in equation (1).
  • the adjustment unit 142 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157.
  • the disturbance function A calculated by the disturbance calculation unit 157 is a value calculated based on the observed value D10 and the model information D21. This disturbance may be regarded as a disturbance applied to the actual writing instrument 20. Therefore, by reducing the function A, the disturbance applied to the writing instrument model D13 can be made smaller than the disturbance applied to the actual writing instrument 20.
  • the amount of change in the position and orientation of the writing instrument model D13 caused by the execution of the simulation can be made smaller than the amount of change in the actual position and orientation of the writing instrument 20.
  • the motion of the writing instrument model D13 in the simulation space becomes smaller than the motion of the actual writing instrument 20.
  • the movement of the estimated value D20 in the virtual space indicated by the simulation execution result becomes smaller. In other words, the movement of the estimated value D20 becomes slower than the actual movement of the writing instrument. This prevents the estimated value D20 from deviating further from the reference value.
  • the adjustment unit 142 reduces the disturbance calculated by the disturbance calculation unit 157 as the estimated value D20 deviates from the reference value, that is, the angle ⁇ deviates from the first tolerance range.
  • the function A of the disturbance applied to the writing instrument model D13 is set so that the reduction rate of the function A of the disturbance applied to the writing instrument model D13 at the time of execution of the simulation with respect to the function A becomes large.
  • the adjustment unit 142 adjusts the writing instrument model D13 to The disturbance function A to be applied to is set to half of the disturbance function A calculated by the disturbance calculation unit 157.
  • the amount of movement of the estimated value D20 indicated by the execution result of the simulation can be suppressed to about half the amount of movement of the actual writing instrument 20.
  • the disturbance imparted to the writing instrument model D13 Set function A to zero.
  • the disturbance applied to the writing instrument model D13 in the simulation space becomes zero, and the operation of the actual writing instrument 20 is no longer reflected in the operation of the writing instrument model D13.
  • the estimated value D20 indicated by the execution result of the simulation does not move in the virtual space, so even if the user moves the writing instrument 20 significantly, the estimated value D20 does not move in the virtual space.
  • the movement of the estimated value D20 is made smaller as the estimated value D20 deviates from the reference value in the virtual space. can.
  • the user can be made aware that the current position and orientation of the writing instrument 20 deviate greatly from the virtual plane S.
  • the monitoring unit 141 may monitor whether or not to perform drift correction on the estimated value D20. In this case, the monitoring unit 141 sets a drift correction condition that requires drift correction of the estimated value D20.
  • a detection error generally referred to as a drift error may occur in the motion sensor 208.
  • a phenomenon occurs in which the estimated value D20 gradually deviates from the virtual surface S in the virtual space. If such a phenomenon occurs, the center of the distribution of estimated values D20 within a certain time when the distribution of estimated values D20 is reflected on the virtual surface S will be The estimated value D20 gradually shifts from the center of the distribution.
  • the center of the distribution of estimated values D20 within a certain period of time may be the average value or median value of the distribution of estimated values D20 within a certain period of time.
  • the center of the distribution of estimated values D20 within a certain period of time is located near the center Sc of the virtual surface S, for example.
  • the calculation unit 143 calculates the shift amount of the center of the distribution of the current estimated value D20 within a certain period of time, using the center of the distribution of the estimated value D20 within a certain period of time as a reference value when drift errors have not accumulated. You may. In this case, the determination unit 144 determines that the drift correction condition is satisfied when the calculated amount of deviation exceeds a preset threshold. On the other hand, the determination unit 144 determines that the drift correction condition is not satisfied when the calculated amount of deviation does not exceed the threshold.
  • the above “threshold” can be obtained from the threshold information D11 stored in the database 30.
  • the above threshold value may be set in advance by, for example, the information input system 1, the provider of the information input system 1, or the user.
  • the above threshold value may be set, for example, based on a past statistical value indicating the center of the distribution of the estimated value D20 within a certain period of time, or may be set arbitrarily by the user.
  • the adjusting unit 142 executes a calibration process to return the estimated value D20 to the reference value. For example, the adjustment unit 142 adjusts the position and orientation of the writing instrument model D13 in the simulation space so that the estimated value D20 becomes the reference value (that is, so that the pen tip 204 of the writing instrument 20 faces the center Sc of the virtual surface S). adjust. For example, the adjustment unit 142 performs the same process as the calibration process described above. Thereby, the center of the distribution of the current estimated value D20 within a certain period of time can be returned to the vicinity of the center Sc of the virtual surface S. On the other hand, if the determination unit 144 determines that the drift correction condition is not satisfied, the determination unit 144 repeats the above determination again for the next estimated value D20.
  • FIG. 9 is a diagram showing an example of the configuration of the recognition unit 16.
  • the recognition unit 16 recognizes the trajectory of the writing instrument 20 based on the estimated value D20 estimated by the estimation unit 15.
  • the recognition unit 16 includes, for example, a virtual line setting unit 161, a drawing point setting unit 162, and a writing information acquisition unit 163.
  • the virtual line setting unit 161 sets a virtual line VL (for example, a virtual ray) in the virtual space.
  • FIG. 10 is a diagram showing how the drawing point P is set on the virtual surface S by the virtual line VL.
  • the virtual line VL may be, for example, a straight line virtually set to set the drawing point P on the virtual surface S.
  • the virtual line VL extends linearly from the pen tip 204 of the writing instrument 20 toward the virtual surface S, for example.
  • the drawing point P may be an input point set on the virtual surface S in order to recognize the writing operation of the writing instrument 20 on the virtual surface S.
  • the estimated value D20 indicates the coordinate value of the position of the pen tip 204 in the virtual coordinate system ⁇ and the attitude of the writing instrument 20 in the virtual coordinate system ⁇ (that is, the angle in the axial direction L). Therefore, the virtual line setting unit 161 uses the estimated value D20 to set a straight line passing through the position of the pen tip 204 indicated by the estimated value D20 and along the axial direction L of the writing instrument 20 as the virtual line VL.
  • the position of the pen tip 204 indicated by the estimated value D20 may be, for example, a coordinate point (that is, an input point) indicating the position of the pen tip 204 in the virtual coordinate system ⁇ .
  • the posture of the writing instrument 20 in the virtual coordinate system ⁇ can be expressed by a direction vector V along the axial direction L of the writing instrument 20.
  • the coordinate values of the origin of the screen coordinate system ⁇ in the virtual coordinate system ⁇ be (x 1 , y 1 , z 1 )
  • the coordinate values of the position of the pen tip 204 in the virtual coordinate system ⁇ be (x 2 , y 2 , z 2 )
  • the direction vector V is (Vx, Vy, Vz).
  • the virtual line VL is represented as a straight line along a direction vector (Vx, Vy, Vz) passing through the coordinate values (x 2 , y 2 , z 2 ).
  • the virtual line VL is expressed by the following equation (3), where t is an arbitrary real number (that is, a parameter).
  • the virtual line setting unit 161 sets a virtual line VL passing through the position of the pen tip 204 and along the axial direction L of the writing instrument 20 using the estimated value D20 and equation (3).
  • the virtual line VL extends linearly from the pen tip 204 in the direction toward which the pen tip 204 faces, and intersects the virtual surface S.
  • the virtual line setting section 161 provides the drawing point setting section 162 with virtual line information D1 indicating the virtual line VL.
  • the drawing point setting unit 162 sets, as a drawing point P, the intersection between the virtual line VL indicated by the virtual line information D1 and the virtual plane S set in the virtual space.
  • the plane equation representing the virtual surface S is expressed by the following equation (4). Therefore, the drawing point setting unit 162 calculates the intersection between the virtual line VL and the virtual surface S by substituting equation (3) representing the virtual line VL into equation (4) representing the virtual surface S. For example, by substituting equation (3) into equation (4), equation (5) is obtained. By solving equation (5) for t, equation (6) is obtained. Then, by substituting equation (6) into equation (3), the intersection between the virtual surface S and the virtual line VL can be found.
  • the drawing point setting unit 162 sets the intersection of the virtual plane S and the virtual line VL as the drawing point P. For example, the drawing point setting unit 162 sets the position of the drawing point P as a coordinate value in the screen coordinate system ⁇ .
  • the drawing point setting section 162 provides coordinate information D2 including the coordinate values of the position of the drawing point P to the adjustment section 17 and the writing information acquisition section 163.
  • the coordinate information D2 may include the coordinate values of the position of the pen tip 204 in addition to the coordinate values of the position of the drawing point P.
  • the adjustment unit 17 refers to the coordinate information D2 and determines whether it is necessary to adjust the positions of the virtual plane S and the writing instrument 20 in the virtual coordinate system ⁇ .
  • the adjustment unit 17 determines that the position between the virtual surface S and the writing instrument 20 needs to be adjusted, it performs an adjustment process to adjust the position between the virtual surface S and the writing instrument 20.
  • the adjustment process may include, for example, a process similar to the feedback process by the feedback unit 154 described above.
  • the adjustment unit 17 may perform the adjustment process together with the feedback process described above, or may perform the adjustment process when the feedback process is not performed. Alternatively, the adjustment unit 17 does not need to perform the adjustment process when the feedback process is performed.
  • the adjustment unit 17 first sets a first region R1 and a second region R2 on the virtual surface S, similarly to the feedback process. Then, the adjustment unit 17 determines whether the drawing point P is located within the second region R2 of the virtual surface S. That is, the adjustment unit 17 determines whether the drawing point P falls within the virtual plane S or not. If the adjustment unit 17 determines that the drawing point P is not located within the second region R2, the adjustment unit 17 determines that the position of the virtual surface S and the writing instrument 20 needs to be adjusted. In this case, the adjustment unit 17 executes a calibration process to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position. The calibration process here is different from the calibration process by the adjustment unit 142 described above.
  • FIG. 11(a), FIG. 11(b), FIG. 12(a), and FIG. 12(b) are diagrams showing an example of the image of the calibration process by the adjustment unit 17.
  • the pen tip 204 of the writing instrument 20 is facing the virtual surface S, and the drawing point P is located within the virtual surface S.
  • the adjustment unit 17 performs calibration to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position so that the drawing point P is set within the virtual surface S.
  • the state in which the position of the virtual surface S with respect to the writing instrument 20 is at the reference position may be, for example, a state in which the virtual surface S and the writing instrument 20 are arranged such that the tip of the pen tip 204 faces the center Sc of the virtual surface S.
  • the fact that the tip of the pen nib 204 points toward the center Sc of the virtual surface S means that the virtual line VL extending from the pen nib 204 intersects the center Sc of the virtual surface S (that is, the intersection of the virtual line VL and the virtual surface S). may be located at the center Sc).
  • the axial direction L of the writing instrument 20 may be inclined with respect to the normal direction of the virtual surface S, for example, or may be inclined with respect to the normal direction of the virtual surface S. It may be along the direction.
  • the state in which the virtual surface S and the writing instrument 20 are at the reference position is not limited to the state in which the tip of the pen tip 204 faces the center Sc of the virtual surface S, but also the state in which the tip of the pen tip 204 points to the center Sc in the virtual surface S. It may also be in a state where it is facing a position shifted from the original position. That is, as long as the virtual line VL extending from the pen tip 204 intersects the virtual surface S, the intersection of the virtual line VL and the virtual surface S may be offset from the center Sc of the virtual surface S.
  • the adjustment unit 17 moves at least one of the virtual surface S and the writing instrument 20 in the virtual coordinate system ⁇ so that the positions of the virtual surface S and the writing instrument 20 become the reference positions. At this time, the adjustment unit 17 may move only the virtual surface S without moving the writing instrument 20 in the virtual coordinate system ⁇ , or may move only the writing instrument 20 without moving the virtual surface S. , both the virtual surface S and the writing instrument 20 may be moved.
  • the adjustment unit 17 rotates only the virtual plane S with respect to the writing instrument 20 in the virtual coordinate system ⁇ .
  • the position of the screen coordinate system ⁇ that defines the virtual surface S in the virtual coordinate system ⁇ also rotates.
  • the virtual plane S and the screen coordinate system ⁇ move to the position shown in FIG. 12(b).
  • the positions of the writing instrument 20 and the virtual surface S are adjusted so that the tip of the pen nib 204 faces the center Sc of the virtual surface S, so the virtual line extending from the pen nib 204 VL intersects the center Sc of the virtual plane S. This completes the calibration process that returns the positions of the virtual plane S and the writing instrument 20 in the virtual coordinate system ⁇ to the reference positions.
  • the adjustment unit 17 determines whether the drawing point P is within the second region R2, it determines whether the drawing point P is within the first region R1.
  • the determination unit 144 determines that the drawing point P is not within the first region R1, the determination unit 144 performs an adjustment to adjust the amount of change in the estimated value D20 with respect to the reference value using a predetermined position and orientation of the writing instrument 20 with respect to the virtual surface S as a reference value. Perform processing.
  • This adjustment process may be the same process as the adjustment process by the adjustment unit 142 described above. In this case, the adjustment unit 17 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157.
  • the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation can be made smaller than the amount of change in the actual position and orientation of the writing instrument 20, respectively.
  • the movement of the estimated value D20 in the virtual space indicated by the simulation execution result becomes smaller. This prevents the estimated value D20 from deviating further from the reference value.
  • the determination unit 144 determines that the drawing point P is within the first region R1
  • the determination unit 144 repeats the above determination again for the drawing point P obtained based on the next estimated value D20.
  • FIG. 13 is a diagram showing an example of the configuration of the handwritten information acquisition section 163.
  • the handwritten information acquisition section 163 includes, for example, a trajectory information acquisition section 164 and a handwritten information extraction section 165.
  • the trajectory information acquisition unit 164 includes, for example, a coordinate string recording unit 166 and a distance calculation unit 167.
  • the coordinate string recording section 166 Upon receiving the coordinate information D2 from the drawing point setting section 162, the coordinate string recording section 166 records the coordinate values of the drawing point P in chronological order and also records the coordinate values of the pen tip 204 in chronological order.
  • the coordinate string recording unit 166 records the coordinate values of the drawing point P and the coordinate values of the pen tip 204 in a time-series manner in association with each other.
  • the coordinate string recording unit 166 stores drawing point trajectory information D31 in which the coordinate values of the drawing point P are recorded in time series, and pen tip trajectory information D32 in which the coordinate values of the pen tip 204 are recorded in time series, in the distance calculation unit 167. and provided to the handwritten information extraction unit 165.
  • the drawing point trajectory information D31 may be information indicating a two-dimensional trajectory of the coordinate values of the drawing point P on the virtual surface S.
  • the pen tip trajectory information D32 may be information indicating a three-dimensional trajectory of the coordinate values of the pen tip 204 in the virtual space.
  • the distance calculation unit 167 calculates the distance between the drawing point P and the pen tip 204 using the drawing point trajectory information D31 and the pen tip trajectory information D32. For example, the distance calculation unit 167 calculates the coordinate values of the drawing point P at a certain time and the coordinate values of the pen tip 204 at that time (i.e., the coordinates of the drawing point P) from the drawing point trajectory information D31 and the pen tip trajectory information D32. (coordinate value of the pen tip 204 associated with the value). The distance calculation unit 167 records the calculated distance in chronological order and associates it with the coordinate values of the drawing point P and the coordinate values of the pen tip 204 in chronological order.
  • the distance calculation unit 167 provides the writing information extraction unit 165 with distance information D33 indicating the distance between the coordinate values of the drawing point P and the coordinate values of the pen tip 204.
  • the trajectory information acquisition unit 164 provides the writing information extraction unit 165 with information including the drawing point trajectory information D31, the pen tip trajectory information D32, and the distance information D33 as the trajectory information D3 indicating the trajectory of the writing instrument 20. .
  • the writing information extraction unit 165 extracts writing information D4 indicating the trajectory of the drawing point P that the user U intends to write on the virtual surface S from the drawing point trajectory information D31 while referring to the distance information D33.
  • the locus of the drawing point P indicated by the writing information D4 is expressed as a character written on the virtual surface S using the writing instrument 20, for example. Therefore, the written information D4 may be character information written by the user U, for example.
  • the trajectories of the drawing points P indicated by the drawing point trajectory information D31 are connected like one stroke.
  • the writing information D4 includes character information
  • the drawing point trajectory information D31 the trajectory of the drawing point P that the user U intended to write on the virtual surface S
  • the trajectory of the drawing point P that the user U intended to write on the virtual surface S It is necessary to extract the characters written on the virtual surface S by determining the locus of the drawing point P that is not intended to be written on the virtual surface S.
  • the trajectory of the drawing point P that the user U intends to write on the virtual surface S is the trajectory of the coordinate values of the drawing point P excluding the breaks in the character stroke.
  • the character stroke may be an action of drawing lines constituting a character on the virtual surface S.
  • a break in a character stroke may be a location between any line that makes up a character and the next line.
  • the writing information extraction unit 165 refers to the change in the distance between the virtual surface S and the writing instrument 20 and extracts the drawing point P that the user U intends to write on the virtual surface S from the drawing point locus information D31.
  • the trajectory of is extracted as written information D4.
  • the handwriting information extraction unit 165 determines whether the change in the distance between the drawing point P and the pen tip 204 is less than a preset threshold.
  • the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is less than the threshold, the user U can virtually trace the trajectory of the drawing point P associated with the change in distance. It is extracted as handwriting information D4 intended to be written on the surface S.
  • the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is not less than the threshold, the user U can trace the trajectory of the drawing point P associated with the change in distance on the virtual surface. It is determined that the written information D4 is not intended to be written in S.
  • the above “threshold” can be obtained from threshold information stored in the database 30.
  • the above threshold value may be set in advance by the information input system 1, the provider of the information input system 1, or the user U, for example.
  • the above threshold value may be set, for example, based on past statistical values indicating changes in the distance between the drawing point P and the pen tip 204 of the same user U, or may be set arbitrarily by the user U. .
  • a threshold value for a change in the distance between the drawing point P of the same user U and the pen tip 204 may be set in advance using artificial intelligence (AI) technology.
  • AI artificial intelligence
  • FIG. 14 is a diagram showing an example of an image in which the written information D4 is extracted.
  • trajectories T1 and T2 indicate the trajectories of the pen tip 204 when it is assumed that the pen tip 204 is on the virtual surface S during writing.
  • a trajectory T1 indicated by a solid line indicates a trajectory of the pen tip 204 that the user U intends to write on the virtual surface S.
  • a trajectory T2 indicated by a broken line indicates a trajectory of the pen tip 204 in mid-air where the user U does not intend to write on the virtual surface S.
  • the writing information extraction unit 165 determines whether or not the change ⁇ d in the distance between the drawing point P and the pen tip 204 is less than a preset threshold value, thereby determining the trajectory T1 from the drawing point trajectory information D31.
  • the coordinate values of the indicated drawing point P are extracted as writing information D4.
  • the writing information extraction unit 165 extracts the writing information D4 based on the change in the moving speed of the pen tip 204, for example. may be extracted.
  • the handwriting information extraction unit 165 may extract the handwriting information D4 based on both the change ⁇ d in the distance between the drawing point P and the pen tip 204 and the change in the moving speed of the pen tip 204.
  • the handwritten information extraction unit 165 provides the extracted handwritten information D4 to the output unit 19.
  • the output unit 19 outputs the written information D4 to the display unit 13.
  • the display unit 13 displays the handwritten information D4 in real time on a virtual surface S arranged in the virtual space. Thereby, the user U can see the written information D4 displayed on the virtual surface S.
  • the output unit 19 may output the handwritten information D4 not only to the display unit 13 but also to another display device.
  • the output unit 19 may output the handwritten information D4 to a display screen of a personal computer, a cloud server, a smart device (for example, a smartphone or a tablet terminal), or the like.
  • the appraisal unit 18 receives the trajectory information D3 recorded in the trajectory information acquisition unit 164, and evaluates the writing information D4.
  • the appraisal unit 18 may determine that the writing action by the user U has ended when it receives a signal indicating that the user U has finished writing, or may determine that the writing action by the user U has ended when a predetermined period of time has elapsed since the writing action stopped. It may be determined that the writing operation has ended.
  • the appraisal section 18 includes, for example, a non-written information extraction section 181 and a comparison section 182.
  • the non-writing information extraction unit 181 extracts the trajectory of the pen tip 204 that the user U does not intend to write on the virtual surface S from the pen tip trajectory information D32 as non-writing information D5.
  • the non-written information D5 can be extracted in the same manner as the written information D4.
  • the non-written information extraction unit 181 determines that the change ⁇ d in the distance between the drawing point P and the pen tip 204 is not less than a preset threshold
  • the non-written information extraction unit 181 detects the trajectory of the pen tip 204 associated with the change in distance. is extracted as non-written information D5 that the user U does not intend to write on the virtual surface S.
  • the non-written information extraction unit 181 determines that the change ⁇ d in the distance between the drawing point P and the pen tip 204 is not less than the threshold, the user U It is determined that the non-written information D5 is not intended to be written on the virtual surface S.
  • the comparison unit 182 receives the non-written information D5 from the non-written information extraction unit 181, and converts the information including the written information D4 and the non-written information D5 into current handwriting information D45 indicating the current handwriting of the user U (see FIG. 15 (described later)). a) Reference).
  • the comparison unit 182 may extract the written information D4 from the trajectory information D3, or may receive the written information D4 from the written information extraction unit 165. Furthermore, the comparison unit 182 acquires past handwriting information D12 from the database 30.
  • the past handwriting information D12 is information indicating the past handwriting of the user U, and corresponds to the current handwriting information D45.
  • the comparison unit 182 compares the current handwriting information D45 and the past handwriting information D12. For example, the comparison unit 182 calculates the degree of matching between the current handwriting information D45 and the past handwriting information D12.
  • the above-mentioned "matching degree” may be an index indicating how similar the current handwriting information D45 and the past handwriting information D12 are. If each coordinate value indicated by the current handwriting information D45 is similar to each coordinate value indicated by the past handwriting information D12, the degree of coincidence between the current handwriting information D45 and the past handwriting information D12 will be high; otherwise, the corresponding The degree of agreement will be low.
  • the comparison unit 182 compares each coordinate value indicated by the current handwriting information D45 and each coordinate value indicated by the past handwriting information D12 in a time series, and determines the degree of deviation of the coordinate values for each time series as a degree of coincidence. It may be calculated. In this case, the comparison unit 182 may calculate a statistical value (for example, an average value or a median value) of the degree of deviation of coordinate values for each time series as the degree of coincidence.
  • the comparison unit 182 determines whether the calculated degree of matching is greater than or equal to a preset threshold. If the comparison unit 182 determines that the degree of matching is greater than or equal to the threshold, it determines that the current handwriting information D45 and the past handwriting information D12 are the same, and compares the user U who provided the current handwriting information D45 with the past handwriting information D12. It is determined that the user U who provided the information is the same person.
  • the comparison unit 182 determines that the degree of matching is not equal to or higher than the threshold, it determines that the current handwriting information D45 and the past handwriting information D12 are not the same, and the comparison unit 182 determines that the current handwriting information D45 and the past handwriting information D12 are It is determined that the user U who provided the is a different person.
  • the above “threshold” can be obtained from the threshold information D11 stored in the database 30.
  • the above threshold value may be set in advance by the information input system 1, the provider of the information input system 1, or the user U, for example.
  • the above threshold value may be set based on a past statistical value indicating the degree of matching between the current handwriting information D45 and the past handwriting information D12, or may be set arbitrarily by the user U, for example.
  • a threshold value for the degree of matching between the current handwriting information D45 and the past handwriting information D12 may be set in advance using artificial intelligence (AI) technology.
  • AI artificial intelligence
  • FIG. 15(a) is a diagram showing an example of an image of the current handwriting information D45.
  • FIG. 15(a) is a diagram showing an example of an image of past handwriting information D12.
  • the handwriting information D4 of the current handwriting information D45 is similar to the past handwriting information D14 corresponding to the handwriting information D4 of the past handwriting information D12.
  • the non-written information D5 of the current handwriting information D45 is significantly different from the past non-written information D15 corresponding to the non-written information D5 of the past handwriting information D12.
  • the comparing unit 182 compares the current handwriting information D45 including the past written information D14 and the non-written information D5 with the past handwriting information D12 including the written information D4 and the past non-written information D15.
  • the comparison unit 182 determines that the degree of coincidence between the current handwriting information D45 and the past handwriting information D12 is not equal to or greater than the threshold value, and determines that the current handwriting information D45 and the past handwriting information D12 are not the same. In this case, the comparison unit 182 determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are different people. In this way, the comparison unit 182 not only compares the written information D4 and the past written information D14, but also includes a comparison between the non-written information D5, which the user U does not intend to write, and the past non-written information D15. Then, the written information D4 is evaluated. Therefore, in this embodiment, as shown in FIGS.
  • the comparison unit 182 may extract the past written information D14 and the past non-written information D15 from the past handwritten information D12, and compare the degree of matching between the written information D4 and the past written information D14 and the non-written information D5 and the past non-written information.
  • the degree of matching with D15 may be calculated respectively.
  • the comparison unit 182 may determine whether each degree of matching is greater than or equal to a threshold value.
  • the comparison unit 182 determines whether or not the degree of matching between the current handwriting information D45 and the past handwriting information D12 is greater than or equal to a threshold value, thereby determining whether or not the user U who provided the handwriting information D4 is the user. .
  • the comparison unit 182 outputs to the output unit 19 an appraisal result D6 indicating whether or not the user U who provided the written information D4 is the user himself/herself.
  • the output unit 19 may output the appraisal result D6 to the display unit 13, or may output the appraisal result D6 to a notification unit such as a speaker.
  • FIG. 16 is a flowchart illustrating an example of the processing contents of the information input method implemented in the information input system 1.
  • the processing by the information input system 1 includes an observation process (step S1) to acquire the observed value D10 from the motion sensor 208, an estimation process (step S2) to estimate the estimated value D20, and a recognition process (step S2) to recognize the trajectory of the writing instrument 20. step S3), and an output process (step S4) for outputting the written information D4.
  • steps S1 to S4 are, for example, repeatedly executed at predetermined intervals.
  • step S1 the motion sensor 208 of the writing instrument 20 acquires observed values D10 of acceleration and angular velocity in accordance with the operation of the writing instrument 20.
  • the acquisition unit 12 of the terminal 10 acquires the observed value D10 from the motion sensor 208.
  • the estimation unit 15 estimates the estimated value D20 of the position and orientation of the writing instrument 20 based on the observed value D10.
  • the recognition unit 16 sets a drawing point P on the virtual surface S based on the estimated value D20.
  • the recognition unit 16 acquires writing information D4 for the virtual surface S based on the trajectory indicated by the drawing point P.
  • step S4 the output unit 19 outputs the written information D4 to the display unit 13.
  • FIG. 17 is a flowchart illustrating an example of the estimation process performed by the estimation unit 15.
  • the model acquisition unit 155 acquires a writing instrument model D13 that is a model of the writing instrument 20 from the database 30 (step S21).
  • the initialization processing unit 156 initializes the state (eg, position and orientation) of the writing instrument model D13 (step S22).
  • the disturbance calculation unit 157 calculates the disturbance to be applied to the writing instrument model D13 (step S23).
  • the disturbance may be a force and torque applied to the writing instrument 20 by a user during operation of the writing instrument 20.
  • the analysis execution unit 158 sets the disturbance condition D22 as an analysis condition for executing a simulation of the operation of the writing instrument 20 (step S24).
  • the analysis execution unit 158 executes a simulation of the operation of the writing instrument 20 using the writing instrument model D13 (step S25).
  • the analysis execution unit 158 causes the writing instrument model D13 to operate in the simulation space by applying a disturbance condition D22 to the writing instrument model D13.
  • the estimated value deriving unit 153 derives the estimated value D20 of the current position and orientation of the writing instrument 20 from the simulation execution result (step S26).
  • the estimated value deriving unit 153 derives the estimated value D20 by associating the position and orientation of the writing instrument model D13 in the simulation space with the position and orientation of the writing instrument 20 in the virtual space, respectively.
  • the estimated value D20 is estimated based on the observed value D10.
  • FIG. 18 is a flowchart illustrating an example of feedback processing performed by the feedback unit 154.
  • the estimation unit 15 estimates the estimated value D20
  • the calculation unit 143 calculates the amount of change in the estimated value D20 from the reference value using the predetermined position and orientation of the writing instrument 20 with respect to the virtual plane S as the reference value (step S31).
  • the amount of change in the estimated value D20 from the reference value can be expressed, for example, by taking the inner product of the unit direction vector V1 toward which the pen tip 204 of the writing instrument 20A is directed and the unit direction vector V2 toward which the pen tip 204 of the writing instrument 20A is directed. can.
  • the calculation unit 143 calculates the angle ⁇ formed by the unit direction vector V1 and the unit direction vector V2 as an index indicating the amount of change in the estimated value D20 from the reference value.
  • the determination unit 144 determines whether the angle ⁇ (that is, the amount of change in the estimated value D20) calculated by the calculation unit 143 is within the second tolerance range (step S32). If the determining unit 144 determines that the angle ⁇ is not within the second tolerance range (step S32: No), the adjusting unit 142 performs calibration to return the estimated value D20 to the reference value (step S33). For example, the adjustment unit 142 calculates the disturbance necessary to move the unit direction vector V2 of the writing instrument 20 indicated by the estimated value D20 to the unit direction vector V1 of the writing instrument 20A indicated by the reference value. The adjustment unit 142 changes the position and orientation of the writing instrument model D13 in the simulation space by applying the calculated disturbance to the writing instrument model D13.
  • the adjustment unit 142 reflects the position and orientation of the writing instrument model D13 after execution of the simulation in the current position and orientation of the writing instrument 20 in the virtual space, thereby obtaining an estimated value of the current position and orientation of the writing instrument 20. Return D20 to standard value.
  • the determination unit 144 determines whether the angle ⁇ is within the first tolerance range (step S34). When determining that the angle ⁇ is not within the first tolerance range (step S34: No), the determining unit 144 adjusts the amount of change in the estimated value D20 with respect to the reference value. For example, the adjustment unit 142 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157. Thereby, the adjustment unit 142 makes the disturbance applied to the writing instrument model D13 smaller than the disturbance applied to the actual writing instrument 20.
  • step S34 determines that the angle ⁇ is within the first tolerance range (step S34: Yes)
  • the determination unit 144 returns to step S31 again and sets the angle ⁇ obtained based on the next estimated value D20 in step S31. ⁇ Repeat step S35.
  • FIG. 19 is a flowchart illustrating an example of the drift correction process performed by the feedback unit 154.
  • the determination unit 144 determines whether the drift correction condition is satisfied (step S41).
  • the drift correction condition is such that when the center of the distribution of the current estimated value D20 within a certain period of time exceeds a threshold value, the center of the distribution of the estimated value D20 within a certain period of time is set as the reference value when drift errors are not accumulated. It is filled.
  • the determination unit 144 determines that the drift correction condition is satisfied (step S41: Yes)
  • the adjustment unit 142 performs calibration to return the estimated value D20 to the reference value (step S42).
  • the determining unit 144 determines that the drift correction condition is not satisfied (step S41: No)
  • the determining unit 144 returns to step S41 again and performs steps S41 and S42 for the next estimated value D20. repeat.
  • FIG. 20 is a flowchart showing an example of the recognition process performed by the recognition unit 16.
  • the virtual line setting unit 161 sets a straight line passing through the position of the pen tip 204 and along the axial direction L of the writing instrument 20 as the virtual line VL (step S51).
  • the virtual line VL extends linearly from the pen tip 204 in the direction toward which the pen tip 204 faces, and intersects the virtual surface S.
  • the drawing point setting unit 162 calculates the intersection between the virtual line VL and the virtual surface S (step S52).
  • the drawing point setting unit 162 sets the intersection of the virtual line VL and the virtual surface S as a drawing point P.
  • the position of the drawing point P is set, for example, as a coordinate value in the screen coordinate system ⁇ .
  • the trajectory information acquisition unit 164 acquires trajectory information D3 indicating the trajectory of the writing instrument 20 (step S54).
  • the trajectory information D3 includes, for example, drawing point trajectory information D31 in which the coordinate values of the drawing point P are recorded in chronological order, pen tip trajectory information D32 in which the coordinate values of the pen tip 204 are recorded in chronological order, and the coordinates of the drawing point P. It includes distance information D33 indicating the distance between the value and the coordinate value of the pen tip 204.
  • the writing information extraction unit 165 extracts writing information D4 indicating the trajectory of the drawing point P that the user U intends to write on the virtual surface S from the drawing point trajectory information D31 while referring to the distance information D33. (Step S55).
  • the handwriting information extraction unit 165 determines, for example, whether a change in the distance between the drawing point P and the pen tip 204 indicated by the distance information D33 is less than a preset threshold. Then, when the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is less than the threshold value, the writing information extraction unit 165 extracts the trajectory of the drawing point P associated with the change in distance as writing information D4. do.
  • FIG. 21 is a flowchart illustrating an example of the adjustment process performed by the adjustment unit 17.
  • the adjustment unit 17 determines whether the drawing point P is located within the second region R2 of the virtual surface S (step S61).
  • step S61 determines that the drawing point P is not located within the second region R2 (step S61: No)
  • the adjustment unit 17 executes a calibration process to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position ( Step S62).
  • the adjustment unit 17 determines that the drawing point P is within the second region R2 (step S61: Yes)
  • step S63: No the determination unit 144 uses the predetermined position and orientation of the writing instrument 20 with respect to the virtual surface S as a reference value, and calculates the estimated value D20 with respect to the reference value. The amount of change is adjusted (step S64).
  • step S63: Yes the adjustment unit 17 determines that the drawing point P is within the first region R1 (step S63: Yes)
  • step S61: Yes the adjustment unit 17 returns to step S61 again and adjusts the drawing point P obtained based on the next estimated value D20. Steps S61 to S64 are repeated.
  • FIG. 22 is a flowchart illustrating an example of the appraisal process performed by the appraisal unit 18.
  • the appraisal unit 18 determines whether or not the writing operation by the user U has been completed (step S71). When determining that the writing action by the user U has not been completed (step S71: No), the appraisal unit 18 repeats step S71 until it is determined that the writing action by the user U has been completed. On the other hand, when determining that the writing action by the user U has ended (step S71: Yes), the appraisal unit 18 extracts the non-writing information D5 from the trajectory information D3.
  • the non-writing information extraction unit 181 determines, as the non-writing information D5, the trajectory of the pen tip 204 that the user U does not intend to write on the virtual surface S. Extract (step S72).
  • the comparison unit 182 acquires information including the written information D4 and the non-written information D5 as current handwriting information D45 indicating the current handwriting of the user U, and the degree of matching between the current handwriting information D45 and the past handwriting information D12. is calculated (step S73).
  • the comparison unit 182 evaluates the handwriting information D4 based on the degree of matching between the current handwriting information D45 and the past handwriting information D12 (step S74). For example, the comparison unit 182 determines whether the degree of matching is greater than or equal to a preset threshold.
  • the comparison unit 182 determines that the degree of matching is greater than or equal to the threshold, it determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are the same person. That is, the comparison unit 182 determines that the user U who provided the written information D4 is the user himself/herself. On the other hand, if the comparison unit 182 determines that the degree of matching is not equal to or greater than the threshold, it determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are different people. In other words, the comparison unit 182 determines that the user U who provided the written information D4 is an impersonator.
  • the estimated value D20 can be derived with high accuracy based on the observed value D10 from the motion sensor 208. Furthermore, by applying a disturbance to the writing instrument model D13 having mass information indicating the mass of the writing instrument 20, it is possible to give the writing instrument model D13 a natural movement that takes the mass of the writing instrument 20 into consideration. In other words, it is possible to bring the change in the state of the writing instrument 20 indicated by the estimated value D20 closer to the actual change in the state of the writing instrument 20 accompanying the writing operation. As a result, it is possible to suppress the discomfort that the user U feels during the writing operation.
  • the writing instrument 20 having the pen tip 204 at one end in the axial direction L may be used as an input device.
  • the information indicated by the trajectory of the pen tip 204 can be accurately acquired as handwriting information, and the discomfort felt by the user U during the writing operation can be suppressed.
  • the information input system 1 includes a setting unit 14 that sets a virtual plane S in the space where the writing instrument 20 is placed, and a recognition unit 16 that recognizes the trajectory of the estimated value D20 with respect to the virtual plane S. You may prepare. In this case, the writing motion of the user U on the virtual surface S can be recognized.
  • the estimating unit 15 may include a monitoring unit 141 that monitors changes in the estimated value D20, and an adjusting unit 142 that adjusts the estimated value D20.
  • the estimated value D20 can be adjusted so that the change in the estimated value D20 becomes small. Thereby, a situation in which the estimated value D20 deviates significantly from the virtual plane S can be suppressed. As a result, it becomes possible to recognize the locus of the estimated value D20 with respect to the virtual surface S more reliably.
  • the monitoring unit 141 includes a calculation unit 143 that calculates the angle ⁇ as the amount of change in the estimated value D20 from the reference value, using a predetermined state of the writing instrument 20 with respect to the virtual plane S as a reference value; may also include a determination unit 144 that determines whether or not is within a first tolerance range.
  • the adjustment unit 142 improves the simulation by making the disturbance applied to the writing instrument model D13 smaller than the disturbance calculated by the disturbance calculation unit 157 when the determination unit 144 determines that the angle ⁇ is not within the first tolerance range.
  • the angle ⁇ of the estimated value D20 indicated by the execution result may be made smaller.
  • the determination unit 144 may determine whether the angle ⁇ is within a second tolerance range that is larger than the first tolerance range.
  • the adjustment unit 142 may return the estimated value D20 to the reference value when the determination unit 144 determines that the angle ⁇ is not within the second tolerance range. According to this configuration, when the estimated value D20 changes further from the reference value, by returning the estimated value D20 to the reference value, the trajectory of the estimated value D20 with respect to the virtual surface S can be more reliably recognized. It becomes possible.
  • the adjustment unit 142 adjusts the angle ⁇ as it deviates from the first tolerance range.
  • the reduction rate of the disturbance applied to the writing instrument model D13 relative to the disturbance calculated by the disturbance calculation unit 157 may be increased.
  • the larger the angle ⁇ the smaller the movement of the estimated value D20 indicated by the simulation execution result. This makes it possible for the user U to recognize that the estimated value D20 deviates from the reference value.
  • the terminal 10 includes the communication unit 11, the acquisition unit 12, the display unit 13, the setting unit 14, the estimation unit 15, the adjustment unit 17, the recognition unit 16, and the appraisal unit as functional elements.
  • a case is illustrated in which the output unit 18 and the output unit 19 are included.
  • the information input system 1 can acquire the writing information D4 without using the terminal 10.
  • the information input system 1 may include a server separate from the terminal 10. In this case, at least one of the above functional elements included in the terminal 10 may be implemented in the server.
  • the terminal 10 or the database 30 stores various information such as the threshold information D11, the past handwriting information D12, and the writing instrument model D13.
  • the writing instrument 20 includes a storage device, the storage device may store various information.
  • the motion sensor 208 includes an acceleration sensor and a gyro sensor
  • the motion sensor 208 may include a magnetic field sensor that detects magnetic fields in three axes orthogonal to each other instead of the acceleration sensor or the gyro sensor.
  • motion sensor 208 may include a magnetic field sensor in addition to an acceleration sensor and a gyro sensor.
  • the writing instrument 20 is used as an example of the input device.
  • the input device may be any device other than the writing instrument 20 as long as it is capable of detecting an input operation by the user U.
  • the processing procedure executed in the information input system 1 is not limited to the example shown in the embodiment described above. For example, some of the steps (processes) described above may be omitted, or each step may be executed in a different order. Any two or more of the steps described above may be combined, or some of the steps may be modified or deleted. Alternatively, other steps may be performed in addition to each of the above steps.
  • the purpose and scene of use of the writing instrument 20 as an example of an input device are not limited.
  • the writing implement 20 may be used in a learning scene that involves writing, or may be used in a scene other than a learning scene. Learning that involves writing may be learning that takes place at school, home, or other places.
  • the writing instrument 20 may be used in the creation of illustrations or charts.
  • the input information system is not limited to the above-mentioned situations, but can be applied to various situations in which an input device such as the writing instrument 20 is used.
  • An acquisition unit that acquires observed values regarding the operation of the input device from the sensor; an estimation unit that estimates an estimated value of the state of the input device based on the observed value; an output unit that outputs information indicated by the trajectory of the estimated value to a display device as input information from the input device,
  • the estimation unit is a model acquisition unit that acquires an input device model that models the input device; a disturbance calculation unit that calculates a disturbance to the input device model using the observed value and mass information indicating the mass of the input device; an analysis execution unit that performs a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
  • An information input system comprising: an estimated value deriving unit that derives the estimated value by associating a state of the input device model indicated by the execution result of the simulation with a current state of the input device.
  • the input device is a writing instrument including a pen tip at one end in the axial direction, The information input system according to [1], wherein the estimator estimates, as the estimated values, an estimated value of the position of the pen tip and an estimated value of the angle in the axial direction.
  • a setting unit that sets a virtual plane in a space in which the input device is arranged;
  • the estimation unit includes: a monitoring unit that monitors changes in the estimated value with respect to the virtual surface;
  • the monitoring unit a calculation unit that calculates an amount of change in the estimated value from the reference value, using a predetermined state of the input device with respect to the virtual plane as a reference value; a determination unit that determines whether the amount of change is within a first tolerance range, The adjustment unit makes the disturbance applied to the input device model smaller than the disturbance calculated by the disturbance calculation unit when the determination unit determines that the amount of change is not within the first tolerance range.
  • the determination unit determines whether the amount of change is within a second tolerance range that is larger than the first tolerance range, The information input system according to [5], wherein the adjustment unit returns the estimated value to the reference value when the determination unit determines that the amount of change is not within the second tolerance range.
  • the adjustment unit adjusts the amount of change so that the amount of change deviates from the first tolerance range when the determination unit determines that the amount of change is within the second tolerance range and not within the first tolerance range. , the information input system according to [6], wherein the reduction rate of the disturbance given to the input device model with respect to the disturbance calculated by the disturbance calculation unit is increased.
  • An information input method executed by an information input system including a processor Obtaining observed values regarding the operation of the input device from the sensor; estimating an estimate of the state of the input device based on the observed value; outputting information indicated by the trajectory of the estimated value to a display device as input information by the input device,
  • the step of estimating the estimated value includes: obtaining an input device model that models the input device; calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device; performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
  • An information input method comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
  • [9] Obtaining observed values regarding the operation of the input device from the sensor; estimating an estimate of the state of the input device based on the observed value; causing a computer to perform a step of outputting information indicated by the trajectory of the estimated value to a display device as input information from the input device;
  • the step of estimating the estimated value includes: obtaining an input device model that models the input device; calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device; performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
  • An information input program comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
  • DESCRIPTION OF SYMBOLS 1 Information input system, 12... Acquisition part, 13... Display part (an example of a display device), 14... Setting part, 15... Estimation part, 16... Recognition part, 19... Output part, 20... Writing implement (an example of an input device) ), 101... Processor, 141... Monitoring section, 143... Calculating section, 142... Adjusting section, 144... Judging section, 153... Estimated value deriving section, 155... Model acquiring section, 157... Disturbance calculating section, 158... Analysis execution section , 204... Pen tip, 208... Movement sensor (an example of a sensor), D4... Writing information (an example of input information), D10... Observed value, D13... Writing instrument model (an example of an input device model), D20... Estimated value, L ...Axis direction, S...Virtual surface, T1...Trajectory, ⁇ ...Angle (an example of the amount of change).

Abstract

This information input system comprises: an acquisition unit for acquiring from a sensor an observation value regarding the action of an input apparatus; an estimation unit for estimating the estimate value of state of the input apparatus on the basis of the observation value; and an output unit for outputting information indicated by the locus of the estimate value to a display device as input information by the input apparatus. The estimation unit includes: a model acquisition unit for acquiring an input apparatus model in which the input apparatus is modeled; a disturbance calculation unit for calculating a disturbance on the input apparatus model, using the observation value and mass information that indicates the mass of the input apparatus; an analysis execution unit for giving an disturbance, including mass information, to the input apparatus model, so as to execute the simulation of action of the input apparatus using the input apparatus model; and an estimate value derivation unit for associating the state of the input apparatus model indicated by the result of execution of simulation with the current state of the input apparatus, so as to derive an estimate value.

Description

情報入力システム、情報入力方法、及び情報入力プログラムInformation input system, information input method, and information input program
 本開示は、情報入力システム、情報入力方法、及び情報入力プログラムに関する。 The present disclosure relates to an information input system, an information input method, and an information input program.
 情報入力システムに関する技術として、例えば特許文献1~5に記載された技術が知られている。例えば特許文献1は、筆記動作を検出して筆記情報をユーザに提供する筆記具を開示する。この筆記具では、軸筒の先端部に傾斜部が設けられ、当該傾斜部に動作センサが収容される。この動作センサによって筆記の動作が測定され、動作センサの測定結果に基づいて筆記情報が取得される。 As technologies related to information input systems, for example, the technologies described in Patent Documents 1 to 5 are known. For example, Patent Document 1 discloses a writing instrument that detects a writing motion and provides writing information to a user. In this writing instrument, an inclined portion is provided at the tip of the barrel, and a motion sensor is housed in the inclined portion. The motion sensor measures the handwriting motion, and handwriting information is acquired based on the measurement results of the motion sensor.
特開2021-033563号公報JP 2021-033563 Publication 特開2018-067131号公報JP2018-067131A 特表2016-530612号公報Special table 2016-530612 publication 特開2013-125487号公報Japanese Patent Application Publication No. 2013-125487 特開2003-114754号公報Japanese Patent Application Publication No. 2003-114754
 上述した筆記具では、筆記具の先端に近い位置に動作センサが配置されることによって、動作センサが測定した筆記情報と、実際の筆記動作と、の間の誤差の抑制を図っている。しかしながら、動作センサを筆記具の先端に配置することはできないため、動作センサを筆記具の先端に近付けて筆記情報の精度向上を図ることには限界がある。一方、動作センサの位置において測定された筆記情報を、筆記具の先端の位置での筆記情報に変換する補正演算を行うことも考えられる。しかしながら、この場合、補正演算によって得られる筆記具の先端の動きが、実際の筆記動作とは異なる不自然な動きとなってしまう可能性がある。そのため、この手法は、ユーザに違和感を与えるおそれがある。 In the above-mentioned writing instrument, the motion sensor is placed near the tip of the writing instrument to suppress errors between the writing information measured by the motion sensor and the actual writing motion. However, since the motion sensor cannot be placed at the tip of the writing instrument, there is a limit to improving the accuracy of handwritten information by bringing the motion sensor closer to the tip of the writing instrument. On the other hand, it is also conceivable to perform a correction calculation to convert the handwriting information measured at the position of the motion sensor into the handwriting information at the position of the tip of the writing instrument. However, in this case, there is a possibility that the movement of the tip of the writing instrument obtained by the correction calculation will be an unnatural movement that differs from the actual writing motion. Therefore, this method may give the user a sense of discomfort.
 本開示は、入力情報の精度向上を図りつつ、入力動作の際にユーザが受ける違和感を抑えることが可能な情報入力システム、情報入力方法、及び情報入力プログラムを提供する。 The present disclosure provides an information input system, an information input method, and an information input program that can improve the accuracy of input information while suppressing the discomfort that a user feels during input operations.
 本開示の一形態に係る情報入力システムは、入力機器の動作に関する観測値をセンサから取得する取得部と、観測値に基づいて、入力機器の状態の推定値を推定する推定部と、推定値の軌跡が示す情報を、入力機器による入力情報として表示装置に出力する出力部と、を備える。推定部は、入力機器をモデル化した入力機器モデルを取得するモデル取得部と、観測値、及び入力機器の質量を示す質量情報を用いて、入力機器モデルへの外乱を計算する外乱計算部と、質量情報を含む入力機器モデルに外乱を付与することにより、入力機器モデルを用いて入力機器の動作のシミュレーションを実行する解析実行部と、シミュレーションの実行結果が示す入力機器モデルの状態を、現在の入力機器の状態に対応付けることにより、推定値を導出する推定値導出部と、を含む。 An information input system according to an embodiment of the present disclosure includes an acquisition unit that acquires observed values regarding the operation of an input device from a sensor, an estimation unit that estimates an estimated value of a state of the input device based on the observed values, and an estimated value. and an output unit that outputs information indicated by the trajectory to a display device as input information from an input device. The estimation unit includes a model acquisition unit that acquires an input device model that models the input device, and a disturbance calculation unit that calculates a disturbance to the input device model using the observed value and mass information indicating the mass of the input device. , an analysis execution unit that simulates the operation of the input device using the input device model by applying a disturbance to the input device model that includes mass information, and the current state of the input device model indicated by the simulation execution result. and an estimated value deriving unit that derives the estimated value by associating it with the state of the input device.
 本開示の一形態に係る情報入力方法は、プロセッサを備える情報入力システムにより実行される情報入力方法である。この情報入力方法は、入力機器の動作に関する観測値をセンサから取得するステップと、観測値に基づいて、入力機器の状態の推定値を推定するステップと、推定値の軌跡が示す情報を、入力機器による入力情報として表示装置に出力するステップと、を備える。推定値を推定するステップは、入力機器をモデル化した入力機器モデルを取得するステップと、観測値、及び入力機器の質量を示す質量情報を用いて、入力機器モデルへの外乱を計算するステップと、質量情報を含む入力機器モデルに外乱を付与することにより、入力機器モデルを用いて入力機器の動作のシミュレーションを実行するステップと、シミュレーションの実行結果が示す入力機器モデルの状態を、現在の入力機器の状態に対応付けることにより、推定値を導出するステップと、を含む。 An information input method according to one embodiment of the present disclosure is an information input method executed by an information input system including a processor. This information input method includes the steps of acquiring observed values regarding the operation of the input device from the sensor, estimating the estimated value of the state of the input device based on the observed values, and inputting information indicated by the trajectory of the estimated value. and outputting to a display device as input information from the device. The step of estimating the estimated value includes a step of obtaining an input device model that models the input device, and a step of calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device. , the step of simulating the operation of the input device using the input device model by applying a disturbance to the input device model including mass information, and the step of simulating the operation of the input device using the input device model, and calculating the state of the input device model indicated by the simulation execution result based on the current input The method includes the step of deriving the estimated value by associating it with the state of the device.
 本開示の一形態に係る情報入力プログラムは、入力機器の動作に関する観測値をセンサから取得するステップと、観測値に基づいて、入力機器の状態の推定値を推定するステップと、推定値の軌跡が示す情報を、入力機器による入力情報として表示装置に出力するステップと、をコンピュータに実行させる。推定値を推定するステップは、入力機器をモデル化した入力機器モデルを取得するステップと、観測値、及び入力機器の質量を示す質量情報を用いて、入力機器モデルへの外乱を計算するステップと、質量情報を含む入力機器モデルに外乱を付与することにより、入力機器モデルを用いて入力機器の動作のシミュレーションを実行するステップと、シミュレーションの実行結果が示す入力機器モデルの状態を、現在の入力機器の状態に対応付けることにより、推定値を導出するステップと、を含む。 An information input program according to an embodiment of the present disclosure includes a step of acquiring an observed value regarding the operation of an input device from a sensor, a step of estimating an estimated value of the state of the input device based on the observed value, and a trajectory of the estimated value. outputting the information indicated by the input device to the display device as input information by the input device. The step of estimating the estimated value includes a step of obtaining an input device model that models the input device, and a step of calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device. , the step of simulating the operation of the input device using the input device model by applying a disturbance to the input device model including mass information, and the step of simulating the operation of the input device using the input device model, and calculating the state of the input device model indicated by the simulation execution result based on the current input The method includes the step of deriving the estimated value by associating it with the state of the device.
 上記の形態では、入力機器の動作に関する観測値がセンサから取得され、観測値及び質量情報を用いて入力機器モデルへの外乱が計算される。そして、入力機器モデルに外乱が付与されることによって、入力機器モデルを用いた入力機器の動作のシミュレーションが実行される。シミュレーションの実行結果が示す入力機器モデルの状態は、現在の入力機器の状態に対応付けられる。これにより、入力機器の状態の推定値が導出される。このように入力機器モデルを用いたシミュレーションを利用すれば、センサの位置において得られる情報を、入力機器の任意の位置における情報に変換して処理できる。そのため、入力機器に対するセンサの位置に関わらず、センサからの観測値に基づいて推定値を精度良く導出できる。更に、入力機器の質量を示す質量情報を含む入力機器モデルに外乱を付与すれば、入力機器の質量を考慮した自然な動きを入力機器モデルに与えることができる。つまり、推定値が示す入力機器の状態の変化を、入力動作に伴う実際の入力機器の状態の変化に近付けることが可能となる。その結果、入力動作の際にユーザが受ける違和感を抑えることができる。 In the above embodiment, observed values regarding the operation of the input device are acquired from the sensor, and disturbances to the input device model are calculated using the observed values and mass information. Then, by applying a disturbance to the input device model, a simulation of the operation of the input device using the input device model is executed. The state of the input device model indicated by the simulation execution result is associated with the current state of the input device. As a result, an estimated value of the state of the input device is derived. By using a simulation using an input device model in this way, information obtained at the position of a sensor can be converted into information at an arbitrary position of the input device and processed. Therefore, the estimated value can be derived with high accuracy based on the observed value from the sensor, regardless of the position of the sensor with respect to the input device. Furthermore, by applying a disturbance to the input device model that includes mass information indicating the mass of the input device, it is possible to give the input device model natural motion that takes the mass of the input device into consideration. In other words, it is possible to bring the change in the state of the input device indicated by the estimated value closer to the actual change in the state of the input device accompanying the input operation. As a result, it is possible to suppress the discomfort felt by the user during the input operation.
 入力機器は、軸線方向の一端にペン先を含む筆記具であってもよい。推定部は、推定値として、ペン先の位置の推定値と、軸線方向の角度の推定値と、を推定してもよい。この場合、ペン先の軌跡が示す情報を筆跡情報として精度良く取得しつつ、筆記動作の際にユーザが受ける違和感を抑えることができる。 The input device may be a writing instrument that includes a pen tip at one end in the axial direction. The estimation unit may estimate an estimated value of the position of the pen tip and an estimated value of the angle in the axial direction as the estimated values. In this case, it is possible to accurately acquire information indicated by the trajectory of the pen tip as handwriting information, while suppressing the discomfort felt by the user during the writing operation.
 情報入力システムは、入力機器が配置される空間に仮想面を設定する設定部と、仮想面に対する推定値の軌跡を認識する認識部と、を更に備えてもよい。この場合、仮想面に対するユーザの筆記動作を認識できる。 The information input system may further include a setting unit that sets a virtual plane in a space where the input device is placed, and a recognition unit that recognizes a trajectory of the estimated value with respect to the virtual plane. In this case, the user's writing motion on the virtual surface can be recognized.
 推定部は、仮想面に対する推定値の変化を監視する監視部と、仮想面に対する推定値の調整を行う調整部と、を更に含んでもよい。この構成では、仮想面に対する推定値の変化が大きい場合に、推定値の変化が小さくなるように推定値を調整できる。これにより、仮想面に対して推定値が大きく逸脱する事態を抑制できる。その結果、仮想面に対する推定値の軌跡の認識をより確実に行うことが可能となる。 The estimation unit may further include a monitoring unit that monitors changes in the estimated value for the virtual surface, and an adjustment unit that adjusts the estimated value for the virtual surface. With this configuration, when the change in the estimated value with respect to the virtual plane is large, the estimated value can be adjusted so that the change in the estimated value is small. Thereby, it is possible to suppress a situation in which the estimated value deviates significantly from the virtual plane. As a result, it becomes possible to more reliably recognize the locus of the estimated value with respect to the virtual surface.
 監視部は、仮想面に対する入力機器の所定の状態を基準値として、基準値からの推定値の変化量を計算する計算部と、変化量が第1許容範囲内であるか否かを判定する判定部と、を含んでもよい。調整部は、変化量が第1許容範囲内でないと判定部が判定した場合に、入力機器モデルに付与する外乱を、外乱計算部が計算した外乱よりも小さくすることにより、シミュレーションの実行結果が示す推定値の変化量を小さくしてもよい。この構成では、基準値から推定値が大きく変化した場合に、入力機器モデルに付与する外乱を小さくすることによって、シミュレーションの実行により生じる入力機器モデルの状態の変化を小さくできる。これにより、シミュレーションの実行結果が示す推定値の動きを小さくできる。その結果、仮想面に対して推定値が大きく逸脱する事態を抑制でき、仮想面に対する推定値の軌跡の認識をより確実に行うことが可能となる。 The monitoring unit includes a calculation unit that calculates an amount of change in the estimated value from the reference value using a predetermined state of the input device with respect to the virtual surface as a reference value, and a calculation unit that determines whether the amount of change is within a first tolerance range. A determination unit may also be included. The adjustment unit improves the simulation execution result by making the disturbance applied to the input device model smaller than the disturbance calculated by the disturbance calculation unit when the determination unit determines that the amount of change is not within the first tolerance range. The amount of change in the estimated value shown may be reduced. With this configuration, when the estimated value changes significantly from the reference value, by reducing the disturbance applied to the input device model, it is possible to reduce changes in the state of the input device model caused by execution of the simulation. Thereby, the movement of the estimated value indicated by the simulation execution result can be reduced. As a result, it is possible to suppress a situation in which the estimated value deviates significantly from the virtual plane, and it becomes possible to more reliably recognize the trajectory of the estimated value with respect to the virtual plane.
 判定部は、変化量が第1許容範囲よりも大きい第2許容範囲内であるか否かを判定してもよい。調整部は、変化量が第2許容範囲内でないと判定部が判定した場合に、推定値を基準値に戻してもよい。この構成によれば、基準値から推定値が更に大きく変化した場合に、推定値を基準値に戻すことによって、仮想面に対する推定値の軌跡の認識をより一層確実に行うことが可能となる。 The determination unit may determine whether the amount of change is within a second tolerance range that is larger than the first tolerance range. The adjustment unit may return the estimated value to the reference value when the determination unit determines that the amount of change is not within the second tolerance range. According to this configuration, when the estimated value changes further from the reference value, by returning the estimated value to the reference value, it becomes possible to more reliably recognize the locus of the estimated value with respect to the virtual plane.
 調整部は、変化量が第2許容範囲内であって第1許容範囲内でないと判定部が判定した場合に、変化量が第1許容範囲から外れるほど、外乱計算部が計算した外乱に対する、入力機器モデルに付与する外乱の減少率を大きくしてもよい。この構成では、基準値からの推定値の変化量が大きいほど、シミュレーションの実行結果が示す推定値の動きが小さくなる。これにより、基準値から推定値がずれていることをユーザに認識させることが可能となる。 When the determination unit determines that the amount of change is within the second tolerance range and not within the first tolerance range, the adjustment unit adjusts the amount of change to the disturbance calculated by the disturbance calculation unit as the amount of change deviates from the first tolerance range. The reduction rate of the disturbance applied to the input device model may be increased. In this configuration, the larger the amount of change in the estimated value from the reference value, the smaller the movement of the estimated value indicated by the simulation execution result. This allows the user to recognize that the estimated value deviates from the reference value.
 本開示の一形態によれば、入力情報の精度向上を図りつつ、入力動作の際にユーザが受ける違和感を抑えることが可能となる。 According to one embodiment of the present disclosure, it is possible to improve the accuracy of input information while suppressing the discomfort that a user feels during input operations.
図1は、一実施形態に係る情報入力システムの適用の一例を示す図である。FIG. 1 is a diagram illustrating an example of application of an information input system according to an embodiment. 図2は、情報入力システムに関連するハードウェア構成の一例を示す図である。FIG. 2 is a diagram showing an example of a hardware configuration related to the information input system. 図3は、情報入力システムが備える筆記具の構成の一例を示す断面図である。FIG. 3 is a sectional view showing an example of the configuration of a writing instrument included in the information input system. 図4は、情報入力システムに関連する機能構成の一例を示す図である。FIG. 4 is a diagram illustrating an example of a functional configuration related to the information input system. 図5は、設定部が設定する仮想面の一例を示す図である。FIG. 5 is a diagram showing an example of a virtual plane set by the setting unit. 図6は、推定部の構成の一例を示す図である。FIG. 6 is a diagram illustrating an example of the configuration of the estimation section. 図7は、図6の推定部の構成の一部をより詳細に示す図である。FIG. 7 is a diagram illustrating a part of the configuration of the estimating section in FIG. 6 in more detail. 図8(a)は、仮想面と筆記具との位置関係を示す図である。図8(b)は、図8(a)を別の角度から見た図である。FIG. 8(a) is a diagram showing the positional relationship between the virtual plane and the writing instrument. FIG. 8(b) is a diagram of FIG. 8(a) viewed from another angle. 図9は、認識部の構成の一例を示す図である。FIG. 9 is a diagram showing an example of the configuration of the recognition section. 図10は、仮想面に描画点が設定される様子を示す図である。FIG. 10 is a diagram showing how drawing points are set on a virtual plane. 図11(a)及び図11(b)は、キャリブレーション処理のイメージの一例を示す図である。FIGS. 11(a) and 11(b) are diagrams showing an example of an image of the calibration process. 図12(a)及び図12(b)は、キャリブレーション処理のイメージの一例を示す図である。FIGS. 12(a) and 12(b) are diagrams showing an example of the image of the calibration process. 図13は、図8の認識部の構成の一部をより詳細に示す図である。FIG. 13 is a diagram showing a part of the configuration of the recognition unit shown in FIG. 8 in more detail. 図14は、筆記情報が抽出されるイメージの一例を示す図である。FIG. 14 is a diagram showing an example of an image in which handwritten information is extracted. 図15(a)は、現在筆跡情報の一例を示す図である。図15(b)は、過去筆跡情報の一例を示す図である。FIG. 15(a) is a diagram showing an example of current handwriting information. FIG. 15(b) is a diagram showing an example of past handwriting information. 図16は、情報入力システムにおいて実施される情報入力方法の処理内容の一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the processing contents of the information input method implemented in the information input system. 図17は、推定処理の一例を示すフローチャートである。FIG. 17 is a flowchart illustrating an example of estimation processing. 図18は、フィードバック処理の一例を示すフローチャートである。FIG. 18 is a flowchart illustrating an example of feedback processing. 図19は、ドリフト補正処理の一例を示すフローチャートである。FIG. 19 is a flowchart illustrating an example of drift correction processing. 図20は、認識処理の一例を示すフローチャートである。FIG. 20 is a flowchart illustrating an example of recognition processing. 図21は、調整処理の一例を示すフローチャートである。FIG. 21 is a flowchart illustrating an example of adjustment processing. 図22は、鑑定処理の一例を示すフローチャートである。FIG. 22 is a flowchart illustrating an example of the appraisal process.
 [情報入力システムの構成]
 図1は、本実施形態に係る情報入力システム1の適用の一例を示す図である。情報入力システム1は、ユーザUによる筆記動作に関する筆記情報D4(入力情報の一例)を電子機器の表示画面に入力するためのシステムである。情報入力システム1は、例えば、端末10と、筆記具20(入力機器の一例)と、データベース30とを備える。筆記具20は、文字、記号及びイラストなどの筆記に使用される道具である。筆記具20は、例えば、ボールペン、万年筆、マーカー及びシャープペンシルといった、インク又は黒鉛を用いて筆記できるペンであってもよいし、スタイラスペンといったポインティングデバイスであってもよい。ユーザUは、筆記具20を用いて筆記を行う筆記者としてよい。
[Information input system configuration]
FIG. 1 is a diagram showing an example of application of the information input system 1 according to the present embodiment. The information input system 1 is a system for inputting handwriting information D4 (an example of input information) regarding a writing action by the user U to a display screen of an electronic device. The information input system 1 includes, for example, a terminal 10, a writing instrument 20 (an example of an input device), and a database 30. The writing instrument 20 is a tool used for writing characters, symbols, illustrations, and the like. The writing instrument 20 may be a pen capable of writing using ink or graphite, such as a ballpoint pen, fountain pen, marker, or mechanical pencil, or may be a pointing device such as a stylus pen. The user U may be a scribe who writes using the writing instrument 20.
 筆記具20は、近距離無線通信により端末10と接続されている。近距離無線通信は、例えば、Bluetooth(登録商標)又はWi-Fi(登録商標)などの通信方式としてよい。筆記具20と端末10との間の通信方式は限定されない。本実施形態においては、1つの筆記具20が端末10と通信を行う場合を例示するが、筆記具20の個数は限定されない。例えば、2つ以上の筆記具20が1つの端末10と通信を行ってもよい。 The writing instrument 20 is connected to the terminal 10 by short-range wireless communication. The short-range wireless communication may be, for example, a communication method such as Bluetooth (registered trademark) or Wi-Fi (registered trademark). The communication method between the writing instrument 20 and the terminal 10 is not limited. In this embodiment, a case is illustrated in which one writing instrument 20 communicates with the terminal 10, but the number of writing instruments 20 is not limited. For example, two or more writing instruments 20 may communicate with one terminal 10.
 端末10は、筆記者によって用いられるコンピュータである。端末10は、例えば高機能携帯電話機(スマートフォン)、タブレット端末、ウェアラブル端末(例えば、ヘッドマウントディスプレイ(HMD:Head Mounted Display)、スマートグラス、或いはスマートウォッチ)、ラップトップ型パーソナルコンピュータ、又は携帯電話機などの携帯端末としてよい。端末10は、デスクトップ型パーソナルコンピュータなどの据置型端末としてもよい。端末10は、通信ネットワークを介してデータベース30と接続されている。通信ネットワークは、例えば、インターネットを含んで構成されてもよいし、イントラネットを含んで構成されてもよい。 The terminal 10 is a computer used by a scribe. The terminal 10 is, for example, a high-performance mobile phone (smartphone), a tablet terminal, a wearable terminal (for example, a head mounted display (HMD), smart glasses, or a smart watch), a laptop personal computer, or a mobile phone. Good as a mobile device. The terminal 10 may be a stationary terminal such as a desktop personal computer. The terminal 10 is connected to a database 30 via a communication network. The communication network may include, for example, the Internet or an intranet.
 データベース30は、情報入力システム1によって用いられる各種のデータを記憶する非一時的な記憶装置である。本実施形態では、データベース30は、閾値情報D11と過去筆跡情報D12と筆記具モデルD13(入力機器モデルの一例)とを記録する。データベース30は、例えば、複数のユーザUについての閾値情報D11及び過去筆跡情報D12を記憶してもよい。データベース30は、複数の種類の筆記具モデルD13を記憶してもよい。データベース30は、単一のデータベースとして構築されてもよいし、複数のデータベースの集合であってもよい。 The database 30 is a non-temporary storage device that stores various data used by the information input system 1. In this embodiment, the database 30 records threshold information D11, past handwriting information D12, and a writing instrument model D13 (an example of an input device model). The database 30 may store threshold information D11 and past handwriting information D12 regarding a plurality of users U, for example. The database 30 may store multiple types of writing instrument models D13. The database 30 may be constructed as a single database or may be a collection of multiple databases.
 図2は、情報入力システム1に関連するハードウェア構成の一例を示す図である。図2は、端末10として機能する端末コンピュータ100を示す。端末コンピュータ100は、ハードウェア構成要素として、例えば、プロセッサ101と、主記憶部102と、補助記憶部103と、通信部104と、入力インタフェース105と、出力インタフェース106とを備える。プロセッサ101は、オペレーティングシステム及びアプリケーションプログラムを実行する演算装置である。プロセッサ101は、例えばCPU又はGPUであり得るが、プロセッサ101の種類はこれらに限定されない。 FIG. 2 is a diagram showing an example of a hardware configuration related to the information input system 1. FIG. 2 shows a terminal computer 100 functioning as a terminal 10. As shown in FIG. The terminal computer 100 includes, for example, a processor 101, a main storage section 102, an auxiliary storage section 103, a communication section 104, an input interface 105, and an output interface 106 as hardware components. Processor 101 is a computing device that executes an operating system and application programs. The processor 101 may be, for example, a CPU or a GPU, but the type of processor 101 is not limited thereto.
 主記憶部102は、端末10を実現させるためのプログラム、及びプロセッサ101から出力された演算結果などを記憶する装置である。主記憶部102は、例えばROM及びRAMのうちの少なくとも一つにより構成される。補助記憶部103は、一般に主記憶部102よりも大量のデータを記憶することが可能な装置である。補助記憶部103は、例えばハードディスク又はフラッシュメモリなどの不揮発性記憶媒体によって構成される。補助記憶部103は、端末コンピュータ100を端末10として機能させるためのクライアントプログラムP1と、各種のデータとを記憶する。通信部104は、通信ネットワークを介して他のコンピュータとの間でデータ通信を実行する装置である。通信部104は、例えばネットワークカード又は無線通信モジュールにより構成される。 The main storage unit 102 is a device that stores programs for implementing the terminal 10, calculation results output from the processor 101, and the like. The main storage unit 102 includes, for example, at least one of a ROM and a RAM. The auxiliary storage unit 103 is generally a device that can store a larger amount of data than the main storage unit 102. The auxiliary storage unit 103 is configured by a nonvolatile storage medium such as a hard disk or flash memory. The auxiliary storage unit 103 stores a client program P1 for causing the terminal computer 100 to function as the terminal 10 and various data. The communication unit 104 is a device that performs data communication with other computers via a communication network. The communication unit 104 is configured by, for example, a network card or a wireless communication module.
 入力インタフェース105は、ユーザUの操作又は動作に基づいてデータを受け付ける装置である。例えば、入力インタフェース105は、キーボード、操作ボタン、ポインティングデバイス、マイクロフォン、センサ、及びカメラのうちの少なくとも1つによって構成される。キーボード及び操作ボタンは、タッチパネル上に表示されてもよい。入力インタフェース105に入力されるデータは限定されない。例えば、入力インタフェース105は、キーボード、操作ボタン、又はポインティングデバイスによって、入力又は選択されたデータを受け付けてもよい。或いは、入力インタフェース105は、マイクロフォンにより入力された音声データを受け付けてもよい。或いは、入力インタフェース105は、カメラによって撮影された画像データ(例えば、映像データ又は静止画データ)を受け付けてもよい。 The input interface 105 is a device that receives data based on user U's operations or actions. For example, the input interface 105 is configured by at least one of a keyboard, an operation button, a pointing device, a microphone, a sensor, and a camera. The keyboard and operation buttons may be displayed on the touch panel. The data input to the input interface 105 is not limited. For example, input interface 105 may accept input or selected data via a keyboard, operating buttons, or pointing device. Alternatively, the input interface 105 may accept audio data input through a microphone. Alternatively, the input interface 105 may accept image data (eg, video data or still image data) captured by a camera.
 出力インタフェース106は、端末コンピュータ100で処理されたデータを出力する装置である。例えば、出力インタフェース106は、モニタ、タッチパネル、HMD及びスピーカのうちの少なくとも1つによって構成される。モニタ、タッチパネル、HMDなどの表示装置は、処理されたデータを画面上に表示する。スピーカは、処理された音声データで示される音声を出力する。 The output interface 106 is a device that outputs data processed by the terminal computer 100. For example, the output interface 106 is configured by at least one of a monitor, a touch panel, an HMD, and a speaker. Display devices such as monitors, touch panels, and HMDs display processed data on screens. The speaker outputs audio indicated by the processed audio data.
 端末10の各機能要素は、情報入力システム1の一例であるクライアントプログラムP1をプロセッサ101又は主記憶部102に読み込ませて、プロセッサ101にそのプログラムを実行させることで実現される。クライアントプログラムP1は、端末10の各機能要素を実現するためのコードを含む。プロセッサ101は、クライアントプログラムP1に従って、通信部104、入力インタフェース105または出力インタフェース106を動作させ、主記憶部102又は補助記憶部103におけるデータの読み出し及び書き込みを行う。この処理により端末10の各機能要素が実現される。 Each functional element of the terminal 10 is realized by loading a client program P1, which is an example of the information input system 1, into the processor 101 or the main storage unit 102, and causing the processor 101 to execute the program. The client program P1 includes codes for realizing each functional element of the terminal 10. The processor 101 operates the communication unit 104, input interface 105, or output interface 106 in accordance with the client program P1, and reads and writes data in the main storage unit 102 or the auxiliary storage unit 103. Through this processing, each functional element of the terminal 10 is realized.
 クライアントプログラムP1は、CD-ROM、DVD-ROM、又は半導体メモリなどの有形の記録媒体に非一時的に記録された上で提供されてもよい。或いは、クライアントプログラムP1は、搬送波に重畳されたデータ信号として通信ネットワークを介して提供されてもよい。 The client program P1 may be provided after being recorded non-temporarily on a tangible recording medium such as a CD-ROM, a DVD-ROM, or a semiconductor memory. Alternatively, the client program P1 may be provided via a communication network as a data signal superimposed on a carrier wave.
 図3は、筆記具20の構成の一例を示す模式図である。図3は、筆記具20を軸線方向Lに沿う面において切断した際の断面を示している。本実施形態では、筆記具20がボールペンである場合を例示する。図3に示すように、筆記具20は、例えば、筒部201と、リフィル203と、基板206と、圧力センサ207と、動作センサ208(センサの一例)とを備える。 FIG. 3 is a schematic diagram showing an example of the configuration of the writing instrument 20. FIG. 3 shows a cross section of the writing instrument 20 taken along the axial direction L. As shown in FIG. In this embodiment, a case where the writing instrument 20 is a ballpoint pen will be exemplified. As shown in FIG. 3, the writing instrument 20 includes, for example, a cylinder portion 201, a refill 203, a substrate 206, a pressure sensor 207, and a motion sensor 208 (an example of a sensor).
 筒部201は、筆記具20の軸線方向Lに沿って延びる略円筒状の部材としてよい。筒部201は、軸線方向Lの先端に開口202を有する。リフィル203は、その内部にインクが充填された円筒状の部材としてよい。リフィル203は、筒部201の内径よりも小さい外径を有し、筒部201の内部に収容されている。リフィル203が筒部201に収容された状態において、リフィル203の先端(以下、「ペン先204」と称する)は、筒部201の開口202から露出する。基板206は、筒部201の内部において、リフィル203の基端205(すなわち、軸線方向Lにおいてペン先204とは反対側の端部)の後方に収容されている。筆記者が筒部201を把持し、ペン先204を媒体に押し当てることにより、リフィル203の内部のインクがペン先204からにじみ出る。従って、筆記者は、媒体にペン先204を押し当てた状態で筆記具20を動かすことで、筆記を行うことができる。 The cylindrical portion 201 may be a substantially cylindrical member extending along the axial direction L of the writing instrument 20. The cylindrical portion 201 has an opening 202 at the tip in the axial direction L. The refill 203 may be a cylindrical member filled with ink. The refill 203 has an outer diameter smaller than the inner diameter of the cylindrical portion 201 and is housed inside the cylindrical portion 201 . When the refill 203 is housed in the cylindrical portion 201, the tip of the refill 203 (hereinafter referred to as the “pen nib 204”) is exposed from the opening 202 of the cylindrical portion 201. The substrate 206 is housed inside the cylindrical portion 201 behind the base end 205 of the refill 203 (that is, the end opposite to the pen tip 204 in the axial direction L). When the scribe holds the cylinder portion 201 and presses the pen tip 204 against the medium, the ink inside the refill 203 oozes out from the pen tip 204. Therefore, the scribe can write by moving the writing instrument 20 while pressing the pen tip 204 against the medium.
 圧力センサ207は、例えば、筒部201の内部においてリフィル203の基端205と基板206との間に設けられている。圧力センサ207は、筆記具20による筆記の際に、ペン先204が媒体から受ける圧力を筆圧として検出する。但し、空中での筆記動作が行われる場合には、筆記具20に筆圧が発生しないため、圧力センサ207は筆圧を検出しない。本実施形態では、空中で筆記動作が行われる状況を想定し、圧力センサ207が筆圧を検出する処理については説明を省略する。 The pressure sensor 207 is provided, for example, inside the cylindrical portion 201 between the base end 205 of the refill 203 and the substrate 206. The pressure sensor 207 detects the pressure that the pen tip 204 receives from the medium when writing with the writing instrument 20 as writing pressure. However, when a writing operation is performed in the air, no writing pressure is generated on the writing instrument 20, so the pressure sensor 207 does not detect the writing pressure. In this embodiment, it is assumed that a writing operation is performed in the air, and a description of the process by which the pressure sensor 207 detects the writing pressure will be omitted.
 動作センサ208は、例えば、筒部201の内部の基板206に設けられている。動作センサ208は、筆記具20の動作を検出する。例えば、動作センサ208は、互いに直交する3軸方向の加速度を検出する加速度センサと、当該3軸方向の角速度を検出するジャイロセンサとを含む6軸センサとしてよい。この場合、動作センサ208は、筆記具20による筆記の際に、3軸方向の加速度の観測値と、3軸方向の角速度の観測値と、を検出する。 The motion sensor 208 is provided, for example, on the substrate 206 inside the cylindrical portion 201. The motion sensor 208 detects the motion of the writing instrument 20. For example, the motion sensor 208 may be a six-axis sensor that includes an acceleration sensor that detects acceleration in three axes that are orthogonal to each other, and a gyro sensor that detects angular velocity in the three axes. In this case, the motion sensor 208 detects observed values of acceleration in three axial directions and observed values of angular velocity in three axial directions when writing with the writing instrument 20 .
 筆記具20は、機能要素として、例えば通信部21及び取得部22を含む。取得部22は、筆記動作に伴い動作センサ208が検出した加速度及び角速度の観測値を、連続的又は所定頻度で断続的に取得する。通信部21は、取得部22が取得した加速度及び角速度の観測値を端末10に送信する。通信部21は、前述の通り、例えば、Bluetooth(登録商標)等の近距離無線通信により、端末10と通信可能である。通信部21及び取得部22は、簡易なコンピュータ装置として、筆記具20に組み込まれていてもよい。 The writing instrument 20 includes, for example, a communication section 21 and an acquisition section 22 as functional elements. The acquisition unit 22 acquires observed values of acceleration and angular velocity detected by the motion sensor 208 during a writing operation, either continuously or intermittently at a predetermined frequency. The communication unit 21 transmits the observed values of acceleration and angular velocity acquired by the acquisition unit 22 to the terminal 10. As described above, the communication unit 21 is capable of communicating with the terminal 10, for example, by short-range wireless communication such as Bluetooth (registered trademark). The communication unit 21 and the acquisition unit 22 may be incorporated into the writing instrument 20 as a simple computer device.
 図4は、情報入力システム1に関連する機能構成の一例を示す図である。端末10は、機能要素として、例えば、通信部11と、取得部12と、表示部13(表示装置の一例)と、設定部14と、推定部15と、認識部16と、調整部17と、鑑定部18と、出力部19とを有する。通信部11は、筆記具20の通信部21から送信される情報を受信する。筆記具20の通信部21から送信される情報は、例えば、筆記具20の取得部22が取得した加速度及び角速度の観測値を含む。取得部12は、通信部11が受信した加速度及び角速度の観測値を取得する。 FIG. 4 is a diagram showing an example of a functional configuration related to the information input system 1. The terminal 10 includes, as functional elements, a communication section 11, an acquisition section 12, a display section 13 (an example of a display device), a setting section 14, an estimation section 15, a recognition section 16, and an adjustment section 17. , an appraisal section 18, and an output section 19. The communication unit 11 receives information transmitted from the communication unit 21 of the writing instrument 20. The information transmitted from the communication unit 21 of the writing instrument 20 includes, for example, observed values of acceleration and angular velocity acquired by the acquisition unit 22 of the writing instrument 20. The acquisition unit 12 acquires the observed values of acceleration and angular velocity that the communication unit 11 receives.
 表示部13は、例えば、実空間(すなわち、ワールド座標系)に対応付けられた仮想物体を表示する。表示部13は、例えば、ユーザUに装着されるHMDを含む情報処理端末などによって実現されてもよいし、タブレット端末及びプロジェクションマッピングなどによって実現されてもよい。表示部13は、例えば、ユーザUの目線と同様の撮像方向などで撮像し、撮像した画像に仮想空間を重畳して表示する。ユーザUは、表示部13を通して、実空間に配置された実体物の配置に応じた、現実にはない仮想物体を見ることができる。 The display unit 13 displays, for example, a virtual object associated with real space (ie, world coordinate system). The display unit 13 may be realized, for example, by an information processing terminal including an HMD worn by the user U, or may be realized by a tablet terminal, projection mapping, or the like. The display unit 13 captures an image in the same imaging direction as the line of sight of the user U, and displays the captured image with the virtual space superimposed thereon. Through the display unit 13, the user U can view virtual objects that do not exist in reality, which correspond to the arrangement of real objects arranged in real space.
 表示部13は、3次元座標である仮想空間を用いて仮想物体を表示する。表示部13は、仮想空間上の予め設定された位置に仮想物体を配置し、仮想空間と実空間との対応関係を算出する。以下の説明では、仮想空間における座標系を仮想座標系αという。表示部13は、実空間での撮像位置及び撮像方向にそれぞれ対応する、仮想空間での位置及び方向から見た画像を、仮想物体の表示とする。つまり、表示部13は、仮想的なユーザUの視線から仮想物体を見た画像を表示する。上記の仮想物体の表示は、従来のMR(Mixed Reality)技術を用いて実施できる。 The display unit 13 displays virtual objects using a virtual space that is three-dimensional coordinates. The display unit 13 places a virtual object at a preset position on the virtual space, and calculates the correspondence between the virtual space and the real space. In the following explanation, the coordinate system in the virtual space will be referred to as the virtual coordinate system α. The display unit 13 displays an image of the virtual object as viewed from a position and direction in virtual space that respectively correspond to the imaging position and imaging direction in real space. That is, the display unit 13 displays an image of the virtual object viewed from the line of sight of the virtual user U. The display of the virtual object described above can be performed using conventional MR (Mixed Reality) technology.
 設定部14は、仮想空間に仮想面Sを設定する。仮想面Sは、ユーザUによる筆記具20の筆記動作を認識するための仮想的な筆記面としてよい。設定部14は、仮想空間の任意の位置に仮想面Sを設定する。設定部14が設定した仮想面Sは、表示部13によって仮想空間に表示される。図5は、仮想面Sの一例を示す図である。図5に示すように、仮想面Sは、例えば、仮想座標系αにおける任意の位置に設定される2次元の仮想平面としてよい。以下、仮想面Sを規定する座標系をスクリーン座標系βという。 The setting unit 14 sets a virtual surface S in the virtual space. The virtual surface S may be a virtual writing surface for recognizing the writing motion of the writing instrument 20 by the user U. The setting unit 14 sets the virtual plane S at an arbitrary position in the virtual space. The virtual plane S set by the setting unit 14 is displayed in the virtual space by the display unit 13. FIG. 5 is a diagram showing an example of the virtual surface S. As shown in FIG. 5, the virtual surface S may be, for example, a two-dimensional virtual plane set at an arbitrary position in the virtual coordinate system α. Hereinafter, the coordinate system that defines the virtual surface S will be referred to as the screen coordinate system β.
 仮想面Sは、図5に示す例に限られず、例えば、3次元的に変化する仮想曲面であってもよい。この場合、仮想空間に表示される任意の3次元仮想物体の表面に仮想面Sを設定できる。仮想面Sの種類は、例えば、ユーザUからの入力を受け付けることによって決定される。従って、設定部14は、ユーザUからの入力を受け付けたときに、当該入力に対応する仮想面Sを仮想空間に設定する。設定部14は、ユーザUの入力に応じて仮想面Sを設定しなくてもよく、例えば、仮想空間の一定の位置に予め仮想面Sを設定してもよい。仮想面Sは、仮想空間に固定される必要はなく、仮想空間において移動するように設定されてもよい。 The virtual surface S is not limited to the example shown in FIG. 5, and may be a virtual curved surface that changes three-dimensionally, for example. In this case, the virtual surface S can be set on the surface of any three-dimensional virtual object displayed in the virtual space. The type of virtual surface S is determined by receiving input from user U, for example. Therefore, when receiving an input from the user U, the setting unit 14 sets the virtual plane S corresponding to the input in the virtual space. The setting unit 14 does not need to set the virtual plane S in response to the input from the user U, and may set the virtual plane S in advance at a fixed position in the virtual space, for example. The virtual surface S does not need to be fixed in the virtual space, and may be set to move in the virtual space.
 図6は、推定部15の構成の一例を示す図である。推定部15は、仮想空間における筆記具20の状態の推定値を推定する。仮想空間における筆記具20の状態は、仮想空間に配置される筆記具20の位置及び姿勢としてよい。従って、筆記具20の状態の推定値は、筆記具20の位置の推定値であってもよいし、筆記具20の姿勢の推定値であってもよい。筆記具20の位置の推定値は、例えば、ペン先204(図5参照)の位置の推定値としてよい。 FIG. 6 is a diagram showing an example of the configuration of the estimation unit 15. The estimation unit 15 estimates an estimated value of the state of the writing instrument 20 in the virtual space. The state of the writing instrument 20 in the virtual space may be the position and orientation of the writing instrument 20 arranged in the virtual space. Therefore, the estimated value of the state of the writing instrument 20 may be the estimated value of the position of the writing instrument 20, or the estimated value of the posture of the writing instrument 20. The estimated value of the position of the writing instrument 20 may be, for example, the estimated value of the position of the pen tip 204 (see FIG. 5).
 ペン先204の位置の推定値は、ペン先204の位置を示す仮想座標系αの座標値によって表される。筆記具20の姿勢の推定値とは、例えば、筆記具20の移動に伴って変化する軸線方向Lの角度の推定値としてよい。筆記具20の角度の推定値は、例えば、仮想座標系αにおいて筆記具20の軸線方向Lの初期角度を基準としたときの、現在の筆記具20の軸線方向Lの角度としてよい。以下では、加速度の観測値と角速度の観測値とをまとめて「観測値D10」と称し、筆記具20の位置の推定値と筆記具20の姿勢の推定値とをまとめて「推定値D20」と称する。 The estimated value of the position of the pen tip 204 is represented by coordinate values of the virtual coordinate system α indicating the position of the pen tip 204. The estimated value of the posture of the writing instrument 20 may be, for example, an estimated value of the angle in the axial direction L that changes as the writing instrument 20 moves. The estimated value of the angle of the writing instrument 20 may be, for example, the current angle of the axial direction L of the writing instrument 20 with respect to the initial angle of the axial direction L of the writing instrument 20 in the virtual coordinate system α. Hereinafter, the observed value of acceleration and the observed value of angular velocity will be collectively referred to as "observed value D10", and the estimated value of the position of writing instrument 20 and the estimated value of the posture of writing instrument 20 will be collectively referred to as "estimated value D20". .
 推定部15は、筆記具20をモデル化した筆記具モデルD13を用いて、筆記具20の動作のシミュレーションを実行することにより、推定値D20を推定する。図6に示すように、推定部15は、例えば、モデル処理部151と、解析部152と、推定値導出部153と、フィードバック部154とを有する。モデル処理部151は、例えば、モデル取得部155と初期化処理部156とを有する。モデル取得部155は、データベース30から筆記具モデルD13を取得する。筆記具モデルD13は、例えば、筆記具20のCADデータであり、数値解析が可能な有限個の要素で分割されている。数値解析が可能とは、例えば、有限要素法、有限体積法、差分法又は境界要素法といった数値解析法にて取り扱い可能であることを意味する。筆記具モデルD13は、予めデータベース30に保存されていなくてもよく、端末10によって生成されてもよい。 The estimation unit 15 estimates the estimated value D20 by executing a simulation of the operation of the writing instrument 20 using a writing instrument model D13 that models the writing instrument 20. As shown in FIG. 6, the estimation unit 15 includes, for example, a model processing unit 151, an analysis unit 152, an estimated value derivation unit 153, and a feedback unit 154. The model processing unit 151 includes, for example, a model acquisition unit 155 and an initialization processing unit 156. The model acquisition unit 155 acquires the writing instrument model D13 from the database 30. The writing instrument model D13 is, for example, CAD data of the writing instrument 20, and is divided into a finite number of elements that can be numerically analyzed. Possible for numerical analysis means that it can be handled by a numerical analysis method such as the finite element method, finite volume method, finite difference method, or boundary element method, for example. The writing instrument model D13 does not need to be stored in the database 30 in advance, and may be generated by the terminal 10.
 筆記具モデルD13には、筆記具20の仕様を示すモデル情報D21が付与されている。筆記具20の仕様は、例えば、筆記具20の形状、寸法、材料、及び質量などであってよい。これらの情報は、筆記具モデルD13を構成する要素ごとに付与され、筆記具モデルD13に反映されている。筆記具20の質量を示す情報は、例えば、筆記具20の部位ごとの質量の分布を示す質量情報としてよい。質量情報は、筆記具モデルD13の要素ごとの質量の分布を示す情報として筆記具モデルD13に付与される。質量情報を参照すれば、筆記具モデルD13において質量の高い部位と質量の低い部位とを把握できる。筆記具20の形状及び寸法を示す情報から、ペン先204の太さを把握できる。モデル情報D21は、ペン先204の色を示す色情報を含んでもよい。モデル情報D21は、筆記具モデルD13に予め付与されていてもよいし、シミュレーション実行時に筆記具モデルD13に付与されてもよい。 Model information D21 indicating the specifications of the writing instrument 20 is given to the writing instrument model D13. The specifications of the writing instrument 20 may include, for example, the shape, dimensions, material, and mass of the writing instrument 20. These pieces of information are given to each element constituting the writing instrument model D13 and are reflected in the writing instrument model D13. The information indicating the mass of the writing instrument 20 may be, for example, mass information indicating the distribution of mass for each part of the writing instrument 20. The mass information is given to the writing instrument model D13 as information indicating the distribution of mass for each element of the writing instrument model D13. By referring to the mass information, it is possible to determine which parts have a higher mass and which parts have a lower mass in the writing instrument model D13. The thickness of the pen tip 204 can be determined from the information indicating the shape and dimensions of the writing instrument 20. The model information D21 may include color information indicating the color of the pen tip 204. The model information D21 may be provided to the writing instrument model D13 in advance, or may be provided to the writing instrument model D13 at the time of simulation execution.
 初期化処理部156は、筆記具モデルD13の状態(例えば、位置及び姿勢)を初期化する。初期化することは、シミュレーション空間における筆記具モデルD13の位置及び姿勢を初期値に設定することを意味する。筆記具モデルD13の位置の初期値は、シミュレーション空間における座標系の任意の座標値としてよい。筆記具モデルD13の角度の初期値は、シミュレーション空間における座標系の任意の基準線からの角度としてよい。 The initialization processing unit 156 initializes the state (for example, position and orientation) of the writing instrument model D13. Initializing means setting the position and orientation of the writing instrument model D13 in the simulation space to initial values. The initial value of the position of the writing instrument model D13 may be any coordinate value of the coordinate system in the simulation space. The initial value of the angle of the writing instrument model D13 may be an angle from an arbitrary reference line of the coordinate system in the simulation space.
 シミュレーション空間における座標系は、仮想空間における仮想座標系αと必ずしも一致している必要はないが、仮想座標系αに対応付けられている。シミュレーション空間における座標系の座標値は、仮想座標系αの座標値に変換可能である。初期化処理部156は、モデル取得部155が筆記具モデルD13を取得したタイミング、又は初期化が必要とされる任意のタイミングで、筆記具モデルD13の位置及び姿勢の初期化を実行する。モデル処理部151は、初期化処理部156により初期化された筆記具モデルD13、及び筆記具モデルD13に付与されたモデル情報D21を、解析部152に提供する。 The coordinate system in the simulation space does not necessarily have to match the virtual coordinate system α in the virtual space, but is associated with the virtual coordinate system α. The coordinate values of the coordinate system in the simulation space can be converted to the coordinate values of the virtual coordinate system α. The initialization processing unit 156 initializes the position and orientation of the writing instrument model D13 at the timing when the model acquisition unit 155 acquires the writing instrument model D13, or at any timing when initialization is required. The model processing unit 151 provides the analysis unit 152 with the writing instrument model D13 initialized by the initialization processing unit 156 and the model information D21 given to the writing instrument model D13.
 解析部152は、例えば、外乱計算部157及び解析実行部158を有する。外乱計算部157は、観測値D10及びモデル情報D21を用いて、筆記具モデルD13に付与すべき外乱を計算する。外乱は、例えば、筆記具20の動作の際にユーザによって筆記具20に付与される力及びトルクとしてよい。外乱計算部157は、観測値D10が示す加速度及び角速度、並びにモデル情報が示す質量情報から、筆記具モデルD13の各要素に付与すべき力及びトルクを算出する。筆記具モデルD13の要素に付与すべき力Fは、当該要素の質量をm、加速度をSaとすると、次の式(1)のように表される。式(1)において、Aは、質量m及び加速度Saから力Fを導出するために定義される任意の関数である。外乱計算部157は、力Fに基づいてトルクを算出する。外乱計算部157は、計算した外乱を外乱条件D22として、筆記具モデルD13及びモデル情報D21と共に、解析実行部158に提供する。
Figure JPOXMLDOC01-appb-M000001
The analysis unit 152 includes, for example, a disturbance calculation unit 157 and an analysis execution unit 158. The disturbance calculation unit 157 uses the observed value D10 and the model information D21 to calculate a disturbance to be applied to the writing instrument model D13. The disturbance may be, for example, a force and torque applied to the writing instrument 20 by a user during operation of the writing instrument 20. The disturbance calculation unit 157 calculates the force and torque to be applied to each element of the writing instrument model D13 from the acceleration and angular velocity indicated by the observed value D10 and the mass information indicated by the model information. The force F to be applied to the element of the writing instrument model D13 is expressed as in the following equation (1), where m is the mass of the element and Sa is the acceleration. In equation (1), A is an arbitrary function defined to derive force F from mass m and acceleration Sa. The disturbance calculation unit 157 calculates torque based on the force F. The disturbance calculation section 157 provides the calculated disturbance as a disturbance condition D22 to the analysis execution section 158 together with the writing instrument model D13 and model information D21.
Figure JPOXMLDOC01-appb-M000001
 解析実行部158は、筆記具モデルD13を用いて筆記具20の動作のシミュレーションを実行する。解析実行部158は、外乱条件D22を、筆記具20の動作のシミュレーションを実行するための解析条件として設定する。解析実行部158は、筆記具モデルD13に外乱条件D22を付与することにより、筆記具モデルD13をシミュレーション空間で動作させる。これにより、シミュレーション空間における筆記具モデルD13の位置及び姿勢が変化する。シミュレーションが実行される際、外乱条件D22に含まれる力F及びトルク、並びにモデル情報D21に含まれる質量情報は、筆記具モデルD13の要素ごとに付与されている。 The analysis execution unit 158 executes a simulation of the operation of the writing instrument 20 using the writing instrument model D13. The analysis execution unit 158 sets the disturbance condition D22 as an analysis condition for executing a simulation of the operation of the writing instrument 20. The analysis execution unit 158 causes the writing instrument model D13 to operate in the simulation space by applying a disturbance condition D22 to the writing instrument model D13. As a result, the position and orientation of the writing instrument model D13 in the simulation space change. When the simulation is executed, the force F and torque included in the disturbance condition D22 and the mass information included in the model information D21 are given to each element of the writing instrument model D13.
 ここで、力Fは質量mに比例するため、質量mの高い要素には大きな力Fが付与され、質量mの低い要素には小さな力Fが付与される。例えば、ペン先204の付近の質量mが他の部位に比べて高い場合には、比較的大きな力Fがペン先204に付与され、シミュレーション空間におけるペン先204の位置の変化量が大きくなる。一方、ペン先204の付近の質量mが他の部位に比べて低い場合には、比較的小さな力Fがペン先204に付与され、シミュレーション空間におけるペン先204の位置の変化量が小さくなる。 Here, since the force F is proportional to the mass m, a large force F is applied to an element with a high mass m, and a small force F is applied to an element with a low mass m. For example, when the mass m near the pen tip 204 is higher than other parts, a relatively large force F is applied to the pen tip 204, and the amount of change in the position of the pen tip 204 in the simulation space becomes large. On the other hand, when the mass m near the pen tip 204 is lower than other parts, a relatively small force F is applied to the pen tip 204, and the amount of change in the position of the pen tip 204 in the simulation space becomes small.
 その結果、シミュレーション空間における筆記具モデルD13の動作は、筆記具20の質量mを加味した自然な動作、すなわち実際の筆記具20の動作に近い動作となる。このようなシミュレーションが実行されることによって、シミュレーション空間において、シミュレーションの実行後の筆記具モデルD13の位置及び姿勢は、シミュレーションの実行前の筆記具モデルD13の位置及び姿勢から変化する。解析実行部158は、シミュレーションの実行結果を推定値導出部153に提供する。シミュレーションの実行結果は、シミュレーションの実行後の筆記具モデルD13の位置及び姿勢を示す結果であってもよいし、シミュレーションの実行によって生じた筆記具モデルD13の位置及び姿勢の変化量を示す結果であってもよい。 As a result, the motion of the writing instrument model D13 in the simulation space becomes a natural motion that takes into account the mass m of the writing instrument 20, that is, a motion that is close to the motion of the actual writing instrument 20. By executing such a simulation, in the simulation space, the position and orientation of the writing instrument model D13 after execution of the simulation change from the position and orientation of the writing instrument model D13 before execution of the simulation. The analysis execution unit 158 provides the simulation execution result to the estimated value derivation unit 153. The execution result of the simulation may be a result indicating the position and orientation of the writing instrument model D13 after execution of the simulation, or a result indicating the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation. Good too.
 推定値導出部153は、シミュレーションの実行結果から、現在の筆記具20の位置及び姿勢の推定値D20を導出する。例えば、推定値導出部153は、シミュレーション空間における筆記具モデルD13の位置及び姿勢を、仮想空間における筆記具20の位置及び姿勢に対応付けることにより、推定値D20を導出する。筆記具モデルD13の位置及び姿勢を筆記具20の位置及び姿勢に対応付けることは、シミュレーション空間の座標系における筆記具モデルD13の位置及び姿勢を、仮想座標系αにおける筆記具モデルD13の位置及び姿勢にそれぞれ変換し、変換した筆記具モデルD13の位置及び姿勢を、仮想空間における筆記具20の位置及び姿勢としてそれぞれ見做すことを意味する。 The estimated value deriving unit 153 derives the estimated value D20 of the current position and orientation of the writing instrument 20 from the simulation execution result. For example, the estimated value deriving unit 153 derives the estimated value D20 by associating the position and orientation of the writing instrument model D13 in the simulation space with the position and orientation of the writing instrument 20 in the virtual space. Associating the position and orientation of the writing instrument model D13 with the position and orientation of the writing instrument 20 involves converting the position and orientation of the writing instrument model D13 in the coordinate system of the simulation space to the position and orientation of the writing instrument model D13 in the virtual coordinate system α, respectively. , means that the position and orientation of the converted writing instrument model D13 are respectively regarded as the position and orientation of the writing instrument 20 in the virtual space.
 このように、推定値導出部153は、シミュレーション空間における座標系と、仮想空間における仮想座標系αとの対応関係を利用して、例えば、シミュレーションの実行結果が示す(すなわち、シミュレーション実行後の)筆記具モデルD13の位置及び姿勢を、仮想空間における現在の筆記具20の位置及び姿勢にそれぞれ反映することにより、現在の筆記具20の位置及び姿勢の推定値D20を導出する。 In this way, the estimated value deriving unit 153 utilizes the correspondence between the coordinate system in the simulation space and the virtual coordinate system α in the virtual space, for example, as indicated by the simulation execution result (i.e., after the simulation execution). By reflecting the position and orientation of the writing instrument model D13 on the current position and orientation of the writing instrument 20 in the virtual space, an estimated value D20 of the current position and orientation of the writing instrument 20 is derived.
 推定値導出部153は、シミュレーションの実行によって生じた筆記具モデルD13の位置及び姿勢の変化量を、仮想空間における筆記具モデルD13の位置及び姿勢の変化量にそれぞれ変換し、変換後の筆記具モデルD13の位置及び姿勢の変化量を、仮想空間における動作前(すなわち、シミュレーション実行前)の筆記具20の位置及び姿勢にそれぞれ加えてもよい。この場合、推定値導出部153は、当該変化量が加えられた後の筆記具20の位置及び姿勢を、現在の筆記具20の位置及び姿勢の推定値D20としてそれぞれ導出する。推定値導出部153は、推定した推定値D20を、認識部16及びフィードバック部154にそれぞれ提供する。 The estimated value deriving unit 153 converts the amount of change in the position and orientation of the writing instrument model D13 caused by the execution of the simulation into the amount of change in the position and orientation of the writing instrument model D13 in the virtual space, and calculates the amount of change in the position and orientation of the writing instrument model D13 after the conversion. The amount of change in position and orientation may be added to the position and orientation of the writing instrument 20 before the operation in the virtual space (that is, before the simulation is executed). In this case, the estimated value deriving unit 153 derives the position and orientation of the writing instrument 20 after the amount of change has been added as the estimated value D20 of the current position and orientation of the writing instrument 20, respectively. The estimated value deriving unit 153 provides the estimated value D20 to the recognizing unit 16 and the feedback unit 154, respectively.
 シミュレーション空間において、筆記具モデルD13のペン先204の位置は、動作センサ208からの観測値D10を用いて求めることができる。動作センサ208からの観測値D10は、ペン先204の位置ではなく、動作センサ208が設けられる位置における加速度及び角速度を示す。これら加速度及び角速度の少なくとも一方を積分計算することにより、動作センサ208の位置が求められる。動作センサ208の位置は、例えば次の式(2)を用いて、ペン先204の位置に変換できる。
Figure JPOXMLDOC01-appb-M000002
In the simulation space, the position of the pen tip 204 of the writing instrument model D13 can be determined using the observed value D10 from the motion sensor 208. The observed value D10 from the motion sensor 208 indicates not the position of the pen tip 204 but the acceleration and angular velocity at the position where the motion sensor 208 is provided. By integrally calculating at least one of these accelerations and angular velocities, the position of the motion sensor 208 is determined. The position of the motion sensor 208 can be converted to the position of the pen tip 204 using, for example, the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 式(2)において、VSは動作センサ208の位置を示す位置ベクトル、VSは動作センサ208の位置の初期値を示す位置ベクトル、VHはペン先204の位置を示す位置ベクトル、VHはペン先204の位置の初期値を示す位置ベクトル、RSは動作センサ208の回転を示す回転ベクトル、RSは動作センサ208の回転の初期値を示す回転ベクトル、RotはVHとVSとの角度から得られる変換回転行列を示す。動作センサ208に対するペン先204の位置は変化しないので、(VH-VS)は常に一定となる(図5参照)。そして、式(2)において、VS、RS、及びRSは、観測値D10が示す加速度及び角速度から得られる。従って、推定値導出部153は、式(2)を用いて、観測値D10からペン先204の位置及び筆記具20の姿勢を推定できる。 In equation (2), VS is a position vector indicating the position of the motion sensor 208, VS 0 is a position vector indicating the initial value of the position of the motion sensor 208, VH is a position vector indicating the position of the pen tip 204, and VH 0 is the position vector of the pen tip 204. RS is a rotation vector indicating the rotation of the motion sensor 208, RS is a rotation vector indicating the initial value of the rotation of the motion sensor 208, and Rot is obtained from the angle between VH and VS. The transformation rotation matrix shown is shown below. Since the position of the pen tip 204 with respect to the motion sensor 208 does not change, (VH-VS) is always constant (see FIG. 5). In Equation (2), VS, RS, and RS 0 are obtained from the acceleration and angular velocity indicated by the observed value D10. Therefore, the estimated value deriving unit 153 can estimate the position of the pen tip 204 and the orientation of the writing instrument 20 from the observed value D10 using equation (2).
 図7は、フィードバック部154の構成の一例を示す図である。フィードバック部154は、仮想座標系αにおける現在の筆記具20の位置及び姿勢の推定値D20を監視し、監視結果のフィードバックを解析部152に行う。図7に示すように、フィードバック部154は、例えば、監視部141及び調整部142を有する。監視部141は、仮想座標系αにおける仮現在の筆記具20の位置及び姿勢の変化量を監視する。監視部141は、例えば、計算部143及び判定部144を含む。計算部143は、仮想面Sに対する筆記具20の所定の位置及び姿勢を基準値として、基準値からの推定値D20の変化量を計算する。 FIG. 7 is a diagram showing an example of the configuration of the feedback section 154. The feedback unit 154 monitors the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system α, and feeds back the monitoring results to the analysis unit 152. As shown in FIG. 7, the feedback unit 154 includes, for example, a monitoring unit 141 and an adjustment unit 142. The monitoring unit 141 monitors the amount of change in the temporary current position and orientation of the writing instrument 20 in the virtual coordinate system α. The monitoring unit 141 includes, for example, a calculation unit 143 and a determination unit 144. The calculation unit 143 uses the predetermined position and orientation of the writing instrument 20 with respect to the virtual plane S as a reference value, and calculates the amount of change in the estimated value D20 from the reference value.
 基準値は、仮想座標系αにおける筆記具20の位置及び姿勢の初期値としてよい。初期値は、例えば、仮想座標系αにおいて、ペン先204の向く先が仮想面Sの中心Scを向くように配置された状態における筆記具20の位置及び姿勢としてよい。ペン先204の向く先が仮想面Sの中心Scを向くことは、ペン先204から延びる仮想線が仮想面Sの中心Scと交差することとしてよい。ペン先204の向く先が仮想面Sの中心Scを向いていれば、筆記具20の軸線方向Lは、例えば仮想面Sの法線方向に対して傾いてもよいし、仮想面Sの法線方向に沿ってもよい。初期状態は、ペン先204の向く先が仮想面Sの中心Scを向いている状態に限らず、ペン先204の向く先が仮想面Sの中心Scからずれた位置を向いている状態であってもよい。 The reference value may be the initial value of the position and orientation of the writing instrument 20 in the virtual coordinate system α. The initial values may be, for example, the position and orientation of the writing instrument 20 in a state in which the tip of the pen tip 204 faces the center Sc of the virtual plane S in the virtual coordinate system α. The point of the pen nib 204 facing the center Sc of the virtual surface S may be defined as a virtual line extending from the pen nib 204 intersecting the center Sc of the virtual surface S. If the tip of the pen nib 204 faces the center Sc of the virtual surface S, the axial direction L of the writing instrument 20 may be inclined with respect to the normal direction of the virtual surface S, for example, or may be inclined with respect to the normal direction of the virtual surface S. It may be along the direction. The initial state is not limited to a state in which the tip of the pen tip 204 faces the center Sc of the virtual surface S, but also a state in which the tip of the pen tip 204 points at a position offset from the center Sc of the virtual surface S. It's okay.
 図8(a)は、仮想面Sと筆記具20との位置関係を示す図である。図8(b)は、図8(a)を別の角度から見た図である。図8(a)及び図8(b)には、基準値が示す筆記具20Aと、推定値D20が示す現在の筆記具20とが示されている。基準値からの推定値D20の変化量は、例えば、筆記具20Aのペン先204が向く単位方向ベクトルV1と、筆記具20Aのペン先204が向く単位方向ベクトルV2と、の内積をとることによって表すことができる。単位方向ベクトルV1と単位方向ベクトルV2との内積をとることにより、単位方向ベクトルV1と単位方向ベクトルV2とがなす角度θを計算できる。 FIG. 8(a) is a diagram showing the positional relationship between the virtual surface S and the writing instrument 20. FIG. 8(b) is a diagram of FIG. 8(a) viewed from another angle. FIGS. 8A and 8B show the writing instrument 20A indicated by the reference value and the current writing instrument 20 indicated by the estimated value D20. The amount of change in the estimated value D20 from the reference value can be expressed, for example, by taking the inner product of the unit direction vector V1 toward which the pen tip 204 of the writing instrument 20A is directed and the unit direction vector V2 toward which the pen tip 204 of the writing instrument 20A is directed. I can do it. By taking the inner product of the unit directional vector V1 and the unit directional vector V2, the angle θ formed by the unit directional vector V1 and the unit directional vector V2 can be calculated.
 基準値からの推定値D20の変化量が大きいほど、単位方向ベクトルV1と単位方向ベクトルV2とがなす角度θは大きくなり、基準値からの推定値D20の変化量が小さいほど、当該角度θは小さくなる。そこで、計算部143は、基準値からの推定値D20の変化量を示す指標として、単位方向ベクトルV1と単位方向ベクトルV2とがなす角度θを計算する。計算部143は、計算した角度θを示す計算結果D41を判定部144に提供する。 The larger the amount of change in the estimated value D20 from the reference value, the larger the angle θ formed by the unit direction vector V1 and the unit direction vector V2, and the smaller the amount of change in the estimated value D20 from the reference value, the larger the angle θ becomes. becomes smaller. Therefore, the calculation unit 143 calculates the angle θ formed by the unit direction vector V1 and the unit direction vector V2 as an index indicating the amount of change in the estimated value D20 from the reference value. The calculation unit 143 provides the determination unit 144 with a calculation result D41 indicating the calculated angle θ.
 判定部144は、計算結果D41が示す変化量(例えば、角度θ)が許容範囲内であるか否かを判定する。判定部144は、まず、仮想面Sに第1領域R1と第2領域R2とを設定する。図8(a)及び図8(b)に示すように、第1領域R1は、例えば、仮想面Sの外縁よりも内側であって少なくとも中心Scを含む領域としてよい。第1領域R1の境界は、仮想面Sの外縁と中心Scとの中間に位置してもよいし、当該中間よりも内側又は外側にずれていてもよい。第2領域R2は、第1領域R1よりも大きい領域としてよい。第2領域R2は、例えば、仮想面Sの外縁よりも内側の領域としてよい。 The determination unit 144 determines whether the amount of change (for example, angle θ) indicated by the calculation result D41 is within an allowable range. The determination unit 144 first sets a first region R1 and a second region R2 on the virtual surface S. As shown in FIGS. 8A and 8B, the first region R1 may be, for example, a region inside the outer edge of the virtual surface S and including at least the center Sc. The boundary of the first region R1 may be located midway between the outer edge of the virtual surface S and the center Sc, or may be shifted to the inside or outside of the midpoint. The second region R2 may be larger than the first region R1. The second region R2 may be, for example, a region inside the outer edge of the virtual surface S.
 判定部144は、単位方向ベクトルV2と仮想面Sとの交点PV2が第1領域R1に位置するときの、単位方向ベクトルV1に対する単位方向ベクトルV2のなす角度θの範囲を、第1許容範囲に設定する。角度θが第1許容範囲内にあることは、交点PV2が第2領域R2内に収まっていることを意味する。判定部144は、交点PV2が第2領域R2に位置するときの角度θの範囲を、第2許容範囲に設定する。角度θが第2許容範囲内にあることは、単位方向ベクトルV2と仮想面Sとの交点PV2が第2領域R2内に収まっていることを意味する。第2許容範囲は、第1許容範囲よりも大きな角度θの範囲となる。 The determining unit 144 sets the range of the angle θ formed by the unit directional vector V2 with respect to the unit directional vector V1 to a first tolerance range when the intersection PV2 of the unit directional vector V2 and the virtual plane S is located in the first region R1. Set. The fact that the angle θ is within the first tolerance range means that the intersection PV2 is within the second region R2. The determination unit 144 sets the range of angle θ when the intersection PV2 is located in the second region R2 as a second tolerance range. The fact that the angle θ is within the second tolerance range means that the intersection PV2 between the unit direction vector V2 and the virtual surface S is within the second region R2. The second tolerance range is a range of angles θ larger than the first tolerance range.
 判定部144は、まず、角度θが第2許容範囲内であるか否かを判定する。判定部144は、角度θが第2許容範囲内でないと判定した場合、すなわち、単位方向ベクトルV2と仮想面Sとの交点PV2が第2領域R2内に収まっていないと判定した場合、角度θが第2許容範囲内でないことを示す判定結果D42を調整部142に提供する。一方、判定部144は、角度θが第2許容範囲内であると判定した場合、すなわち、交点PV2が第2領域R2内に収まっていると判定した場合、角度θが第1許容範囲内であるか否かを判定する。判定部144は、角度θが第1許容範囲内でないと判定した場合、すなわち、交点PV2が第1領域R1内に収まっていないと判定した場合、角度θが第2許容範囲内であって第1許容範囲内でないことを示す判定結果D42を調整部142に提供する。一方、判定部144は、角度θが第1許容範囲内であると判定した場合、次の推定値D20に基づいて得られる角度θに対して再び上記の判定を繰り返す。 The determining unit 144 first determines whether the angle θ is within the second tolerance range. When determining that the angle θ is not within the second tolerance range, that is, when determining that the intersection PV2 between the unit direction vector V2 and the virtual surface S is not within the second region R2, the determining unit 144 determines that the angle θ is not within the second tolerance range. The adjustment unit 142 is provided with a determination result D42 indicating that the value is not within the second allowable range. On the other hand, when determining that the angle θ is within the second tolerance range, that is, when determining that the intersection PV2 is within the second region R2, the determination unit 144 determines that the angle θ is within the first tolerance range. Determine whether it exists or not. When determining that the angle θ is not within the first tolerance range, that is, when determining that the intersection PV2 is not within the first region R1, the determination unit 144 determines that the angle θ is within the second tolerance range. The adjustment unit 142 is provided with a determination result D42 indicating that the value is not within the allowable range. On the other hand, when the determination unit 144 determines that the angle θ is within the first tolerance range, the determination unit 144 repeats the above determination again for the angle θ obtained based on the next estimated value D20.
 調整部142は、監視部141による監視結果(具体的には、判定部144による判定結果D42)に基づいて、仮想座標系αにおける現在の筆記具20の位置及び姿勢の推定値D20を調整する。調整部142は、角度θが第2許容範囲内でないことを示す判定結果D42を受け取った場合、仮想座標系αにおける現在の筆記具20の位置及び姿勢の推定値D20を基準値に戻すキャリブレーション処理を実行する。例えば、調整部142は、推定値D20が基準値となるように(すなわち、筆記具20のペン先204が仮想面Sの中心Scを向くように)、シミュレーション空間における筆記具モデルD13の位置及び姿勢を調整する。 The adjustment unit 142 adjusts the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system α based on the monitoring result by the monitoring unit 141 (specifically, the determination result D42 by the determination unit 144). When the adjustment unit 142 receives the determination result D42 indicating that the angle θ is not within the second tolerance range, the adjustment unit 142 performs a calibration process to return the estimated value D20 of the current position and orientation of the writing instrument 20 in the virtual coordinate system α to the reference value. Execute. For example, the adjustment unit 142 adjusts the position and orientation of the writing instrument model D13 in the simulation space so that the estimated value D20 becomes the reference value (that is, so that the pen tip 204 of the writing instrument 20 faces the center Sc of the virtual surface S). adjust.
 調整部142は、例えば、推定値D20が示す筆記具20の単位方向ベクトルV2を、基準値が示す筆記具20Aの単位方向ベクトルV1に移動させるために必要な外乱を計算し、計算した外乱を解析実行部158に提供する。この外乱は、上述した式(1)の関数Aを調整することによって設定できる。解析実行部158は、調整部142から提供された外乱を筆記具モデルD13に付与することにより、シミュレーション空間における筆記具モデルD13の位置及び姿勢を変化させる。そして、調整部142は、シミュレーションの実行後の筆記具モデルD13の位置及び姿勢を、仮想空間における現在の筆記具20の位置及び姿勢にそれぞれ反映させる。これにより、調整部142は、現在の筆記具20の位置及び姿勢の推定値D20を基準値に戻す。 For example, the adjustment unit 142 calculates the disturbance necessary to move the unit direction vector V2 of the writing instrument 20 indicated by the estimated value D20 to the unit direction vector V1 of the writing instrument 20A indicated by the reference value, and analyzes the calculated disturbance. Department 158. This disturbance can be set by adjusting the function A of the above-mentioned equation (1). The analysis execution unit 158 changes the position and orientation of the writing instrument model D13 in the simulation space by applying the disturbance provided from the adjustment unit 142 to the writing instrument model D13. Then, the adjustment unit 142 reflects the position and orientation of the writing instrument model D13 after execution of the simulation on the current position and orientation of the writing instrument 20 in the virtual space. Thereby, the adjustment unit 142 returns the estimated value D20 of the current position and orientation of the writing instrument 20 to the reference value.
 調整部142は、角度θが第2許容範囲内であって第1許容範囲内でないことを示す判定結果D42を受け取った場合、基準値に対する推定値D20の変化量を調整する。推定値D20の変化量は、シミュレーションの実行によって生じる筆記具モデルD13の位置及び姿勢の変化量に対応する。筆記具モデルD13の位置及び姿勢の変化量は、シミュレーションの実行の際に解析条件として設定される外乱に依存する。外乱は、式(1)に示す関数Aの大きさによって調整できる。 When the adjustment unit 142 receives the determination result D42 indicating that the angle θ is within the second tolerance range and not within the first tolerance range, the adjustment unit 142 adjusts the amount of change in the estimated value D20 with respect to the reference value. The amount of change in the estimated value D20 corresponds to the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation. The amount of change in the position and orientation of the writing instrument model D13 depends on disturbances set as analysis conditions when executing the simulation. The disturbance can be adjusted by the magnitude of the function A shown in equation (1).
 そこで、調整部142は、シミュレーションの実行時に筆記具モデルD13に付与する外乱の関数Aを、外乱計算部157によって計算された外乱の関数Aよりも小さくする。外乱計算部157によって計算された外乱の関数Aは、観測値D10及びモデル情報D21に基づいて計算される値である。この外乱は、実際の筆記具20に付与される外乱とみなしてよい。従って、関数Aを小さくすれば、筆記具モデルD13に付与される外乱を、実際の筆記具20に付与される外乱よりも小さくできる。 Therefore, the adjustment unit 142 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157. The disturbance function A calculated by the disturbance calculation unit 157 is a value calculated based on the observed value D10 and the model information D21. This disturbance may be regarded as a disturbance applied to the actual writing instrument 20. Therefore, by reducing the function A, the disturbance applied to the writing instrument model D13 can be made smaller than the disturbance applied to the actual writing instrument 20.
 つまり、シミュレーションの実行によって生じる筆記具モデルD13の位置及び姿勢の変化量を、実際の筆記具20の位置及び姿勢の変化量よりも小さくできる。その結果、シミュレーション空間における筆記具モデルD13の動作が、実際の筆記具20の動作よりも小さくなる。これに応じて、シミュレーションの実行結果が示す仮想空間での推定値D20の動きが小さくなる。つまり、実際の筆記具の動作に対して推定値D20の動きが鈍くなる。これにより、推定値D20が基準値から更に大きく外れることが抑制される。 In other words, the amount of change in the position and orientation of the writing instrument model D13 caused by the execution of the simulation can be made smaller than the amount of change in the actual position and orientation of the writing instrument 20. As a result, the motion of the writing instrument model D13 in the simulation space becomes smaller than the motion of the actual writing instrument 20. Correspondingly, the movement of the estimated value D20 in the virtual space indicated by the simulation execution result becomes smaller. In other words, the movement of the estimated value D20 becomes slower than the actual movement of the writing instrument. This prevents the estimated value D20 from deviating further from the reference value.
 調整部142は、筆記具モデルD13に付与される外乱を小さくする際、推定値D20が基準値から外れるほど、すなわち角度θが第1許容範囲から外れるほど、外乱計算部157によって計算された外乱の関数Aに対する、シミュレーションの実行時に筆記具モデルD13に付与する外乱の関数Aの減少率が大きくなるように、筆記具モデルD13に付与する外乱の関数Aを設定する。例えば、調整部142は、角度θが第1許容範囲から大きく外れていない場合(例えば、角度θが第2許容範囲の上限よりも第1許容範囲の上限に近い場合)には、筆記具モデルD13に付与する外乱の関数Aを、外乱計算部157によって計算された外乱の関数Aの半分に設定する。この場合、シミュレーションの実行結果が示す推定値D20の移動量を、実際の筆記具20の移動量の半分程度に抑えることができる。 When reducing the disturbance applied to the writing instrument model D13, the adjustment unit 142 reduces the disturbance calculated by the disturbance calculation unit 157 as the estimated value D20 deviates from the reference value, that is, the angle θ deviates from the first tolerance range. The function A of the disturbance applied to the writing instrument model D13 is set so that the reduction rate of the function A of the disturbance applied to the writing instrument model D13 at the time of execution of the simulation with respect to the function A becomes large. For example, when the angle θ does not deviate significantly from the first tolerance range (for example, when the angle θ is closer to the upper limit of the first tolerance range than the upper limit of the second tolerance range), the adjustment unit 142 adjusts the writing instrument model D13 to The disturbance function A to be applied to is set to half of the disturbance function A calculated by the disturbance calculation unit 157. In this case, the amount of movement of the estimated value D20 indicated by the execution result of the simulation can be suppressed to about half the amount of movement of the actual writing instrument 20.
 一方、角度θが第1許容範囲から大きく外れている場合(例えば、角度θが第1許容範囲の上限よりも第2許容範囲の上限に近い場合)には、筆記具モデルD13に付与する外乱の関数Aをゼロに設定する。これにより、シミュレーション空間において筆記具モデルD13に付与される外乱がゼロになり、筆記具モデルD13の動作に実際の筆記具20の動作が反映されなくなる。この場合、シミュレーションの実行結果が示す推定値D20が仮想空間において移動しなくなるので、ユーザが筆記具20を大きく動かしたとしても、仮想空間において推定値D20は移動しない。このように角度θの大きさに応じて、筆記具モデルD13に付与される外乱の減少率を変化させることによって、仮想空間において、基準値から推定値D20が外れるほど、推定値D20の動きを小さくできる。その結果、現在の筆記具20の位置及び姿勢が仮想面Sに対して大きく逸脱していることをユーザに認識させることができる。 On the other hand, if the angle θ is significantly outside the first tolerance range (for example, if the angle θ is closer to the upper limit of the second tolerance range than the upper limit of the first tolerance range), the disturbance imparted to the writing instrument model D13 Set function A to zero. As a result, the disturbance applied to the writing instrument model D13 in the simulation space becomes zero, and the operation of the actual writing instrument 20 is no longer reflected in the operation of the writing instrument model D13. In this case, the estimated value D20 indicated by the execution result of the simulation does not move in the virtual space, so even if the user moves the writing instrument 20 significantly, the estimated value D20 does not move in the virtual space. In this way, by changing the reduction rate of the disturbance applied to the writing instrument model D13 according to the magnitude of the angle θ, the movement of the estimated value D20 is made smaller as the estimated value D20 deviates from the reference value in the virtual space. can. As a result, the user can be made aware that the current position and orientation of the writing instrument 20 deviate greatly from the virtual plane S.
 監視部141は、推定値D20のドリフト補正を行うか否かを監視してもよい。この場合、監視部141は、推定値D20のドリフト補正が必要となるドリフト補正条件を設定する。動作センサ208には、一般にドリフト誤差と呼ばれる検出誤差が発生する場合がある。このような誤差が累積すると、仮想空間において仮想面Sに対して推定値D20が徐々にずれていく現象が生じる。このような現象が生じた場合、仮想面Sに推定値D20の分布を反映させたときの一定時間内の推定値D20の分布の中心が、ドリフト誤差が累積していない場合における一定時間内の推定値D20の分布の中心から徐々にずれる。一定時間内の推定値D20の分布の中心は、一定時間内の推定値D20の分布の平均値又は中央値としてよい。ドリフト誤差が累積していない場合において、一定時間内の推定値D20の分布の中心は、例えば、仮想面Sの中心Sc付近に位置する。 The monitoring unit 141 may monitor whether or not to perform drift correction on the estimated value D20. In this case, the monitoring unit 141 sets a drift correction condition that requires drift correction of the estimated value D20. A detection error generally referred to as a drift error may occur in the motion sensor 208. When such errors accumulate, a phenomenon occurs in which the estimated value D20 gradually deviates from the virtual surface S in the virtual space. If such a phenomenon occurs, the center of the distribution of estimated values D20 within a certain time when the distribution of estimated values D20 is reflected on the virtual surface S will be The estimated value D20 gradually shifts from the center of the distribution. The center of the distribution of estimated values D20 within a certain period of time may be the average value or median value of the distribution of estimated values D20 within a certain period of time. When drift errors are not accumulated, the center of the distribution of estimated values D20 within a certain period of time is located near the center Sc of the virtual surface S, for example.
 そこで、計算部143は、ドリフト誤差が累積していない場合における一定時間内の推定値D20の分布の中心を基準値として、一定時間内の現在の推定値D20の分布の中心のずれ量を計算してもよい。この場合、判定部144は、計算されたずれ量が予め設定される閾値を超えたときに、ドリフト補正条件を満たしたと判定する。一方、判定部144は、計算されたずれ量が当該閾値を超えていないときに、ドリフト補正条件を満たしていないと判定する。上記の「閾値」は、データベース30に記憶された閾値情報D11から得ることができる。上記の閾値は、例えば、情報入力システム1、情報入力システム1の提供者、又はユーザにより予め設定されてよい。上記の閾値は、例えば、一定時間内の推定値D20の分布の中心を示す過去の統計値に基づいて設定されてもよいし、ユーザにより任意に設定されてもよい。 Therefore, the calculation unit 143 calculates the shift amount of the center of the distribution of the current estimated value D20 within a certain period of time, using the center of the distribution of the estimated value D20 within a certain period of time as a reference value when drift errors have not accumulated. You may. In this case, the determination unit 144 determines that the drift correction condition is satisfied when the calculated amount of deviation exceeds a preset threshold. On the other hand, the determination unit 144 determines that the drift correction condition is not satisfied when the calculated amount of deviation does not exceed the threshold. The above “threshold” can be obtained from the threshold information D11 stored in the database 30. The above threshold value may be set in advance by, for example, the information input system 1, the provider of the information input system 1, or the user. The above threshold value may be set, for example, based on a past statistical value indicating the center of the distribution of the estimated value D20 within a certain period of time, or may be set arbitrarily by the user.
 判定部144がドリフト補正条件を満たしたと判定した場合、調整部142は、推定値D20を基準値に戻すキャリブレーション処理を実行する。例えば、調整部142は、推定値D20が基準値となるように(すなわち、筆記具20のペン先204が仮想面Sの中心Scを向くように)、シミュレーション空間における筆記具モデルD13の位置及び姿勢を調整する。例えば、調整部142は、前述したキャリブレーション処理と同一の処理を行う。これにより、一定時間内の現在の推定値D20の分布の中心を仮想面Sの中心Sc付近に戻すことができる。一方、判定部144は、ドリフト補正条件を満たしていないと判定した場合、次の推定値D20に対して再び上記の判定を繰り返す。 If the determining unit 144 determines that the drift correction condition is satisfied, the adjusting unit 142 executes a calibration process to return the estimated value D20 to the reference value. For example, the adjustment unit 142 adjusts the position and orientation of the writing instrument model D13 in the simulation space so that the estimated value D20 becomes the reference value (that is, so that the pen tip 204 of the writing instrument 20 faces the center Sc of the virtual surface S). adjust. For example, the adjustment unit 142 performs the same process as the calibration process described above. Thereby, the center of the distribution of the current estimated value D20 within a certain period of time can be returned to the vicinity of the center Sc of the virtual surface S. On the other hand, if the determination unit 144 determines that the drift correction condition is not satisfied, the determination unit 144 repeats the above determination again for the next estimated value D20.
 図9は、認識部16の構成の一例を示す図である。認識部16は、推定部15が推定した推定値D20に基づく筆記具20の軌跡を認識する。図9に示すように、認識部16は、例えば、仮想線設定部161と、描画点設定部162と、筆記情報取得部163とを有する。仮想線設定部161は、仮想空間に仮想線VL(例えば、仮想レイ)を設定する。図10は、仮想線VLにより仮想面Sに描画点Pが設定される様子を示す図である。仮想線VLは、例えば、仮想面Sに描画点Pを設定するために仮想的に設定される直線としてよい。図10に示すように、仮想線VLは、例えば、筆記具20のペン先204から仮想面Sに向かって直線状に延びている。 FIG. 9 is a diagram showing an example of the configuration of the recognition unit 16. The recognition unit 16 recognizes the trajectory of the writing instrument 20 based on the estimated value D20 estimated by the estimation unit 15. As shown in FIG. 9, the recognition unit 16 includes, for example, a virtual line setting unit 161, a drawing point setting unit 162, and a writing information acquisition unit 163. The virtual line setting unit 161 sets a virtual line VL (for example, a virtual ray) in the virtual space. FIG. 10 is a diagram showing how the drawing point P is set on the virtual surface S by the virtual line VL. The virtual line VL may be, for example, a straight line virtually set to set the drawing point P on the virtual surface S. As shown in FIG. 10, the virtual line VL extends linearly from the pen tip 204 of the writing instrument 20 toward the virtual surface S, for example.
 描画点Pは、仮想面Sに対する筆記具20の筆記動作を認識するために仮想面S上に設定される入力点としてよい。推定値D20は、仮想座標系αにおけるペン先204の位置の座標値と、仮想座標系αにおける筆記具20の姿勢(すなわち、軸線方向Lの角度)とを示す。そこで、仮想線設定部161は、推定値D20を用いて、推定値D20が示すペン先204の位置を通り且つ筆記具20の軸線方向Lに沿った直線を仮想線VLとして設定する。推定値D20が示すペン先204の位置は、例えば、仮想座標系αにおけるペン先204の位置を示す座標点(すなわち、入力点)としてよい。 The drawing point P may be an input point set on the virtual surface S in order to recognize the writing operation of the writing instrument 20 on the virtual surface S. The estimated value D20 indicates the coordinate value of the position of the pen tip 204 in the virtual coordinate system α and the attitude of the writing instrument 20 in the virtual coordinate system α (that is, the angle in the axial direction L). Therefore, the virtual line setting unit 161 uses the estimated value D20 to set a straight line passing through the position of the pen tip 204 indicated by the estimated value D20 and along the axial direction L of the writing instrument 20 as the virtual line VL. The position of the pen tip 204 indicated by the estimated value D20 may be, for example, a coordinate point (that is, an input point) indicating the position of the pen tip 204 in the virtual coordinate system α.
 仮想座標系αにおける筆記具20の姿勢は、筆記具20の軸線方向Lに沿った方向ベクトルVによって表すことができる。例えば、仮想座標系αにおけるスクリーン座標系βの原点の座標値を(x、y、z)とし、仮想座標系αにおけるペン先204の位置の座標値を(x、y、z)とし、方向ベクトルVを(Vx、Vy、Vz)とする。この場合、仮想線VLは、座標値(x、y、z)を通る方向ベクトル(Vx、Vy、Vz)に沿った直線として表される。 The posture of the writing instrument 20 in the virtual coordinate system α can be expressed by a direction vector V along the axial direction L of the writing instrument 20. For example, let the coordinate values of the origin of the screen coordinate system β in the virtual coordinate system α be (x 1 , y 1 , z 1 ), and the coordinate values of the position of the pen tip 204 in the virtual coordinate system α be (x 2 , y 2 , z 2 ), and the direction vector V is (Vx, Vy, Vz). In this case, the virtual line VL is represented as a straight line along a direction vector (Vx, Vy, Vz) passing through the coordinate values (x 2 , y 2 , z 2 ).
 例えば、仮想線VLは、任意の実数(すなわち、媒介変数)をtとして、次の式(3)によって表される。仮想線設定部161は、推定値D20及び式(3)を用いて、ペン先204の位置を通り且つ筆記具20の軸線方向Lに沿った仮想線VLを設定する。仮想線VLは、ペン先204からペン先204が向く方向に直線状に延び、仮想面Sと交差する。仮想線設定部161は、仮想線VLを示す仮想線情報D1を描画点設定部162に提供する。
Figure JPOXMLDOC01-appb-M000003
For example, the virtual line VL is expressed by the following equation (3), where t is an arbitrary real number (that is, a parameter). The virtual line setting unit 161 sets a virtual line VL passing through the position of the pen tip 204 and along the axial direction L of the writing instrument 20 using the estimated value D20 and equation (3). The virtual line VL extends linearly from the pen tip 204 in the direction toward which the pen tip 204 faces, and intersects the virtual surface S. The virtual line setting section 161 provides the drawing point setting section 162 with virtual line information D1 indicating the virtual line VL.
Figure JPOXMLDOC01-appb-M000003
 描画点設定部162は、仮想線情報D1が示す仮想線VLと、仮想空間に設定される仮想面Sとの交点を、描画点Pとして設定する。仮想座標系αにおいて仮想面Sの法線ベクトルを(Nx、Ny、Nz)とすると、仮想面Sを示す平面方程式は、次の式(4)によって表される。そこで、描画点設定部162は、仮想線VLを示す式(3)を、仮想面Sを示す式(4)に代入することによって、仮想線VLと仮想面Sとの交点を求める。例えば、式(3)を式(4)に代入すると、式(5)が得られる。式(5)をtについて解くことで、式(6)が得られる。そして、式(6)を式(3)に代入することにより、仮想面Sと仮想線VLとの交点を求めることができる。
Figure JPOXMLDOC01-appb-M000004
 
Figure JPOXMLDOC01-appb-M000005
 
Figure JPOXMLDOC01-appb-M000006
The drawing point setting unit 162 sets, as a drawing point P, the intersection between the virtual line VL indicated by the virtual line information D1 and the virtual plane S set in the virtual space. When the normal vector of the virtual surface S in the virtual coordinate system α is (Nx, Ny, Nz), the plane equation representing the virtual surface S is expressed by the following equation (4). Therefore, the drawing point setting unit 162 calculates the intersection between the virtual line VL and the virtual surface S by substituting equation (3) representing the virtual line VL into equation (4) representing the virtual surface S. For example, by substituting equation (3) into equation (4), equation (5) is obtained. By solving equation (5) for t, equation (6) is obtained. Then, by substituting equation (6) into equation (3), the intersection between the virtual surface S and the virtual line VL can be found.
Figure JPOXMLDOC01-appb-M000004

Figure JPOXMLDOC01-appb-M000005

Figure JPOXMLDOC01-appb-M000006
 描画点設定部162は、仮想面Sと仮想線VLとの交点を描画点Pとして設定する。描画点設定部162は、例えば、描画点Pの位置を、スクリーン座標系βにおける座標値として設定する。描画点設定部162は、描画点Pの位置の座標値を含む座標情報D2を調整部17及び筆記情報取得部163に提供する。座標情報D2は、描画点Pの位置の座標値に加えて、ペン先204の位置の座標値を含んでもよい。 The drawing point setting unit 162 sets the intersection of the virtual plane S and the virtual line VL as the drawing point P. For example, the drawing point setting unit 162 sets the position of the drawing point P as a coordinate value in the screen coordinate system β. The drawing point setting section 162 provides coordinate information D2 including the coordinate values of the position of the drawing point P to the adjustment section 17 and the writing information acquisition section 163. The coordinate information D2 may include the coordinate values of the position of the pen tip 204 in addition to the coordinate values of the position of the drawing point P.
 再び図9を参照する。調整部17は、座標情報D2を参照して、仮想座標系αにおける仮想面Sと筆記具20との位置を調整する必要があるか否かを判定する。調整部17は、仮想面Sと筆記具20との位置を調整する必要があると判定した場合、仮想面Sと筆記具20との位置を調整する調整処理を行う。調整処理は、例えば、前述したフィードバック部154によるフィードバック処理に類似した処理を含んでよい。調整部17は、前述したフィードバック処理と共に調整処理を行ってもよいし、フィードバック処理が行われない場合に調整処理を行ってもよい。或いは、調整部17は、フィードバック処理が行われる場合には、調整処理を行わなくてもよい。 Refer to FIG. 9 again. The adjustment unit 17 refers to the coordinate information D2 and determines whether it is necessary to adjust the positions of the virtual plane S and the writing instrument 20 in the virtual coordinate system α. When the adjustment unit 17 determines that the position between the virtual surface S and the writing instrument 20 needs to be adjusted, it performs an adjustment process to adjust the position between the virtual surface S and the writing instrument 20. The adjustment process may include, for example, a process similar to the feedback process by the feedback unit 154 described above. The adjustment unit 17 may perform the adjustment process together with the feedback process described above, or may perform the adjustment process when the feedback process is not performed. Alternatively, the adjustment unit 17 does not need to perform the adjustment process when the feedback process is performed.
 図8(a)に示すように、調整部17は、まず、フィードバック処理と同様に、仮想面Sに第1領域R1及び第2領域R2を設定する。そして、調整部17は、仮想面Sの第2領域R2内に描画点Pが位置しているか否かを判定する。すなわち、調整部17は、仮想面S内に描画点Pが収まっているか否かを判定する。調整部17は、描画点Pが第2領域R2内に位置していないと判定した場合、仮想面Sと筆記具20との位置を調整する必要があると判定する。この場合、調整部17は、筆記具20に対する仮想面Sの位置を基準位置に戻すキャリブレーション処理を実行する。ここでのキャリブレーション処理は、前述した調整部142によるキャリブレーション処理とは異なる。 As shown in FIG. 8(a), the adjustment unit 17 first sets a first region R1 and a second region R2 on the virtual surface S, similarly to the feedback process. Then, the adjustment unit 17 determines whether the drawing point P is located within the second region R2 of the virtual surface S. That is, the adjustment unit 17 determines whether the drawing point P falls within the virtual plane S or not. If the adjustment unit 17 determines that the drawing point P is not located within the second region R2, the adjustment unit 17 determines that the position of the virtual surface S and the writing instrument 20 needs to be adjusted. In this case, the adjustment unit 17 executes a calibration process to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position. The calibration process here is different from the calibration process by the adjustment unit 142 described above.
 図11(a)、図11(b)、図12(a)、及び図12(b)は、調整部17によるキャリブレーション処理のイメージの一例を示す図である。図11(a)に示す状態では、筆記具20のペン先204が仮想面Sに向いており、描画点Pが仮想面S内に位置している。この状態から図11(b)に示す状態に変化すると、ペン先204の向きが仮想面Sから逸れるため、仮想面Sに描画点Pが設定されない。このような場合に、調整部17は、仮想面S内に描画点Pが設定されるように、筆記具20に対する仮想面Sの位置を基準位置に戻すキャリブレーションを実行する。 11(a), FIG. 11(b), FIG. 12(a), and FIG. 12(b) are diagrams showing an example of the image of the calibration process by the adjustment unit 17. In the state shown in FIG. 11(a), the pen tip 204 of the writing instrument 20 is facing the virtual surface S, and the drawing point P is located within the virtual surface S. When this state changes to the state shown in FIG. 11(b), the direction of the pen tip 204 deviates from the virtual surface S, so the drawing point P is not set on the virtual surface S. In such a case, the adjustment unit 17 performs calibration to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position so that the drawing point P is set within the virtual surface S.
 筆記具20に対する仮想面Sの位置が基準位置にある状態は、例えば、ペン先204の向く先が仮想面Sの中心Scを向くように仮想面Sと筆記具20とが配置される状態としてよい。ペン先204の向く先が仮想面Sの中心Scを向くことは、ペン先204から延びる仮想線VLが仮想面Sの中心Scと交差すること(すなわち、仮想線VLと仮想面Sとの交点が中心Scに位置すること)としてよい。ペン先204の向く先が仮想面Sの中心Scを向いていれば、筆記具20の軸線方向Lは、例えば仮想面Sの法線方向に対して傾いてもよいし、仮想面Sの法線方向に沿ってもよい。仮想面Sと筆記具20とが基準位置にある状態は、ペン先204の向く先が仮想面Sの中心Scを向いている状態に限らず、ペン先204の向く先が仮想面Sにおける中心Scからずれた位置を向いている状態であってもよい。つまり、ペン先204から延びる仮想線VLが仮想面Sと交差していれば、仮想線VLと仮想面Sとの交点が仮想面Sの中心Scからずれていてもよい。 The state in which the position of the virtual surface S with respect to the writing instrument 20 is at the reference position may be, for example, a state in which the virtual surface S and the writing instrument 20 are arranged such that the tip of the pen tip 204 faces the center Sc of the virtual surface S. The fact that the tip of the pen nib 204 points toward the center Sc of the virtual surface S means that the virtual line VL extending from the pen nib 204 intersects the center Sc of the virtual surface S (that is, the intersection of the virtual line VL and the virtual surface S). may be located at the center Sc). If the tip of the pen nib 204 faces the center Sc of the virtual surface S, the axial direction L of the writing instrument 20 may be inclined with respect to the normal direction of the virtual surface S, for example, or may be inclined with respect to the normal direction of the virtual surface S. It may be along the direction. The state in which the virtual surface S and the writing instrument 20 are at the reference position is not limited to the state in which the tip of the pen tip 204 faces the center Sc of the virtual surface S, but also the state in which the tip of the pen tip 204 points to the center Sc in the virtual surface S. It may also be in a state where it is facing a position shifted from the original position. That is, as long as the virtual line VL extending from the pen tip 204 intersects the virtual surface S, the intersection of the virtual line VL and the virtual surface S may be offset from the center Sc of the virtual surface S.
 調整部17は、キャリブレーション処理を実行する際、仮想面S及び筆記具20の位置が基準位置となるように、仮想座標系αにおいて仮想面S及び筆記具20の少なくとも一方を移動させる。このとき、調整部17は、仮想座標系αにおいて筆記具20を移動させずに仮想面Sのみを移動させてもよいし、仮想面Sを移動させずに筆記具20のみを移動させてもよいし、仮想面S及び筆記具20の両方を移動させてもよい。 When executing the calibration process, the adjustment unit 17 moves at least one of the virtual surface S and the writing instrument 20 in the virtual coordinate system α so that the positions of the virtual surface S and the writing instrument 20 become the reference positions. At this time, the adjustment unit 17 may move only the virtual surface S without moving the writing instrument 20 in the virtual coordinate system α, or may move only the writing instrument 20 without moving the virtual surface S. , both the virtual surface S and the writing instrument 20 may be moved.
 図12(a)及び図12(b)に示す例では、調整部17は、仮想座標系αにおいて筆記具20に対して仮想面Sのみを回転させている。このとき、仮想面Sの回転に伴い、仮想座標系αにおいて仮想面Sを規定するスクリーン座標系βの位置も回転する。その結果、仮想面S及びスクリーン座標系βは、図12(b)に示す位置に移動する。図12(b)に示す状態では、ペン先204の向く先が仮想面Sの中心Scを向くように筆記具20と仮想面Sとの位置が調整されているので、ペン先204から延びる仮想線VLが仮想面Sの中心Scと交差する。これにより、仮想座標系αにおける仮想面S及び筆記具20の位置を基準位置に戻すキャリブレーション処理が完了する。 In the example shown in FIGS. 12(a) and 12(b), the adjustment unit 17 rotates only the virtual plane S with respect to the writing instrument 20 in the virtual coordinate system α. At this time, as the virtual surface S rotates, the position of the screen coordinate system β that defines the virtual surface S in the virtual coordinate system α also rotates. As a result, the virtual plane S and the screen coordinate system β move to the position shown in FIG. 12(b). In the state shown in FIG. 12(b), the positions of the writing instrument 20 and the virtual surface S are adjusted so that the tip of the pen nib 204 faces the center Sc of the virtual surface S, so the virtual line extending from the pen nib 204 VL intersects the center Sc of the virtual plane S. This completes the calibration process that returns the positions of the virtual plane S and the writing instrument 20 in the virtual coordinate system α to the reference positions.
 調整部17は、描画点Pが第2領域R2内であると判定した場合、描画点Pが第1領域R1内であるか否かを判定する。判定部144は、描画点Pが第1領域R1内でないと判定した場合、仮想面Sに対する筆記具20の所定の位置及び姿勢を基準値として、基準値に対する推定値D20の変化量を調整する調整処理を行う。この調整処理は、前述した調整部142による調整処理と同一の処理としてよい。この場合、調整部17は、シミュレーションの実行時に筆記具モデルD13に付与する外乱の関数Aを、外乱計算部157によって計算された外乱の関数Aよりも小さくする。 If the adjustment unit 17 determines that the drawing point P is within the second region R2, it determines whether the drawing point P is within the first region R1. When the determination unit 144 determines that the drawing point P is not within the first region R1, the determination unit 144 performs an adjustment to adjust the amount of change in the estimated value D20 with respect to the reference value using a predetermined position and orientation of the writing instrument 20 with respect to the virtual surface S as a reference value. Perform processing. This adjustment process may be the same process as the adjustment process by the adjustment unit 142 described above. In this case, the adjustment unit 17 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157.
 これにより、シミュレーションの実行によって生じる筆記具モデルD13の位置及び姿勢の変化量を、実際の筆記具20の位置及び姿勢の変化量よりもそれぞれ小さくできる。その結果、シミュレーションの実行結果が示す仮想空間での推定値D20の動きが小さくなる。これにより、推定値D20が基準値から更に大きく外れることが抑制される。一方、判定部144は、描画点Pが第1領域R1内であると判定した場合、次の推定値D20に基づいて得られる描画点Pに対して再び上記の判定を繰り返す。 Thereby, the amount of change in the position and orientation of the writing instrument model D13 caused by execution of the simulation can be made smaller than the amount of change in the actual position and orientation of the writing instrument 20, respectively. As a result, the movement of the estimated value D20 in the virtual space indicated by the simulation execution result becomes smaller. This prevents the estimated value D20 from deviating further from the reference value. On the other hand, when the determination unit 144 determines that the drawing point P is within the first region R1, the determination unit 144 repeats the above determination again for the drawing point P obtained based on the next estimated value D20.
 図13は、筆記情報取得部163の構成の一例を示す図である。図13に示すように、筆記情報取得部163は、例えば、軌跡情報取得部164及び筆記情報抽出部165を有する。軌跡情報取得部164は、例えば、座標列記録部166及び距離計算部167を含む。座標列記録部166は、描画点設定部162から座標情報D2を受け取ると、描画点Pの座標値を時系列で記録すると共に、ペン先204の座標値を時系列で記録する。座標列記録部166は、描画点Pの座標値とペン先204の座標値とを時系列で関連付けて記録する。座標列記録部166は、描画点Pの座標値を時系列で記録した描画点軌跡情報D31と、ペン先204の座標値を時系列で記録したペン先軌跡情報D32と、を距離計算部167及び筆記情報抽出部165に提供する。描画点軌跡情報D31は、仮想面Sにおける描画点Pの座標値の2次元的な軌跡を示す情報としてよい。ペン先軌跡情報D32は、仮想空間におけるペン先204の座標値の3次元的な軌跡を示す情報としてよい。 FIG. 13 is a diagram showing an example of the configuration of the handwritten information acquisition section 163. As shown in FIG. 13, the handwritten information acquisition section 163 includes, for example, a trajectory information acquisition section 164 and a handwritten information extraction section 165. The trajectory information acquisition unit 164 includes, for example, a coordinate string recording unit 166 and a distance calculation unit 167. Upon receiving the coordinate information D2 from the drawing point setting section 162, the coordinate string recording section 166 records the coordinate values of the drawing point P in chronological order and also records the coordinate values of the pen tip 204 in chronological order. The coordinate string recording unit 166 records the coordinate values of the drawing point P and the coordinate values of the pen tip 204 in a time-series manner in association with each other. The coordinate string recording unit 166 stores drawing point trajectory information D31 in which the coordinate values of the drawing point P are recorded in time series, and pen tip trajectory information D32 in which the coordinate values of the pen tip 204 are recorded in time series, in the distance calculation unit 167. and provided to the handwritten information extraction unit 165. The drawing point trajectory information D31 may be information indicating a two-dimensional trajectory of the coordinate values of the drawing point P on the virtual surface S. The pen tip trajectory information D32 may be information indicating a three-dimensional trajectory of the coordinate values of the pen tip 204 in the virtual space.
 距離計算部167は、描画点軌跡情報D31及びペン先軌跡情報D32を用いて、描画点Pとペン先204との距離を計算する。例えば、距離計算部167は、描画点軌跡情報D31及びペン先軌跡情報D32から、或る時刻における描画点Pの座標値と、当該時刻におけるペン先204の座標値(すなわち、描画点Pの座標値に関連付けられたペン先204の座標値)との間の距離を計算する。距離計算部167は、計算した距離を時系列で記録し、描画点Pの座標値及びペン先204の座標値に時系列で関連付ける。距離計算部167は、描画点Pの座標値とペン先204の座標値との距離を示す距離情報D33を、筆記情報抽出部165に提供する。結果として、軌跡情報取得部164は、描画点軌跡情報D31、ペン先軌跡情報D32、及び距離情報D33を含む情報を、筆記具20の軌跡を示す軌跡情報D3として、筆記情報抽出部165に提供する。 The distance calculation unit 167 calculates the distance between the drawing point P and the pen tip 204 using the drawing point trajectory information D31 and the pen tip trajectory information D32. For example, the distance calculation unit 167 calculates the coordinate values of the drawing point P at a certain time and the coordinate values of the pen tip 204 at that time (i.e., the coordinates of the drawing point P) from the drawing point trajectory information D31 and the pen tip trajectory information D32. (coordinate value of the pen tip 204 associated with the value). The distance calculation unit 167 records the calculated distance in chronological order and associates it with the coordinate values of the drawing point P and the coordinate values of the pen tip 204 in chronological order. The distance calculation unit 167 provides the writing information extraction unit 165 with distance information D33 indicating the distance between the coordinate values of the drawing point P and the coordinate values of the pen tip 204. As a result, the trajectory information acquisition unit 164 provides the writing information extraction unit 165 with information including the drawing point trajectory information D31, the pen tip trajectory information D32, and the distance information D33 as the trajectory information D3 indicating the trajectory of the writing instrument 20. .
 筆記情報抽出部165は、距離情報D33を参照しながら、描画点軌跡情報D31から、ユーザUが仮想面Sでの筆記を意図した描画点Pの軌跡を示す筆記情報D4を抽出する。筆記情報D4が示す描画点Pの軌跡は、例えば、筆記具20を用いて仮想面Sに筆記された文字として表される。従って、筆記情報D4は、例えば、ユーザUによって筆記された文字情報としてよい。描画点軌跡情報D31が示す描画点Pの軌跡は、一筆書きのようにつながった状態になっている。そのため、例えば筆記情報D4が文字情報を含む場合には、描画点軌跡情報D31から、ユーザUが仮想面Sに対して筆記することを意図した描画点Pの軌跡と、ユーザUが仮想面Sに対して筆記することを意図しない描画点Pの軌跡とを判別することによって、仮想面Sにおいて筆記された文字を抽出する必要がある。 The writing information extraction unit 165 extracts writing information D4 indicating the trajectory of the drawing point P that the user U intends to write on the virtual surface S from the drawing point trajectory information D31 while referring to the distance information D33. The locus of the drawing point P indicated by the writing information D4 is expressed as a character written on the virtual surface S using the writing instrument 20, for example. Therefore, the written information D4 may be character information written by the user U, for example. The trajectories of the drawing points P indicated by the drawing point trajectory information D31 are connected like one stroke. Therefore, for example, when the writing information D4 includes character information, from the drawing point trajectory information D31, the trajectory of the drawing point P that the user U intended to write on the virtual surface S, and the trajectory of the drawing point P that the user U intended to write on the virtual surface S. It is necessary to extract the characters written on the virtual surface S by determining the locus of the drawing point P that is not intended to be written on the virtual surface S.
 ユーザUが仮想面Sに対して筆記することを意図した描画点Pの軌跡は、例えば、筆記情報D4が文字情報を含む場合に、文字ストロークの切れ目を除く描画点Pの座標値の軌跡としてよい。文字ストロークは、仮想面Sに対して文字を構成する線を描画する動作としてよい。文字ストロークの切れ目とは、文字を構成する任意の線と次の線との間の箇所としてよい。ユーザUは、仮想面Sに対して筆記することを意図しない場合(すなわち、ユーザUが文字ストロークの切れ目に対応する動作を行う場合)、仮想面Sに対して筆記することを意図する場合(すなわち、ユーザUが文字ストロークの動作を行う場合)と比べて、仮想面Sから筆記具20から離す動作を行う。 For example, when the writing information D4 includes character information, the trajectory of the drawing point P that the user U intends to write on the virtual surface S is the trajectory of the coordinate values of the drawing point P excluding the breaks in the character stroke. good. The character stroke may be an action of drawing lines constituting a character on the virtual surface S. A break in a character stroke may be a location between any line that makes up a character and the next line. When the user U does not intend to write on the virtual surface S (that is, when the user U performs an action corresponding to a break in a character stroke), when the user U intends to write on the virtual surface S ( That is, compared to the case where the user U performs a character stroke motion, the user U performs a motion of separating the writing instrument 20 from the virtual surface S.
 その結果、ユーザUが仮想面Sに対する筆記を意図しないで筆記具20を動作させる場合には、ユーザUが仮想面Sに対する筆記を意図して筆記具20を動作させる場合よりも、仮想面Sと筆記具20との距離が大きくなる。そこで、筆記情報抽出部165は、仮想面Sと筆記具20との距離の変化を参照して、描画点軌跡情報D31から、ユーザUが仮想面Sに対して筆記することを意図した描画点Pの軌跡を筆記情報D4として抽出する。例えば、筆記情報抽出部165は、描画点Pとペン先204との距離の変化が、予め設定された閾値未満であるか否かを判定する。そして、筆記情報抽出部165は、描画点Pとペン先204との距離の変化が閾値未満であると判定した場合、当該距離の変化に関連付けられた描画点Pの軌跡を、ユーザUが仮想面Sでの筆記を意図した筆記情報D4として抽出する。 As a result, when the user U operates the writing instrument 20 without intending to write on the virtual surface S, the virtual surface S and the writing instrument The distance from 20 becomes larger. Therefore, the writing information extraction unit 165 refers to the change in the distance between the virtual surface S and the writing instrument 20 and extracts the drawing point P that the user U intends to write on the virtual surface S from the drawing point locus information D31. The trajectory of is extracted as written information D4. For example, the handwriting information extraction unit 165 determines whether the change in the distance between the drawing point P and the pen tip 204 is less than a preset threshold. When the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is less than the threshold, the user U can virtually trace the trajectory of the drawing point P associated with the change in distance. It is extracted as handwriting information D4 intended to be written on the surface S.
 一方、筆記情報抽出部165は、描画点Pとペン先204との距離の変化が閾値未満でないと判定した場合、当該距離の変化に関連付けられた描画点Pの軌跡を、ユーザUが仮想面Sでの筆記を意図した筆記情報D4でないと判断する。上記の「閾値」は、データベース30に記憶された閾値情報から得ることができる。上記の閾値は、例えば、情報入力システム1、情報入力システム1の提供者、又はユーザUにより予め設定されてよい。上記の閾値は、例えば、同一のユーザUの描画点Pとペン先204との距離の変化を示す過去の統計値に基づいて設定されてもよいし、ユーザUにより任意に設定されてもよい。人工知能(AI:Artificial Intelligence)技術を使って、同一のユーザUの描画点Pとペン先204との距離の変化の閾値が事前に設定されてもよい。 On the other hand, if the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is not less than the threshold, the user U can trace the trajectory of the drawing point P associated with the change in distance on the virtual surface. It is determined that the written information D4 is not intended to be written in S. The above “threshold” can be obtained from threshold information stored in the database 30. The above threshold value may be set in advance by the information input system 1, the provider of the information input system 1, or the user U, for example. The above threshold value may be set, for example, based on past statistical values indicating changes in the distance between the drawing point P and the pen tip 204 of the same user U, or may be set arbitrarily by the user U. . A threshold value for a change in the distance between the drawing point P of the same user U and the pen tip 204 may be set in advance using artificial intelligence (AI) technology.
 図14は、筆記情報D4が抽出されるイメージの一例を示す図である。図14に示すように、ユーザUが、文字を構成する線を筆記してから次の線を筆記する動作を行う際、仮想面Sと筆記具20とのSz軸方向の距離が大きくなる。図14において、軌跡T1,T2は、筆記時のペン先204が仮想面Sにあると仮定したときのペン先204の軌跡を示している。実線で示した軌跡T1は、ユーザUが仮想面Sでの筆記を意図したペン先204の軌跡を示している。一方、破線で示した軌跡T2は、ユーザUが仮想面Sでの筆記を意図しない、空中でのペン先204の軌跡を示している。 FIG. 14 is a diagram showing an example of an image in which the written information D4 is extracted. As shown in FIG. 14, when the user U performs an action of writing a line constituting a character and then writing the next line, the distance between the virtual surface S and the writing instrument 20 in the Sz axis direction increases. In FIG. 14, trajectories T1 and T2 indicate the trajectories of the pen tip 204 when it is assumed that the pen tip 204 is on the virtual surface S during writing. A trajectory T1 indicated by a solid line indicates a trajectory of the pen tip 204 that the user U intends to write on the virtual surface S. On the other hand, a trajectory T2 indicated by a broken line indicates a trajectory of the pen tip 204 in mid-air where the user U does not intend to write on the virtual surface S.
 図14に示すように、ユーザUが仮想面Sでの筆記を意図する動作(すなわち、軌跡T1に対応する動作)を行う場合における描画点Pとペン先204との距離と比べて、ユーザUが仮想面Sでの筆記を意図しない動作(すなわち、軌跡T2に対応する動作)を行う場合における描画点Pとペン先204との距離の方が大きくなる。そこで、筆記情報抽出部165は、描画点Pとペン先204との距離の変化Δdが予め設定された閾値未満であるか否かを判定することによって、描画点軌跡情報D31から、軌跡T1が示す描画点Pの座標値を筆記情報D4として抽出する。 As shown in FIG. 14, compared to the distance between the drawing point P and the pen tip 204 when the user U performs an action intended to write on the virtual surface S (that is, an action corresponding to the trajectory T1), the user U The distance between the drawing point P and the pen tip 204 becomes larger when the user performs an action that is not intended to be written on the virtual surface S (that is, an action corresponding to the trajectory T2). Therefore, the writing information extraction unit 165 determines whether or not the change Δd in the distance between the drawing point P and the pen tip 204 is less than a preset threshold value, thereby determining the trajectory T1 from the drawing point trajectory information D31. The coordinate values of the indicated drawing point P are extracted as writing information D4.
 筆記情報抽出部165は、描画点Pとペン先204との距離の変化Δdに基づいて筆記情報D4を抽出することに代えて、例えば、ペン先204の移動速度の変化に基づいて筆記情報D4を抽出してもよい。或いは、筆記情報抽出部165は、描画点Pとペン先204との距離の変化Δd、及びペン先204の移動速度の変化の両方に基づいて、筆記情報D4を抽出してもよい。 Instead of extracting the writing information D4 based on the change Δd in the distance between the drawing point P and the pen tip 204, the writing information extraction unit 165 extracts the writing information D4 based on the change in the moving speed of the pen tip 204, for example. may be extracted. Alternatively, the handwriting information extraction unit 165 may extract the handwriting information D4 based on both the change Δd in the distance between the drawing point P and the pen tip 204 and the change in the moving speed of the pen tip 204.
 筆記情報抽出部165は、抽出した筆記情報D4を出力部19に提供する。出力部19は、筆記情報D4を表示部13に出力する。表示部13は、仮想空間に配置された仮想面Sに筆記情報D4をリアルタイムに表示する。これにより、ユーザUは、仮想面Sに表示された筆記情報D4を見ることができる。出力部19は、表示部13に限らず、他の表示装置に筆記情報D4を出力してもよい。例えば、出力部19は、パーソナルコンピュータ、クラウドサーバ、又はスマートデバイス(例えば、スマートフォン或いはタブレット端末)などの表示画面に筆記情報D4を出力してもよい。 The handwritten information extraction unit 165 provides the extracted handwritten information D4 to the output unit 19. The output unit 19 outputs the written information D4 to the display unit 13. The display unit 13 displays the handwritten information D4 in real time on a virtual surface S arranged in the virtual space. Thereby, the user U can see the written information D4 displayed on the virtual surface S. The output unit 19 may output the handwritten information D4 not only to the display unit 13 but also to another display device. For example, the output unit 19 may output the handwritten information D4 to a display screen of a personal computer, a cloud server, a smart device (for example, a smartphone or a tablet terminal), or the like.
 再び図13を参照する。鑑定部18は、ユーザUによる筆記動作が終了すると、軌跡情報取得部164に記録された軌跡情報D3を受け取り、筆記情報D4の鑑定を行う。鑑定部18は、ユーザUが筆記を終了したことを知らせる信号を受信したときに、ユーザUによる筆記動作が終了したと判断してもよいし、筆記動作が停止してから所定時間経過したタイミングで筆記動作が終了したと判断してもよい。鑑定部18は、例えば、非筆記情報抽出部181及び比較部182を有する。非筆記情報抽出部181は、ペン先軌跡情報D32から、ユーザUが仮想面Sでの筆記を意図しないペン先204の軌跡を非筆記情報D5として抽出する。非筆記情報D5の抽出は、筆記情報D4を抽出する方法と同様に行うことができる。 Refer to FIG. 13 again. When the writing operation by the user U is completed, the appraisal unit 18 receives the trajectory information D3 recorded in the trajectory information acquisition unit 164, and evaluates the writing information D4. The appraisal unit 18 may determine that the writing action by the user U has ended when it receives a signal indicating that the user U has finished writing, or may determine that the writing action by the user U has ended when a predetermined period of time has elapsed since the writing action stopped. It may be determined that the writing operation has ended. The appraisal section 18 includes, for example, a non-written information extraction section 181 and a comparison section 182. The non-writing information extraction unit 181 extracts the trajectory of the pen tip 204 that the user U does not intend to write on the virtual surface S from the pen tip trajectory information D32 as non-writing information D5. The non-written information D5 can be extracted in the same manner as the written information D4.
 つまり、非筆記情報抽出部181は、描画点Pとペン先204との距離の変化Δdが、予め設定された閾値未満でないと判定した場合、当該距離の変化に関連付けられたペン先204の軌跡を、ユーザUが仮想面Sでの筆記を意図しない非筆記情報D5として抽出する。一方、非筆記情報抽出部181は、描画点Pとペン先204との距離の変化Δdが閾値未満でないと判定した場合、当該距離の変化に関連付けられたペン先204の軌跡を、ユーザUが仮想面Sでの筆記を意図しない非筆記情報D5でないと判断する。 That is, when the non-written information extraction unit 181 determines that the change Δd in the distance between the drawing point P and the pen tip 204 is not less than a preset threshold, the non-written information extraction unit 181 detects the trajectory of the pen tip 204 associated with the change in distance. is extracted as non-written information D5 that the user U does not intend to write on the virtual surface S. On the other hand, if the non-written information extraction unit 181 determines that the change Δd in the distance between the drawing point P and the pen tip 204 is not less than the threshold, the user U It is determined that the non-written information D5 is not intended to be written on the virtual surface S.
 比較部182は、非筆記情報抽出部181から非筆記情報D5を受け取り、筆記情報D4及び非筆記情報D5を含む情報を、現在のユーザUの筆跡を示す現在筆跡情報D45(後述する図15(a)参照)として取得する。比較部182は、軌跡情報D3から筆記情報D4を抽出してもよいし、筆記情報抽出部165から筆記情報D4を受け取ってもよい。更に、比較部182は、データベース30から過去筆跡情報D12を取得する。過去筆跡情報D12は、過去のユーザUの筆跡を示す情報であり、現在筆跡情報D45に対応する。比較部182は、現在筆跡情報D45と過去筆跡情報D12とを比較する。例えば、比較部182は、現在筆跡情報D45と過去筆跡情報D12との一致度を計算する。 The comparison unit 182 receives the non-written information D5 from the non-written information extraction unit 181, and converts the information including the written information D4 and the non-written information D5 into current handwriting information D45 indicating the current handwriting of the user U (see FIG. 15 (described later)). a) Reference). The comparison unit 182 may extract the written information D4 from the trajectory information D3, or may receive the written information D4 from the written information extraction unit 165. Furthermore, the comparison unit 182 acquires past handwriting information D12 from the database 30. The past handwriting information D12 is information indicating the past handwriting of the user U, and corresponds to the current handwriting information D45. The comparison unit 182 compares the current handwriting information D45 and the past handwriting information D12. For example, the comparison unit 182 calculates the degree of matching between the current handwriting information D45 and the past handwriting information D12.
 上記の「一致度」は、現在筆跡情報D45と過去筆跡情報D12とがどれくらい類似しているかを示す指数としてよい。現在筆跡情報D45が示す各座標値と、過去筆跡情報D12が示す各座標値とが似通っていれば、現在筆跡情報D45と過去筆跡情報D12との一致度は高くなり、そうでなければ、当該一致度は低くなる。比較部182は、例えば、現在筆跡情報D45が示す各座標値と、過去筆跡情報D12が示す各座標値とを時系列でそれぞれ比較し、時系列ごとの座標値のずれの程度を一致度として算出してもよい。この場合、比較部182は、時系列ごとの座標値のずれの程度の統計値(例えば、平均値又は中央値)を一致度として算出してもよい。 The above-mentioned "matching degree" may be an index indicating how similar the current handwriting information D45 and the past handwriting information D12 are. If each coordinate value indicated by the current handwriting information D45 is similar to each coordinate value indicated by the past handwriting information D12, the degree of coincidence between the current handwriting information D45 and the past handwriting information D12 will be high; otherwise, the corresponding The degree of agreement will be low. For example, the comparison unit 182 compares each coordinate value indicated by the current handwriting information D45 and each coordinate value indicated by the past handwriting information D12 in a time series, and determines the degree of deviation of the coordinate values for each time series as a degree of coincidence. It may be calculated. In this case, the comparison unit 182 may calculate a statistical value (for example, an average value or a median value) of the degree of deviation of coordinate values for each time series as the degree of coincidence.
 比較部182は、算出した一致度が予め設定された閾値以上であるか否かを判定する。比較部182は、一致度が閾値以上であると判定した場合、現在筆跡情報D45と過去筆跡情報D12とが同一であると判断し、現在筆跡情報D45を提供したユーザUと、過去筆跡情報D12を提供したユーザUとが同一人物であると判断する。一方、比較部182は、一致度が閾値以上でないと判定した場合、現在筆跡情報D45と過去筆跡情報D12とが同一でないと判断し、現在筆跡情報D45を提供したユーザUと、過去筆跡情報D12を提供したユーザUとが異なる人物であると判断する。 The comparison unit 182 determines whether the calculated degree of matching is greater than or equal to a preset threshold. If the comparison unit 182 determines that the degree of matching is greater than or equal to the threshold, it determines that the current handwriting information D45 and the past handwriting information D12 are the same, and compares the user U who provided the current handwriting information D45 with the past handwriting information D12. It is determined that the user U who provided the information is the same person. On the other hand, if the comparison unit 182 determines that the degree of matching is not equal to or higher than the threshold, it determines that the current handwriting information D45 and the past handwriting information D12 are not the same, and the comparison unit 182 determines that the current handwriting information D45 and the past handwriting information D12 are It is determined that the user U who provided the is a different person.
 上記の「閾値」は、データベース30に記憶された閾値情報D11から得ることができる。上記の閾値は、例えば、情報入力システム1、情報入力システム1の提供者、又はユーザUにより予め設定されてよい。上記の閾値は、例えば、現在筆跡情報D45と過去筆跡情報D12との一致度を示す過去の統計値に基づいて設定されてもよいし、ユーザUにより任意に設定されてもよい。人工知能(AI:Artificial Intelligence)技術を使って、現在筆跡情報D45と過去筆跡情報D12との一致度の閾値が事前に設定されてもよい。 The above "threshold" can be obtained from the threshold information D11 stored in the database 30. The above threshold value may be set in advance by the information input system 1, the provider of the information input system 1, or the user U, for example. The above threshold value may be set based on a past statistical value indicating the degree of matching between the current handwriting information D45 and the past handwriting information D12, or may be set arbitrarily by the user U, for example. A threshold value for the degree of matching between the current handwriting information D45 and the past handwriting information D12 may be set in advance using artificial intelligence (AI) technology.
 図15(a)は、現在筆跡情報D45のイメージの一例を示す図である。図15(a)は、過去筆跡情報D12のイメージの一例を示す図である。図15(a)及び図15(b)に示す例では、現在筆跡情報D45の筆記情報D4は、過去筆跡情報D12のうち、筆記情報D4に対応する過去筆記情報D14に類似している。しかし、現在筆跡情報D45の非筆記情報D5は、過去筆跡情報D12のうち、非筆記情報D5に対応する過去非筆記情報D15とは大きく異なっている。この場合、比較部182は、過去筆記情報D14及び非筆記情報D5を含めた現在筆跡情報D45と、筆記情報D4及び過去非筆記情報D15を含めた過去筆跡情報D12とを全体的に比較する。 FIG. 15(a) is a diagram showing an example of an image of the current handwriting information D45. FIG. 15(a) is a diagram showing an example of an image of past handwriting information D12. In the example shown in FIGS. 15A and 15B, the handwriting information D4 of the current handwriting information D45 is similar to the past handwriting information D14 corresponding to the handwriting information D4 of the past handwriting information D12. However, the non-written information D5 of the current handwriting information D45 is significantly different from the past non-written information D15 corresponding to the non-written information D5 of the past handwriting information D12. In this case, the comparing unit 182 compares the current handwriting information D45 including the past written information D14 and the non-written information D5 with the past handwriting information D12 including the written information D4 and the past non-written information D15.
 その結果、比較部182は、現在筆跡情報D45と過去筆跡情報D12との一致度が閾値以上でないと判定し、現在筆跡情報D45と過去筆跡情報D12とが同一でないと判断する。この場合、比較部182は、現在筆跡情報D45を提供したユーザUと、過去筆跡情報D12を提供したユーザUとが異なる人物であると判断する。このように、比較部182は、単に筆記情報D4と過去筆記情報D14との比較のみを行うのではなく、ユーザUが筆記を意図しない非筆記情報D5と過去非筆記情報D15との比較を含めて、筆記情報D4の鑑定を行う。そのため、本実施形態では、図15(a)及び図15(b)に示すように、筆記情報D4と過去筆記情報D14とが類似している場合であっても、非筆記情報D5と過去非筆記情報D15との違いによって、ユーザUが本人でない(すなわち、ユーザUがなりすましである)と判断できる。 As a result, the comparison unit 182 determines that the degree of coincidence between the current handwriting information D45 and the past handwriting information D12 is not equal to or greater than the threshold value, and determines that the current handwriting information D45 and the past handwriting information D12 are not the same. In this case, the comparison unit 182 determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are different people. In this way, the comparison unit 182 not only compares the written information D4 and the past written information D14, but also includes a comparison between the non-written information D5, which the user U does not intend to write, and the past non-written information D15. Then, the written information D4 is evaluated. Therefore, in this embodiment, as shown in FIGS. 15(a) and 15(b), even if the written information D4 and the past written information D14 are similar, the non-written information D5 and the past written information D14 are similar. Based on the difference from the handwritten information D15, it can be determined that the user U is not the original user (that is, the user U is an impersonator).
 比較部182は、過去筆跡情報D12から過去筆記情報D14及び過去非筆記情報D15を抽出してもよく、筆記情報D4と過去筆記情報D14との一致度、及び非筆記情報D5と過去非筆記情報D15との一致度をそれぞれ算出してもよい。この場合、比較部182は、それぞれの一致度が閾値以上であるか否かを判定してもよい。比較部182は、現在筆跡情報D45と過去筆跡情報D12との一致度が閾値以上であるか否かを判定することにより、筆記情報D4を提供したユーザUが本人であるか否かを鑑定する。そして、比較部182は、筆記情報D4を提供したユーザUが本人であるか否かを示す鑑定結果D6を出力部19に出力する。出力部19は、鑑定結果D6を表示部13に出力してもよいし、鑑定結果D6をスピーカなどの報知部に出力してもよい。 The comparison unit 182 may extract the past written information D14 and the past non-written information D15 from the past handwritten information D12, and compare the degree of matching between the written information D4 and the past written information D14 and the non-written information D5 and the past non-written information. The degree of matching with D15 may be calculated respectively. In this case, the comparison unit 182 may determine whether each degree of matching is greater than or equal to a threshold value. The comparison unit 182 determines whether or not the degree of matching between the current handwriting information D45 and the past handwriting information D12 is greater than or equal to a threshold value, thereby determining whether or not the user U who provided the handwriting information D4 is the user. . Then, the comparison unit 182 outputs to the output unit 19 an appraisal result D6 indicating whether or not the user U who provided the written information D4 is the user himself/herself. The output unit 19 may output the appraisal result D6 to the display unit 13, or may output the appraisal result D6 to a notification unit such as a speaker.
[情報入力システムの動作]
 続いて、本実施形態に係る情報入力システム1の動作について説明する。図16は、情報入力システム1において実施される情報入力方法の処理内容の一例を示すフローチャートである。
[Operation of information input system]
Next, the operation of the information input system 1 according to this embodiment will be explained. FIG. 16 is a flowchart illustrating an example of the processing contents of the information input method implemented in the information input system 1.
 まず、図16を参照して、情報入力システム1による処理の全体像を説明する。情報入力システム1による処理は、動作センサ208から観測値D10を取得する観測処理(ステップS1)と、推定値D20を推定する推定処理(ステップS2)と、筆記具20の軌跡を認識する認識処理(ステップS3)と、筆記情報D4を出力する出力処理(ステップS4)と、を含む。ステップS1~ステップS4は、例えば、所定の間隔を空けて繰り返し実行される。 First, the overall process performed by the information input system 1 will be described with reference to FIG. 16. The processing by the information input system 1 includes an observation process (step S1) to acquire the observed value D10 from the motion sensor 208, an estimation process (step S2) to estimate the estimated value D20, and a recognition process (step S2) to recognize the trajectory of the writing instrument 20. step S3), and an output process (step S4) for outputting the written information D4. Steps S1 to S4 are, for example, repeatedly executed at predetermined intervals.
 ステップS1では、筆記具20の動作に応じて、筆記具20の動作センサ208が加速度及び角速度の観測値D10を取得する。次に、端末10の取得部12が、動作センサ208から観測値D10を取得する。ステップS2では、推定部15が、観測値D10に基づいて、筆記具20の位置及び姿勢の推定値D20を推定する。ステップS3では、認識部16が、推定値D20に基づいて仮想面Sにおける描画点Pを設定する。次に、認識部16は、描画点Pが示す軌跡に基づいて仮想面Sに対する筆記情報D4を取得する。ステップS4では、出力部19が、筆記情報D4を表示部13に出力する。以上のステップS1~ステップS4が繰り返されることにより、ユーザUは、筆記具20により筆記した筆記情報D4を、表示部13を介してリアルタイムで視認できる。 In step S1, the motion sensor 208 of the writing instrument 20 acquires observed values D10 of acceleration and angular velocity in accordance with the operation of the writing instrument 20. Next, the acquisition unit 12 of the terminal 10 acquires the observed value D10 from the motion sensor 208. In step S2, the estimation unit 15 estimates the estimated value D20 of the position and orientation of the writing instrument 20 based on the observed value D10. In step S3, the recognition unit 16 sets a drawing point P on the virtual surface S based on the estimated value D20. Next, the recognition unit 16 acquires writing information D4 for the virtual surface S based on the trajectory indicated by the drawing point P. In step S4, the output unit 19 outputs the written information D4 to the display unit 13. By repeating the above steps S1 to S4, the user U can visually check the handwritten information D4 written with the writing instrument 20 via the display unit 13 in real time.
 図17は、推定部15が行う推定処理の一例を示すフローチャートである。まず、モデル取得部155は、データベース30から、筆記具20をモデル化した筆記具モデルD13を取得する(ステップS21)。次に、初期化処理部156は、筆記具モデルD13の状態(例えば、位置及び姿勢)を初期化する(ステップS22)。一方、取得部12が観測値D10を取得すると、外乱計算部157は、筆記具モデルD13に付与すべき外乱を計算する(ステップS23)。外乱は、筆記具20の動作の際にユーザによって筆記具20に付与される力及びトルクとしてよい。次に、解析実行部158は、外乱条件D22を、筆記具20の動作のシミュレーションを実行するための解析条件として設定する(ステップS24)。 FIG. 17 is a flowchart illustrating an example of the estimation process performed by the estimation unit 15. First, the model acquisition unit 155 acquires a writing instrument model D13 that is a model of the writing instrument 20 from the database 30 (step S21). Next, the initialization processing unit 156 initializes the state (eg, position and orientation) of the writing instrument model D13 (step S22). On the other hand, when the acquisition unit 12 acquires the observed value D10, the disturbance calculation unit 157 calculates the disturbance to be applied to the writing instrument model D13 (step S23). The disturbance may be a force and torque applied to the writing instrument 20 by a user during operation of the writing instrument 20. Next, the analysis execution unit 158 sets the disturbance condition D22 as an analysis condition for executing a simulation of the operation of the writing instrument 20 (step S24).
 次に、解析実行部158は、筆記具モデルD13を用いて筆記具20の動作のシミュレーションを実行する(ステップS25)。解析実行部158は、筆記具モデルD13に外乱条件D22を付与することにより、筆記具モデルD13をシミュレーション空間で動作させる。これにより、シミュレーション空間における筆記具モデルD13の位置及び姿勢が変化する。次に、推定値導出部153は、シミュレーションの実行結果から、現在の筆記具20の位置及び姿勢の推定値D20を導出する(ステップS26)。例えば、推定値導出部153は、シミュレーション空間における筆記具モデルD13の位置及び姿勢を、仮想空間における筆記具20の位置及び姿勢にそれぞれ対応付けることにより、推定値D20を導出する。以上のステップS21~ステップS26を経て、観測値D10に基づいて推定値D20が推定される。 Next, the analysis execution unit 158 executes a simulation of the operation of the writing instrument 20 using the writing instrument model D13 (step S25). The analysis execution unit 158 causes the writing instrument model D13 to operate in the simulation space by applying a disturbance condition D22 to the writing instrument model D13. As a result, the position and orientation of the writing instrument model D13 in the simulation space change. Next, the estimated value deriving unit 153 derives the estimated value D20 of the current position and orientation of the writing instrument 20 from the simulation execution result (step S26). For example, the estimated value deriving unit 153 derives the estimated value D20 by associating the position and orientation of the writing instrument model D13 in the simulation space with the position and orientation of the writing instrument 20 in the virtual space, respectively. Through the above steps S21 to S26, the estimated value D20 is estimated based on the observed value D10.
 図18は、フィードバック部154が行うフィードバック処理の一例を示すフローチャートである。推定部15が推定値D20を推定すると、計算部143は、仮想面Sに対する筆記具20の所定の位置及び姿勢を基準値として、基準値からの推定値D20の変化量を計算する(ステップS31)。基準値からの推定値D20の変化量は、例えば、筆記具20Aのペン先204が向く単位方向ベクトルV1と、筆記具20Aのペン先204が向く単位方向ベクトルV2との内積をとることによって表すことができる。単位方向ベクトルV1と単位方向ベクトルV2との内積をとることにより、単位方向ベクトルV1と単位方向ベクトルV2とがなす角度θを計算できる。そこで、計算部143は、基準値からの推定値D20の変化量を示す指標として、単位方向ベクトルV1と単位方向ベクトルV2とがなす角度θを計算する。 FIG. 18 is a flowchart illustrating an example of feedback processing performed by the feedback unit 154. When the estimation unit 15 estimates the estimated value D20, the calculation unit 143 calculates the amount of change in the estimated value D20 from the reference value using the predetermined position and orientation of the writing instrument 20 with respect to the virtual plane S as the reference value (step S31). . The amount of change in the estimated value D20 from the reference value can be expressed, for example, by taking the inner product of the unit direction vector V1 toward which the pen tip 204 of the writing instrument 20A is directed and the unit direction vector V2 toward which the pen tip 204 of the writing instrument 20A is directed. can. By taking the inner product of the unit directional vector V1 and the unit directional vector V2, the angle θ formed by the unit directional vector V1 and the unit directional vector V2 can be calculated. Therefore, the calculation unit 143 calculates the angle θ formed by the unit direction vector V1 and the unit direction vector V2 as an index indicating the amount of change in the estimated value D20 from the reference value.
 次に、判定部144は、計算部143が計算した角度θ(すなわち、推定値D20の変化量)が第2許容範囲内であるか否かを判定する(ステップS32)。判定部144は、角度θが第2許容範囲内でないと判定した場合(ステップS32:No)、調整部142は、推定値D20を基準値に戻すキャリブレーションを実行する(ステップS33)。調整部142は、例えば、推定値D20が示す筆記具20の単位方向ベクトルV2を、基準値が示す筆記具20Aの単位方向ベクトルV1に移動させるために必要な外乱を計算する。調整部142は、計算した外乱を筆記具モデルD13に付与することにより、シミュレーション空間における筆記具モデルD13の位置及び姿勢を変化させる。そして、調整部142は、シミュレーションの実行後の筆記具モデルD13の位置及び姿勢を、仮想空間における現在の筆記具20の位置及び姿勢にそれぞれ反映させることにより、現在の筆記具20の位置及び姿勢の推定値D20を基準値に戻す。 Next, the determination unit 144 determines whether the angle θ (that is, the amount of change in the estimated value D20) calculated by the calculation unit 143 is within the second tolerance range (step S32). If the determining unit 144 determines that the angle θ is not within the second tolerance range (step S32: No), the adjusting unit 142 performs calibration to return the estimated value D20 to the reference value (step S33). For example, the adjustment unit 142 calculates the disturbance necessary to move the unit direction vector V2 of the writing instrument 20 indicated by the estimated value D20 to the unit direction vector V1 of the writing instrument 20A indicated by the reference value. The adjustment unit 142 changes the position and orientation of the writing instrument model D13 in the simulation space by applying the calculated disturbance to the writing instrument model D13. Then, the adjustment unit 142 reflects the position and orientation of the writing instrument model D13 after execution of the simulation in the current position and orientation of the writing instrument 20 in the virtual space, thereby obtaining an estimated value of the current position and orientation of the writing instrument 20. Return D20 to standard value.
 一方、判定部144は、角度θが第2許容範囲内であると判定した場合、角度θが第1許容範囲内であるか否かを判定する(ステップS34)。判定部144は、角度θが第1許容範囲内でないと判定した場合(ステップS34:No)、基準値に対する推定値D20の変化量を調整する。例えば、調整部142は、シミュレーションの実行時に筆記具モデルD13に付与する外乱の関数Aを、外乱計算部157によって計算された外乱の関数Aよりも小さくする。これにより、調整部142は、筆記具モデルD13に付与される外乱を、実際の筆記具20に付与される外乱よりも小さくする。 On the other hand, when determining that the angle θ is within the second tolerance range, the determination unit 144 determines whether the angle θ is within the first tolerance range (step S34). When determining that the angle θ is not within the first tolerance range (step S34: No), the determining unit 144 adjusts the amount of change in the estimated value D20 with respect to the reference value. For example, the adjustment unit 142 makes the disturbance function A given to the writing instrument model D13 during simulation execution smaller than the disturbance function A calculated by the disturbance calculation unit 157. Thereby, the adjustment unit 142 makes the disturbance applied to the writing instrument model D13 smaller than the disturbance applied to the actual writing instrument 20.
 その結果、シミュレーション空間における筆記具モデルD13が、実際の筆記具20の動作よりも小さくなる。つまり、シミュレーションの実行結果が示す推定値D20の動きが小さくなる。判定部144は、角度θが第1許容範囲内であると判定した場合(ステップS34:Yes)、再びステップS31に戻り、次の推定値D20に基づいて得られる角度θに対して、ステップS31~ステップS35を繰り返す。 As a result, the writing instrument model D13 in the simulation space becomes smaller than the actual operation of the writing instrument 20. In other words, the movement of the estimated value D20 indicated by the simulation execution result becomes smaller. When the determination unit 144 determines that the angle θ is within the first tolerance range (step S34: Yes), the determination unit 144 returns to step S31 again and sets the angle θ obtained based on the next estimated value D20 in step S31. ~Repeat step S35.
 図19は、フィードバック部154が行うドリフト補正処理の一例を示すフローチャートである。推定部15が推定値D20を推定すると、判定部144は、ドリフト補正条件を満たすか否かを判定する(ステップS41)。ドリフト補正条件は、ドリフト誤差が累積していない場合における一定時間内の推定値D20の分布の中心を基準値として、一定時間内の現在の推定値D20の分布の中心が閾値を超えたときに満たされる。判定部144がドリフト補正条件を満たすと判定した場合(ステップS41:Yes)、調整部142は、推定値D20を基準値に戻すキャリブレーションを実行する(ステップS42)。一方、判定部144がドリフト補正条件を満たしてないと判定した場合(ステップS41:No)、判定部144は、再びステップS41に戻り、次の推定値D20に対して、ステップS41及びステップS42を繰り返す。 FIG. 19 is a flowchart illustrating an example of the drift correction process performed by the feedback unit 154. When the estimation unit 15 estimates the estimated value D20, the determination unit 144 determines whether the drift correction condition is satisfied (step S41). The drift correction condition is such that when the center of the distribution of the current estimated value D20 within a certain period of time exceeds a threshold value, the center of the distribution of the estimated value D20 within a certain period of time is set as the reference value when drift errors are not accumulated. It is filled. When the determination unit 144 determines that the drift correction condition is satisfied (step S41: Yes), the adjustment unit 142 performs calibration to return the estimated value D20 to the reference value (step S42). On the other hand, if the determining unit 144 determines that the drift correction condition is not satisfied (step S41: No), the determining unit 144 returns to step S41 again and performs steps S41 and S42 for the next estimated value D20. repeat.
 図20は、認識部16が行う認識処理の一例を示すフローチャートである。推定部15が推定値D20を推定すると、仮想線設定部161は、ペン先204の位置を通り且つ筆記具20の軸線方向Lに沿った直線を仮想線VLとして設定する(ステップS51)。仮想線VLは、ペン先204からペン先204が向く方向に直線状に延び、仮想面Sと交差する。次に、描画点設定部162は、仮想線VLと仮想面Sとの交点を算出する(ステップS52)。次に、描画点設定部162は、仮想線VLと仮想面Sとの交点を描画点Pとして設定する。描画点Pの位置は、例えば、スクリーン座標系βにおける座標値として設定される。 FIG. 20 is a flowchart showing an example of the recognition process performed by the recognition unit 16. When the estimation unit 15 estimates the estimated value D20, the virtual line setting unit 161 sets a straight line passing through the position of the pen tip 204 and along the axial direction L of the writing instrument 20 as the virtual line VL (step S51). The virtual line VL extends linearly from the pen tip 204 in the direction toward which the pen tip 204 faces, and intersects the virtual surface S. Next, the drawing point setting unit 162 calculates the intersection between the virtual line VL and the virtual surface S (step S52). Next, the drawing point setting unit 162 sets the intersection of the virtual line VL and the virtual surface S as a drawing point P. The position of the drawing point P is set, for example, as a coordinate value in the screen coordinate system β.
 次に、軌跡情報取得部164は、筆記具20の軌跡を示す軌跡情報D3を取得する(ステップS54)。軌跡情報D3は、例えば、描画点Pの座標値を時系列で記録した描画点軌跡情報D31と、ペン先204の座標値を時系列で記録したペン先軌跡情報D32と、描画点Pの座標値とペン先204の座標値との距離を示す距離情報D33とを含む。次に、筆記情報抽出部165は、距離情報D33を参照しながら、描画点軌跡情報D31から、ユーザUが仮想面Sでの筆記を意図した描画点Pの軌跡を示す筆記情報D4を抽出する(ステップS55)。筆記情報抽出部165は、例えば、距離情報D33が示す描画点Pとペン先204との距離の変化が、予め設定された閾値未満であるか否かを判定する。そして、筆記情報抽出部165は、描画点Pとペン先204との距離の変化が閾値未満であると判定した場合、当該距離の変化に関連付けられた描画点Pの軌跡を筆記情報D4として抽出する。 Next, the trajectory information acquisition unit 164 acquires trajectory information D3 indicating the trajectory of the writing instrument 20 (step S54). The trajectory information D3 includes, for example, drawing point trajectory information D31 in which the coordinate values of the drawing point P are recorded in chronological order, pen tip trajectory information D32 in which the coordinate values of the pen tip 204 are recorded in chronological order, and the coordinates of the drawing point P. It includes distance information D33 indicating the distance between the value and the coordinate value of the pen tip 204. Next, the writing information extraction unit 165 extracts writing information D4 indicating the trajectory of the drawing point P that the user U intends to write on the virtual surface S from the drawing point trajectory information D31 while referring to the distance information D33. (Step S55). The handwriting information extraction unit 165 determines, for example, whether a change in the distance between the drawing point P and the pen tip 204 indicated by the distance information D33 is less than a preset threshold. Then, when the writing information extraction unit 165 determines that the change in the distance between the drawing point P and the pen tip 204 is less than the threshold value, the writing information extraction unit 165 extracts the trajectory of the drawing point P associated with the change in distance as writing information D4. do.
 図21は、調整部17が行う調整処理の一例を示すフローチャートである。描画点設定部162が描画点Pを設定すると、調整部17は、仮想面Sの第2領域R2内に描画点Pが位置しているか否かを判定する(ステップS61)。調整部17は、描画点Pが第2領域R2内に位置していないと判定した場合(ステップS61:No)、筆記具20に対する仮想面Sの位置を基準位置に戻すキャリブレーション処理を実行する(ステップS62)。一方、調整部17は、描画点Pが第2領域R2内であると判定した場合(ステップS61:Yes)、描画点Pが第1領域R1内であるか否かを判定する(ステップS63)。判定部144は、描画点Pが第1領域R1内でないと判定した場合(ステップS63:No)、仮想面Sに対する筆記具20の所定の位置及び姿勢を基準値として、基準値に対する推定値D20の変化量を調整する(ステップS64)。調整部17は、描画点Pが第1領域R1内であると判定した場合(ステップS63:Yes)、再びステップS61に戻り、次の推定値D20に基づいて得られる描画点Pに対して、ステップS61~ステップS64を繰り返す。 FIG. 21 is a flowchart illustrating an example of the adjustment process performed by the adjustment unit 17. When the drawing point setting unit 162 sets the drawing point P, the adjustment unit 17 determines whether the drawing point P is located within the second region R2 of the virtual surface S (step S61). When the adjustment unit 17 determines that the drawing point P is not located within the second region R2 (step S61: No), the adjustment unit 17 executes a calibration process to return the position of the virtual surface S with respect to the writing instrument 20 to the reference position ( Step S62). On the other hand, when the adjustment unit 17 determines that the drawing point P is within the second region R2 (step S61: Yes), it determines whether the drawing point P is within the first region R1 (step S63). . If the determination unit 144 determines that the drawing point P is not within the first region R1 (step S63: No), the determination unit 144 uses the predetermined position and orientation of the writing instrument 20 with respect to the virtual surface S as a reference value, and calculates the estimated value D20 with respect to the reference value. The amount of change is adjusted (step S64). When the adjustment unit 17 determines that the drawing point P is within the first region R1 (step S63: Yes), the adjustment unit 17 returns to step S61 again and adjusts the drawing point P obtained based on the next estimated value D20. Steps S61 to S64 are repeated.
 図22は、鑑定部18が行う鑑定処理の一例を示すフローチャートである。まず、鑑定部18は、ユーザUによる筆記動作が終了したか否かを判定する(ステップS71)。鑑定部18は、ユーザUによる筆記動作が終了していないと判定した場合(ステップS71:No)、ユーザUによる筆記動作が終了したと判定するまでステップS71を繰り返す。一方、鑑定部18は、ユーザUによる筆記動作が終了したと判定した場合(ステップS71:Yes)、軌跡情報D3から非筆記情報D5を抽出する。例えば、非筆記情報抽出部181が、描画点Pとペン先204との距離の変化Δdに基づいて、ユーザUが仮想面Sでの筆記を意図しないペン先204の軌跡を非筆記情報D5として抽出する(ステップS72)。 FIG. 22 is a flowchart illustrating an example of the appraisal process performed by the appraisal unit 18. First, the appraisal unit 18 determines whether or not the writing operation by the user U has been completed (step S71). When determining that the writing action by the user U has not been completed (step S71: No), the appraisal unit 18 repeats step S71 until it is determined that the writing action by the user U has been completed. On the other hand, when determining that the writing action by the user U has ended (step S71: Yes), the appraisal unit 18 extracts the non-writing information D5 from the trajectory information D3. For example, based on the change Δd in the distance between the drawing point P and the pen tip 204, the non-writing information extraction unit 181 determines, as the non-writing information D5, the trajectory of the pen tip 204 that the user U does not intend to write on the virtual surface S. Extract (step S72).
 次に、比較部182は、筆記情報D4及び非筆記情報D5を含む情報を、現在のユーザUの筆跡を示す現在筆跡情報D45として取得し、現在筆跡情報D45と過去筆跡情報D12との一致度を計算する(ステップS73)。次に、比較部182は、現在筆跡情報D45と過去筆跡情報D12との一致度に基づいて筆記情報D4を鑑定する(ステップS74)。例えば、比較部182は、一致度が予め設定された閾値以上であるか否かを判定する。比較部182は、一致度が閾値以上であると判定した場合、現在筆跡情報D45を提供したユーザUと過去筆跡情報D12を提供したユーザUとが同一人物であると判断する。つまり、比較部182は、筆記情報D4を提供したユーザUが本人であると判断する。一方、比較部182は、一致度が閾値以上でないと判定した場合、現在筆跡情報D45を提供したユーザUと過去筆跡情報D12を提供したユーザUとが異なる人物であると判断する。つまり、比較部182は、筆記情報D4を提供したユーザUがなりすましであると判断する。 Next, the comparison unit 182 acquires information including the written information D4 and the non-written information D5 as current handwriting information D45 indicating the current handwriting of the user U, and the degree of matching between the current handwriting information D45 and the past handwriting information D12. is calculated (step S73). Next, the comparison unit 182 evaluates the handwriting information D4 based on the degree of matching between the current handwriting information D45 and the past handwriting information D12 (step S74). For example, the comparison unit 182 determines whether the degree of matching is greater than or equal to a preset threshold. If the comparison unit 182 determines that the degree of matching is greater than or equal to the threshold, it determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are the same person. That is, the comparison unit 182 determines that the user U who provided the written information D4 is the user himself/herself. On the other hand, if the comparison unit 182 determines that the degree of matching is not equal to or greater than the threshold, it determines that the user U who provided the current handwriting information D45 and the user U who provided the past handwriting information D12 are different people. In other words, the comparison unit 182 determines that the user U who provided the written information D4 is an impersonator.
[作用効果]
 以上に説明した、本実施形態に係る情報入力システム1、情報入力方法、及び情報入力プログラムが奏する作用効果について説明する。本実施形態では、筆記具20の動作に関する観測値D10が動作センサ208から取得され、観測値D10及び質量情報を用いて筆記具モデルD13への外乱が計算される。そして、筆記具モデルD13に外乱が付与されることによって、筆記具モデルD13を用いて筆記具20の動作のシミュレーションが実行される。シミュレーションの実行結果が示す筆記具モデルD13の状態は、現在の筆記具20の状態に対応付けられる。これにより、筆記具20の状態の推定値D20が導出される。このように筆記具モデルD13を用いたシミュレーションを利用すれば、動作センサ208の位置において得られる情報を、筆記具20の任意の位置における情報に変換して処理できる。そのため、筆記具20に対する動作センサ208の位置に関わらず、動作センサ208からの観測値D10に基づいて推定値D20を精度良く導出できる。更に、筆記具20の質量を示す質量情報を有する筆記具モデルD13に外乱を付与すれば、筆記具20の質量を考慮した自然な動きを筆記具モデルD13に与えることができる。つまり、推定値D20が示す筆記具20の状態の変化を、筆記動作に伴う実際の筆記具20の状態の変化に近付けることが可能となる。その結果、筆記動作の際にユーザUが受ける違和感を抑えることができる。
[Effect]
The effects of the information input system 1, the information input method, and the information input program according to the present embodiment described above will be explained. In this embodiment, an observed value D10 regarding the operation of the writing instrument 20 is acquired from the motion sensor 208, and a disturbance to the writing instrument model D13 is calculated using the observed value D10 and mass information. Then, by applying a disturbance to the writing instrument model D13, a simulation of the operation of the writing instrument 20 is executed using the writing instrument model D13. The state of the writing instrument model D13 indicated by the simulation execution result is associated with the current state of the writing instrument 20. As a result, an estimated value D20 of the state of the writing instrument 20 is derived. By using the simulation using the writing instrument model D13 in this manner, information obtained at the position of the motion sensor 208 can be converted into information at an arbitrary position of the writing instrument 20 and processed. Therefore, regardless of the position of the motion sensor 208 with respect to the writing instrument 20, the estimated value D20 can be derived with high accuracy based on the observed value D10 from the motion sensor 208. Furthermore, by applying a disturbance to the writing instrument model D13 having mass information indicating the mass of the writing instrument 20, it is possible to give the writing instrument model D13 a natural movement that takes the mass of the writing instrument 20 into consideration. In other words, it is possible to bring the change in the state of the writing instrument 20 indicated by the estimated value D20 closer to the actual change in the state of the writing instrument 20 accompanying the writing operation. As a result, it is possible to suppress the discomfort that the user U feels during the writing operation.
 本実施形態のように、軸線方向Lの一端にペン先204を有する筆記具20が入力機器として用いられてもよい。この場合、ペン先204の軌跡が示す情報を筆跡情報として精度良く取得しつつ、筆記動作の際にユーザUが受ける違和感を抑えることができる。 As in this embodiment, the writing instrument 20 having the pen tip 204 at one end in the axial direction L may be used as an input device. In this case, the information indicated by the trajectory of the pen tip 204 can be accurately acquired as handwriting information, and the discomfort felt by the user U during the writing operation can be suppressed.
 本実施形態のように、情報入力システム1は、筆記具20が配置される空間に仮想面Sを設定する設定部14と、仮想面Sに対する推定値D20の軌跡を認識する認識部16と、を備えてもよい。この場合、仮想面Sに対するユーザUの筆記動作を認識できる。 As in the present embodiment, the information input system 1 includes a setting unit 14 that sets a virtual plane S in the space where the writing instrument 20 is placed, and a recognition unit 16 that recognizes the trajectory of the estimated value D20 with respect to the virtual plane S. You may prepare. In this case, the writing motion of the user U on the virtual surface S can be recognized.
 本実施形態のように、推定部15は、推定値D20の変化を監視する監視部141と、推定値D20の調整を行う調整部142と、を有してもよい。この場合、この構成では、推定値D20の変化が大きい場合に、推定値D20の変化が小さくなるように推定値D20を調整できる。これにより、仮想面Sに対して推定値D20が大きく逸脱する事態を抑制できる。その結果、仮想面Sに対する推定値D20の軌跡の認識をより確実に行うことが可能となる。 As in this embodiment, the estimating unit 15 may include a monitoring unit 141 that monitors changes in the estimated value D20, and an adjusting unit 142 that adjusts the estimated value D20. In this case, with this configuration, when the change in the estimated value D20 is large, the estimated value D20 can be adjusted so that the change in the estimated value D20 becomes small. Thereby, a situation in which the estimated value D20 deviates significantly from the virtual plane S can be suppressed. As a result, it becomes possible to recognize the locus of the estimated value D20 with respect to the virtual surface S more reliably.
 本実施形態のように、監視部141は、仮想面Sに対する筆記具20の所定の状態を基準値として、基準値からの推定値D20の変化量として角度θを計算する計算部143と、角度θが第1許容範囲内であるか否かを判定する判定部144と、を含んでもよい。調整部142は、角度θが第1許容範囲内でないと判定部144が判定した場合に、筆記具モデルD13に付与する外乱を、外乱計算部157が計算した外乱よりも小さくすることにより、シミュレーションの実行結果が示す推定値D20の角度θを小さくしてもよい。この構成では、基準値から推定値D20が大きく変化した場合に、筆記具モデルD13に付与する外乱を小さくすることによって、シミュレーションの実行により生じる筆記具モデルD13の状態の変化を小さくできる。これにより、シミュレーションの実行結果が示す推定値D20の動きを小さくできる。その結果、仮想面Sに対して推定値D20が大きく逸脱する事態を抑制でき、仮想面Sに対する推定値D20の軌跡の認識をより確実に行うことが可能となる。 As in the present embodiment, the monitoring unit 141 includes a calculation unit 143 that calculates the angle θ as the amount of change in the estimated value D20 from the reference value, using a predetermined state of the writing instrument 20 with respect to the virtual plane S as a reference value; may also include a determination unit 144 that determines whether or not is within a first tolerance range. The adjustment unit 142 improves the simulation by making the disturbance applied to the writing instrument model D13 smaller than the disturbance calculated by the disturbance calculation unit 157 when the determination unit 144 determines that the angle θ is not within the first tolerance range. The angle θ of the estimated value D20 indicated by the execution result may be made smaller. With this configuration, when the estimated value D20 changes significantly from the reference value, by reducing the disturbance applied to the writing instrument model D13, it is possible to reduce the change in the state of the writing instrument model D13 caused by execution of the simulation. Thereby, the movement of the estimated value D20 indicated by the simulation execution result can be reduced. As a result, it is possible to suppress a situation in which the estimated value D20 greatly deviates from the virtual surface S, and it becomes possible to recognize the locus of the estimated value D20 with respect to the virtual surface S more reliably.
 本実施形態のように、判定部144は、角度θが第1許容範囲よりも大きい第2許容範囲内であるか否かを判定してもよい。調整部142は、角度θが第2許容範囲内でないと判定部144が判定した場合に、推定値D20を基準値に戻してもよい。この構成によれば、基準値から推定値D20が更に大きく変化した場合に、推定値D20を基準値に戻すことによって、仮想面Sに対する推定値D20の軌跡の認識をより一層確実に行うことが可能となる。 As in the present embodiment, the determination unit 144 may determine whether the angle θ is within a second tolerance range that is larger than the first tolerance range. The adjustment unit 142 may return the estimated value D20 to the reference value when the determination unit 144 determines that the angle θ is not within the second tolerance range. According to this configuration, when the estimated value D20 changes further from the reference value, by returning the estimated value D20 to the reference value, the trajectory of the estimated value D20 with respect to the virtual surface S can be more reliably recognized. It becomes possible.
 本実施形態のように、調整部142は、角度θが第2許容範囲内であって第1許容範囲内でないと判定部144が判定した場合に、角度θが第1許容範囲から外れるほど、外乱計算部157が計算した外乱に対する、筆記具モデルD13に付与する外乱の減少率を大きくしてもよい。この構成では、角度θが大きいほど、シミュレーションの実行結果が示す推定値D20の動きが小さくなる。これにより、基準値から推定値D20がずれていることをユーザUに認識させることが可能となる。 As in the present embodiment, when the determination unit 144 determines that the angle θ is within the second tolerance range and not within the first tolerance range, the adjustment unit 142 adjusts the angle θ as it deviates from the first tolerance range. The reduction rate of the disturbance applied to the writing instrument model D13 relative to the disturbance calculated by the disturbance calculation unit 157 may be increased. In this configuration, the larger the angle θ, the smaller the movement of the estimated value D20 indicated by the simulation execution result. This makes it possible for the user U to recognize that the estimated value D20 deviates from the reference value.
[変形例]
 以上、本開示の実施形態に基づいて詳細に説明した。しかし、本開示は上述した実施形態に限定されない。本開示は、その要旨を逸脱しない範囲で様々な変形が可能である。
[Modified example]
The detailed description has been made above based on the embodiments of the present disclosure. However, the present disclosure is not limited to the embodiments described above. The present disclosure can be modified in various ways without departing from the gist thereof.
 上述した実施形態では、端末10が、機能要素として、通信部11と、取得部12と、表示部13と、設定部14と、推定部15と、調整部17と、認識部16と、鑑定部18と、出力部19とを有する場合を例示した。しかし、これらの機能要素の少なくともいずれか1つが筆記具20に実装されてもよい。これらの機能要素の全てが筆記具20に実装される場合には、情報入力システム1は、端末10を用いることなく、筆記情報D4の取得を行うことができる。情報入力システム1は、端末10とは別にサーバを備えてもよい。この場合、端末10が備える上記の各機能要素のうち少なくともいずれか1つがサーバに実装されてもよい。上述した実施形態では、端末10又はデータベース30が閾値情報D11、過去筆跡情報D12、及び筆記具モデルD13などの各種情報を記憶する場合を例示した。しかし、筆記具20が記憶装置を備える場合には、当該記憶装置が各種情報を記憶してもよい。 In the embodiment described above, the terminal 10 includes the communication unit 11, the acquisition unit 12, the display unit 13, the setting unit 14, the estimation unit 15, the adjustment unit 17, the recognition unit 16, and the appraisal unit as functional elements. A case is illustrated in which the output unit 18 and the output unit 19 are included. However, at least one of these functional elements may be implemented in the writing instrument 20. When all of these functional elements are implemented in the writing instrument 20, the information input system 1 can acquire the writing information D4 without using the terminal 10. The information input system 1 may include a server separate from the terminal 10. In this case, at least one of the above functional elements included in the terminal 10 may be implemented in the server. In the embodiment described above, the terminal 10 or the database 30 stores various information such as the threshold information D11, the past handwriting information D12, and the writing instrument model D13. However, when the writing instrument 20 includes a storage device, the storage device may store various information.
 上述した実施形態では、動作センサ208が、加速度センサ及びジャイロセンサを有する場合を例示した。しかし、動作センサ208は、加速度センサ又はジャイロセンサに代えて、互いに直交する3軸方向の磁場を検出する磁場センサを有してもよい。或いは、動作センサ208は、加速度センサ及びジャイロセンサに加えて磁場センサを有してもよい。上述した実施形態では、入力機器の一例として筆記具20が用いられる場合を例示した。しかし、入力機器は、ユーザUによる入力動作を検出可能な機器であれば、筆記具20以外の機器であってもよい。 In the embodiment described above, the case where the motion sensor 208 includes an acceleration sensor and a gyro sensor is exemplified. However, the motion sensor 208 may include a magnetic field sensor that detects magnetic fields in three axes orthogonal to each other instead of the acceleration sensor or the gyro sensor. Alternatively, motion sensor 208 may include a magnetic field sensor in addition to an acceleration sensor and a gyro sensor. In the embodiment described above, the writing instrument 20 is used as an example of the input device. However, the input device may be any device other than the writing instrument 20 as long as it is capable of detecting an input operation by the user U.
 情報入力システム1において実行される処理手順は、上述した実施形態で示した例に限定されない。例えば、上述したステップ(処理)の一部が省略されてもよいし、別の順序で各ステップが実行されてもよい。上述したステップのうちの任意の2以上のステップが組み合わされてもよいし、ステップの一部が修正又は削除されてもよい。あるいは、上記の各ステップに加えて他のステップが実行されてもよい。 The processing procedure executed in the information input system 1 is not limited to the example shown in the embodiment described above. For example, some of the steps (processes) described above may be omitted, or each step may be executed in a different order. Any two or more of the steps described above may be combined, or some of the steps may be modified or deleted. Alternatively, other steps may be performed in addition to each of the above steps.
 入力機器の一例としての筆記具20の利用目的および利用場面は限定されない。筆記具20は、筆記を伴う学習の場面において利用されてもよいし、学習の場面以外の場面において利用されてもよい。筆記を伴う学習は、学校、自宅、又はそれ以外の場所において行われる学習であってもよい。筆記具20は、イラスト又は図表の作成の場面において利用されてもよい。入力情報システムは、上述した場面に限られず、筆記具20を例とする入力機器を利用する種々の場面に適用可能である。 The purpose and scene of use of the writing instrument 20 as an example of an input device are not limited. The writing implement 20 may be used in a learning scene that involves writing, or may be used in a scene other than a learning scene. Learning that involves writing may be learning that takes place at school, home, or other places. The writing instrument 20 may be used in the creation of illustrations or charts. The input information system is not limited to the above-mentioned situations, but can be applied to various situations in which an input device such as the writing instrument 20 is used.
 以下、本開示の要旨を示す。
[1] 入力機器の動作に関する観測値をセンサから取得する取得部と、
 前記観測値に基づいて、前記入力機器の状態の推定値を推定する推定部と、
 前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力する出力部と、を備え、
 前記推定部は、
 前記入力機器をモデル化した入力機器モデルを取得するモデル取得部と、
 前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算する外乱計算部と、
 前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行する解析実行部と、
 前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出する推定値導出部と、を含む、情報入力システム。
[2] 前記入力機器は、軸線方向の一端にペン先を含む筆記具であり、
 前記推定部は、前記推定値として、前記ペン先の位置の推定値と、前記軸線方向の角度の推定値と、を推定する、[1]に記載の情報入力システム。
[3] 前記入力機器が配置される空間に仮想面を設定する設定部と、
 前記仮想面に対する前記推定値の軌跡を認識する認識部と、を更に備える、[1]又は[2]に記載の情報入力システム。
[4] 前記推定部は、
 前記仮想面に対する前記推定値の変化を監視する監視部と、
 前記仮想面に対する前記推定値の調整を行う調整部と、を更に含む、[3]に記載の情報入力システム。
[5] 前記監視部は、
 前記仮想面に対する前記入力機器の所定の状態を基準値として、前記基準値からの前記推定値の変化量を計算する計算部と、
 前記変化量が第1許容範囲内であるか否かを判定する判定部と、を含み、
 前記調整部は、前記変化量が前記第1許容範囲内でないと前記判定部が判定した場合に、前記入力機器モデルに付与する前記外乱を、前記外乱計算部が計算した前記外乱よりも小さくすることにより、前記シミュレーションの実行結果が示す前記推定値の変化量を小さくする、[4]に記載の情報入力システム。
[6] 前記判定部は、前記変化量が前記第1許容範囲よりも大きい第2許容範囲内であるか否かを判定し、
 前記調整部は、前記変化量が前記第2許容範囲内でないと前記判定部が判定した場合に、前記推定値を前記基準値に戻す、[5]に記載の情報入力システム。
[7] 前記調整部は、前記変化量が前記第2許容範囲内であって前記第1許容範囲内でないと前記判定部が判定した場合に、前記変化量が前記第1許容範囲から外れるほど、前記外乱計算部が計算した前記外乱に対する、前記入力機器モデルに付与する前記外乱の減少率を大きくする、[6]に記載の情報入力システム。
[8] プロセッサを備える情報入力システムにより実行される情報入力方法であって、
 入力機器の動作に関する観測値をセンサから取得するステップと、
 前記観測値に基づいて、前記入力機器の状態の推定値を推定するステップと、
 前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力するステップと、を備え、
 前記推定値を推定するステップは、
 前記入力機器をモデル化した入力機器モデルを取得するステップと、
 前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算するステップと、
 前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行するステップと、
 前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出するステップと、を含む、情報入力方法。
[9] 入力機器の動作に関する観測値をセンサから取得するステップと、
 前記観測値に基づいて、前記入力機器の状態の推定値を推定するステップと、
 前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力するステップと、をコンピュータに実行させ、
 前記推定値を推定するステップは、
 前記入力機器をモデル化した入力機器モデルを取得するステップと、
 前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算するステップと、
 前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行するステップと、
 前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出するステップと、を含む、情報入力プログラム。
The gist of the present disclosure is shown below.
[1] An acquisition unit that acquires observed values regarding the operation of the input device from the sensor;
an estimation unit that estimates an estimated value of the state of the input device based on the observed value;
an output unit that outputs information indicated by the trajectory of the estimated value to a display device as input information from the input device,
The estimation unit is
a model acquisition unit that acquires an input device model that models the input device;
a disturbance calculation unit that calculates a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
an analysis execution unit that performs a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
An information input system comprising: an estimated value deriving unit that derives the estimated value by associating a state of the input device model indicated by the execution result of the simulation with a current state of the input device.
[2] The input device is a writing instrument including a pen tip at one end in the axial direction,
The information input system according to [1], wherein the estimator estimates, as the estimated values, an estimated value of the position of the pen tip and an estimated value of the angle in the axial direction.
[3] a setting unit that sets a virtual plane in a space in which the input device is arranged;
The information input system according to [1] or [2], further comprising a recognition unit that recognizes a trajectory of the estimated value with respect to the virtual surface.
[4] The estimation unit includes:
a monitoring unit that monitors changes in the estimated value with respect to the virtual surface;
The information input system according to [3], further comprising: an adjustment unit that adjusts the estimated value for the virtual plane.
[5] The monitoring unit:
a calculation unit that calculates an amount of change in the estimated value from the reference value, using a predetermined state of the input device with respect to the virtual plane as a reference value;
a determination unit that determines whether the amount of change is within a first tolerance range,
The adjustment unit makes the disturbance applied to the input device model smaller than the disturbance calculated by the disturbance calculation unit when the determination unit determines that the amount of change is not within the first tolerance range. The information input system according to [4], wherein the amount of change in the estimated value indicated by the execution result of the simulation is reduced.
[6] The determination unit determines whether the amount of change is within a second tolerance range that is larger than the first tolerance range,
The information input system according to [5], wherein the adjustment unit returns the estimated value to the reference value when the determination unit determines that the amount of change is not within the second tolerance range.
[7] The adjustment unit adjusts the amount of change so that the amount of change deviates from the first tolerance range when the determination unit determines that the amount of change is within the second tolerance range and not within the first tolerance range. , the information input system according to [6], wherein the reduction rate of the disturbance given to the input device model with respect to the disturbance calculated by the disturbance calculation unit is increased.
[8] An information input method executed by an information input system including a processor,
Obtaining observed values regarding the operation of the input device from the sensor;
estimating an estimate of the state of the input device based on the observed value;
outputting information indicated by the trajectory of the estimated value to a display device as input information by the input device,
The step of estimating the estimated value includes:
obtaining an input device model that models the input device;
calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
An information input method comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
[9] Obtaining observed values regarding the operation of the input device from the sensor;
estimating an estimate of the state of the input device based on the observed value;
causing a computer to perform a step of outputting information indicated by the trajectory of the estimated value to a display device as input information from the input device;
The step of estimating the estimated value includes:
obtaining an input device model that models the input device;
calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
An information input program comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
 1…情報入力システム、12…取得部、13…表示部(表示装置の一例)、14…設定部、15…推定部、16…認識部、19…出力部、20…筆記具(入力機器の一例)、101…プロセッサ、141…監視部、143…計算部、142…調整部、144…判定部、153…推定値導出部、155…モデル取得部、157…外乱計算部、158…解析実行部、204…ペン先、208…動作センサ(センサの一例)、D4…筆記情報(入力情報の一例)、D10…観測値、D13…筆記具モデル(入力機器モデルの一例)、D20…推定値、L…軸線方向、S…仮想面、T1…軌跡、θ…角度(変化量の一例)。 DESCRIPTION OF SYMBOLS 1... Information input system, 12... Acquisition part, 13... Display part (an example of a display device), 14... Setting part, 15... Estimation part, 16... Recognition part, 19... Output part, 20... Writing implement (an example of an input device) ), 101... Processor, 141... Monitoring section, 143... Calculating section, 142... Adjusting section, 144... Judging section, 153... Estimated value deriving section, 155... Model acquiring section, 157... Disturbance calculating section, 158... Analysis execution section , 204... Pen tip, 208... Movement sensor (an example of a sensor), D4... Writing information (an example of input information), D10... Observed value, D13... Writing instrument model (an example of an input device model), D20... Estimated value, L ...Axis direction, S...Virtual surface, T1...Trajectory, θ...Angle (an example of the amount of change).

Claims (9)

  1.  入力機器の動作に関する観測値をセンサから取得する取得部と、
     前記観測値に基づいて、前記入力機器の状態の推定値を推定する推定部と、
     前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力する出力部と、を備え、
     前記推定部は、
     前記入力機器をモデル化した入力機器モデルを取得するモデル取得部と、
     前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算する外乱計算部と、
     前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行する解析実行部と、
     前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出する推定値導出部と、を含む、情報入力システム。
    an acquisition unit that acquires observed values regarding the operation of the input device from the sensor;
    an estimation unit that estimates an estimated value of the state of the input device based on the observed value;
    an output unit that outputs information indicated by the trajectory of the estimated value to a display device as input information from the input device,
    The estimation unit is
    a model acquisition unit that acquires an input device model that models the input device;
    a disturbance calculation unit that calculates a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
    an analysis execution unit that performs a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
    An information input system comprising: an estimated value deriving unit that derives the estimated value by associating a state of the input device model indicated by the execution result of the simulation with a current state of the input device.
  2.  前記入力機器は、軸線方向の一端にペン先を含む筆記具であり、
     前記推定部は、前記推定値として、前記ペン先の位置の推定値と、前記軸線方向の角度の推定値と、を推定する、請求項1に記載の情報入力システム。
    The input device is a writing instrument including a pen tip at one end in the axial direction,
    The information input system according to claim 1, wherein the estimator estimates, as the estimated values, an estimated value of the position of the pen tip and an estimated value of the angle in the axial direction.
  3.  前記入力機器が配置される空間に仮想面を設定する設定部と、
     前記仮想面に対する前記推定値の軌跡を認識する認識部と、を更に備える、請求項2に記載の情報入力システム。
    a setting unit that sets a virtual plane in a space where the input device is placed;
    The information input system according to claim 2, further comprising a recognition unit that recognizes a trajectory of the estimated value with respect to the virtual surface.
  4.  前記推定部は、
     前記仮想面に対する前記推定値の変化を監視する監視部と、
     前記仮想面に対する前記推定値の調整を行う調整部と、を更に含む、請求項3に記載の情報入力システム。
    The estimation unit is
    a monitoring unit that monitors changes in the estimated value with respect to the virtual surface;
    The information input system according to claim 3, further comprising: an adjustment unit that adjusts the estimated value for the virtual plane.
  5.  前記監視部は、
     前記仮想面に対する前記入力機器の所定の状態を基準値として、前記基準値からの前記推定値の変化量を計算する計算部と、
     前記変化量が第1許容範囲内であるか否かを判定する判定部と、を含み、
     前記調整部は、前記変化量が前記第1許容範囲内でないと前記判定部が判定した場合に、前記入力機器モデルに付与する前記外乱を、前記外乱計算部が計算した前記外乱よりも小さくすることにより、前記シミュレーションの実行結果が示す前記推定値の変化量を小さくする、請求項4に記載の情報入力システム。
    The monitoring unit is
    a calculation unit that calculates an amount of change in the estimated value from the reference value, using a predetermined state of the input device with respect to the virtual plane as a reference value;
    a determination unit that determines whether the amount of change is within a first tolerance range,
    The adjustment unit makes the disturbance applied to the input device model smaller than the disturbance calculated by the disturbance calculation unit when the determination unit determines that the amount of change is not within the first tolerance range. The information input system according to claim 4, wherein the amount of change in the estimated value indicated by the execution result of the simulation is reduced.
  6.  前記判定部は、前記変化量が前記第1許容範囲よりも大きい第2許容範囲内であるか否かを判定し、
     前記調整部は、前記変化量が前記第2許容範囲内でないと前記判定部が判定した場合に、前記推定値を前記基準値に戻す、請求項5に記載の情報入力システム。
    The determination unit determines whether the amount of change is within a second tolerance range that is larger than the first tolerance range,
    The information input system according to claim 5, wherein the adjustment unit returns the estimated value to the reference value when the determination unit determines that the amount of change is not within the second tolerance range.
  7.  前記調整部は、前記変化量が前記第2許容範囲内であって前記第1許容範囲内でないと前記判定部が判定した場合に、前記変化量が前記第1許容範囲から外れるほど、前記外乱計算部が計算した前記外乱に対する、前記入力機器モデルに付与する前記外乱の減少率を大きくする、請求項6に記載の情報入力システム。 When the determination unit determines that the amount of change is within the second tolerance range and not within the first tolerance range, the adjustment unit is configured to adjust the disturbance so that the amount of change deviates from the first tolerance range. 7. The information input system according to claim 6, wherein the reduction rate of the disturbance applied to the input device model with respect to the disturbance calculated by the calculation unit is increased.
  8.  プロセッサを備える情報入力システムにより実行される情報入力方法であって、
     入力機器の動作に関する観測値をセンサから取得するステップと、
     前記観測値に基づいて、前記入力機器の状態の推定値を推定するステップと、
     前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力するステップと、を備え、
     前記推定値を推定するステップは、
     前記入力機器をモデル化した入力機器モデルを取得するステップと、
     前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算するステップと、
     前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行するステップと、
     前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出するステップと、を含む、情報入力方法。
    An information input method performed by an information input system comprising a processor, the method comprising:
    Obtaining observed values regarding the operation of the input device from the sensor;
    estimating an estimate of the state of the input device based on the observed value;
    outputting information indicated by the trajectory of the estimated value to a display device as input information by the input device,
    The step of estimating the estimated value includes:
    obtaining an input device model that models the input device;
    calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
    performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
    An information input method comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
  9.  入力機器の動作に関する観測値をセンサから取得するステップと、
     前記観測値に基づいて、前記入力機器の状態の推定値を推定するステップと、
     前記推定値の軌跡が示す情報を、前記入力機器による入力情報として表示装置に出力するステップと、をコンピュータに実行させ、
     前記推定値を推定するステップは、
     前記入力機器をモデル化した入力機器モデルを取得するステップと、
     前記観測値、及び前記入力機器の質量を示す質量情報を用いて、前記入力機器モデルへの外乱を計算するステップと、
     前記質量情報を含む前記入力機器モデルに前記外乱を付与することにより、前記入力機器モデルを用いて前記入力機器の動作のシミュレーションを実行するステップと、
     前記シミュレーションの実行結果が示す前記入力機器モデルの状態を、現在の前記入力機器の状態に対応付けることにより、前記推定値を導出するステップと、を含む、情報入力プログラム。
    Obtaining observed values regarding the operation of the input device from the sensor;
    estimating an estimate of the state of the input device based on the observed value;
    causing a computer to perform a step of outputting information indicated by the trajectory of the estimated value to a display device as input information from the input device;
    The step of estimating the estimated value includes:
    obtaining an input device model that models the input device;
    calculating a disturbance to the input device model using the observed value and mass information indicating the mass of the input device;
    performing a simulation of the operation of the input device using the input device model by applying the disturbance to the input device model including the mass information;
    An information input program comprising: deriving the estimated value by associating the state of the input device model indicated by the execution result of the simulation with the current state of the input device.
PCT/JP2023/015665 2022-05-12 2023-04-19 Information input device, information input method, and information input program WO2023218886A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022078746 2022-05-12
JP2022-078746 2022-05-12

Publications (1)

Publication Number Publication Date
WO2023218886A1 true WO2023218886A1 (en) 2023-11-16

Family

ID=88730246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/015665 WO2023218886A1 (en) 2022-05-12 2023-04-19 Information input device, information input method, and information input program

Country Status (1)

Country Link
WO (1) WO2023218886A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004338235A (en) * 2003-05-15 2004-12-02 Pilot Corporation Writing utensil
JP2018519583A (en) * 2015-06-10 2018-07-19 アップル インコーポレイテッド Device and method for operating a user interface with a stylus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004338235A (en) * 2003-05-15 2004-12-02 Pilot Corporation Writing utensil
JP2018519583A (en) * 2015-06-10 2018-07-19 アップル インコーポレイテッド Device and method for operating a user interface with a stylus
JP2021089751A (en) * 2015-06-10 2021-06-10 アップル インコーポレイテッドApple Inc. Device and method for manipulating user interface with stylus

Similar Documents

Publication Publication Date Title
JP4810146B2 (en) Method and apparatus for generating input text
Kumar et al. Hand data glove: a wearable real-time device for human-computer interaction
Kratz et al. A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors
KR100465241B1 (en) Motion recognition system using a imaginary writing plane and method thereof
US11663784B2 (en) Content creation in augmented reality environment
EP2903256B1 (en) Image processing device, image processing method and program
Oh et al. Inertial sensor based recognition of 3-D character gestures with an ensemble classifiers
JP2012531658A (en) Virtual world processing apparatus and method
TW201627824A (en) Method and system for identifying handwriting track
US10310615B2 (en) Apparatus and method of using events for user interface
US20100103092A1 (en) Video-based handwritten character input apparatus and method thereof
TWI567592B (en) Gesture recognition method and wearable apparatus
CN106990836B (en) Method for measuring spatial position and attitude of head-mounted human input device
Pan et al. Handwriting trajectory reconstruction using low-cost imu
Patil et al. Handwriting recognition in free space using WIMU-based hand motion analysis
Ousmer et al. Recognizing 3D trajectories as 2D multi-stroke gestures
Suriya et al. A survey on hand gesture recognition for simple mouse control
Fatmi et al. American Sign Language Recognition using Hidden Markov Models and Wearable Motion Sensors.
JP7408562B2 (en) Program, information processing device, quantification method, and information processing system
US9442577B2 (en) System for inputting graphical elements
WO2023218886A1 (en) Information input device, information input method, and information input program
Younas et al. Finger air writing-movement reconstruction with low-cost imu sensor
JP2023167505A (en) Information input system, information input method, and information input program
JP2023167500A (en) Information input system, information input method, and information input program
CN110390281B (en) Sign language recognition system based on sensing equipment and working method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803368

Country of ref document: EP

Kind code of ref document: A1