WO2016056259A1 - Procédé de traitement de données de système d'entrée de gestes - Google Patents

Procédé de traitement de données de système d'entrée de gestes Download PDF

Info

Publication number
WO2016056259A1
WO2016056259A1 PCT/JP2015/054773 JP2015054773W WO2016056259A1 WO 2016056259 A1 WO2016056259 A1 WO 2016056259A1 JP 2015054773 W JP2015054773 W JP 2015054773W WO 2016056259 A1 WO2016056259 A1 WO 2016056259A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
coordinate system
sensor
vector
reference coordinate
Prior art date
Application number
PCT/JP2015/054773
Other languages
English (en)
Japanese (ja)
Inventor
亮平 神谷
Original Assignee
株式会社ログバー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ログバー filed Critical 株式会社ログバー
Publication of WO2016056259A1 publication Critical patent/WO2016056259A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to a data processing method of a gesture input system for correcting an input value by the gesture input system and assisting a user's intended gesture input.
  • an input method based on gesture detection that performs input by detecting a user's movement and posture is attracting attention.
  • Patent Document 1 describes a technique related to a finger attached to a hand or a hand for detecting a handwritten character using a gyro sensor. By writing a character in the air or on the surface while wearing this wearing body, it is possible to detect a change in the angle of the hand and output the handwritten character on a computer. This technology makes it possible to perform new character input without using an input device such as a keyboard.
  • Patent Document 2 describes a technique related to an input device that controls the movement of a cursor on the screen of a display device such as a television or a personal computer by inputting a locus of a gesture drawn in the air.
  • a display device such as a television or a personal computer
  • the direction of gravity acceleration detected by the acceleration sensor mounted on the input device is set to a common direction for both the input device and the display device, and the two-dimensional rotation processing of the detection value by the gyro sensor is performed. Regardless of how it is held, the movement of the cursor on the screen of the display device can be controlled efficiently.
  • the inclination of the input device relative to the installed display is corrected. That is, the input value is corrected by the input device by grasping the relationship between the reference coordinate system when the user inputs with respect to the vertical direction in which the gravitational acceleration is generated and the reference coordinate system.
  • the user's input posture and device holding method affect the coordinate system that the user uses as a reference, and do not affect the coordinate system that the display uses as a reference.
  • the technique described in Patent Document 2 efficiently corrects the movement of the cursor on the display on the assumption that the direction of gravitational acceleration is unchanged in the coordinate system on which the display is a reference.
  • gesture input systems can define a coordinate system that uses the vertical direction as an invariant reference.
  • the coordinate system on the canvas that the user images is not necessarily in a fixed relationship with the vertical direction. There is no.
  • Patent Literature 2 In an input device in which a user can use a coordinate system that does not use the vertical direction as a reference as a coordinate system of a space for drawing a gesture, the correction related to the direction in which the user holds or wears the input device is disclosed in Patent Literature 2 cannot be performed by the technique described in 2.
  • a data processing method of a gesture input system that can correct a sensor output by a three-dimensional rotation process so that the coordinate system that the user uses as a reference and the coordinate system that the sensor uses as a reference match. It is an issue to provide.
  • a data processing method of a gesture input system includes: In a gesture input system using a wearable device, a device reference coordinate system based on an orientation of the wearable device and a gesture reference coordinate system used as a reference when a user performs gesture input are provided in the wearable device.
  • the gesture trajectory acquired by the first sensor is subjected to a three-dimensional rotation process.
  • the device reference coordinate system and the gesture reference coordinate system are matched, and the user can more accurately match the wearable device. Can be used for input.
  • the first sensor in order to make a sensor reference coordinate system that is a reference when the first sensor detects a value match a device reference coordinate system based on the orientation of the wearable device, the first sensor A three-dimensional rotation process is performed on the gesture trajectory acquired by one sensor.
  • the sensor reference coordinate system and the device reference coordinate system do not match due to spatial circumstances in the wearable device, etc., the user inputs 3 using the wearable device by three-dimensional rotation processing.
  • a dimensional input can be obtained more accurately.
  • the gesture reference coordinate system is determined by a value detected by a second sensor provided in the wearable device. As described above, by determining the gesture reference coordinate system using the second sensor, it is possible to flexibly cope with a change in the gesture reference coordinate system.
  • the first coordinate axis of the device reference coordinate system and the first coordinate axis of the gesture reference coordinate system overlap, Setting a reference vector group having two or more vectors extending from the origin of the device reference coordinate system on a plane represented by the second coordinate axis and the third coordinate axis of the device reference coordinate system; Selecting a rotation reference vector that minimizes an angle between the reference vector group and the gravity vector detected by the second sensor;
  • the direction of the rotation reference vector is a direction of a two-dimensional vector consisting of the component of the second coordinate axis of the device reference coordinate system of the gravity vector and the component of the third coordinate axis of the device reference coordinate system.
  • a step of performing a rotation process so as to overlap each other.
  • the reference vector group includes at least a first reference vector extending in the positive direction of the second coordinate axis from the origin of the device reference coordinate system, and a first reference vector extending in the negative direction of the second coordinate axis from the origin of the device reference coordinate system.
  • a reference vector is a reference vector.
  • the gesture reference coordinate system and the device in a main gesture input posture such as a state where the user is standing upright or a state where the user lies down on either the left or right side of the body. Correction processing related to the deviation from the reference coordinate system can be performed.
  • the direction of the rotation reference vector is a direction of a two-dimensional vector consisting of the component of the second coordinate axis of the device reference coordinate system of the gravity vector and the component of the third coordinate axis of the device reference coordinate system. Rotation processing is performed so as to overlap. As a result, when the user takes an unexpected gesture input posture, it is possible to prevent the deviation between the gesture reference coordinate system and the device reference coordinate system from being increased by correction processing.
  • the second sensor is an acceleration sensor.
  • the direction in which the gravitational acceleration occurs that is, the vertically downward direction can be easily specified.
  • the first sensor is a gyro sensor. Accordingly, the data processing method according to the present invention can be applied to a gesture input system that acquires a user's gesture input based on a change in angular velocity detected by a gyro sensor.
  • a calculation using quaternions is used for the rotation processing.
  • the load concerning a system can be made smaller.
  • a wearable device using the data processing method according to the present invention is: A wearable device including a sensor unit, The sensor unit has a gyro sensor that detects angular velocity, The gesture trajectory acquired by the gyro sensor is corrected by a three-dimensional rotation process based on the mounting position and mounting angle of the sensor unit.
  • the degree of freedom of arrangement of the sensor unit in the wearable device is increased, and the housing of the wearable device can be downsized.
  • the sensor unit has an acceleration sensor for detecting acceleration;
  • the gesture trajectory acquired by the gyro sensor is corrected by a three-dimensional rotation process based on the gravitational acceleration detected by the acceleration sensor.
  • a wearable device using the data processing method according to the present invention is: Angular velocity detecting means for detecting angular velocity; Gesture trajectory deriving means for deriving a gesture trajectory from the detected angular velocity; 1st gesture locus
  • trajectory correction means which correct
  • Acceleration detecting means for detecting acceleration
  • a second gesture trajectory correction unit that corrects the gesture trajectory by dimension rotation processing based on the detected acceleration.
  • an input intended by a user can be acquired more accurately.
  • FIG. 1 shows the mounting state of the ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 shows the mounting state of the ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 shows the mounting state of the ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 shows the mounting state of the ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 shows the mounting state of the ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 It is a figure which shows an example of the gesture input by a ring type device made into the illustration of the application object of the data processing method of the gesture input system which concerns on Embodiment 1 of this invention.
  • FIG. 1 is a diagram showing a wearing state of a ring-type device 1 as an example of an application target of a data processing method of a gesture input system according to the present invention.
  • the direction from the base of the finger to the fingertip is the z g axis
  • the direction from the finger belly to the back is y g X g axis
  • the coordinate system represented by the x g axis, the y g axis, and the z g axis is used as a gesture reference coordinate system that is used as a reference when the user performs gesture input using the ring device 1.
  • the user wears the ring-type device 1 in a correct orientation, that is, in a state where the device reference coordinate system based on the orientation of the ring-type device 1 matches the gesture reference coordinate system. Suppose you are.
  • the ring-type device 1 A sensor unit 11 for detecting movement of the ring-type device 1; An auxiliary input unit 12 that receives an instruction to start gesture input reception by an arbitrary means such as a physical button or a touch sensor; A feedback unit 13 for transmitting the start and end of the gesture input reception to the user by any means such as light, vibration, and sound; A calculation unit 14 for performing various calculations related to the gesture input process; A main storage unit 15 that is a volatile memory used in the calculation by the calculation unit 14; An auxiliary storage unit 16 such as a non-volatile memory for storing various setting information relating to the gesture input process; A communication unit 17 for transmitting information acquired by gesture input to a general smartphone terminal, tablet terminal, personal computer, etc .; And a power supply unit 18 that supplies power to each element constituting the ring-type device 1 described so far.
  • the sensor unit 11 includes, as a first sensor, a gyro sensor 11a that detects a triaxial angular velocity generated in the ring type device 1, and an acceleration sensor 11b that detects a triaxial acceleration generated in the ring type device 1.
  • a gyro sensor 11a that detects a triaxial angular velocity generated in the ring type device 1
  • an acceleration sensor 11b that detects a triaxial acceleration generated in the ring type device 1.
  • Sensor unit 11 is a sensor reference coordinate system to the coordinate system as a reference in detecting the angular velocity and acceleration, and those represented x s-axis, y s axis, the z s axis.
  • auxiliary input unit 12 and the feedback unit 13 may be omitted.
  • the sensor unit 11 may be configured to start accepting a gesture input when the user performs an arbitrary operation such as shaking the ring device 1.
  • the gesture input detection method described here is an example in the present embodiment, and the gesture detection method in the present invention is not limited to this.
  • the gesture input by the ring-type input device 1 is performed in consideration of the three-dimensional rotation of the vector v (i) shown in Equation (1).
  • i indicates the number of steps, and the last step at which the gesture input ends is defined as step n. That is, i is an integer of 0 or more and n or less.
  • the time at step i is t (i), such that the time when step i is 0 is t (0) and the time when step i is n is t (n).
  • the gesture input end determination can be arbitrarily determined when the magnitude of the movement of the ring-type device 1 detected by the sensor unit 11 falls below a preset threshold or when the auxiliary input unit 12 is operated. It may be performed under the following conditions.
  • q (i) is called a quaternion or quaternion at time t (i), and is used as an amount representing three-dimensional rotation.
  • Quaternion q is a single scalar elements q w, unit vectors i of x s direction unit vector j of y s direction, the unit vector k of z s direction, an amount consisting of the three vector components, the formula ( It is expressed as shown in 2).
  • i, j, and k satisfy the relationship shown in Expression (3).
  • FIG. 3 is a flowchart showing a process when a gesture input by the user is acquired using the ring-type device 1.
  • the start of gesture input is accepted according to an arbitrary condition such as input to the auxiliary input unit 12.
  • the angular velocity ⁇ (i) at time t (i) is obtained from the gyro sensor 11a as shown in equation (6). .
  • step S12 the quaternion q (i) is obtained by the equation (7).
  • step S13 a displacement vector ⁇ (i) of coordinates having the vector v (i) as a position vector is obtained.
  • step S14 using the obtained displacement vector ⁇ (i), at time t (i), the displacement of the vector v (i) is projected onto a plane represented by the x s axis and the y s axis, p ( i) is obtained by equation (9).
  • step S15 the process from step S11 to step S14 is repeated while proceeding with step i in step S16 until it is determined that the gesture input is completed.
  • a vector v (i) at each time from time t (0) to t (n) can be obtained.
  • a trajectory drawn by a point having the vector v (i) as a position vector is distributed on a spherical surface with the origin as the center and the norm of the vector v (i) as the radius.
  • FIG. 4B from the time t (1) to the time t (i) when the plane represented by the x s axis and the y s axis is viewed from the negative direction of the z s axis.
  • the locus of the user's gesture input can be obtained from the locus of the point p (i) obtained from the displacement vector ⁇ (i) up to.
  • the gesture trajectory obtained by the above processing uses the angular velocity ⁇ (i) detected by the sensor unit 11 with reference to the x s axis, the y s axis, and the z s axis in step S11. It is obtained in the reference coordinate system.
  • the x s axis and x g axis, y s axis and y g axis, z s axis and z g axis directions are the same, that is, the sensor reference coordinate system and the gesture reference coordinate system are the same. If it is, the gesture input intended by the user is correctly recognized.
  • the sensor reference coordinate system and the gesture reference coordinate system do not always match. In particular, when a restriction is imposed on the mounting position of the sensor unit 11 due to a problem such as a housing size as in the ring-type device 1, as shown in FIG. 5, as shown in FIG. In many cases, a deviation occurs between the system and the system.
  • the gesture locus input by the user based on the gesture reference coordinate system is detected based on the sensor reference coordinate system. Will cause rotation and distortion.
  • the user can use the gesture reference coordinate system. It is possible to correctly detect the gesture trajectory input in the above manner.
  • the quaternion q (-1) and the vector v (-1) represent the deviation of the gesture reference coordinate system as viewed from the sensor reference coordinate system due to the mounting position and orientation of the sensor unit 11.
  • Vector n is the normal vector of the plane represented by z s axis and z g axis, the magnitude of the orientation determined by the outer product of the positive direction of the unit vector in the positive direction of the unit vector and the z g axis z s axis Is the vector of 1
  • is the rotation angle from the z s axis to the z g axis when the vector n is the rotation axis
  • the vector m is the unit vector in the positive direction of the z g axis
  • is the vector m as the rotation axis Is the rotation angle from the x s axis to the x g axis after ⁇ rotation.
  • the vector v ( ⁇ 1) is a parallel movement amount from the origin of the sensor reference coordinate system
  • the sensor is obtained by performing the three-dimensional rotation process using the quaternion q (-1) and the parallel movement process using the vector v (-1). It is possible to correct a deviation in gesture input data caused by a deviation between the reference coordinate system and the gesture reference coordinate system.
  • the rotation process is performed using quaternions.
  • the rotation process may be performed using other mathematically equivalent rotation expressions such as a rotation matrix or an exponent map.
  • the rotation process may be performed using a representation of rotation other than the quaternion.
  • Embodiment 2 of the present invention will be described below with reference to the drawings.
  • symbol is attached
  • the application target of the data processing method of the gesture input system according to the present embodiment is also the ring-type device 1 similar to that described in the first embodiment.
  • the device reference coordinate system based on the orientation of the ring-type device 1 and the gesture reference coordinate system used as a reference when the user performs gesture input using the ring-type device 1 are the same. Since it is assumed that the user wears the ring-type device 1 in the state of being touched, it is possible to correct the deviation between the sensor reference coordinate system and the device reference coordinate system, that is, the sensor reference coordinate system and the gesture reference. It was to correct the deviation from the coordinate system. In this embodiment, in addition to the deviation between the sensor reference coordinate system and the device reference coordinate system, the data for performing correction when there is a deviation between the device reference coordinate system and the gesture reference coordinate system. The processing method is shown.
  • the direction from the base of the finger to the fingertip is the z g axis, and therefore, z r If the axis is set in a direction perpendicular to the ring surface of the ring, the z g axis and the z r axis coincide with each other unless the user wears the ring type device 1 in the opposite direction.
  • FIG. 7 is a flowchart showing a process when the correction process according to the present embodiment is performed and a user's gesture input is acquired.
  • step S21 the acceleration sensor 11b detects the gravitational acceleration at time t (0), and the two-dimensional vector g (consisting of the x s axis component and the y s axis component of the gravitational acceleration shown in the equation (12). 0).
  • the vector g (0) is as shown in FIG.
  • the x s component and two-dimensional vector e 1 comprising a y s component in the positive direction of the unit vector on x r-axis, and x s component and y s component in the negative direction of the unit vector on x r axis
  • a two-dimensional vector e 2 consisting of a x s component and two-dimensional vector e 3 comprising a y s component in the positive direction of the unit vector on the y r axes
  • the negative direction of the unit vector on the y r axes x s Consider a two-dimensional vector e 4 composed of a component and a y s component.
  • the angle between vector g (0) and vector e 1 is ⁇ 1
  • the angle between vector g (0) and vector e 2 is ⁇ 2
  • the angle between vector g (0) and vector e 3 the [delta] 3
  • the angle between the vectors e 4 vector g (0) and [delta] 4 among from [delta] 1 of [delta] 4, the smallest angle, and [psi.
  • e be a vector that forms an angle of ⁇ with g (0) from e 1 to e 4 .
  • step S22 using equation (13), the quaternion q ( ⁇ 2) for correcting the deviation between the gesture reference coordinate system and the device reference coordinate system, the device reference coordinate system,
  • a vector v (0) that takes into account the deviation between the gesture reference coordinate system and the device reference coordinate system can be calculated.
  • the vector l is a unit vector in the zr- axis direction, and the direction thereof is the direction in which the right-handed screw advances when the right-handed screw is rotated in a direction to match the vector e with the vector g (0).
  • step S23 the angular velocity ⁇ (i) is acquired from the angular velocity sensor 11a, and in step S24, the quaternion q (i) is calculated by the equation (7).
  • step S25 the quaternion q (i) obtained in step S24, the quaternion q (i-1) at time t (i-1), and the vector v (0) obtained in step S22 are used.
  • the displacement vector ⁇ (i) of the vector v (i) is obtained.
  • Motoma' displacement vector ⁇ with (i), at time t (i), a displacement vector v (i), the point is projected into a plane represented by the x g-axis and y g axis p (i) is determined by equation (15).
  • step S27 the process from step S23 to step S26 is repeated while proceeding with step i in step S28 until it is determined that the gesture input is completed.
  • the user can perform the correction related to the deviation between the sensor reference coordinate system and the device reference coordinate system and the correction related to the deviation between the device reference coordinate system and the gesture reference coordinate system, thereby allowing the user to use the ring-type device 1. Even if it is not nervous in the direction of wearing, it is possible to perform the gesture input intended.
  • the finger wearing the ring type device 1 is used with the z g axis as the axis even though the user views the vertical direction as the y g axis. Even if the y g axis intended by the user and the actual y g axis do not match because of the rotation, the correction is performed by the data processing method according to the present embodiment, so that the user intends The gesture input can be acquired correctly.
  • the correction processing by the data processing method according to the present embodiment can be applied even when the user sets the y g axis of the gesture reference coordinate system to other than vertically upward.
  • the gesture reference system is as shown in FIG.
  • the y g axis is horizontal with the ground.
  • the direction of g (0) is the negative direction of the xg axis
  • ⁇ 2 is ⁇
  • vector e 2 is e.
  • the correction process may be applied only when the correction reference direction ⁇ falls within a preset range.
  • the vector e indicating the reference direction of the correction process is dynamically selected from four directions, it may be considered that ⁇ 45 ° ⁇ ⁇ ⁇ + 45 °, but it is corrected so that ⁇ 10 ° ⁇ ⁇ ⁇ + 10 °.
  • the range of possible angles may be narrowed, and correction processing may be performed only when ⁇ is included in the range.
  • it may be preferable not to apply this correction processing and such a problem can be avoided by narrowing the range of the correctable angle. .
  • Embodiment 3 of the present invention will be described below with reference to the drawings.
  • symbol is attached
  • the application target of the data processing method of the gesture input system according to the present embodiment is also the ring-type device 1 similar to that described in the first embodiment.
  • Fig.10 (a) it is the case where left rotation centering on a wrist is performed, with the direction of the belly of the index finger which wears the ring-type device 1 fixed.
  • the vector v (i) moves in a plane represented by the z g axis and the x g axis as shown in FIG. .
  • the trajectory of the point p (i) in the plane represented by the x g axis and the y g axis moves in the negative direction of the x g axis as shown in FIG.
  • the straight line intended by the user is correctly input.
  • the vector v (i) includes a component in the y g axis direction as shown in FIG. Therefore, the gesture trajectory is curved as shown in FIG. 12B, which is different from the straight line gesture input intended by the user.
  • FIG. 13 is a flowchart showing a process when the user's gesture input is acquired by correcting the influence of the twisting of the finger.
  • the angular velocity ⁇ (i) at time t (i) is obtained in step S31, and in step S32, the angular velocity ⁇ (i) is obtained. Quaternion q (i) is calculated.
  • step S33 using the quaternion q (i) at time t (i), the rotation angle ⁇ t (i) about the z g axis from time t (0) to time t (i) 16).
  • equation (16) quaternion q (i) is, rotation about z g axis, rotation about y g-axis after receiving the rotation about z g axis, rotation about z g-axis and y g shaft
  • step S34 quaternion q t (i) used for correcting the influence of finger torsion is obtained by equation (17).
  • the vector m in Expression (17) is a unit vector in the positive direction on the z g axis.
  • step S35 the displacement vector ⁇ (i) corrected for the influence of torsion of the finger is obtained by quaternion q t (i) by equation (18), and then in step S36, equation (19) is obtained. Then, a point p (i) indicating the gesture trajectory is calculated.
  • step S37 the process from step S31 to step S36 is repeated while proceeding with step i in step S38 until it is determined that the gesture input is completed.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de traitement de données de système d'entrée de gestes qui peut corriger une sortie de capteur au moyen d'une rotation tridimensionnelle et faire qu'un système de coordonnées qui est référencé par un utilisateur coïncide avec un système de coordonnées qui est référencé par le capteur. Par rapport à un système d'entrée de gestes qui utilise un dispositif vestimentaire, la présente invention est caractérisée par la rotation tridimensionnelle d'une trajectoire de gestes qui est acquise par un premier capteur qui est pourvu sur le dispositif vestimentaire, la rotation tridimensionnelle étant effectuée afin de faire en sorte qu'un système de coordonnées de référence de dispositif qui est basé sur l'orientation du dispositif vestimentaire coïncide avec un système de coordonnées de référence de gestes qui est référencé par l'utilisateur lorsque l'utilisateur effectue une entrée de gestes.
PCT/JP2015/054773 2014-10-07 2015-02-20 Procédé de traitement de données de système d'entrée de gestes WO2016056259A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014206354A JP2016076104A (ja) 2014-10-07 2014-10-07 ジェスチャ入力システムのデータ加工方法
JP2014-206354 2014-10-07

Publications (1)

Publication Number Publication Date
WO2016056259A1 true WO2016056259A1 (fr) 2016-04-14

Family

ID=55652889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/054773 WO2016056259A1 (fr) 2014-10-07 2015-02-20 Procédé de traitement de données de système d'entrée de gestes

Country Status (2)

Country Link
JP (1) JP2016076104A (fr)
WO (1) WO2016056259A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111240468A (zh) * 2019-12-31 2020-06-05 北京诺亦腾科技有限公司 手部动作捕捉的校准方法、装置、电子设备及存储介质
WO2023166885A1 (fr) * 2022-03-04 2023-09-07 株式会社小松製作所 Procédé d'étalonnage d'informations

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3309520B1 (fr) * 2016-10-17 2018-08-29 SICK STEGMANN GmbH Système de mesure d'angle destiné à déterminer un angle de rotation
JP2018163132A (ja) 2017-03-24 2018-10-18 望月 玲於奈 姿勢算出プログラム、姿勢情報を用いたプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0728591A (ja) * 1993-05-13 1995-01-31 Toshiba Corp 空間操作マウスシステム及び空間動作パターン入力方法
JP2002062981A (ja) * 2000-08-16 2002-02-28 Nippon Hoso Kyokai <Nhk> 表示画面指示装置
JP2009301531A (ja) * 2007-10-22 2009-12-24 Sony Corp 空間操作型入力装置、制御装置、制御システム、制御方法、空間操作型入力装置の製造方法及びハンドヘルド装置
JP2013011979A (ja) * 2011-06-28 2013-01-17 Jvc Kenwood Corp 動作指示装置、動作指示システム、動作指示方法、及びプログラム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5797046B2 (ja) * 2011-07-27 2015-10-21 任天堂株式会社 ポインティングシステム、情報処理システム、座標系等の設定方法、情報処理装置、および情報処理プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0728591A (ja) * 1993-05-13 1995-01-31 Toshiba Corp 空間操作マウスシステム及び空間動作パターン入力方法
JP2002062981A (ja) * 2000-08-16 2002-02-28 Nippon Hoso Kyokai <Nhk> 表示画面指示装置
JP2009301531A (ja) * 2007-10-22 2009-12-24 Sony Corp 空間操作型入力装置、制御装置、制御システム、制御方法、空間操作型入力装置の製造方法及びハンドヘルド装置
JP2013011979A (ja) * 2011-06-28 2013-01-17 Jvc Kenwood Corp 動作指示装置、動作指示システム、動作指示方法、及びプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111240468A (zh) * 2019-12-31 2020-06-05 北京诺亦腾科技有限公司 手部动作捕捉的校准方法、装置、电子设备及存储介质
WO2023166885A1 (fr) * 2022-03-04 2023-09-07 株式会社小松製作所 Procédé d'étalonnage d'informations

Also Published As

Publication number Publication date
JP2016076104A (ja) 2016-05-12

Similar Documents

Publication Publication Date Title
KR100543701B1 (ko) 공간형 입력 장치 및 방법
JP6524661B2 (ja) 入力支援方法、入力支援プログラムおよび入力支援装置
TWI378367B (fr)
US7952561B2 (en) Method and apparatus for controlling application using motion of image pickup unit
JP4582116B2 (ja) 入力装置、制御装置、制御システム、制御方法及びそのプログラム
US20050237296A1 (en) Apparatus, system and method for virtual user interface
WO2016056259A1 (fr) Procédé de traitement de données de système d&#39;entrée de gestes
US20160098125A1 (en) Determining User Handedness and Orientation Using a Touchscreen Device
JP5732792B2 (ja) 情報処理装置及び情報処理プログラム
US11009964B2 (en) Length calibration for computer models of users to generate inputs for computer systems
CN103597426B (zh) 空中指向设备和方法
JP2007040763A (ja) 加速度センサの補正装置
JP5589549B2 (ja) 情報処理装置及び情報処理装置の操作方法
US20150260750A1 (en) Electronic apparatus and program
CN105389578A (zh) 信息处理装置、信息处理系统以及信息处理方法
US11531392B2 (en) Tracking upper arm movements using sensor modules attached to the hand and forearm
JP6209581B2 (ja) 姿勢算出装置、姿勢算出方法、携帯機器およびプログラム
CN112306261B (zh) 低功耗倾斜补偿指点方法及相应的指点电子设备
CN110096134B (zh) 一种vr手柄射线抖动矫正方法、装置、终端和介质
CN112328099B (zh) 低功率指向方法以及实现该指向方法的电子设备
JP5652647B2 (ja) 小型電子機器、処理方法及びプログラム
US11961601B1 (en) Adaptive user interface for determining errors in performance of activities
KR20200032492A (ko) 필기 입력에 대한 보정 방법, 이를 위한 전자 장치 및 저장 매체
US11454646B2 (en) Initiation of calibration of multiple sensor modules related to an orientation of a user of the sensor modules
JPH10232739A (ja) ペン型入力装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15848211

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15848211

Country of ref document: EP

Kind code of ref document: A1