JP2014146354A - Program, information input device and control method therefor - Google Patents

Program, information input device and control method therefor Download PDF

Info

Publication number
JP2014146354A
JP2014146354A JP2014057631A JP2014057631A JP2014146354A JP 2014146354 A JP2014146354 A JP 2014146354A JP 2014057631 A JP2014057631 A JP 2014057631A JP 2014057631 A JP2014057631 A JP 2014057631A JP 2014146354 A JP2014146354 A JP 2014146354A
Authority
JP
Japan
Prior art keywords
value
input
operation
output
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2014057631A
Other languages
Japanese (ja)
Inventor
Junya Okura
純也 大倉
Rii Tsuchikura
利威 土蔵
Original Assignee
Sony Computer Entertainment Inc
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc, 株式会社ソニー・コンピュータエンタテインメント filed Critical Sony Computer Entertainment Inc
Priority to JP2014057631A priority Critical patent/JP2014146354A/en
Publication of JP2014146354A publication Critical patent/JP2014146354A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

A program capable of facilitating a user's operation input is provided.
An input value indicating the content of a user's operation input received by an operation input device is acquired every unit time, and a reference value determined according to one of the acquired multiple input values is obtained. This is a program for causing a computer connected to the operation input device to function so as to calculate a value changed by a change amount corresponding to each input value as an output value of a parameter to be operated.
[Selection] Figure 1A

Description

  The present invention relates to a program for receiving an operation input performed by a user, an information input device, and a control method thereof.

  There is an information input device that accepts an operation input performed by a user on an operation input device and performs various types of information processing. Such an information input device acquires an input value indicating the operation content when the user performs an operation input on the operation input device, and moves a character, for example, according to the acquired input value, A process of moving the viewpoint set in the virtual space is executed. Specific examples of such operation input include an operation in which a user touches a finger, a stylus, or the like on the touch sensor, or an operation in which an operation member such as an analog stick is moved. Some information input devices have a built-in sensor (such as a gyroscope) that detects the tilt of the housing as an operation input device. Such an information input device can perform various kinds of information processing by obtaining a value of a detection result of a sensor indicating the tilt as an input value by a user performing an operation of tilting the casing.

  In the information input device as described above, since the range of values that can be taken by the input value is limited due to physical restrictions of the operation input device, the input indicating the content of the operation input performed by the user at a certain moment The range of values that can be input by the user is limited only by using values. As an example, when the user performs an operation of tilting the housing, a process of changing the orientation of the viewpoint set in the virtual space according to the tilt amount will be considered. In this case, in order to change the orientation of the viewpoint greatly, it is necessary to tilt the casing so much, and depending on the shape of the casing and the way the user holds the casing, it becomes difficult to input large values. It can happen.

  On the other hand, an input value indicating the content of the operation input performed by the user may be used as the amount of change per unit time of the parameter to be operated. When such an operation input method is applied to the above-described example, while the user keeps the casing of the information input device tilted in a certain direction, the information input device looks in a direction corresponding to the tilt direction. Will continue to change its direction. However, in such an operation input method, it is difficult to finely adjust the parameter value depending on the scene, and it may be difficult for the user to operate.

  The present invention has been made in consideration of the above circumstances, and one of its purposes is to change the value of a parameter to be operated according to an input value indicating the content of an operation input performed by a user. It is an object of the present invention to provide a program, an information input device, and a control method thereof that can facilitate an operation input by a user.

  The program according to the present invention provides an input value acquisition means for acquiring an input value indicating the content of an operation input received by the operation input device for each unit time from a computer connected to the operation input device receiving user operation input. And an output value calculating unit that calculates an output value of a parameter to be operated according to a plurality of input values acquired by the input value acquiring unit, the output value calculating unit Is a value obtained by changing a reference value determined according to one input value acquired by the input value acquisition unit by a change amount corresponding to each of a plurality of input values acquired by the input value acquisition unit per unit time. Is calculated as the output value. This program may be stored in a computer-readable information storage medium.

  In the above program, the reference value may be a value corresponding to the input value last acquired by the input value acquisition means.

  In addition, the output value calculation unit may change a value obtained by changing the reference value by a change amount determined according to each of a plurality of input values acquired by the input value acquisition unit after the user has performed a predetermined start operation. The output value may be output.

  Alternatively, the output value calculation means may correspond to each of a plurality of input values acquired by the input value acquisition means after the user has executed a predetermined start operation and after a predetermined start condition is satisfied. A value obtained by changing the reference value by a determined change amount may be output as the output value.

  Further, when the input value acquired by the input value acquisition unit exceeds a predetermined value, the output value calculation unit calculates the output value using a predetermined upper limit value as the amount of change. Also good.

  In the above program, the operation input device is a sensor that detects a tilt of a casing that is held by a user's hand, and the input value acquisition unit performs an operation input that tilts the casing. In addition, a value indicating a tilt direction and a tilt amount of the casing by the operation is acquired as the input value, the output value is a value indicating a direction and a size, and the reference value is the one of the ones. It is a value indicating the direction and magnitude determined according to the tilt direction and the tilt amount indicated by the input value, and the amount of change corresponds to the tilt amount in the direction determined according to the tilt direction indicated by the corresponding input value. It may be a change in size.

  Further, in the above program, the input value is constituted by a first input component value and a second input component value representing a rotation amount with the two reference axes as rotation centers, and the output value is expressed by two reference values. A first output component value indicating a magnitude along each of the directions, and a second output component value, wherein the output value calculation means calculates a change amount with respect to the first output component value as the first input component value; It is determined according to a component value, a change amount with respect to the second output component value is determined according to the second input component value, and a change amount with respect to each output component value is determined by different calculation methods. It is good as well.

  In the above program, the reference value is the input value in a range where the absolute value of the input value exceeds a predetermined value, and the ratio of the change in the reference value to the change in the input value is equal to or less than the predetermined value. It may be determined to be smaller than the ratio of the change in the reference value to the change in.

  Further, the program displays on the display screen an image indicating the magnitude of the change amount according to the input value last acquired by the input value acquisition unit when the output value calculation unit calculates the output value. It is good also as making the said computer function further as a means to perform.

  The information input device according to the present invention is an information input device connected to an operation input device that receives a user's operation input, and is an input indicating the content of the operation input received by the operation input device per unit time. Input value acquisition means for acquiring a value, and output value calculation means for calculating an output value of a parameter to be operated according to a plurality of input values acquired by the input value acquisition means, the output value calculation The means changes a reference value determined according to one input value acquired by the input value acquisition means by an amount of change corresponding to each of a plurality of input values acquired by the input value acquisition means per unit time. A value is calculated as the output value.

  In the information input device, the information input device includes a touch sensor disposed on each of a front surface and a back surface of the housing, and a touch sensor disposed on the front surface of the housing. Start operation reception that accepts the operation as a start operation when an operation is performed in which at least one finger is brought into contact with each other and at least one finger of each hand is also brought into contact with the touch sensor disposed on the back surface of the housing. And the operation input device is a sensor for detecting the inclination of the housing, and the output value calculation means is a plurality of input values acquired by the input value acquisition means after the start operation is received. A value obtained by changing the reference value by a change amount determined in accordance with each of the input values may be output as the output value.

  Furthermore, in the information input device, the start operation accepting unit is configured such that the user makes at least one finger of each hand in contact with the touch sensor disposed on the surface of the housing and disposed on the back surface of the housing. When an operation for releasing the state in which at least one finger of both hands is brought into contact with the touch sensor is performed, the operation is accepted as an end operation, and the output value calculation means accepts the end operation. In this case, the calculation of the output value may be terminated.

  Further, the control method of the information input device according to the present invention includes an input value acquisition step of acquiring an input value indicating the content of the operation input received by the operation input device that receives a user operation input for each unit time, and the input An output value calculation step of calculating an output value of a parameter to be operated according to a plurality of input values acquired in the value acquisition step, wherein the output value calculation step is acquired in the input value acquisition step A value obtained by changing a reference value determined according to a single input value by a change amount corresponding to each of a plurality of input values acquired every unit time in the input value acquisition step is calculated as the output value. It is characterized by that.

It is a perspective view which shows the external appearance of the information input device which concerns on embodiment of this invention. It is a perspective view which shows the external appearance of the information input device which concerns on embodiment of this invention. It is a block diagram which shows the internal structure of the information input device which concerns on this embodiment. It is a functional block diagram which shows the function of the information input device which concerns on this embodiment. It is a figure showing signs that a user grasps a case of an information input device concerning this embodiment. It is explanatory drawing which shows the relationship between the normal vector which shows the attitude | position of a housing | casing, and an input value. It is a figure which shows a mode that the normal vector of FIG. 5 was seen from the X-axis negative direction side. It is a figure which shows a mode that the normal vector of FIG. 5 was seen from the Y-axis negative direction side. It is explanatory drawing which shows the relationship between the viewpoint coordinate system and output angle value at the time of starting operation being detected. It is a figure which shows an example of the display image which the information input device which concerns on this embodiment displays.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.

  1A and 1B are perspective views showing an external appearance of an information input device 1 according to an embodiment of the present invention. FIG. 1A shows a state in which the information input device 1 is viewed from the front (front) side. Shows the state seen from the back side. Hereinafter, a case where the information input device 1 according to the present embodiment is a portable game machine will be described.

  As shown in these drawings, the casing 10 of the information input device 1 has a substantially rectangular flat plate shape as a whole, and a touch panel 12 is provided on the surface thereof. The touch panel 12 has a substantially rectangular shape and includes a display 12a and a surface touch sensor 12b. The display 12a may be various image display devices such as a liquid crystal display panel and an organic EL display panel.

  The surface touch sensor 12b is disposed so as to overlap the display 12a, and includes a substantially rectangular detection surface having a shape and size corresponding to the display surface of the display 12a. When an object such as a user's finger or stylus comes into contact with the detection surface, the contact position of the object is detected. The surface touch sensor 12b does not necessarily detect the position of the object only when the object touches the detection surface. When the object comes close to the detectable range on the detection surface, the surface touch sensor 12b does not detect the position of the object. The position may be detected. The surface touch sensor 12b may be of any type as long as it is a device that can detect the position of an object on the detection surface, such as a capacitance type, a pressure sensitive type, or an optical type. In the present embodiment, the surface touch sensor 12b is a multipoint detection type touch sensor that can detect contact of an object at a plurality of locations.

  Furthermore, in this embodiment, the back surface touch sensor 14 is disposed on the back surface side of the housing 10 so as to face the touch panel 12. The back surface touch sensor 14 includes a substantially rectangular detection surface having a shape and a size corresponding to the display surface of the display 12a. That is, the display surface of the display 12a, the detection surface of the front surface touch sensor 12b, and the detection surface of the back surface touch sensor 14 are all substantially the same type and the same size, and the thickness direction of the housing 10 (Z-axis direction) ) Along the straight line. In addition, the back surface touch sensor 14 may be of various types as with the front surface touch sensor 12b. In the present embodiment, the back surface touch sensor 14 is a multipoint detection type touch sensor that can detect contact of an object at a plurality of locations, similarly to the front surface touch sensor 12b. The user holds the case 10 of the information input device 1 with both hands and makes an operation input to the information input device 1 by bringing his / her finger into contact with the detection surface of the front touch sensor 12b or the back touch sensor 14. be able to. At this time, since both the front surface touch sensor 12b and the back surface touch sensor 14 are multi-point detection type touch sensors, the user can perform various operation inputs by simultaneously bringing his / her fingers into contact with the touch sensors. Can do.

  Although not shown in FIG. 1A and FIG. 1B, the information input device 1 is not limited to the front surface touch sensor 12 b and the rear surface touch sensor 14, and various operations for receiving user operation inputs such as buttons and switches. The member may be provided on the front surface, the back surface, the side surface, or the like of the housing 10.

  In addition, a gyroscope 16 is disposed inside the housing 10 of the information input device 1 as a sensor for detecting the tilt of the housing 10. The gyroscope 16 is a piezoelectric vibration type gyro or the like, detects rotation of the casing 10 around a plurality of gyro reference axes set in the casing 10, and outputs an electrical signal corresponding to the detected rotation. To do. In this embodiment, the gyro reference axis is the X axis along the long side direction (horizontal direction) of the display 12a and the Y axis along the short side direction (vertical direction). Then, a signal corresponding to the rotation of the casing 10 around each of the gyro reference axes is output.

  FIG. 2 is a configuration block diagram showing the internal configuration of the information input device 1. As shown in the figure, the information input apparatus 1 includes a control unit 20, a storage unit 22, and an image processing unit 24 therein. The control unit 20 is configured to include a CPU, for example, and executes various types of information processing according to programs stored in the storage unit 22. A specific example of processing executed by the control unit 20 will be described later. The storage unit 22 is, for example, a memory element such as a RAM or a ROM, a disk device, and the like, and stores a program executed by the control unit 20 and various data. The storage unit 22 also functions as a work memory for the control unit 20.

  The image processing unit 24 includes, for example, a GPU and a frame buffer memory, and draws an image to be displayed on the display 12a in accordance with an instruction output from the control unit 20. As a specific example, the image processing unit 24 includes a frame buffer memory corresponding to the display area of the display 12a, and the GPU writes an image to the frame buffer memory every predetermined time in accordance with an instruction from the control unit 20. The image written in the frame buffer memory is converted into a video signal at a predetermined timing and displayed on the display 12a.

  Hereinafter, functions realized by the information input device 1 in the present embodiment will be described. In the present embodiment, the information input device 1 executes a game application program stored in the storage unit 22 to generate a spatial image showing a state in the virtual three-dimensional space and display it on the display 12a. Furthermore, the user performs an operation input to the information input device 1 by performing an operation of tilting the housing 10 (hereinafter referred to as a tilt operation) in a state where such a spatial image is displayed. In response to this operation input, the information input device 1 executes a process of changing the direction of the viewpoint set in the virtual three-dimensional space (hereinafter referred to as the line-of-sight direction VD). That is, the parameter indicating the line-of-sight direction VD is a parameter to be operated by the tilt operation. And the information input device 1 produces | generates the image which shows a mode that it looked in the virtual three-dimensional space toward the visual line direction VD changed according to inclination operation, and displays it on the display 12a. Thereby, the user can view the situation in the virtual three-dimensional space while tilting the housing 10 and changing the line-of-sight direction. In order to realize such processing, in this embodiment, the gyroscope 16 functions as an operation input device that receives an operation input by a user's tilt operation.

  As shown in FIG. 3, the information input device 1 functionally includes a start operation reception unit 30, an input value acquisition unit 32, an output value calculation unit 34, and a display image control unit 36. The These functions are realized by the control unit 20 executing a program stored in the storage unit 22. This program may be provided by being stored in various computer-readable information storage media such as an optical disk and a memory card, or may be provided to the information input device 1 via a communication network such as the Internet.

  The start operation reception unit 30 receives an input of a predetermined start operation from the user. The input value acquisition unit 32 described later acquires an input value indicating the content of the tilt operation performed by the user after the start operation is accepted. In this way, the user starts the tilt operation for tilting the housing 10 after executing the start operation, so that the housing 10 is not intended to change the line-of-sight direction VD during game play. It is possible to avoid changing the line-of-sight direction VD by changing the inclination of.

  In the present embodiment, this start operation is an operation in which the user holds the case 10 with both hands so that the front touch sensor 12b and the back touch sensor 14 are in contact with at least one finger of each hand (hereinafter, referred to as the following operation). This is called a gripping operation). Specifically, as shown in FIG. 4, the user touches the left hand thumb within a predetermined area on the left side of the front touch sensor 12 b, and the right hand when the left index finger is viewed from the back side of the back touch sensor 14. In a predetermined area. Similarly, the thumb of the right hand is brought into contact with a predetermined area on the right side of the front touch sensor 12b, and the index finger or the like of the right hand is brought into contact with the predetermined area on the left side when viewed from the back side of the back touch sensor 14. The start operation reception unit 30 receives detection results of the front surface touch sensor 12b and the back surface touch sensor 14, and determines whether or not the user has performed such a grip operation. Specifically, for example, the start operation reception unit 30 determines whether or not one or more fingers of the user are in contact with each of four predetermined areas on both the left and right sides of the front surface touch sensor 12b and the left and right sides of the back surface touch sensor 14. Thus, it is determined whether or not the user has performed a gripping operation. Alternatively, as a simpler process, when it is detected that the user's fingers are in contact at two or more locations at any position of the front surface touch sensor 12b and at two or more locations at any position in the back surface touch sensor 14, It may be determined that the user has performed a gripping operation. When the user performs an inclination operation, it is natural that the user grips both the left and right sides of the housing 10 with both hands. Therefore, by setting such a gripping operation as a start operation, the user can start the tilt operation after performing the start operation without a sense of incongruity.

  Based on the detection result of the gyroscope 16, the input value acquisition unit 32 acquires an input value I indicating the content of the tilt operation performed by the user. In the present embodiment, the input value acquisition unit 32 detects from the detection result of the gyroscope 16 every unit time (for example, 1/60 seconds) after the start operation receiving unit 30 detects that the user has performed the start operation. The input value I is calculated. Here, the input value I is a two-dimensional quantity constituted by a set of two numerical values of an x component value Ix and a y component value Iy, and is a relative inclination of the casing 10 with respect to the initial casing direction EDi (ie, , The inclination direction and the inclination amount of the casing 10 from the initial casing direction EDi). The initial housing direction EDi is a direction indicating the posture of the housing 10 when the user performs a start operation, and is the normal direction of the back surface touch sensor 14 when the start operation receiving unit 30 detects the start operation. is there. Hereinafter, a plane parallel to the touch panel 12 and the back surface touch sensor 14 at the time when the start operation is detected is referred to as a reference plane RP for the tilt operation. Usually, since the user holds the housing 10 so that the display 12a is positioned in front of the user, the initial housing direction EDi is considered to substantially coincide with the user's line-of-sight direction when the start operation is performed.

Specifically, each of the x component value Ix and the y component value Iy is a unit vector (hereinafter, referred to as a normal direction of the back surface touch sensor 14 (hereinafter referred to as a housing direction ED) when the user performs an inclination operation. , Referred to as a normal vector V) to the reference plane RP. FIG. 5 is an explanatory diagram showing the relationship between the normal vector V and the input value I. 6A shows the normal vector V of FIG. 5 viewed from the X-axis negative direction side, and FIG. 6B shows the normal vector V of FIG. 5 viewed from the Y-axis negative direction side. . In these drawings, the posture of the housing 10 at the time when the start operation is performed is illustrated, and the orientation of the housing 10 after the tilt operation is not illustrated. As can be seen from these drawings, the x component value Ix and the y component value Iy each take any value in the numerical range from the maximum value 1 to the minimum value −1, with the magnitude of the normal vector V being 1. The combination of these component values indicates how much the user has rotated the housing 10 with respect to the initial housing direction EDi around the X-axis and the Y-axis. That is, assuming that the rotation angles of the casing 10 about the X axis and the Y axis by the tilt operation are θx and θy, the input value acquisition unit 32 has the rotation angles θx and θy obtained from the detection result of the gyroscope 16. Can be used to obtain the x component value Ix and the y component value Iy of the input value I. In particular,
Ix = sin θy
Iy = sin θx
The relationship holds. Here, with respect to rotation about the X axis, the direction rotating clockwise when viewed from the X axis negative direction side is the positive direction, and rotation about the Y axis is from the Y axis negative direction side. The direction rotating counterclockwise as viewed is the positive direction. Further, it is assumed that the user does not tilt the housing 10 beyond 90 degrees in any of the left, right, left, and right directions from the initial housing direction EDi.

  In the following, the time when the start operation receiving unit 30 detects the start operation is set to t = 0, and thereafter the time is indicated every time the unit time elapses until the start operation receiving unit 30 detects the end operation. Assume that the value t increases by one. Further, the input value I acquired by the input value acquisition unit 32 at time t is expressed as I (t), and the x component value and the y component value of the input value I (t) are respectively Ix (t) and Iy (t). Is written. Since the direction of the normal vector V coincides with the initial housing direction EDi at the time of t = 0, Ix (0) = 0 and Iy (0) = 0. In the following, the casing direction ED at time t is expressed as ED (t).

  The output value calculation unit 34 calculates the output value Po of the parameter to be operated according to the plurality of input values I acquired by the input value acquisition unit 32 while the user performs the gripping operation. In the present embodiment, as described above, the parameter to be operated is a line-of-sight direction parameter indicating the line-of-sight direction VD set in the virtual three-dimensional space. Specifically, the output value Po of the line-of-sight parameter is assumed to be composed of two output angle values Pox and Poy indicating the azimuth of the line-of-sight direction VD with respect to the initial line-of-sight direction VDi. Here, the initial line-of-sight direction VDi is the line-of-sight direction when the start operation is detected. Further, these output angle values Pox and Poy indicate the rotation angle in the visual line direction VD with the two reference axes as the rotation center, and the reference axis is defined based on the viewpoint coordinate system at the time when the start operation is detected. Is done. The viewpoint coordinate system is a coordinate system using a Vx axis, a Vy axis, and a Vz axis that are orthogonal to each other. The Vx axis is the horizontal direction of the screen of the display 12a, the Vy axis is the vertical direction of the screen, and the Vz axis is the visual line direction. Each corresponds to VD. Specifically, the output angle value Pox is a rotation angle around the Vx axis in the viewpoint coordinate system when the start operation is detected, and the output angle value Poy is a rotation angle around the Vy axis. Each will be shown. FIG. 7 is an explanatory diagram showing the relationship between the viewpoint coordinate system and the output angle values Pox and Poy. The viewpoint position VP, initial line-of-sight direction VDi set in the virtual three-dimensional space, and the image A projection plane PP is shown. The output angle values Pox and Poy are to be calculated by the output value calculation unit 34. The line-of-sight direction VD is a direction obtained by rotating the initial line-of-sight direction VDi according to the output angle values Pox and Poy. That is, the output angle values Pox and Poy are values indicating the magnitudes of rotation in the two reference axis directions, the Vy axis direction and the Vx axis direction, and the direction and magnitude represented by the set of these angle values. The visual line direction VD is determined by rotating the initial visual line direction VDi.

  Hereinafter, a specific example of a method for calculating the output value Po will be described. The output value calculation unit 34 determines the reference value Pr determined according to one of the plurality of input values I acquired by the input value acquisition unit 32, and the plurality of input values I acquired by the input value acquisition unit 32 per unit time. A value changed by a change amount Pd corresponding to each is calculated as an output value Po. Similar to the output value Po, the reference value Pr is also composed of two angle values, the reference angle values Prx and Pry. Further, the change amount Pd may also be configured by two angle values: a difference angle value Pdx indicating the change amount of the angle value Pox and a difference angle value Pdy indicating the change amount of the angle value Poy.

In the present embodiment, the output value calculation unit 34 updates the output value Po every time the unit time elapses and the input value acquisition unit 32 acquires a new input value I. At this time, it is assumed that the reference value Pr (n) at time t = n is a value determined according to the input value I (n) acquired last by the input value acquisition unit 32. For the change amount Pd, each of all the input values I (t) (t = 0, 1, 2,..., N) acquired after the time t = 0 when the start operation is detected. And is reflected in the output value Po (n). As a result, the angle values Pox (n) and Poy (n) at time t = n are values calculated by the following calculation formula, for example.

Here, a specific example of a method for calculating the reference angle values Prx (n) and Pry (n) will be described. Since these reference angle values are determined according to the input value I (n) at time t = n, the reference angle values are uniquely determined according to the attitude of the housing 10 at time t = n. For example, Prx (n) and Pry (n) are values calculated by the following formula using the x component value Ix (t) and the y component value Iy (t) of the input value I (t).
Prx (n) = sin −1 (α · Iy (n))
Pry (n) = sin −1 (α · Ix (n))
Here, α is a predetermined coefficient.

  When α is 1, the reference angle values Prx (n) and Pry (n) coincide with θx and θy, respectively, according to the calculation formula described above. That is, the reference value Pr represents a rotation angle for rotating the initial line-of-sight direction VDi with a direction and an amount that match the inclination direction and the inclination amount of the housing direction ED (n) with respect to the initial housing direction EDi. When α is smaller than 1, the reference value Pr rotates the initial line-of-sight direction VDi by an angle smaller than the tilt amount in a direction that matches the tilt direction of the casing direction ED (n) with respect to the initial casing direction EDi. It represents the rotation angle. Conversely, if α is a value greater than 1, the reference value Pr represents a rotation angle that rotates the initial line-of-sight direction VDi by an angle larger than the inclination amount of the housing direction ED (n) with respect to the initial housing direction EDi. become. In any case, by calculating the reference angle values Prx (n) and Pry (n) in this way, the output value Po (n) is the initial line of sight in a direction corresponding to the posture of the housing 10 at time t = n. A component for rotating the direction VDi is included. The coefficient α may be a negative value. In this case, the rotation direction from the initial line-of-sight direction VDi indicated by the reference value Pr is opposite to the tilt direction of the housing 10. In the above example, the calculation formula for calculating Prx (n) from Iy (n) and the calculation formula for calculating Pry (n) from Ix (n) are the same function. The value calculation unit 34 may calculate Prx (n) and Pry (n) by different calculation methods. Specifically, for example, the output value calculation unit 34 may calculate Prx (n) and Pry (n) using different coefficients α1 and α2 instead of the coefficient α.

Further, the reference angle value Pox is a value of the input value Iy in the range where the change rate of the reference angle value Prx with respect to the change of the input value Iy is less than or equal to the threshold value Ith in the range where the absolute value of the input value Iy exceeds the predetermined threshold value Ith. It may be determined to be smaller than the rate of change of the reference value Prx with respect to the change. Specifically, for example, the output value calculation unit 34 calculates the reference angle value Prx by the following calculation formula.
In this way, in the range where the absolute value of the input value Iy exceeds the threshold value Ith, the coefficient by which Iy is multiplied is halved, and the change rate of the reference angle value Prx per unit change of the input value Iy is accordingly increased. Get smaller. Thereby, when the input value Iy approaches the upper limit value (here, +1) and the lower limit value (here, -1), the change in the input value Iy is less likely to be reflected in the change in the reference angle value Prx. Here, the ratio of the coefficient in the range where the absolute value of the input value Iy is less than or equal to the threshold value Ith and the coefficient in the range exceeding the threshold value Ith is ½, but this ratio is other than that. Also good. Further, instead of using different calculation formulas for the case where the absolute value of the input value Iy is equal to or less than the threshold value Ith and the case where the absolute value of the input value Iy exceeds the threshold value Ith, as the absolute value of the input value Iy increases, the reference angle with respect to the change in the input value Iy The reference angle value Prx may be calculated using one function (for example, a function in which the second-order differentiation is a negative value) such that the rate of change of the value Prx becomes small. Also, when the reference angle value Poy is calculated using the input value Ix, it may be calculated by the same calculation formula as the calculation formula of the reference angle value Pry described above.

On the other hand, the difference angle values Pdx (t) and Pdy (t) determined according to the input value I (t) are also determined according to the input value I (t) in the same manner as the reference angle values Prx (n) and Pry (n). It may be an angle value indicating a rotation angle for rotating the line-of-sight direction VD by the direction and amount. However, the difference angle values Pdx (t) and Pdy (t) are relatively small compared to the reference angle values Prx (t) and Pry (t) determined according to the same input value I (t). It is desirable to become. Therefore, these difference angle values are calculated by a function different from the reference angle value. For example, the differential angle values Pdx (t) and Pdy (t) are expressed by the following functions using Ix (t) and Iy (t), respectively.
Pdx (t) = sin −1 (F (Iy (t)))
Pdy (t) = sin −1 (F (Ix (t)))
Here, F (x) is a predetermined function. The function F (x) is defined as follows, for example.
Here, β is a predetermined coefficient. The coefficient β may be a positive value or a negative value, like the coefficient α. Further, F (x) may be limited so that its absolute value does not exceed a predetermined upper limit value. In this case, when the absolute value of x exceeds a predetermined value, the absolute value of F (x) becomes a value that matches this upper limit value. Note that, similarly to the reference angle value Pr, Pdx (t) and Pdy (t) may be calculated by different calculation formulas. Specifically, for example, the output value calculation unit 34 may change the coefficient β in the above-described calculation formula between when calculating Pdx (t) and when calculating Pdy (t). In this way, the amount of change in the line-of-sight direction VD differs between the case where the case 10 is rotated about the Y axis and the case 10 is rotated about the X axis. Can be.

As described above, the output value calculation unit 34 needs to calculate a new output value Po by reflecting the input value I newly acquired by the input value acquisition unit 32 every time the unit time elapses. However, the output value calculation unit 34 does not necessarily have to redo the calculation of the output value Po from the beginning, and is a value that is determined according to the input value I for which the reference value Pr is newly acquired with respect to the previously calculated output value Po. It is only necessary to further change the amount of change corresponding to the newly acquired input value I. As a specific example, the output angle values Pox (n) and Poy (n) at time t = n use the output angle values Pox (n−1) and Poy (n−1) at time t = (n−1). Is calculated by the following formula.
When calculating the output value Po using such a calculation formula, the output value calculation unit 34 calculates the output angle values Pox (n−1) and Poy (n−1) at time t = (n−1). The calculated output angle values Pox (n−1), Poy (n−1) and the reference angle values Prx (n−1), Pry (n−1) used for the calculation are stored in the storage unit 22. Temporarily store in. In the next calculation, the previous output angle value and reference angle value stored in the storage unit 22, and the reference angle value and difference angle value calculated from the newly acquired input value I (n). The output angle values Pox (n) and Poy (n) are calculated by the above-described calculation formula.

Alternatively, the output value calculation unit 34 may calculate the output angle values Pox (n) and Poy (n) by the following calculation formula.
In this example, the output value calculation unit 34 stores in the storage unit 22 a cumulative value obtained by summing the difference angle values Pdx calculated after the start operation. When the input value I (t) is newly acquired, the cumulative angle stored in the storage unit 22 is further added with the difference angle value Pdx (t) calculated from the input value I (t). Update to value. Then, the output angle value Pox (t) is calculated by adding the reference angle value Pdx (t) calculated from the input value I (t) to the updated accumulated value. The output angle value Poy (t) can also be calculated by the same process.

  In the description so far, the output value calculation unit 34 calculates the change amount Pd for all the input values I acquired after the start operation is detected, and reflects these change amounts Pd in the output value Po. However, only the change amount Pd calculated for the input value I obtained after the user has performed the start operation and after the predetermined start condition is satisfied may be reflected in the output value Po. The start condition in this case may be, for example, that a predetermined time elapses after the start operation is detected. Alternatively, the amount of change in unit time of the input value I may be less than a predetermined value. In this way, immediately after the user performs the start operation, the output value Po is determined based only on the reference value Pr, and after a certain period of time has elapsed after the start operation (or the user has started the case The change of the output value Po according to the change amount Pd can be started (after the ten postures are maintained in a constant state).

  The display image control unit 36 updates the image displayed on the display 12a in accordance with the output value Po of the line-of-sight direction parameter output by the output value calculation unit 34. In this embodiment, the display image control unit 36 rotates the initial line-of-sight direction VDi with the rotation direction and the rotation amount determined according to the output value Po (t) output from the output value calculation unit 34 at time t. The line-of-sight direction VD (t) at time t is set, and a spatial image of the state in the virtual three-dimensional space is displayed on the display 12a along the line-of-sight direction VD (t). By repeating such processing every unit time, the image element included in the display image moves in the display image according to the change of the output value Po. This image element may be various objects arranged in the virtual space, or an icon or the like in the menu screen.

  In the present embodiment, the display image control unit 36 constructs a virtual three-dimensional space in which objects such as game character objects and background objects are arranged in the storage unit 22. Then, the image processing unit 24 is instructed to draw an image showing a state in the virtual three-dimensional space viewed from the line-of-sight direction VD determined according to the output value Po calculated by the output value calculating unit 34. In response to the instruction, the GPU in the image processing unit 24 generates an image and writes it in the frame buffer, whereby a game image is displayed on the display 12a. Thereby, the user can view the state in the virtual three-dimensional space as viewed from the line-of-sight direction VD changed according to the tilt operation performed by the user.

  When the display image control unit 36 updates the display image according to the output value Po calculated by the output value calculation unit 34, the display image control unit 36 displays an image indicating the magnitude of the change amount used to calculate the output value Po ( Hereinafter, the indicator image 40) may be included in the display image. FIG. 8 is a diagram illustrating an example of a display image including such an indicator image 40. Here, especially the indicator image 40 shall show the magnitude | size of the variation | change_quantity according to the input value I which the input value acquisition part 32 acquired last. That is, when the display image is updated according to the output value Po (t) calculated at time t, the difference angle values Pdx (t) and Pdy (t) calculated according to the input value I (t). An indicator image 40 indicating the size of the image is displayed. The indicator image 40 may be an image that changes in accordance with the sum of the absolute values of the difference angle values Pdx (t) and Pdy (t), or the sum of squares of Pdx (t) and Pdy (t). It may be an image that changes accordingly. Or it may be an image which shows each size of Pdx (t) and Pdy (t). Further, it indicates not the size of Pdx (t) and Pdy (t) itself, but the size of Pdx (t) and Pdy (t) relative to the size of the reference value Pr (t) and the output value Po (t), for example. It may be an image. By displaying such an indicator image 40, it is possible to explicitly notify the user of the current change status of the operation target parameter.

  Further, in the present embodiment, the process for updating the parameter to be operated according to the input value I described above is repeatedly executed until the user performs a predetermined end operation. When the user performs a predetermined end operation, the output value calculation unit 34 ends the process of calculating the output value Po of the parameter to be operated, and the display image control unit 36 outputs the output at the time of the end operation. While maintaining the line-of-sight direction VD determined by the value Po, an image of the virtual three-dimensional space viewed from the line-of-sight direction VD is displayed.

  Specifically, the end operation is an operation for releasing the state in which the user is performing the gripping operation (that is, the state in which the user has touched the fingers of both hands to the front touch sensor 12b and the back touch sensor 14). And In this example, it is assumed that the user keeps the gripping operation while performing the tilting operation. That is, after the user starts the gripping operation, the start operation receiving unit 30 repeats the determination whether or not the state is maintained every predetermined time. When it is determined that the gripping state has ended (that is, when the user releases the finger that has been in contact with one of the predetermined areas set on the left and right of the front touch sensor 12b and the back touch sensor 14), The start operation accepting unit 30 accepts this operation as an end operation indicating the end of the tilt operation, and outputs that effect to the input value acquiring unit 32. Thereby, the input value acquisition unit 32 is an input value indicating the content of the operation input performed by the user while the user continues the gripping operation (that is, from the start operation to the end operation). If the user performs a termination operation, the input value acquisition process can be terminated.

  As described above, in the information input device 1 according to the present embodiment, the output value Po of the parameter to be operated is determined based on the reference value Pr and the change amount Pd. Among these, the reference value Pr is a component reflecting the input value I at a certain moment, and the change amount Pd is a change amount corresponding to each of the plurality of input values I acquired after the user performs a start operation. Pd is accumulated and reflected in the output value Po. While the user continues the tilting operation, the contribution of the change amount Pd to the output value Po increases with time, so the output value Po is calculated using only the input value I at a certain temporary point. Unlike the above, the range of values that the output value Po can take can be increased. On the contrary, immediately after the tilt operation is started, the contribution of the change amount Pd to the output value Po is smaller than the reference value Pr, so that an output value Po that substantially corresponds to the reference value Pr is output. For example, after performing the start operation, the user performs an inclination operation for tilting the housing 10 in a direction in which the line-of-sight direction VD is to be changed, and then immediately performs an end operation, so that the direction is close to the direction intended by the user. The line-of-sight direction VD can be changed.

  The embodiments of the present invention are not limited to those described above. For example, in the above description, the operation target parameter is a parameter representing the line-of-sight direction VD, but the operation target parameter may be various other parameters. For example, the information input device 1 may change the position coordinates of the viewpoint VP set in the virtual three-dimensional space according to the user's tilt operation, or the position of the character placed in the virtual three-dimensional space or virtual plane. Alternatively, the traveling direction may be changed according to the user's tilting operation. Or you may change the display range displayed on the display 12a within a virtual plane.

  The information input device 1 is not limited to a gyroscope, and may detect the direction and amount of tilt operation performed by the user by various detection means such as a geomagnetic sensor. Further, the operation input method performed by the user is not limited to the tilt operation for tilting the housing 10. For example, the information input device 1 may acquire the input value I by receiving a user operation input to the front surface touch sensor 12b and the rear surface touch sensor 14. As a specific example of such processing, the user brings one of his / her fingers into contact with one point of the back surface touch sensor 14. Then, another finger is brought into contact with another point on the back surface touch sensor 14 while maintaining the state. At this time, the information input device 1 uses the position where the user's finger first contacts as a reference position, and sets a value indicating the direction and distance from the reference position to the position where the user's finger next contacts as an input value I. Get as. In this way, the user can perform an operation input indicating the direction and amount with respect to the reference position using the touch sensor, as in the case where the operation of tilting the housing 10 is performed after the start operation. Alternatively, the user may perform an operation input for changing the parameter to be operated by simply bringing one of the fingers into contact with one point on the back surface touch sensor 14. In this case, the information input device 1 inputs a value indicating the direction and amount of the position touched by the user's finger with respect to the reference position, for example, by setting the center position of the detection surface of the back surface touch sensor 14 as the reference position. It can be acquired as value I. Even when the front surface touch sensor 12b is used, the same operation input as the operation input to the back surface touch sensor 14 as described above can be performed.

  In the above description, the input value I is composed of two component values indicating the X coordinate and the Y coordinate of the projection point obtained by projecting the normal vector V indicating the orientation of the housing 10 onto the reference plane RP. However, instead of this, the information input device 1 acquires θx and θy indicating the rotation angles themselves around the X axis and the Y axis of the housing 10 as the input values I as the input value I, and this rotation The output angle values Pox and Poy may be calculated using the angles θx and θy. In the above description, the input value I is a two-dimensional value indicating the direction and amount with respect to the reference position. However, the input value I may be a one-dimensional value. For example, the information input device 1 acquires only the coordinate value in the X-axis direction of the position touched by the user's finger on the front surface touch sensor 12b or the back surface touch sensor 14 as the input value I, and the acquired plurality of input values I are acquired. Using the one-dimensional output value Po calculated using the scroll value, a scroll process for scrolling the display image displayed on the display 12a in the horizontal direction by an amount corresponding to the output value Po may be executed. Further, the input value I may be a three-dimensional value constituted by a set of three component values. For example, in the above description, the component value indicating the rotation amount about the X axis and the Y axis detected by the gyroscope 16 is used as the input value. In addition to these component values, both the X axis and the Y axis are used. A component value Iz indicating the amount of rotation with the Z axis orthogonal to the rotation center may be included in the input value. In this case, the information input device 1 rotates the image displayed on the display 12a with the line-of-sight direction VD as the rotation center in response to an operation of the user rotating the casing 10 with the normal direction of the back surface touch sensor 14 as the rotation center. The output value calculation unit 34 calculates an output angle value Poz indicating the rotation amount of the display image according to the component value Iz.

  In the above description, the gyroscope 16, the front touch sensor 12 b, and the back touch sensor 14 function as operation input devices. However, the user can use various other operation input devices such as analog sticks, for example. An operation input may be accepted. In addition, the start operation and the end operation for instructing the start and end of the operation input that is the target of acquiring the input value I are not limited to those described above, and may be, for example, a button press. Furthermore, although all of these operation input devices are arranged in the housing 10 of the information input device 1, the operation input devices may be configured separately from the information input device 1. As a specific example, the information input device 1 may be a consumer game machine, a personal computer, or the like, and the operation input device may be a controller that is connected to the information input device 1 by wire or wirelessly and includes a gyroscope. Good. In this case, the user performs an operation of grasping and tilting the housing of the controller by hand, and the information input device 1 calculates an output value based on the detection result of the gyroscope transmitted from the controller.

1 Information input device, 10 housing, 12 touch panel, 12a display, 12b
Front surface touch sensor, 14 Back surface touch sensor, 16 Gyroscope, 20 Control unit, 22 Storage unit, 24 Image processing unit, 30 Start operation reception unit, 32 Input value acquisition unit, 34
An output value calculation unit, 36 a display image control unit.

Claims (13)

  1. A computer connected to an operation input device that receives user operation input,
    Input value acquisition means for acquiring an input value indicating the content of the operation input received by the operation input device for each unit time; and
    Output value calculating means for calculating an output value of a parameter to be operated according to a plurality of input values acquired by the input value acquiring means;
    Is a program for functioning as
    The output value calculating means determines a reference value determined according to one input value acquired by the input value acquiring means, and a change amount corresponding to each of a plurality of input values acquired by the input value acquiring means per unit time. A program that calculates a value that is changed only as the output value.
  2. The program according to claim 1,
    The reference value is a value corresponding to an input value last acquired by the input value acquisition means.
  3. In the program according to claim 1 or 2,
    The output value calculation means is a value obtained by changing the reference value by a change amount determined according to each of a plurality of input values acquired by the input value acquisition means after the user has performed a predetermined start operation. A program characterized by output as an output value.
  4. In the program according to claim 1 or 2,
    The output value calculation means is a change determined according to each of a plurality of input values acquired by the input value acquisition means after the user has performed a predetermined start operation and after a predetermined start condition is satisfied. A program that outputs a value obtained by changing the reference value by an amount as the output value.
  5. In the program according to any one of claims 1 to 4,
    When the input value acquired by the input value acquisition unit exceeds a predetermined value, the output value calculation unit calculates the output value using a predetermined upper limit value as the amount of change. Program to do.
  6. In the program according to any one of claims 1 to 5,
    The operation input device is a sensor that detects a tilt of a housing that a user holds with a hand,
    When the user performs an operation input for tilting the casing, the input value acquisition means acquires a value indicating the tilt direction and tilt amount of the casing by the operation as the input value,
    The output value is a value indicating a direction and a size,
    The reference value is a value indicating a direction and a magnitude determined according to a tilt direction and a tilt amount indicated by the one input value,
    The change amount is a change amount having a magnitude according to the tilt amount in a direction determined according to a tilt direction indicated by the corresponding input value.
  7. The program according to claim 6,
    The input value is constituted by a first input component value and a second input component value representing a rotation amount with the two reference axes as rotation centers,
    The output value is constituted by a first output component value and a second output component value indicating a magnitude along each of two reference directions,
    The output value calculation means determines a change amount with respect to the first output component value according to the first input component value, and sets a change amount with respect to the second output component value as the second input component value. And a program for determining a change amount for each output component value by a different calculation method.
  8. The program according to any one of claims 1 to 7,
    In the range where the absolute value of the input value exceeds a predetermined value, the reference value is the reference value for the change of the input value in a range where the change of the reference value with respect to the change of the input value is less than or equal to the predetermined value A program characterized by being determined to be smaller than the rate of change of.
  9. In the program according to any one of claims 1 to 8,
    Means for displaying on the display screen an image indicating the magnitude of the amount of change according to the input value last acquired by the input value acquiring means when the output value calculating means calculates the output value;
    A program for causing the computer to further function as:
  10. An information input device connected to an operation input device that receives a user's operation input,
    Input value acquisition means for acquiring an input value indicating the content of the operation input received by the operation input device for each unit time;
    Output value calculating means for calculating an output value of a parameter to be operated according to a plurality of input values acquired by the input value acquiring means;
    Including
    The output value calculating means determines a reference value determined according to one input value acquired by the input value acquiring means, and a change amount corresponding to each of a plurality of input values acquired by the input value acquiring means per unit time. An information input device that calculates a value that is changed only as an output value.
  11. The information input device according to claim 10,
    The information input device includes:
    A touch sensor disposed on each of the front and back surfaces of the housing;
    The user brings at least one finger of each hand into contact with the touch sensor disposed on the surface of the housing, and at least one finger of each hand on the touch sensor disposed on the back surface of the housing. A start operation accepting means for accepting the operation as a start operation when an operation to contact is performed;
    Further comprising
    The operation input device is a sensor that detects the inclination of the housing,
    The output value calculation means uses, as the output value, a value obtained by changing the reference value by a change amount determined according to each of a plurality of input values acquired by the input value acquisition means after the start operation is received. An information input device characterized by output.
  12. The information input device according to claim 11,
    The start operation accepting unit is configured such that the user makes at least one finger of each hand in contact with the touch sensor disposed on the surface of the casing, and both hands also touch the touch sensor disposed on the back surface of the casing. When the operation to release the state in which at least one finger is in contact with each other is performed, the operation is accepted as an end operation,
    The information input device, wherein the output value calculation means ends the calculation of the output value when the end operation is accepted.
  13. An input value acquisition step for acquiring an input value indicating the content of the operation input received by the operation input device that receives the user's operation input for each unit time;
    An output value calculating step for calculating an output value of a parameter to be operated according to the plurality of input values acquired in the input value acquiring step;
    Including
    In the output value calculation step, a reference value determined according to one input value acquired in the input value acquisition step is determined according to each of a plurality of input values acquired per unit time in the input value acquisition step. A control method for an information input device, wherein a value changed by an amount of change is calculated as the output value.
JP2014057631A 2014-03-20 2014-03-20 Program, information input device and control method therefor Pending JP2014146354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014057631A JP2014146354A (en) 2014-03-20 2014-03-20 Program, information input device and control method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014057631A JP2014146354A (en) 2014-03-20 2014-03-20 Program, information input device and control method therefor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2010105856 Division 2010-04-30

Publications (1)

Publication Number Publication Date
JP2014146354A true JP2014146354A (en) 2014-08-14

Family

ID=51426489

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014057631A Pending JP2014146354A (en) 2014-03-20 2014-03-20 Program, information input device and control method therefor

Country Status (1)

Country Link
JP (1) JP2014146354A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007310840A (en) * 2006-05-22 2007-11-29 Sony Computer Entertainment Inc Information processor, and control method and program of information processor
WO2009034982A1 (en) * 2007-09-14 2009-03-19 Kyocera Corporation Electronic apparatus
JP2009516280A (en) * 2005-11-16 2009-04-16 ソニー エリクソン モバイル コミュニケーションズ, エービー Method for displaying parameter status information and related portable electronic devices and parameters
JP2009189660A (en) * 2008-02-15 2009-08-27 Nintendo Co Ltd Information processing program and information processing device
JP2010015535A (en) * 2008-06-02 2010-01-21 Sony Corp Input device, control system, handheld device, and calibration method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009516280A (en) * 2005-11-16 2009-04-16 ソニー エリクソン モバイル コミュニケーションズ, エービー Method for displaying parameter status information and related portable electronic devices and parameters
JP2007310840A (en) * 2006-05-22 2007-11-29 Sony Computer Entertainment Inc Information processor, and control method and program of information processor
WO2009034982A1 (en) * 2007-09-14 2009-03-19 Kyocera Corporation Electronic apparatus
JP2009189660A (en) * 2008-02-15 2009-08-27 Nintendo Co Ltd Information processing program and information processing device
JP2010015535A (en) * 2008-06-02 2010-01-21 Sony Corp Input device, control system, handheld device, and calibration method

Similar Documents

Publication Publication Date Title
TWI470534B (en) Three dimensional user interface effects on a display by using properties of motion
EP2353065B1 (en) Controlling and accessing content using motion processing on mobile devices
US10055018B2 (en) Glove interface object with thumb-index controller
JP5675627B2 (en) Mobile device with gesture recognition
US9292083B2 (en) Interacting with user interface via avatar
US9507431B2 (en) Viewing images with tilt-control on a hand-held device
US9600078B2 (en) Method and system enabling natural user interface gestures with an electronic system
US8872762B2 (en) Three dimensional user interface cursor control
US8836768B1 (en) Method and system enabling natural user interface gestures with user wearable glasses
JP5293603B2 (en) Input device, control device, control system, control method, and handheld device
Lee Hacking the nintendo wii remote
US7683883B2 (en) 3D mouse and game controller based on spherical coordinates system and system for use
US20130050069A1 (en) Method and system for use in providing three dimensional user interface
EP2613223A1 (en) System and method for enhanced gesture-based interaction
US9268480B2 (en) Computer-readable storage medium, apparatus, system, and method for scrolling in response to an input
US20080010616A1 (en) Spherical coordinates cursor, mouse, and method
JP2012061301A (en) Game system, game device, game program, and game processing method
EP2348383B1 (en) System and method of virtual interaction
US8655622B2 (en) Method and apparatus for interpreting orientation invariant motion
JP5430246B2 (en) Game device and game program
US8310537B2 (en) Detecting ego-motion on a mobile device displaying three-dimensional content
US9229540B2 (en) Deriving input from six degrees of freedom interfaces
US20080211768A1 (en) Inertial Sensor Input Device
KR20160106629A (en) Target positioning with gaze tracking
US9152248B1 (en) Method and system for making a selection in 3D virtual environment

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150630

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150721

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150917

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20151124