WO2017208619A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2017208619A1
WO2017208619A1 PCT/JP2017/014461 JP2017014461W WO2017208619A1 WO 2017208619 A1 WO2017208619 A1 WO 2017208619A1 JP 2017014461 W JP2017014461 W JP 2017014461W WO 2017208619 A1 WO2017208619 A1 WO 2017208619A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
information
processing apparatus
information processing
control unit
Prior art date
Application number
PCT/JP2017/014461
Other languages
French (fr)
Japanese (ja)
Inventor
麻紀 井元
健太郎 井田
拓也 池田
陽方 川名
龍一 鈴木
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017208619A1 publication Critical patent/WO2017208619A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • This disclosure relates to an information processing apparatus, an information processing method, and a program.
  • the operation intended by the user may not be recognized by the device. For example, if the detected gesture does not match the gesture associated with the process, the detected gesture is not recognized as an operation. As a result, the processing desired by the user is not executed.
  • Patent Document 1 when a gesture is detected and the detected gesture is an unrecognizable gesture, the invention relates to an information processing terminal that provides an interface for associating the detected gesture with a process to the user. Is disclosed. According to the present invention, a gesture that could not be recognized can be corrected to a recognizable gesture.
  • Patent Literature 1 a user's gesture and a process corresponding to the gesture are associated with each other by the user, so that the process intended by the user can be reliably executed based on the gesture.
  • a burden is generated in order for the apparatus to recognize the gesture.
  • the present disclosure proposes a mechanism capable of causing the apparatus to execute processing intended by the user and reducing the burden on the user for operating the apparatus.
  • the processing unit that performs processing based on movement information of the operating tool, the first direction determined based on a change in position in the movement of the operating tool, and the movement path length of the operating tool, And a control unit that controls movement information used in the processing based on the information processing apparatus.
  • processing is performed based on movement information of the operating body, and the first direction determined based on a change in position in the movement of the operating body and the operation
  • an information processing method including controlling movement information used in the processing based on a movement path length of a body.
  • a processing function for executing processing based on movement information of the operating body, a first direction determined based on a change in position in the movement of the operating body, and a movement path of the operating body A program for causing a computer to realize a control function for controlling movement information used in the processing based on the length is provided.
  • a mechanism capable of causing the apparatus to execute a process intended by the user and reducing the burden on the user for operating the apparatus is provided.
  • the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
  • FIG. 2 is a block diagram schematically illustrating an example of a functional configuration of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram for describing movement information correction processing in the information processing apparatus according to an embodiment of the present disclosure.
  • FIG. It is a figure which shows the example of the change of the position of the operating body to move in the information processing apparatus which concerns on one Embodiment of this indication.
  • 5 is a flowchart conceptually showing an example of overall processing of an information processing apparatus according to an embodiment of the present disclosure.
  • 14 is a flowchart conceptually showing an example of details of correction processing of movement information in the information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 14 is a diagram for describing an example of correction control in an information processing device according to a second modification of an embodiment of the present disclosure.
  • FIG. 14 is a flowchart conceptually showing an example of overall processing of an information processing apparatus according to a third modification of an embodiment of the present disclosure. It is a figure which shows the example of the choice of the presence or absence of correction
  • 16 is a diagram for describing an example in which a preview indicating movement information after correction is displayed as an option object in the information processing apparatus according to the fourth modification of an embodiment of the present disclosure.
  • FIG. 16 is a flowchart conceptually showing an example of details of correction processing of movement information in an information processing apparatus according to a fifth modification of an embodiment of the present disclosure. 16 is a flowchart conceptually showing an example of overall processing including movement information correction cancellation processing in an information processing apparatus according to a sixth modification of an embodiment of the present disclosure;
  • FIG. 3 is an explanatory diagram illustrating a hardware configuration of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
  • the information processing system includes an information processing apparatus 100, a measurement apparatus 200, and a projection apparatus 300.
  • the information processing apparatus 100, the measurement apparatus 200, and the projection apparatus 300 are connected for communication.
  • the information processing apparatus 100 controls the projection of the projection apparatus 300 using the measurement result of the measurement apparatus 200. Specifically, the information processing apparatus 100 recognizes the operating tool from the measurement result provided from the measurement apparatus 200. Then, the information processing apparatus 100 controls the mode of projection by the projection apparatus 300 based on the movement information about the recognized operating body. For example, the information processing apparatus 100 controls the projection position of the virtual object 20 to be projected on the projection apparatus 300 according to the movement of the user's hand measured by the measurement apparatus 200.
  • the measuring device 200 measures the situation around the measuring device 200. Specifically, the measuring apparatus 200 measures a phenomenon in which the position or state of an object (for example, a user or an operating body) existing around the measuring apparatus 200 is grasped. Then, the measuring apparatus 200 provides information obtained by the measurement (hereinafter also referred to as measurement information) to the information processing apparatus 100 as a measurement result.
  • the measurement apparatus 200 is a depth sensor, and a positional relationship between a body part (for example, a finger or a hand) of a user on which a marker is mounted and a surrounding object (that is, a three-dimensional space of the body part and the surrounding object). Position) can be measured.
  • the measurement information may be 3D image information.
  • Measurement device 200 may be an inertial sensor worn by a user or a ring-type or bracelet-type wearable device having the inertial sensor. Moreover, the measuring apparatus 200 may be installed in any way as long as an operation area can be secured. For example, the measuring device 200 may be provided on the floor in addition to the ceiling as shown in FIG.
  • Projection apparatus 300 projects an image based on instruction information from information processing apparatus 100. Specifically, the projection apparatus 300 projects an image provided from the information processing apparatus 100 onto a designated location. For example, the projection apparatus 300 projects the virtual object 20 onto the projection area 10 shown in FIG. In FIG. 1, the projection apparatus 300 is a 2D (Dimension) projector, but the projection apparatus 300 may be a 3D projector.
  • input devices that use the user's body as an operating body have been developed and are in widespread use.
  • an input device that is operated by directly touching a display screen such as a smartphone with a finger or the like, or an input device that is operated by performing a gesture such as a user waving an arm toward a projected screen.
  • These input devices have the advantage that the operation is intuitive and easy to use.
  • the user when the user operates the device by performing a gesture in the air, the user becomes less likely to move linearly as the hand moves away from the user's body, and the arc-shaped movement centered on the user's body becomes difficult. Resulting in. This is considered to be due to a difference between the somatic sensation of a human hand or finger and the range of motion of an actual hand or finger. Therefore, the operation actually recognized by the apparatus does not match the operation intended by the user, and the user performs the operation again. Furthermore, there is a possibility that processing corresponding to an operation not intended by the user is executed. Further, when attempting an operation intended by the user, the user may be forced into an unreasonable posture, and there is a risk that the body may be burdened.
  • the technology may still not reduce the burden on the user.
  • a user operation in a direction other than a specific direction related to the recognized scroll operation is ignored, only a part of the user operation amount is reflected in the scroll process.
  • only the component in the specific direction related to the recognized scroll operation among the operation amount of the user is reflected in the scroll process. Therefore, the scroll amount is reduced compared to the actual user operation amount.
  • the processing amount with respect to the operation amount decreases, the operation amount increases even for an operation for executing the same processing, and as a result, the number of operations may increase. This may increase the operation burden.
  • the operation burden can be further increased in order to operate a wide display screen.
  • an information processing system capable of causing a device to execute a process intended by a user and reducing a user's burden on the operation of the device, and an information processing device 100 for realizing the information processing system. Propose.
  • FIG. 2 is a block diagram schematically illustrating an example of a functional configuration of the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the information processing apparatus 100 includes a communication unit 102, a recognition unit 104, a correction control unit 106, and a processing control unit 108.
  • the communication unit 102 communicates with a device external to the information processing device 100. Specifically, the communication unit 102 receives a measurement result from the measurement apparatus 200 and transmits projection instruction information to the projection apparatus 300. For example, the communication unit 102 communicates with the measurement apparatus 200 and the projection apparatus 300 using a wired communication method. Note that the communication unit 102 may communicate using a wireless communication method.
  • the recognition unit 104 performs recognition processing based on the measurement result of the measurement device 200. Specifically, the recognition unit 104 recognizes the operating tool based on the measurement information received from the measurement device 200.
  • the operation body includes a part of the user's body. For example, the recognition unit 104 recognizes the user's finger or hand based on the three-dimensional image information obtained from the measurement apparatus 200. Note that the operation body recognized by the recognition unit 104 may be a pen-type or stick-type operation device.
  • the recognition unit 104 recognizes the position of the operating body. Specifically, the recognition unit 104 recognizes the position of the operation body in the operation area where the operation body is recognized. For example, the recognition unit 104 recognizes the position of the operation body in the three-dimensional space in the operation region based on the distance between the measurement apparatus 200 and the operation body measured by the measurement apparatus 200. Note that the position of the operating tool in the depth direction from the measuring apparatus 200 toward the operating tool may be recognized based on the recognized size of the operating tool.
  • the recognizing unit 104 recognizes the movement of the operating body by recognizing the position of the operating body. For example, the recognizing unit 104 recognizes the presence / absence of movement of the operating body, the moving direction, the speed and acceleration of movement, the moving path, and the like based on the change in the position of the operating body.
  • the correction control unit 106 controls correction for an input operation. Specifically, the correction control unit 106 controls movement information of the operating tool for an operation of moving the operating tool. More specifically, the correction control unit 106 functions as a control unit in a first direction (hereinafter, also referred to as an estimated movement direction) determined based on a change in position in the movement of the operating body, and the movement path of the operating body. Based on the length, the movement information used in the process executed based on the movement information of the operating body (hereinafter also referred to as an operation corresponding process) is controlled. For example, the movement information includes time-series discrete coordinate information or continuous movement route information.
  • FIG. 3 is a diagram for describing movement information correction processing in the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the correction control unit 106 acquires movement information of the operating tool. For example, the correction control unit 106 acquires information related to the position of the user's hand as the moving operation body recognized by the recognition unit 104.
  • FIG. 4 is a diagram illustrating an example of a change in the position of the moving operating body in the information processing apparatus 100 according to an embodiment of the present disclosure.
  • FIG. 4 discretely shows changes in the hand position corresponding to the movement of the user's hand in FIG.
  • the correction control unit 106 estimates the movement path of the operating tool. For example, the correction control unit 106 acquires information identifying the operating body positions P50A to P50E as illustrated in FIG. Then, the correction control unit 106 calculates the movement path R60 of the operating body that passes through these positions P50A to P50E based on the information. In addition, when the information regarding a movement route is provided from the recognition unit 104, the processing may be omitted.
  • the correction control unit 106 calculates a tangent for the position of the moving operating body. For example, the correction control unit 106 calculates a tangent to the calculated movement route R60 for each of the operating body positions P50A to P50E. It should be noted that the position of the operating body that is a tangent calculation target may be sprinkled.
  • the correction control unit 106 determines whether or not the movement information is corrected based on the determination of whether the calculated tangent direction has changed. For example, as shown in FIG. 4, the correction control unit 106 determines that the movement information is to be corrected because the tangential direction for the position P50D is different from the tangential direction for the positions P50A to P50C.
  • the correction control unit 106 calculates the estimated movement direction of the operating tool. Specifically, the correction control unit 106 determines the direction determined from the tangent to the position of the moving operating body as the estimated movement direction. More specifically, the correction control unit 106 determines a direction in which the operating body is moving in a direction parallel to the calculated tangent as an estimated movement direction. Specifically, the correction control unit 106 is a direction parallel to a tangent to the position of the operating body between the time when the operating body is recognized and after a predetermined period has elapsed, and the direction in which the operating body is moving. Is determined as the estimated moving direction. For example, the direction that is parallel to the tangent to the positions P50A to P50C as shown in FIG. 4 and that goes from the position P50A to P50C is determined as the estimated movement direction.
  • the method of determining the estimated movement direction is not limited to the above method.
  • the correction control unit 106 may determine the direction parallel to the direction with the largest number of coincidence of the tangential directions with respect to the position of the operating body as the estimated movement direction.
  • the tangent directions are different at each of the positions P50A to P50C, the position P50D, and the position P50E as shown in FIG. 4, but the direction in which the tangent directions coincide at the three positions P50A to P50C is estimated movement. Determined in the direction.
  • the correction control unit 106 calculates the movement path length of the operating tool. Specifically, the correction control unit 106 calculates the length of the movement path R60 of the operating tool as shown in FIG.
  • the correction control unit 106 determines movement information related to movement in the estimated movement direction of a length corresponding to the movement path length of the operating body as movement information used in the operation handling process. For example, the correction control unit 106 determines the determined estimated movement direction as the movement direction of the operation, and determines the calculated movement path length as the operation amount. In other words, instead of the actual movement path R60 of the operating body, the movement path R70 as shown in FIG. 3 is used for the operation handling process.
  • the processing control unit 108 controls the processing of the information processing apparatus 100 as a whole. Specifically, the process control unit 108 executes a process based on the movement information of the operating tool. More specifically, the processing control unit 108 performs display control processing based on movement information related to movement of the operating tool recognized by the recognition unit 104 or movement information of the operating tool corrected by the correction control unit 106. To do. For example, the process control unit 108 determines the operation direction and the operation amount in the process from the movement direction and the movement amount of the operating body. Then, the processing control unit 108 changes the projection position of the virtual object to be projected on the projection device 300 according to the determined operation direction and operation amount.
  • FIG. 5 is a flowchart conceptually showing an example of overall processing of the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the information processing apparatus 100 starts an application (step S401). Specifically, the process control unit 108 activates an application in response to a user application activation operation recognized by the recognition unit 104. Note that the application may be automatically started.
  • the information processing apparatus 100 determines whether an end operation has been recognized (step S402). Specifically, the process control unit 108 determines whether the user operation recognized by the recognition unit 104 is an application end operation.
  • the information processing apparatus 100 determines whether the movement of the operating tool has been recognized (step S403). Specifically, the correction control unit 106 determines whether the recognition unit 104 has recognized the movement of the operating tool.
  • the information processing apparatus 100 determines whether the moving direction of the operating tool has changed (step S404). Specifically, when it is determined that the movement of the operating tool has been recognized by the recognition unit 104, the correction control unit 106 estimates the moving direction of the operating tool, and the estimated moving direction of the operating tool changes during the movement. Determine if you did.
  • the information processing apparatus 100 determines the estimated moving direction based on the change in the position of the operating tool (step S405). Specifically, when it is determined that the movement direction of the operating tool has changed during movement, the correction control unit 106 determines the movement direction before the change as the estimated movement direction.
  • the information processing apparatus 100 calculates the movement path length of the operating tool (step S406). Specifically, the correction control unit 106 calculates the length of the movement path estimated based on the recognized change in the position of the operating tool.
  • the information processing apparatus 100 executes movement information correction control (step S407).
  • the correction control unit 106 corrects the movement direction and the movement amount related to the movement information based on the estimated movement direction and the movement path length.
  • the information processing apparatus 100 executes processing based on the movement information (step S408). Specifically, the process control unit 108 executes projection control of the projection apparatus 300 based on the corrected movement direction and movement amount.
  • step S500A If it is determined that the end operation has been recognized (step S402 / YES), the information processing apparatus 100 ends the application (step S409) and ends the process.
  • steps S404 to S407 are collectively referred to as step S500A.
  • FIG. 6 is a flowchart conceptually showing an example of details of the movement information correction processing in the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the information processing apparatus 100 stores the position of the operating tool within a predetermined period (step S501). Specifically, the correction control unit 106 causes the storage unit to store information indicating the position of the operating tool during a period corresponding to a predetermined number of frames provided from the recognition unit 104.
  • the information processing apparatus 100 calculates a tangent for the movement path of the operating body for each of the stored positions of the operating body (step S502). Specifically, the correction control unit 106 estimates the movement route based on the stored information indicating the position of the operating tool. Next, the correction control unit 106 calculates a tangent to the estimated movement path for each position of the operating tool.
  • the information processing apparatus 100 determines whether the calculated direction of the tangent has changed (step S503). Specifically, the correction control unit 106 determines whether the slope of the tangent with respect to the position of the operating tool has changed by a predetermined value or more between the positions before and after the time series.
  • the information processing apparatus 100 determines the direction of the tangent before the direction is changed as the estimated movement direction (step S504). Specifically, when the correction control unit 106 determines that the inclination of the tangent has changed by a predetermined value or more, the correction control unit 106 is in a direction parallel to the tangent with respect to the position of the operating body before the inclination changes, The change direction of the position (that is, the movement direction) is determined as the estimated movement direction.
  • the information processing apparatus 100 calculates the movement path length of the operating tool (step S505). Specifically, the correction control unit 106 calculates the length of the travel route estimated as described above.
  • the information processing apparatus 100 corrects the movement direction related to the movement information to the estimated movement direction (step S506). Specifically, the correction control unit 106 changes the movement direction related to the movement information to the estimated movement direction determined from the movement direction related to the estimated movement route (that is, the actual movement direction).
  • the information processing apparatus 100 corrects the movement amount related to the movement information to an amount corresponding to the calculated movement route length (step S507). Specifically, the correction control unit 106 changes the movement amount related to the movement information to a movement path length calculated from the length of the estimated movement path in a specific direction.
  • step S600 the processes of steps S501 to S505 are collectively referred to as step S600.
  • the information processing apparatus 100 executes the process based on the movement information of the operating tool, and is determined based on the change in position in the movement of the operating tool.
  • the movement information used in the process is controlled based on the direction and the movement path length of the operating tool.
  • the movement information used in the process is corrected to the movement information related to the movement of the operation tool intended by the user, so that the operation in a range beyond the user's range of motion is performed. Also for the above, it is possible to execute processing in accordance with the user's intention. Further, the amount of operation can be reduced by correcting the operation amount based on the movement path length. Therefore, it is possible to cause the apparatus to execute processing intended by the user and to reduce the burden on the user for operating the apparatus. Thereby, the user can perform a desired operation without changing the operation method or the operation posture of the operation tool. Therefore, the user can concentrate on the purpose of the operation without paying attention to the operation of the apparatus.
  • the information processing apparatus 100 determines movement information related to movement in the first direction having a length corresponding to the movement path length as movement information used in the processing. For this reason, a process can be performed based on the length equivalent to the distance which the user actually moved the operation body. Therefore, it is possible to prevent a part of the user's operation from being reflected in the processing (that is, wasted), and it is possible to more reliably reduce the burden on the operation.
  • the length of the movement according to the movement information may be less than the movement path length or may be longer than the movement path length. Further, as will be described later, the length of the movement information may be controlled based on various information.
  • the information processing apparatus 100 determines the control mode of the movement information based on the information related to the movement mode of the operating body.
  • the movement information control mode includes the presence or absence of the movement information control.
  • it is easy to determine that correction of movement information is desired because the movement path changes in an arc shape. Therefore, the accuracy or adaptability of the correction control can be improved by determining the correction control mode based on the movement mode of the operating body.
  • the movement mode of the operating body includes a change in the moving direction of the operating body.
  • the moving direction of the user operating body generally changes during the movement. Accordingly, the correction control can be effectively performed by executing the correction control when a change in the moving direction of the operating tool is detected.
  • the first direction includes a direction determined from a tangent to the position of the moving operating body. For this reason, the moving direction of the operating body can be grasped more accurately. Therefore, it is possible to execute processing according to the user's intention by correcting the movement direction according to the movement information in a direction closer to the direction intended by the user.
  • the information processing apparatus 100 may execute correction control for operations other than the gestures of waving hands in the air as described above. Examples in which the correction control is applied will be described with reference to FIGS.
  • FIG. 7 is a diagram illustrating a first application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure.
  • FIG. 7 shows an example in which the user performs an operation by moving the user's hand, which is the operating body, from the upper part of the thigh to the side part.
  • the recognition unit 104 recognizes the user's hand 50 and the movement of the hand 50.
  • the correction control unit 106 determines the presence / absence of correction based on the recognized change in the position of the hand 50. As shown in FIG. 7, since the hand 50 is moved in an arc as in the movement route R61, it is determined that correction control is performed here.
  • the correction control unit 106 determines an estimated movement direction for the hand 50 based on a change in the position of the hand 50, and calculates the length of the movement route R61. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R71), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R61. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R61 to the movement information related to the movement route R71.
  • FIG. 8 is a diagram illustrating a second application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure.
  • FIG. 8 shows an example in which the user performs an operation of tracing the palm of the left hand with the finger of the right hand.
  • the recognition unit 104 recognizes the user's finger 51 and the movement of the finger 51.
  • the correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 8, the finger 51 is moved in an arc as in the movement path R62, and therefore it is determined here that correction control is performed.
  • the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R62. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R72), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R62. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R62 to the movement information related to the movement route R72.
  • FIG. 9 is a diagram illustrating a third application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure.
  • FIG. 9 shows an example in which the user performs an operation of tracing the forearm from the wrist of the right arm with the finger of the left hand.
  • the recognition unit 104 recognizes the user's finger 51 and the movement of the finger 51.
  • the correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 9, since the finger 51 is moved in an arc as in the movement path R63, it is determined here that correction control is performed.
  • the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R63. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R73), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R63. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R63 to the movement information related to the movement route R73.
  • FIG. 10 is a diagram illustrating a fourth application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure.
  • FIG. 10 shows an example in which the user performs an operation to move the finger pointing at the projection area from left to right toward the projection area.
  • the recognition unit 104 recognizes the user's finger 51 directed to the projection area 10 and the movement of the finger 51.
  • the correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 10, the finger 51 is moved in an arc on the projection area 10 like a movement path R64, and therefore it is determined that correction control is performed here.
  • the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R64. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R74), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R64. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R64 to the movement information related to the movement route R74.
  • the change in the position of the operating tool may be a change in position estimated from the position of the operating tool.
  • the estimated movement direction may be determined based on a change in the position on the projection region 10 indicated by the user's finger 51 as shown in FIG.
  • the estimated change in position may be a change in the position of the operation target operated by the operating tool.
  • FIG. 11 is a diagram illustrating a fifth application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure.
  • FIG. 11 shows an example in which the user performs an operation of sliding the thumb of the right hand holding a device having a touch screen such as a smartphone on the touch screen.
  • the recognition unit 104 recognizes the user's finger 51 on the touch screen 30 and the movement of the finger 51.
  • the correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 11, the finger 51 is moved on the touch screen 30 in an arc as in the movement path R ⁇ b> 65, and therefore it is determined that correction control is performed here.
  • the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R65. Then, the correction control unit 106 changes the movement direction related to the movement information of the operating tool to the determined estimated moving direction (that is, the direction of the movement route R75), and calculates the movement amount related to the movement information of the operating tool. The length is changed to the length of the route R65. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R65 to the movement information related to the movement route R75. Note that the correction control process of the information processing apparatus 100 may be applied to an apparatus in which a touch panel and a display unit are separately provided instead of an apparatus including a touch screen.
  • the information processing apparatus 100 can be applied to various types of operations. Therefore, for various operations, it is possible to reduce a burden on operations while causing the apparatus to execute processing intended by the user.
  • the information processing apparatus 100 may determine the control mode based on the degree of change in the moving direction of the operating tool. Specifically, the correction control unit 106 determines to perform correction control when the degree of change in the moving direction of the operating tool is equal to or greater than a predetermined value. With reference to FIG. 12, the process of this modification is demonstrated in detail.
  • FIG. 12 is a diagram for describing an example of correction control in the information processing apparatus 100 according to the second modification of an embodiment of the present disclosure.
  • the recognition unit 104 recognizes the movement of the operating body. For example, the recognition unit 104 recognizes the movement of the operating body by recognizing the positions P51A to P51D of the operating body that are moved so as to draw a circle as illustrated in FIG.
  • the correction control unit 106 calculates the amount of change in the movement direction related to the movement of the recognized operating tool. For example, the correction control unit 106 calculates a tangent to the movement route R66 for each of the operating body positions P51A to P51D as shown in FIG. Then, the correction control unit 106 calculates a difference in tangent slope between the calculated tangents.
  • the correction control unit 106 determines a correction control mode based on whether the calculated change amount exceeds a predetermined value. For example, the correction control unit 106 determines that the correction control is to be executed when the difference in the slope of the tangent between the calculated tangents exceeds a predetermined value.
  • the predetermined value includes a value determined from a difference in tangent slope between tangents.
  • the predetermined value may be an average value, a mode value, a median value, or the like of the tangent slope difference between the tangent lines.
  • the predetermined value may be a value determined before the operation.
  • the information processing apparatus 100 determines the control mode based on the degree of change in the moving direction of the operating tool. For this reason, it is possible to correct the movement information for an operation (for example, a rotation operation) in which the moving direction of the operating body changes by a predetermined degree. Therefore, it is possible to further reduce the operation burden on the user by increasing the variation of the operation to be corrected.
  • an operation for example, a rotation operation
  • the information processing apparatus 100 may determine a movement information correction control mode using pattern matching of a movement mode of an operating tool. Specifically, the correction control unit 106 determines the movement information control mode based on a comparison between the movement mode of the operating tool and the pattern of the moving mode of the operating tool stored in advance. As the movement mode of the operating body for which the pattern is prepared, there are the speed of movement of the operating body, acceleration, or a moving path. For example, the recognition unit 104 performs matching between the movement mode of the operating tool and a movement mode pattern stored in advance. Then, the correction control unit 106 determines whether to control the movement information based on the pattern matching result of the recognition unit 104. Furthermore, the pattern of the movement mode of the operating body may be obtained by machine learning. For example, by accumulating the moving speed, acceleration, or moving path of the operating tool for an operation that is repeatedly performed, a pattern of the operating mode of the operating tool is derived.
  • FIG. 13 is a flowchart conceptually showing an example of overall processing of the information processing apparatus 100 according to the third modification example of the embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
  • step S413 If it is determined that the end operation has not been recognized after the application is started (step S411) (step S412 / NO), the information processing apparatus 100 determines whether the movement of the operating tool has been recognized (step S413).
  • the information processing apparatus 100 determines whether the movement mode of the operating tool matches a predetermined pattern (step S414). Specifically, the recognizing unit 104 performs matching with a learning pattern stored in advance for the recognized movement speed, acceleration, or movement path of the operating tool. Then, when the matching with the learning pattern is successful, the correction control unit 106 determines to execute the movement information correction process. On the other hand, if the matching with the learning pattern fails, it is determined not to execute the movement information correction process.
  • step S414 If it is determined that the movement mode of the operating body matches the predetermined pattern (step S414 / YES), after the movement information correction process is executed (steps S415 to S417), the process is performed based on the corrected movement information. This is executed (step S418).
  • the information processing apparatus 100 determines the control mode of the movement information based on the comparison between the movement mode of the operating tool and the movement mode pattern stored in advance. . For this reason, it is possible to improve the certainty that the correction process of the movement information is performed when the movement that is the mode of the pattern in which the correction of the movement information is desired. Therefore, it becomes easy for the user to learn a pattern to be corrected, and usability can be improved.
  • the movement mode of the operating body includes the speed, acceleration, or moving path of the operating body.
  • the higher the speed or acceleration of movement of the operating body the more likely the movement of the operation is distorted. Therefore, by controlling the presence / absence of the correction process based on the speed or acceleration of the movement of the operating tool, the correction process can be more reliably executed when correction is desired. Further, by performing pattern matching on the movement path, it is possible to execute the correction process when correction is desired more reliably than the speed or acceleration of movement.
  • the pattern of the movement mode of the operating body is obtained by machine learning of the movement mode of the operating body.
  • the operation of the operation body generally differs among individual users. Therefore, a machine-learned pattern for each user is used for the determination process of whether or not correction is performed, so that a correction process suitable for each user can be provided.
  • a learning pattern may be prepared for each individual user, and a learning pattern may be prepared for each user attribute.
  • the correction control unit 106 determines the control mode of the movement information based on the information related to the selection operation of the control mode of the movement information by the subject of the operation using the operation tool. For example, the process control unit 108 causes the display device to display an option (hereinafter also referred to as an option object) for selecting whether or not to correct the movement information. Then, the correction control unit 106 determines whether or not to correct the movement information based on recognition of an operation for selecting the displayed option. Furthermore, with reference to FIG. 14, the process of this modification is demonstrated in detail.
  • FIG. 14 is a diagram illustrating an example of options for whether or not to correct the movement information whose display is controlled by the information processing apparatus 100 according to the fourth modification example of the embodiment of the present disclosure.
  • the correction control unit 106 determines whether or not the movement information of the operating tool is corrected. For example, the correction control unit 106 determines the presence / absence of correction based on the movement mode of the operating tool as described above.
  • the correction control unit 106 causes the processing control unit 108 to display an option object. For example, when the operation object 21 projected to a position on the projection area corresponding to the position of the operation body is moved as shown in the movement path R67 as shown in FIG. 14, the correction control unit 106 executes correction processing. It is determined. Then, the processing control unit 108 causes the projection device 300 to project the option object 40A that selects correction and the option object 40B that selects not correction.
  • the position where the choice objects 40A and 40B are projected may be a position adjacent to the projection position of the operation object 21.
  • the choice objects 40 ⁇ / b> A and 40 ⁇ / b> B are positions adjacent to the projected operation object 21 and are projected on an extension line of the movement path of the operation object 21.
  • the correction control unit 106 controls whether or not the correction process is executed based on the selection result of the choice object. For example, when the recognition unit 104 recognizes an operation for selecting the choice object 40A, the correction control unit 106 performs a correction process. On the contrary, when an operation for selecting the option object 40B is recognized, the correction process is not executed.
  • the choice object is selected by an operation body (for example, the left hand) different from the operation body (for example, the right hand) that is operating the operation object. Further, the choice object may be selected according to the number of fingers operating the operation object.
  • the selection of whether or not the movement information is corrected may be selected by another selection method. For example, the option may be selected by voice input, and the option may be selected based on the line of sight of the user who performs the operation.
  • the displayed option object may be a preview showing the corrected movement information.
  • FIG. 15 is a diagram for describing an example in which a preview indicating movement information after correction is displayed as an option object in the information processing apparatus 100 according to the fourth modification of an embodiment of the present disclosure.
  • the correction control unit 106 calculates the movement direction related to the corrected movement information and the movement direction related to the movement information before correction. For example, the correction control unit 106 calculates the estimated movement direction based on the movement information before correction. Further, the correction control unit 106 is also referred to as a movement direction (hereinafter, also referred to as the latest movement direction) estimated from a position after it is determined that the movement direction among the positions related to the movement information before correction has changed in time series. ) Is calculated.
  • a movement direction hereinafter, also referred to as the latest movement direction
  • the process control unit 108 causes the projection apparatus 300 to project the option object indicating the movement direction related to the calculated corrected movement information and the option object indicating the movement direction related to the calculated movement information before correction.
  • the processing control unit 108 causes the projection device 300 to project the option object 41A indicating the estimated movement direction and the option object 41B indicating the latest movement direction calculated by the correction control unit 106.
  • FIG. 16 is a diagram illustrating an example of the choice object mode control in the information processing apparatus 100 according to the fourth modification example of the embodiment of the present disclosure.
  • the correction control unit 106 determines whether or not the movement information of the operation tool is corrected based on the movement mode of the operation tool, and calculates an estimated correction proposal level.
  • the proposal level of correction may be an index indicating the appropriateness (in other words, reliability) of the correction to be performed with respect to the correction estimated to be desired by the user.
  • the correction control unit 106 calculates a correction proposal level according to the degree of change in the movement direction. For example, when the movement path of the operating body is a meandering movement path R68 as shown in FIG. 16, the correction control unit 106 reduces the degree of change in the movement direction, so the correction proposal level is the meandering movement path. Without being calculated as compared with the case where the moving direction changes. Note that the suggested level of correction may be calculated by an external device.
  • the correction control unit 106 calculates the movement direction related to the corrected movement information and the movement direction related to the movement information before correction.
  • the processing control unit 108 causes the projection apparatus 300 to project the option object indicating the movement direction related to the corrected movement information and the option object indicating the movement direction related to the movement information before correction in a manner corresponding to the proposal level.
  • the processing control unit 108 causes the projection device 300 to project the option object 42A indicating the estimated movement direction whose mode has been changed according to the proposal level and the option object 42B indicating the latest movement direction.
  • the mode of the option object to be controlled may be another visual mode such as the color, shading, or luminance of the option object.
  • the information processing apparatus 100 determines the movement information control mode based on the information related to the selection operation of the movement information control mode performed by the subject of the operation using the operating tool. .
  • the user desires to input the operation that is the correction control target as it is, if the movement information is corrected, processing that does not conform to the user's intention may be executed.
  • whether the correction of the movement information is selected by the user can control whether the correction process is performed according to the user's intention. Therefore, it is possible to prevent the processing based on the movement information corrected against the user's intention from being executed.
  • the selection operation includes an operation for selecting a displayed option. For this reason, when the option is visually presented to the user, the possibility that the user misses the option or erroneously selects the option can be suppressed. Note that the options may be presented to the user by voice.
  • the displayed options include a preview showing movement information after control. For this reason, it is possible to present to the user material for determining whether or not the movement information should be corrected by the apparatus. Therefore, the user can more accurately determine whether or not the movement information is corrected.
  • the information processing apparatus 100 controls the mode of options displayed based on the evaluation information regarding the control of the movement information before execution. For this reason, when the strength of the recommendation is presented to the user, it is possible to increase materials for determining whether the apparatus should correct the movement information. Therefore, the user can more accurately determine whether or not the movement information is corrected.
  • the movement information correction mode may be controlled based on other information.
  • the correction control unit 106 determines the control mode of the movement information based on the attribute information or mode information to be operated.
  • the operation target includes an application or a displayed virtual object.
  • the operation target attribute information includes the type, format, or identifier of the operation target.
  • the correction control unit 106 determines the degree of correction (for example, a correction parameter) of the movement information of the operating tool according to the type of application operated by the operating tool.
  • the operation target aspect information includes the size, shape, or movement speed of the operation target.
  • the correction control unit 106 determines the degree of correction of the movement information of the operating tool according to the size of the virtual object operated by the operating tool.
  • the correction control unit 106 determines the control mode of the movement information based on the mode information of the place where the operation is performed.
  • the form information of the place where the operation is performed includes the shape, texture or moisture of the place where the operation is performed.
  • the correction control unit 106 determines the degree of correction of the movement information of the operating tool according to the undulation of the surface on which the operation is performed.
  • the correction control unit 106 determines the control mode of the movement information based on the attribute information or mode information of the operation subject.
  • the attribute information of the subject of operation includes the age, sex or health status of the user who operates the operating tool. For example, the correction control unit 106 increases the correction amount of the movement information of the operating tool as the user is older.
  • the mode information of the subject of the operation includes the physique, position, posture or action of the user who operates the operating tool. For example, the correction control unit 106 decreases the correction amount of the movement information of the operating tool as the length of the user's arm is longer.
  • FIG. 17 is a flowchart conceptually showing a detailed example of the movement information correction process (step S500B) in the information processing apparatus 100 according to the fifth modification example of the embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
  • the information processing apparatus 100 executes correction preparation processing (step S600). Specifically, the processes of steps S501 to S505 described above are executed.
  • the information processing apparatus 100 determines whether attribute information or aspect information to be operated has been acquired (step S521). Specifically, the correction control unit 106 acquires attribute information from the application. The recognizing unit 104 recognizes the aspect of the virtual object and provides the aspect information to the correction control unit 106.
  • the information processing apparatus 100 determines a correction amount according to the attribute or aspect of the operation target (step S522). Specifically, the correction control unit 106 calculates a correction amount corresponding to the attribute indicated by the acquired attribute information or the mode indicated by the mode information.
  • the information processing apparatus 100 determines whether or not the operation location mode information has been acquired (step S523). Specifically, the recognizing unit 104 recognizes an operation surface on which the user moves the operation body, and provides the correction control unit 106 with mode information related to the mode of the recognized operation surface.
  • the information processing apparatus 100 determines a correction amount according to the operation location mode (step S524). Specifically, the correction control unit 106 calculates a correction amount to a degree according to the mode indicated by the acquired mode information of the operation place.
  • the information processing apparatus 100 determines whether the attribute information or the mode information of the operation subject has been acquired (step S525). Specifically, the recognizing unit 104 recognizes the user who performs the operation, and provides the correction control unit 106 with information that identifies the recognized user or aspect information related to the recognized user aspect. And the correction control part 106 acquires the attribute information of the said user from a memory
  • the information processing apparatus 100 determines a correction amount according to the attribute or aspect of the operating subject (step S526). Specifically, the correction control unit 106 calculates a correction amount corresponding to the attribute indicated by the acquired user attribute information or the mode indicated by the mode information.
  • the information processing apparatus 100 determines the correction amount according to the standard (step S527). Specifically, the correction control unit 106 sets the correction amount to an initial value.
  • the information processing apparatus 100 corrects the movement direction related to the movement information based on the correction amount and the estimated movement direction (step S528). Specifically, the correction control unit 106 changes the movement direction related to the movement information to a direction obtained by adding the correction amount calculated to the estimated movement direction determined from the movement direction related to the estimated movement route.
  • the information processing apparatus 100 corrects the movement amount related to the movement information based on the correction amount and the calculated movement route length (step S529). Specifically, the correction control unit 106 sets the movement amount related to the movement information to a length obtained by adding the correction amount calculated to the movement route length calculated from the length of the estimated movement route in a specific direction. change.
  • the information processing apparatus 100 determines the control mode of the movement information based on the attribute information or the mode information of the operation target.
  • the operation method or the ease of operation differs depending on the object of operation. Therefore, by controlling the presence / absence or degree of correction according to the operation target, it is possible to control the correction processing according to the desired presence / absence of correction, that is, user needs.
  • the information processing apparatus 100 determines the control mode of the movement information based on the mode information of the place where the operation is performed. In general, the ease of operation differs depending on the mode of operation. In view of this, the presence / absence or degree of correction is controlled in accordance with the operation location, so that it is possible to control the correction processing according to the desired presence / absence of correction, that is, user needs.
  • the information processing apparatus 100 determines the control mode of the movement information based on the attribute information or mode information of the operation subject.
  • the method of operation or the range of operation differs depending on the operation subject. Therefore, by controlling the presence / absence or degree of correction according to the operation subject, it is possible to control the correction process according to the desired presence / absence of correction, that is, user needs.
  • the movement information control mode includes the degree of movement information control. For this reason, more precise correction control can be performed by controlling not only the presence / absence of correction but also the degree of correction. Therefore, it becomes possible to respond to finer user needs.
  • the correction of the movement information may be canceled.
  • the correction control unit 106 controls cancellation of movement information control in accordance with the presence or absence of a predetermined operation (hereinafter also referred to as cancellation operation) after the start of movement information control. More specifically, when the cancel operation is recognized, the correction control unit 106 changes the movement information used for the operation handling process to the movement information before the control. Canceling operations include shaking hands, stopping the operation for a predetermined time, and increasing or decreasing the number of fingers.
  • the correction control unit 106 For example, after the movement information is corrected and a virtual object or the like is moved in the direction related to the corrected movement information according to the user's operation, when the user performs a cancel operation, the correction control unit 106 The virtual object is moved to the position when the virtual object is moved in the direction related to the movement information before correction in accordance with the operation.
  • FIG. 18 is a flowchart conceptually showing an example of overall processing including movement information correction cancellation processing in the information processing apparatus 100 according to the sixth modification of an embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
  • step S433 If it is determined that the end operation is not recognized after the application is started (step S431) (step S432 / NO), the information processing apparatus 100 determines whether the movement of the operating tool is recognized (step S433).
  • step S433 If it is determined that the movement of the operating tool has been recognized (step S433 / YES), the information processing apparatus 100 determines whether the moving direction of the operating tool has changed (step S434).
  • step S434 If it is determined that the movement direction of the operating tool has changed (step S434 / YES), the movement information correction process is executed (steps S435 to S437), and then the process is executed based on the corrected movement information. (Step S438).
  • the information processing apparatus 100 determines whether a cancel operation for correcting the movement information has been recognized (step S439). Specifically, the correction control unit 106 determines whether the cancel operation has been recognized by the recognition unit 104 after the process is executed or during the process based on the corrected movement information.
  • the information processing apparatus 100 changes the movement information to the movement information before correction (step S440). Specifically, when the cancel operation is recognized, the correction control unit 106 causes the processing control unit 108 to cancel or cancel the processing based on the corrected movement information, and causes the processing control unit 108 to transmit the movement information before correction. provide.
  • the information processing apparatus 100 executes processing based on the changed movement information (step S441). Specifically, the process control unit 108 cancels or cancels the process based on the corrected movement information and executes the process based on the movement information before correction.
  • the information processing apparatus 100 controls cancellation of movement information control according to the presence or absence of a predetermined operation after the start of movement information control. For this reason, even if it is a case where movement information is corrected contrary to a user's intention, it can control that processing based on movement information after amendment is performed. Therefore, usability can be improved.
  • the cancellation of the control of the movement information includes changing the movement information used for the processing to the movement information before the control. For this reason, not only the correction is canceled, but the processing based on the movement information before the correction is automatically executed, so that it is possible to prevent the user's operation from being wasted. Therefore, the burden on the user can be further reduced.
  • the movement information correction may be canceled by canceling the operation.
  • the process control unit 108 cancels or cancels the process based on the corrected movement information and does not execute an alternative process.
  • the determination of the movement information control mode may be performed before the operation is started.
  • the correction control unit 106 determines a correction mode based on information related to an operation determined before the operation is started. More specifically, the correction control unit 106 determines a correction mode based on the determined operation subject, operation location, or operation target. For example, when the operation target is determined to be a map application or an application that displays products in a three-dimensional space, the correction control unit 106 determines that correction is not automatically performed.
  • the determination of the movement information control mode may be performed at the start of the operation.
  • the correction control unit 106 determines a correction mode based on information related to the operation determined at the start of the operation. For example, by touching a slide bar that can be moved only in a one-dimensional direction or a seek bar of a player that plays video or music, these virtual objects are determined as operation targets. When these virtual objects are determined as operation targets, the correction control unit 106 determines to perform correction. Note that whether or not correction is actually performed may be selected by the user as described above.
  • the determination timing of the movement information control mode includes the time before the start of the operation or the time when the operation starts. For this reason, the processing result (for example, movement of a virtual object etc.) based on the corrected movement information from the beginning of the operation can be presented to the user. Therefore, the user can perform an operation while confirming the processing result based on the corrected movement information.
  • FIG. 19 is an explanatory diagram illustrating a hardware configuration of the information processing apparatus 100 according to an embodiment of the present disclosure.
  • the information processing apparatus 100 includes a processor 132, a memory 134, a bridge 136, a bus 138, an interface 140, an input device 142, an output device 144, a storage device 146, a drive 148, a connection port 150, and a communication device. 152.
  • the processor 132 functions as an arithmetic processing unit, and realizes the functions of the recognition unit 104, the correction control unit 106, and the processing control unit 108 in the information processing apparatus 100 in cooperation with various programs.
  • the processor 132 operates various logical functions of the information processing apparatus 100 by executing a program stored in the memory 134 or another storage medium using the control circuit.
  • the processor 132 may be a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), or a system-on-a-chip (SoC).
  • the memory 134 stores a program used by the processor 132 or an operation parameter.
  • the memory 134 includes a RAM (Random Access Memory), and temporarily stores a program used in the execution of the processor 132 or a parameter that changes as appropriate in the execution.
  • the memory 134 includes a ROM (Read Only Memory), and the function of the storage unit is realized by the RAM and the ROM. Note that an external storage device may be used as part of the memory 134 via the connection port 150 or the communication device 152.
  • processor 132 and the memory 134 are connected to each other by an internal bus including a CPU bus or the like.
  • the bridge 136 connects the buses. Specifically, the bridge 136 connects an internal bus to which the processor 132 and the memory 134 are connected and a bus 138 to be connected to the interface 140.
  • the input device 142 is used for a user to operate the information processing apparatus 100 or input information to the information processing apparatus 100.
  • the input device 142 includes input means for a user to input information, an input control circuit that generates an input signal based on an input by the user, and outputs the input signal to the processor 132.
  • the input means may be a mouse, keyboard, touch panel, switch, lever, microphone, or the like.
  • a user of the information processing apparatus 100 can input various data or instruct a processing operation to the information processing apparatus 100 by operating the input device 142.
  • the output device 144 is used to notify the user of information.
  • the output device 144 may be a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, a projector, a speaker, a headphone, or the like, or a module that outputs to the device.
  • LCD liquid crystal display
  • OLED Organic Light Emitting Diode
  • the input device 142 or the output device 144 may include an input / output device.
  • the input / output device may be a touch screen.
  • the storage device 146 is a device for storing data.
  • the storage device 146 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 146 stores programs executed by the CPU 132 and various data.
  • the drive 148 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 100.
  • the drive 148 reads information stored in a mounted removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the memory 134.
  • the drive 148 can also write information on a removable storage medium.
  • connection port 150 is a port for directly connecting a device to the information processing apparatus 100.
  • the connection port 150 may be a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
  • the connection port 150 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like. Data may be exchanged between the information processing apparatus 100 and the external device by connecting the external device to the connection port 150.
  • the communication device 152 mediates communication between the information processing device 100 and the external device, and realizes the function of the communication unit 102. Specifically, the communication device 152 executes communication according to a wireless communication method or a wired communication method. For example, the communication device 152 performs wireless communication according to a cellular communication method such as WCDMA (registered trademark) (Wideband Code Division Multiple Access), WiMAX (registered trademark), LTE (Long Term Evolution), or LTE-A.
  • WCDMA registered trademark
  • WiMAX registered trademark
  • LTE Long Term Evolution
  • LTE-A Long Term Evolution
  • the communication device 152 may be a short-range wireless communication method such as Bluetooth (registered trademark), NFC (Near Field Communication), wireless USB or TransferJet (registered trademark), or a wireless LAN (Local trademark) such as Wi-Fi (registered trademark).
  • Wireless communication may be executed according to an arbitrary wireless communication method such as an area network method.
  • the communication device 152 may execute wired communication such as signal line communication or wired LAN communication.
  • the information processing apparatus 100 may not have a part of the configuration described with reference to FIG. 19 or may have any additional configuration.
  • a one-chip information processing module in which all or part of the configuration described with reference to FIG. 19 is integrated may be provided.
  • the movement information used in the process is corrected to the movement information related to the movement of the operation tool intended by the user, so that an operation in a range exceeding the user's range of motion is performed. Also for the above, it is possible to execute processing in accordance with the user's intention. Further, the amount of operation can be reduced by correcting the operation amount based on the movement path length. Therefore, it is possible to cause the apparatus to execute processing intended by the user and to reduce the burden on the user for operating the apparatus. Thereby, the user can perform a desired operation without changing the operation method or the operation posture of the operation tool. Therefore, the user can concentrate on the purpose of the operation without paying attention to the operation of the apparatus.
  • the user of the information processing apparatus 100 is single, but the present technology is not limited to such an example.
  • the movement information correction control mode is determined and managed for each of a plurality of users. Further, the movement information correction control mode may be collectively managed for a plurality of users.
  • the operation target may be displayed by another display device.
  • the operation target may be displayed by a display device such as a tablet terminal, a smartphone, or a stationary display.
  • the operation target is a HUD (Head Up Display) in which the light of the external image is transmitted and the image is displayed on the display unit or the image light related to the image is projected on the user's eyes, or the captured external image and image May be displayed by HMD (Head Mount Display) or the like.
  • HUD Head Up Display
  • the position of the operating body is recognized by using a touch sensor (for example, a capacitive or pressure-sensitive sensor) integrated with the display unit.
  • recognition of the position of the operating tool is realized using analysis processing of an image obtained by imaging by the imaging device.
  • each of the results of the processing based on the movement information corrected based on the plurality of correction parameters is displayed as a preview, and the user can select any one of the results of the plurality of processing displayed in the preview by selecting one of the results. Select whether to use correction parameters.
  • the information processing system may be a server client type system or a cloud service type system.
  • the information processing apparatus 100 may be a server installed at a place different from the place where the measurement apparatus 200 and the projection apparatus 300 are installed.
  • the information processing system may be applied to other cases.
  • the information processing apparatus 100 inputs an operation using the hand of a user who performs product assembly work as an operating body, and automatically corrects the operation based on the input operation or proposes correction of the operation. You may do it.
  • a processing unit that executes processing based on movement information of the operating body;
  • a control unit for controlling movement information used in the processing based on a first direction determined based on a change in position in the movement of the operating body and a moving path length of the operating body;
  • An information processing apparatus comprising: (2) The control unit determines movement information related to movement in the first direction having a length corresponding to the movement path length as movement information used in the processing.
  • the control unit determines a control mode of the movement information based on information related to a movement mode of the operating body.
  • the information processing apparatus according to (1) or (2).
  • the movement mode of the operating body includes a change in the moving direction of the operating body.
  • the information processing apparatus determines a control mode of the movement information based on a comparison between a movement mode of the operation body and a pattern of the movement mode of the operation body stored in advance.
  • the information processing apparatus according to (3) or (4).
  • the movement mode of the operation body includes a speed of movement, acceleration, or a movement path of the operation body.
  • the information processing apparatus according to (5).
  • the movement mode pattern of the operation body is obtained by machine learning of the movement mode of the operation body.
  • the information processing apparatus according to (5) or (6).
  • the control unit determines a control mode of the movement information based on information related to a selection operation of the control mode of the movement information by a subject of the operation using the operation body;
  • the information processing apparatus according to any one of (1) to (7).
  • the selection operation includes an operation of selecting a displayed option.
  • the displayed options include a preview showing movement information after control,
  • the control unit controls an aspect of the displayed option based on evaluation information about control of the movement information before execution.
  • (12) The control unit determines a control mode of the movement information based on attribute information or mode information of an operation target by the operating body.
  • the information processing apparatus according to any one of (1) to (11).
  • the control unit determines a control mode of the movement information based on mode information of a place where an operation by the operating body is performed.
  • the information processing apparatus according to any one of (1) to (12).
  • the control unit determines a control mode of the movement information based on attribute information or mode information of an operation subject by the operating body.
  • the information processing apparatus according to any one of (1) to (13).
  • the movement information control mode includes the presence or absence of the movement information or the degree of control.
  • the control unit controls cancellation of the control of the movement information according to the presence or absence of a predetermined operation after the start of the control of the movement information;
  • the information processing apparatus according to any one of (1) to (15).
  • Canceling control of the movement information includes changing movement information used for the processing to movement information before control.
  • the first direction includes a direction determined from a tangent to the position of the operating body that moves.
  • An information processing method including: (20) A processing function for executing processing based on movement information of the operating tool; A control function for controlling movement information used in the processing based on a first direction determined based on a change in position in movement of the operating body and a movement path length of the operating body; A program to make a computer realize.

Abstract

[Problem] To provide a device with which a user can cause the intended processing to be executed in the device, and the burden on the user related to operating the device can be reduced. [Solution] Provided is an information processing device which is provided with: a processing unit which executes processing on the basis of movement information related to an operation body; and a control unit which, on the basis of a first direction determined on the basis of the positional change in the movement of the operation body, and the movement path length of the operation body, controls the movement information used in the processing. Also provided is an information processing method which includes: a step in which a processor is used to execute processing on the basis of movement information related to an operation body; and a step in which the movement information used in the processing is controlled on the basis of a first direction determined on the basis of the positional change in the movement of the operation body, and the movement path length of the operation body. Furthermore, provided is a program for causing a computer to implement the function of the information processing device.

Description

情報処理装置、情報処理方法およびプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、情報処理方法およびプログラムに関する。 This disclosure relates to an information processing apparatus, an information processing method, and a program.
 近年、情報処理技術の発展に伴い、装置を操作するための入力に関して様々な技術が研究開発されている。具体的には、タッチスクリーンなどを用いた二次元空間における入力に関する技術または認識されるユーザのジェスチャなどに基づく三次元空間における入力に関する技術がある。 In recent years, with the development of information processing technology, various technologies have been researched and developed for input for operating the device. Specifically, there is a technique related to input in a two-dimensional space using a touch screen or the like, or a technique related to input in a three-dimensional space based on a recognized user gesture.
 ここで、ユーザが意図する操作が装置に認識されない場合がある。例えば、検出されたジェスチャが処理に関連付けられたジェスチャと一致しない場合、当該検出されたジェスチャは操作として認識されない。その結果、ユーザが所望する処理も実行されない。 Here, the operation intended by the user may not be recognized by the device. For example, if the detected gesture does not match the gesture associated with the process, the detected gesture is not recognized as an operation. As a result, the processing desired by the user is not executed.
 これに対し、特許文献1では、ジェスチャを検出し、検出されたジェスチャが認識できないジェスチャであった場合、当該検出されたジェスチャと処理とをユーザに関連付けさせるインタフェースを提供する情報処理端末に係る発明が開示されている。当該発明によれば、認識できなかったジェスチャを認識可能なジェスチャへ補正することができる。 On the other hand, in Patent Document 1, when a gesture is detected and the detected gesture is an unrecognizable gesture, the invention relates to an information processing terminal that provides an interface for associating the detected gesture with a process to the user. Is disclosed. According to the present invention, a gesture that could not be recognized can be corrected to a recognizable gesture.
特開2012-256099号公報JP 2012-256099 A
 しかし、装置の操作にかかる負担がより低減されることが望まれている。例えば、特許文献1では、ユーザが行うジェスチャとジェスチャに対応する処理とがユーザにより関連付けられることにより、ジェスチャに基づいてユーザが意図する処理を確実に実行することができる。他方で、ユーザが関連付けを行うため、装置にジェスチャを認識させるために負担が発生してしまう。 However, it is desired that the burden on the operation of the apparatus is further reduced. For example, in Patent Literature 1, a user's gesture and a process corresponding to the gesture are associated with each other by the user, so that the process intended by the user can be reliably executed based on the gesture. On the other hand, since the user performs the association, a burden is generated in order for the apparatus to recognize the gesture.
 そこで、本開示では、ユーザが意図する処理を装置に実行させると共に、装置の操作にかかるユーザの負担を低減することが可能な仕組みを提案する。 Therefore, the present disclosure proposes a mechanism capable of causing the apparatus to execute processing intended by the user and reducing the burden on the user for operating the apparatus.
 本開示によれば、操作体の移動情報に基づいて処理を実行する処理部と、前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御部と、を備える情報処理装置が提供される。 According to the present disclosure, the processing unit that performs processing based on movement information of the operating tool, the first direction determined based on a change in position in the movement of the operating tool, and the movement path length of the operating tool, And a control unit that controls movement information used in the processing based on the information processing apparatus.
 また、本開示によれば、プロセッサを用いて、操作体の移動情報に基づいて処理を実行することと、前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御することと、を含む情報処理方法が提供される。 In addition, according to the present disclosure, using a processor, processing is performed based on movement information of the operating body, and the first direction determined based on a change in position in the movement of the operating body and the operation There is provided an information processing method including controlling movement information used in the processing based on a movement path length of a body.
 また、本開示によれば、操作体の移動情報に基づいて処理を実行する処理機能と、前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御機能と、をコンピュータに実現させるためのプログラムが提供される。 In addition, according to the present disclosure, a processing function for executing processing based on movement information of the operating body, a first direction determined based on a change in position in the movement of the operating body, and a movement path of the operating body A program for causing a computer to realize a control function for controlling movement information used in the processing based on the length is provided.
 以上説明したように本開示によれば、ユーザが意図する処理を装置に実行させると共に、装置の操作にかかるユーザの負担を低減することが可能な仕組みが提供される。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 As described above, according to the present disclosure, a mechanism capable of causing the apparatus to execute a process intended by the user and reducing the burden on the user for operating the apparatus is provided. Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の一実施形態に係る情報処理システムの構成例を示す図である。It is a figure showing an example of composition of an information processing system concerning one embodiment of this indication. 本開示の一実施形態に係る情報処理装置の機能構成の例を概略的に示すブロック図である。2 is a block diagram schematically illustrating an example of a functional configuration of an information processing apparatus according to an embodiment of the present disclosure. FIG. 本開示の一実施形態に係る情報処理装置における移動情報の補正処理を説明するための図である。5 is a diagram for describing movement information correction processing in the information processing apparatus according to an embodiment of the present disclosure. FIG. 本開示の一実施形態に係る情報処理装置における移動する操作体の位置の変化の例を示す図である。It is a figure which shows the example of the change of the position of the operating body to move in the information processing apparatus which concerns on one Embodiment of this indication. 本開示の一実施形態に係る情報処理装置の全体処理の例を概念的に示すフローチャートである。5 is a flowchart conceptually showing an example of overall processing of an information processing apparatus according to an embodiment of the present disclosure. 本開示の一実施形態に係る情報処理装置における移動情報の補正処理の詳細の例を概念的に示すフローチャートである。14 is a flowchart conceptually showing an example of details of correction processing of movement information in the information processing apparatus according to an embodiment of the present disclosure. 本開示の一実施形態の第1の変形例に係る情報処理装置による移動情報の補正制御の第1の適用例を示す図である。It is a figure showing the 1st example of application of correction control of movement information by the information processor concerning the 1st modification of one embodiment of this indication. 本開示の一実施形態の第1の変形例に係る情報処理装置による移動情報の補正制御の第2の適用例を示す図である。It is a figure which shows the 2nd application example of correction | amendment control of the movement information by the information processing apparatus which concerns on the 1st modification of one Embodiment of this indication. 本開示の一実施形態の第1の変形例に係る情報処理装置による移動情報の補正制御の第3の適用例を示す図である。It is a figure showing the 3rd example of application of amendment control of movement information by the information processor concerning the 1st modification of one embodiment of this indication. 本開示の一実施形態の第1の変形例に係る情報処理装置による移動情報の補正制御の第4の適用例を示す図である。It is a figure showing the 4th example of application of amendment control of movement information by the information processor concerning the 1st modification of one embodiment of this indication. 本開示の一実施形態の第1の変形例に係る情報処理装置による移動情報の補正制御の第5の適用例を示す図である。It is a figure showing the 5th example of amendment control of movement information by the information processor concerning the 1st modification of one embodiment of this indication. 本開示の一実施形態の第2の変形例に係る情報処理装置における補正制御の例を説明するための図である。14 is a diagram for describing an example of correction control in an information processing device according to a second modification of an embodiment of the present disclosure. FIG. 本開示の一実施形態の第3の変形例に係る情報処理装置の全体処理の例を概念的に示すフローチャートである。14 is a flowchart conceptually showing an example of overall processing of an information processing apparatus according to a third modification of an embodiment of the present disclosure. 本開示の一実施形態の第4の変形例に係る情報処理装置により表示が制御される移動情報の補正有無の選択肢の例を示す図である。It is a figure which shows the example of the choice of the presence or absence of correction | amendment of the movement information which a display is controlled by the information processing apparatus which concerns on the 4th modification of one Embodiment of this indication. 本開示の一実施形態の第4の変形例に係る情報処理装置において選択肢オブジェクトとして補正後の移動情報を示すプレビューが表示される例を説明するための図である。16 is a diagram for describing an example in which a preview indicating movement information after correction is displayed as an option object in the information processing apparatus according to the fourth modification of an embodiment of the present disclosure. FIG. 本開示の一実施形態の第4の変形例に係る情報処理装置において選択肢オブジェクトの態様制御の例を示す図である。It is a figure showing an example of mode control of a choice object in an information processor concerning a 4th modification of one embodiment of this indication. 本開示の一実施形態の第5の変形例に係る情報処理装置における移動情報の補正処理の詳細の例を概念的に示すフローチャートである。16 is a flowchart conceptually showing an example of details of correction processing of movement information in an information processing apparatus according to a fifth modification of an embodiment of the present disclosure. 本開示の一実施形態の第6の変形例に係る情報処理装置における移動情報の補正のキャンセル処理を含む全体処理の例を概念的に示すフローチャートである。16 is a flowchart conceptually showing an example of overall processing including movement information correction cancellation processing in an information processing apparatus according to a sixth modification of an embodiment of the present disclosure; 本開示の一実施形態に係る情報処理装置のハードウェア構成を示した説明図である。FIG. 3 is an explanatory diagram illustrating a hardware configuration of an information processing apparatus according to an embodiment of the present disclosure.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 なお、説明は以下の順序で行うものとする。
 1.本開示の一実施形態
  1.1.システム構成
  1.2.装置の構成
  1.3.装置の処理
  1.4.本開示の一実施形態のまとめ
  1.5.変形例
 2.本開示の一実施形態に係る情報処理装置のハードウェア構成
 3.むすび
The description will be made in the following order.
1. One Embodiment of the Present Disclosure 1.1. System configuration 1.2. Configuration of apparatus 1.3. Device processing 1.4. Summary of one embodiment of the present disclosure 1.5. Modification 2 2. Hardware configuration of information processing apparatus according to an embodiment of the present disclosure; Conclusion
 <1.本開示の一実施形態>
 本開示の一実施形態に係る情報処理システムおよび当該情報処理システムを実現するための情報処理装置について説明する。
<1. One Embodiment of the Present Disclosure>
An information processing system according to an embodiment of the present disclosure and an information processing apparatus for realizing the information processing system will be described.
  <1.1.システム構成>
 まず、図1を参照して、本開示の一実施形態に係る情報処理システムの構成について説明する。図1は、本開示の一実施形態に係る情報処理システムの構成例を示す図である。
<1.1. System configuration>
First, a configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
 図1に示したように、情報処理システムは、情報処理装置100、測定装置200および投影装置300を備える。情報処理装置100と測定装置200および投影装置300との間はそれぞれ通信接続される。 As shown in FIG. 1, the information processing system includes an information processing apparatus 100, a measurement apparatus 200, and a projection apparatus 300. The information processing apparatus 100, the measurement apparatus 200, and the projection apparatus 300 are connected for communication.
 情報処理装置100は、測定装置200の測定結果を用いて投影装置300の投影を制御する。具体的には、情報処理装置100は、測定装置200から提供される測定結果から操作体を認識する。そして、情報処理装置100は、認識された操作体についての移動情報に基づいて投影装置300による投影の態様を制御する。例えば、情報処理装置100は、測定装置200により測定されるユーザの手の移動に応じて投影装置300に投影させる仮想オブジェクト20の投影位置などを制御する。 The information processing apparatus 100 controls the projection of the projection apparatus 300 using the measurement result of the measurement apparatus 200. Specifically, the information processing apparatus 100 recognizes the operating tool from the measurement result provided from the measurement apparatus 200. Then, the information processing apparatus 100 controls the mode of projection by the projection apparatus 300 based on the movement information about the recognized operating body. For example, the information processing apparatus 100 controls the projection position of the virtual object 20 to be projected on the projection apparatus 300 according to the movement of the user's hand measured by the measurement apparatus 200.
 測定装置200は、測定装置200の周辺の状況を測定する。具体的には、測定装置200は、測定装置200の周辺に存在する物体(例えばユーザまたは操作体など)の位置または状態が把握される現象を測定する。そして、測定装置200は、測定により得られる情報(以下、測定情報とも称する。)を測定結果として情報処理装置100へ提供する。例えば、測定装置200は、デプスセンサであり、マーカが装着されたユーザの身体の部位(例えば指または手)と周辺の物体との位置関係(すなわち身体の部位および周辺の物体の三次元空間上の位置)を測定することができる。測定情報は、三次元画像情報であってよい。なお、測定装置200は、ユーザに装着される慣性センサまたは当該慣性センサを有する指輪型もしくは腕輪型のウェアラブル装置であってもよい。また、測定装置200は、操作領域を確保できればどのように設置されてもよい。例えば、測定装置200は、図1に示したような天井のほか、床に設けられてよい。 The measuring device 200 measures the situation around the measuring device 200. Specifically, the measuring apparatus 200 measures a phenomenon in which the position or state of an object (for example, a user or an operating body) existing around the measuring apparatus 200 is grasped. Then, the measuring apparatus 200 provides information obtained by the measurement (hereinafter also referred to as measurement information) to the information processing apparatus 100 as a measurement result. For example, the measurement apparatus 200 is a depth sensor, and a positional relationship between a body part (for example, a finger or a hand) of a user on which a marker is mounted and a surrounding object (that is, a three-dimensional space of the body part and the surrounding object). Position) can be measured. The measurement information may be 3D image information. Measurement device 200 may be an inertial sensor worn by a user or a ring-type or bracelet-type wearable device having the inertial sensor. Moreover, the measuring apparatus 200 may be installed in any way as long as an operation area can be secured. For example, the measuring device 200 may be provided on the floor in addition to the ceiling as shown in FIG.
 投影装置300は、情報処理装置100の指示情報に基づいて画像を投影する。具体的には、投影装置300は、情報処理装置100から提供される画像を指示される場所へ投影する。例えば、投影装置300は、情報処理装置100に指示される図1に示した投影領域10に仮想オブジェクト20を投影する。なお、図1では、投影装置300は2D(Dimension)プロジェクタである例をしめしたが、投影装置300は3Dプロジェクタであってもよい。 Projection apparatus 300 projects an image based on instruction information from information processing apparatus 100. Specifically, the projection apparatus 300 projects an image provided from the information processing apparatus 100 onto a designated location. For example, the projection apparatus 300 projects the virtual object 20 onto the projection area 10 shown in FIG. In FIG. 1, the projection apparatus 300 is a 2D (Dimension) projector, but the projection apparatus 300 may be a 3D projector.
 ここで、近年、ユーザの身体を操作体として用いる入力装置が開発され、普及している。例えば、スマートフォンのような表示画面をユーザが指などで直接的に触れることにより操作する入力装置または投影される画面に向かってユーザが腕を振るなどのジェスチャを行うことにより操作する入力装置がある。これらの入力装置は、操作が直感的で利用しやすいという利点を有する。 Here, in recent years, input devices that use the user's body as an operating body have been developed and are in widespread use. For example, there is an input device that is operated by directly touching a display screen such as a smartphone with a finger or the like, or an input device that is operated by performing a gesture such as a user waving an arm toward a projected screen. . These input devices have the advantage that the operation is intuitive and easy to use.
 他方で、このような入力装置を操作するユーザには操作について負担がかかるおそれがあった。例えば、ユーザがスマートフォンを片手で持ち、当該スマートフォンを持つ手の親指を使ってスクロール操作を行う場合、ユーザは当該親指を直線的に動かしているつもりであっても実際には当該親指は弧を描いて動かされていることがある。また例えば、テーブルなどに投影される画像に対してユーザが手などを用いて直線的なジェスチャを行うことにより装置を操作する場合、ユーザは自己の体から手が離れるほど直線的な動きがしにくくなり、自己の体を中心とした円弧状の動きをしてしまう。同様に、ユーザが空中においてジェスチャを行うことにより装置を操作する場合も、ユーザは自己の体から手が離れるほど直線的な動きがしにくくなり、自己の体を中心とした円弧状の動きをしてしまう。これは、人間の手または指などについての体性感覚と実際の手または指などの可動域との間にずれが存在するためであると考えられている。そのため、実際に装置に認識される操作とユーザが意図する操作とが一致せず、ユーザは操作をやり直すことになる。さらに、ユーザが意図しない操作に応じた処理が実行される恐れもある。また、ユーザが意図する操作を試みる場合には、ユーザは無理な姿勢を強いられることにもなりかねず、身体に負担がかかるおそれがある。 On the other hand, there is a risk that a user who operates such an input device is burdened with the operation. For example, when a user holds a smartphone with one hand and performs a scroll operation using the thumb of the hand holding the smartphone, even if the user intends to move the thumb linearly, the thumb actually does not arc. Sometimes it is drawn and moved. In addition, for example, when the user operates the apparatus by performing a linear gesture using a hand or the like on an image projected on a table or the like, the user moves linearly as the hand moves away from his / her body. It becomes difficult, and it moves like an arc around its own body. Similarly, when the user operates the device by performing a gesture in the air, the user becomes less likely to move linearly as the hand moves away from the user's body, and the arc-shaped movement centered on the user's body becomes difficult. Resulting in. This is considered to be due to a difference between the somatic sensation of a human hand or finger and the range of motion of an actual hand or finger. Therefore, the operation actually recognized by the apparatus does not match the operation intended by the user, and the user performs the operation again. Furthermore, there is a possibility that processing corresponding to an operation not intended by the user is executed. Further, when attempting an operation intended by the user, the user may be forced into an unreasonable posture, and there is a risk that the body may be burdened.
 これに対し、ユーザが意図しない操作に応じた処理が実行されることを防止する技術が存在する。例えば、特定の方向についてスクロール操作が認識されると、当該特定の方向のみについてスクロール処理を実行し、当該特定の方向以外の方向へのスクロール処理を実行しない技術がある。当該技術によれば、ユーザが意図しない方向へのスクロールが抑制され、操作のやり直しの抑制が期待される。 In contrast, there is a technique for preventing processing according to an operation that is not intended by the user. For example, when a scroll operation is recognized in a specific direction, there is a technique in which a scroll process is executed only in the specific direction and a scroll process in a direction other than the specific direction is not executed. According to the technique, scrolling in a direction not intended by the user is suppressed, and suppression of re-operation is expected.
 しかし、当該技術では、やはりユーザの負担が軽減されないおそれがある。例えば、当該技術では、認識されたスクロール操作に係る特定の方向以外の方向についてのユーザの操作が無視されるため、ユーザの操作量の一部しかスクロール処理に反映されない。言い換えると、ユーザの操作量のうちの認識されたスクロール操作に係る特定の方向の成分のみがスクロール処理に反映される。従って、実際のユーザの操作量に比べてスクロール量が目減りしてしまう。このように、操作量に対する処理量が目減りすると、同一の処理を実行させるための操作であっても操作量が増加し、結果として操作回数も増加しかねない。これは操作負担の増大を招くおそれがある。近年では、スマートフォンおよびタブレット端末が大型化しているため、広い表示画面を操作するために操作負担がさらに増大しかねない。 However, the technology may still not reduce the burden on the user. For example, in this technique, since a user operation in a direction other than a specific direction related to the recognized scroll operation is ignored, only a part of the user operation amount is reflected in the scroll process. In other words, only the component in the specific direction related to the recognized scroll operation among the operation amount of the user is reflected in the scroll process. Therefore, the scroll amount is reduced compared to the actual user operation amount. As described above, when the processing amount with respect to the operation amount decreases, the operation amount increases even for an operation for executing the same processing, and as a result, the number of operations may increase. This may increase the operation burden. In recent years, since the size of smartphones and tablet terminals is increasing, the operation burden can be further increased in order to operate a wide display screen.
 そこで、本開示では、ユーザが意図する処理を装置に実行させると共に、装置の操作にかかるユーザの負担を低減することが可能な情報処理システムおよび当該情報処理システムを実現するための情報処理装置100を提案する。 Therefore, in the present disclosure, an information processing system capable of causing a device to execute a process intended by a user and reducing a user's burden on the operation of the device, and an information processing device 100 for realizing the information processing system. Propose.
  <1.2.装置の構成>
 次に、図2を参照して、本開示の一実施形態に係る情報処理装置100の構成について説明する。図2は、本開示の一実施形態に係る情報処理装置100の機能構成の例を概略的に示すブロック図である。
<1.2. Configuration of device>
Next, the configuration of the information processing apparatus 100 according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 2 is a block diagram schematically illustrating an example of a functional configuration of the information processing apparatus 100 according to an embodiment of the present disclosure.
 図2に示したように、情報処理装置100は、通信部102、認識部104、補正制御部106および処理制御部108を備える。 2, the information processing apparatus 100 includes a communication unit 102, a recognition unit 104, a correction control unit 106, and a processing control unit 108.
   (通信部)
 通信部102は、情報処理装置100の外部の装置と通信する。具体的には、通信部102は、測定装置200から測定結果を受信し、投影装置300へ投影指示情報を送信する。例えば、通信部102は、有線通信方式を用いて測定装置200および投影装置300と通信する。なお、通信部102は、無線通信方式を用いて通信してもよい。
(Communication Department)
The communication unit 102 communicates with a device external to the information processing device 100. Specifically, the communication unit 102 receives a measurement result from the measurement apparatus 200 and transmits projection instruction information to the projection apparatus 300. For example, the communication unit 102 communicates with the measurement apparatus 200 and the projection apparatus 300 using a wired communication method. Note that the communication unit 102 may communicate using a wireless communication method.
   (認識部)
 認識部104は、測定装置200の測定結果に基づいて認識処理を行う。具体的には、認識部104は、測定装置200から受信される測定情報に基づいて操作体を認識する。操作体としては、ユーザの身体の部位がある。例えば、認識部104は、測定装置200から得られる三次元画像情報に基づいてユーザの指または手などを認識する。なお、認識部104により認識される操作体は、ペン型またはスティック型などの操作機器であってもよい。
(Recognition part)
The recognition unit 104 performs recognition processing based on the measurement result of the measurement device 200. Specifically, the recognition unit 104 recognizes the operating tool based on the measurement information received from the measurement device 200. The operation body includes a part of the user's body. For example, the recognition unit 104 recognizes the user's finger or hand based on the three-dimensional image information obtained from the measurement apparatus 200. Note that the operation body recognized by the recognition unit 104 may be a pen-type or stick-type operation device.
 また、認識部104は、操作体の位置を認識する。具体的には、認識部104は、操作体が認識される操作領域における操作体の位置を認識する。例えば、認識部104は、測定装置200により測定された測定装置200と操作体との距離に基づいて、操作領域における三次元空間上の操作体の位置を認識する。なお、測定装置200から操作体へ向かう奥行き方向における操作体の位置は、認識された操作体の大きさに基づいて認識されてもよい。また、認識部104は、操作体の位置を認識することにより操作体の移動を認識する。例えば、認識部104は、操作体の位置の変化に基づいて操作体の移動の有無、移動方向、移動の速さおよび加速ならびに移動経路などを認識する。 Also, the recognition unit 104 recognizes the position of the operating body. Specifically, the recognition unit 104 recognizes the position of the operation body in the operation area where the operation body is recognized. For example, the recognition unit 104 recognizes the position of the operation body in the three-dimensional space in the operation region based on the distance between the measurement apparatus 200 and the operation body measured by the measurement apparatus 200. Note that the position of the operating tool in the depth direction from the measuring apparatus 200 toward the operating tool may be recognized based on the recognized size of the operating tool. The recognizing unit 104 recognizes the movement of the operating body by recognizing the position of the operating body. For example, the recognizing unit 104 recognizes the presence / absence of movement of the operating body, the moving direction, the speed and acceleration of movement, the moving path, and the like based on the change in the position of the operating body.
   (補正制御部)
 補正制御部106は、入力される操作についての補正を制御する。具体的には、補正制御部106は、操作体が移動する操作について操作体の移動情報を制御する。より具体的には、補正制御部106は、制御部として、操作体の移動における位置の変化に基づいて決定される第1の方向(以下、推定移動方向とも称する。)と操作体の移動経路長とに基づいて、操作体の移動情報に基づいて実行される処理(以下、操作対応処理とも称する。)で用いられる移動情報を制御する。例えば、移動情報としては、時系列の離散的な座標情報または連続的な移動経路情報などがある。また、補正制御部106は、操作体の移動態様に係る情報に基づいて移動情報の制御態様を決定する。図3を参照して、移動情報の補正について詳細に説明する。図3は、本開示の一実施形態に係る情報処理装置100における移動情報の補正処理を説明するための図である。
(Correction control unit)
The correction control unit 106 controls correction for an input operation. Specifically, the correction control unit 106 controls movement information of the operating tool for an operation of moving the operating tool. More specifically, the correction control unit 106 functions as a control unit in a first direction (hereinafter, also referred to as an estimated movement direction) determined based on a change in position in the movement of the operating body, and the movement path of the operating body. Based on the length, the movement information used in the process executed based on the movement information of the operating body (hereinafter also referred to as an operation corresponding process) is controlled. For example, the movement information includes time-series discrete coordinate information or continuous movement route information. Further, the correction control unit 106 determines the control mode of the movement information based on the information related to the movement mode of the operating tool. The movement information correction will be described in detail with reference to FIG. FIG. 3 is a diagram for describing movement information correction processing in the information processing apparatus 100 according to an embodiment of the present disclosure.
    (補正制御態様の決定)
 補正制御部106は、操作体の移動情報を取得する。例えば、補正制御部106は、認識部104により認識された移動する操作体としてのユーザの手の位置に係る情報を取得する。
(Determination of correction control mode)
The correction control unit 106 acquires movement information of the operating tool. For example, the correction control unit 106 acquires information related to the position of the user's hand as the moving operation body recognized by the recognition unit 104.
 次に、補正制御部106は、操作体の移動方向の変化に基づいて移動情報の制御態様を決定する。具体的には、補正制御部106は、操作体の移動方向の変化に基づいて移動情報の補正有無を決定する。さらに、図4を参照して、操作体の移動方向について詳細に説明する。図4は、本開示の一実施形態に係る情報処理装置100における移動する操作体の位置の変化の例を示す図である。なお、図4は、図3におけるユーザの手の動きに対応する手の位置の変化を離散的に示している。 Next, the correction control unit 106 determines a movement information control mode based on a change in the movement direction of the operating tool. Specifically, the correction control unit 106 determines whether or not to correct the movement information based on a change in the movement direction of the operating tool. Furthermore, the moving direction of the operating tool will be described in detail with reference to FIG. FIG. 4 is a diagram illustrating an example of a change in the position of the moving operating body in the information processing apparatus 100 according to an embodiment of the present disclosure. FIG. 4 discretely shows changes in the hand position corresponding to the movement of the user's hand in FIG.
 まず、補正制御部106は、操作体の移動経路を推定する。例えば、補正制御部106は、図4に示したような操作体の位置P50A~P50Eが特定される情報を認識部104から取得する。そして、補正制御部106は、当該情報に基づいてこれらの位置P50A~P50Eを通る操作体の移動経路R60を算出する。なお、認識部104から移動経路に係る情報が提供される場合は当該処理が省略されてもよい。 First, the correction control unit 106 estimates the movement path of the operating tool. For example, the correction control unit 106 acquires information identifying the operating body positions P50A to P50E as illustrated in FIG. Then, the correction control unit 106 calculates the movement path R60 of the operating body that passes through these positions P50A to P50E based on the information. In addition, when the information regarding a movement route is provided from the recognition unit 104, the processing may be omitted.
 次に、補正制御部106は、移動する操作体の位置についての接線を算出する。例えば、補正制御部106は、操作体の位置P50A~P50Eの各々について、算出された移動経路R60についての接線をそれぞれ算出する。なお、接線の算出対象となる操作体の位置はまびかれてもよい。 Next, the correction control unit 106 calculates a tangent for the position of the moving operating body. For example, the correction control unit 106 calculates a tangent to the calculated movement route R60 for each of the operating body positions P50A to P50E. It should be noted that the position of the operating body that is a tangent calculation target may be sprinkled.
 そして、補正制御部106は、算出された接線の方向が変化したかの判定に基づいて移動情報の補正有無を決定する。例えば、補正制御部106は、図4に示したように位置P50Dについての接線の方向が位置P50A~P50Cについての接線の方向と異なるため、移動情報を補正する旨を決定する。 Then, the correction control unit 106 determines whether or not the movement information is corrected based on the determination of whether the calculated tangent direction has changed. For example, as shown in FIG. 4, the correction control unit 106 determines that the movement information is to be corrected because the tangential direction for the position P50D is different from the tangential direction for the positions P50A to P50C.
    (補正制御の実行)
 移動情報を補正すると決定される場合、補正制御部106は、操作体の推定移動方向を算出する。具体的には、補正制御部106は、移動する操作体の位置についての接線から決定される方向を推定移動方向に決定する。より具体的には、補正制御部106は、算出された接線と平行な方向であって操作体が移動している方向を推定移動方向に決定する。詳細には、補正制御部106は、操作体の移動が認識されてから所定の期間経過後までの間の操作体の位置についての接線と平行な方向であって操作体が移動している方向を推定移動方向に決定する。例えば、図4に示したような位置P50A~P50Cについての接線と平行な方向であって位置P50AからP50Cへ向かう方向が推定移動方向に決定される。
(Execution of correction control)
When it is determined to correct the movement information, the correction control unit 106 calculates the estimated movement direction of the operating tool. Specifically, the correction control unit 106 determines the direction determined from the tangent to the position of the moving operating body as the estimated movement direction. More specifically, the correction control unit 106 determines a direction in which the operating body is moving in a direction parallel to the calculated tangent as an estimated movement direction. Specifically, the correction control unit 106 is a direction parallel to a tangent to the position of the operating body between the time when the operating body is recognized and after a predetermined period has elapsed, and the direction in which the operating body is moving. Is determined as the estimated moving direction. For example, the direction that is parallel to the tangent to the positions P50A to P50C as shown in FIG. 4 and that goes from the position P50A to P50C is determined as the estimated movement direction.
 なお、推定移動方向の決定方法は上記の方法に限られない。例えば、補正制御部106は、操作体の位置についての接線の方向の最も一致数の多い方向と平行な方向を推定移動方向に決定してもよい。詳細には、図4に示したような位置P50A~P50C、位置P50Dおよび位置P50Eのそれぞれで接線の方向が異なるが、接線の方向が位置P50A~P50Cの3つの位置で一致する方向が推定移動方向に決定される。 Note that the method of determining the estimated movement direction is not limited to the above method. For example, the correction control unit 106 may determine the direction parallel to the direction with the largest number of coincidence of the tangential directions with respect to the position of the operating body as the estimated movement direction. Specifically, the tangent directions are different at each of the positions P50A to P50C, the position P50D, and the position P50E as shown in FIG. 4, but the direction in which the tangent directions coincide at the three positions P50A to P50C is estimated movement. Determined in the direction.
 図3を参照して移動情報の補正処理についての説明に戻ると、補正制御部106は、操作体の移動経路長を算出する。具体的には、補正制御部106は、図3に示したような操作体の移動経路R60の長さを算出する。 Returning to the description of the movement information correction process with reference to FIG. 3, the correction control unit 106 calculates the movement path length of the operating tool. Specifically, the correction control unit 106 calculates the length of the movement path R60 of the operating tool as shown in FIG.
 次に、補正制御部106は、操作体の移動経路長に相当する長さの推定移動方向への移動に係る移動情報を、操作対応処理で用いられる移動情報に決定する。例えば、補正制御部106は、決定された推定移動方向を操作の移動方向とし、算出された移動経路長を操作量として決定する。言い換えると、実際の操作体の移動経路R60の代わりに、図3に示したような移動経路R70が操作対応処理に用いられる。 Next, the correction control unit 106 determines movement information related to movement in the estimated movement direction of a length corresponding to the movement path length of the operating body as movement information used in the operation handling process. For example, the correction control unit 106 determines the determined estimated movement direction as the movement direction of the operation, and determines the calculated movement path length as the operation amount. In other words, instead of the actual movement path R60 of the operating body, the movement path R70 as shown in FIG. 3 is used for the operation handling process.
   (処理制御部)
 処理制御部108は、情報処理装置100の処理を全体的に制御する。具体的には、処理制御部108は、操作体の移動情報に基づいて処理を実行する。より具体的には、処理制御部108は、認識部104により認識された操作体の移動に係る移動情報、または補正制御部106により補正された操作体の移動情報に基づいて表示制御処理を実行する。例えば、処理制御部108は、操作体の移動方向および移動量から処理における操作方向および操作量を決定する。そして、処理制御部108は、決定された操作方向および操作量に応じて、投影装置300に投影させる仮想オブジェクトの投影位置などを変更する。
(Processing control unit)
The processing control unit 108 controls the processing of the information processing apparatus 100 as a whole. Specifically, the process control unit 108 executes a process based on the movement information of the operating tool. More specifically, the processing control unit 108 performs display control processing based on movement information related to movement of the operating tool recognized by the recognition unit 104 or movement information of the operating tool corrected by the correction control unit 106. To do. For example, the process control unit 108 determines the operation direction and the operation amount in the process from the movement direction and the movement amount of the operating body. Then, the processing control unit 108 changes the projection position of the virtual object to be projected on the projection device 300 according to the determined operation direction and operation amount.
  <1.3.装置の処理>
 次に、情報処理装置100の処理について説明する。
<1.3. Device processing>
Next, processing of the information processing apparatus 100 will be described.
   (全体処理)
 まず、図5を参照して、情報処理装置100の全体処理について説明する。図5は、本開示の一実施形態に係る情報処理装置100の全体処理の例を概念的に示すフローチャートである。
(Overall processing)
First, the overall processing of the information processing apparatus 100 will be described with reference to FIG. FIG. 5 is a flowchart conceptually showing an example of overall processing of the information processing apparatus 100 according to an embodiment of the present disclosure.
 情報処理装置100は、アプリケーションを起動する(ステップS401)。具体的には、処理制御部108は、認識部104により認識されるユーザのアプリケーション起動操作に応じてアプリケーションを起動する。なお、アプリケーションは自動的に起動させられてもよい。 The information processing apparatus 100 starts an application (step S401). Specifically, the process control unit 108 activates an application in response to a user application activation operation recognized by the recognition unit 104. Note that the application may be automatically started.
 次に、情報処理装置100は、終了操作を認識したかを判定する(ステップS402)。具体的には、処理制御部108は、認識部104により認識されたユーザの操作がアプリケーションの終了操作であるかを判定する。 Next, the information processing apparatus 100 determines whether an end operation has been recognized (step S402). Specifically, the process control unit 108 determines whether the user operation recognized by the recognition unit 104 is an application end operation.
 終了操作を認識していないと判定されると(ステップS402/NO)、情報処理装置100は、操作体の移動が認識されたかを判定する(ステップS403)。具体的には、補正制御部106は、認識部104により操作体の移動が認識されたかを判定する。 If it is determined that the end operation has not been recognized (step S402 / NO), the information processing apparatus 100 determines whether the movement of the operating tool has been recognized (step S403). Specifically, the correction control unit 106 determines whether the recognition unit 104 has recognized the movement of the operating tool.
 操作体の移動が認識されたと判定されると(ステップS403/YES)、情報処理装置100は、操作体の移動方向が変化したかを判定する(ステップS404)。具体的には、補正制御部106は、認識部104により操作体の移動が認識されたと判定されると、操作体の移動方向を推定し、推定される操作体の移動方向が移動中に変化したかを判定する。 If it is determined that the movement of the operating tool has been recognized (step S403 / YES), the information processing apparatus 100 determines whether the moving direction of the operating tool has changed (step S404). Specifically, when it is determined that the movement of the operating tool has been recognized by the recognition unit 104, the correction control unit 106 estimates the moving direction of the operating tool, and the estimated moving direction of the operating tool changes during the movement. Determine if you did.
 操作体の移動方向が変化したと判定されると(ステップS404/YES)、情報処理装置100は、操作体の位置の変化に基づいて推定移動方向を決定する(ステップS405)。具体的には、補正制御部106は、操作体の移動方向が移動中に変化したと判定されると、変化前の移動方向を推定移動方向に決定する。 If it is determined that the moving direction of the operating tool has changed (step S404 / YES), the information processing apparatus 100 determines the estimated moving direction based on the change in the position of the operating tool (step S405). Specifically, when it is determined that the movement direction of the operating tool has changed during movement, the correction control unit 106 determines the movement direction before the change as the estimated movement direction.
 次に、情報処理装置100は、操作体の移動経路長を算出する(ステップS406)。具体的には、補正制御部106は、認識された操作体の位置の変化に基づいて推定される移動経路の長さを算出する。 Next, the information processing apparatus 100 calculates the movement path length of the operating tool (step S406). Specifically, the correction control unit 106 calculates the length of the movement path estimated based on the recognized change in the position of the operating tool.
 次に、情報処理装置100は、移動情報の補正制御を実行する(ステップS407)。具体的には、補正制御部106は、移動情報に係る移動方向および移動量を推定移動方向および移動経路長に基づいて補正する。 Next, the information processing apparatus 100 executes movement information correction control (step S407). Specifically, the correction control unit 106 corrects the movement direction and the movement amount related to the movement information based on the estimated movement direction and the movement path length.
 次に、情報処理装置100は、移動情報に基づく処理を実行する(ステップS408)。具体的には、処理制御部108は、補正後の移動方向および移動量に基づいて投影装置300の投影制御を実行する。 Next, the information processing apparatus 100 executes processing based on the movement information (step S408). Specifically, the process control unit 108 executes projection control of the projection apparatus 300 based on the corrected movement direction and movement amount.
 なお、終了操作を認識したと判定されると(ステップS402/YES)、情報処理装置100は、アプリケーションを終了し(ステップS409)、処理を終了する。また、以下では、ステップS404~S407の処理をまとめてステップS500Aとも称する。 If it is determined that the end operation has been recognized (step S402 / YES), the information processing apparatus 100 ends the application (step S409) and ends the process. Hereinafter, the processes in steps S404 to S407 are collectively referred to as step S500A.
   (移動情報の補正処理)
 続いて、図6を参照して、情報処理装置100における移動情報の補正処理(ステップS500A)の詳細について説明する。図6は、本開示の一実施形態に係る情報処理装置100における移動情報の補正処理の詳細の例を概念的に示すフローチャートである。
(Movement information correction process)
Next, details of the movement information correction process (step S500A) in the information processing apparatus 100 will be described with reference to FIG. FIG. 6 is a flowchart conceptually showing an example of details of the movement information correction processing in the information processing apparatus 100 according to an embodiment of the present disclosure.
 情報処理装置100は、所定期間内の操作体の位置を記憶する(ステップS501)。具体的には、補正制御部106は、認識部104から提供された、所定のフレーム数に相当する期間の操作体の位置を示す情報を記憶部に記憶させる。 The information processing apparatus 100 stores the position of the operating tool within a predetermined period (step S501). Specifically, the correction control unit 106 causes the storage unit to store information indicating the position of the operating tool during a period corresponding to a predetermined number of frames provided from the recognition unit 104.
 次に、情報処理装置100は、記憶された操作体の位置の各々について、操作体の移動経路について接線をそれぞれ算出する(ステップS502)。具体的には、補正制御部106は、記憶された操作体の位置を示す情報に基づいて移動経路を推定する。次いで、補正制御部106は、操作体の位置の各々について、推定される移動経路についての接線をそれぞれ算出する。 Next, the information processing apparatus 100 calculates a tangent for the movement path of the operating body for each of the stored positions of the operating body (step S502). Specifically, the correction control unit 106 estimates the movement route based on the stored information indicating the position of the operating tool. Next, the correction control unit 106 calculates a tangent to the estimated movement path for each position of the operating tool.
 次に、情報処理装置100は、算出された接線の方向が変化したかを判定する(ステップS503)。具体的には、補正制御部106は、操作体の位置についての接線の傾きが時系列の前後の位置の間で所定値以上変化したかを判定する。 Next, the information processing apparatus 100 determines whether the calculated direction of the tangent has changed (step S503). Specifically, the correction control unit 106 determines whether the slope of the tangent with respect to the position of the operating tool has changed by a predetermined value or more between the positions before and after the time series.
 接線の方向が変化したと判定されると(ステップS503/YES)、情報処理装置100は、方向が変化する前の接線の方向を推定移動方向に決定する(ステップS504)。具体的には、補正制御部106は、接線の傾きが所定値以上変化したと判定されると、傾きが変化する前の操作体の位置についての接線と平行な方向であって、操作体の位置の変化方向(すなわち移動方向)を推定移動方向に決定する。 If it is determined that the direction of the tangent has changed (step S503 / YES), the information processing apparatus 100 determines the direction of the tangent before the direction is changed as the estimated movement direction (step S504). Specifically, when the correction control unit 106 determines that the inclination of the tangent has changed by a predetermined value or more, the correction control unit 106 is in a direction parallel to the tangent with respect to the position of the operating body before the inclination changes, The change direction of the position (that is, the movement direction) is determined as the estimated movement direction.
 また、情報処理装置100は、操作体の移動経路長を算出する(ステップS505)。具体的には、補正制御部106は、上述のように推定された移動経路の長さを算出する。 Further, the information processing apparatus 100 calculates the movement path length of the operating tool (step S505). Specifically, the correction control unit 106 calculates the length of the travel route estimated as described above.
 次に、情報処理装置100は、移動情報に係る移動方向を推定移動方向に補正する(ステップS506)。具体的には、補正制御部106は、移動情報に係る移動方向を、推定された移動経路に係る移動方向(すなわち実際の移動方向)から決定された推定移動方向へ変更する。 Next, the information processing apparatus 100 corrects the movement direction related to the movement information to the estimated movement direction (step S506). Specifically, the correction control unit 106 changes the movement direction related to the movement information to the estimated movement direction determined from the movement direction related to the estimated movement route (that is, the actual movement direction).
 また、情報処理装置100は、移動情報に係る移動量を算出された移動経路長に相当する量に補正する(ステップS507)。具体的には、補正制御部106は、移動情報に係る移動量を、推定された移動経路の特定の方向の長さから算出された移動経路長へ変更する。 Further, the information processing apparatus 100 corrects the movement amount related to the movement information to an amount corresponding to the calculated movement route length (step S507). Specifically, the correction control unit 106 changes the movement amount related to the movement information to a movement path length calculated from the length of the estimated movement path in a specific direction.
 なお、以下では、ステップS501~S505の処理をまとめてステップS600とも称する。 In the following, the processes of steps S501 to S505 are collectively referred to as step S600.
  <1.4.本開示の一実施形態のまとめ>
 このように、本開示の一実施形態によれば、情報処理装置100は、操作体の移動情報に基づいて処理を実行し、操作体の移動における位置の変化に基づいて決定される第1の方向と操作体の移動経路長とに基づいて当該処理で用いられる移動情報を制御する。
<1.4. Summary of Embodiment of Present Disclosure>
As described above, according to the embodiment of the present disclosure, the information processing apparatus 100 executes the process based on the movement information of the operating tool, and is determined based on the change in position in the movement of the operating tool. The movement information used in the process is controlled based on the direction and the movement path length of the operating tool.
 従来では、操作体を用いて装置を操作するユーザは、操作体を持つ身体の可動域を超えた範囲においてはユーザが意図する操作を行うことが困難であった。そのため、ユーザが意図しない処理が実行されるおそれがあった。また、ユーザにより行われる操作のうちの特定の方向の成分のみを処理に用いる技術も存在するが、当該技術では処理に用いられる操作量が実際のユーザの操作量(すなわち操作体の移動量)よりも目減りしてしまう。そのため、ユーザは所望の操作を行うために操作体をより長く移動させることを強いられかねない。 Conventionally, it has been difficult for a user who operates an apparatus using an operating tool to perform an operation intended by the user in a range that exceeds the range of motion of the body having the operating tool. Therefore, there is a possibility that processing unintended by the user is executed. In addition, there is a technology that uses only a component in a specific direction among operations performed by the user for processing, but in this technology, the operation amount used for the processing is the actual user operation amount (that is, the movement amount of the operating body). It will be dull than. Therefore, the user may be forced to move the operating body for a longer time in order to perform a desired operation.
 これに対し、本開示の一実施形態によれば、処理で用いられる移動情報がユーザの意図する操作体の移動に係る移動情報に補正されることにより、ユーザの可動域を超えた範囲の操作についてもユーザの意図に即した処理を実行することができる。また、移動経路長に基づいて操作量が補正されることにより、操作量の目減りを抑制することができる。従って、ユーザが意図する処理を装置に実行させると共に、装置の操作にかかるユーザの負担を低減することが可能となる。それにより、ユーザは操作体の操作方法または操作姿勢を変えることなく、所望の操作を行うことができる。そのため、ユーザは装置の操作に意識を向けることなく、操作の目的に集中することが可能となる。 On the other hand, according to an embodiment of the present disclosure, the movement information used in the process is corrected to the movement information related to the movement of the operation tool intended by the user, so that the operation in a range beyond the user's range of motion is performed. Also for the above, it is possible to execute processing in accordance with the user's intention. Further, the amount of operation can be reduced by correcting the operation amount based on the movement path length. Therefore, it is possible to cause the apparatus to execute processing intended by the user and to reduce the burden on the user for operating the apparatus. Thereby, the user can perform a desired operation without changing the operation method or the operation posture of the operation tool. Therefore, the user can concentrate on the purpose of the operation without paying attention to the operation of the apparatus.
 また、情報処理装置100は、上記移動経路長に相当する長さの上記第1の方向への移動に係る移動情報を、上記処理で用いられる移動情報に決定する。このため、ユーザが実際に操作体を動かした距離に相当する長さに基づいて処理を実行することができる。従って、ユーザの操作の一部が処理に反映されない(すなわち無駄になる)ことを抑制でき、操作にかかる負担をより確実に低減することが可能となる。なお、移動情報に係る移動の長さは、移動経路長未満であってもよく、移動経路長超過であってもよい。また、後述するように、移動情報に係る長さは、様々な情報に基づいて制御されてもよい。 Further, the information processing apparatus 100 determines movement information related to movement in the first direction having a length corresponding to the movement path length as movement information used in the processing. For this reason, a process can be performed based on the length equivalent to the distance which the user actually moved the operation body. Therefore, it is possible to prevent a part of the user's operation from being reflected in the processing (that is, wasted), and it is possible to more reliably reduce the burden on the operation. Note that the length of the movement according to the movement information may be less than the movement path length or may be longer than the movement path length. Further, as will be described later, the length of the movement information may be controlled based on various information.
 また、情報処理装置100は、上記操作体の移動態様に係る情報に基づいて移動情報の制御態様を決定する。また、当該移動情報の制御態様は、上記移動情報の制御の有無を含む。ここで、操作体の移動情報についての補正制御が常に実行されることはユーザにとって好ましくない場合がある。例えば、ユーザが正しく操作可能な状況においても補正制御に係る処理が実行されると、装置の処理負荷が無駄に増大してしまう。そこで、操作体の移動のされ方に応じて補正制御が実行されることにより、装置の処理負荷を低減することができる。特に、操作体の移動態様からは、上述したようなユーザが操作体を操る身体の部位の可動域を超えた操作が推定されやすい。例えば、移動経路が円弧状に変化していることなどから移動情報の補正が所望されることが判定されやすい。従って、操作体の移動態様に基づいて補正制御の態様が決定されることにより、補正制御の正確性または適応性を向上させることができる。 Further, the information processing apparatus 100 determines the control mode of the movement information based on the information related to the movement mode of the operating body. The movement information control mode includes the presence or absence of the movement information control. Here, it may not be preferable for the user to always perform the correction control for the movement information of the operating tool. For example, if the process related to the correction control is executed even in a situation where the user can operate correctly, the processing load of the apparatus increases unnecessarily. Thus, the processing load of the apparatus can be reduced by executing correction control according to how the operating tool is moved. In particular, from the movement mode of the operating body, it is easy to estimate an operation exceeding the range of motion of the body part where the user operates the operating body as described above. For example, it is easy to determine that correction of movement information is desired because the movement path changes in an arc shape. Therefore, the accuracy or adaptability of the correction control can be improved by determining the correction control mode based on the movement mode of the operating body.
 また、上記操作体の移動態様は、上記操作体の移動方向の変化を含む。ここで、上述したようなユーザが操作体を操る身体の部位の可動域を超えた操作においては、概してユーザ操作体の移動方向が移動中に変化する。そこで、操作体の移動方向の変化が検出される場合に補正制御を実行することにより、効果的に補正制御を行うことができる。 Further, the movement mode of the operating body includes a change in the moving direction of the operating body. Here, in the operation beyond the movable range of the body part where the user manipulates the operating body as described above, the moving direction of the user operating body generally changes during the movement. Accordingly, the correction control can be effectively performed by executing the correction control when a change in the moving direction of the operating tool is detected.
 また、上記第1の方向は、移動する上記操作体の位置についての接線から決定される方向を含む。このため、操作体の移動方向をより正確に把握することができる。従って、ユーザが意図する方向により近い方向に移動情報に係る移動方向が補正されることにより、ユーザの意図に応じた処理を実行することが可能となる。 Further, the first direction includes a direction determined from a tangent to the position of the moving operating body. For this reason, the moving direction of the operating body can be grasped more accurately. Therefore, it is possible to execute processing according to the user's intention by correcting the movement direction according to the movement information in a direction closer to the direction intended by the user.
  <1.5.変形例>
 以上、本開示の一実施形態について説明した。なお、本開示の一実施形態は、上述の例に限定されない。以下に、本開示の一実施形態の第1~第7の変形例について説明する。
<1.5. Modification>
The embodiment of the present disclosure has been described above. Note that an embodiment of the present disclosure is not limited to the above-described example. Hereinafter, first to seventh modifications of the embodiment of the present disclosure will be described.
  (第1の変形例)
 本開示の一実施形態の第1の変形例として、情報処理装置100は、上述したような空中で手を振るジェスチャ以外の操作についても補正制御を実行してよい。図7~図11を参照して、補正制御が適用される例についてそれぞれ説明する。
(First modification)
As a first modification of an embodiment of the present disclosure, the information processing apparatus 100 may execute correction control for operations other than the gestures of waving hands in the air as described above. Examples in which the correction control is applied will be described with reference to FIGS.
 図7は、本開示の一実施形態の第1の変形例に係る情報処理装置100による移動情報の補正制御の第1の適用例を示す図である。図7では、ユーザが太ももの上部から側部にかけて操作体であるユーザの手を移動させることにより操作を行う例が示されている。認識部104は、ユーザの手50および手50の移動を認識する。補正制御部106は、認識された手50の位置の変化に基づいて補正有無を判定する。図7に示したように手50は、移動経路R61のように弧を描いて移動されるため、ここでは補正制御を行うと判定される。次に、補正制御部106は、手50の位置の変化に基づいて手50についての推定移動方向を決定し、また移動経路R61の長さを算出する。そして、補正制御部106は、操作体の移動情報に係る移動方向を決定された推定移動方向(すなわち移動経路R71の方向)に変更し、操作体の移動情報に係る移動量を算出された移動経路R61の長さに変更する。言い換えると、補正制御部106は、操作体の移動情報を移動経路R61に係る移動情報から移動経路R71に係る移動情報へ変更する。 FIG. 7 is a diagram illustrating a first application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure. FIG. 7 shows an example in which the user performs an operation by moving the user's hand, which is the operating body, from the upper part of the thigh to the side part. The recognition unit 104 recognizes the user's hand 50 and the movement of the hand 50. The correction control unit 106 determines the presence / absence of correction based on the recognized change in the position of the hand 50. As shown in FIG. 7, since the hand 50 is moved in an arc as in the movement route R61, it is determined that correction control is performed here. Next, the correction control unit 106 determines an estimated movement direction for the hand 50 based on a change in the position of the hand 50, and calculates the length of the movement route R61. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R71), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R61. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R61 to the movement information related to the movement route R71.
 図8は、本開示の一実施形態の第1の変形例に係る情報処理装置100による移動情報の補正制御の第2の適用例を示す図である。図8では、ユーザが左手の手の平上を右手の指でなぞるような操作を行う例が示されている。認識部104は、ユーザの指51および指51の移動を認識する。補正制御部106は、認識された指51の位置の変化に基づいて補正有無を判定する。図8に示したように指51は、移動経路R62のように弧を描いて移動されるため、ここでは補正制御を行うと判定される。次に、補正制御部106は、指51の位置の変化に基づいて指51についての推定移動方向を決定し、また移動経路R62の長さを算出する。そして、補正制御部106は、操作体の移動情報に係る移動方向を決定された推定移動方向(すなわち移動経路R72の方向)に変更し、操作体の移動情報に係る移動量を算出された移動経路R62の長さに変更する。言い換えると、補正制御部106は、操作体の移動情報を移動経路R62に係る移動情報から移動経路R72に係る移動情報へ変更する。 FIG. 8 is a diagram illustrating a second application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure. FIG. 8 shows an example in which the user performs an operation of tracing the palm of the left hand with the finger of the right hand. The recognition unit 104 recognizes the user's finger 51 and the movement of the finger 51. The correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 8, the finger 51 is moved in an arc as in the movement path R62, and therefore it is determined here that correction control is performed. Next, the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R62. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R72), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R62. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R62 to the movement information related to the movement route R72.
 図9は、本開示の一実施形態の第1の変形例に係る情報処理装置100による移動情報の補正制御の第3の適用例を示す図である。図9では、ユーザが左手の指で右腕の手首から前腕をなぞるような操作を行う例が示されている。認識部104は、ユーザの指51および指51の移動を認識する。補正制御部106は、認識された指51の位置の変化に基づいて補正有無を判定する。図9に示したように指51は、移動経路R63のように弧を描いて移動されるため、ここでは補正制御を行うと判定される。次に、補正制御部106は、指51の位置の変化に基づいて指51についての推定移動方向を決定し、また移動経路R63の長さを算出する。そして、補正制御部106は、操作体の移動情報に係る移動方向を決定された推定移動方向(すなわち移動経路R73の方向)に変更し、操作体の移動情報に係る移動量を算出された移動経路R63の長さに変更する。言い換えると、補正制御部106は、操作体の移動情報を移動経路R63に係る移動情報から移動経路R73に係る移動情報へ変更する。 FIG. 9 is a diagram illustrating a third application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure. FIG. 9 shows an example in which the user performs an operation of tracing the forearm from the wrist of the right arm with the finger of the left hand. The recognition unit 104 recognizes the user's finger 51 and the movement of the finger 51. The correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 9, since the finger 51 is moved in an arc as in the movement path R63, it is determined here that correction control is performed. Next, the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R63. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R73), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R63. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R63 to the movement information related to the movement route R73.
 図10は、本開示の一実施形態の第1の変形例に係る情報処理装置100による移動情報の補正制御の第4の適用例を示す図である。図10では、ユーザが投影領域を指した指を投影領域に向かって左から右に移動させるような操作を行う例が示されている。認識部104は、投影領域10に向けられているユーザの指51および指51の移動を認識する。補正制御部106は、認識された指51の位置の変化に基づいて補正有無を判定する。図10に示したように指51は、投影領域10上を移動経路R64のように弧を描いて移動されるため、ここでは補正制御を行うと判定される。次に、補正制御部106は、指51の位置の変化に基づいて指51についての推定移動方向を決定し、また移動経路R64の長さを算出する。そして、補正制御部106は、操作体の移動情報に係る移動方向を決定された推定移動方向(すなわち移動経路R74の方向)に変更し、操作体の移動情報に係る移動量を算出された移動経路R64の長さに変更する。言い換えると、補正制御部106は、操作体の移動情報を移動経路R64に係る移動情報から移動経路R74に係る移動情報へ変更する。なお、操作体の位置の変化は、操作体の位置から推定される位置の変化であってもよい。例えば、図10に示したようなユーザの指51により指し示される投影領域10上の位置の変化に基づいて推定移動方向が決定されてもよい。また、当該推定される位置の変化は、操作体により操作される操作対象の位置の変化であってもよい。 FIG. 10 is a diagram illustrating a fourth application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure. FIG. 10 shows an example in which the user performs an operation to move the finger pointing at the projection area from left to right toward the projection area. The recognition unit 104 recognizes the user's finger 51 directed to the projection area 10 and the movement of the finger 51. The correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 10, the finger 51 is moved in an arc on the projection area 10 like a movement path R64, and therefore it is determined that correction control is performed here. Next, the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R64. Then, the correction control unit 106 changes the movement direction related to the movement information of the operation tool to the determined estimated movement direction (that is, the direction of the movement route R74), and calculates the movement amount related to the movement information of the operation tool. The length is changed to the length of the route R64. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R64 to the movement information related to the movement route R74. The change in the position of the operating tool may be a change in position estimated from the position of the operating tool. For example, the estimated movement direction may be determined based on a change in the position on the projection region 10 indicated by the user's finger 51 as shown in FIG. The estimated change in position may be a change in the position of the operation target operated by the operating tool.
 図11は、本開示の一実施形態の第1の変形例に係る情報処理装置100による移動情報の補正制御の第5の適用例を示す図である。図11では、ユーザがスマートフォンなどのタッチスクリーンを有する装置を把持する右手の親指をタッチスクリーン上でスライドさせるような操作を行う例が示されている。認識部104は、タッチスクリーン30上のユーザの指51および指51の移動を認識する。補正制御部106は、認識された指51の位置の変化に基づいて補正有無を判定する。図11に示したように指51は、タッチスクリーン30上を移動経路R65のように弧を描いて移動されるため、ここでは補正制御を行うと判定される。次に、補正制御部106は、指51の位置の変化に基づいて指51についての推定移動方向を決定し、また移動経路R65の長さを算出する。そして、補正制御部106は、操作体の移動情報に係る移動方向を決定された推定移動方向(すなわち移動経路R75の方向)に変更し、操作体の移動情報に係る移動量を算出された移動経路R65の長さに変更する。言い換えると、補正制御部106は、操作体の移動情報を移動経路R65に係る移動情報から移動経路R75に係る移動情報へ変更する。なお、タッチスクリーンを備える装置の代わりに、タッチパネルと表示部とが分離して備えられる装置に情報処理装置100の補正制御処理が適用されてもよい。 FIG. 11 is a diagram illustrating a fifth application example of the movement information correction control by the information processing apparatus 100 according to the first modification example of the embodiment of the present disclosure. FIG. 11 shows an example in which the user performs an operation of sliding the thumb of the right hand holding a device having a touch screen such as a smartphone on the touch screen. The recognition unit 104 recognizes the user's finger 51 on the touch screen 30 and the movement of the finger 51. The correction control unit 106 determines the presence or absence of correction based on the recognized change in the position of the finger 51. As shown in FIG. 11, the finger 51 is moved on the touch screen 30 in an arc as in the movement path R <b> 65, and therefore it is determined that correction control is performed here. Next, the correction control unit 106 determines an estimated movement direction for the finger 51 based on a change in the position of the finger 51, and calculates the length of the movement route R65. Then, the correction control unit 106 changes the movement direction related to the movement information of the operating tool to the determined estimated moving direction (that is, the direction of the movement route R75), and calculates the movement amount related to the movement information of the operating tool. The length is changed to the length of the route R65. In other words, the correction control unit 106 changes the movement information of the operating tool from the movement information related to the movement route R65 to the movement information related to the movement route R75. Note that the correction control process of the information processing apparatus 100 may be applied to an apparatus in which a touch panel and a display unit are separately provided instead of an apparatus including a touch screen.
 このように、第1の変形例によれば、情報処理装置100は、様々な形態の操作について適用することができる。従って、様々な操作について、ユーザが意図する処理を装置に実行させながら操作にかかる負担を低減することが可能となる。 Thus, according to the first modification, the information processing apparatus 100 can be applied to various types of operations. Therefore, for various operations, it is possible to reduce a burden on operations while causing the apparatus to execute processing intended by the user.
  (第2の変形例)
 本開示の一実施形態の第2の変形例として、情報処理装置100は、操作体の移動方向の変化の程度に基づいて制御態様を決定してもよい。具体的には、補正制御部106は、操作体の移動方向の変化の程度が所定値以上である場合、補正制御を実行する旨を決定する。図12を参照して、本変形例の処理について詳細に説明する。図12は、本開示の一実施形態の第2の変形例に係る情報処理装置100における補正制御の例を説明するための図である。
(Second modification)
As a second modification of an embodiment of the present disclosure, the information processing apparatus 100 may determine the control mode based on the degree of change in the moving direction of the operating tool. Specifically, the correction control unit 106 determines to perform correction control when the degree of change in the moving direction of the operating tool is equal to or greater than a predetermined value. With reference to FIG. 12, the process of this modification is demonstrated in detail. FIG. 12 is a diagram for describing an example of correction control in the information processing apparatus 100 according to the second modification of an embodiment of the present disclosure.
 認識部104は、操作体の移動を認識する。例えば、認識部104は、図12に示したような円を描くように移動させられる操作体の位置P51A~P51Dを認識することにより操作体の移動を認識する。 The recognition unit 104 recognizes the movement of the operating body. For example, the recognition unit 104 recognizes the movement of the operating body by recognizing the positions P51A to P51D of the operating body that are moved so as to draw a circle as illustrated in FIG.
 補正制御部106は、認識された操作体の移動に係る移動方向の変化量を算出する。例えば、補正制御部106は、図12に示したような操作体の位置P51A~P51Dの各々について、移動経路R66についての接線を算出する。そして、補正制御部106は、算出された接線間で接線の傾きの差分を算出する。 The correction control unit 106 calculates the amount of change in the movement direction related to the movement of the recognized operating tool. For example, the correction control unit 106 calculates a tangent to the movement route R66 for each of the operating body positions P51A to P51D as shown in FIG. Then, the correction control unit 106 calculates a difference in tangent slope between the calculated tangents.
 そして、補正制御部106は、算出された変化量が所定値を超えるかに基づいて補正制御の態様を決定する。例えば、補正制御部106は、算出された接線間の接線の傾きの差分が所定値を超える場合、補正制御を実行する旨を決定する。当該所定値としては、接線間の接線の傾きの差分から決定される値がある。例えば、当該所定値は、接線間の接線の傾きの差分の平均値、最頻値または中央値などであってよい。また、当該所定値は、操作前に決定される値であってもよい。 Then, the correction control unit 106 determines a correction control mode based on whether the calculated change amount exceeds a predetermined value. For example, the correction control unit 106 determines that the correction control is to be executed when the difference in the slope of the tangent between the calculated tangents exceeds a predetermined value. The predetermined value includes a value determined from a difference in tangent slope between tangents. For example, the predetermined value may be an average value, a mode value, a median value, or the like of the tangent slope difference between the tangent lines. The predetermined value may be a value determined before the operation.
 このように、第2の変形例によれば、情報処理装置100は、操作体の移動方向の変化の程度に基づいて制御態様を決定する。このため、操作体の移動方向が所定の程度で変化するような操作(例えば回転操作など)についても移動情報の補正を行うことができる。従って、補正対象となる操作のバリエーションが増加することにより、ユーザにかかる操作の負担をより軽減することが可能となる。 Thus, according to the second modification, the information processing apparatus 100 determines the control mode based on the degree of change in the moving direction of the operating tool. For this reason, it is possible to correct the movement information for an operation (for example, a rotation operation) in which the moving direction of the operating body changes by a predetermined degree. Therefore, it is possible to further reduce the operation burden on the user by increasing the variation of the operation to be corrected.
  (第3の変形例)
 本開示の一実施形態の第3の変形例として、情報処理装置100は、操作体の移動態様のパターンマッチングを用いて移動情報の補正制御の態様を決定してもよい。具体的には、補正制御部106は、操作体の移動態様と予め記憶される操作体の移動態様のパターンとの比較に基づいて移動情報の制御態様を決定する。パターンが用意される操作体の移動態様としては、操作体の移動の速さ、加速または移動経路がある。例えば、認識部104は、操作体の移動態様と予め記憶される移動態様のパターンとのマッチングを行う。そして、補正制御部106は、認識部104のパターンマッチングの結果に基づいて移動情報の制御有無を決定する。さらに、操作体の移動態様のパターンは、機械学習により得られてよい。例えば、やり直しが繰り返される操作についての操作体の移動速度、加速度または移動経路を蓄積することにより、操作体の移動態様のパターンが導出される。
(Third Modification)
As a third modification example of an embodiment of the present disclosure, the information processing apparatus 100 may determine a movement information correction control mode using pattern matching of a movement mode of an operating tool. Specifically, the correction control unit 106 determines the movement information control mode based on a comparison between the movement mode of the operating tool and the pattern of the moving mode of the operating tool stored in advance. As the movement mode of the operating body for which the pattern is prepared, there are the speed of movement of the operating body, acceleration, or a moving path. For example, the recognition unit 104 performs matching between the movement mode of the operating tool and a movement mode pattern stored in advance. Then, the correction control unit 106 determines whether to control the movement information based on the pattern matching result of the recognition unit 104. Furthermore, the pattern of the movement mode of the operating body may be obtained by machine learning. For example, by accumulating the moving speed, acceleration, or moving path of the operating tool for an operation that is repeatedly performed, a pattern of the operating mode of the operating tool is derived.
 次に、図13を参照して、本変形例の処理について詳細に説明する。図13は、本開示の一実施形態の第3の変形例に係る情報処理装置100の全体処理の例を概念的に示すフローチャートである。なお、上述した処理と実質的に同一である処理については説明を省略する。 Next, the processing of this modification will be described in detail with reference to FIG. FIG. 13 is a flowchart conceptually showing an example of overall processing of the information processing apparatus 100 according to the third modification example of the embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
 情報処理装置100は、アプリケーションの起動後(ステップS411)に終了操作を認識していないと判定されると(ステップS412/NO)、操作体の移動が認識されたかを判定する(ステップS413)。 If it is determined that the end operation has not been recognized after the application is started (step S411) (step S412 / NO), the information processing apparatus 100 determines whether the movement of the operating tool has been recognized (step S413).
 操作体の移動が認識されたと判定されると(ステップS413/YES)、情報処理装置100は、操作体の移動態様が所定のパターンと一致するかを判定する(ステップS414)。具体的には、認識部104は、認識された操作体の移動の速さ、加速または移動経路について、予め記憶される学習パターンとのマッチングを行う。そして、補正制御部106は、学習パターンとのマッチングが成功すると、移動情報の補正処理を実行する旨を決定する。反対に、学習パターンとのマッチングが失敗すると、移動情報の補正処理を実行しない旨が決定される。 If it is determined that the movement of the operating tool has been recognized (step S413 / YES), the information processing apparatus 100 determines whether the movement mode of the operating tool matches a predetermined pattern (step S414). Specifically, the recognizing unit 104 performs matching with a learning pattern stored in advance for the recognized movement speed, acceleration, or movement path of the operating tool. Then, when the matching with the learning pattern is successful, the correction control unit 106 determines to execute the movement information correction process. On the other hand, if the matching with the learning pattern fails, it is determined not to execute the movement information correction process.
 操作体の移動態様が所定のパターンと一致すると判定されると(ステップS414/YES)、移動情報の補正処理が実行された後(ステップS415~S417)、補正後の移動情報に基づいて処理が実行される(ステップS418)。 If it is determined that the movement mode of the operating body matches the predetermined pattern (step S414 / YES), after the movement information correction process is executed (steps S415 to S417), the process is performed based on the corrected movement information. This is executed (step S418).
 このように、第3の変形例によれば、情報処理装置100は、操作体の移動態様と予め記憶される操作体の移動態様のパターンとの比較に基づいて移動情報の制御態様を決定する。このため、移動情報の補正が所望されるパターンの態様である移動がなされた場合に移動情報の補正処理が行われる確実性を向上させることができる。従って、補正処理が行われるパターンをユーザが学習しやすくなり、ユーザビリティを向上させることが可能となる。 As described above, according to the third modification, the information processing apparatus 100 determines the control mode of the movement information based on the comparison between the movement mode of the operating tool and the movement mode pattern stored in advance. . For this reason, it is possible to improve the certainty that the correction process of the movement information is performed when the movement that is the mode of the pattern in which the correction of the movement information is desired. Therefore, it becomes easy for the user to learn a pattern to be corrected, and usability can be improved.
 また、上記操作体の移動態様は、操作体の移動の速さ、加速または移動経路を含む。ここで、操作体の移動の速さまたは加速が高いほど、操作の移動に歪みが生じやすい。そこで、操作体の移動の速さまたは加速に基づいて補正処理の有無などが制御されることにより、補正が所望される場合においてより確実に補正処理を実行することが可能となる。また、移動経路についてパターンマッチングが行われることにより、移動の速さまたは加速よりもさらに確実に補正が所望される場合において補正処理を実行することが可能となる。 Further, the movement mode of the operating body includes the speed, acceleration, or moving path of the operating body. Here, the higher the speed or acceleration of movement of the operating body, the more likely the movement of the operation is distorted. Therefore, by controlling the presence / absence of the correction process based on the speed or acceleration of the movement of the operating tool, the correction process can be more reliably executed when correction is desired. Further, by performing pattern matching on the movement path, it is possible to execute the correction process when correction is desired more reliably than the speed or acceleration of movement.
 また、上記操作体の移動態様のパターンは、操作体の移動態様の機械学習により得られる。ここで、操作体の操作は、概して個々のユーザの間で差異がある。そこで、個々のユーザについて機械学習されたパターンが補正有無の判定処理に用いられることにより、個々のユーザに適した補正処理を提供することが可能となる。なお、ユーザ個人ごとに学習パターンが用意されてもよく、ユーザの属性ごとに学習パターンが用意されてもよい。 Further, the pattern of the movement mode of the operating body is obtained by machine learning of the movement mode of the operating body. Here, the operation of the operation body generally differs among individual users. Therefore, a machine-learned pattern for each user is used for the determination process of whether or not correction is performed, so that a correction process suitable for each user can be provided. A learning pattern may be prepared for each individual user, and a learning pattern may be prepared for each user attribute.
  (第4の変形例)
 本開示の一実施形態の第4の変形例として、移動情報の補正有無は、ユーザにより選択されてもよい。具体的には、補正制御部106は、操作体を用いた操作の主体による移動情報の制御態様の選択操作に係る情報に基づいて移動情報の制御態様を決定する。例えば、処理制御部108は、移動情報の補正有無が選択される選択肢(以下、選択肢オブジェクトとも称する。)を表示装置に表示させる。そして、補正制御部106は、表示される選択肢を選択する操作の認識に基づいて移動情報の補正有無を決定する。さらに、図14を参照して、本変形例の処理について詳細に説明する。図14は、本開示の一実施形態の第4の変形例に係る情報処理装置100により表示が制御される移動情報の補正有無の選択肢の例を示す図である。
(Fourth modification)
As a fourth modification example of the embodiment of the present disclosure, whether or not the movement information is corrected may be selected by the user. Specifically, the correction control unit 106 determines the control mode of the movement information based on the information related to the selection operation of the control mode of the movement information by the subject of the operation using the operation tool. For example, the process control unit 108 causes the display device to display an option (hereinafter also referred to as an option object) for selecting whether or not to correct the movement information. Then, the correction control unit 106 determines whether or not to correct the movement information based on recognition of an operation for selecting the displayed option. Furthermore, with reference to FIG. 14, the process of this modification is demonstrated in detail. FIG. 14 is a diagram illustrating an example of options for whether or not to correct the movement information whose display is controlled by the information processing apparatus 100 according to the fourth modification example of the embodiment of the present disclosure.
 補正制御部106は、操作体の移動情報の補正有無を判定する。例えば、補正制御部106は、上述したような操作体の移動態様に基づいて補正有無を判定する。 The correction control unit 106 determines whether or not the movement information of the operating tool is corrected. For example, the correction control unit 106 determines the presence / absence of correction based on the movement mode of the operating tool as described above.
 操作体の移動情報を補正する旨が判定されると、補正制御部106は、処理制御部108に選択肢オブジェクトを表示させる。例えば、操作体の位置に応じた投影領域上の位置に投影される操作オブジェクト21が図14に示したような移動経路R67のように移動させられると、補正制御部106により補正処理を実行する旨が判定される。そして、処理制御部108は、補正する旨を選択する選択肢オブジェクト40Aおよび補正しない旨を選択する選択肢オブジェクト40Bを投影装置300に投影させる。ここで、選択肢オブジェクト40Aおよび40Bが投影される位置は、操作オブジェクト21の投影位置に隣接する位置であってよい。例えば、選択肢オブジェクト40Aおよび40Bは、投影されている操作オブジェクト21と隣接する位置であって、操作オブジェクト21の移動経路の延長線上に投影される。 When it is determined that the movement information of the operating tool is to be corrected, the correction control unit 106 causes the processing control unit 108 to display an option object. For example, when the operation object 21 projected to a position on the projection area corresponding to the position of the operation body is moved as shown in the movement path R67 as shown in FIG. 14, the correction control unit 106 executes correction processing. It is determined. Then, the processing control unit 108 causes the projection device 300 to project the option object 40A that selects correction and the option object 40B that selects not correction. Here, the position where the choice objects 40A and 40B are projected may be a position adjacent to the projection position of the operation object 21. For example, the choice objects 40 </ b> A and 40 </ b> B are positions adjacent to the projected operation object 21 and are projected on an extension line of the movement path of the operation object 21.
 次に、補正制御部106は、選択肢オブジェクトの選択結果に基づいて補正処理の実行有無を制御する。例えば、補正制御部106は、選択肢オブジェクト40Aを選択する操作が認識部104により認識されると、補正処理を実行する。反対に選択肢オブジェクト40Bを選択する操作が認識されると、補正処理を実行しない。選択肢オブジェクトは、操作オブジェクトを操作している操作体(例えば右手)とは別の操作体(例えば左手)により選択される。また、選択肢オブジェクトは、操作オブジェクトを操作している指の本数に応じて選択されてもよい。なお、移動情報の補正有無の選択は、他の選択方法により選択されてもよい。例えば、音声入力により選択肢が選択されてよく、操作を行うユーザの視線に基づいて選択肢が選択されてもよい。 Next, the correction control unit 106 controls whether or not the correction process is executed based on the selection result of the choice object. For example, when the recognition unit 104 recognizes an operation for selecting the choice object 40A, the correction control unit 106 performs a correction process. On the contrary, when an operation for selecting the option object 40B is recognized, the correction process is not executed. The choice object is selected by an operation body (for example, the left hand) different from the operation body (for example, the right hand) that is operating the operation object. Further, the choice object may be selected according to the number of fingers operating the operation object. The selection of whether or not the movement information is corrected may be selected by another selection method. For example, the option may be selected by voice input, and the option may be selected based on the line of sight of the user who performs the operation.
 なお、表示される選択肢オブジェクトは、補正後の移動情報を示すプレビューであってもよい。図15を参照して、選択肢オブジェクトが補正後の移動情報を示すプレビューである場合について説明する。図15は、本開示の一実施形態の第4の変形例に係る情報処理装置100において選択肢オブジェクトとして補正後の移動情報を示すプレビューが表示される例を説明するための図である。 Note that the displayed option object may be a preview showing the corrected movement information. With reference to FIG. 15, a case where the option object is a preview indicating the corrected movement information will be described. FIG. 15 is a diagram for describing an example in which a preview indicating movement information after correction is displayed as an option object in the information processing apparatus 100 according to the fourth modification of an embodiment of the present disclosure.
 補正制御部106は、操作体の移動情報を補正する旨が判定されると、補正後の移動情報に係る移動方向および補正前の移動情報に係る移動方向を算出する。例えば、補正制御部106は、補正前の移動情報に基づいて推定移動方向を算出する。また、補正制御部106は、補正前の移動情報に係る位置のうちの移動方向が時系列で変化したと判定された後の位置から推定される移動方向(以下、直近の移動方向とも称する。)を算出する。 When it is determined that the movement information of the operating tool is to be corrected, the correction control unit 106 calculates the movement direction related to the corrected movement information and the movement direction related to the movement information before correction. For example, the correction control unit 106 calculates the estimated movement direction based on the movement information before correction. Further, the correction control unit 106 is also referred to as a movement direction (hereinafter, also referred to as the latest movement direction) estimated from a position after it is determined that the movement direction among the positions related to the movement information before correction has changed in time series. ) Is calculated.
 次に、処理制御部108は、算出された補正後の移動情報に係る移動方向を示す選択肢オブジェクトおよび算出された補正前の移動情報に係る移動方向を示す選択肢オブジェクトを投影装置300に投影させる。例えば、処理制御部108は、補正制御部106により算出された、推定移動方向を示す選択肢オブジェクト41Aおよび直近の移動方向を示す選択肢オブジェクト41Bを投影装置300に投影させる。 Next, the process control unit 108 causes the projection apparatus 300 to project the option object indicating the movement direction related to the calculated corrected movement information and the option object indicating the movement direction related to the calculated movement information before correction. For example, the processing control unit 108 causes the projection device 300 to project the option object 41A indicating the estimated movement direction and the option object 41B indicating the latest movement direction calculated by the correction control unit 106.
 さらに、補正制御部106は、選択肢の態様を制御してもよい。具体的には、補正制御部106は、実行前の移動情報の制御についての評価情報に基づいて、表示される選択肢の態様を制御する。図16を参照して、選択肢オブジェクトの態様の制御について詳細に説明する。図16は、本開示の一実施形態の第4の変形例に係る情報処理装置100において選択肢オブジェクトの態様制御の例を示す図である。 Further, the correction control unit 106 may control the mode of options. Specifically, the correction control unit 106 controls the mode of options displayed based on the evaluation information regarding the control of movement information before execution. With reference to FIG. 16, the control of the choice object mode will be described in detail. FIG. 16 is a diagram illustrating an example of the choice object mode control in the information processing apparatus 100 according to the fourth modification example of the embodiment of the present disclosure.
 補正制御部106は、操作体の移動態様に基づいて操作体の移動情報の補正有無を判定すると共に、推定される補正の提案レベルを算出する。補正の提案レベルは、ユーザが所望すると推定される補正に対する実行される予定の補正の適切さ(言い換えると信頼性)を示す指標であってよい。具体的には、補正制御部106は、移動方向の変化の程度に応じて補正の提案レベルを算出する。例えば、補正制御部106は、操作体の移動経路が図16に示したような蛇行した移動経路R68である場合、移動方向の変化の程度は小さくなるため、補正の提案レベルは移動経路が蛇行しないで移動方向が変化する場合と比べて低く算出される。なお、補正の提案レベルは外部の装置により算出されてもよい。 The correction control unit 106 determines whether or not the movement information of the operation tool is corrected based on the movement mode of the operation tool, and calculates an estimated correction proposal level. The proposal level of correction may be an index indicating the appropriateness (in other words, reliability) of the correction to be performed with respect to the correction estimated to be desired by the user. Specifically, the correction control unit 106 calculates a correction proposal level according to the degree of change in the movement direction. For example, when the movement path of the operating body is a meandering movement path R68 as shown in FIG. 16, the correction control unit 106 reduces the degree of change in the movement direction, so the correction proposal level is the meandering movement path. Without being calculated as compared with the case where the moving direction changes. Note that the suggested level of correction may be calculated by an external device.
 次に、補正制御部106は、操作体の移動情報を補正する旨が判定されると、補正後の移動情報に係る移動方向および補正前の移動情報に係る移動方向を算出する。 Next, when it is determined that the movement information of the operating tool is to be corrected, the correction control unit 106 calculates the movement direction related to the corrected movement information and the movement direction related to the movement information before correction.
 そして、処理制御部108は、補正後の移動情報に係る移動方向を示す選択肢オブジェクトおよび補正前の移動情報に係る移動方向を示す選択肢オブジェクトを提案レベルに応じた態様で投影装置300に投影させる。例えば、処理制御部108は、提案レベルに応じて態様が変更された推定移動方向を示す選択肢オブジェクト42Aおよび直近の移動方向を示す選択肢オブジェクト42Bを投影装置300に投影させる。ここでは、提案レベルが低いため、選択肢オブジェクト42Aおよび42Bの輪郭および透明度が、図15に示した選択肢オブジェクト41Aおよび41Bよりも細く薄くされ、存在感が弱められている。なお、制御される選択肢オブジェクトの態様は、例えば選択肢オブジェクトの色、濃淡または輝度などの他の視覚的態様であってもよい。 Then, the processing control unit 108 causes the projection apparatus 300 to project the option object indicating the movement direction related to the corrected movement information and the option object indicating the movement direction related to the movement information before correction in a manner corresponding to the proposal level. For example, the processing control unit 108 causes the projection device 300 to project the option object 42A indicating the estimated movement direction whose mode has been changed according to the proposal level and the option object 42B indicating the latest movement direction. Here, since the proposal level is low, the outlines and transparency of the choice objects 42A and 42B are made thinner and thinner than the choice objects 41A and 41B shown in FIG. 15, and the presence is weakened. Note that the mode of the option object to be controlled may be another visual mode such as the color, shading, or luminance of the option object.
 このように、第4の変形例によれば、情報処理装置100は、操作体を用いた操作の主体による移動情報の制御態様の選択操作に係る情報に基づいて移動情報の制御態様を決定する。ここで、補正制御対象である操作についてユーザがあえてそのまま入力されることを所望する場合にも移動情報が補正されてしまうと、ユーザの意図に沿わない処理が実行されることになりかねない。これに対し、移動情報の補正有無がユーザによって選択されることにより、ユーザの意図に応じて補正処理の実行有無を制御することができる。従って、ユーザの意図に反して補正された移動情報に基づく処理が実行されることを防止することが可能となる。 As described above, according to the fourth modification, the information processing apparatus 100 determines the movement information control mode based on the information related to the selection operation of the movement information control mode performed by the subject of the operation using the operating tool. . Here, even when the user desires to input the operation that is the correction control target as it is, if the movement information is corrected, processing that does not conform to the user's intention may be executed. On the other hand, whether the correction of the movement information is selected by the user can control whether the correction process is performed according to the user's intention. Therefore, it is possible to prevent the processing based on the movement information corrected against the user's intention from being executed.
 また、上記選択操作は、表示される選択肢を選択する操作を含む。このため、選択肢が視覚的にユーザに提示されることにより、ユーザが選択肢を見逃したり、選択肢の選択を誤ったりするおそれを抑制することができる。なお、選択肢は音声でユーザに提示されてもよい。 Also, the selection operation includes an operation for selecting a displayed option. For this reason, when the option is visually presented to the user, the possibility that the user misses the option or erroneously selects the option can be suppressed. Note that the options may be presented to the user by voice.
 また、上記表示される選択肢は、制御後の移動情報を示すプレビューを含む。このため、移動情報を装置に補正させるべきかどうかの判断材料をユーザに提示することができる。従って、ユーザは移動情報の補正の有無をより正確に判断することが可能となる。 Also, the displayed options include a preview showing movement information after control. For this reason, it is possible to present to the user material for determining whether or not the movement information should be corrected by the apparatus. Therefore, the user can more accurately determine whether or not the movement information is corrected.
 また、情報処理装置100は、実行前の移動情報の制御についての評価情報に基づいて表示される選択肢の態様を制御する。このため、レコメンデーションの強さがユーザに提示されることにより、移動情報を装置に補正させるべきかどうかの判断材料を増やすことができる。従って、ユーザは移動情報の補正の有無をさらに正確に判断することが可能となる。 Further, the information processing apparatus 100 controls the mode of options displayed based on the evaluation information regarding the control of the movement information before execution. For this reason, when the strength of the recommendation is presented to the user, it is possible to increase materials for determining whether the apparatus should correct the movement information. Therefore, the user can more accurately determine whether or not the movement information is corrected.
  (第5の変形例)
 本開示の一実施形態の第5の変形例として、移動情報の補正態様は、他の情報に基づいて制御されてもよい。具体的には、補正制御部106は、操作の対象の属性情報または態様情報に基づいて移動情報の制御態様を決定する。操作対象としては、アプリケーションまたは表示される仮想オブジェクトなどがある。操作対象の属性情報としては、操作対象の種類、形式または識別子などがある。例えば、補正制御部106は、操作体により操作されるアプリケーションの種類に応じて操作体の移動情報の補正の程度(例えば補正パラメタ)を決定する。また、操作対象の態様情報としては、操作対象のサイズ、形状または移動速度などがある。例えば、補正制御部106は、操作体により操作される仮想オブジェクトのサイズに応じて操作体の移動情報の補正の程度を決定する。
(Fifth modification)
As a fifth modification of an embodiment of the present disclosure, the movement information correction mode may be controlled based on other information. Specifically, the correction control unit 106 determines the control mode of the movement information based on the attribute information or mode information to be operated. The operation target includes an application or a displayed virtual object. The operation target attribute information includes the type, format, or identifier of the operation target. For example, the correction control unit 106 determines the degree of correction (for example, a correction parameter) of the movement information of the operating tool according to the type of application operated by the operating tool. Further, the operation target aspect information includes the size, shape, or movement speed of the operation target. For example, the correction control unit 106 determines the degree of correction of the movement information of the operating tool according to the size of the virtual object operated by the operating tool.
 また、補正制御部106は、操作が行われる場所の態様情報に基づいて移動情報の制御態様を決定する。操作が行われる場所の態様情報としては、操作が行われる場所の形状、質感または湿気などがある。例えば、補正制御部106は、操作が行われる面の起伏に応じて操作体の移動情報の補正の程度を決定する。 Further, the correction control unit 106 determines the control mode of the movement information based on the mode information of the place where the operation is performed. The form information of the place where the operation is performed includes the shape, texture or moisture of the place where the operation is performed. For example, the correction control unit 106 determines the degree of correction of the movement information of the operating tool according to the undulation of the surface on which the operation is performed.
 また、補正制御部106は、操作の主体の属性情報または態様情報に基づいて移動情報の制御態様を決定する。操作の主体の属性情報としては、操作体を操作するユーザの年齢、性別または健康状態などがある。例えば、補正制御部106は、ユーザが高齢であるほど、操作体の移動情報の補正量を増やす。また、操作の主体の態様情報としては、操作体を操作するユーザの体格、位置、姿勢または行動などがある。例えば、補正制御部106は、ユーザの腕の長さが長いほど、操作体の移動情報の補正量を減らす。 Further, the correction control unit 106 determines the control mode of the movement information based on the attribute information or mode information of the operation subject. The attribute information of the subject of operation includes the age, sex or health status of the user who operates the operating tool. For example, the correction control unit 106 increases the correction amount of the movement information of the operating tool as the user is older. Further, the mode information of the subject of the operation includes the physique, position, posture or action of the user who operates the operating tool. For example, the correction control unit 106 decreases the correction amount of the movement information of the operating tool as the length of the user's arm is longer.
 さらに、図17を参照して、本変形例の処理について説明する。図17は、本開示の一実施形態の第5の変形例に係る情報処理装置100における移動情報の補正処理(ステップS500B)の詳細の例を概念的に示すフローチャートである。なお、上述した処理と実質的に同一である処理については説明を省略する。 Furthermore, with reference to FIG. 17, the process of this modification is demonstrated. FIG. 17 is a flowchart conceptually showing a detailed example of the movement information correction process (step S500B) in the information processing apparatus 100 according to the fifth modification example of the embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
 情報処理装置100は、補正準備処理を実行する(ステップS600)。具体的には、上述したステップS501~S505の処理が実行される。 The information processing apparatus 100 executes correction preparation processing (step S600). Specifically, the processes of steps S501 to S505 described above are executed.
 次に、情報処理装置100は、操作対象の属性情報または態様情報が取得されたかを判定する(ステップS521)。具体的には、補正制御部106は、アプリケーションから属性情報を取得する。また、認識部104は、仮想オブジェクトについての態様を認識し、態様情報を補正制御部106に提供する。 Next, the information processing apparatus 100 determines whether attribute information or aspect information to be operated has been acquired (step S521). Specifically, the correction control unit 106 acquires attribute information from the application. The recognizing unit 104 recognizes the aspect of the virtual object and provides the aspect information to the correction control unit 106.
 操作対象の属性情報または態様情報が取得されたと判定されると(ステップS521/YES)、情報処理装置100は、操作対象の属性または態様に応じて補正量を決定する(ステップS522)。具体的には、補正制御部106は、取得された属性情報の示す属性または態様情報の示す態様に応じた程度の補正量を算出する。 If it is determined that the attribute information or aspect information of the operation target has been acquired (step S521 / YES), the information processing apparatus 100 determines a correction amount according to the attribute or aspect of the operation target (step S522). Specifically, the correction control unit 106 calculates a correction amount corresponding to the attribute indicated by the acquired attribute information or the mode indicated by the mode information.
 また、情報処理装置100は、操作場所の態様情報が取得されたかを判定する(ステップS523)。具体的には、認識部104は、ユーザが操作体を移動させる操作面を認識し、認識された操作面の態様に係る態様情報を補正制御部106に提供する。 In addition, the information processing apparatus 100 determines whether or not the operation location mode information has been acquired (step S523). Specifically, the recognizing unit 104 recognizes an operation surface on which the user moves the operation body, and provides the correction control unit 106 with mode information related to the mode of the recognized operation surface.
 操作場所の態様情報が取得されたと判定されると(ステップS523/YES)、情報処理装置100は、操作場所の態様に応じた補正量を決定する(ステップS524)。具体的には、補正制御部106は、取得された操作場所の態様情報の示す態様に応じた程度の補正量を算出する。 If it is determined that the operation location mode information has been acquired (step S523 / YES), the information processing apparatus 100 determines a correction amount according to the operation location mode (step S524). Specifically, the correction control unit 106 calculates a correction amount to a degree according to the mode indicated by the acquired mode information of the operation place.
 また、情報処理装置100は、操作主体の属性情報または態様情報が取得されたかを判定する(ステップS525)。具体的には、認識部104は、操作を行うユーザを認識し、認識されたユーザが特定される情報または認識されたユーザの態様に係る態様情報を補正制御部106に提供する。そして、補正制御部106は、提供されたユーザが特定される情報に基づいて当該ユーザの属性情報を記憶部などから取得する。なお、ユーザの属性情報は外部の装置から通信を介して取得されてもよい。 In addition, the information processing apparatus 100 determines whether the attribute information or the mode information of the operation subject has been acquired (step S525). Specifically, the recognizing unit 104 recognizes the user who performs the operation, and provides the correction control unit 106 with information that identifies the recognized user or aspect information related to the recognized user aspect. And the correction control part 106 acquires the attribute information of the said user from a memory | storage part etc. based on the information by which the provided user is specified. Note that the user attribute information may be acquired from an external device via communication.
 操作主体の属性情報または態様情報が取得されたと判定されると(ステップS525/YES)、情報処理装置100は、操作主体の属性または態様に応じた補正量を決定する(ステップS526)。具体的には、補正制御部106は、取得されたユーザの属性情報の示す属性または態様情報の示す態様に応じた程度の補正量を算出する。 If it is determined that the attribute information or aspect information of the operating subject has been acquired (step S525 / YES), the information processing apparatus 100 determines a correction amount according to the attribute or aspect of the operating subject (step S526). Specifically, the correction control unit 106 calculates a correction amount corresponding to the attribute indicated by the acquired user attribute information or the mode indicated by the mode information.
 なお、上述したいずれの情報も取得されない場合には、情報処理装置100は、補正量を基準に従って決定する(ステップS527)。具体的には、補正制御部106は、補正量を初期値に設定する。 In addition, when none of the above-described information is acquired, the information processing apparatus 100 determines the correction amount according to the standard (step S527). Specifically, the correction control unit 106 sets the correction amount to an initial value.
 次に、情報処理装置100は、移動情報に係る移動方向を補正量と推定移動方向とに基づいて補正する(ステップS528)。具体的には、補正制御部106は、移動情報に係る移動方向を、推定された移動経路に係る移動方向から決定された推定移動方向に算出された補正量を加えた方向へ変更する。 Next, the information processing apparatus 100 corrects the movement direction related to the movement information based on the correction amount and the estimated movement direction (step S528). Specifically, the correction control unit 106 changes the movement direction related to the movement information to a direction obtained by adding the correction amount calculated to the estimated movement direction determined from the movement direction related to the estimated movement route.
 また、情報処理装置100は、移動情報に係る移動量を補正量と算出された移動経路長とに基づいて補正する(ステップS529)。具体的には、補正制御部106は、移動情報に係る移動量を、推定された移動経路の特定の方向の長さから算出された移動経路長に算出された補正量を加えた長さへ変更する。 Further, the information processing apparatus 100 corrects the movement amount related to the movement information based on the correction amount and the calculated movement route length (step S529). Specifically, the correction control unit 106 sets the movement amount related to the movement information to a length obtained by adding the correction amount calculated to the movement route length calculated from the length of the estimated movement route in a specific direction. change.
 このように、第5の変形例によれば、情報処理装置100は、操作の対象の属性情報または態様情報に基づいて移動情報の制御態様を決定する。概して、操作の対象に応じて操作方法または操作の容易性が異なる。そこで、操作の対象に応じて補正の有無または程度が制御されることにより、補正の所望の有無すなわちユーザニーズに応じた補正処理の制御が可能となる。 Thus, according to the fifth modification, the information processing apparatus 100 determines the control mode of the movement information based on the attribute information or the mode information of the operation target. Generally, the operation method or the ease of operation differs depending on the object of operation. Therefore, by controlling the presence / absence or degree of correction according to the operation target, it is possible to control the correction processing according to the desired presence / absence of correction, that is, user needs.
 また、情報処理装置100は、操作が行われる場所の態様情報に基づいて移動情報の制御態様を決定する。概して、操作場所の態様に応じて操作の容易性などが異なる。そこで、操作場所に応じて補正の有無または程度が制御されることにおり、補正の所望の有無すなわちユーザニーズに応じた補正処理の制御が可能となる。 Further, the information processing apparatus 100 determines the control mode of the movement information based on the mode information of the place where the operation is performed. In general, the ease of operation differs depending on the mode of operation. In view of this, the presence / absence or degree of correction is controlled in accordance with the operation location, so that it is possible to control the correction processing according to the desired presence / absence of correction, that is, user needs.
 また、情報処理装置100は、操作の主体の属性情報または態様情報に基づいて移動情報の制御態様を決定する。概して、操作主体に応じて操作のやり方または操作の範囲などが異なる。そこで、操作主体に応じて補正の有無または程度が制御されることにより、補正の所望の有無すなわちユーザニーズに応じた補正処理の制御が可能となる。 Also, the information processing apparatus 100 determines the control mode of the movement information based on the attribute information or mode information of the operation subject. In general, the method of operation or the range of operation differs depending on the operation subject. Therefore, by controlling the presence / absence or degree of correction according to the operation subject, it is possible to control the correction process according to the desired presence / absence of correction, that is, user needs.
 また、移動情報の制御態様は、移動情報の制御の程度を含む。このため、補正の有無だけでなく、補正の程度が制御されることにより、より細やかな補正制御ができる。従って、より細かいユーザニーズに対応することが可能となる。 Also, the movement information control mode includes the degree of movement information control. For this reason, more precise correction control can be performed by controlling not only the presence / absence of correction but also the degree of correction. Therefore, it becomes possible to respond to finer user needs.
  (第6の変形例)
 本開示の一実施形態の第6の変形例として、移動情報の補正はキャンセルされてもよい。具体的には、補正制御部106は、移動情報の制御開始後における所定の操作(以下、キャンセル操作とも称する。)の有無に応じて移動情報の制御の取り消しを制御する。より具体的には、補正制御部106は、当該キャンセル操作が認識されると、操作対応処理に用いられる移動情報を、制御前の移動情報へ変更する。キャンセル操作としては、手を振る、所定の時間操作を止める、指を増やすまたは減らすなどがある。例えば、移動情報の補正が実行されユーザの操作に応じて補正後の移動情報に係る方向に仮想オブジェクトなどが移動させられた後、ユーザがキャンセル操作を行うと、補正制御部106は、当該ユーザの操作に応じて補正前の移動情報に係る方向に仮想オブジェクトが移動させられた場合の位置に仮想オブジェクトを移動させる。
(Sixth Modification)
As a sixth modification example of the embodiment of the present disclosure, the correction of the movement information may be canceled. Specifically, the correction control unit 106 controls cancellation of movement information control in accordance with the presence or absence of a predetermined operation (hereinafter also referred to as cancellation operation) after the start of movement information control. More specifically, when the cancel operation is recognized, the correction control unit 106 changes the movement information used for the operation handling process to the movement information before the control. Canceling operations include shaking hands, stopping the operation for a predetermined time, and increasing or decreasing the number of fingers. For example, after the movement information is corrected and a virtual object or the like is moved in the direction related to the corrected movement information according to the user's operation, when the user performs a cancel operation, the correction control unit 106 The virtual object is moved to the position when the virtual object is moved in the direction related to the movement information before correction in accordance with the operation.
 さらに、図18を参照して、移動情報の補正のキャンセル処理について詳細に説明する。図18は、本開示の一実施形態の第6の変形例に係る情報処理装置100における移動情報の補正のキャンセル処理を含む全体処理の例を概念的に示すフローチャートである。なお、上述した処理と実質的に同一である処理については説明を省略する。 Further, with reference to FIG. 18, the movement information correction cancellation process will be described in detail. FIG. 18 is a flowchart conceptually showing an example of overall processing including movement information correction cancellation processing in the information processing apparatus 100 according to the sixth modification of an embodiment of the present disclosure. Note that description of processing that is substantially the same as the processing described above is omitted.
 情報処理装置100は、アプリケーションの起動後(ステップS431)に終了操作を認識していないと判定されると(ステップS432/NO)、操作体の移動が認識されたかを判定する(ステップS433)。 If it is determined that the end operation is not recognized after the application is started (step S431) (step S432 / NO), the information processing apparatus 100 determines whether the movement of the operating tool is recognized (step S433).
 操作体の移動が認識されたと判定されると(ステップS433/YES)、情報処理装置100は、操作体の移動方向が変化したかを判定する(ステップS434)。 If it is determined that the movement of the operating tool has been recognized (step S433 / YES), the information processing apparatus 100 determines whether the moving direction of the operating tool has changed (step S434).
 操作体の移動方向が変化したと判定されると(ステップS434/YES)、移動情報の補正処理が実行された後(ステップS435~S437)、補正後の移動情報に基づいて処理が実行される(ステップS438)。 If it is determined that the movement direction of the operating tool has changed (step S434 / YES), the movement information correction process is executed (steps S435 to S437), and then the process is executed based on the corrected movement information. (Step S438).
 次に、情報処理装置100は、移動情報の補正についてのキャンセル操作が認識されたかを判定する(ステップS439)。具体的には、補正制御部106は、補正後の移動情報に基づいて処理が実行された後または処理が実行されている間に、認識部104によりキャンセル操作が認識されたかを判定する。 Next, the information processing apparatus 100 determines whether a cancel operation for correcting the movement information has been recognized (step S439). Specifically, the correction control unit 106 determines whether the cancel operation has been recognized by the recognition unit 104 after the process is executed or during the process based on the corrected movement information.
 移動情報の補正についてのキャンセル操作が認識されたと判定されると(ステップS439/YES)、情報処理装置100は、移動情報を補正前の移動情報へ変更する(ステップS440)。具体的には、補正制御部106は、キャンセル操作が認識されると、補正後の移動情報に基づく処理を処理制御部108に取り消させまたは中止させ、補正前の移動情報を処理制御部108に提供する。 If it is determined that the cancel operation for the correction of the movement information has been recognized (step S439 / YES), the information processing apparatus 100 changes the movement information to the movement information before correction (step S440). Specifically, when the cancel operation is recognized, the correction control unit 106 causes the processing control unit 108 to cancel or cancel the processing based on the corrected movement information, and causes the processing control unit 108 to transmit the movement information before correction. provide.
 次に、情報処理装置100は、変更後の移動情報に基づく処理を実行する(ステップS441)。具体的には、処理制御部108は、補正後の移動情報に基づく処理を取り消しまたは中止すると共に、補正前の移動情報に基づく処理を実行する。 Next, the information processing apparatus 100 executes processing based on the changed movement information (step S441). Specifically, the process control unit 108 cancels or cancels the process based on the corrected movement information and executes the process based on the movement information before correction.
 このように、第6の変形例によれば、情報処理装置100は、移動情報の制御開始後における所定の操作の有無に応じて移動情報の制御の取り消しを制御する。このため、ユーザの意図に反して移動情報が補正された場合であっても、補正後の移動情報に基づく処理が実行されることを抑制することができる。従って、ユーザビリティを向上させることが可能となる。 Thus, according to the sixth modification, the information processing apparatus 100 controls cancellation of movement information control according to the presence or absence of a predetermined operation after the start of movement information control. For this reason, even if it is a case where movement information is corrected contrary to a user's intention, it can control that processing based on movement information after amendment is performed. Therefore, usability can be improved.
 また、上記移動情報の制御の取り消しは、処理に用いられる移動情報を、制御前の移動情報へ変更することを含む。このため、補正がキャンセルされるだけでなく、自動的に補正前の移動情報に基づく処理が実行されることにより、ユーザの操作が無駄になることを防止することができる。従って、ユーザにかかる負担をより軽減することが可能となる。なお、移動情報の補正の取り消しは、操作の取り消しであってもよい。例えば、処理制御部108は、補正後の移動情報に基づく処理を取り消しまたは中止し、代替的な処理を実行しない。 Further, the cancellation of the control of the movement information includes changing the movement information used for the processing to the movement information before the control. For this reason, not only the correction is canceled, but the processing based on the movement information before the correction is automatically executed, so that it is possible to prevent the user's operation from being wasted. Therefore, the burden on the user can be further reduced. Note that the movement information correction may be canceled by canceling the operation. For example, the process control unit 108 cancels or cancels the process based on the corrected movement information and does not execute an alternative process.
  (第7の変形例)
 本開示の一実施形態の第7の変形例として、移動情報の制御態様の決定は、操作開始前であってもよい。具体的には、補正制御部106は、操作開始前に決定される操作に係る情報に基づいて補正の態様を決定する。より具体的には、補正制御部106は、決定された操作主体、操作場所または操作対象に基づいて補正の態様を決定する。例えば、補正制御部106は、操作対象が地図アプリケーションまたは三次元空間的に商品などを表示するアプリケーションに決定されると、自動的に補正を行わない旨を決定する。
(Seventh Modification)
As a seventh modified example of an embodiment of the present disclosure, the determination of the movement information control mode may be performed before the operation is started. Specifically, the correction control unit 106 determines a correction mode based on information related to an operation determined before the operation is started. More specifically, the correction control unit 106 determines a correction mode based on the determined operation subject, operation location, or operation target. For example, when the operation target is determined to be a map application or an application that displays products in a three-dimensional space, the correction control unit 106 determines that correction is not automatically performed.
 また、移動情報の制御態様の決定は、操作開始時であってもよい。具体的には、補正制御部106は、操作開始時に決定される操作に係る情報に基づいて補正の態様を決定する。例えば、一次元方向にのみ移動させられるスライドバーまたは映像もしくは音楽を再生するプレイヤーのシークバーなどが触れられることにより、これらの仮想オブジェクトが操作対象に決定される。補正制御部106は、これらの仮想オブジェクトが操作対象に決定されると、補正を行う旨を決定する。なお、実際に補正が行われるかどうかは、上述したようにユーザにより選択されてもよい。 Further, the determination of the movement information control mode may be performed at the start of the operation. Specifically, the correction control unit 106 determines a correction mode based on information related to the operation determined at the start of the operation. For example, by touching a slide bar that can be moved only in a one-dimensional direction or a seek bar of a player that plays video or music, these virtual objects are determined as operation targets. When these virtual objects are determined as operation targets, the correction control unit 106 determines to perform correction. Note that whether or not correction is actually performed may be selected by the user as described above.
 このように、第7の変形例によれば、移動情報の制御態様の決定タイミングは、操作開始前または操作開始時を含む。このため、操作開始当初から補正後の移動情報に基づく処理結果(例えば仮想オブジェクトの移動など)をユーザに提示することができる。従って、ユーザは補正後の移動情報に基づく処理結果を確認しながら操作を行うことが可能となる。 Thus, according to the seventh modification, the determination timing of the movement information control mode includes the time before the start of the operation or the time when the operation starts. For this reason, the processing result (for example, movement of a virtual object etc.) based on the corrected movement information from the beginning of the operation can be presented to the user. Therefore, the user can perform an operation while confirming the processing result based on the corrected movement information.
 <2.本開示の一実施形態に係る情報処理装置のハードウェア構成>
 以上、本開示の一実施形態に係る情報処理装置100について説明した。上述した情報処理装置100の処理は、ソフトウェアと、以下に説明する情報処理装置100のハードウェアとの協働により実現される。
<2. Hardware Configuration of Information Processing Device According to One Embodiment of Present Disclosure>
Heretofore, the information processing apparatus 100 according to an embodiment of the present disclosure has been described. The processing of the information processing apparatus 100 described above is realized by cooperation of software and hardware of the information processing apparatus 100 described below.
 図19は、本開示の一実施形態に係る情報処理装置100のハードウェア構成を示した説明図である。図19に示したように、情報処理装置100は、プロセッサ132、メモリ134、ブリッジ136、バス138、インタフェース140、入力装置142、出力装置144、ストレージ装置146、ドライブ148、接続ポート150および通信装置152を備える。 FIG. 19 is an explanatory diagram illustrating a hardware configuration of the information processing apparatus 100 according to an embodiment of the present disclosure. As illustrated in FIG. 19, the information processing apparatus 100 includes a processor 132, a memory 134, a bridge 136, a bus 138, an interface 140, an input device 142, an output device 144, a storage device 146, a drive 148, a connection port 150, and a communication device. 152.
  (プロセッサ)
 プロセッサ132は、演算処理装置として機能し、各種プログラムと協働して情報処理装置100内の認識部104、補正制御部106および処理制御部108の機能を実現する。プロセッサ132は、制御回路を用いてメモリ134または他の記憶媒体に記憶されるプログラムを実行することにより、情報処理装置100の様々な論理的機能を動作させる。例えば、プロセッサ132は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、DSP(Digital Signal Processor)またはSoC(System-on-a-Chip)であり得る。
(Processor)
The processor 132 functions as an arithmetic processing unit, and realizes the functions of the recognition unit 104, the correction control unit 106, and the processing control unit 108 in the information processing apparatus 100 in cooperation with various programs. The processor 132 operates various logical functions of the information processing apparatus 100 by executing a program stored in the memory 134 or another storage medium using the control circuit. For example, the processor 132 may be a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), or a system-on-a-chip (SoC).
  (メモリ)
 メモリ134は、プロセッサ132が使用するプログラムまたは演算パラメタなどを記憶する。例えば、メモリ134は、RAM(Random Access Memory)を含み、プロセッサ132の実行において使用するプログラムまたは実行において適宜変化するパラメタなどを一時記憶する。また、メモリ134は、ROM(Read Only Memory)を含み、RAMおよびROMにより記憶部の機能が実現される。なお、接続ポート150または通信装置152などを介して外部のストレージ装置がメモリ134の一部として利用されてもよい。
(memory)
The memory 134 stores a program used by the processor 132 or an operation parameter. For example, the memory 134 includes a RAM (Random Access Memory), and temporarily stores a program used in the execution of the processor 132 or a parameter that changes as appropriate in the execution. The memory 134 includes a ROM (Read Only Memory), and the function of the storage unit is realized by the RAM and the ROM. Note that an external storage device may be used as part of the memory 134 via the connection port 150 or the communication device 152.
 なお、プロセッサ132およびメモリ134は、CPUバスなどから構成される内部バスにより相互に接続されている。 Note that the processor 132 and the memory 134 are connected to each other by an internal bus including a CPU bus or the like.
  (ブリッジおよびバス)
 ブリッジ136は、バス間を接続する。具体的には、ブリッジ136は、プロセッサ132およびメモリ134が接続される内部バスと、インタフェース140と接続するバス138と、を接続する。
(Bridge and bus)
The bridge 136 connects the buses. Specifically, the bridge 136 connects an internal bus to which the processor 132 and the memory 134 are connected and a bus 138 to be connected to the interface 140.
  (入力装置)
 入力装置142は、ユーザが情報処理装置100を操作しまたは情報処理装置100へ情報を入力するために使用される。例えば、入力装置142は、ユーザが情報を入力するための入力手段、およびユーザによる入力に基づいて入力信号を生成し、プロセッサ132に出力する入力制御回路などから構成されている。なお、当該入力手段は、マウス、キーボード、タッチパネル、スイッチ、レバーまたはマイクロフォンなどであってもよい。情報処理装置100のユーザは、入力装置142を操作することにより、情報処理装置100に対して各種のデータを入力したり処理動作を指示したりすることができる。
(Input device)
The input device 142 is used for a user to operate the information processing apparatus 100 or input information to the information processing apparatus 100. For example, the input device 142 includes input means for a user to input information, an input control circuit that generates an input signal based on an input by the user, and outputs the input signal to the processor 132. The input means may be a mouse, keyboard, touch panel, switch, lever, microphone, or the like. A user of the information processing apparatus 100 can input various data or instruct a processing operation to the information processing apparatus 100 by operating the input device 142.
  (出力装置)
 出力装置144は、ユーザに情報を通知するために使用される。例えば、出力装置144は、液晶ディスプレイ(LCD:Liquid Crystal Display)装置、OLED(Organic Light Emitting Diode)装置、プロジェクタ、スピーカまたはヘッドフォンなどの装置または当該装置への出力を行うモジュールであってよい。
(Output device)
The output device 144 is used to notify the user of information. For example, the output device 144 may be a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, a projector, a speaker, a headphone, or the like, or a module that outputs to the device.
 なお、入力装置142または出力装置144は、入出力装置を含んでよい。例えば、入出力装置は、タッチスクリーンであってよい。 Note that the input device 142 or the output device 144 may include an input / output device. For example, the input / output device may be a touch screen.
  (ストレージ装置)
 ストレージ装置146は、データ格納用の装置である。ストレージ装置146は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されるデータを削除する削除装置等を含んでもよい。ストレージ装置146は、CPU132が実行するプログラムや各種データを格納する。
(Storage device)
The storage device 146 is a device for storing data. The storage device 146 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 146 stores programs executed by the CPU 132 and various data.
  (ドライブ)
 ドライブ148は、記憶媒体用リーダライタであり、情報処理装置100に内蔵、あるいは外付けされる。ドライブ148は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記憶されている情報を読み出して、メモリ134に出力する。また、ドライブ148は、リムーバブル記憶媒体に情報を書込むこともできる。
(drive)
The drive 148 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 100. The drive 148 reads information stored in a mounted removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the memory 134. The drive 148 can also write information on a removable storage medium.
  (接続ポート)
 接続ポート150は、機器を情報処理装置100に直接接続するためのポートである。例えば、接続ポート150は、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)ポートなどであり得る。また、接続ポート150は、RS-232Cポート、光オーディオ端子、HDMI(登録商標)(High-Definition Multimedia Interface)ポートなどであってもよい。接続ポート150に外部機器を接続することで、情報処理装置100と当該外部機器との間でデータが交換されてもよい。
(Connection port)
The connection port 150 is a port for directly connecting a device to the information processing apparatus 100. For example, the connection port 150 may be a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. The connection port 150 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like. Data may be exchanged between the information processing apparatus 100 and the external device by connecting the external device to the connection port 150.
  (通信装置)
 通信装置152は、情報処理装置100と外部装置との間の通信を仲介し、通信部102の機能を実現する。具体的には、通信装置152は、無線通信方式または有線通信方式に従って通信を実行する。例えば、通信装置152は、WCDMA(登録商標)(Wideband Code Division Multiple Access)、WiMAX(登録商標)、LTE(Long Term Evolution)もしくはLTE-Aなどのセルラ通信方式に従って無線通信を実行する。なお、通信装置152は、Bluetooth(登録商標)、NFC(Near Field Communication)、ワイヤレスUSBもしくはTransferJet(登録商標)などの近距離無線通信方式、またはWi-Fi(登録商標)などの無線LAN(Local Area Network)方式といった、任意の無線通信方式に従って無線通信を実行してもよい。また、通信装置152は、信号線通信または有線LAN通信などの有線通信を実行してよい。
(Communication device)
The communication device 152 mediates communication between the information processing device 100 and the external device, and realizes the function of the communication unit 102. Specifically, the communication device 152 executes communication according to a wireless communication method or a wired communication method. For example, the communication device 152 performs wireless communication according to a cellular communication method such as WCDMA (registered trademark) (Wideband Code Division Multiple Access), WiMAX (registered trademark), LTE (Long Term Evolution), or LTE-A. Note that the communication device 152 may be a short-range wireless communication method such as Bluetooth (registered trademark), NFC (Near Field Communication), wireless USB or TransferJet (registered trademark), or a wireless LAN (Local trademark) such as Wi-Fi (registered trademark). Wireless communication may be executed according to an arbitrary wireless communication method such as an area network method. The communication device 152 may execute wired communication such as signal line communication or wired LAN communication.
 なお、情報処理装置100は、図19を用いて説明した構成の一部を有しなくてもよく、または任意の追加的な構成を有していてもよい。また、図19を用いて説明した構成の全体または一部を集積したワンチップの情報処理モジュールが提供されてもよい。 Note that the information processing apparatus 100 may not have a part of the configuration described with reference to FIG. 19 or may have any additional configuration. In addition, a one-chip information processing module in which all or part of the configuration described with reference to FIG. 19 is integrated may be provided.
 <3.むすび>
 このように、本開示の一実施形態によれば、処理で用いられる移動情報がユーザの意図する操作体の移動に係る移動情報に補正されることにより、ユーザの可動域を超えた範囲の操作についてもユーザの意図に即した処理を実行することができる。また、移動経路長に基づいて操作量が補正されることにより、操作量の目減りを抑制することができる。従って、ユーザが意図する処理を装置に実行させると共に、装置の操作にかかるユーザの負担を低減することが可能となる。それにより、ユーザは操作体の操作方法または操作姿勢を変えることなく、所望の操作を行うことができる。そのため、ユーザは装置の操作に意識を向けることなく、操作の目的に集中することが可能となる。
<3. Conclusion>
As described above, according to an embodiment of the present disclosure, the movement information used in the process is corrected to the movement information related to the movement of the operation tool intended by the user, so that an operation in a range exceeding the user's range of motion is performed. Also for the above, it is possible to execute processing in accordance with the user's intention. Further, the amount of operation can be reduced by correcting the operation amount based on the movement path length. Therefore, it is possible to cause the apparatus to execute processing intended by the user and to reduce the burden on the user for operating the apparatus. Thereby, the user can perform a desired operation without changing the operation method or the operation posture of the operation tool. Therefore, the user can concentrate on the purpose of the operation without paying attention to the operation of the apparatus.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 例えば、上記実施形態では、情報処理装置100のユーザは単独であるとしたが、本技術はかかる例に限定されない。具体的には、ユーザは複数であってもよい。例えば、複数のユーザの各々について移動情報の補正の制御態様が決定され、管理される。また、複数のユーザについて移動情報の補正の制御態様が一括管理されてもよい。 For example, in the above embodiment, the user of the information processing apparatus 100 is single, but the present technology is not limited to such an example. Specifically, there may be a plurality of users. For example, the movement information correction control mode is determined and managed for each of a plurality of users. Further, the movement information correction control mode may be collectively managed for a plurality of users.
 また、上記実施形態では、操作対象が投影装置300により投影される例を説明したが、操作対象は他の表示装置により表示されてもよい。例えば、操作対象は、タブレット端末、スマートフォンまたは据置型ディスプレイなどの表示装置により表示されてよい。また例えば、操作対象は、外界像の光を透過し表示部に画像が表示されまたはユーザの眼に画像に係る画像光が投射されるHUD(Head Up Display)、または撮像された外界像と画像とが表示されるHMD(Head Mount Display)などにより表示されてもよい。前者の場合、操作体の位置の認識は、表示部と一体化されているタッチセンサ(例えば静電容量方式または感圧方式のセンサ)を用いて実現される。後者の場合、操作体の位置の認識は、撮像装置による撮像により得られた画像の解析処理を用いて実現される。 In the above-described embodiment, an example in which the operation target is projected by the projection device 300 has been described. However, the operation target may be displayed by another display device. For example, the operation target may be displayed by a display device such as a tablet terminal, a smartphone, or a stationary display. Further, for example, the operation target is a HUD (Head Up Display) in which the light of the external image is transmitted and the image is displayed on the display unit or the image light related to the image is projected on the user's eyes, or the captured external image and image May be displayed by HMD (Head Mount Display) or the like. In the former case, the position of the operating body is recognized by using a touch sensor (for example, a capacitive or pressure-sensitive sensor) integrated with the display unit. In the latter case, recognition of the position of the operating tool is realized using analysis processing of an image obtained by imaging by the imaging device.
 また、上記実施形態では、移動情報の補正の程度は、情報処理装置100により自動的に制御される例を説明したが、ユーザにより選択されてもよい。例えば、複数の補正パラメタに基づいて補正された移動情報に基づく処理の結果の各々がプレビュー表示され、ユーザはプレビュー表示される複数の処理の結果のうちの1つを選択することにより、いずれの補正パラメタを利用するかを選択する。 In the above embodiment, an example in which the degree of correction of movement information is automatically controlled by the information processing apparatus 100 has been described, but may be selected by a user. For example, each of the results of the processing based on the movement information corrected based on the plurality of correction parameters is displayed as a preview, and the user can select any one of the results of the plurality of processing displayed in the preview by selecting one of the results. Select whether to use correction parameters.
 また、本開示の一実施形態に係る情報処理システムは、サーバクライアント型のシステムまたはクラウドサービス型のシステムであってもよい。例えば、情報処理装置100は、測定装置200および投影装置300が設置される場所と異なる場所に設置されるサーバであってよい。 The information processing system according to an embodiment of the present disclosure may be a server client type system or a cloud service type system. For example, the information processing apparatus 100 may be a server installed at a place different from the place where the measurement apparatus 200 and the projection apparatus 300 are installed.
 また、本開示の一実施形態に係る情報処理システムは、他の事例に適用されてもよい。例えば、情報処理装置100は、製品の組み立て作業を行うユーザの手などを操作体とした操作が入力され、入力される操作に基づいて自動的に操作を補正したり、操作の補正を提案したりしてよい。 In addition, the information processing system according to an embodiment of the present disclosure may be applied to other cases. For example, the information processing apparatus 100 inputs an operation using the hand of a user who performs product assembly work as an operating body, and automatically corrects the operation based on the input operation or proposes correction of the operation. You may do it.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 また、上記の実施形態のフローチャートに示されたステップは、記載された順序に沿って時系列的に行われる処理はもちろん、必ずしも時系列的に処理されなくとも、並列的にまたは個別的に実行される処理をも含む。また時系列的に処理されるステップでも、場合によっては適宜順序を変更することが可能であることは言うまでもない。 In addition, the steps shown in the flowcharts of the above-described embodiments are executed in parallel or individually even if they are not necessarily processed in time series, as well as processes performed in time series in the order described. Including processing to be performed. Further, it goes without saying that the order can be appropriately changed even in the steps processed in time series.
 また、情報処理装置100に内蔵されるハードウェアに上述した情報処理装置100の各機能構成と同等の機能を発揮させるためのコンピュータプログラムも作成可能である。また、当該コンピュータプログラムが記憶された記憶媒体も提供される。 Also, it is possible to create a computer program for causing the hardware built in the information processing apparatus 100 to exhibit functions equivalent to the functional configurations of the information processing apparatus 100 described above. A storage medium storing the computer program is also provided.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 操作体の移動情報に基づいて処理を実行する処理部と、
 前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御部と、
 を備える情報処理装置。
(2)
 前記制御部は、前記移動経路長に相当する長さの前記第1の方向への移動に係る移動情報を、前記処理で用いられる移動情報に決定する、
 前記(1)に記載の情報処理装置。
(3)
 前記制御部は、前記操作体の移動態様に係る情報に基づいて前記移動情報の制御態様を決定する、
 前記(1)または(2)に記載の情報処理装置。
(4)
 前記操作体の移動態様は、前記操作体の移動方向の変化を含む、
 前記(3)に記載の情報処理装置。
(5)
 前記制御部は、前記操作体の移動態様と予め記憶される前記操作体の移動態様のパターンとの比較に基づいて前記移動情報の制御態様を決定する、
 前記(3)または(4)に記載の情報処理装置。
(6)
 前記操作体の移動態様は、前記操作体の移動の速さ、加速または移動経路を含む、
 前記(5)に記載の情報処理装置。
(7)
 前記操作体の移動態様のパターンは、前記操作体の移動態様の機械学習により得られる、
 前記(5)または(6)に記載の情報処理装置。
(8)
 前記制御部は、前記操作体を用いた操作の主体による前記移動情報の制御態様の選択操作に係る情報に基づいて前記移動情報の制御態様を決定する、
 前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
 前記選択操作は、表示される選択肢を選択する操作を含む、
 前記(8)に記載の情報処理装置。
(10)
 前記表示される選択肢は、制御後の移動情報を示すプレビューを含む、
 前記(9)に記載の情報処理装置。
(11)
 前記制御部は、実行前の前記移動情報の制御についての評価情報に基づいて前記表示される選択肢の態様を制御する、
 前記(9)または(10)に記載の情報処理装置。
(12)
 前記制御部は、前記操作体による操作の対象の属性情報または態様情報に基づいて前記移動情報の制御態様を決定する、
 前記(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
 前記制御部は、前記操作体による操作が行われる場所の態様情報に基づいて前記移動情報の制御態様を決定する、
 前記(1)~(12)のいずれか1項に記載の情報処理装置。
(14)
 前記制御部は、前記操作体による操作の主体の属性情報または態様情報に基づいて前記移動情報の制御態様を決定する、
 前記(1)~(13)のいずれか1項に記載の情報処理装置。
(15)
 前記移動情報の制御態様は、前記移動情報の制御の有無または制御の程度を含む、
 前記(1)~(14)のいずれか1項に記載の情報処理装置。
(16)
 前記制御部は、前記移動情報の制御開始後における所定の操作の有無に応じて前記移動情報の制御の取り消しを制御する、
 前記(1)~(15)のいずれか1項に記載の情報処理装置。
(17)
 前記移動情報の制御の取り消しは、前記処理に用いられる移動情報を、制御前の移動情報へ変更することを含む、
 前記(16)に記載の情報処理装置。
(18)
 前記第1の方向は、移動する前記操作体の位置についての接線から決定される方向を含む、
 前記(1)~(17)のいずれか1項に記載の情報処理装置。
(19)
 プロセッサを用いて、
 操作体の移動情報に基づいて処理を実行することと、
 前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御することと、
 を含む情報処理方法。
(20)
 操作体の移動情報に基づいて処理を実行する処理機能と、
 前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御機能と、
 をコンピュータに実現させるためのプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
A processing unit that executes processing based on movement information of the operating body;
A control unit for controlling movement information used in the processing based on a first direction determined based on a change in position in the movement of the operating body and a moving path length of the operating body;
An information processing apparatus comprising:
(2)
The control unit determines movement information related to movement in the first direction having a length corresponding to the movement path length as movement information used in the processing.
The information processing apparatus according to (1).
(3)
The control unit determines a control mode of the movement information based on information related to a movement mode of the operating body.
The information processing apparatus according to (1) or (2).
(4)
The movement mode of the operating body includes a change in the moving direction of the operating body.
The information processing apparatus according to (3).
(5)
The control unit determines a control mode of the movement information based on a comparison between a movement mode of the operation body and a pattern of the movement mode of the operation body stored in advance.
The information processing apparatus according to (3) or (4).
(6)
The movement mode of the operation body includes a speed of movement, acceleration, or a movement path of the operation body.
The information processing apparatus according to (5).
(7)
The movement mode pattern of the operation body is obtained by machine learning of the movement mode of the operation body.
The information processing apparatus according to (5) or (6).
(8)
The control unit determines a control mode of the movement information based on information related to a selection operation of the control mode of the movement information by a subject of the operation using the operation body;
The information processing apparatus according to any one of (1) to (7).
(9)
The selection operation includes an operation of selecting a displayed option.
The information processing apparatus according to (8).
(10)
The displayed options include a preview showing movement information after control,
The information processing apparatus according to (9).
(11)
The control unit controls an aspect of the displayed option based on evaluation information about control of the movement information before execution.
The information processing apparatus according to (9) or (10).
(12)
The control unit determines a control mode of the movement information based on attribute information or mode information of an operation target by the operating body.
The information processing apparatus according to any one of (1) to (11).
(13)
The control unit determines a control mode of the movement information based on mode information of a place where an operation by the operating body is performed.
The information processing apparatus according to any one of (1) to (12).
(14)
The control unit determines a control mode of the movement information based on attribute information or mode information of an operation subject by the operating body.
The information processing apparatus according to any one of (1) to (13).
(15)
The movement information control mode includes the presence or absence of the movement information or the degree of control.
The information processing apparatus according to any one of (1) to (14).
(16)
The control unit controls cancellation of the control of the movement information according to the presence or absence of a predetermined operation after the start of the control of the movement information;
The information processing apparatus according to any one of (1) to (15).
(17)
Canceling control of the movement information includes changing movement information used for the processing to movement information before control.
The information processing apparatus according to (16).
(18)
The first direction includes a direction determined from a tangent to the position of the operating body that moves.
The information processing apparatus according to any one of (1) to (17).
(19)
Using a processor
Executing processing based on movement information of the operating body;
Controlling movement information used in the processing based on a first direction determined based on a change in position in movement of the operating body and a moving path length of the operating body;
An information processing method including:
(20)
A processing function for executing processing based on movement information of the operating tool;
A control function for controlling movement information used in the processing based on a first direction determined based on a change in position in movement of the operating body and a movement path length of the operating body;
A program to make a computer realize.
 100  情報処理装置
 102  通信部
 104  認識部
 106  補正制御部
 108  処理制御部
 200  測定装置
 300  投影装置
DESCRIPTION OF SYMBOLS 100 Information processing apparatus 102 Communication part 104 Recognition part 106 Correction control part 108 Processing control part 200 Measuring apparatus 300 Projection apparatus

Claims (20)

  1.  操作体の移動情報に基づいて処理を実行する処理部と、
     前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御部と、
     を備える情報処理装置。
    A processing unit that executes processing based on movement information of the operating body;
    A control unit for controlling movement information used in the processing based on a first direction determined based on a change in position in the movement of the operating body and a moving path length of the operating body;
    An information processing apparatus comprising:
  2.  前記制御部は、前記移動経路長に相当する長さの前記第1の方向への移動に係る移動情報を、前記処理で用いられる移動情報に決定する、
     請求項1に記載の情報処理装置。
    The control unit determines movement information related to movement in the first direction having a length corresponding to the movement path length as movement information used in the processing.
    The information processing apparatus according to claim 1.
  3.  前記制御部は、前記操作体の移動態様に係る情報に基づいて前記移動情報の制御態様を決定する、
     請求項1に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on information related to a movement mode of the operating body.
    The information processing apparatus according to claim 1.
  4.  前記操作体の移動態様は、前記操作体の移動方向の変化を含む、
     請求項3に記載の情報処理装置。
    The movement mode of the operating body includes a change in the moving direction of the operating body.
    The information processing apparatus according to claim 3.
  5.  前記制御部は、前記操作体の移動態様と予め記憶される前記操作体の移動態様のパターンとの比較に基づいて前記移動情報の制御態様を決定する、
     請求項3に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on a comparison between a movement mode of the operation body and a pattern of the movement mode of the operation body stored in advance.
    The information processing apparatus according to claim 3.
  6.  前記操作体の移動態様は、前記操作体の移動の速さ、加速または移動経路を含む、
     請求項5に記載の情報処理装置。
    The movement mode of the operation body includes a speed of movement, acceleration, or a movement path of the operation body.
    The information processing apparatus according to claim 5.
  7.  前記操作体の移動態様のパターンは、前記操作体の移動態様の機械学習により得られる、
     請求項5に記載の情報処理装置。
    The movement mode pattern of the operation body is obtained by machine learning of the movement mode of the operation body.
    The information processing apparatus according to claim 5.
  8.  前記制御部は、前記操作体を用いた操作の主体による前記移動情報の制御態様の選択操作に係る情報に基づいて前記移動情報の制御態様を決定する、
     請求項1に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on information related to a selection operation of the control mode of the movement information by a subject of the operation using the operation body;
    The information processing apparatus according to claim 1.
  9.  前記選択操作は、表示される選択肢を選択する操作を含む、
     請求項8に記載の情報処理装置。
    The selection operation includes an operation of selecting a displayed option.
    The information processing apparatus according to claim 8.
  10.  前記表示される選択肢は、制御後の移動情報を示すプレビューを含む、
     請求項9に記載の情報処理装置。
    The displayed options include a preview showing movement information after control,
    The information processing apparatus according to claim 9.
  11.  前記制御部は、実行前の前記移動情報の制御についての評価情報に基づいて前記表示される選択肢の態様を制御する、
     請求項9に記載の情報処理装置。
    The control unit controls an aspect of the displayed option based on evaluation information about control of the movement information before execution.
    The information processing apparatus according to claim 9.
  12.  前記制御部は、前記操作体による操作の対象の属性情報または態様情報に基づいて前記移動情報の制御態様を決定する、
     請求項1に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on attribute information or mode information of an operation target by the operating body.
    The information processing apparatus according to claim 1.
  13.  前記制御部は、前記操作体による操作が行われる場所の態様情報に基づいて前記移動情報の制御態様を決定する、
     請求項1に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on mode information of a place where an operation by the operating body is performed.
    The information processing apparatus according to claim 1.
  14.  前記制御部は、前記操作体による操作の主体の属性情報または態様情報に基づいて前記移動情報の制御態様を決定する、
     請求項1に記載の情報処理装置。
    The control unit determines a control mode of the movement information based on attribute information or mode information of an operation subject by the operating body.
    The information processing apparatus according to claim 1.
  15.  前記移動情報の制御態様は、前記移動情報の制御の有無または制御の程度を含む、
     請求項1に記載の情報処理装置。
    The movement information control mode includes the presence or absence of the movement information or the degree of control.
    The information processing apparatus according to claim 1.
  16.  前記制御部は、前記移動情報の制御開始後における所定の操作の有無に応じて前記移動情報の制御の取り消しを制御する、
     請求項1に記載の情報処理装置。
    The control unit controls cancellation of the control of the movement information according to the presence or absence of a predetermined operation after the start of the control of the movement information;
    The information processing apparatus according to claim 1.
  17.  前記移動情報の制御の取り消しは、前記処理に用いられる移動情報を、制御前の移動情報へ変更することを含む、
     請求項16に記載の情報処理装置。
    Canceling control of the movement information includes changing movement information used for the processing to movement information before control.
    The information processing apparatus according to claim 16.
  18.  前記第1の方向は、移動する前記操作体の位置についての接線から決定される方向を含む、
     請求項1に記載の情報処理装置。
    The first direction includes a direction determined from a tangent to the position of the operating body that moves.
    The information processing apparatus according to claim 1.
  19.  プロセッサを用いて、
     操作体の移動情報に基づいて処理を実行することと、
     前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御することと、
     を含む情報処理方法。
    Using a processor
    Executing processing based on movement information of the operating body;
    Controlling movement information used in the processing based on a first direction determined based on a change in position in movement of the operating body and a moving path length of the operating body;
    An information processing method including:
  20.  操作体の移動情報に基づいて処理を実行する処理機能と、
     前記操作体の移動における位置の変化に基づいて決定される第1の方向と前記操作体の移動経路長とに基づいて前記処理で用いられる移動情報を制御する制御機能と、
     をコンピュータに実現させるためのプログラム。
    A processing function for executing processing based on movement information of the operating tool;
    A control function for controlling movement information used in the processing based on a first direction determined based on a change in position in movement of the operating body and a movement path length of the operating body;
    A program to make a computer realize.
PCT/JP2017/014461 2016-05-31 2017-04-07 Information processing device, information processing method, and program WO2017208619A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-108410 2016-05-31
JP2016108410 2016-05-31

Publications (1)

Publication Number Publication Date
WO2017208619A1 true WO2017208619A1 (en) 2017-12-07

Family

ID=60478193

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014461 WO2017208619A1 (en) 2016-05-31 2017-04-07 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2017208619A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0612493A (en) * 1992-06-25 1994-01-21 Hitachi Ltd Gesture recognizing method and user interface method
JP2012043194A (en) * 2010-08-19 2012-03-01 Sony Corp Information processor, information processing method, and program
JP2012185694A (en) * 2011-03-07 2012-09-27 Elmo Co Ltd Drawing system
JP2012256099A (en) * 2011-06-07 2012-12-27 Sony Corp Information processing terminal and method, program, and recording medium
JP2015148947A (en) * 2014-02-06 2015-08-20 ソニー株式会社 information processing system, information processing method, and program
JP2015165346A (en) * 2014-02-28 2015-09-17 富士ゼロックス株式会社 Image processing apparatus, image processing method, image processing system, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0612493A (en) * 1992-06-25 1994-01-21 Hitachi Ltd Gesture recognizing method and user interface method
JP2012043194A (en) * 2010-08-19 2012-03-01 Sony Corp Information processor, information processing method, and program
JP2012185694A (en) * 2011-03-07 2012-09-27 Elmo Co Ltd Drawing system
JP2012256099A (en) * 2011-06-07 2012-12-27 Sony Corp Information processing terminal and method, program, and recording medium
JP2015148947A (en) * 2014-02-06 2015-08-20 ソニー株式会社 information processing system, information processing method, and program
JP2015165346A (en) * 2014-02-28 2015-09-17 富士ゼロックス株式会社 Image processing apparatus, image processing method, image processing system, and program

Similar Documents

Publication Publication Date Title
US11112856B2 (en) Transition between virtual and augmented reality
US20200279104A1 (en) Gesture-based casting and manipulation of virtual content in artificial-reality environments
KR102105637B1 (en) Input through context-sensitive collision of objects and hands in virtual reality
EP3411777B1 (en) Method of object motion tracking with remote device for mixed reality system, mixed reality system and computer readable medium
US10635184B2 (en) Information processing device, information processing method, and program
JP6288372B2 (en) Interface control system, interface control device, interface control method, and program
EP3090331B1 (en) Systems with techniques for user interface control
EP2980677B1 (en) Wearable device and method of operating the same
WO2017177006A1 (en) Head mounted display linked to a touch sensitive input device
CN109074217A (en) Application for multiple point touching input detection
TWI525477B (en) System and method for receiving user input and program storage medium thereof
JP6921193B2 (en) Game programs, information processing devices, information processing systems, and game processing methods
US20150199020A1 (en) Gesture ui device, gesture ui method, and computer-readable recording medium
WO2015102974A1 (en) Hangle-based hover input method
TWI668600B (en) Method, device, and non-transitory computer readable storage medium for virtual reality or augmented reality
JP6816727B2 (en) Information processing equipment, information processing methods and programs
WO2017208637A1 (en) Information processing device, information processing method, and program
WO2016147498A1 (en) Information processing device, information processing method, and program
WO2017208619A1 (en) Information processing device, information processing method, and program
US9454233B2 (en) Non-transitory computer readable medium
AU2015297289A1 (en) Wearable device and method of operating the same
JP2016119019A (en) Information processing apparatus, information processing method, and program
US20140266982A1 (en) System and method for controlling an event in a virtual reality environment based on the body state of a user
JP2018013926A (en) Display control method
JP6373546B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17806180

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17806180

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP