WO2023238568A1 - Information processing method, program, and information processing device - Google Patents

Information processing method, program, and information processing device Download PDF

Info

Publication number
WO2023238568A1
WO2023238568A1 PCT/JP2023/017267 JP2023017267W WO2023238568A1 WO 2023238568 A1 WO2023238568 A1 WO 2023238568A1 JP 2023017267 W JP2023017267 W JP 2023017267W WO 2023238568 A1 WO2023238568 A1 WO 2023238568A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
mode
information processing
command
switching
Prior art date
Application number
PCT/JP2023/017267
Other languages
French (fr)
Japanese (ja)
Inventor
泰生 菰田
Original Assignee
株式会社ジンズホールディングス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ジンズホールディングス filed Critical 株式会社ジンズホールディングス
Publication of WO2023238568A1 publication Critical patent/WO2023238568A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to an information processing method, a program, and an information processing device.
  • the disclosed technology aims to improve usability by making it possible to more appropriately realize the operation intended by the user when an input operation is performed using the movement of a predetermined part where a sensor is provided.
  • An information processing method in one aspect of the disclosed technology includes: an information processing apparatus sequentially acquiring data regarding the movement of a predetermined part of a user at a plurality of points in time from a sensor capable of measuring the movement of the predetermined part of the user; and a non-mobile mode. , determining whether the motion of the predetermined region is a mode switching motion based on the data; and when determining that the motion of the predetermined region is a mode switching motion, switching from the non-moving mode to the moving mode; when the pointer is in the moving mode, outputting movement information regarding the pointer based on the data; and switching from the moving mode to the non-moving mode if a predetermined condition is satisfied after outputting the movement information.
  • FIG. 1 is a diagram showing an example of an information processing system in an embodiment.
  • FIG. 2 is a block diagram illustrating an example of the configuration of an external information processing device in an embodiment.
  • FIG. 1 is a block diagram showing an example of the configuration of an information processing device in an embodiment. It is a block diagram showing an example of composition of a processing part in an example. It is a figure which shows an example of the data regarding the moving speed in an Example.
  • FIG. 3 is a diagram showing an example of the relationship between data, movement, and output in an example.
  • FIG. 6 is a diagram showing an example of the relationship between the movement of a predetermined part, a mode, and a command in an embodiment.
  • FIG. 6 is a diagram illustrating an example of the relationship between movements, commands, and pointer movements in the embodiment.
  • 12 is a flowchart illustrating an example (part 1) of processing related to command issuance in the embodiment.
  • 12 is a flowchart illustrating an example (part 2) of processing related to command issuance in the embodiment.
  • FIG. 1 is a diagram showing an example of an information processing system 1 in an embodiment.
  • the information processing system 1 shown in FIG. 1 includes an external information processing device (hereinafter also referred to as "external device") 10 and eyewear 30, and the external device 10 and the eyewear 30 are connected via a network. is connected and data communication is possible.
  • external device hereinafter also referred to as "external device”
  • the eyewear 30 has the information processing device 20 mounted on the bridge portion, for example.
  • the information processing device 20 includes a pair of nose pads and a bridge portion, each of which may have bioelectrodes 32, 34, and 36.
  • the information processing device 20 includes a 3-axis acceleration sensor and/or a 3-axis angular velocity sensor (or a 6-axis sensor). Note that the bioelectrodes 32, 34, and 36 are not necessarily required.
  • the information processing device 20 detects sensor signals, electro-oculography signals, etc. and transmits them to the external device 10.
  • the installation position of the information processing device 20 does not necessarily have to be in the bridge portion, but it may be positioned at a position where the sensor signal and the electro-oculography signal can be obtained when the eyewear 30 is worn. Further, the information processing device 20 may be removably provided in the bridge portion.
  • the external device 10 is an information processing device that has a communication function.
  • the external device 10 is a server device, a personal computer, a tablet terminal, a mobile terminal such as a smartphone, or the like.
  • the external device 10 acquires data regarding the movement of the user's head received from the information processing device 20, and executes a plurality of operation processes based on this data.
  • the operation process includes, for example, pointer (cursor) movement, clicking, dragging, scrolling, and the like.
  • the external device 10 may receive a command for instructing an operation process or data indicating the amount of movement of the pointer from the information processing device 20 in accordance with the movement of the head.
  • FIG. 2 is a block diagram showing an example of the configuration of the external device 10 in the embodiment.
  • External device 10 includes one or more processing units (CPUs) 110, one or more network communication interfaces 120, memory 130, user interface 150, image sensor 160, and one or more processors for interconnecting these components. one or more communication buses 170 .
  • CPUs processing units
  • network communication interfaces 120 memory 130
  • user interface 150 user interface 150
  • image sensor 160 image sensor
  • processors for interconnecting these components.
  • communication buses 170 one or more communication buses 170 .
  • the network communication interface 120 is connected to a network through a mobile communication antenna or a wireless LAN communication antenna, and is capable of data communication with the information processing device 20.
  • User interface 150 may include a display device and an input device (such as a keyboard and/or mouse, or some other pointing device).
  • the user interface 150 is capable of moving a pointer displayed on a display device.
  • the image sensor 160 is an image sensor that receives an optical signal, converts the optical signal into an electrical signal, and generates an image.
  • the image includes at least one of a still image and a moving image.
  • Memory 130 can be, for example, a high speed random access memory such as DRAM, SRAM, other random access solid state storage, or one or more magnetic disk storage, optical disk storage, flash memory devices, or other non-volatile It may be a non-volatile memory such as a solid-state storage device, or it may be a computer-readable non-transitory recording medium.
  • a high speed random access memory such as DRAM, SRAM, other random access solid state storage, or one or more magnetic disk storage, optical disk storage, flash memory devices, or other non-volatile It may be a non-volatile memory such as a solid-state storage device, or it may be a computer-readable non-transitory recording medium.
  • the memory 130 stores data used by the information processing system 1.
  • the memory 130 stores data transmitted from the eyewear 30 (information processing device 20).
  • memory 130 stores programs, modules and data structures, or a subset thereof, that are executed by CPU 110.
  • the CPU 110 configures the acquisition unit 112 and the processing control unit 114 by executing a program stored in the memory 130.
  • the acquisition unit 112 acquires instructions (commands) regarding operations, data indicating the amount of movement of the pointer, etc. from the information processing device 20 of the eyewear 30.
  • the processing control unit 113 executes commands related to the operations acquired by the acquisition unit 112, and controls the movement of the pointer using the amount of movement of the pointer. For example, the processing control unit 113 controls operations such as clicking and dragging using commands and data obtained from the eyewear 30, which has a function similar to that of an input device such as a pointing device (eg, a mouse).
  • a pointing device eg, a mouse
  • the acquisition unit 112 of the CPU 110 may sequentially acquire data regarding the movement speed of a predetermined part of the user from a sensor capable of measuring the movement of a predetermined part of the user (for example, a sensor attached to a predetermined part of the user). good.
  • the processing control unit 113 may execute mode control processing, determination processing, and output processing of the information processing device 20, which will be described later, and may control operations such as clicking and dragging.
  • the processing of the processing control unit 113 may also include operations such as a function key, a numeric keypad, and a setting change, which may be associated with the movement of the user's head.
  • FIG. 3 is a block diagram showing an example of the configuration of the information processing device 20 in the embodiment.
  • the information processing device 20 includes a processing section 202, a transmitting section 204, a six-axis sensor 206, a power supply section 208, and each bioelectrode 32, 34, and 36. Further, each of the bioelectrodes 32, 34, and 36 is connected to the processing unit 202 using an electric wire, for example, via an amplification unit. Note that each bioelectrode 32, 34, and 36 is not necessarily a necessary configuration.
  • the information processing device 20 also includes a memory that stores processing data. This memory may be a computer readable non-transitory storage medium.
  • the transmitting unit 204 transmits, for example, data regarding the moving speed packetized by the processing unit 202, moving information, non-moving information, or a command, which will be described later, to the external device 10.
  • the transmitter 204 transmits non-movement information, movement information, commands, etc. to the external device 10 by wireless communication such as Bluetooth (registered trademark) and wireless LAN, or by wired communication.
  • the 6-axis sensor 206 is a 3-axis acceleration sensor and a 3-axis angular velocity sensor. Further, each of these sensors may be provided separately, or one of the sensors may be provided.
  • the 6-axis sensor 206 outputs the detected sensor signal to the processing unit 202.
  • the power supply unit 208 supplies power to the processing unit 202, the transmission unit 204, the 6-axis sensor 206, and the like.
  • the processing unit 202 includes a processor, and processes the sensor signal obtained from the 6-axis sensor 206 and the electro-oculography signal obtained from each bioelectrode 32, 34, and 36 as necessary, and processes, for example, the sensor signal and the electro-oculography signal. may be packetized and this packet may be output to the transmitter 204. Further, the processing unit 202 may simply amplify the sensor signal obtained from the 6-axis sensor 206.
  • the processing unit 202 may generate the sensor signal from the 6-axis sensor 206 as movement speed data based on information regarding head movement.
  • the information related to the movement of the head is, for example, information related to the movement of the head up and down (back and forth) and to the left and right.
  • An example in which the processing unit 202 includes a processor and executes processing corresponding to an input device will be described below.
  • FIG. 4 is a block diagram showing an example of the configuration of the processing unit 202 in the embodiment.
  • the processing unit 202 includes an acquisition unit 222, a mode control unit 224, and an output unit 228.
  • the acquisition unit 222 sequentially acquires data regarding the movement speed of a predetermined region at multiple points in time from a sensor capable of measuring the motion of the predetermined region of the user.
  • the acquisition unit 112 sequentially acquires angular velocity data transmitted from angular velocity sensors that sample at a predetermined sampling rate.
  • Obtaining data from a sensor includes obtaining data directly or indirectly.
  • the predetermined region in the embodiment is the head, it may also be an arm, a leg, or the like.
  • the sensor is a sensor capable of measuring the moving speed of a predetermined part, and in the embodiment, an angular velocity sensor or an acceleration sensor is suitable, and may be attached to a predetermined part of the user.
  • the data regarding the moving speed is data on an angular velocity vector including direction.
  • the mode control unit 224 controls state transition between a movement mode in which movement of the pointer of the external device 10 is controlled and a non-movement mode in which the pointer is not moved, based on data regarding movement speed.
  • the mode control unit 224 includes a determination unit 226 that performs various determinations.
  • the determination unit 226 of the mode control unit 224 determines whether the movement of a predetermined part of the user is a mode switching movement based on data related to movement speed that are sequentially acquired. Determine.
  • the mode switching operation may be one or more operations of a predetermined part set in advance, or may be an operation set by the user.
  • the determination unit 226 determines that the motion of the user's predetermined region is a mode switching motion
  • the determination unit 226 switches from the non-moving mode to the moving mode. For example, if the determination unit 226 determines that the moving direction and/or amount of movement of the predetermined part indicated by the sequentially acquired data regarding the moving speed indicates one of one or more mode switching operations, The mode control unit 224 switches from non-moving mode to moving mode.
  • the determination of the mode switching operation for example, it may be determined whether the operation involves a movement of more than a threshold value in a predetermined direction within a predetermined time, or whether there is a combination of a plurality of arbitrary operations.
  • the threshold value may be a predetermined threshold value that can detect sudden movement. For example, if the predetermined part is the head, a threshold is set that is exceeded when the head is quickly shaken horizontally or vertically, and if the predetermined part is the arms or legs, a threshold is set that is exceeded when the head is quickly shaken horizontally or vertically. A threshold may be set that will be exceeded if the object is moved. Note that the threshold values for each direction may be set differently depending on each movement.
  • the output unit 228 outputs the data processed by the processing unit 202 when the information processing device 20 is powered on.
  • the output data is transmitted from the transmitter 204 to the external device 10 via a network or Bluetooth (registered trademark).
  • the output unit 228 outputs non-movement information that does not move the pointer on the external device 10 side.
  • the non-movement information is information such as (0,0) indicating that the amount of movement of the X and Y coordinates is 0 in order to not move the pointer, regardless of the data regarding the moving speed that is sequentially acquired.
  • the output unit 228 when the current state is in the movement mode, the output unit 228 outputs movement information regarding the pointer based on the data regarding the movement speed that is sequentially acquired. For example, the output unit 228 outputs information such as the amount of movement of the X coordinate and Y coordinate (X1, Y1) in order to move the pointer based on the data regarding the moving speed that is sequentially acquired.
  • the mode control unit 224 switches from the movement mode to the non-movement mode.
  • the predetermined conditions include, for example, conditions regarding the movement amount and motion indicated by the data regarding the movement speed.
  • the user when the sensor measures the movement of a predetermined part and performs an input operation, the user can prevent the pointer from moving unintentionally, making it possible to more appropriately perform the user's intended operation, and improving usability. can be done. Furthermore, according to the above process, when the user does not move the pointer, by introducing a non-movement mode that is unrelated to the movement of a predetermined part, when the user does not intend to perform an operation, the pointer can be moved according to the movement of the predetermined part. It is possible to prevent this from moving. Further, the external device 10 only needs to move the pointer according to the non-movement information, movement information, commands, etc. transmitted from the information processing device 20 in the same way as when operating a mouse or pointing device. It can be applied to the existing external device 10 without the need to add a special program to the side. Moreover, as a result, it becomes possible to effectively utilize the limited processing resources of the external device 10.
  • the mode control unit 224 may set a non-movement mode as the starting mode. This prevents the pointer from moving in accordance with the movement of a predetermined part immediately after the power of the information processing device 20 is turned on, and provides an operational feel that allows the pointer to move only when you want to use it, like with a mouse. can be provided to users.
  • the mode control unit 224 includes switching from the non-movement mode to the first state of the movement mode when the user's movement of a predetermined region is the first mode switching action. But that's fine. For example, if it is determined that the movement of the predetermined part is the first mode switching movement based on the movement direction and movement amount indicated by data related to movement speed that are sequentially acquired, the mode control unit 224 switches the movement mode from the non-movement mode. Switch to the first state of a plurality of movement modes.
  • the determination unit 226 determines, based on the sequentially acquired data regarding the movement speed, the amount of movement in the left direction is equal to or more than a threshold value and the amount of movement in the right direction is equal to or more than a threshold value within a predetermined time. It is determined whether there is a movement with a movement amount of . If the determination unit 226 detects this movement, it determines that the series of actions of the user's predetermined part is a first mode switching action that performs processing from pointer movement to command issuance.
  • the output unit 228 may include outputting a command corresponding to the first mode switching operation after the pointer stops moving, following output of the movement information.
  • the first state is a click wait state
  • the command corresponding to the first mode switching operation is a left click.
  • the user does not need to perform both the mode switching action and the command determination action, and for frequently used left clicks, the user can perform the series of actions from the pointer movement to the left click in the series of actions from the mode switching action. It becomes possible to do so. Furthermore, by reducing the command determination process, it is possible to reduce the processing load on the information processing device 20. Note that the command corresponding to the first mode switching operation is not limited to the left click, and may be changed to a command frequently used by the user.
  • the mode control unit 224 may include switching from the non-movement mode to the second state of the movement mode when the movement of the predetermined region is a second mode switching operation. . For example, if it is determined that the movement of the predetermined part is the second mode switching movement based on the movement direction and movement amount indicated by data related to movement speed that are sequentially acquired, the mode control unit 224 switches the movement mode from the non-movement mode. Switch to the second state of the plurality of movement modes.
  • the determination unit 226 determines, based on the sequentially acquired data regarding the movement speed, the amount of movement in the right direction is equal to or more than a threshold value and the amount of movement in the left direction is equal to or more than a threshold value within a predetermined time. It is determined whether there is a movement with a movement amount of . If the determination unit 226 detects this movement, it determines that the user's movement of the predetermined part is a second mode switching movement in which a movement judgment process for determining a command is performed after the pointer is moved.
  • the determination unit 226 determines whether the movement of the predetermined part is a command movement based on the data regarding the movement speed after the pointer stops moving. judge. For example, if a plurality of command operations are set, the determination unit 226 determines whether one of the command operations corresponds to one of the plurality of command operations based on sequentially acquired data related to movement speed.
  • the output unit 228 may output a command corresponding to the command operation when the determination unit 226 determines that the operation is a command operation. For example, if the command action is a momentary movement to the left, a left click command will be output, and if the command action is a momentary movement to the right, a right click command will be output, and if the command action is a momentary movement upward. , a double click is output, and if the command action is a momentary downward movement, a command to start dragging is output.
  • the predetermined condition for transitioning from the moving mode to the non-moving mode may include that the output unit 228 outputs a command.
  • the mode control unit 224 detects that a command has been output from the output unit 228, it controls the mode to transition from the moving mode to the non-moving mode.
  • the output unit 228 may include outputting motion information that causes a pointer to move corresponding to the mode switching operation. For example, after mode switching, the output unit 228 outputs pointer movement information corresponding to a preset mode switching operation instead of movement information for a predetermined time, and following the movement information, movement information based on movement speed data is output. may be output.
  • the user can check whether the movement of the predetermined part is intended by understanding the movement of the pointer.
  • the above-mentioned movement information may include information indicating a predetermined movement regardless of the movement of a predetermined part of the user.
  • the movement information may be information indicating three times in a circle.
  • the user can check whether his or her complex movements have been input correctly by understanding the simple movements of the pointer.
  • the determination unit 226 determines whether the movement of the predetermined part of the user is a command movement that does not involve movement of the pointer, based on the data regarding the movement speed that is sequentially acquired. You may judge.
  • the output unit 228 may output a command corresponding to the command movement.
  • the user can cancel the non-movement mode by directly performing a command operation while in the non-movement mode, and also issue direct commands for commands that do not require pointer movement (such as scrolling). It becomes possible to output.
  • FIG. 5 is a diagram showing an example of data regarding the moving speed in the example.
  • the data shown in FIG. 5 is stored in a memory within the information processing device 20, for example. Note that data regarding the movement speed of the line of sight based on the electro-oculogram signal may also be stored as the data regarding the movement speed.
  • FIG. 6 is a diagram showing an example of the relationship between data, movement, and output in the embodiment.
  • the information processing device 20 is controlled to be in a non-mobile mode. That is, after pairing, although the predetermined part of the user has moved, the mode of the processing unit 202 is the non-movement mode, so non-movement information is output. Thereby, the external device 10 acquires non-movement information and does not move the pointer.
  • the determination unit 226 switches to the first mode when the sequentially acquired data regarding the movement speed indicates left and right movement. It is determined that it is a motion.
  • the mode control unit 224 causes a transition from the non-mobile mode to the first state of the mobile mode.
  • the output unit 228 may output non-movement information, or if there is a time lag between data acquisition and output, it may output movement information about a series of left and right movements. Further, the output unit 228 may output motion information indicating a command.
  • the user moves the predetermined part until 2.65 seconds, and the pointer is located at the target position that the user is aiming for.
  • the sensor detects this movement, data regarding the movement speed is acquired by the acquisition unit 222, and movement information based on this data is output by the output unit 228.
  • the sequentially acquired data related to the movement speed indicate data in which the movement direction and movement amount are approximately 0.
  • the determining unit 226 determines to stop, and the output unit 228 outputs a left click command corresponding to the first mode switching operation.
  • FIG. 7 is a diagram showing an example of the relationship between the movement of a predetermined part, the mode, and the command in the embodiment.
  • the subsequent commands for movement of a predetermined part are as follows. Note that after issuing the command, the state returns to non-mobile mode.
  • a non-movement mode it is possible to transition from a non-movement mode to a mode in which the pointer moves (pointer movement), a mode in which a command is directly issued, and the like.
  • a mouse wheel is used to issue commands without moving the pointer.
  • a direct command operation corresponding to a scroll command is determined, a direct command is issued without moving the pointer.
  • the pointer does not move and no command is issued unless an action is taken to cancel the stop mode.
  • a mode is provided that does not accept any motion other than the motion to cancel the stop mode. This makes it possible to more appropriately issue commands based on the actions intended by the user.
  • FIG. 8 is a diagram illustrating an example of the relationship between movements, commands, and pointer movements in the embodiment.
  • the movement of the pointer refers to the movement of the pointer that is set according to the movement of a predetermined part or a command after the mode is switched.
  • the relationships are as follows. Movement indicated by the pointer: Command: Pointer movement Left/Right: Move: Left/Right/Right/Left: Move: Right/Left/Left/Left: Stop mode: Circle 3 times vertically 3 times back and forth: Setting mode: Circle 2 times up/down: Scroll one level up: Up/down/down/up :Scroll down one level:Down/Up
  • the output unit 228 may output motion information having a motion that moves faster or moves more greatly than the motion indicated by the actual data, for example. Since the movement indicated by the data and the movement of the pointer are the same, the user can intuitively understand that the movement has been appropriately recognized.
  • the pointer makes a predetermined movement regardless of the movement indicated by the data.
  • the pointer makes a simple movement that the predetermined region does not normally perform, the user can understand whether the motion of the predetermined region has been appropriately recognized.
  • FIGS. 9 and 10 are flowcharts illustrating an example of processing related to command issuance in the embodiment.
  • the processes shown in FIGS. 9 and 10 are examples of processes executed by the information processing device 20.
  • step S102 shown in FIG. 9 the acquisition unit 222 sequentially acquires data regarding the movement speed of a predetermined region of the user from a sensor capable of measuring the motion of the predetermined region of the user.
  • the sensor is, for example, an angular velocity sensor attached to the eyewear 30, the predetermined part is, for example, the user's head, and the data regarding the moving speed is, for example, angular velocity data.
  • step S104 the mode control unit 224 determines whether the current state is a non-moving mode or a moving mode. For example, the mode control unit 224 sets the non-moving mode as a default, and sets the moving mode after a mode switching operation is detected until a predetermined condition is satisfied.
  • the process proceeds to step S106, and if the current state is the movement mode (step S104-NO), the process proceeds to step S122 shown in FIG. move on.
  • step S106 the determining unit 226 determines whether the sequentially acquired data regarding the moving speed indicates one or more preset mode switching operations. If the data indicates a mode switching operation (step S106-YES), the process proceeds to step S114, and if the data does not indicate a mode switching operation (step S106-NO), the process proceeds to step S108.
  • step S108 the determination unit 226 determines whether the data regarding the movement speed acquired in step S106 is a command operation that does not involve pointer movement (direct command operation). If the data indicates a direct command operation (step S108-YES), the process proceeds to step S110, and if the data does not indicate a direct command operation (step S108-NO), the process proceeds to step S112.
  • step S110 the output unit 228 outputs a command corresponding to the direct command operation.
  • step S112 the output unit 228 outputs non-movement information that does not move the pointer of the external device 10, for example, information indicating that the amount of movement is (0,0).
  • step S114 the mode control unit 224 switches from the non-moving mode to the moving mode.
  • step S116 the determination unit 226 determines whether the data regarding the moving speed acquired in step S106 indicates a first mode switching operation. If the data indicates a first mode switching operation (step S116-YES), the process proceeds to step S118, and if the data does not indicate a first mode switching operation (step S118-NO), the process proceeds to step S120. In other words, if it is determined that the operation is the first mode switching operation, the process proceeds to step S118, and if it is determined that the operation is the second mode switching operation, the process proceeds to step S120.
  • step S118 the mode control unit 224 sets the first state in which a command is issued after the pointer stops.
  • step S120 the mode control unit 224 sets the mode controller 224 to a second state in which it waits for an operation corresponding to a predetermined command.
  • step S122 shown in FIG. 10 that is, in the movement mode, the determination unit 226 determines whether or not the sequentially acquired data regarding the movement speed indicates a stopping motion of the pointer. If the data indicates a stop operation of the pointer (step S122-YES), the process proceeds to step S126, and if the data does not indicate a stop operation of the pointer (step S122-NO), the process proceeds to step S124.
  • step S124 the output unit 228 outputs movement information indicating the amount of movement of the pointer based on the data regarding the movement speed that is sequentially acquired.
  • step S126 the mode control unit 224 determines whether the current state is the first state. If the current state is the first state (step S126-YES), the process proceeds to step S128, and if the current state is the second state (step S126-NO), the process proceeds to step S130.
  • step S1208 the output unit 228 outputs a command corresponding to the command operation corresponding to the first mode switching operation. Furthermore, after outputting the command, the mode control unit 224 switches the movement mode to the non-movement mode.
  • step S130 the output unit 228 outputs a command according to a predetermined command operation after the pointer stops. Furthermore, after outputting the command, the mode control unit 224 switches the movement mode to the non-movement mode.
  • the output unit 228 may output pointer movement information according to the mode switching operation or command operation. Through this process, the user can check whether the movement of the predetermined region is correctly recognized by the information processing device 20.
  • an input operation is performed by measuring the motion of a predetermined part with a sensor, it is possible to more appropriately realize the operation intended by the user and improve usability. Further, according to the embodiment, when performing an input operation using movement of a predetermined part, recognition of the movement intended by the user when the user wants to move the pointer or perform a predetermined operation, such as with an input device such as a mouse. It becomes possible to provide an interface that allows input operations.
  • movements of the user's head, arms, legs, etc. are taken as examples of movements of the predetermined parts, but movements of the line of sight of the eyeballs may also be used.
  • the movement of the eyeball may be determined from the sensor signal detected from each bioelectrode (an example of a sensor), and the processing unit 202 may control the processing based on data regarding this movement.
  • the 6-axis sensor can be attached not only to the head but also to any position on the human body. It would be fine if it had been done.
  • the data of the 3-axis acceleration sensor may be used as the movement speed data, but in this case, after the acceleration is detected, it is converted to speed by predetermined calculation processing, and the converted Velocity data may also be used.
  • the information processing device 20 on the eyewear 30 side functions as an interface that replaces a mouse or the like, and the information processing device 20 outputs operation commands such as left click, right click, and scroll to the external device 10.
  • operation commands such as left click, right click, and scroll to the external device 10.
  • the information processing device 20 may amplify or packetize the sequentially acquired data regarding the moving speed and transmit the data to the external device 10.
  • the external device 10 may be It may have a function and output the determined command to an OS (Operating System) or the like to control the movement and operation of a pointer on the screen.
  • OS Operating System
  • the processing content described above in the embodiment may be realized by an information processing device in which the information processing device 20 and a part of the configuration of the external device 10 are integrally formed.
  • a head-mounted display includes the processing unit 202 of the information processing device 20 and the display 151 of the external device 10, and a pointer corresponding to the movement of the user's head is superimposed on an image displayed on the display 151. , input operations such as those described above may be performed.
  • an image sensor 160 is used as a sensor capable of measuring the motion of a predetermined region of the user.
  • the external device 10 is the information processing device described above, and the CPU 110 includes the processing section 202.
  • the motion of a predetermined part (such as the head) of the user is photographed using the live view mode of the image sensor 160, and the processing unit 202 acquires data regarding the movement of the predetermined part recognized by image recognition.
  • the subsequent processing is similar to that of the embodiment. Thereby, by introducing the above-described non-mobile mode using a personal computer having a camera, it is possible to more appropriately realize the operation intended by the user, and improve usability.
  • Information processing device (external device) 20 Information processing device 30 Eyewear 110 CPU 112 Acquisition unit 113 Processing control unit 120 Network communication interface 130 Memory 150 User interface 160 Image sensor 202 Processing unit 204 Transmission unit 206 6-axis sensor 208 Power supply unit 222 Acquisition unit 224 Mode control unit 226 Determination unit 228 Output unit

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

This information processing device performs: sequential acquisition, from a sensor that is capable of measuring an action of a prescribed part of a user, of data relating to the action of the prescribed part at a plurality of time points; determination, based on the data, of whether or not the action of the prescribed part is a mode switching action, when in a non-movement mode; switching from the non-movement mode to a movement mode when it is determined that the action of the prescribed part is the mode switching action; outputting movement information relating to a pointer on the basis of the data when in the movement mode; and switching from the movement mode to the non-movement mode when a prescribed condition has been satisfied after the output of the movement information.

Description

情報処理方法、プログラム、及び情報処理装置Information processing method, program, and information processing device
 本発明は、情報処理方法、プログラム、及び情報処理装置に関する。 The present invention relates to an information processing method, a program, and an information processing device.
 従来、ユーザの頭部の動きを検出するセンサを用いて、マウスなどの入力手段に対するクリック操作などの入力操作を制御する技術が知られている(例えば、特許文献1、2参照)。 Conventionally, there is a known technique for controlling input operations such as click operations on input means such as a mouse using a sensor that detects the movement of a user's head (see, for example, Patent Documents 1 and 2).
特開平4-309996号公報Japanese Patent Application Publication No. 4-309996 特開平9-258887号公報Japanese Patent Application Publication No. 9-258887
 しかしながら、従来技術では、頭部の動きを用いて入力操作を行う場合、ポインタが常に動いていたり、自然な頭部の動きにより意図しない操作が行われたり、ユーザビリティに問題があった。 However, in the conventional technology, when performing input operations using head movements, there were problems in usability, such as the pointer constantly moving and unintended operations being performed due to natural head movements.
 そこで、開示技術は、センサが設けられる所定部位の動きを用いて入力操作が行われる際に、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることを目的とする。 Therefore, the disclosed technology aims to improve usability by making it possible to more appropriately realize the operation intended by the user when an input operation is performed using the movement of a predetermined part where a sensor is provided.
 開示技術の一態様における情報処理方法は、情報処理装置が、ユーザの所定部位の動作を計測可能なセンサから、複数時点における前記所定部位の動作に関するデータを順に取得すること、非移動モードである場合、前記データに基づいて前記所定部位の動作がモード切替動作であるかどうかを判定すること、前記所定部位の動作がモード切替動作であると判定した場合、前記非移動モードから移動モードに切り替えること、前記移動モードである場合、前記データに基づいてポインタに関する移動情報を出力すること、前記移動情報の出力後に所定条件が満たされた場合、前記移動モードから前記非移動モードに切り替えること、を実行する。 An information processing method in one aspect of the disclosed technology includes: an information processing apparatus sequentially acquiring data regarding the movement of a predetermined part of a user at a plurality of points in time from a sensor capable of measuring the movement of the predetermined part of the user; and a non-mobile mode. , determining whether the motion of the predetermined region is a mode switching motion based on the data; and when determining that the motion of the predetermined region is a mode switching motion, switching from the non-moving mode to the moving mode; when the pointer is in the moving mode, outputting movement information regarding the pointer based on the data; and switching from the moving mode to the non-moving mode if a predetermined condition is satisfied after outputting the movement information. Execute.
 開示技術によれば、センサにより所定部位の動作を計測して入力操作が行われる際に、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることができる。 According to the disclosed technology, when an input operation is performed by measuring the motion of a predetermined part with a sensor, it is possible to more appropriately realize the operation intended by the user and improve usability.
実施例における情報処理システムの一例を示す図である。FIG. 1 is a diagram showing an example of an information processing system in an embodiment. 実施例における外部の情報処理装置の構成の一例を示すブロック図である。FIG. 2 is a block diagram illustrating an example of the configuration of an external information processing device in an embodiment. 実施例における情報処理装置の構成の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of the configuration of an information processing device in an embodiment. 実施例における処理部の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of a processing part in an example. 実施例における移動速度に関するデータの一例を示す図である。It is a figure which shows an example of the data regarding the moving speed in an Example. 実施例におけるデータ、動き、出力の関係の一例を示す図である。FIG. 3 is a diagram showing an example of the relationship between data, movement, and output in an example. 実施例における所定部位の動きとモードとコマンドとの関係の一例を示す図である。FIG. 6 is a diagram showing an example of the relationship between the movement of a predetermined part, a mode, and a command in an embodiment. 実施例における動きとコマンドとポインタの動きとの関係の一例を示す図である。FIG. 6 is a diagram illustrating an example of the relationship between movements, commands, and pointer movements in the embodiment. 実施例におけるコマンド発行に関する処理の一例(その1)を示すフローチャートである。12 is a flowchart illustrating an example (part 1) of processing related to command issuance in the embodiment. 実施例におけるコマンド発行に関する処理の一例(その2)を示すフローチャートである。12 is a flowchart illustrating an example (part 2) of processing related to command issuance in the embodiment.
 以下、図面を参照して本発明の実施の形態を説明する。ただし、以下に説明する実施形態は、あくまでも例示であり、以下に明示しない種々の変形や技術の適用を排除する意図はない。即ち、本発明は、その趣旨を逸脱しない範囲で種々変形して実施することができる。また、以下の図面の記載において、同一または類似の部分には同一または類似の符号を付して表している。図面は模式的なものであり、必ずしも実際の寸法や比率等とは一致しない。図面相互間においても互いの寸法の関係や比率が異なる部分が含まれていることがある。 Embodiments of the present invention will be described below with reference to the drawings. However, the embodiments described below are merely examples, and there is no intention to exclude the application of various modifications and techniques not specified below. That is, the present invention can be implemented with various modifications without departing from the spirit thereof. In addition, in the description of the drawings below, the same or similar parts are denoted by the same or similar symbols. The drawings are schematic and do not necessarily correspond to actual dimensions or proportions. The drawings may also include portions that differ in dimensional relationships and ratios.
 [実施例]
 実施例では、加速度センサ及び/又は角速度センサ、必要に応じて生体電極を搭載するウエアラブル端末の対象として、アイウエアを例に挙げるが、これに限られない。図1は、実施例における情報処理システム1の一例を示す図である。図1に示す情報処理システム1は、外部の情報処理装置(以下、「外部装置」とも称す。)10と、アイウエア30とを含み、外部装置10と、アイウエア30とは、ネットワークを介して接続され、データ通信可能になっている。
[Example]
In the embodiment, eyewear is taken as an example of a wearable terminal equipped with an acceleration sensor and/or an angular velocity sensor and, if necessary, a bioelectrode, but the present invention is not limited thereto. FIG. 1 is a diagram showing an example of an information processing system 1 in an embodiment. The information processing system 1 shown in FIG. 1 includes an external information processing device (hereinafter also referred to as "external device") 10 and eyewear 30, and the external device 10 and the eyewear 30 are connected via a network. is connected and data communication is possible.
 アイウエア30は、例えばブリッジ部分に情報処理装置20を搭載する。情報処理装置20は、一対のノーズパッド及びブリッジ部を含み、それぞれに生体電極32、34、36を有してもよい。情報処理装置20は、3軸加速度センサ及び/又は3軸角速度センサ(6軸センサでもよい)を含む。なお、生体電極32、34、36は必ずしも必要ではない。 The eyewear 30 has the information processing device 20 mounted on the bridge portion, for example. The information processing device 20 includes a pair of nose pads and a bridge portion, each of which may have bioelectrodes 32, 34, and 36. The information processing device 20 includes a 3-axis acceleration sensor and/or a 3-axis angular velocity sensor (or a 6-axis sensor). Note that the bioelectrodes 32, 34, and 36 are not necessarily required.
 情報処理装置20は、センサ信号や眼電位信号等を検出して外部装置10に送信する。情報処理装置20の設置位置は、必ずしもブリッジ部分である必要はないが、アイウエア30が装着された際に、センサ信号や眼電位信号を取得可能な位置に位置決めされればよい。また、情報処理装置20は、ブリッジ部分に取り外し可能に設けられてもよい。 The information processing device 20 detects sensor signals, electro-oculography signals, etc. and transmits them to the external device 10. The installation position of the information processing device 20 does not necessarily have to be in the bridge portion, but it may be positioned at a position where the sensor signal and the electro-oculography signal can be obtained when the eyewear 30 is worn. Further, the information processing device 20 may be removably provided in the bridge portion.
 外部装置10は、通信機能を有する情報処理装置である。例えば、外部装置10は、サーバ装置、パーソナルコンピュータ、タブレット端末、スマートフォンなどの携帯端末等である。外部装置10は、情報処理装置20から受信したユーザの頭部の移動に関するデータを取得し、このデータに基づいて複数の操作処理を実行する。操作処理は、例えば、ポインタ(カーソル)移動、クリック、ドラッグ、スクロールなどを含む。また、外部装置10は、情報処理装置20から、頭部の動作に応じて、操作処理を命令するコマンドや、ポインタの移動量を示すデータを受信してもよい。 The external device 10 is an information processing device that has a communication function. For example, the external device 10 is a server device, a personal computer, a tablet terminal, a mobile terminal such as a smartphone, or the like. The external device 10 acquires data regarding the movement of the user's head received from the information processing device 20, and executes a plurality of operation processes based on this data. The operation process includes, for example, pointer (cursor) movement, clicking, dragging, scrolling, and the like. Further, the external device 10 may receive a command for instructing an operation process or data indicating the amount of movement of the pointer from the information processing device 20 in accordance with the movement of the head.
 <外部装置10の構成>
 図2は、実施例における外部装置10の構成の一例を示すブロック図である。外部装置10は、1つ又は複数の処理装置(CPU)110、1つ又は複数のネットワーク通信インタフェース120、メモリ130、ユーザインタフェース150、撮像センサ160、及びこれらの構成要素を相互接続するための1つ又は複数の通信バス170を含む。
<Configuration of external device 10>
FIG. 2 is a block diagram showing an example of the configuration of the external device 10 in the embodiment. External device 10 includes one or more processing units (CPUs) 110, one or more network communication interfaces 120, memory 130, user interface 150, image sensor 160, and one or more processors for interconnecting these components. one or more communication buses 170 .
 ネットワーク通信インタフェース120は、移動体通信用アンテナや無線LAN通信用アンテナを通じてネットワークに接続され、情報処理装置20との間でデータ通信をすることが可能である。 The network communication interface 120 is connected to a network through a mobile communication antenna or a wireless LAN communication antenna, and is capable of data communication with the information processing device 20.
 ユーザインタフェース150は、ディスプレイ装置、及び入力装置(キーボード及び/又はマウス、又は他の何らかのポインティングデバイス等)を挙げることができる。ユーザインタフェース150は、ディスプレイ装置に表示されるポインタを移動操作することが可能である。撮像センサ160は、イメージセンサであり、光信号を受け、この光信号を電気信号に変換し画像を生成する。画像は静止画像及び動画像の少なくとも1つを含む。 User interface 150 may include a display device and an input device (such as a keyboard and/or mouse, or some other pointing device). The user interface 150 is capable of moving a pointer displayed on a display device. The image sensor 160 is an image sensor that receives an optical signal, converts the optical signal into an electrical signal, and generates an image. The image includes at least one of a still image and a moving image.
 メモリ130は、例えば、DRAM、SRAM、他のランダムアクセス固体記憶装置などの高速ランダムアクセスメモリであり、また、1つ又は複数の磁気ディスク記憶装置、光ディスク記憶装置、フラッシュメモリデバイス、又は他の不揮発性固体記憶装置などの不揮発性メモリでもよく、コンピュータ読み取り可能な非一時的な記録媒体でもよい。 Memory 130 can be, for example, a high speed random access memory such as DRAM, SRAM, other random access solid state storage, or one or more magnetic disk storage, optical disk storage, flash memory devices, or other non-volatile It may be a non-volatile memory such as a solid-state storage device, or it may be a computer-readable non-transitory recording medium.
 メモリ130は、情報処理システム1により用いられるデータを記憶する。例えば、メモリ130は、アイウエア30(情報処理装置20)から送信されるデータを記憶する。 The memory 130 stores data used by the information processing system 1. For example, the memory 130 stores data transmitted from the eyewear 30 (information processing device 20).
 また、メモリ130の他の例として、CPU110から遠隔に設置される1つ又は複数の記憶装置でもよい。ある実施形態において、メモリ130は、CPU110により実行されるプログラム、モジュール及びデータ構造、又はそれらのサブセットを格納する。 Further, as another example of the memory 130, one or more storage devices installed remotely from the CPU 110 may be used. In some embodiments, memory 130 stores programs, modules and data structures, or a subset thereof, that are executed by CPU 110.
 CPU110は、メモリ130に記憶されるプログラムを実行することで、取得部112、及び処理制御部114を構成する。 The CPU 110 configures the acquisition unit 112 and the processing control unit 114 by executing a program stored in the memory 130.
 取得部112は、アイウエア30の情報処理装置20から、操作に関する指令(コマンド)やポインタの移動量を示すデータ等を取得する。 The acquisition unit 112 acquires instructions (commands) regarding operations, data indicating the amount of movement of the pointer, etc. from the information processing device 20 of the eyewear 30.
 処理制御部113は、取得部112により取得された操作に関する指令を実行したり、ポインタの移動量を用いてポインタの移動を制御したりする。例えば、処理制御部113は、ポインティングデバイス(例えばマウス)などの入力デバイスの機能と同様の機能を有するアイウエア30から取得される指令やデータを用いて、クリックやドラッグなどの操作を制御する。 The processing control unit 113 executes commands related to the operations acquired by the acquisition unit 112, and controls the movement of the pointer using the amount of movement of the pointer. For example, the processing control unit 113 controls operations such as clicking and dragging using commands and data obtained from the eyewear 30, which has a function similar to that of an input device such as a pointing device (eg, a mouse).
 なお、CPU110の取得部112は、、ユーザの所定部位の動作を計測可能なセンサ(一例として、ユーザの所定部位に装着されるセンサ)から、所定部位の移動速度に関するデータを順に取得してもよい。この場合、処理制御部113は、後述する情報処理装置20のモード制御処理や判定処理や出力処理を実行し、クリックやドラッグなどの操作を制御してもよい。なお、処理制御部113の処理は、クリックやドラッグ、スクロールなどのマウス操作に加え、ファンクションキーやテンキーや設定変更などの操作が、ユーザの頭部の動きに対応付けられてもよい。 Note that the acquisition unit 112 of the CPU 110 may sequentially acquire data regarding the movement speed of a predetermined part of the user from a sensor capable of measuring the movement of a predetermined part of the user (for example, a sensor attached to a predetermined part of the user). good. In this case, the processing control unit 113 may execute mode control processing, determination processing, and output processing of the information processing device 20, which will be described later, and may control operations such as clicking and dragging. Note that in addition to mouse operations such as clicking, dragging, and scrolling, the processing of the processing control unit 113 may also include operations such as a function key, a numeric keypad, and a setting change, which may be associated with the movement of the user's head.
 <情報処理装置20の構成>
 図3は、実施例における情報処理装置20の構成の一例を示すブロック図である。図3に示すように、情報処理装置20は、処理部202、送信部204、6軸センサ206、電源部208、及び各生体電極32、34、36を有する。また、各生体電極32、34、36は、例えば増幅部を介して電線を用いて処理部202に接続される。なお、各生体電極32、34、36は必ずしも必要な構成ではない。また、情報処理装置20は、処理データを記憶するメモリを有する。このメモリは、コンピュータ読み取り可能な非一時的な記録媒体でもよい。
<Configuration of information processing device 20>
FIG. 3 is a block diagram showing an example of the configuration of the information processing device 20 in the embodiment. As shown in FIG. 3, the information processing device 20 includes a processing section 202, a transmitting section 204, a six-axis sensor 206, a power supply section 208, and each bioelectrode 32, 34, and 36. Further, each of the bioelectrodes 32, 34, and 36 is connected to the processing unit 202 using an electric wire, for example, via an amplification unit. Note that each bioelectrode 32, 34, and 36 is not necessarily a necessary configuration. The information processing device 20 also includes a memory that stores processing data. This memory may be a computer readable non-transitory storage medium.
 送信部204は、例えば、処理部202によりパケット化された移動速度に関するデータ、後述する移動情報、非移動情報、又はコマンドを外部装置10に送信する。例えば、送信部204は、Bluetooth(登録商標)及び無線LAN等の無線通信、又は有線通信によって、非移動情報、移動情報、コマンド等を外部装置10に送信する。 The transmitting unit 204 transmits, for example, data regarding the moving speed packetized by the processing unit 202, moving information, non-moving information, or a command, which will be described later, to the external device 10. For example, the transmitter 204 transmits non-movement information, movement information, commands, etc. to the external device 10 by wireless communication such as Bluetooth (registered trademark) and wireless LAN, or by wired communication.
 6軸センサ206は、3軸加速度センサ及び3軸角速度センサである。また、これらの各センサは別個に設けられてもよく、いずれか一方のセンサが設けられてもよい。6軸センサ206は、検出したセンサ信号を処理部202に出力する。電源部208は、処理部202、送信部204、6軸センサ206等に電力を供給する。 The 6-axis sensor 206 is a 3-axis acceleration sensor and a 3-axis angular velocity sensor. Further, each of these sensors may be provided separately, or one of the sensors may be provided. The 6-axis sensor 206 outputs the detected sensor signal to the processing unit 202. The power supply unit 208 supplies power to the processing unit 202, the transmission unit 204, the 6-axis sensor 206, and the like.
 処理部202は、プロセッサを含み、6軸センサ206から得られるセンサ信号や、必要に応じて各生体電極32、34、36から得られる眼電位信号を処理し、例えば、センサ信号や眼電位信号をパケット化して、このパケットを送信部204に出力してもよい。また、処理部202は、6軸センサ206から得られるセンサ信号を増幅等するだけでもよい。 The processing unit 202 includes a processor, and processes the sensor signal obtained from the 6-axis sensor 206 and the electro-oculography signal obtained from each bioelectrode 32, 34, and 36 as necessary, and processes, for example, the sensor signal and the electro-oculography signal. may be packetized and this packet may be output to the transmitter 204. Further, the processing unit 202 may simply amplify the sensor signal obtained from the 6-axis sensor 206.
 また、処理部202は、6軸センサ206からのセンサ信号を、頭部の動きに関する情報に基づく移動速度データとして生成してもよい。頭部の動きに関する情報は、例えば頭部の上下(前後)左右への動きに関する情報である。以下、処理部202が、プロセッサを有し、入力デバイスに対応する処理を実行する例について説明する。 Additionally, the processing unit 202 may generate the sensor signal from the 6-axis sensor 206 as movement speed data based on information regarding head movement. The information related to the movement of the head is, for example, information related to the movement of the head up and down (back and forth) and to the left and right. An example in which the processing unit 202 includes a processor and executes processing corresponding to an input device will be described below.
 図4は、実施例における処理部202の構成の一例を示すブロック図である。図4に示す例では、処理部202は、取得部222、モード制御部224、出力部228を有する。 FIG. 4 is a block diagram showing an example of the configuration of the processing unit 202 in the embodiment. In the example shown in FIG. 4, the processing unit 202 includes an acquisition unit 222, a mode control unit 224, and an output unit 228.
 取得部222は、ユーザの所定部位の動作を計測可能なセンサから、複数時点における所定部位の移動速度に関するデータを順に取得する。例えば、取得部112は、所定のサンプリングレートでサンプリングする角速度センサから送信される角速度データを順に取得する。センサからデータを取得するとは、直接的又は間接的にデータを取得することを含む。実施例における所定部位は頭部であるが、腕や脚などでもよい。センサは、所定部位の移動速度を計測可能なセンサであり、実施例では、角速度センサや加速度センサが好適であり、ユーザの所定部位に装着されてもよい。例えば、センサが3軸角速度センサの場合、移動速度に関するデータは方向を含む角速度ベクトルのデータである。 The acquisition unit 222 sequentially acquires data regarding the movement speed of a predetermined region at multiple points in time from a sensor capable of measuring the motion of the predetermined region of the user. For example, the acquisition unit 112 sequentially acquires angular velocity data transmitted from angular velocity sensors that sample at a predetermined sampling rate. Obtaining data from a sensor includes obtaining data directly or indirectly. Although the predetermined region in the embodiment is the head, it may also be an arm, a leg, or the like. The sensor is a sensor capable of measuring the moving speed of a predetermined part, and in the embodiment, an angular velocity sensor or an acceleration sensor is suitable, and may be attached to a predetermined part of the user. For example, if the sensor is a three-axis angular velocity sensor, the data regarding the moving speed is data on an angular velocity vector including direction.
 モード制御部224は、移動速度に関するデータに基づいて、外部装置10のポインタの移動を制御する移動モードと、ポインタを移動させない非移動モードとの状態遷移を制御する。例えば、モード制御部224は、各種の判定を行う判定部226を有する。 The mode control unit 224 controls state transition between a movement mode in which movement of the pointer of the external device 10 is controlled and a non-movement mode in which the pointer is not moved, based on data regarding movement speed. For example, the mode control unit 224 includes a determination unit 226 that performs various determinations.
 例えば、モード制御部224の判定部226は、現在の状態が非移動モードである場合、順に取得される移動速度に関するデータに基づいて、ユーザの所定部位の動作がモード切替動作であるか否かを判定する。具体例として、モード切替動作は、予め設定された所定部位の1又は複数の動作であり、ユーザにより設定された動作でもよい。 For example, when the current state is the non-moving mode, the determination unit 226 of the mode control unit 224 determines whether the movement of a predetermined part of the user is a mode switching movement based on data related to movement speed that are sequentially acquired. Determine. As a specific example, the mode switching operation may be one or more operations of a predetermined part set in advance, or may be an operation set by the user.
 また、判定部226は、ユーザの所定部位の動作がモード切替動作であると判定した場合、非移動モードから移動モードに切り替える。例えば、判定部226は、順に取得される移動速度に関するデータにより示される所定部位の移動方向及び/又は移動量が、1又は複数のモード切替動作のうちの1つの動作を示すと判定した場合、モード制御部224は、非移動モードから移動モードに切り替える。モード切替動作の判定について、例えば、所定時間内に所定方向に閾値以上の移動がある動作であるか、又は任意の動作の複数の組み合わせがあるか否かが判定されてもよい。 Furthermore, if the determination unit 226 determines that the motion of the user's predetermined region is a mode switching motion, the determination unit 226 switches from the non-moving mode to the moving mode. For example, if the determination unit 226 determines that the moving direction and/or amount of movement of the predetermined part indicated by the sequentially acquired data regarding the moving speed indicates one of one or more mode switching operations, The mode control unit 224 switches from non-moving mode to moving mode. Regarding the determination of the mode switching operation, for example, it may be determined whether the operation involves a movement of more than a threshold value in a predetermined direction within a predetermined time, or whether there is a combination of a plurality of arbitrary operations.
 また、閾値は、急な動きを検出できる所定閾値が予め設定されてもよい。例えば、所定部位が頭部の場合は、素早く頭部を横方向や縦方向に振ったりする場合に超えるような閾値が設定されたり、所定部位が腕や脚の場合は、素早く腕や脚を動かした場合に超えるような閾値が設定されたりする。なお、各方向に応じた閾値は、それぞれの動きに合わせて異なるように設定されてもよい。 Further, the threshold value may be a predetermined threshold value that can detect sudden movement. For example, if the predetermined part is the head, a threshold is set that is exceeded when the head is quickly shaken horizontally or vertically, and if the predetermined part is the arms or legs, a threshold is set that is exceeded when the head is quickly shaken horizontally or vertically. A threshold may be set that will be exceeded if the object is moved. Note that the threshold values for each direction may be set differently depending on each movement.
 出力部228は、情報処理装置20の電源がONになった場合などに、処理部202により処理されたデータを出力する。出力されたデータは、送信部204からネットワークやBluetooth(登録商標)を介して外部装置10に送信される。 The output unit 228 outputs the data processed by the processing unit 202 when the information processing device 20 is powered on. The output data is transmitted from the transmitter 204 to the external device 10 via a network or Bluetooth (registered trademark).
 例えば、出力部228は、非移動モードの場合、外部装置10側のポインタを移動させない非移動情報を出力する。例えば、非移動情報は、順に取得される移動速度に関するデータに関わらず、ポインタを移動させないために、X座標、Y座標の移動量が0を示す(0,0)などの情報である。 For example, in the case of non-movement mode, the output unit 228 outputs non-movement information that does not move the pointer on the external device 10 side. For example, the non-movement information is information such as (0,0) indicating that the amount of movement of the X and Y coordinates is 0 in order to not move the pointer, regardless of the data regarding the moving speed that is sequentially acquired.
 また、出力部228は、現在の状態が移動モードである場合、順に取得される移動速度に関するデータに基づいてポインタに関する移動情報を出力する。例えば、出力部228は、順に取得される移動速度に関するデータに基づいて、ポインタを移動させるために、X座標、Y座標の移動量(X1,Y1)などの情報を出力する。 Furthermore, when the current state is in the movement mode, the output unit 228 outputs movement information regarding the pointer based on the data regarding the movement speed that is sequentially acquired. For example, the output unit 228 outputs information such as the amount of movement of the X coordinate and Y coordinate (X1, Y1) in order to move the pointer based on the data regarding the moving speed that is sequentially acquired.
 モード制御部224は、現在が移動モードであり、かつ、移動情報の出力後に所定条件が満たされた場合、移動モードから非移動モードに切り替える。所定条件は、例えば、移動速度に関するデータが示す移動量や動作に関する条件を含む。 If the current mode is the movement mode and a predetermined condition is met after the movement information is output, the mode control unit 224 switches from the movement mode to the non-movement mode. The predetermined conditions include, for example, conditions regarding the movement amount and motion indicated by the data regarding the movement speed.
 以上の処理により、センサにより所定部位の動作を計測して入力操作が行われる際に、ユーザが意図しないポインタの移動を防止し、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることができる。また、上記処理によれば、ユーザがポインタを移動させないときは、所定部位の動きに関係ない非移動モードを導入することで、ユーザが操作を意図しないときに、所定部位の動きに合わせてポインタが動いてしまうことを防止することが可能になる。また、外部装置10は、マウスやポインティングデバイスを操作するときと同様に情報処理装置20から送信される非移動情報、移動情報、コマンド等に従ってポインタを移動等させればよいだけなので、外部装置10側に特別なプログラムを追加等する必要がなく既存の外部装置10に適用できる。またその結果、外部装置10の限られた処理リソースを有効活用することが可能になる。 Through the above processing, when the sensor measures the movement of a predetermined part and performs an input operation, the user can prevent the pointer from moving unintentionally, making it possible to more appropriately perform the user's intended operation, and improving usability. can be done. Furthermore, according to the above process, when the user does not move the pointer, by introducing a non-movement mode that is unrelated to the movement of a predetermined part, when the user does not intend to perform an operation, the pointer can be moved according to the movement of the predetermined part. It is possible to prevent this from moving. Further, the external device 10 only needs to move the pointer according to the non-movement information, movement information, commands, etc. transmitted from the information processing device 20 in the same way as when operating a mouse or pointing device. It can be applied to the existing external device 10 without the need to add a special program to the side. Moreover, as a result, it becomes possible to effectively utilize the limited processing resources of the external device 10.
 また、モード制御部224は、移動速度に関するデータを用いてポインタ移動の制御が開始される場合、開始のモードとして非移動モードが設定されてもよい。これにより、情報処理装置20の電源をONにしてからすぐにポインタが所定部位の動きに合わせて移動することを防止し、マウス等のように、使いたい時にだけポインタが動くようになる操作感をユーザに提供することが可能になる。 Furthermore, when control of pointer movement is started using data related to movement speed, the mode control unit 224 may set a non-movement mode as the starting mode. This prevents the pointer from moving in accordance with the movement of a predetermined part immediately after the power of the information processing device 20 is turned on, and provides an operational feel that allows the pointer to move only when you want to use it, like with a mouse. can be provided to users.
 また、モード制御部224は、非移動モードから移動モードに切り替えることについて、ユーザの所定部位の動作が第1モード切替動作である場合、非移動モードから移動モードの第1状態に切り替えることを含んでもよい。例えば、モード制御部224は、順に取得される移動速度に関するデータにより示される移動方向及び移動量に基づいて、所定部位の動作が第1モード切替動作であると判定された場合、非移動モードから複数の移動モードのうちの第1状態に切り替える。 Furthermore, the mode control unit 224 includes switching from the non-movement mode to the first state of the movement mode when the user's movement of a predetermined region is the first mode switching action. But that's fine. For example, if it is determined that the movement of the predetermined part is the first mode switching movement based on the movement direction and movement amount indicated by data related to movement speed that are sequentially acquired, the mode control unit 224 switches the movement mode from the non-movement mode. Switch to the first state of a plurality of movement modes.
 例えば、第1モード切替動作が左右の動きである場合、判定部226は、順に取得される移動速度に関するデータに基づいて、所定時間内に左方向に閾値以上の移動量及び右方向に閾値以上の移動量のある動きがあるか否かを判定する。判定部226は、この動きを検知すれば、ユーザの所定部位の一連の動作が、ポインタ移動後にコマンド発行までの処理を行う第1モード切替動作であると判定する。 For example, when the first mode switching operation is a left-right movement, the determination unit 226 determines, based on the sequentially acquired data regarding the movement speed, the amount of movement in the left direction is equal to or more than a threshold value and the amount of movement in the right direction is equal to or more than a threshold value within a predetermined time. It is determined whether there is a movement with a movement amount of . If the determination unit 226 detects this movement, it determines that the series of actions of the user's predetermined part is a first mode switching action that performs processing from pointer movement to command issuance.
 また、出力部228は、移動モードが第1状態である場合、移動情報の出力に続いて、ポインタの移動停止後に第1モード切替動作に対応するコマンドを出力することを含んでもよい。例えば、第1状態はクリック待ち状態であり、第1モード切替動作に対応するコマンドは左クリックである。 Furthermore, when the movement mode is in the first state, the output unit 228 may include outputting a command corresponding to the first mode switching operation after the pointer stops moving, following output of the movement information. For example, the first state is a click wait state, and the command corresponding to the first mode switching operation is a left click.
 以上の処理により、非移動モードから移動モードへの切替動作後に、ポインタを移動して目的の位置で停止するだけで、左クリックのコマンドを出力することが可能になる。すなわち、ユーザは、モード切替動作と、コマンド決定動作との両方を行う必要がなく、良く使用する左クリックについてはモード切替動作からの一連の動作でポインタの移動から左クリックまでの一連の動作を行うことが可能になる。また、コマンド決定の処理を削減することで、情報処理装置20の処理負荷を軽減させることが可能になる。なお、第1モード切替動作に対応するコマンドは左クリックに限らず、ユーザが良く利用するコマンドに変更されてもよい。 With the above processing, it becomes possible to output a left-click command by simply moving the pointer and stopping at the desired position after switching from non-moving mode to moving mode. In other words, the user does not need to perform both the mode switching action and the command determination action, and for frequently used left clicks, the user can perform the series of actions from the pointer movement to the left click in the series of actions from the mode switching action. It becomes possible to do so. Furthermore, by reducing the command determination process, it is possible to reduce the processing load on the information processing device 20. Note that the command corresponding to the first mode switching operation is not limited to the left click, and may be changed to a command frequently used by the user.
 また、モード制御部224は、非移動モードから移動モードに切り替えることについて、所定部位の動作が第2モード切替動作である場合、非移動モードから移動モードの第2状態に切り替えることを含んでもよい。例えば、モード制御部224は、順に取得される移動速度に関するデータにより示される移動方向及び移動量に基づいて、所定部位の動作が第2モード切替動作であると判定された場合、非移動モードから複数の移動モードのうちの第2状態に切り替える。 Regarding switching from the non-movement mode to the movement mode, the mode control unit 224 may include switching from the non-movement mode to the second state of the movement mode when the movement of the predetermined region is a second mode switching operation. . For example, if it is determined that the movement of the predetermined part is the second mode switching movement based on the movement direction and movement amount indicated by data related to movement speed that are sequentially acquired, the mode control unit 224 switches the movement mode from the non-movement mode. Switch to the second state of the plurality of movement modes.
 例えば、第2モード切替動作が右左の動きである場合、判定部226は、順に取得される移動速度に関するデータに基づいて、所定時間内に右方向に閾値以上の移動量及び左方向に閾値以上の移動量のある動きがあるか否かを判定する。判定部226は、この動きを検知すれば、ユーザの所定部位の動作が、ポインタ移動後にコマンド決定の動作判定処理を行う第2モード切替動作であると判定する。 For example, when the second mode switching operation is a right-left movement, the determination unit 226 determines, based on the sequentially acquired data regarding the movement speed, the amount of movement in the right direction is equal to or more than a threshold value and the amount of movement in the left direction is equal to or more than a threshold value within a predetermined time. It is determined whether there is a movement with a movement amount of . If the determination unit 226 detects this movement, it determines that the user's movement of the predetermined part is a second mode switching movement in which a movement judgment process for determining a command is performed after the pointer is moved.
  また、判定部226は、移動モードが第2状態である場合、移動情報の出力に続いて、ポインタの移動停止後に移動速度に関するデータに基づいて、所定部位の動作がコマンド動作であるかどうかを判定する。例えば、コマンド動作が複数設定されている場合、判定部226は、順に取得される移動速度に関するデータに基づいて、複数のコマンド動作のうちの1つに該当するか否かを判定する。 Further, when the movement mode is in the second state, following the output of the movement information, the determination unit 226 determines whether the movement of the predetermined part is a command movement based on the data regarding the movement speed after the pointer stops moving. judge. For example, if a plurality of command operations are set, the determination unit 226 determines whether one of the command operations corresponds to one of the plurality of command operations based on sequentially acquired data related to movement speed.
 また、出力部228は、判定部226によりコマンド動作であると判定された場合に、コマンド動作に対応するコマンドを出力してもよい。例えば、コマンド動作が左へ一瞬動かす動きの場合、左クリックのコマンドが出力され、コマンド動作が右へ一瞬動かす動きの場合、右クリックのコマンドが出力され、コマンド動作が上へ一瞬動かす動きの場合、ダブルクリックが出力され、コマンド動作が下へ一瞬動かす動きの場合、ドラッグ開始のコマンドが出力される。 Further, the output unit 228 may output a command corresponding to the command operation when the determination unit 226 determines that the operation is a command operation. For example, if the command action is a momentary movement to the left, a left click command will be output, and if the command action is a momentary movement to the right, a right click command will be output, and if the command action is a momentary movement upward. , a double click is output, and if the command action is a momentary downward movement, a command to start dragging is output.
 以上の処理により、例えばポインタを移動して目的のアイコン上で一旦停止してから頭部を右へ一瞬動かすことで、当該アイコン上で右クリックするコマンドが出力され、当該アイコンのメニュー表示を開くことができる。すなわち、複数の動作を組み合わせてコマンドを設定することができ、コマンドのバリエーションを増やすことが可能になる。 With the above process, for example, by moving the pointer, stopping once on the desired icon, and then momentarily moving the head to the right, a command to right-click on the icon will be output, and the menu display for the icon will be opened. be able to. That is, it is possible to set a command by combining a plurality of operations, and it is possible to increase the variations of commands.
 また、移動モードから非移動モードに遷移する所定条件は、出力部228によりコマンドが出力されたことを含んでもよい。例えば、モード制御部224は、出力部228からコマンドが出力されたことを検知すると、移動モードから非移動モードに遷移するよう制御する。 Furthermore, the predetermined condition for transitioning from the moving mode to the non-moving mode may include that the output unit 228 outputs a command. For example, when the mode control unit 224 detects that a command has been output from the output unit 228, it controls the mode to transition from the moving mode to the non-moving mode.
 以上の処理により、コマンド出力後に非移動モードに自動で遷移することができ、入力操作が終わった後にポインタが不必要に移動することを防止することが可能になる。 With the above processing, it is possible to automatically transition to the non-movement mode after a command is output, and it is possible to prevent the pointer from moving unnecessarily after the input operation is completed.
 また、出力部228は、非移動モードから移動モードに切り替えられる場合、モード切替動作に対応する動きをポインタにさせる動き情報を出力することを含んでもよい。例えば、出力部228は、モード切替後、所定時間は移動情報に代えて、予め設定されたモード切替動作に対応するポインタの動き情報を出力し、動き情報に続いて移動速度データに基づく移動情報を出力してもよい。 Furthermore, when the non-moving mode is switched to the moving mode, the output unit 228 may include outputting motion information that causes a pointer to move corresponding to the mode switching operation. For example, after mode switching, the output unit 228 outputs pointer movement information corresponding to a preset mode switching operation instead of movement information for a predetermined time, and following the movement information, movement information based on movement speed data is output. may be output.
 以上の処理により、ユーザは、ポインタの動きを把握することにより、所定部位の動きが意図したものかどうかを確認することが可能になる。 Through the above processing, the user can check whether the movement of the predetermined part is intended by understanding the movement of the pointer.
 また、上述した動き情報は、ユーザの所定部位の動作に関係なく定められた動きを示す情報を含んでもよい。例えば、所定部位の動きは左右左右であるのに対し、動き情報は円3回を示す情報であってもよい。 Furthermore, the above-mentioned movement information may include information indicating a predetermined movement regardless of the movement of a predetermined part of the user. For example, while the movement of the predetermined part is left, right, left, and right, the movement information may be information indicating three times in a circle.
 以上の処理により、ユーザはポインタのシンプルな動きを把握することで、自身の複雑な動きが正しく入力できたかを確認することが可能になる。 Through the above processing, the user can check whether his or her complex movements have been input correctly by understanding the simple movements of the pointer.
 また、現在のモードが非移動モードである場合、判定部226は、順に取得される移動速度に関するデータに基づいて、ユーザの所定部位の動作がポインタの移動を伴わないコマンド動作であるかどうかを判定してもよい。 Furthermore, when the current mode is the non-movement mode, the determination unit 226 determines whether the movement of the predetermined part of the user is a command movement that does not involve movement of the pointer, based on the data regarding the movement speed that is sequentially acquired. You may judge.
 この場合、出力部228は、ユーザの所定部位の動作が上記コマンド動作であると判定した場合、コマンド動作に対応するコマンドを出力してもよい。 In this case, if the output unit 228 determines that the user's movement of the predetermined region is the above command movement, it may output a command corresponding to the command movement.
 以上の処理により、非移動モードの状態の時に、ユーザは直接コマンド動作を行うことで、非移動モードが解除されるとともに、ポインタの移動が不要なコマンド(例えばスクロール動作など)に対して直接コマンドを出力することが可能になる。 Through the above processing, the user can cancel the non-movement mode by directly performing a command operation while in the non-movement mode, and also issue direct commands for commands that do not require pointer movement (such as scrolling). It becomes possible to output.
 <データ例>
 図5は、実施例における移動速度に関するデータの一例を示す図である。図5に示すデータは、所定のサンプリングレート(例えば20Hz=0.05秒)により取得されたデータを表す。図5に示すデータは、例えば情報処理装置20内のメモリに記憶される。なお、移動速度に関するデータとして、眼電位信号に基づく視線の移動速度に関するデータも記憶されてもよい。
<Data example>
FIG. 5 is a diagram showing an example of data regarding the moving speed in the example. The data shown in FIG. 5 represents data acquired at a predetermined sampling rate (for example, 20 Hz = 0.05 seconds). The data shown in FIG. 5 is stored in a memory within the information processing device 20, for example. Note that data regarding the movement speed of the line of sight based on the electro-oculogram signal may also be stored as the data regarding the movement speed.
 <具体例>
 図6は、実施例におけるデータ、動き、出力の関係の一例を示す図である。図6に示す例では、情報処理装置20の電源がONになり、外部装置10と情報処理装置20とがBluetooth(登録商標)などでペアリングした後は非移動モードに制御される。すなわち、ペアリング後、ユーザの所定部位は移動しているが、処理部202のモードは非移動モードであるため、非移動情報が出力される。これにより、外部装置10は、非移動情報を取得し、ポインタを移動させない。
<Specific example>
FIG. 6 is a diagram showing an example of the relationship between data, movement, and output in the embodiment. In the example shown in FIG. 6, after the power of the information processing device 20 is turned on and the external device 10 and the information processing device 20 are paired using Bluetooth (registered trademark) or the like, the information processing device 20 is controlled to be in a non-mobile mode. That is, after pairing, although the predetermined part of the user has moved, the mode of the processing unit 202 is the non-movement mode, so non-movement information is output. Thereby, the external device 10 acquires non-movement information and does not move the pointer.
 次に、0.85秒までの所定時間において、所定部位が左右と移動した場合、例えば、判定部226は、順に取得される移動速度に関するデータが左右の動きを示すことで、第1モード切替動作であると判定する。モード制御部224は、非移動モードから、移動モードの第1状態に遷移させる。この時、出力部228は、非移動情報を出力してもよいし、データの取得から出力までにタイムラグがある場合は、左右の一連の動作の移動情報を出力してもよい。また、出力部228は、コマンドを示す動き情報を出力してもよい。 Next, when the predetermined part moves left and right for a predetermined period of up to 0.85 seconds, for example, the determination unit 226 switches to the first mode when the sequentially acquired data regarding the movement speed indicates left and right movement. It is determined that it is a motion. The mode control unit 224 causes a transition from the non-mobile mode to the first state of the mobile mode. At this time, the output unit 228 may output non-movement information, or if there is a time lag between data acquisition and output, it may output movement information about a series of left and right movements. Further, the output unit 228 may output motion information indicating a command.
 次に、2.65秒まで所定部位が移動し、ユーザが目的とする対象位置にポインタが位置するように、ユーザは所定部位を動かす。第1状態においては、センサはこの動きを検知し、移動速度に関するデータが取得部222に取得され、このデータに基づく移動情報が出力部228により出力される。 Next, the user moves the predetermined part until 2.65 seconds, and the pointer is located at the target position that the user is aiming for. In the first state, the sensor detects this movement, data regarding the movement speed is acquired by the acquisition unit 222, and movement information based on this data is output by the output unit 228.
 次に、所定時間(例えば2.70~2.80までの間)、所定部位が停止した場合、順に取得される移動速度に関するデータは、移動方向及び移動量がほぼ0のデータを示す。これにより、判定部226は停止を判定し、出力部228は、第1モード切替動作に対応する左クリックのコマンドを出力する。 Next, when the predetermined part stops for a predetermined period of time (for example, between 2.70 and 2.80), the sequentially acquired data related to the movement speed indicate data in which the movement direction and movement amount are approximately 0. As a result, the determining unit 226 determines to stop, and the output unit 228 outputs a left click command corresponding to the first mode switching operation.
 図7は、実施例における所定部位の動きとモードとコマンドとの関係の一例を示す図である。例えば、非移動モードの状態で、所定部位の動作のその後のコマンドは以下のとおりである。なお、コマンド発行後、状態は非移動モードに戻る。
所定部位の動作:その後のコマンド
左右(移動モードへの切替動作):ポインタ移動→停止→左クリック
右左(移動モードへの切替動作):ポインタ移動→停止→クリック選択(左:左クリック、右:右クリック、上:ダブルクリック、下:ドラッグ開始)
左右左右又は右左右左(停止動作):停止モード
縦3回往復(設定動作):設定モード→モード選択(右:設定変更モード、左:設定表示モード)
上下(ダイレクトコマンド動作):スクロール1つ上
下上(ダイレクトコマンド動作):スクロール1つ下
上下上(ダイレクトコマンド動作):連続スクロール上→上又は下→スクロール停止
下上下(ダイレクトコマンド動作):連続スクロール下→上又は下→スクロール停止
FIG. 7 is a diagram showing an example of the relationship between the movement of a predetermined part, the mode, and the command in the embodiment. For example, in the non-movement mode, the subsequent commands for movement of a predetermined part are as follows. Note that after issuing the command, the state returns to non-mobile mode.
Movement of the specified part: Subsequent commands Left and right (switching to movement mode): Pointer movement → stop → left click Right and left (switching to movement mode): Pointer movement → stop → click selection (left: left click, right: Right click, top: double click, bottom: start dragging)
Left/right/left/right or right/left/left (stop action): stop mode vertically reciprocating 3 times (setting action): setting mode → mode selection (right: setting change mode, left: setting display mode)
Up/down (direct command action): Scroll one step up or down (direct command action): Scroll down one step up or down (direct command action): Continuous scroll up → up or down → scroll stop Down up/down (direct command action): Continuous scrolling Down → Up or Down → Stop scrolling
 以上のとおり、非移動モードから、ポインタが移動するモード(ポインタ移動)や、直接コマンドが発行されるモード等に遷移しうる。例えば、マウスのホイールは、ポインタの移動を伴わずにコマンドが発行される。このホイール機能と同様にするため、スクロールのコマンドに対応するダイレクトコマンド動作が判定されれば、ポインタの移動を伴わずに直接コマンドが発行される。 As described above, it is possible to transition from a non-movement mode to a mode in which the pointer moves (pointer movement), a mode in which a command is directly issued, and the like. For example, a mouse wheel is used to issue commands without moving the pointer. In order to perform the same function as this wheel function, if a direct command operation corresponding to a scroll command is determined, a direct command is issued without moving the pointer.
 また、停止モードは、停止モードを解除する動作がされない限り、ポインタは移動せず、コマンドは発行されない。例えば、会議中などで、相手に同調して所定部位を動かすことで、意図しない操作がなされることを防止するため、停止モードを解除する動作以外の動作を受け付けないモードが設けられている。これにより、ユーザが意図する動作からのコマンドの発行を、より適切に行うことができるようになる。 Furthermore, in the stop mode, the pointer does not move and no command is issued unless an action is taken to cancel the stop mode. For example, in order to prevent unintended operations from being performed by moving a predetermined part in sync with the other person during a meeting, etc., a mode is provided that does not accept any motion other than the motion to cancel the stop mode. This makes it possible to more appropriately issue commands based on the actions intended by the user.
 図8は、実施例における動きとコマンドとポインタの動きとの関係の一例を示す図である。ポインタの動きとは、モード切替後に、所定部位の動作又はコマンドに応じて設定されているポインタの動きを示す。図8に示す例では、各関係は以下のとおりである。
ポインタが示す動き:コマンド:ポインタの動き
左右:移動:左右
右左:移動:右左
左右左右:停止モード:円3回
縦3回往復:設定モード:円2回
上下:スクロール1つ上:上下
下上:スクロール1つ下:下上
FIG. 8 is a diagram illustrating an example of the relationship between movements, commands, and pointer movements in the embodiment. The movement of the pointer refers to the movement of the pointer that is set according to the movement of a predetermined part or a command after the mode is switched. In the example shown in FIG. 8, the relationships are as follows.
Movement indicated by the pointer: Command: Pointer movement Left/Right: Move: Left/Right/Right/Left: Move: Right/Left/Left/Left: Stop mode: Circle 3 times vertically 3 times back and forth: Setting mode: Circle 2 times up/down: Scroll one level up: Up/down/down/up :Scroll down one level:Down/Up
 コマンドが移動などの場合は、データが示す動きとポインタの動きは同じであるが、コマンドが認識されたことを強調するため、ポインタの動きが強調されてもよい。出力部228は、強調処理として、例えば実際のデータが示す動きよりも、早く動いたり、大きく動いたりする動きを有する動き情報を出力してもよい。データが示す動きと、ポインタの動きとが同じであるため、ユーザは動作が適切に認識されたことを直感的に把握することができる。 If the command is movement, etc., the movement indicated by the data and the movement of the pointer are the same, but the movement of the pointer may be emphasized to emphasize that the command has been recognized. As an emphasis process, the output unit 228 may output motion information having a motion that moves faster or moves more greatly than the motion indicated by the actual data, for example. Since the movement indicated by the data and the movement of the pointer are the same, the user can intuitively understand that the movement has been appropriately recognized.
 また、コマンドが停止モードなどの場合は、動きがないコマンドであるため、データが示す動きとは関係なく定められた動きをポインタが行う。このとき、所定部位が通常行わないようなシンプルな動きとすることで、ユーザは、所定部位の動作が適切に認識されたかを把握することができる。 Furthermore, if the command is in a stop mode or the like, the command does not move, so the pointer makes a predetermined movement regardless of the movement indicated by the data. At this time, by making a simple movement that the predetermined region does not normally perform, the user can understand whether the motion of the predetermined region has been appropriately recognized.
 <動作>
 図9及び図10は、実施例におけるコマンド発行に関する処理の一例を示すフローチャートである。図9及び図10に示す処理は、情報処理装置20で実行される処理の一例である。
<Operation>
9 and 10 are flowcharts illustrating an example of processing related to command issuance in the embodiment. The processes shown in FIGS. 9 and 10 are examples of processes executed by the information processing device 20.
 図9に示すステップS102において、取得部222は、ユーザの所定部位の動作を計測可能なセンサから、所定部位の移動速度に関するデータを順に取得する。センサは、例えばアイウエア30に装着される角速度センサであり、所定部位は、例えばユーザの頭部であり、移動速度に関するデータは、例えば角速度データである。 In step S102 shown in FIG. 9, the acquisition unit 222 sequentially acquires data regarding the movement speed of a predetermined region of the user from a sensor capable of measuring the motion of the predetermined region of the user. The sensor is, for example, an angular velocity sensor attached to the eyewear 30, the predetermined part is, for example, the user's head, and the data regarding the moving speed is, for example, angular velocity data.
 ステップS104において、モード制御部224は、現在の状態が非移動モードか移動モードかを判定する。例えば、モード制御部224は、デフォルトとして非移動モードに設定し、モード切替動作が検知されてから所定条件が満たされるまで移動モードに設定する。ここで、現在の状態が非移動モードであれば(ステップS104-YES)、処理はステップS106に進み、現在の状態が移動モードであれば(ステップS104-NO)、図10に示すステップS122に進む。 In step S104, the mode control unit 224 determines whether the current state is a non-moving mode or a moving mode. For example, the mode control unit 224 sets the non-moving mode as a default, and sets the moving mode after a mode switching operation is detected until a predetermined condition is satisfied. Here, if the current state is the non-movement mode (step S104-YES), the process proceeds to step S106, and if the current state is the movement mode (step S104-NO), the process proceeds to step S122 shown in FIG. move on.
 ステップS106において、判定部226は、順に取得される移動速度に関するデータが、予め設定されている1又は複数のモード切替動作を示すかを判定する。データがモード切替動作を示せば(ステップS106-YES)、処理はステップS114に進み、データがモード切替動作を示さなければ(ステップS106-NO)、処理はステップS108に進む。 In step S106, the determining unit 226 determines whether the sequentially acquired data regarding the moving speed indicates one or more preset mode switching operations. If the data indicates a mode switching operation (step S106-YES), the process proceeds to step S114, and if the data does not indicate a mode switching operation (step S106-NO), the process proceeds to step S108.
 ステップS108において、判定部226は、ステップS106において取得した移動速度に関するデータが、ポインタ移動を伴わないコマンド動作(ダイレクトコマンド動作)か否かを判定する。データがダイレクトコマンド動作を示せば(ステップS108-YES)、処理はステップS110に進み、データがダイレクトコマンド動作を示さなければ(ステップS108-NO)、処理はステップS112に進む。 In step S108, the determination unit 226 determines whether the data regarding the movement speed acquired in step S106 is a command operation that does not involve pointer movement (direct command operation). If the data indicates a direct command operation (step S108-YES), the process proceeds to step S110, and if the data does not indicate a direct command operation (step S108-NO), the process proceeds to step S112.
 ステップS110において、出力部228は、ダイレクトコマンド動作に対応するコマンドを出力する。 In step S110, the output unit 228 outputs a command corresponding to the direct command operation.
 ステップS112において、出力部228は、外部装置10のポインタを移動させない非移動情報、例えば移動量が(0,0)を示す情報を出力する。 In step S112, the output unit 228 outputs non-movement information that does not move the pointer of the external device 10, for example, information indicating that the amount of movement is (0,0).
 ステップS114において、モード制御部224は、非移動モードから移動モードに切り替える。 In step S114, the mode control unit 224 switches from the non-moving mode to the moving mode.
 ステップS116において、判定部226は、ステップS106において取得した移動速度に関するデータが第1モード切替動作を示すか否かを判定する。データが第1モード切替動作を示せば(ステップS116-YES)、処理はステップS118に進み、データが第1モード切替動作を示さなければ(ステップS118-NO)、処理はステップS120に進む。言い換えれば、第1モード切替動作であると判定されれば、処理はステップS118に進み、第2モード切替動作であると判定されれば、処理はステップS120に進む。 In step S116, the determination unit 226 determines whether the data regarding the moving speed acquired in step S106 indicates a first mode switching operation. If the data indicates a first mode switching operation (step S116-YES), the process proceeds to step S118, and if the data does not indicate a first mode switching operation (step S118-NO), the process proceeds to step S120. In other words, if it is determined that the operation is the first mode switching operation, the process proceeds to step S118, and if it is determined that the operation is the second mode switching operation, the process proceeds to step S120.
 ステップS118において、モード制御部224は、ポインタ停止後にコマンドを発行する第1状態に設定する。 In step S118, the mode control unit 224 sets the first state in which a command is issued after the pointer stops.
 ステップS120において、モード制御部224は、所定のコマンドに対応する動作を待つ第2状態に設定する。 In step S120, the mode control unit 224 sets the mode controller 224 to a second state in which it waits for an operation corresponding to a predetermined command.
 図10に示すステップS122において、すなわち移動モードにおいて、判定部226は、順に取得される移動速度に関するデータがポインタの停止動作を示すか否かを判定する。データがポインタの停止動作を示せば(ステップS122-YES)、処理はステップS126に進み、データがポインタの停止動作を示さなければ(ステップS122-NO)、処理はステップS124に進む。 In step S122 shown in FIG. 10, that is, in the movement mode, the determination unit 226 determines whether or not the sequentially acquired data regarding the movement speed indicates a stopping motion of the pointer. If the data indicates a stop operation of the pointer (step S122-YES), the process proceeds to step S126, and if the data does not indicate a stop operation of the pointer (step S122-NO), the process proceeds to step S124.
 ステップS124において、出力部228は、順に取得される移動速度に関するデータに基づき、ポインタの移動量を示す移動情報を出力する。 In step S124, the output unit 228 outputs movement information indicating the amount of movement of the pointer based on the data regarding the movement speed that is sequentially acquired.
 ステップS126において、モード制御部224は、現在の状態が第1状態であるか否かを判定する。現在の状態が第1状態であれば(ステップS126-YES)、処理はステップS128に進み、現在の状態が第2状態であれば(ステップS126-NO)、処理はステップS130に進む。 In step S126, the mode control unit 224 determines whether the current state is the first state. If the current state is the first state (step S126-YES), the process proceeds to step S128, and if the current state is the second state (step S126-NO), the process proceeds to step S130.
 ステップS128において、出力部228は、第1モード切替動作に対応したコマンド動作に応じたコマンドを出力する。また、モード制御部224は、コマンド出力後に、移動モードを非移動モードに切り替える。 In step S128, the output unit 228 outputs a command corresponding to the command operation corresponding to the first mode switching operation. Furthermore, after outputting the command, the mode control unit 224 switches the movement mode to the non-movement mode.
 ステップS130において、出力部228は、ポインタ停止後の所定のコマンド動作に応じたコマンドを出力する。また、モード制御部224は、コマンド出力後に、移動モードを非移動モードに切り替える。 In step S130, the output unit 228 outputs a command according to a predetermined command operation after the pointer stops. Furthermore, after outputting the command, the mode control unit 224 switches the movement mode to the non-movement mode.
 以上の処理により、センサが設けられる所定部位の動きを用いて入力操作が行われる際に、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることができる。 Through the above processing, when an input operation is performed using the movement of a predetermined part where a sensor is provided, it is possible to more appropriately realize the operation intended by the user and improve usability.
 なお、判定部226によりモード切替動作又はコマンド動作が判定された場合、モード切替動作又はコマンド動作に応じたポインタの動き情報を、出力部228が出力するようにしてもよい。この処理により、ユーザは、所定部位の動きが情報処理装置20により正しく認識されているかを確認することができる。 Note that when the determination unit 226 determines a mode switching operation or a command operation, the output unit 228 may output pointer movement information according to the mode switching operation or command operation. Through this process, the user can check whether the movement of the predetermined region is correctly recognized by the information processing device 20.
 なお、図9で説明した処理のフローに含まれる各処理ステップは、処理内容に矛盾を生じない範囲で、任意に順番を変更して又は並列に実行することができるとともに、各処理ステップ間に他のステップを追加してもよい。また、便宜上1ステップとして記載されているステップは、複数ステップに分けて実行することができる一方、便宜上複数ステップに分けて記載されているものは、1ステップとして把握することができる。 It should be noted that each processing step included in the processing flow explained in FIG. Other steps may be added. Furthermore, a step that is described as one step for convenience can be divided into multiple steps and executed, while a step that is described as multiple steps for convenience can be understood as one step.
 以上、実施例によれば、センサにより所定部位の動作を計測して入力操作が行われる際に、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることができる。また、実施例によれば、所定部位の動きを用いて入力操作する場合、マウスなどの入力デバイスのように、ポインタを動かしたいときや所定の操作をしたいときに、ユーザが意図する動きの認識や入力操作ができるようなインタフェースを提供することが可能になる。 As described above, according to the embodiment, when an input operation is performed by measuring the motion of a predetermined part with a sensor, it is possible to more appropriately realize the operation intended by the user and improve usability. Further, according to the embodiment, when performing an input operation using movement of a predetermined part, recognition of the movement intended by the user when the user wants to move the pointer or perform a predetermined operation, such as with an input device such as a mouse. It becomes possible to provide an interface that allows input operations.
 また、上記実施例では、所定部位の動きとして、ユーザの頭部や腕や脚などの動きを例に挙げたが、眼球の視線の動きを用いてもよい。この場合、各生体電極(センサの一例)から検知されるセンサ信号から眼球の動きを求め、この動きに関するデータに基づいて、処理部202は、処理を制御するようにしてもよい。 Furthermore, in the above embodiments, movements of the user's head, arms, legs, etc. are taken as examples of movements of the predetermined parts, but movements of the line of sight of the eyeballs may also be used. In this case, the movement of the eyeball may be determined from the sensor signal detected from each bioelectrode (an example of a sensor), and the processing unit 202 may control the processing based on data regarding this movement.
 なお、実施例において、アイウエア30に搭載された6軸センサからのセンサ信号を用いて説明したが、上述したように、6軸センサは頭部だけではなく、人体のいずれかの位置に装着されていればよい。 In addition, in the embodiment, explanation was given using the sensor signal from the 6-axis sensor mounted on the eyewear 30, but as mentioned above, the 6-axis sensor can be attached not only to the head but also to any position on the human body. It would be fine if it had been done.
 また、実施例において、移動速度データに3軸加速度センサのデータを用いてもよいことは述べたが、この場合、加速度が検出された後に、所定の演算処理によって速度に変換し、変換後の速度のデータが用いられてもよい。 In addition, in the embodiment, it has been mentioned that the data of the 3-axis acceleration sensor may be used as the movement speed data, but in this case, after the acceleration is detected, it is converted to speed by predetermined calculation processing, and the converted Velocity data may also be used.
 以上、本発明について実施例を用いて説明したが、本発明の技術的範囲は上記実施例に記載の範囲には限定されない。上記実施例に、多様な変更又は改良を加えることが可能であることが当業者に明らかである。その様な変更又は改良を加えた形態も本発明の技術的範囲に含まれ得ることが、特許請求の範囲の記載から明らかである。 Although the present invention has been described above using Examples, the technical scope of the present invention is not limited to the scope described in the above Examples. It will be apparent to those skilled in the art that various changes or improvements can be made to the embodiments described above. It is clear from the claims that such modifications or improvements may be included within the technical scope of the present invention.
 <変形例1>
 上述した実施例では、アイウエア30側の情報処理装置20がマウス等に変わるインタフェースの機能を果たし、情報処理装置20が、左クリック、右クリック、スクロールなどの操作コマンドを外部装置10に出力する例を説明した。他方、情報処理装置20は、順に取得される移動速度に関するデータを増幅したり、パケット化したりして外部装置10に送信してもよく、この場合、外部装置10は、上述した処理部202の機能を有し、判定されたコマンドをOS(Operating System)等に出力し、画面上のポインタの移動や操作を制御してもよい。
<Modification 1>
In the embodiment described above, the information processing device 20 on the eyewear 30 side functions as an interface that replaces a mouse or the like, and the information processing device 20 outputs operation commands such as left click, right click, and scroll to the external device 10. An example was explained. On the other hand, the information processing device 20 may amplify or packetize the sequentially acquired data regarding the moving speed and transmit the data to the external device 10. In this case, the external device 10 may be It may have a function and output the determined command to an OS (Operating System) or the like to control the movement and operation of a pointer on the screen.
 <変形例2>
 変形例2において、情報処理装置20と、外部装置10の一部の構成とが一体として形成される情報処理装置により、実施例で上述した処理内容が実現されてもよい。例えば、ヘッドマウントディスプレイは、情報処理装置20の処理部202と、外部装置10のディスプレイ151とを有し、ディスプレイ151に表示される画像に、ユーザの頭部の動作に応じたポインタが重畳され、上述したような入力操作が行われてもよい。
<Modification 2>
In modification 2, the processing content described above in the embodiment may be realized by an information processing device in which the information processing device 20 and a part of the configuration of the external device 10 are integrally formed. For example, a head-mounted display includes the processing unit 202 of the information processing device 20 and the display 151 of the external device 10, and a pointer corresponding to the movement of the user's head is superimposed on an image displayed on the display 151. , input operations such as those described above may be performed.
 <変形例3>
 変形例3において、ユーザの所定部位の動作を計測可能なセンサとして、撮像センサ160が用いられる。変形例3の場合、外部装置10が、上述した情報処理装置であり、CPU110が処理部202を有する。具体例として、撮像センサ160のライブビューモードなどによりユーザの所定部位(頭部等)の動作を撮影し、画像認識により認識された所定部位の動きに関するデータを処理部202が取得する。以降の処理は、実施例の処理と同様である。これにより、カメラを有するパーソナルコンピュータなどを用いて、上述した非移動モードの導入により、ユーザが意図する操作をより適切に実現可能にし、ユーザビリティを向上させることができる。
<Modification 3>
In the third modification, an image sensor 160 is used as a sensor capable of measuring the motion of a predetermined region of the user. In the case of modification 3, the external device 10 is the information processing device described above, and the CPU 110 includes the processing section 202. As a specific example, the motion of a predetermined part (such as the head) of the user is photographed using the live view mode of the image sensor 160, and the processing unit 202 acquires data regarding the movement of the predetermined part recognized by image recognition. The subsequent processing is similar to that of the embodiment. Thereby, by introducing the above-described non-mobile mode using a personal computer having a camera, it is possible to more appropriately realize the operation intended by the user, and improve usability.
10 情報処理装置(外部装置)
20 情報処理装置
30 アイウエア
110 CPU
112 取得部
113 処理制御部
120 ネットワーク通信インタフェース
130 メモリ
150 ユーザインタフェース
160 撮像センサ
202 処理部
204 送信部
206 6軸センサ
208 電源部
222 取得部
224 モード制御部
226 判定部
228 出力部
10 Information processing device (external device)
20 Information processing device 30 Eyewear 110 CPU
112 Acquisition unit 113 Processing control unit 120 Network communication interface 130 Memory 150 User interface 160 Image sensor 202 Processing unit 204 Transmission unit 206 6-axis sensor 208 Power supply unit 222 Acquisition unit 224 Mode control unit 226 Determination unit 228 Output unit

Claims (9)

  1.  情報処理装置が、
     ユーザの所定部位の動作を計測可能なセンサから、複数時点における前記所定部位の動作に関するデータを順に取得すること、
     非移動モードである場合、前記データに基づいて前記所定部位の動作がモード切替動作であるかどうかを判定すること、
     前記所定部位の動作がモード切替動作であると判定した場合、前記非移動モードから移動モードに切り替えること、
     前記移動モードである場合、前記データに基づいてポインタに関する移動情報を出力すること、
     前記移動情報の出力後に所定条件が満たされた場合、前記移動モードから前記非移動モードに切り替えること、
     を実行する情報処理方法。
    The information processing device
    Sequentially acquiring data regarding the movement of the predetermined part at multiple points in time from a sensor capable of measuring the movement of the predetermined part of the user;
    determining whether the motion of the predetermined part is a mode switching motion based on the data when the mode is a non-moving mode;
    If it is determined that the motion of the predetermined part is a mode switching motion, switching from the non-moving mode to the moving mode;
    When in the movement mode, outputting movement information regarding the pointer based on the data;
    If a predetermined condition is satisfied after outputting the movement information, switching from the movement mode to the non-movement mode;
    An information processing method that performs.
  2.  前記非移動モードから前記移動モードに切り替えることは、
     前記所定部位の動作が第1モード切替動作である場合、前記非移動モードから前記移動モードの第1状態に切り替えることを含み、
     前記情報処理装置がさらに、
     前記移動モードが前記第1状態である場合、前記移動情報の出力に続いて、ポインタの移動停止後に前記第1モード切替動作に対応するコマンドを出力することを実行する、請求項1に記載の情報処理方法。
    Switching from the non-mobile mode to the mobile mode includes:
    When the operation of the predetermined part is a first mode switching operation, the method includes switching from the non-moving mode to the first state of the moving mode;
    The information processing device further includes:
    2. When the movement mode is the first state, following the output of the movement information, outputting a command corresponding to the first mode switching operation is performed after the pointer stops moving. Information processing method.
  3.  前記非移動モードから前記移動モードに切り替えることは、
     前記所定部位の動作が第2モード切替動作である場合、前記非移動モードから前記移動モードの第2状態に切り替えることを含み、
     前記情報処理装置がさらに、
     前記移動モードが前記第2状態である場合、前記移動情報の出力に続いて、ポインタの移動停止後に前記データに基づいて前記所定部位の動作がコマンド動作であるかどうかを判定し、前記コマンド動作であると判定した場合に、前記コマンド動作に対応するコマンドを出力することを実行する、請求項1に記載の情報処理方法。
    Switching from the non-mobile mode to the mobile mode includes:
    When the operation of the predetermined part is a second mode switching operation, the method includes switching from the non-moving mode to the second state of the moving mode;
    The information processing device further includes:
    When the movement mode is the second state, following the output of the movement information, after the pointer stops moving, it is determined whether the movement of the predetermined part is a command movement based on the data, and the command movement is performed. 2. The information processing method according to claim 1, further comprising outputting a command corresponding to the command operation when it is determined that the command operation is the command operation.
  4.  前記所定条件は、前記コマンドを出力したことを含む、請求項2又は3に記載の情報処理方法。 The information processing method according to claim 2 or 3, wherein the predetermined condition includes outputting the command.
  5.  前記非移動モードから前記移動モードに切り替えられる場合、前記モード切替動作に対応する動きを前記ポインタにさせる動き情報を出力することを含む、請求項1に記載の情報処理方法。 The information processing method according to claim 1, further comprising outputting movement information that causes the pointer to make a movement corresponding to the mode switching operation when switching from the non-movement mode to the movement mode.
  6.  前記動き情報は、前記所定部位の動作に関係なく定められた動きを示す情報を含む、請求項5に記載の情報処理方法。 The information processing method according to claim 5, wherein the movement information includes information indicating a predetermined movement regardless of the movement of the predetermined part.
  7.  前記情報処理装置がさらに、
     前記非移動モードである場合、前記データに基づいて前記所定部位の動作がポインタの移動を伴わないコマンド動作であるかどうかを判定すること、
     前記所定部位の動作が前記ポインタの移動を伴わないコマンド動作であると判定した場合、前記ポインタの移動を伴わないコマンド動作に対応するコマンドを出力すること、
     を実行する請求項1に記載の情報処理方法。
    The information processing device further includes:
    If the mode is the non-movement mode, determining whether the movement of the predetermined part is a command movement that does not involve movement of a pointer based on the data;
    If it is determined that the movement of the predetermined part is a command movement that does not involve movement of the pointer, outputting a command corresponding to the command movement that does not involve movement of the pointer;
    2. The information processing method according to claim 1, wherein:
  8.  情報処理装置に、
     ユーザの所定部位の動作を計測可能なセンサから、複数時点における前記所定部位の動作に関するデータを順に取得すること、
     非移動モードである場合、前記データに基づいて前記所定部位の動作がモード切替動作であるかどうかを判定すること、
     前記所定部位の動作がモード切替動作であると判定した場合、前記非移動モードから移動モードに切り替えること、
     前記移動モードである場合、前記データに基づいてポインタに関する移動情報を出力すること、
     前記移動情報の出力後に所定条件が満たされた場合、前記移動モードから前記非移動モードに切り替えること、
     を実行させるプログラム。
    In the information processing device,
    Sequentially acquiring data regarding the movement of the predetermined part at multiple points in time from a sensor capable of measuring the movement of the predetermined part of the user;
    determining whether the motion of the predetermined part is a mode switching motion based on the data when the mode is a non-moving mode;
    If it is determined that the motion of the predetermined part is a mode switching motion, switching from the non-moving mode to the moving mode;
    When in the movement mode, outputting movement information regarding the pointer based on the data;
    If a predetermined condition is satisfied after outputting the movement information, switching from the movement mode to the non-movement mode;
    A program to run.
  9.  プロセッサを含む情報処理装置であって、
     前記プロセッサが、
     ユーザの所定部位の動作を計測可能なセンサから、複数時点における前記所定部位の動作に関するデータを順に取得すること、
     非移動モードである場合、前記データに基づいて前記所定部位の動作がモード切替動作であるかどうかを判定すること、
     前記所定部位の動作がモード切替動作であると判定した場合、前記非移動モードから移動モードに切り替えること、
     前記移動モードである場合、前記データに基づいてポインタに関する移動情報を出力すること、
     前記移動情報の出力後に所定条件が満たされた場合、前記移動モードから前記非移動モードに切り替えること、
     を実行する情報処理装置。
     
    An information processing device including a processor,
    The processor,
    Sequentially acquiring data regarding the movement of the predetermined part at multiple points in time from a sensor capable of measuring the movement of the predetermined part of the user;
    determining whether the motion of the predetermined part is a mode switching motion based on the data when the mode is a non-moving mode;
    If it is determined that the motion of the predetermined part is a mode switching motion, switching from the non-moving mode to the moving mode;
    When in the movement mode, outputting movement information regarding the pointer based on the data;
    If a predetermined condition is satisfied after outputting the movement information, switching from the movement mode to the non-movement mode;
    An information processing device that executes.
PCT/JP2023/017267 2022-06-08 2023-05-08 Information processing method, program, and information processing device WO2023238568A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-092871 2022-06-08
JP2022092871A JP2023179927A (en) 2022-06-08 2022-06-08 Information processing method, program, and information processing device

Publications (1)

Publication Number Publication Date
WO2023238568A1 true WO2023238568A1 (en) 2023-12-14

Family

ID=89118137

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/017267 WO2023238568A1 (en) 2022-06-08 2023-05-08 Information processing method, program, and information processing device

Country Status (2)

Country Link
JP (1) JP2023179927A (en)
WO (1) WO2023238568A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016062274A (en) * 2014-09-17 2016-04-25 株式会社東芝 Recognition device, recognition method, and recognition program
JP2017142857A (en) * 2017-05-08 2017-08-17 株式会社ニコン Input device
JP2022026251A (en) * 2020-07-30 2022-02-10 株式会社ジンズホールディングス Program, information processing method, and information processing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016062274A (en) * 2014-09-17 2016-04-25 株式会社東芝 Recognition device, recognition method, and recognition program
JP2017142857A (en) * 2017-05-08 2017-08-17 株式会社ニコン Input device
JP2022026251A (en) * 2020-07-30 2022-02-10 株式会社ジンズホールディングス Program, information processing method, and information processing device

Also Published As

Publication number Publication date
JP2023179927A (en) 2023-12-20

Similar Documents

Publication Publication Date Title
US9978261B2 (en) Remote controller and information processing method and system
US9448624B2 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
KR102297473B1 (en) Apparatus and method for providing touch inputs by using human body
EP2824551A1 (en) Method, device, and system for controlling computer terminal
US10890982B2 (en) System and method for multipurpose input device for two-dimensional and three-dimensional environments
US20060125789A1 (en) Contactless input device
WO2020108101A1 (en) Virtual data processing method and apparatus, and storage medium, and terminal
WO2016003365A1 (en) A wearable input device
WO2020088244A1 (en) Mobile terminal interaction control method and mobile terminal
WO2023238568A1 (en) Information processing method, program, and information processing device
WO2016147498A1 (en) Information processing device, information processing method, and program
US11604507B2 (en) Information processing method, non-transitory recording medium, and information processing apparatus
US20160291703A1 (en) Operating system, wearable device, and operation method
CN118076940A (en) System for differently interpreting finger gestures of a user&#39;s finger based on roll values of a wrist-wearable device worn by the user and method of using the same
US9600088B2 (en) Method and apparatus for displaying a pointer on an external display
US11301031B2 (en) Information processing apparatus and display control method to control a display position of a virtual object
TW202414164A (en) Information processing methods, programs, and information processing devices
KR20150093270A (en) Motion interface device
CN116438503A (en) Electronic device and operation method thereof
JP2015049822A (en) Display control apparatus, display control method, display control signal generating apparatus, display control signal generating method, program, and display control system
CN112445328A (en) Mapping control method and device
CN114063768B (en) Information processing method, non-transitory recording medium, and information processing apparatus
US11806610B2 (en) Computer system and gaming mouse
JPH03256112A (en) Control processor
WO2023286316A1 (en) Input device, system, and control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23819554

Country of ref document: EP

Kind code of ref document: A1